Sample records for point information measuring

  1. 46 CFR 153.908 - Cargo viscosity and melting point information; measuring cargo temperature during discharge...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 5 2014-10-01 2014-10-01 false Cargo viscosity and melting point information; measuring... Cargo viscosity and melting point information; measuring cargo temperature during discharge: Categories... lading, a written statement of the following: (1) For Category A or B NLS, the cargo's viscosity at 20 °C...

  2. 46 CFR 153.908 - Cargo viscosity and melting point information; measuring cargo temperature during discharge...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 5 2013-10-01 2013-10-01 false Cargo viscosity and melting point information; measuring... Cargo viscosity and melting point information; measuring cargo temperature during discharge: Categories... lading, a written statement of the following: (1) For Category A or B NLS, the cargo's viscosity at 20 °C...

  3. 46 CFR 153.908 - Cargo viscosity and melting point information; measuring cargo temperature during discharge...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 5 2012-10-01 2012-10-01 false Cargo viscosity and melting point information; measuring... Cargo viscosity and melting point information; measuring cargo temperature during discharge: Categories... lading, a written statement of the following: (1) For Category A or B NLS, the cargo's viscosity at 20 °C...

  4. The small low SNR target tracking using sparse representation information

    NASA Astrophysics Data System (ADS)

    Yin, Lifan; Zhang, Yiqun; Wang, Shuo; Sun, Chenggang

    2017-11-01

    Tracking small targets, such as missile warheads, from a remote distance is a difficult task since the targets are "points" which are similar to sensor's noise points. As a result, traditional tracking algorithms only use the information contained in point measurement, such as the position information and intensity information, as characteristics to identify targets from noise points. But in fact, as a result of the diffusion of photon, any small target is not a point in the focal plane array and it occupies an area which is larger than one sensor cell. So, if we can take the geometry characteristic into account as a new dimension of information, it will be of helpful in distinguishing targets from noise points. In this paper, we use a novel method named sparse representation (SR) to depict the geometry information of target intensity and define it as the SR information of target. Modeling the intensity spread and solving its SR coefficients, the SR information is represented by establishing its likelihood function. Further, the SR information likelihood is incorporated in the conventional Probability Hypothesis Density (PHD) filter algorithm with point measurement. To illustrate the different performances of algorithm with or without the SR information, the detection capability and estimation error have been compared through simulation. Results demonstrate the proposed method has higher estimation accuracy and probability of detecting target than the conventional algorithm without the SR information.

  5. Two-point method uncertainty during control and measurement of cylindrical element diameters

    NASA Astrophysics Data System (ADS)

    Glukhov, V. I.; Shalay, V. V.; Radev, H.

    2018-04-01

    The topic of the article is devoted to the urgent problem of the reliability of technical products geometric specifications measurements. The purpose of the article is to improve the quality of parts linear sizes control by the two-point measurement method. The article task is to investigate methodical extended uncertainties in measuring cylindrical element linear sizes. The investigation method is a geometric modeling of the element surfaces shape and location deviations in a rectangular coordinate system. The studies were carried out for elements of various service use, taking into account their informativeness, corresponding to the kinematic pairs classes in theoretical mechanics and the number of constrained degrees of freedom in the datum element function. Cylindrical elements with informativity of 4, 2, 1 and θ (zero) were investigated. The uncertainties estimation of in two-point measurements was made by comparing the results of of linear dimensions measurements with the functional diameters maximum and minimum of the element material. Methodical uncertainty is formed when cylindrical elements with maximum informativeness have shape deviations of the cut and the curvature types. Methodical uncertainty is formed by measuring the element average size for all types of shape deviations. The two-point measurement method cannot take into account the location deviations of a dimensional element, so its use for elements with informativeness less than the maximum creates unacceptable methodical uncertainties in measurements of the maximum, minimum and medium linear dimensions. Similar methodical uncertainties also exist in the arbitration control of the linear dimensions of the cylindrical elements by limiting two-point gauges.

  6. The guidance methodology of a new automatic guided laser theodolite system

    NASA Astrophysics Data System (ADS)

    Zhang, Zili; Zhu, Jigui; Zhou, Hu; Ye, Shenghua

    2008-12-01

    Spatial coordinate measurement systems such as theodolites, laser trackers and total stations have wide application in manufacturing and certification processes. The traditional operation of theodolites is manual and time-consuming which does not meet the need of online industrial measurement, also laser trackers and total stations need reflective targets which can not realize noncontact and automatic measurement. A new automatic guided laser theodolite system is presented to achieve automatic and noncontact measurement with high precision and efficiency which is comprised of two sub-systems: the basic measurement system and the control and guidance system. The former system is formed by two laser motorized theodolites to accomplish the fundamental measurement tasks while the latter one consists of a camera and vision system unit mounted on a mechanical displacement unit to provide azimuth information of the measured points. The mechanical displacement unit can rotate horizontally and vertically to direct the camera to the desired orientation so that the camera can scan every measured point in the measuring field, then the azimuth of the corresponding point is calculated for the laser motorized theodolites to move accordingly to aim at it. In this paper the whole system composition and measuring principle are analyzed, and then the emphasis is laid on the guidance methodology for the laser points from the theodolites to move towards the measured points. The guidance process is implemented based on the coordinate transformation between the basic measurement system and the control and guidance system. With the view field angle of the vision system unit and the world coordinate of the control and guidance system through coordinate transformation, the azimuth information of the measurement area that the camera points at can be attained. The momentary horizontal and vertical changes of the mechanical displacement movement are also considered and calculated to provide real time azimuth information of the pointed measurement area by which the motorized theodolite will move accordingly. This methodology realizes the predetermined location of the laser points which is within the camera-pointed scope so that it accelerates the measuring process and implements the approximate guidance instead of manual operations. The simulation results show that the proposed method of automatic guidance is effective and feasible which provides good tracking performance of the predetermined location of laser points.

  7. Control surface in aerial triangulation

    NASA Astrophysics Data System (ADS)

    Jaw, Jen-Jer

    With the increased availability of surface-related sensors, the collection of surface information becomes easier and more straightforward than ever before. In this study, the author proposes a model in which the surface information is integrated into the aerial triangulation workflow by hypothesizing plane observations in the object space, the estimated object points via photo measurements (or matching) together with the adjusted surface points would provide a better point group describing the surface. The algorithms require no special structure of surface points and involve no interpolation process. The suggested measuring strategy (pairwise measurements) results in a quite fluent and favorable working environment when taking measurements. Furthermore, the extension of the model employing the the surface plane finds itself useful in tying photo models. The proposed model has been proven working by the simulation and carried out in the photogrammetric laboratory.

  8. A fingerprint classification algorithm based on combination of local and global information

    NASA Astrophysics Data System (ADS)

    Liu, Chongjin; Fu, Xiang; Bian, Junjie; Feng, Jufu

    2011-12-01

    Fingerprint recognition is one of the most important technologies in biometric identification and has been wildly applied in commercial and forensic areas. Fingerprint classification, as the fundamental procedure in fingerprint recognition, can sharply decrease the quantity for fingerprint matching and improve the efficiency of fingerprint recognition. Most fingerprint classification algorithms are based on the number and position of singular points. Because the singular points detecting method only considers the local information commonly, the classification algorithms are sensitive to noise. In this paper, we propose a novel fingerprint classification algorithm combining the local and global information of fingerprint. Firstly we use local information to detect singular points and measure their quality considering orientation structure and image texture in adjacent areas. Furthermore the global orientation model is adopted to measure the reliability of singular points group. Finally the local quality and global reliability is weighted to classify fingerprint. Experiments demonstrate the accuracy and effectivity of our algorithm especially for the poor quality fingerprint images.

  9. Optimal Information Extraction of Laser Scanning Dataset by Scale-Adaptive Reduction

    NASA Astrophysics Data System (ADS)

    Zang, Y.; Yang, B.

    2018-04-01

    3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  10. Three-dimensional displacement measurement of image point by point-diffraction interferometry

    NASA Astrophysics Data System (ADS)

    He, Xiao; Chen, Lingfeng; Meng, Xiaojie; Yu, Lei

    2018-01-01

    This paper presents a method for measuring the three-dimensional (3-D) displacement of an image point based on point-diffraction interferometry. An object Point-light-source (PLS) interferes with a fixed PLS and its interferograms are captured by an exit pupil. When the image point of the object PLS is slightly shifted to a new position, the wavefront of the image PLS changes. And its interferograms also change. Processing these figures (captured before and after the movement), the wavefront difference of the image PLS can be obtained and it contains the information of three-dimensional (3-D) displacement of the image PLS. However, the information of its three-dimensional (3-D) displacement cannot be calculated until the distance between the image PLS and the exit pupil is calibrated. Therefore, we use a plane-parallel-plate with a known refractive index and thickness to determine this distance, which is based on the Snell's law for small angle of incidence. Thus, since the distance between the exit pupil and the image PLS is a known quantity, the 3-D displacement of the image PLS can be simultaneously calculated through two interference measurements. Preliminary experimental results indicate that its relative error is below 0.3%. With the ability to accurately locate an image point (whatever it is real or virtual), a fiber point-light-source can act as the reticle by itself in optical measurement.

  11. Fusion of light-field and photogrammetric surface form data

    NASA Astrophysics Data System (ADS)

    Sims-Waterhouse, Danny; Piano, Samanta; Leach, Richard K.

    2017-08-01

    Photogrammetry based systems are able to produce 3D reconstructions of an object given a set of images taken from different orientations. In this paper, we implement a light-field camera within a photogrammetry system in order to capture additional depth information, as well as the photogrammetric point cloud. Compared to a traditional camera that only captures the intensity of the incident light, a light-field camera also provides angular information for each pixel. In principle, this additional information allows 2D images to be reconstructed at a given focal plane, and hence a depth map can be computed. Through the fusion of light-field and photogrammetric data, we show that it is possible to improve the measurement uncertainty of a millimetre scale 3D object, compared to that from the individual systems. By imaging a series of test artefacts from various positions, individual point clouds were produced from depth-map information and triangulation of corresponding features between images. Using both measurements, data fusion methods were implemented in order to provide a single point cloud with reduced measurement uncertainty.

  12. Electric-Field Instrument With Ac-Biased Corona Point

    NASA Technical Reports Server (NTRS)

    Markson, R.; Anderson, B.; Govaert, J.

    1993-01-01

    Measurements indicative of incipient lightning yield additional information. New instrument gives reliable readings. High-voltage ac bias applied to needle point through high-resistance capacitance network provides corona discharge at all times, enabling more-slowly-varying component of electrostatic potential of needle to come to equilibrium with surrounding air. High resistance of high-voltage coupling makes instrument insensitive to wind. Improved corona-point instrument expected to yield additional information assisting in safety-oriented forecasting of lighting.

  13. PL-VIO: Tightly-Coupled Monocular Visual–Inertial Odometry Using Point and Line Features

    PubMed Central

    Zhao, Ji; Guo, Yue; He, Wenhao; Yuan, Kui

    2018-01-01

    To address the problem of estimating camera trajectory and to build a structural three-dimensional (3D) map based on inertial measurements and visual observations, this paper proposes point–line visual–inertial odometry (PL-VIO), a tightly-coupled monocular visual–inertial odometry system exploiting both point and line features. Compared with point features, lines provide significantly more geometrical structure information on the environment. To obtain both computation simplicity and representational compactness of a 3D spatial line, Plücker coordinates and orthonormal representation for the line are employed. To tightly and efficiently fuse the information from inertial measurement units (IMUs) and visual sensors, we optimize the states by minimizing a cost function which combines the pre-integrated IMU error term together with the point and line re-projection error terms in a sliding window optimization framework. The experiments evaluated on public datasets demonstrate that the PL-VIO method that combines point and line features outperforms several state-of-the-art VIO systems which use point features only. PMID:29642648

  14. Set membership experimental design for biological systems.

    PubMed

    Marvel, Skylar W; Williams, Cranos M

    2012-03-21

    Experimental design approaches for biological systems are needed to help conserve the limited resources that are allocated for performing experiments. The assumptions used when assigning probability density functions to characterize uncertainty in biological systems are unwarranted when only a small number of measurements can be obtained. In these situations, the uncertainty in biological systems is more appropriately characterized in a bounded-error context. Additionally, effort must be made to improve the connection between modelers and experimentalists by relating design metrics to biologically relevant information. Bounded-error experimental design approaches that can assess the impact of additional measurements on model uncertainty are needed to identify the most appropriate balance between the collection of data and the availability of resources. In this work we develop a bounded-error experimental design framework for nonlinear continuous-time systems when few data measurements are available. This approach leverages many of the recent advances in bounded-error parameter and state estimation methods that use interval analysis to generate parameter sets and state bounds consistent with uncertain data measurements. We devise a novel approach using set-based uncertainty propagation to estimate measurement ranges at candidate time points. We then use these estimated measurements at the candidate time points to evaluate which candidate measurements furthest reduce model uncertainty. A method for quickly combining multiple candidate time points is presented and allows for determining the effect of adding multiple measurements. Biologically relevant metrics are developed and used to predict when new data measurements should be acquired, which system components should be measured and how many additional measurements should be obtained. The practicability of our approach is illustrated with a case study. This study shows that our approach is able to 1) identify candidate measurement time points that maximize information corresponding to biologically relevant metrics and 2) determine the number at which additional measurements begin to provide insignificant information. This framework can be used to balance the availability of resources with the addition of one or more measurement time points to improve the predictability of resulting models.

  15. Set membership experimental design for biological systems

    PubMed Central

    2012-01-01

    Background Experimental design approaches for biological systems are needed to help conserve the limited resources that are allocated for performing experiments. The assumptions used when assigning probability density functions to characterize uncertainty in biological systems are unwarranted when only a small number of measurements can be obtained. In these situations, the uncertainty in biological systems is more appropriately characterized in a bounded-error context. Additionally, effort must be made to improve the connection between modelers and experimentalists by relating design metrics to biologically relevant information. Bounded-error experimental design approaches that can assess the impact of additional measurements on model uncertainty are needed to identify the most appropriate balance between the collection of data and the availability of resources. Results In this work we develop a bounded-error experimental design framework for nonlinear continuous-time systems when few data measurements are available. This approach leverages many of the recent advances in bounded-error parameter and state estimation methods that use interval analysis to generate parameter sets and state bounds consistent with uncertain data measurements. We devise a novel approach using set-based uncertainty propagation to estimate measurement ranges at candidate time points. We then use these estimated measurements at the candidate time points to evaluate which candidate measurements furthest reduce model uncertainty. A method for quickly combining multiple candidate time points is presented and allows for determining the effect of adding multiple measurements. Biologically relevant metrics are developed and used to predict when new data measurements should be acquired, which system components should be measured and how many additional measurements should be obtained. Conclusions The practicability of our approach is illustrated with a case study. This study shows that our approach is able to 1) identify candidate measurement time points that maximize information corresponding to biologically relevant metrics and 2) determine the number at which additional measurements begin to provide insignificant information. This framework can be used to balance the availability of resources with the addition of one or more measurement time points to improve the predictability of resulting models. PMID:22436240

  16. Spatial and spectral imaging of point-spread functions using a spatial light modulator

    NASA Astrophysics Data System (ADS)

    Munagavalasa, Sravan; Schroeder, Bryce; Hua, Xuanwen; Jia, Shu

    2017-12-01

    We develop a point-spread function (PSF) engineering approach to imaging the spatial and spectral information of molecular emissions using a spatial light modulator (SLM). We show that a dispersive grating pattern imposed upon the emission reveals spectral information. We also propose a deconvolution model that allows the decoupling of the spectral and 3D spatial information in engineered PSFs. The work is readily applicable to single-molecule measurements and fluorescent microscopy.

  17. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan W.

    2014-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  18. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  19. EAST kinetic equilibrium reconstruction combining with Polarimeter-Interferometer internal measurement constraints

    NASA Astrophysics Data System (ADS)

    Lian, H.; Liu, H. Q.; Li, K.; Zou, Z. Y.; Qian, J. P.; Wu, M. Q.; Li, G. Q.; Zeng, L.; Zang, Q.; Lv, B.; Jie, Y. X.; EAST Team

    2017-12-01

    Plasma equilibrium reconstruction plays an important role in the tokamak plasma research. With a high temporal and spatial resolution, the POlarimeter-INTerferometer (POINT) system on EAST has provided effective measurements for 102s H-mode operation. Based on internal Faraday rotation measurements provided by the POINT system, the equilibrium reconstruction with a more accurate core current profile constraint has been demonstrated successfully on EAST. Combining other experimental diagnostics and external magnetic fields measurement, the kinetic equilibrium has also been reconstructed on EAST. Take the pressure and edge current information from kinetic EFIT into the equilibrium reconstruction with Faraday rotation constraint, the new equilibrium reconstruction not only provides a more accurate internal current profile but also contains edge current and pressure information. One time slice result using new kinetic equilibrium reconstruction with POINT data constraints is demonstrated in this paper and the result shows there is a reversed shear of q profile and the pressure profile is also contained. The new improved equilibrium reconstruction is greatly helpful to the future theoretical analysis.

  20. A robust approach to using of the redundant information in the temperature calibration

    NASA Astrophysics Data System (ADS)

    Strnad, R.; Kňazovická, L.; Šindelář, M.; Kukal, J.

    2013-09-01

    In the calibration laboratories are used standard procedures for calculating of the calibration model coefficients based on well described standards (EN 60751, ITS-90, EN 60584, etc.). In practice, sensors are mostly calibrated in more points and redundant information is used as a validation of the model. This paper will present the influence of including all measured points with respect to their uncertainties to the measured models using standard weighted least square methods. A special case with regards of the different level of the uncertainty of the measured points in case of the robust approach will be discussed. This will go to the different minimization criteria and different uncertainty propagation methodology. This approach also will eliminate of the influence of the outline measurements in the calibration. In practical part will be three cases of this approach presented, namely industrial calibration according to the standard EN 60751, SPRT according to the ITS-90 and thermocouple according to the standard EN 60584.

  1. Fusing Satellite-Derived Irradiance and Point Measurements through Optimal Interpolation

    NASA Astrophysics Data System (ADS)

    Lorenzo, A.; Morzfeld, M.; Holmgren, W.; Cronin, A.

    2016-12-01

    Satellite-derived irradiance is widely used throughout the design and operation of a solar power plant. While satellite-derived estimates cover a large area, they also have large errors compared to point measurements from sensors on the ground. We describe an optimal interpolation routine that fuses the broad spatial coverage of satellite-derived irradiance with the high accuracy of point measurements. The routine can be applied to any satellite-derived irradiance and point measurement datasets. Unique aspects of this work include the fact that information is spread using cloud location and thickness and that a number of point measurements are collected from rooftop PV systems. The routine is sensitive to errors in the satellite image geolocation, so care must be taken to adjust the cloud locations based on the solar and satellite geometries. Analysis of the optimal interpolation routine over Tucson, AZ, with 20 point measurements shows a significant improvement in the irradiance estimate for two distinct satellite image to irradiance algorithms. Improved irradiance estimates can be used for resource assessment, distributed generation production estimates, and irradiance forecasts.

  2. ASRDI oxygen technology survey. Volume 4: Low temperature measurement

    NASA Technical Reports Server (NTRS)

    Sparks, L. L.

    1974-01-01

    Information is presented on temperature measurement between the triple point and critical point of liquid oxygen. The criterion selected is that all transducers which may reasonably be employed in the liquid oxygen (LO2) temperature range are considered. The temperature range for each transducer is the appropriate full range for the particular thermometer. The discussion of each thermometer or type of thermometer includes the following information: (1) useful temperature range, (2) general and particular methods of construction and the advantages of each type, (3) specifications (accuracy, reproducibility, response time, etc.), (4) associated instrumentation, (5) calibrations and procedures, and (6) analytical representations.

  3. An Examination of College Students' Receptiveness to Alcohol-Related Information and Advice

    ERIC Educational Resources Information Center

    Leahy, Matthew M.; Jouriles, Ernest N.; Walters, Scott T.

    2013-01-01

    This project examined the reliability and validity of a newly developed measure of college students' receptiveness to alcohol related information and advice. Participants were 116 college students who reported having consumed alcohol at some point in their lifetime. Participants completed a measure of receptiveness to alcohol-related…

  4. Mild cognitive impairment: baseline and longitudinal structural MR imaging measures improve predictive prognosis.

    PubMed

    McEvoy, Linda K; Holland, Dominic; Hagler, Donald J; Fennema-Notestine, Christine; Brewer, James B; Dale, Anders M

    2011-06-01

    To assess whether single-time-point and longitudinal volumetric magnetic resonance (MR) imaging measures provide predictive prognostic information in patients with amnestic mild cognitive impairment (MCI). This study was conducted with institutional review board approval and in compliance with HIPAA regulations. Written informed consent was obtained from all participants or the participants' legal guardians. Cross-validated discriminant analyses of MR imaging measures were performed to differentiate 164 Alzheimer disease (AD) cases from 203 healthy control cases. Separate analyses were performed by using data from MR images obtained at one time point or by combining single-time-point measures with 1-year change measures. Resulting discriminant functions were applied to 317 MCI cases to derive individual patient risk scores. Risk of conversion to AD was estimated as a continuous function of risk score percentile. Kaplan-Meier survival curves were computed for risk score quartiles. Odds ratios (ORs) for the conversion to AD were computed between the highest and lowest quartile scores. Individualized risk estimates from baseline MR examinations indicated that the 1-year risk of conversion to AD ranged from 3% to 40% (average group risk, 17%; OR, 7.2 for highest vs lowest score quartiles). Including measures of 1-year change in global and regional volumes significantly improved risk estimates (P = 001), with the risk of conversion to AD in the subsequent year ranging from 3% to 69% (average group risk, 27%; OR, 12.0 for highest vs lowest score quartiles). Relative to the risk of conversion to AD conferred by the clinical diagnosis of MCI alone, MR imaging measures yield substantially more informative patient-specific risk estimates. Such predictive prognostic information will be critical if disease-modifying therapies become available. http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.11101975/-/DC1. RSNA, 2011

  5. Information measures in nonlinear experimental design

    NASA Technical Reports Server (NTRS)

    Niple, E.; Shaw, J. H.

    1980-01-01

    Some different approaches to the problem of designing experiments which estimate the parameters of nonlinear models are discussed. The assumption in these approaches that the information in a set of data can be represented by a scalar is criticized, and the nonscalar discrimination information is proposed as the proper measure to use. The two-step decay example in Box and Lucas (1959) is used to illustrate the main points of the discussion.

  6. Alternative Methods for Estimating Plane Parameters Based on a Point Cloud

    NASA Astrophysics Data System (ADS)

    Stryczek, Roman

    2017-12-01

    Non-contact measurement techniques carried out using triangulation optical sensors are increasingly popular in measurements with the use of industrial robots directly on production lines. The result of such measurements is often a cloud of measurement points that is characterized by considerable measuring noise, presence of a number of points that differ from the reference model, and excessive errors that must be eliminated from the analysis. To obtain vector information points contained in the cloud that describe reference models, the data obtained during a measurement should be subjected to appropriate processing operations. The present paperwork presents an analysis of suitability of methods known as RANdom Sample Consensus (RANSAC), Monte Carlo Method (MCM), and Particle Swarm Optimization (PSO) for the extraction of the reference model. The effectiveness of the tested methods is illustrated by examples of measurement of the height of an object and the angle of a plane, which were made on the basis of experiments carried out at workshop conditions.

  7. The algorithm of fast image stitching based on multi-feature extraction

    NASA Astrophysics Data System (ADS)

    Yang, Chunde; Wu, Ge; Shi, Jing

    2018-05-01

    This paper proposed an improved image registration method combining Hu-based invariant moment contour information and feature points detection, aiming to solve the problems in traditional image stitching algorithm, such as time-consuming feature points extraction process, redundant invalid information overload and inefficiency. First, use the neighborhood of pixels to extract the contour information, employing the Hu invariant moment as similarity measure to extract SIFT feature points in those similar regions. Then replace the Euclidean distance with Hellinger kernel function to improve the initial matching efficiency and get less mismatching points, further, estimate affine transformation matrix between the images. Finally, local color mapping method is adopted to solve uneven exposure, using the improved multiresolution fusion algorithm to fuse the mosaic images and realize seamless stitching. Experimental results confirm high accuracy and efficiency of method proposed in this paper.

  8. [Analysis of different health status based on characteristics of the facial spectrum photometric color].

    PubMed

    Xu, Jiatuo; Wu, Hongjin; Lu, Luming; Tu, Liping; Zhang, Zhifeng; Chen, Xiao

    2012-12-01

    This paper is aimed to observe the difference of facial color of people with different health status by spectral photometric color measuring technique according to the theory of facial color diagnosis in Internal Classic. We gathered the facial color information about the health status of persons in healthy group (183), sub-healthy group (287) and disease group (370) respectively. The information included L, a, b, C values and reflection of different wavelengths in 400-700nm with CM-2600D spectral photometric color measuring instrument on 8 points. The results indicated that overall complexion color values of the people in the three groups were significantly different. The persons in the disease group looked deep dark in features. The people in the sub-healthy group looked pale in features. The loci L, a, b, C values were with varying degrees of significant differences (P < 0.05) at 6 points among the groups, and the central position of the face in all the groups was the position with most significant differences. Comparing the facial color information at the same point of the people in the three groups, we obtained each group's diagnostic special point. There existed diagnostic values in distinguishing disease status and various status of health in some degree by spectral photometric color measuring technique. The present method provides a prosperous quantitative basis for Chinese medical inspection of the complexion diagnosis.

  9. Quantifying Complexity in Quantum Phase Transitions via Mutual Information Complex Networks

    NASA Astrophysics Data System (ADS)

    Valdez, Marc Andrew; Jaschke, Daniel; Vargas, David L.; Carr, Lincoln D.

    2017-12-01

    We quantify the emergent complexity of quantum states near quantum critical points on regular 1D lattices, via complex network measures based on quantum mutual information as the adjacency matrix, in direct analogy to quantifying the complexity of electroencephalogram or functional magnetic resonance imaging measurements of the brain. Using matrix product state methods, we show that network density, clustering, disparity, and Pearson's correlation obtain the critical point for both quantum Ising and Bose-Hubbard models to a high degree of accuracy in finite-size scaling for three classes of quantum phase transitions, Z2, mean field superfluid to Mott insulator, and a Berzinskii-Kosterlitz-Thouless crossover.

  10. A quality-based payment strategy for nursing home care in Minnesota.

    PubMed

    Kane, Robert L; Arling, Greg; Mueller, Christine; Held, Robert; Cooke, Valerie

    2007-02-01

    This article describes a pay-for-performance system developed for Minnesota nursing homes. In effect, nursing homes can retain a greater proportion of the difference between their costs and the average costs on the basis of their quality scores. The quality score is a derived and weighted composite measure currently composed of five elements: staff retention (25 points), staff turnover (15 points), use of pool staff (10 points), nursing home quality indicators (40 points), and survey deficiencies (10 points). Information on residents' quality of life and satisfaction, derived from interviews with a random sample of residents in each Minnesota nursing home, is now available for inclusion in the quality measure. The new payment system was designed to create a business case for quality when used in addition to a nursing home report card that uses the same quality elements to inform potential consumers about the quality of nursing homes. Although the nursing home industry has announced general support for the new approach, it has lobbied the legislature to delay its implementation, claiming concerns about operational details.

  11. Evaluating Groundwater-Surface Water Exchange With A New Point Measurement Device

    NASA Astrophysics Data System (ADS)

    Cremeans, M.; Devlin, J. F.; McKnight, U. S.; Bjerg, P. L.; Nairn, R.

    2017-12-01

    Estimating exchange at the groundwater-surface water interface (GWSWI) could be crucial to designing effective remediation measures. The StreamBed Point Velocity Probe (SBPVP), a new point measurement device, measures in situ groundwater velocities at the GWSWI without reliance on estimations of hydraulic conductivity, porosity, or gradient information. The SBPVP has been applied to natural and engineered interfaces at contaminated sites, a stream and vertical flow bioreactor, respectively. Velocity data ( 18 cm/day to 2600 cm/day in the stream, and 54 cm/day to 161 cm/day in the bioreactor) were used to determine water and solute fluxes (as well as potential contaminant attenuation rates) at these sites. Analysis of the spatial distribution of velocity values in a streambed illustrated the extremely heterogeneous nature of that environment, while the engineered system was found to be relatively homogeneous by comparison. Combining SBPVP velocity data with geochemical data supports the calculation of mass discharges and mass removal rates. The wide range of exchange rate variability (within and between these sites) suggests that detailed characterization of the GWSWI interface is useful information for remediation in both cases.

  12. Nonrigid mammogram registration using mutual information

    NASA Astrophysics Data System (ADS)

    Wirth, Michael A.; Narhan, Jay; Gray, Derek W. S.

    2002-05-01

    Of the papers dealing with the task of mammogram registration, the majority deal with the task by matching corresponding control-points derived from anatomical landmark points. One of the caveats encountered when using pure point-matching techniques is their reliance on accurately extracted anatomical features-points. This paper proposes an innovative approach to matching mammograms which combines the use of a similarity-measure and a point-based spatial transformation. Mutual information is a cost-function used to determine the degree of similarity between the two mammograms. An initial rigid registration is performed to remove global differences and bring the mammograms into approximate alignment. The mammograms are then subdivided into smaller regions and each of the corresponding subimages is matched independently using mutual information. The centroids of each of the matched subimages are then used as corresponding control-point pairs in association with the Thin-Plate Spline radial basis function. The resulting spatial transformation generates a nonrigid match of the mammograms. The technique is illustrated by matching mammograms from the MIAS mammogram database. An experimental comparison is made between mutual information incorporating purely rigid behavior, and that incorporating a more nonrigid behavior. The effectiveness of the registration process is evaluated using image differences.

  13. Evaluation of Information Leakage from Cryptographic Hardware via Common-Mode Current

    NASA Astrophysics Data System (ADS)

    Hayashi, Yu-Ichi; Homma, Naofumi; Mizuki, Takaaki; Sugawara, Takeshi; Kayano, Yoshiki; Aoki, Takafumi; Minegishi, Shigeki; Satoh, Akashi; Sone, Hideaki; Inoue, Hiroshi

    This paper presents a possibility of Electromagnetic (EM) analysis against cryptographic modules outside their security boundaries. The mechanism behind the information leakage is explained from the view point of Electromagnetic Compatibility: electric fluctuation released from cryptographic modules can conduct to peripheral circuits based on ground bounce, resulting in radiation. We demonstrate the consequence of the mechanism through experiments where the ISO/IEC standard block cipher AES (Advanced Encryption Standard) is implemented on an FPGA board and EM radiations from power and communication cables are measured. Correlation Electromagnetic Analysis (CEMA) is conducted in order to evaluate the information leakage. The experimental results show that secret keys are revealed even though there are various disturbing factors such as voltage regulators and AC/DC converters between the target module and the measurement points. We also discuss information-suppression techniques as electrical-level countermeasures against such CEMAs.

  14. Using information Theory in Optimal Test Point Selection for Health Management in NASA's Exploration Vehicles

    NASA Technical Reports Server (NTRS)

    Mehr, Ali Farhang; Tumer, Irem

    2005-01-01

    In this paper, we will present a new methodology that measures the "worth" of deploying an additional testing instrument (sensor) in terms of the amount of information that can be retrieved from such measurement. This quantity is obtained using a probabilistic model of RLV's that has been partially developed in the NASA Ames Research Center. A number of correlated attributes are identified and used to obtain the worth of deploying a sensor in a given test point from an information-theoretic viewpoint. Once the information-theoretic worth of sensors is formulated and incorporated into our general model for IHM performance, the problem can be formulated as a constrained optimization problem where reliability and operational safety of the system as a whole is considered. Although this research is conducted specifically for RLV's, the proposed methodology in its generic form can be easily extended to other domains of systems health monitoring.

  15. Second Iteration of Photogrammetric Pipeline to Enhance the Accuracy of Image Pose Estimation

    NASA Astrophysics Data System (ADS)

    Nguyen, T. G.; Pierrot-Deseilligny, M.; Muller, J.-M.; Thom, C.

    2017-05-01

    In classical photogrammetric processing pipeline, the automatic tie point extraction plays a key role in the quality of achieved results. The image tie points are crucial to pose estimation and have a significant influence on the precision of calculated orientation parameters. Therefore, both relative and absolute orientations of the 3D model can be affected. By improving the precision of image tie point measurement, one can enhance the quality of image orientation. The quality of image tie points is under the influence of several factors such as the multiplicity, the measurement precision and the distribution in 2D images as well as in 3D scenes. In complex acquisition scenarios such as indoor applications and oblique aerial images, tie point extraction is limited while only image information can be exploited. Hence, we propose here a method which improves the precision of pose estimation in complex scenarios by adding a second iteration to the classical processing pipeline. The result of a first iteration is used as a priori information to guide the extraction of new tie points with better quality. Evaluated with multiple case studies, the proposed method shows its validity and its high potiential for precision improvement.

  16. Wave directional spreading from point field measurements.

    PubMed

    McAllister, M L; Venugopal, V; Borthwick, A G L

    2017-04-01

    Ocean waves have multidirectional components. Most wave measurements are taken at a single point, and so fail to capture information about the relative directions of the wave components directly. Conventional means of directional estimation require a minimum of three concurrent time series of measurements at different spatial locations in order to derive information on local directional wave spreading. Here, the relationship between wave nonlinearity and directionality is utilized to estimate local spreading without the need for multiple concurrent measurements, following Adcock & Taylor (Adcock & Taylor 2009 Proc. R. Soc. A 465 , 3361-3381. (doi:10.1098/rspa.2009.0031)), with the assumption that directional spreading is frequency independent. The method is applied to measurements recorded at the North Alwyn platform in the northern North Sea, and the results compared against estimates of wave spreading by conventional measurement methods and hindcast data. Records containing freak waves were excluded. It is found that the method provides accurate estimates of wave spreading over a range of conditions experienced at North Alwyn, despite the noisy chaotic signals that characterize such ocean wave data. The results provide further confirmation that Adcock and Taylor's method is applicable to metocean data and has considerable future promise as a technique to recover estimates of wave spreading from single point wave measurement devices.

  17. Wave directional spreading from point field measurements

    PubMed Central

    Venugopal, V.; Borthwick, A. G. L.

    2017-01-01

    Ocean waves have multidirectional components. Most wave measurements are taken at a single point, and so fail to capture information about the relative directions of the wave components directly. Conventional means of directional estimation require a minimum of three concurrent time series of measurements at different spatial locations in order to derive information on local directional wave spreading. Here, the relationship between wave nonlinearity and directionality is utilized to estimate local spreading without the need for multiple concurrent measurements, following Adcock & Taylor (Adcock & Taylor 2009 Proc. R. Soc. A 465, 3361–3381. (doi:10.1098/rspa.2009.0031)), with the assumption that directional spreading is frequency independent. The method is applied to measurements recorded at the North Alwyn platform in the northern North Sea, and the results compared against estimates of wave spreading by conventional measurement methods and hindcast data. Records containing freak waves were excluded. It is found that the method provides accurate estimates of wave spreading over a range of conditions experienced at North Alwyn, despite the noisy chaotic signals that characterize such ocean wave data. The results provide further confirmation that Adcock and Taylor's method is applicable to metocean data and has considerable future promise as a technique to recover estimates of wave spreading from single point wave measurement devices. PMID:28484326

  18. Full scattering profile of tissues with elliptical cross sections

    NASA Astrophysics Data System (ADS)

    Duadi, H.; Feder, I.; Fixler, D.

    2018-02-01

    Light reflectance and transmission from soft tissue has been utilized in noninvasive clinical measurement devices such as the photoplethysmograph (PPG) and reflectance pulse oximeter. Most methods of near infrared (NIR) spectroscopy focus on the volume reflectance from a semi-infinite sample, while very few measure transmission. However, since PPG and pulse oximetry are usually measured on tissue such as earlobe, fingertip, lip and pinched tissue, we propose examining the full scattering profile (FSP), which is the angular distribution of exiting photons. The FSP provides more comprehensive information when measuring from a cylindrical tissue. In our work we discovered a unique point, that we named the iso-pathlength (IPL) point, which is not dependent on changes in the reduced scattering coefficient (µs'). This IPL point was observed both in Monte Carlo (MC) simulation and in experimental tissue mimicking phantoms. The angle corresponding to this IPL point depends only on the tissue geometry. In the case of cylindrical tissues this point linearly depends on the tissue diameter. Since the target tissues for clinically physiological measuring are not a perfect cylinder, in this work we will examine how the change in the tissue cross section geometry influences the FSP and the IPL point. We used a MC simulation to compare a circular to an elliptic tissue cross section. The IPL point can serve as a self-calibration point for optical tissue measurements such as NIR spectroscopy, PPG and pulse oximetery.

  19. Improved pointing information for SCIAMACHY from in-flight measurements of the viewing directions towards sun and moon

    NASA Astrophysics Data System (ADS)

    Bramstedt, Klaus; Stone, Thomas C.; Gottwald, Manfred; Noël, Stefan; Bovensmann, Heinrich; Burrows, John P.

    2017-07-01

    The SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY (SCIAMACHY) on Envisat (2002-2012) performed nadir, limb, solar/lunar occultation and various monitoring measurements. The pointing information of the instrument is determined by the attitude information of the Envisat platform with its star trackers together with the encoder readouts of both the azimuth and the elevation scanner of SCIAMACHY. In this work, we present additional sources of attitude information from the SCIAMACHY measurements itself. The basic principle is the same as used by the star tracker: we measure the viewing direction towards celestial objects, i.e. sun and moon, to detect possible mispointings. In sun over limb port observations, we utilise the vertical scans over the solar disk. In horizontal direction, SCIAMACHY's sun follower device (SFD) is used to adjust the viewing direction. Moon over limb port measurements use for both the vertical and the horizontal direction the adjustment by the SFD. The viewing direction is steered towards the intensity centroid of the illuminated part of the lunar disk. We use reference images from the USGS Robotic Lunar Observatory (ROLO) to take into account the inhomogeneous surface and the variations by lunar libration and phase to parameterise the location of the intensity centroid from the observation geometry. Solar observations through SCIAMACHY's so-called sub-solar port (with a viewing direction closely to zenith) also use the SFD in the vertical direction. In the horizontal direction the geometry of the port defines the viewing direction. Using these three type of measurements, we fit improved mispointing parameters by minimising the pointing offsets in elevation and azimuth. The geolocation of all retrieved products will benefit from this; the tangent heights are especially improved. The altitudes assigned to SCIAMACHY's solar occultation measurements are changed in the range of -130 to -330 m, the lunar occultation measurements are changed in the range of 0 to +130 m and the limb measurements are changed in the range of -50 to +60 m (depending on season, altitude and azimuth angle). The horizontal location of the tangent point is changed by about 5 km for all measurements. These updates are implemented in version 9 of the SCIAMACHY Level 1b products and Level 2 version 7 (based on L1b version 9).

  20. Populating a Control Point Database: A cooperative effort between the USGS, Grand Canyon Monitoring and Research Center and the Grand Canyon Youth Organization

    NASA Astrophysics Data System (ADS)

    Brown, K. M.; Fritzinger, C.; Wharton, E.

    2004-12-01

    The Grand Canyon Monitoring and Research Center measures the effects of Glen Canyon Dam operations on the resources along the Colorado River from Glen Canyon Dam to Lake Mead in support of the Grand Canyon Adaptive Management Program. Control points are integral for geo-referencing the myriad of data collected in the Grand Canyon including aerial photography, topographic and bathymetric data used for classification and change-detection analysis of physical, biologic and cultural resources. The survey department has compiled a list of 870 control points installed by various organizations needing to establish a consistent reference for data collected at field sites along the 240 mile stretch of Colorado River in the Grand Canyon. This list is the foundation for the Control Point Database established primarily for researchers, to locate control points and independently geo-reference collected field data. The database has the potential to be a valuable mapping tool for assisting researchers to easily locate a control point and reduce the occurrance of unknowingly installing new control points within close proximity of an existing control point. The database is missing photographs and accurate site description information. Current site descriptions do not accurately define the location of the point but refer to the project that used the point, or some other interesting fact associated with the point. The Grand Canyon Monitoring and Research Center (GCMRC) resolved this problem by turning the data collection effort into an educational exercise for the participants of the Grand Canyon Youth organization. Grand Canyon Youth is a non-profit organization providing experiential education for middle and high school aged youth. GCMRC and the Grand Canyon Youth formed a partnership where GCMRC provided the logistical support, equipment, and training to conduct the field work, and the Grand Canyon Youth provided the time and personnel to complete the field work. Two data collection efforts were conducted during the 2004 summer allowing 40 youth the opportunity to contribute valuable information to the Control Point Database. This information included: verification of point existence, photographs, accurate site descriptions concisely describing the location of the point, how to reach the point, the specific point location and detailed bearings to visible and obvious land marks. The youth learned to locate themselves and find the points using 1:1000 airphotos, write detailed site descriptions, take bearings with a compass, measure vertical and horizontal distances, and use a digital camera. The youth found information for 252 control points (29% of the total points).

  1. Observing Bridge Dynamic Deflection in Green Time by Information Technology

    NASA Astrophysics Data System (ADS)

    Yu, Chengxin; Zhang, Guojian; Zhao, Yongqian; Chen, Mingzhi

    2018-01-01

    As traditional surveying methods are limited to observe bridge dynamic deflection; information technology is adopted to observe bridge dynamic deflection in Green time. Information technology used in this study means that we use digital cameras to photograph the bridge in red time as a zero image. Then, a series of successive images are photographed in green time. Deformation point targets are identified and located by Hough transform. With reference to the control points, the deformation values of these deformation points are obtained by differencing the successive images with a zero image, respectively. Results show that the average measurement accuracies of C0 are 0.46 pixels, 0.51 pixels and 0.74 pixels in X, Z and comprehensive direction. The average measurement accuracies of C1 are 0.43 pixels, 0.43 pixels and 0.67 pixels in X, Z and comprehensive direction in these tests. The maximal bridge deflection is 44.16mm, which is less than 75mm (Bridge deflection tolerance value). Information technology in this paper can monitor bridge dynamic deflection and depict deflection trend curves of the bridge in real time. It can provide data support for the site decisions to the bridge structure safety.

  2. Measuring organizational effectiveness in information and communication technology companies using item response theory.

    PubMed

    Trierweiller, Andréa Cristina; Peixe, Blênio César Severo; Tezza, Rafael; Pereira, Vera Lúcia Duarte do Valle; Pacheco, Waldemar; Bornia, Antonio Cezar; de Andrade, Dalton Francisco

    2012-01-01

    The aim of this paper is to measure the effectiveness of the organizations Information and Communication Technology (ICT) from the point of view of the manager, using Item Response Theory (IRT). There is a need to verify the effectiveness of these organizations which are normally associated to complex, dynamic, and competitive environments. In academic literature, there is disagreement surrounding the concept of organizational effectiveness and its measurement. A construct was elaborated based on dimensions of effectiveness towards the construction of the items of the questionnaire which submitted to specialists for evaluation. It demonstrated itself to be viable in measuring organizational effectiveness of ICT companies under the point of view of a manager through using Two-Parameter Logistic Model (2PLM) of the IRT. This modeling permits us to evaluate the quality and property of each item placed within a single scale: items and respondents, which is not possible when using other similar tools.

  3. Study protocol: identifying and delivering point-of-care information to improve care coordination.

    PubMed

    Hysong, Sylvia J; Che, Xinxuan; Weaver, Sallie J; Petersen, Laura A

    2015-10-19

    The need for deliberately coordinated care is noted by many national-level organizations. The Department of Veterans Affairs (VA) recently transitioned primary care clinics nationwide into Patient Aligned Care Teams (PACTs) to provide more accessible, coordinated, comprehensive, and patient-centered care. To better serve this purpose, PACTs must be able to successfully sequence and route interdependent tasks to appropriate team members while also maintaining collective situational awareness (coordination). Although conceptual frameworks of care coordination exist, few explicitly articulate core behavioral markers of coordination or the related information needs of team members attempting to synchronize complex care processes across time for a shared patient population. Given this gap, we partnered with a group of frontline primary care personnel at ambulatory care sites to identify the specific information needs of PACT members that will enable them to coordinate their efforts to provide effective, coordinated care. The study has three objectives: (1) development of measurable, prioritized point-of-care criteria for effective PACT coordination; (2) identifying the specific information needed at the point of care to optimize coordination; and (3) assessing the effect of adopting the aforementioned coordination standards on PACT clinicians' coordination behaviors. The study consists of three phases. In phase 1, we will employ the Productivity Measurement and Enhancement System (ProMES), a structured approach to performance measure creation from industrial/organizational psychology, to develop coordination measures with a design team of 6-10 primary care personnel; in phase 2, we will conduct focus groups with the phase 1 design team to identify point-of-care information needs. Phase 3 is a two-arm field experiment (n PACT = 28/arm); intervention arm PACTs will receive monthly feedback reports using the measures developed in phase 1 and attend brief monthly feedback sessions. Control arm PACTs will receive no intervention. PACTs will be followed prospectively for up to 1 year. This project combines both action research and implementation science methods to address important gaps in the existing care coordination literature using a partnership-based research design. It will provide an evidence-based framework for care coordination by employing a structured methodology for a systematic approach to care coordination in PACT settings and identifying the information needs that produce the most successful coordination of care. ISRCTN15412521.

  4. A Study on Amino Acids: Synthesis of Alpha-Aminophenylacetic Acid (Phenylglycine) and Determination of its Isoelectric Point.

    ERIC Educational Resources Information Center

    Barrelle, M.; And Others

    1983-01-01

    Background information and procedures are provided for an experimental study on aminophenylacetic acid (phenylglycine). These include physical chemistry (determination of isoelectric point by pH measurement) and organic chemistry (synthesis of an amino acid in racemic form) experiments. (JN)

  5. A Bionic Camera-Based Polarization Navigation Sensor

    PubMed Central

    Wang, Daobin; Liang, Huawei; Zhu, Hui; Zhang, Shuai

    2014-01-01

    Navigation and positioning technology is closely related to our routine life activities, from travel to aerospace. Recently it has been found that Cataglyphis (a kind of desert ant) is able to detect the polarization direction of skylight and navigate according to this information. This paper presents a real-time bionic camera-based polarization navigation sensor. This sensor has two work modes: one is a single-point measurement mode and the other is a multi-point measurement mode. An indoor calibration experiment of the sensor has been done under a beam of standard polarized light. The experiment results show that after noise reduction the accuracy of the sensor can reach up to 0.3256°. It is also compared with GPS and INS (Inertial Navigation System) in the single-point measurement mode through an outdoor experiment. Through time compensation and location compensation, the sensor can be a useful alternative to GPS and INS. In addition, the sensor also can measure the polarization distribution pattern when it works in multi-point measurement mode. PMID:25051029

  6. On the impact of a refined stochastic model for airborne LiDAR measurements

    NASA Astrophysics Data System (ADS)

    Bolkas, Dimitrios; Fotopoulos, Georgia; Glennie, Craig

    2016-09-01

    Accurate topographic information is critical for a number of applications in science and engineering. In recent years, airborne light detection and ranging (LiDAR) has become a standard tool for acquiring high quality topographic information. The assessment of airborne LiDAR derived DEMs is typically based on (i) independent ground control points and (ii) forward error propagation utilizing the LiDAR geo-referencing equation. The latter approach is dependent on the stochastic model information of the LiDAR observation components. In this paper, the well-known statistical tool of variance component estimation (VCE) is implemented for a dataset in Houston, Texas, in order to refine the initial stochastic information. Simulations demonstrate the impact of stochastic-model refinement for two practical applications, namely coastal inundation mapping and surface displacement estimation. Results highlight scenarios where erroneous stochastic information is detrimental. Furthermore, the refined stochastic information provides insights on the effect of each LiDAR measurement in the airborne LiDAR error budget. The latter is important for targeting future advancements in order to improve point cloud accuracy.

  7. Sales effects of product health information at points of purchase: a systematic review.

    PubMed

    van 't Riet, Jonathan

    2013-03-01

    Information about healthy and unhealthy nutrients is increasingly conveyed at the point of purchase. Many studies have investigated the effects of product health information on attitudes and intentions, but the empirical evidence becomes sketchier when the focus of research is actual purchase behaviour. The present paper provides an overview of empirical evidence on the effectiveness of product health information for food products at the point of purchase. A systematic literature review was conducted. Only studies were included that assessed the effect of product health information at the point of purchase on actual purchase behaviour, using data provided by stores' sales records or obtained by investigating customer receipts as the primary outcome measure. The included studies' target group comprised supermarket clientele. Several studies found no significant effects of product health information on actual purchase behaviour. Interventions were more likely to be effective when they lasted for a longer time, when they included additional intervention components, and when they targeted the absence of unhealthy nutrients instead of or in addition to the presence of healthy nutrients. No strong evidence for the effectiveness of product health information was found. The effect of intervention duration, additional promotional activities and targeting of healthy v. unhealthy nutrients should be closely examined in future studies.

  8. Combining fibre optic Raman spectroscopy and tactile resonance measurement for tissue characterization

    NASA Astrophysics Data System (ADS)

    Candefjord, Stefan; Nyberg, Morgan; Jalkanen, Ville; Ramser, Kerstin; Lindahl, Olof A.

    2010-12-01

    Tissue characterization is fundamental for identification of pathological conditions. Raman spectroscopy (RS) and tactile resonance measurement (TRM) are two promising techniques that measure biochemical content and stiffness, respectively. They have potential to complement the golden standard--histological analysis. By combining RS and TRM, complementary information about tissue content can be obtained and specific drawbacks can be avoided. The aim of this study was to develop a multivariate approach to compare RS and TRM information. The approach was evaluated on measurements at the same points on porcine abdominal tissue. The measurement points were divided into five groups by multivariate analysis of the RS data. A regression analysis was performed and receiver operating characteristic (ROC) curves were used to compare the RS and TRM data. TRM identified one group efficiently (area under ROC curve 0.99). The RS data showed that the proportion of saturated fat was high in this group. The regression analysis showed that stiffness was mainly determined by the amount of fat and its composition. We concluded that RS provided additional, important information for tissue identification that was not provided by TRM alone. The results are promising for development of a method combining RS and TRM for intraoperative tissue characterization.

  9. Remarks on the pion-nucleon σ-term

    NASA Astrophysics Data System (ADS)

    Hoferichter, Martin; Ruiz de Elvira, Jacobo; Kubis, Bastian; Meißner, Ulf-G.

    2016-09-01

    The pion-nucleon σ-term can be stringently constrained by the combination of analyticity, unitarity, and crossing symmetry with phenomenological information on the pion-nucleon scattering lengths. Recently, lattice calculations at the physical point have been reported that find lower values by about 3σ with respect to the phenomenological determination. We point out that a lattice measurement of the pion-nucleon scattering lengths could help resolve the situation by testing the values extracted from spectroscopy measurements in pionic atoms.

  10. An Investigation on the Crustal Deformations in Istanbul after Eastern Marmara Earthquakes in 1999

    NASA Astrophysics Data System (ADS)

    Ozludemir, M.; Ozyasar, M.

    2008-12-01

    Since the introduction of the GPS technique in mid 1970's there has been great advances in positioning activities. Today such Global Navigational Satellite Systems (GNSS) based positioning techniques are widely used in daily geodetic applications. High order geodetic network measurements are one of such geodetic applications. Such networks are established to provide reliable infrastructures for all kind of geodetic work from the production of cadastral plans to the surveying processes during the construction of engineering structures. In fact such positional information obtained in such engineering surveys could be useful for other studies as well. One of such fields is geodynamic studies where such positional information could be valuable to understand the characteristics of tectonic movements. In Turkey being located in a tectonically active zones and having major earthquakes quite frequently, the positional information obtained in engineering surveys could be very useful for earthquake related studies. In this paper an example of such engineering surveys is discussed. This example is the Istanbul GPS (Global Positioning System) Network, first established in 1997 and remeasured in 2005. Between these two measurement processes two major earthquakes took place, on August 17 and November 12, 1999 with magnitudes of 7.4 and 7.2, respectively. In the first measurement campaign in 1997, a network of about 700 points was measured, while in the second campaign in 2005 more than 1800 points were positioned. In these two campaigns are existing common points. The network covers the whole Istanbul area of about 6000 km2. All network points are located on the Eurasian plate to the north of the North Anatolian Fault Zone. In this study, the horizontal and vertical movements are presented and compared with the results obtained in geodynamic studies.

  11. Localization Using Visual Odometry and a Single Downward-Pointing Camera

    NASA Technical Reports Server (NTRS)

    Swank, Aaron J.

    2012-01-01

    Stereo imaging is a technique commonly employed for vision-based navigation. For such applications, two images are acquired from different vantage points and then compared using transformations to extract depth information. The technique is commonly used in robotics for obstacle avoidance or for Simultaneous Localization And Mapping, (SLAM). Yet, the process requires a number of image processing steps and therefore tends to be CPU-intensive, which limits the real-time data rate and use in power-limited applications. Evaluated here is a technique where a monocular camera is used for vision-based odometry. In this work, an optical flow technique with feature recognition is performed to generate odometry measurements. The visual odometry sensor measurements are intended to be used as control inputs or measurements in a sensor fusion algorithm using low-cost MEMS based inertial sensors to provide improved localization information. Presented here are visual odometry results which demonstrate the challenges associated with using ground-pointing cameras for visual odometry. The focus is for rover-based robotic applications for localization within GPS-denied environments.

  12. Statistical approaches to the analysis of point count data: A little extra information can go a long way

    USGS Publications Warehouse

    Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.

  13. Information Fusion from the Point of View of Communication Theory; Fusing Information to Trade-Off the Resolution of Assessments Against the Probability of Mis-Assessment

    DTIC Science & Technology

    2013-08-19

    excellence in linear models , 2010. She successfully defended her dissertation, Linear System Design for Fusion and Compression, on Aug 13, 2013. Her work was...measurements into canonical coordinates, scaling, and rotation; there is a water-filling interpretation; (3) the optimum design of a linear secondary channel of...measurements to fuse with a primary linear channel of measurements maximizes a generalized Rayleigh quotient; (4) the asymptotically optimum

  14. Apparatus and method for mapping an area of interest

    DOEpatents

    Staab, Torsten A. Cohen, Daniel L.; Feller, Samuel [Fairfax, VA

    2009-12-01

    An apparatus and method are provided for mapping an area of interest using polar coordinates or Cartesian coordinates. The apparatus includes a range finder, an azimuth angle measuring device to provide a heading and an inclinometer to provide an angle of inclination of the range finder as it relates to primary reference points and points of interest. A computer is provided to receive signals from the range finder, inclinometer and azimuth angle measurer to record location data and calculate relative locations between one or more points of interest and one or more primary reference points. The method includes mapping of an area of interest to locate points of interest relative to one or more primary reference points and to store the information in the desired manner. The device may optionally also include an illuminator which can be utilized to paint the area of interest to indicate both points of interest and primary points of reference during and/or after data acquisition.

  15. Sedentary Behaviour Profiling of Office Workers: A Sensitivity Analysis of Sedentary Cut-Points

    PubMed Central

    Boerema, Simone T.; Essink, Gerard B.; Tönis, Thijs M.; van Velsen, Lex; Hermens, Hermie J.

    2015-01-01

    Measuring sedentary behaviour and physical activity with wearable sensors provides detailed information on activity patterns and can serve health interventions. At the basis of activity analysis stands the ability to distinguish sedentary from active time. As there is no consensus regarding the optimal cut-point for classifying sedentary behaviour, we studied the consequences of using different cut-points for this type of analysis. We conducted a battery of sitting and walking activities with 14 office workers, wearing the Promove 3D activity sensor to determine the optimal cut-point (in counts per minute (m·s−2)) for classifying sedentary behaviour. Then, 27 office workers wore the sensor for five days. We evaluated the sensitivity of five sedentary pattern measures for various sedentary cut-points and found an optimal cut-point for sedentary behaviour of 1660 × 10−3 m·s−2. Total sedentary time was not sensitive to cut-point changes within ±10% of this optimal cut-point; other sedentary pattern measures were not sensitive to changes within the ±20% interval. The results from studies analyzing sedentary patterns, using different cut-points, can be compared within these boundaries. Furthermore, commercial, hip-worn activity trackers can implement feedback and interventions on sedentary behaviour patterns, using these cut-points. PMID:26712758

  16. Development and Validation of the POSITIVES Scale (Postsecondary Information Technology Initiative Scale)

    ERIC Educational Resources Information Center

    Fichten, Catherine S.; Asuncion, Jennison V.; Nguyen, Mai N.; Wolforth, Joan; Budd, Jillian; Barile, Maria; Gaulin, Chris; Martiniello, Natalie; Tibbs, Anthony; Ferraro, Vittoria; Amsel, Rhonda

    2009-01-01

    Data on how well information and communication technology (ICT) needs of 1354 Canadian college and university students with disabilities are met on and off campus were collected using the newly developed Positives Scale (Postsecondary Information Technology Initiative Scale). The measure contains 26 items which use a 6-point Likert scale (1 =…

  17. Displacement monitoring and modelling of a high-speed railway bridge using C-band Sentinel-1 data

    NASA Astrophysics Data System (ADS)

    Huang, Qihuan; Crosetto, Michele; Monserrat, Oriol; Crippa, Bruno

    2017-06-01

    Bridge displacement monitoring is one of the key components of bridge structural health monitoring. Traditional methods, usually based on limited sets of sensors mounted on a given bridge, collect point-like deformation information and have the disadvantage of providing incomplete displacement information. In this paper, a Persistent Scatterer Interferometry (PSI) approach is used to monitor the displacements of the Nanjing Dashengguan Yangtze River high-speed railway bridge. Twenty-nine (29) European Space Agency Sentinel-1A images, acquired from April 25, 2015 to August 5, 2016, were used in the PSI analysis. A total of 1828 measurement points were selected on the bridge. The results show a maximum longitudinal displacement of about 150 mm on each side of the bridge. The measured displacements showed a strong correlation with the environmental temperature at the time the images used were acquired, indicating that they were due to thermal expansion of the bridge. At each pier, a regression model based on the PSI-measured displacements was compared with a model based on in-situ measurements. The good agreement of these models demonstrates the capability of the PSI technique to monitor long-span railway bridge displacements. By comparing the modelled displacements and dozens of PSI measurements, we show how the performance of movable bearings can be evaluated. The high density of the PSI measurement points is advantageous for the health monitoring of the entire bridge.

  18. Making Quality Health Websites a National Public Health Priority: Toward Quality Standards.

    PubMed

    Devine, Theresa; Broderick, Jordan; Harris, Linda M; Wu, Huijuan; Hilfiker, Sandra Williams

    2016-08-02

    Most US adults have limited health literacy skills. They struggle to understand complex health information and services and to make informed health decisions. The Internet has quickly become one of the most popular places for people to search for information about their health, thereby making access to quality information on the Web a priority. However, there are no standardized criteria for evaluating Web-based health information. Every 10 years, the US Department of Health and Human Services' Office of Disease Prevention and Health Promotion (ODPHP) develops a set of measurable objectives for improving the health of the nation over the coming decade, known as Healthy People. There are two objectives in Healthy People 2020 related to website quality. The first is objective Health Communication and Health Information Technology (HC/HIT) 8.1: increase the proportion of health-related websites that meet 3 or more evaluation criteria for disclosing information that can be used to assess information reliability. The second is objective HC/HIT-8.2: increase the proportion of health-related websites that follow established usability principles. The ODPHP conducted a nationwide assessment of the quality of Web-based health information using the Healthy People 2020 objectives. The ODPHP aimed to establish (1) a standardized approach to defining and measuring the quality of health websites; (2) benchmarks for measurement; (3) baseline data points to capture the current status of website quality; and (4) targets to drive improvement. The ODPHP developed the National Quality Health Website Survey instrument to assess the quality of health-related websites. The ODPHP used this survey to review 100 top-ranked health-related websites in order to set baseline data points for these two objectives. The ODPHP then set targets to drive improvement by 2020. This study reviewed 100 health-related websites. For objective HC/HIT-8.1, a total of 58 out of 100 (58.0%) websites met 3 or more out of 6 reliability criteria. For objective HC/HIT-8.2, a total of 42 out of 100 (42.0%) websites followed 10 or more out of 19 established usability principles. On the basis of these baseline data points, ODPHP set targets for the year 2020 that meet the minimal statistical significance-increasing objective HC/HIT-8.1 data point to 70.5% and objective HC/HIT-8.2 data point to 55.7%. This research is a critical first step in evaluating the quality of Web-based health information. The criteria proposed by ODPHP provide methods to assess website quality for professionals designing, developing, and managing health-related websites. The criteria, baseline data, and targets are valuable tools for driving quality improvement.

  19. Making Quality Health Websites a National Public Health Priority: Toward Quality Standards

    PubMed Central

    2016-01-01

    Background Most US adults have limited health literacy skills. They struggle to understand complex health information and services and to make informed health decisions. The Internet has quickly become one of the most popular places for people to search for information about their health, thereby making access to quality information on the Web a priority. However, there are no standardized criteria for evaluating Web-based health information. Every 10 years, the US Department of Health and Human Services' Office of Disease Prevention and Health Promotion (ODPHP) develops a set of measurable objectives for improving the health of the nation over the coming decade, known as Healthy People. There are two objectives in Healthy People 2020 related to website quality. The first is objective Health Communication and Health Information Technology (HC/HIT) 8.1: increase the proportion of health-related websites that meet 3 or more evaluation criteria for disclosing information that can be used to assess information reliability. The second is objective HC/HIT-8.2: increase the proportion of health-related websites that follow established usability principles. Objective The ODPHP conducted a nationwide assessment of the quality of Web-based health information using the Healthy People 2020 objectives. The ODPHP aimed to establish (1) a standardized approach to defining and measuring the quality of health websites; (2) benchmarks for measurement; (3) baseline data points to capture the current status of website quality; and (4) targets to drive improvement. Methods The ODPHP developed the National Quality Health Website Survey instrument to assess the quality of health-related websites. The ODPHP used this survey to review 100 top-ranked health-related websites in order to set baseline data points for these two objectives. The ODPHP then set targets to drive improvement by 2020. Results This study reviewed 100 health-related websites. For objective HC/HIT-8.1, a total of 58 out of 100 (58.0%) websites met 3 or more out of 6 reliability criteria. For objective HC/HIT-8.2, a total of 42 out of 100 (42.0%) websites followed 10 or more out of 19 established usability principles. On the basis of these baseline data points, ODPHP set targets for the year 2020 that meet the minimal statistical significance—increasing objective HC/HIT-8.1 data point to 70.5% and objective HC/HIT-8.2 data point to 55.7%. Conclusions This research is a critical first step in evaluating the quality of Web-based health information. The criteria proposed by ODPHP provide methods to assess website quality for professionals designing, developing, and managing health-related websites. The criteria, baseline data, and targets are valuable tools for driving quality improvement. PMID:27485512

  20. Instantaneous Transfer Entropy for the Study of Cardiovascular and Cardiorespiratory Nonstationary Dynamics.

    PubMed

    Valenza, Gaetano; Faes, Luca; Citi, Luca; Orini, Michele; Barbieri, Riccardo

    2018-05-01

    Measures of transfer entropy (TE) quantify the direction and strength of coupling between two complex systems. Standard approaches assume stationarity of the observations, and therefore are unable to track time-varying changes in nonlinear information transfer with high temporal resolution. In this study, we aim to define and validate novel instantaneous measures of TE to provide an improved assessment of complex nonstationary cardiorespiratory interactions. We here propose a novel instantaneous point-process TE (ipTE) and validate its assessment as applied to cardiovascular and cardiorespiratory dynamics. In particular, heartbeat and respiratory dynamics are characterized through discrete time series, and modeled with probability density functions predicting the time of the next physiological event as a function of the past history. Likewise, nonstationary interactions between heartbeat and blood pressure dynamics are characterized as well. Furthermore, we propose a new measure of information transfer, the instantaneous point-process information transfer (ipInfTr), which is directly derived from point-process-based definitions of the Kolmogorov-Smirnov distance. Analysis on synthetic data, as well as on experimental data gathered from healthy subjects undergoing postural changes confirms that ipTE, as well as ipInfTr measures are able to dynamically track changes in physiological systems coupling. This novel approach opens new avenues in the study of hidden, transient, nonstationary physiological states involving multivariate autonomic dynamics in cardiovascular health and disease. The proposed method can also be tailored for the study of complex multisystem physiology (e.g., brain-heart or, more in general, brain-body interactions).

  1. A systematic approach to sound decision making starts with financial reporting.

    PubMed

    Taylor, R B

    1989-11-01

    Managers and supervisors need information to measure departmental performance. Designing a reporting system requires managers to obtain needed information without being flooded by extraneous data. A reporting framework designed to examine five control points is a necessary tool, and a good place to start.

  2. An Informative Interpretation of Decision Theory: The Information Theoretic Basis for Signal-to-Noise Ratio and Log Likelihood Ratio

    DOE PAGES

    Polcari, J.

    2013-08-16

    The signal processing concept of signal-to-noise ratio (SNR), in its role as a performance measure, is recast within the more general context of information theory, leading to a series of useful insights. Establishing generalized SNR (GSNR) as a rigorous information theoretic measure inherent in any set of observations significantly strengthens its quantitative performance pedigree while simultaneously providing a specific definition under general conditions. This directly leads to consideration of the log likelihood ratio (LLR): first, as the simplest possible information-preserving transformation (i.e., signal processing algorithm) and subsequently, as an absolute, comparable measure of information for any specific observation exemplar. Furthermore,more » the information accounting methodology that results permits practical use of both GSNR and LLR as diagnostic scalar performance measurements, directly comparable across alternative system/algorithm designs, applicable at any tap point within any processing string, in a form that is also comparable with the inherent performance bounds due to information conservation.« less

  3. Choosing the Optimal Number of B-spline Control Points (Part 1: Methodology and Approximation of Curves)

    NASA Astrophysics Data System (ADS)

    Harmening, Corinna; Neuner, Hans

    2016-09-01

    Due to the establishment of terrestrial laser scanner, the analysis strategies in engineering geodesy change from pointwise approaches to areal ones. These areal analysis strategies are commonly built on the modelling of the acquired point clouds. Freeform curves and surfaces like B-spline curves/surfaces are one possible approach to obtain space continuous information. A variety of parameters determines the B-spline's appearance; the B-spline's complexity is mostly determined by the number of control points. Usually, this number of control points is chosen quite arbitrarily by intuitive trial-and-error-procedures. In this paper, the Akaike Information Criterion and the Bayesian Information Criterion are investigated with regard to a justified and reproducible choice of the optimal number of control points of B-spline curves. Additionally, we develop a method which is based on the structural risk minimization of the statistical learning theory. Unlike the Akaike and the Bayesian Information Criteria this method doesn't use the number of parameters as complexity measure of the approximating functions but their Vapnik-Chervonenkis-dimension. Furthermore, it is also valid for non-linear models. Thus, the three methods differ in their target function to be minimized and consequently in their definition of optimality. The present paper will be continued by a second paper dealing with the choice of the optimal number of control points of B-spline surfaces.

  4. Predicting Bird Response to Alternative Management Scenarios on a Ranch in Campeche, México

    Treesearch

    Paul A. Wood; Deanna K. Dawson; John R. Sauer; Marcia H. Wilson

    2005-01-01

    We developed models to predict the potential response of wintering Neotropical migrant and resident bird species to alternative management scenarios, using data from point counts of birds along with habitat variables measured or estimated from remotely sensed data in a Geographic Information System. Expected numbers of occurrences at points were calculated for 100...

  5. LiDAR and Image Point Cloud Comparison

    DTIC Science & Technology

    2014-09-01

    UAV unmanned aerial vehicle USGS United States Geological Survey UTM Universal Transverse Mercator WGS 84 World Geodetic System 1984 WSI...19  1.  Physics of LiDAR Systems ................................................................20  III.  DATA AND SOFTWARE...ground control point GPS Global Positioning System IMU inertial measurements unit LiDAR light detection and ranging MI mutual information MVS

  6. Pilot points method for conditioning multiple-point statistical facies simulation on flow data

    NASA Astrophysics Data System (ADS)

    Ma, Wei; Jafarpour, Behnam

    2018-05-01

    We propose a new pilot points method for conditioning discrete multiple-point statistical (MPS) facies simulation on dynamic flow data. While conditioning MPS simulation on static hard data is straightforward, their calibration against nonlinear flow data is nontrivial. The proposed method generates conditional models from a conceptual model of geologic connectivity, known as a training image (TI), by strategically placing and estimating pilot points. To place pilot points, a score map is generated based on three sources of information: (i) the uncertainty in facies distribution, (ii) the model response sensitivity information, and (iii) the observed flow data. Once the pilot points are placed, the facies values at these points are inferred from production data and then are used, along with available hard data at well locations, to simulate a new set of conditional facies realizations. While facies estimation at the pilot points can be performed using different inversion algorithms, in this study the ensemble smoother (ES) is adopted to update permeability maps from production data, which are then used to statistically infer facies types at the pilot point locations. The developed method combines the information in the flow data and the TI by using the former to infer facies values at selected locations away from the wells and the latter to ensure consistent facies structure and connectivity where away from measurement locations. Several numerical experiments are used to evaluate the performance of the developed method and to discuss its important properties.

  7. Assimilating Flow Data into Complex Multiple-Point Statistical Facies Models Using Pilot Points Method

    NASA Astrophysics Data System (ADS)

    Ma, W.; Jafarpour, B.

    2017-12-01

    We develop a new pilot points method for conditioning discrete multiple-point statistical (MPS) facies simulation on dynamic flow data. While conditioning MPS simulation on static hard data is straightforward, their calibration against nonlinear flow data is nontrivial. The proposed method generates conditional models from a conceptual model of geologic connectivity, known as a training image (TI), by strategically placing and estimating pilot points. To place pilot points, a score map is generated based on three sources of information:: (i) the uncertainty in facies distribution, (ii) the model response sensitivity information, and (iii) the observed flow data. Once the pilot points are placed, the facies values at these points are inferred from production data and are used, along with available hard data at well locations, to simulate a new set of conditional facies realizations. While facies estimation at the pilot points can be performed using different inversion algorithms, in this study the ensemble smoother (ES) and its multiple data assimilation variant (ES-MDA) are adopted to update permeability maps from production data, which are then used to statistically infer facies types at the pilot point locations. The developed method combines the information in the flow data and the TI by using the former to infer facies values at select locations away from the wells and the latter to ensure consistent facies structure and connectivity where away from measurement locations. Several numerical experiments are used to evaluate the performance of the developed method and to discuss its important properties.

  8. Multivariate meta-analysis of prognostic factor studies with multiple cut-points and/or methods of measurement.

    PubMed

    Riley, Richard D; Elia, Eleni G; Malin, Gemma; Hemming, Karla; Price, Malcolm P

    2015-07-30

    A prognostic factor is any measure that is associated with the risk of future health outcomes in those with existing disease. Often, the prognostic ability of a factor is evaluated in multiple studies. However, meta-analysis is difficult because primary studies often use different methods of measurement and/or different cut-points to dichotomise continuous factors into 'high' and 'low' groups; selective reporting is also common. We illustrate how multivariate random effects meta-analysis models can accommodate multiple prognostic effect estimates from the same study, relating to multiple cut-points and/or methods of measurement. The models account for within-study and between-study correlations, which utilises more information and reduces the impact of unreported cut-points and/or measurement methods in some studies. The applicability of the approach is improved with individual participant data and by assuming a functional relationship between prognostic effect and cut-point to reduce the number of unknown parameters. The models provide important inferential results for each cut-point and method of measurement, including the summary prognostic effect, the between-study variance and a 95% prediction interval for the prognostic effect in new populations. Two applications are presented. The first reveals that, in a multivariate meta-analysis using published results, the Apgar score is prognostic of neonatal mortality but effect sizes are smaller at most cut-points than previously thought. In the second, a multivariate meta-analysis of two methods of measurement provides weak evidence that microvessel density is prognostic of mortality in lung cancer, even when individual participant data are available so that a continuous prognostic trend is examined (rather than cut-points). © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  9. Inertial Pointing and Positioning System

    NASA Technical Reports Server (NTRS)

    Yee, Robert (Inventor); Robbins, Fred (Inventor)

    1998-01-01

    An inertial pointing and control system and method for pointing to a designated target with known coordinates from a platform to provide accurate position, steering, and command information. The system continuously receives GPS signals and corrects Inertial Navigation System (INS) dead reckoning or drift errors. An INS is mounted directly on a pointing instrument rather than in a remote location on the platform for-monitoring the terrestrial position and instrument attitude. and for pointing the instrument at designated celestial targets or ground based landmarks. As a result. the pointing instrument and die INS move independently in inertial space from the platform since the INS is decoupled from the platform. Another important characteristic of the present system is that selected INS measurements are combined with predefined coordinate transformation equations and control logic algorithms under computer control in order to generate inertial pointing commands to the pointing instrument. More specifically. the computer calculates the desired instrument angles (Phi, Theta. Psi). which are then compared to the Euler angles measured by the instrument- mounted INS. and forms the pointing command error angles as a result of the compared difference.

  10. Analysis of Uncertainty in a Middle-Cost Device for 3D Measurements in BIM Perspective

    PubMed Central

    Sánchez, Alonso; Naranjo, José-Manuel; Jiménez, Antonio; González, Alfonso

    2016-01-01

    Medium-cost devices equipped with sensors are being developed to get 3D measurements. Some allow for generating geometric models and point clouds. Nevertheless, the accuracy of these measurements should be evaluated, taking into account the requirements of the Building Information Model (BIM). This paper analyzes the uncertainty in outdoor/indoor three-dimensional coordinate measures and point clouds (using Spherical Accuracy Standard (SAS) methods) for Eyes Map, a medium-cost tablet manufactured by e-Capture Research & Development Company, Mérida, Spain. To achieve it, in outdoor tests, by means of this device, the coordinates of targets were measured from 1 to 6 m and cloud points were obtained. Subsequently, these were compared to the coordinates of the same targets measured by a Total Station. The Euclidean average distance error was 0.005–0.027 m for measurements by Photogrammetry and 0.013–0.021 m for the point clouds. All of them satisfy the tolerance for point cloud acquisition (0.051 m) according to the BIM Guide for 3D Imaging (General Services Administration); similar results are obtained in the indoor tests, with values of 0.022 m. In this paper, we establish the optimal distances for the observations in both, Photogrammetry and 3D Photomodeling modes (outdoor) and point out some working conditions to avoid in indoor environments. Finally, the authors discuss some recommendations for improving the performance and working methods of the device. PMID:27669245

  11. Motion data classification on the basis of dynamic time warping with a cloud point distance measure

    NASA Astrophysics Data System (ADS)

    Switonski, Adam; Josinski, Henryk; Zghidi, Hafedh; Wojciechowski, Konrad

    2016-06-01

    The paper deals with the problem of classification of model free motion data. The nearest neighbors classifier which is based on comparison performed by Dynamic Time Warping transform with cloud point distance measure is proposed. The classification utilizes both specific gait features reflected by a movements of subsequent skeleton joints and anthropometric data. To validate proposed approach human gait identification challenge problem is taken into consideration. The motion capture database containing data of 30 different humans collected in Human Motion Laboratory of Polish-Japanese Academy of Information Technology is used. The achieved results are satisfactory, the obtained accuracy of human recognition exceeds 90%. What is more, the applied cloud point distance measure does not depend on calibration process of motion capture system which results in reliable validation.

  12. Information entropy to measure the spatial and temporal complexity of solute transport in heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Li, Weiyao; Huang, Guanhua; Xiong, Yunwu

    2016-04-01

    The complexity of the spatial structure of porous media, randomness of groundwater recharge and discharge (rainfall, runoff, etc.) has led to groundwater movement complexity, physical and chemical interaction between groundwater and porous media cause solute transport in the medium more complicated. An appropriate method to describe the complexity of features is essential when study on solute transport and conversion in porous media. Information entropy could measure uncertainty and disorder, therefore we attempted to investigate complexity, explore the contact between the information entropy and complexity of solute transport in heterogeneous porous media using information entropy theory. Based on Markov theory, two-dimensional stochastic field of hydraulic conductivity (K) was generated by transition probability. Flow and solute transport model were established under four conditions (instantaneous point source, continuous point source, instantaneous line source and continuous line source). The spatial and temporal complexity of solute transport process was characterized and evaluated using spatial moment and information entropy. Results indicated that the entropy increased as the increase of complexity of solute transport process. For the point source, the one-dimensional entropy of solute concentration increased at first and then decreased along X and Y directions. As time increased, entropy peak value basically unchanged, peak position migrated along the flow direction (X direction) and approximately coincided with the centroid position. With the increase of time, spatial variability and complexity of solute concentration increase, which result in the increases of the second-order spatial moment and the two-dimensional entropy. Information entropy of line source was higher than point source. Solute entropy obtained from continuous input was higher than instantaneous input. Due to the increase of average length of lithoface, media continuity increased, flow and solute transport complexity weakened, and the corresponding information entropy also decreased. Longitudinal macro dispersivity declined slightly at early time then rose. Solute spatial and temporal distribution had significant impacts on the information entropy. Information entropy could reflect the change of solute distribution. Information entropy appears a tool to characterize the spatial and temporal complexity of solute migration and provides a reference for future research.

  13. Using a plenoptic camera to measure distortions in wavefronts affected by atmospheric turbulence

    NASA Astrophysics Data System (ADS)

    Eslami, Mohammed; Wu, Chensheng; Rzasa, John; Davis, Christopher C.

    2012-10-01

    Ideally, as planar wave fronts travel through an imaging system, all rays, or vectors pointing in the direction of the propagation of energy are parallel, and thus the wave front is focused to a particular point. If the wave front arrives at an imaging system with energy vectors that point in different directions, each part of the wave front will be focused at a slightly different point on the sensor plane and result in a distorted image. The Hartmann test, which involves the insertion of a series of pinholes between the imaging system and the sensor plane, was developed to sample the wavefront at different locations and measure the distortion angles at different points in the wave front. An adaptive optic system, such as a deformable mirror, is then used to correct for these distortions and allow the planar wave front to focus at the point desired on the sensor plane, thereby correcting the distorted image. The apertures of a pinhole array limit the amount of light that reaches the sensor plane. By replacing the pinholes with a microlens array each bundle of rays is focused to brighten the image. Microlens arrays are making their way into newer imaging technologies, such as "light field" or "plenoptic" cameras. In these cameras, the microlens array is used to recover the ray information of the incoming light by using post processing techniques to focus on objects at different depths. The goal of this paper is to demonstrate the use of these plenoptic cameras to recover the distortions in wavefronts. Taking advantage of the microlens array within the plenoptic camera, CODE-V simulations show that its performance can provide more information than a Shack-Hartmann sensor. Using the microlens array to retrieve the ray information and then backstepping through the imaging system provides information about distortions in the arriving wavefront.

  14. Investigating Temporal and Spatial Variations in Near Surface Water Content using GPR

    NASA Astrophysics Data System (ADS)

    Hubbard, S. S.; Grote, K.; Kowalsky, M. B.; Rubin, Y.

    2001-12-01

    Using only conventional point or well logging measurements, it is difficult to obtain information about water content with sufficient spatial resolution and coverage to be useful for near surface applications such as for input to vadose zone predictive models or for assisting with precision crop management. Prompted by successful results of a controlled ground penetrating radar (GPR) pilot study, we are investigating the applicability of GPR methods to estimate near surface water content at a study site within the Robert Mondavi vineyards in Napa County, California. Detailed information about soil variability and water content within vineyards could assist in estimation of plantable acreage, in the design of vineyard layout and in the design of an efficient irrigation/agrochemical application procedure. Our research at the winery study site involves investigation of optimal GPR acquisition and processing techniques, modeling of GPR attributes, and inversion of the attributes for water content information over space and time. A secondary goal of our project is to compare water content information obtained from the GPR data with information available from other types of measurements that are being used to assist in precision crop management. This talk will focus on point and spatial correlation estimation of water content obtained using GPR groundwave information only, and comparison of those estimates with information obtained from analysis of soils, TDR, neutron probe and remote sensing data sets. This comparison will enable us to 1) understand the potential of GPR for providing water content information in the very shallow subsurface, and to 2) investigate the interrelationships between the different types of measurements (and associated measurement scales) that are being utilized to characterize the shallow subsurface water content over space and time.

  15. Drainage areas of the Twelvepole Creek basin, West Virginia; Big Sandy River basin, West Virginia; Tug Fork basin, Virginia, Kentucky, West Virginia

    USGS Publications Warehouse

    Wilson, M.W.

    1979-01-01

    Drainage areas were determined for 61 basins in the Twelvepole Creek basin, West Virginia; 11 basins of the Big Sandy River Basin, West Virginia; and 210 basins in the Tug Fork basin of Virginia, Kentucky, and West Virginia. Most basins with areas greater than 5 square miles were included. Drainage areas were measured with electronic digitizing equipment, and supplementary measurements were made with a hand planimeter. Stream mileages were determined by measuring, with a graduated plastic strip, distances from the mouth of each stream to the measuring point on that stream. Mileages were reported to the nearest one-hundredth of a mile in all cases. The latitude and longitude of each measuring point was determined with electronic digitizing equipment and is reported to the nearest second. The information is listed in tabular form in downstream order. Measuring points for the basins are located in the tables by intersecting tributaries, by counties, by map quadrangles, or by latitude and longitude. (Woodard-USGS)

  16. Method and apparatus for automatically detecting patterns in digital point-ordered signals

    DOEpatents

    Brudnoy, David M.

    1998-01-01

    The present invention is a method and system for detecting a physical feature of a test piece by detecting a pattern in a signal representing data from inspection of the test piece. The pattern is detected by automated additive decomposition of a digital point-ordered signal which represents the data. The present invention can properly handle a non-periodic signal. A physical parameter of the test piece is measured. A digital point-ordered signal representative of the measured physical parameter is generated. The digital point-ordered signal is decomposed into a baseline signal, a background noise signal, and a peaks/troughs signal. The peaks/troughs from the peaks/troughs signal are located and peaks/troughs information indicating the physical feature of the test piece is output.

  17. Method and apparatus for automatically detecting patterns in digital point-ordered signals

    DOEpatents

    Brudnoy, D.M.

    1998-10-20

    The present invention is a method and system for detecting a physical feature of a test piece by detecting a pattern in a signal representing data from inspection of the test piece. The pattern is detected by automated additive decomposition of a digital point-ordered signal which represents the data. The present invention can properly handle a non-periodic signal. A physical parameter of the test piece is measured. A digital point-ordered signal representative of the measured physical parameter is generated. The digital point-ordered signal is decomposed into a baseline signal, a background noise signal, and a peaks/troughs signal. The peaks/troughs from the peaks/troughs signal are located and peaks/troughs information indicating the physical feature of the test piece is output. 14 figs.

  18. Using Eddy Covariance to Quantify Methane Emissions from a Dynamic Heterogeneous Area

    EPA Science Inventory

    Measuring emissions of CH4, CO2, H2O, and other greenhouse gases from heterogeneous land area sources is challenging. Dynamic changes within the source area as well as changing environmental conditions make individual point measurements less informative than desired, especially w...

  19. Using Eddy Covariance to Quantify Methane Emission from a Dynamic Heterogeneous Area

    EPA Science Inventory

    Measuring emissions of CH4, CO2, H2O, and other greenhouse gases from heterogeneous land area sources is challenging. Dynamic changes within the source area as well as changing environmental conditions make individual point measurements less informative than desired, especially w...

  20. Understanding and using quality information for quality improvement: The effect of information presentation.

    PubMed

    Zwijnenberg, Nicolien C; Hendriks, Michelle; Delnoij, Diana M J; de Veer, Anke J E; Spreeuwenberg, Peter; Wagner, Cordula

    2016-12-01

    To examine how information presentation affects the understanding and use of information for quality improvement. An experimental design, testing 22 formats, and showing information on patient safety culture. Formats differed in visualization, outcomes and benchmark information. Respondents viewed three randomly selected presentation formats in an online survey, completing several tasks per format. The hospital sector in the Netherlands. A volunteer sample of healthcare professionals, mainly nurses, working in hospitals. Main Outcome Measure(s): The degree to which information is understandable and usable (accurate choice for quality improvement, sense of urgency to change and appraisal of one's own performance). About 115 healthcare professionals participated (response rate 25%), resulting in 345 reviews. Information in tables (P = 0.007) and bar charts (P < 0.0001) was better understood than radars. Presenting outcomes on a 5-point scale (P < 0.001) or as '% positive responders' (P < 0.001) was better understood than '% negative responders'. Formats without benchmarks were better understood than formats with benchmarks. Use: Bar charts resulted in more accurate choices than tables (P = 0.003) and radars (P < 0.001). Outcomes on a 5-point scale resulted in more accurate choices than '% negative responders' (P = 0.007). Presenting '% positive responders' resulted in a higher sense of urgency to change than outcomes on a 5-point scale (P = 0.002). Benchmark information had inconsistent effects on the appraisal of one's own performances. Information presentation affects healthcare professionals' understanding and use of quality information. Our findings supplement the further understanding on how quality information can be best communicated to healthcare professionals for realizing quality improvements. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  1. Closure measures for coarse-graining of the tent map.

    PubMed

    Pfante, Oliver; Olbrich, Eckehard; Bertschinger, Nils; Ay, Nihat; Jost, Jürgen

    2014-03-01

    We quantify the relationship between the dynamics of a time-discrete dynamical system, the tent map T and its iterations T(m), and the induced dynamics at a symbolical level in information theoretical terms. The symbol dynamics, given by a binary string s of length m, is obtained by choosing a partition point [Formula: see text] and lumping together the points [Formula: see text] s.t. T(i)(x) concurs with the i - 1th digit of s-i.e., we apply a so called threshold crossing technique. Interpreting the original dynamics and the symbolic one as different levels, this allows us to quantitatively evaluate and compare various closure measures that have been proposed for identifying emergent macro-levels of a dynamical system. In particular, we can see how these measures depend on the choice of the partition point α. As main benefit of this new information theoretical approach, we get all Markov partitions with full support of the time-discrete dynamical system induced by the tent map. Furthermore, we could derive an example of a Markovian symbol dynamics whose underlying partition is not Markovian at all, and even a whole hierarchy of Markovian symbol dynamics.

  2. Automatic Lumbar Spondylolisthesis Measurement in CT Images.

    PubMed

    Liao, Shu; Zhan, Yiqiang; Dong, Zhongxing; Yan, Ruyi; Gong, Liyan; Zhou, Xiang Sean; Salganicoff, Marcos; Fei, Jun

    2016-07-01

    Lumbar spondylolisthesis is one of the most common spinal diseases. It is caused by the anterior shift of a lumbar vertebrae relative to subjacent vertebrae. In current clinical practices, staging of spondylolisthesis is often conducted in a qualitative way. Although meyerding grading opens the door to stage spondylolisthesis in a more quantitative way, it relies on the manual measurement, which is time consuming and irreproducible. Thus, an automatic measurement algorithm becomes desirable for spondylolisthesis diagnosis and staging. However, there are two challenges. 1) Accurate detection of the most anterior and posterior points on the superior and inferior surfaces of each lumbar vertebrae. Due to the small size of the vertebrae, slight errors of detection may lead to significant measurement errors, hence, wrong disease stages. 2) Automatic localize and label each lumbar vertebrae is required to provide the semantic meaning of the measurement. It is difficult since different lumbar vertebraes have high similarity of both shape and image appearance. To resolve these challenges, a new auto measurement framework is proposed with two major contributions: First, a learning based spine labeling method that integrates both the image appearance and spine geometry information is designed to detect lumbar vertebrae. Second, a hierarchical method using both the population information from atlases and domain-specific information in the target image is proposed for most anterior and posterior points positioning. Validated on 258 CT spondylolisthesis patients, our method shows very similar results to manual measurements by radiologists and significantly increases the measurement efficiency.

  3. Acoustic systems for the measurement of streamflow

    USGS Publications Warehouse

    Laenen, Antonius; Smith, Winchell

    1983-01-01

    The acoustic velocity meter (AVM), also referred to as an ultrasonic flowmeter, has been an operational tool for the measurement of streamflow since 1965. Very little information is available concerning AVM operation, performance, and limitations. The purpose of this report is to consolidate information in such a manner as to provide a better understanding about the application of this instrumentation to streamflow measurement. AVM instrumentation is highly accurate and nonmechanical. Most commercial AVM systems that measure streamflow use the time-of-travel method to determine a velocity between two points. The systems operate on the principle that point-to-point upstream travel-time of sound is longer than the downstream travel-time, and this difference can be monitored and measured accurately by electronics. AVM equipment has no practical upper limit of measurable velocity if sonic transducers are securely placed and adequately protected. AVM systems used in streamflow measurement generally operate with a resolution of ?0.01 meter per second but this is dependent on system frequency, path length, and signal attenuation. In some applications the performance of AVM equipment may be degraded by multipath interference, signal bending, signal attenuation, and variable streamline orientation. Presently used minicomputer systems, although expensive to purchase and maintain, perform well. Increased use of AVM systems probably will be realized as smaller, less expensive, and more conveniently operable microprocessor-based systems become readily available. Available AVM equipment should be capable of flow measurement in a wide variety of situations heretofore untried. New signal-detection techniques and communication linkages can provide additional flexibility to the systems so that operation is possible in more river and estuary situations.

  4. 40 CFR 141.75 - Reporting and recordkeeping requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS Filtration and Disinfection § 141.75... water to the public. (2) Disinfection information specified in § 141.74(b) must be reported to the State... disinfection. (v) The daily measurement(s) of water temperature in °C following each point of disinfection. (vi...

  5. Study of annoyance due to urban automobile traffic. Annex 4: A catalog of the characteristics of noise at different measuring points

    NASA Technical Reports Server (NTRS)

    Aubree, D.; Auzou, S.; Rapin, J. M.

    1984-01-01

    The characteristics of urban traffic were studied. Data synthesis of and data specifically for the city of Paris concerning noise due to automobile traffic were examined. Information on noise characteristics at different measuring locations is presented.

  6. Improvement of the Accuracy of InSAR Image Co-Registration Based On Tie Points - A Review.

    PubMed

    Zou, Weibao; Li, Yan; Li, Zhilin; Ding, Xiaoli

    2009-01-01

    Interferometric Synthetic Aperture Radar (InSAR) is a new measurement technology, making use of the phase information contained in the Synthetic Aperture Radar (SAR) images. InSAR has been recognized as a potential tool for the generation of digital elevation models (DEMs) and the measurement of ground surface deformations. However, many critical factors affect the quality of InSAR data and limit its applications. One of the factors is InSAR data processing, which consists of image co-registration, interferogram generation, phase unwrapping and geocoding. The co-registration of InSAR images is the first step and dramatically influences the accuracy of InSAR products. In this paper, the principle and processing procedures of InSAR techniques are reviewed. One of important factors, tie points, to be considered in the improvement of the accuracy of InSAR image co-registration are emphatically reviewed, such as interval of tie points, extraction of feature points, window size for tie point matching and the measurement for the quality of an interferogram.

  7. Improvement of the Accuracy of InSAR Image Co-Registration Based On Tie Points – A Review

    PubMed Central

    Zou, Weibao; Li, Yan; Li, Zhilin; Ding, Xiaoli

    2009-01-01

    Interferometric Synthetic Aperture Radar (InSAR) is a new measurement technology, making use of the phase information contained in the Synthetic Aperture Radar (SAR) images. InSAR has been recognized as a potential tool for the generation of digital elevation models (DEMs) and the measurement of ground surface deformations. However, many critical factors affect the quality of InSAR data and limit its applications. One of the factors is InSAR data processing, which consists of image co-registration, interferogram generation, phase unwrapping and geocoding. The co-registration of InSAR images is the first step and dramatically influences the accuracy of InSAR products. In this paper, the principle and processing procedures of InSAR techniques are reviewed. One of important factors, tie points, to be considered in the improvement of the accuracy of InSAR image co-registration are emphatically reviewed, such as interval of tie points, extraction of feature points, window size for tie point matching and the measurement for the quality of an interferogram. PMID:22399966

  8. Brute Force Matching Between Camera Shots and Synthetic Images from Point Clouds

    NASA Astrophysics Data System (ADS)

    Boerner, R.; Kröhnert, M.

    2016-06-01

    3D point clouds, acquired by state-of-the-art terrestrial laser scanning techniques (TLS), provide spatial information about accuracies up to several millimetres. Unfortunately, common TLS data has no spectral information about the covered scene. However, the matching of TLS data with images is important for monoplotting purposes and point cloud colouration. Well-established methods solve this issue by matching of close range images and point cloud data by fitting optical camera systems on top of laser scanners or rather using ground control points. The approach addressed in this paper aims for the matching of 2D image and 3D point cloud data from a freely moving camera within an environment covered by a large 3D point cloud, e.g. a 3D city model. The key advantage of the free movement affects augmented reality applications or real time measurements. Therefore, a so-called real image, captured by a smartphone camera, has to be matched with a so-called synthetic image which consists of reverse projected 3D point cloud data to a synthetic projection centre whose exterior orientation parameters match the parameters of the image, assuming an ideal distortion free camera.

  9. Depth perception camera for autonomous vehicle applications

    NASA Astrophysics Data System (ADS)

    Kornreich, Philipp

    2013-05-01

    An imager that can measure the distance from each pixel to the point on the object that is in focus at the pixel is described. Since it provides numeric information of the distance from the camera to all points in its field of view it is ideally suited for autonomous vehicle navigation and robotic vision. This eliminates the LIDAR conventionally used for range measurements. The light arriving at a pixel through a convex lens adds constructively only if it comes from the object point in focus at this pixel. The light from all other object points cancels. Thus, the lens selects the point on the object who's range is to be determined. The range measurement is accomplished by short light guides at each pixel. The light guides contain a p - n junction and a pair of contacts along its length. They, too, contain light sensing elements along the length. The device uses ambient light that is only coherent in spherical shell shaped light packets of thickness of one coherence length. Each of the frequency components of the broad band light arriving at a pixel has a phase proportional to the distance from an object point to its image pixel.

  10. Phase space gradient of dissipated work and information: A role of relative Fisher information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamano, Takuya, E-mail: yamano@amy.hi-ho.ne.jp

    2013-11-15

    We show that an information theoretic distance measured by the relative Fisher information between canonical equilibrium phase densities corresponding to forward and backward processes is intimately related to the gradient of the dissipated work in phase space. We present a universal constraint on it via the logarithmic Sobolev inequality. Furthermore, we point out that a possible expression of the lower bound indicates a deep connection in terms of the relative entropy and the Fisher information of the canonical distributions.

  11. Technology, Incentives, or Both? Factors Related to Level of Hospital Health Information Exchange.

    PubMed

    Lin, Sunny C; Everson, Jordan; Adler-Milstein, Julia

    2018-02-28

    To assess whether the level of health information exchange (HIE) in U.S. hospitals is related to technology capabilities, incentives to exchange, or both. A total of 1,812 hospitals attesting to stage 2 of Medicare's Meaningful Use Incentive Program through April 2016. Hospital-level, multivariate OLS regression with state fixed effects was used to analyze the relationship between technology capability and incentives measures, and percent of care transitions with summary of care records (SCRs) sent electronically to subsequent providers. Stage 2 hospitals reported sending SCRs electronically for an average of 41 percent (median = 33 percent) of transitions. HIE level is related to four capability measures, one incentive measure, and one measure that is related to both capability and incentive. Percent of transitions with SCRs sent electronically was 3 percentage points higher (95 percent CI: 0.1-5.1) for hospitals with a third-party HIE vendor, 3 percentage points higher (95 percent CI: 0.5-5.4) for hospitals with an EHR vendor as their HIE vendor, and 3 percentage points higher (95 percent CI: 0.4-5.4) for hospitals that automatically alert primary care providers. The direction and statistical significance of the relationships between specific EHR vendor and electronic SCR transmission level varied by vendor. Nonprofits and government hospitals performed 5 percentage points higher (95 percent CI: 1.5-9.1) and 8 percentage points higher (95 percent CI: 3.4-12.3) than for-profits. Hospitals in systems performed 3 percentage points higher (95 percent CI: 0.8-6.1). The overall level of HIE is low, with hospitals sending an SCR electronically for less than half of patient transitions. Specific hospital characteristics related to both technology capabilities and incentives were associated with higher levels of HIE. © Health Research and Educational Trust.

  12. Evidence-Based Practice Point-of-Care Resources: A Quantitative Evaluation of Quality, Rigor, and Content.

    PubMed

    Campbell, Jared M; Umapathysivam, Kandiah; Xue, Yifan; Lockwood, Craig

    2015-12-01

    Clinicians and other healthcare professionals need access to summaries of evidence-based information in order to provide effective care to their patients at the point-of-care. Evidence-based practice (EBP) point-of-care resources have been developed and are available online to meet this need. This study aimed to develop a comprehensive list of available EBP point-of-care resources and evaluate their processes and policies for the development of content, in order to provide a critical analysis based upon rigor, transparency and measures of editorial quality to inform healthcare providers and promote quality improvement amongst publishers of EBP resources. A comprehensive and systematic search (Pubmed, CINAHL, and Cochrane Central) was undertaken to identify available EBP point-of-care resources, defined as "web-based medical compendia specifically designed to deliver predigested, rapidly accessible, comprehensive, periodically updated, and evidence-based information (and possibly also guidance) to clinicians." A pair of investigators independently extracted information on general characteristics, content presentation, editorial quality, evidence-based methodology, and breadth and volume. Twenty-seven summary resources were identified, of which 22 met the predefined inclusion criteria for EBP point-of-care resources, and 20 could be accessed for description and assessment. Overall, the upper quartile of EBP point-of-care providers was assessed to be UpToDate, Nursing Reference Centre, Mosby's Nursing Consult, BMJ Best Practice, and JBI COnNECT+. The choice of which EBP point-of-care resources are suitable for an organization is a decision that depends heavily on the unique requirements of that organization and the resources it has available. However, the results presented in this study should enable healthcare providers to make that assessment in a clear, evidence-based manner, and provide a comprehensive list of the available options. © 2015 Sigma Theta Tau International.

  13. Correlation of Spatially Filtered Dynamic Speckles in Distance Measurement Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Semenov, Dmitry V.; Nippolainen, Ervin; Kamshilin, Alexei A.

    2008-04-15

    In this paper statistical properties of spatially filtered dynamic speckles are considered. This phenomenon was not sufficiently studied yet while spatial filtering is an important instrument for speckles velocity measurements. In case of spatial filtering speckle velocity information is derived from the modulation frequency of filtered light power which is measured by photodetector. Typical photodetector output is represented by a narrow-band random noise signal which includes non-informative intervals. Therefore more or less precious frequency measurement requires averaging. In its turn averaging implies uncorrelated samples. However, conducting research we found that correlation is typical property not only of dynamic speckle patternsmore » but also of spatially filtered speckles. Using spatial filtering the correlation is observed as a response of measurements provided to the same part of the object surface or in case of simultaneously using several adjacent photodetectors. Found correlations can not be explained using just properties of unfiltered dynamic speckles. As we demonstrate the subject of this paper is important not only from pure theoretical point but also from the point of applied speckle metrology. E.g. using single spatial filter and an array of photodetector can greatly improve accuracy of speckle velocity measurements.« less

  14. Determination of welding residual stresses by inverse approach with eigenstrain formulations of boundary integral equation

    NASA Astrophysics Data System (ADS)

    Ma, Hang; Wang, Ying; Qin, Qing-Hua

    2011-04-01

    Based on the concept of eigenstrain, a straightforward computational model of the inverse approach is proposed for determining the residual stress field induced by welding using the eigenstrain formulations of boundary integral equations. The eigenstrains are approximately expressed in terms of low-order polynomials in the local area around welded zones. The domain integrals with polynomial eigenstrains are transformed into the boundary integrals to preserve the favourable features of the boundary-only discretization in the process of numerical solutions. The sensitivity matrices in the inverse approach for evaluating the eigenstrain fields are constructed by either the measured deformations (displacements) on the boundary or the measured stresses in the domain after welding over a number of selected measuring points, or by both the measured information. It shows from the numerical examples that the results of residual stresses from deformation measurements are always better than those from stress measurements but they are sensitive to the noises from experiments. The results from stress measurements can be improved by introducing a few deformation measuring points while reducing the number of points for stress measuring to reduce the cost since the measurement of deformation is easier than that of stresses in practice.

  15. The plenoptic camera as a wavefront sensor for the European Solar Telescope (EST)

    NASA Astrophysics Data System (ADS)

    Rodríguez-Ramos, Luis F.; Martín, Yolanda; Díaz, José J.; Piqueras, J.; Rodríguez-Ramos, J. M.

    2009-08-01

    The plenoptic wavefront sensor combines measurements at pupil and image planes in order to obtain wavefront information from different points of view simultaneously, being capable to sample the volume above the telescope to extract the tomographic information of the atmospheric turbulence. After describing the working principle, a laboratory setup has been used for the verification of the capability of measuring the pupil plane wavefront. A comparative discussion with respect to other wavefront sensors is also included.

  16. Molecular Rayleigh Scattering Techniques Developed for Measuring Gas Flow Velocity, Density, Temperature, and Turbulence

    NASA Technical Reports Server (NTRS)

    Mielke, Amy F.; Seasholtz, Richard G.; Elam, Kristie A.; Panda, Jayanta

    2005-01-01

    Nonintrusive optical point-wise measurement techniques utilizing the principles of molecular Rayleigh scattering have been developed at the NASA Glenn Research Center to obtain time-averaged information about gas velocity, density, temperature, and turbulence, or dynamic information about gas velocity and density in unseeded flows. These techniques enable measurements that are necessary for validating computational fluid dynamics (CFD) and computational aeroacoustic (CAA) codes. Dynamic measurements allow the calculation of power spectra for the various flow properties. This type of information is currently being used in jet noise studies, correlating sound pressure fluctuations with velocity and density fluctuations to determine noise sources in jets. These nonintrusive techniques are particularly useful in supersonic flows, where seeding the flow with particles is not an option, and where the environment is too harsh for hot-wire measurements.

  17. Motivational Differences in Seeking Out Evaluative Categorization Information.

    PubMed

    Smallman, Rachel; Becker, Brittney

    2017-07-01

    Previous research shows that people draw finer evaluative distinctions when rating liked versus disliked objects (e.g., wanting a 5-point scale to evaluate liked cuisines and a 3-point scale to rate disliked cuisines). Known as the preference-categorization effect, this pattern may exist not only in how individuals form evaluative distinctions but also in how individuals seek out evaluative information. The current research presents three experiments that examine motivational differences in evaluative information seeking (rating scales and attributes). Experiment 1 found that freedom of choice (the ability to avoid undesirable stimuli) and sensitivity to punishment (as measured by the Behavior Inhibition System/Behavioral Approach System [BIS/BAS] scale) influenced preferences for desirable and undesirable evaluative information in a health-related decision. Experiment 2 examined choice optimization, finding that maximizers prefer finer evaluative information for both liked and disliked options in a consumer task. Experiment 3 found that this pattern generalizes to another type of evaluative categorization, attributes.

  18. Memory for general and specific value information in younger and older adults: measuring the limits of strategic control.

    PubMed

    Castel, Alan D; Farb, Norman A S; Craik, Fergus I M

    2007-06-01

    The ability to selectively remember important information is a critical function of memory. Although previous research has suggested that older adults are impaired in a variety of episodic memory tasks, recent work has demonstrated that older adults can selectively remember high-value information. In the present research, we examined how younger and older adults selectively remembered words with various assigned numeric point values, to see whether younger adults could remember more specific value information than could older adults. Both groups were equally good at recalling point values when recalling the range of high-value words, but younger adults outperformed older adults when recalling specific values. Although older adults were more likely to recognize negative value words, both groups exhibited control by not recalling negative value information. The findings suggest that although both groups retain high-value information, older adults rely more on gist-based encoding and retrieval operations, whereas younger adults are able to remember specific numeric value information.

  19. Detecting temporal change in land-surface altitude using robotic land-surveying techniques and geographic information system applications at an earthen dam site in Southern Westchester County, New York

    USGS Publications Warehouse

    Noll, Michael L.; Chu, Anthony

    2017-08-14

    In 2005, the U.S. Geological Survey began a cooperative study with New York City Department of Environmental Protection to characterize the local groundwater-flow system and identify potential sources of seeps on the southern embankment at the Hillview Reservoir in southern Westchester County, New York. Monthly site inspections at the reservoir indicated an approximately 90-square-foot depression in the land surface directly upslope from a seep that has episodically flowed since 2007. In July 2008, the U.S. Geological Survey surveyed the topography of land surface in this depression area by collecting high-accuracy (resolution less than 1 inch) measurements. A point of origin was established for the topographic survey by using differentially corrected positional data collected by a global navigation satellite system. Eleven points were surveyed along the edge of the depression area and at arbitrary locations within the depression area by using robotic land-surveying techniques. The points were surveyed again in March 2012 to evaluate temporal changes in land-surface altitude. Survey measurements of the depression area indicated that the land-surface altitude at 8 of the 11 points decreased beyond the accepted measurement uncertainty during the 44 months from July 2008 to March 2012. Two additional control points were established at stable locations along Hillview Avenue, which runs parallel to the embankment. These points were measured during the July 2008 survey and measured again during the March 2012 survey to evaluate the relative accuracy of the altitude measurements. The relative horizontal and vertical (altitude) accuracies of the 11 topographic measurements collected in March 2012 were ±0.098 and ±0.060 feet (ft), respectively. Changes in topography at 8 of the 11 points ranged from 0.09 to 0.63 ft and topography remained constant, or within the measurement uncertainty, for 3 of the 11 points.Two cross sections were constructed through the depression area by using land-surface altitude data that were interpolated from positional data collected during the two topographic surveys. Cross section A–A′ was approximately 8.5 ft long and consisted of three surveyed points that trended north to south across the depression. Land-surface altitude change decreased along the entire north-south trending cross section during the 44 months, and ranged from 0.2 to more than 0.6 ft. In general, greater land-surface altitude change was measured north of the midpoint as compared to south of the midpoint of the cross section. Cross section B–B′ was 18 ft long and consisted of six surveyed points that trended east to west across the depression. Land-surface altitude change generally decreased or remained constant along the east-west trending cross section during the 44 months and ranged from 0.0 to 0.3 ft. Volume change of the depression area was calculated by using a three-dimensional geographic information system utility that subtracts interpolated surfaces. The results indicated a net volume loss of approximately 38 ±5 cubic feet of material from the depression area during the 44 months.

  20. 3D measurement using circular gratings

    NASA Astrophysics Data System (ADS)

    Harding, Kevin

    2013-09-01

    3D measurement using methods of structured light are well known in the industry. Most such systems use some variation of straight lines, either as simple lines or with some form of encoding. This geometry assumes the lines will be projected from one side and viewed from another to generate the profile information. But what about applications where a wide triangulation angle may not be practical, particularly at longer standoff distances. This paper explores the use of circular grating patterns projected from a center point to achieve 3D information. Originally suggested by John Caulfield around 1990, the method had some interesting potential, particularly if combined with alternate means of measurement from traditional triangulation including depth from focus methods. The possible advantages of a central reference point in the projected pattern may offer some different capabilities not as easily attained with a linear grating pattern. This paper will explore the pros and cons of the method and present some examples of possible applications.

  1. Downdating a time-varying square root information filter

    NASA Technical Reports Server (NTRS)

    Muellerschoen, Ronald J.

    1990-01-01

    A new method to efficiently downdate an estimate and covariance generated by a discrete time Square Root Information Filter (SRIF) is presented. The method combines the QR factor downdating algorithm of Gill and the decentralized SRIF algorithm of Bierman. Efficient removal of either measurements or a priori information is possible without loss of numerical integrity. Moreover, the method includes features for detecting potential numerical degradation. Performance on a 300 parameter system with 5800 data points shows that the method can be used in real time and hence is a promising tool for interactive data analysis. Additionally, updating a time-varying SRIF filter with either additional measurements or a priori information proceeds analogously.

  2. Theoretical repeatability assessment without repetitive measurements in gradient high-performance liquid chromatography.

    PubMed

    Kotani, Akira; Tsutsumi, Risa; Shoji, Asaki; Hayashi, Yuzuru; Kusu, Fumiyo; Yamamoto, Kazuhiro; Hakamata, Hideki

    2016-07-08

    This paper puts forward a time and material-saving method for evaluating the repeatability of area measurements in gradient HPLC with UV detection (HPLC-UV), based on the function of mutual information (FUMI) theory which can theoretically provide the measurement standard deviation (SD) and detection limits through the stochastic properties of baseline noise with no recourse to repetitive measurements of real samples. The chromatographic determination of terbinafine hydrochloride and enalapril maleate is taken as an example. The best choice of the number of noise data points, inevitable for the theoretical evaluation, is shown to be 512 data points (10.24s at 50 point/s sampling rate of an A/D converter). Coupled with the relative SD (RSD) of sample injection variability in the instrument used, the theoretical evaluation is proved to give identical values of area measurement RSDs to those estimated by the usual repetitive method (n=6) over a wide concentration range of the analytes within the 95% confidence intervals of the latter RSD. The FUMI theory is not a statistical one, but the "statistical" reliability of its SD estimates (n=1) is observed to be as high as that attained by thirty-one measurements of the same samples (n=31). Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Detection of image structures using the Fisher information and the Rao metric.

    PubMed

    Maybank, Stephen J

    2004-12-01

    In many detection problems, the structures to be detected are parameterized by the points of a parameter space. If the conditional probability density function for the measurements is known, then detection can be achieved by sampling the parameter space at a finite number of points and checking each point to see if the corresponding structure is supported by the data. The number of samples and the distances between neighboring samples are calculated using the Rao metric on the parameter space. The Rao metric is obtained from the Fisher information which is, in turn, obtained from the conditional probability density function. An upper bound is obtained for the probability of a false detection. The calculations are simplified in the low noise case by making an asymptotic approximation to the Fisher information. An application to line detection is described. Expressions are obtained for the asymptotic approximation to the Fisher information, the volume of the parameter space, and the number of samples. The time complexity for line detection is estimated. An experimental comparison is made with a Hough transform-based method for detecting lines.

  4. Multi-Dimensional Pattern Discovery of Trajectories Using Contextual Information

    NASA Astrophysics Data System (ADS)

    Sharif, M.; Alesheikh, A. A.

    2017-10-01

    Movement of point objects are highly sensitive to the underlying situations and conditions during the movement, which are known as contexts. Analyzing movement patterns, while accounting the contextual information, helps to better understand how point objects behave in various contexts and how contexts affect their trajectories. One potential solution for discovering moving objects patterns is analyzing the similarities of their trajectories. This article, therefore, contextualizes the similarity measure of trajectories by not only their spatial footprints but also a notion of internal and external contexts. The dynamic time warping (DTW) method is employed to assess the multi-dimensional similarities of trajectories. Then, the results of similarity searches are utilized in discovering the relative movement patterns of the moving point objects. Several experiments are conducted on real datasets that were obtained from commercial airplanes and the weather information during the flights. The results yielded the robustness of DTW method in quantifying the commonalities of trajectories and discovering movement patterns with 80 % accuracy. Moreover, the results revealed the importance of exploiting contextual information because it can enhance and restrict movements.

  5. Sonic-boom ground-pressure measurements from Apollo 15

    NASA Technical Reports Server (NTRS)

    Hilton, D. A.; Henderson, H. R.; Mckinney, R.

    1972-01-01

    Sonic boom pressure signatures recorded during the launch and reentry phases of the Apollo 15 mission are presented. The measurements were obtained along the vehicle ground track at 87 km and 970 km downrange from the launch site during ascent; and at 500 km, 55.6 km, and 12.9 km from the splashdown point during reentry. Tracings of the measured signatures are included along with values of the overpressure, impulse, time duration, and rise times. Also included are brief descriptions of the launch and recovery test areas in which the measurements were obtained, the sonic boom instrumentation deployment, flight profiles and operating conditions for the launch vehicle and spacecraft, surface weather information at the measuring sites, and high altitude weather information for the general measurement areas.

  6. No scanning depth imaging system based on TOF

    NASA Astrophysics Data System (ADS)

    Sun, Rongchun; Piao, Yan; Wang, Yu; Liu, Shuo

    2016-03-01

    To quickly obtain a 3D model of real world objects, multi-point ranging is very important. However, the traditional measuring method usually adopts the principle of point by point or line by line measurement, which is too slow and of poor efficiency. In the paper, a no scanning depth imaging system based on TOF (time of flight) was proposed. The system is composed of light source circuit, special infrared image sensor module, processor and controller of image data, data cache circuit, communication circuit, and so on. According to the working principle of the TOF measurement, image sequence was collected by the high-speed CMOS sensor, and the distance information was obtained by identifying phase difference, and the amplitude image was also calculated. Experiments were conducted and the experimental results show that the depth imaging system can achieve no scanning depth imaging function with good performance.

  7. Ground-state fidelity and bipartite entanglement in the Bose-Hubbard model.

    PubMed

    Buonsante, P; Vezzani, A

    2007-03-16

    We analyze the quantum phase transition in the Bose-Hubbard model borrowing two tools from quantum-information theory, i.e., the ground-state fidelity and entanglement measures. We consider systems at unitary filling comprising up to 50 sites and show for the first time that a finite-size scaling analysis of these quantities provides excellent estimates for the quantum critical point. We conclude that fidelity is particularly suited for revealing a quantum phase transition and pinning down the critical point thereof, while the success of entanglement measures depends on the mechanisms governing the transition.

  8. Section-Based Tree Species Identification Using Airborne LIDAR Point Cloud

    NASA Astrophysics Data System (ADS)

    Yao, C.; Zhang, X.; Liu, H.

    2017-09-01

    The application of LiDAR data in forestry initially focused on mapping forest community, particularly and primarily intended for largescale forest management and planning. Then with the smaller footprint and higher sampling density LiDAR data available, detecting individual tree overstory, estimating crowns parameters and identifying tree species are demonstrated practicable. This paper proposes a section-based protocol of tree species identification taking palm tree as an example. Section-based method is to detect objects through certain profile among different direction, basically along X-axis or Y-axis. And this method improve the utilization of spatial information to generate accurate results. Firstly, separate the tree points from manmade-object points by decision-tree-based rules, and create Crown Height Mode (CHM) by subtracting the Digital Terrain Model (DTM) from the digital surface model (DSM). Then calculate and extract key points to locate individual trees, thus estimate specific tree parameters related to species information, such as crown height, crown radius, and cross point etc. Finally, with parameters we are able to identify certain tree species. Comparing to species information measured on ground, the portion correctly identified trees on all plots could reach up to 90.65 %. The identification result in this research demonstrate the ability to distinguish palm tree using LiDAR point cloud. Furthermore, with more prior knowledge, section-based method enable the process to classify trees into different classes.

  9. Imaging of Al/Fe ratios in synthetic Al-goethite revealed by nanoscale secondary ion mass spectrometry.

    PubMed

    Pohl, Lydia; Kölbl, Angelika; Werner, Florian; Mueller, Carsten W; Höschen, Carmen; Häusler, Werner; Kögel-Knabner, Ingrid

    2018-04-30

    Aluminium (Al)-substituted goethite is ubiquitous in soils and sediments. The extent of Al-substitution affects the physicochemical properties of the mineral and influences its macroscale properties. Bulk analysis only provides total Al/Fe ratios without providing information with respect to the Al-substitution of single minerals. Here, we demonstrate that nanoscale secondary ion mass spectrometry (NanoSIMS) enables the precise determination of Al-content in single minerals, while simultaneously visualising the variation of the Al/Fe ratio. Al-substituted goethite samples were synthesized with increasing Al concentrations of 0.1, 3, and 7 % and analysed by NanoSIMS in combination with established bulk spectroscopic methods (XRD, FTIR, Mössbauer spectroscopy). The high spatial resolution (50-150 nm) of NanoSIMS is accompanied by a high number of single-point measurements. We statistically evaluated the Al/Fe ratios derived from NanoSIMS, while maintaining the spatial information and reassigning it to its original localization. XRD analyses confirmed increasing concentration of incorporated Al within the goethite structure. Mössbauer spectroscopy revealed 11 % of the goethite samples generated at high Al concentrations consisted of hematite. The NanoSIMS data show that the Al/Fe ratios are in agreement with bulk data derived from total digestion and demonstrated small spatial variability between single-point measurements. More advantageously, statistical analysis and reassignment of single-point measurements allowed us to identify distinct spots with significantly higher or lower Al/Fe ratios. NanoSIMS measurements confirmed the capacity to produce images, which indicated the uniform increase in Al-concentrations in goethite. Using a combination of statistical analysis with information from complementary spectroscopic techniques (XRD, FTIR and Mössbauer spectroscopy) we were further able to reveal spots with lower Al/Fe ratios as hematite. Copyright © 2018 John Wiley & Sons, Ltd.

  10. Analysis of ground-measured and passive-microwave-derived snow depth variations in midwinter across the Northern Great Plains

    USGS Publications Warehouse

    Chang, A.T.C.; Kelly, R.E.J.; Josberger, E.G.; Armstrong, R.L.; Foster, J.L.; Mognard, N.M.

    2005-01-01

    Accurate estimation of snow mass is important for the characterization of the hydrological cycle at different space and time scales. For effective water resources management, accurate estimation of snow storage is needed. Conventionally, snow depth is measured at a point, and in order to monitor snow depth in a temporally and spatially comprehensive manner, optimum interpolation of the points is undertaken. Yet the spatial representation of point measurements at a basin or on a larger distance scale is uncertain. Spaceborne scanning sensors, which cover a wide swath and can provide rapid repeat global coverage, are ideally suited to augment the global snow information. Satellite-borne passive microwave sensors have been used to derive snow depth (SD) with some success. The uncertainties in point SD and areal SD of natural snowpacks need to be understood if comparisons are to be made between a point SD measurement and satellite SD. In this paper three issues are addressed relating satellite derivation of SD and ground measurements of SD in the northern Great Plains of the United States from 1988 to 1997. First, it is shown that in comparing samples of ground-measured point SD data with satellite-derived 25 ?? 25 km2 pixels of SD from the Defense Meteorological Satellite Program Special Sensor Microwave Imager, there are significant differences in yearly SD values even though the accumulated datasets showed similarities. Second, from variogram analysis, the spatial variability of SD from each dataset was comparable. Third, for a sampling grid cell domain of 1?? ?? 1?? in the study terrain, 10 distributed snow depth measurements per cell are required to produce a sampling error of 5 cm or better. This study has important implications for validating SD derivations from satellite microwave observations. ?? 2005 American Meteorological Society.

  11. The contribution of situational probability information to anticipatory skill.

    PubMed

    Farrow, Damian; Reid, Machar

    2012-07-01

    To determine the contribution of situational probability information to the anticipatory responses of skilled tennis players representative of two different stages of development. Participants were required to predict the location of tennis serves presented to them on a plasma touchscreen from the perspective of the receiver. Serves were sequenced into a series of games and sets with a score presented before each point, typical of a game of tennis. The game score was manipulated to provide advance probability information. The location of the serve for the first point of each game was always directed to the same location. A total of 12 service games consisting of 96 points were presented with interest in whether players would detect the relationship between the game score and resultant serve location. A 2×12 (age×service game) ANOVA with repeated measures on the second factor revealed a significant age by service game interaction for response time (F₁₁,₂₉₇=3.86, p<0.05, η(p)²=.12). The older players picked up the occurrence of the first point service pattern after the ninth service game whereas the younger, players did not. There were no significant response accuracy differences between the groups in relation to the first point. The findings highlight the important role of situational probability information, in addition to movement kinematics, for successful anticipatory performance and suggest that the pick-up of such information is not utilised by younger players. Copyright © 2012 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  12. Acquisition of 3d Information for Vanished Structure by Using Only AN Ancient Picture

    NASA Astrophysics Data System (ADS)

    Kunii, Y.; Sakamoto, R.

    2016-06-01

    In order to acquire 3D information for reconstruction of vanished historical structure, grasp of 3D shape of such structure was attempted by using an ancient picture. Generally, 3D information of a structure is acquired by photogrammetric theory which requires two or more pictures. This paper clarifies that the geometrical information of the structure was obtained only from an ancient picture, and 3D information was acquired. This kind of method was applied for an ancient picture of the Old Imperial Theatre. The Old Imperial Theatre in the picture is constituted by two-point perspective. Therefore, estimated value of focal length of camera, length of camera to the Old Imperial Theatre and some parameters were calculated by estimation of field angle, using body height as an index of length and some geometrical information. Consequently, 3D coordinate of 120 measurement points on the surface of the Old Imperial Theatre were calculated respectively, and 3DCG modeling of the Old Imperial Theatre was realized.

  13. Multiobjective sampling design for parameter estimation and model discrimination in groundwater solute transport

    USGS Publications Warehouse

    Knopman, Debra S.; Voss, Clifford I.

    1989-01-01

    Sampling design for site characterization studies of solute transport in porous media is formulated as a multiobjective problem. Optimal design of a sampling network is a sequential process in which the next phase of sampling is designed on the basis of all available physical knowledge of the system. Three objectives are considered: model discrimination, parameter estimation, and cost minimization. For the first two objectives, physically based measures of the value of information obtained from a set of observations are specified. In model discrimination, value of information of an observation point is measured in terms of the difference in solute concentration predicted by hypothesized models of transport. Points of greatest difference in predictions can contribute the most information to the discriminatory power of a sampling design. Sensitivity of solute concentration to a change in a parameter contributes information on the relative variance of a parameter estimate. Inclusion of points in a sampling design with high sensitivities to parameters tends to reduce variance in parameter estimates. Cost minimization accounts for both the capital cost of well installation and the operating costs of collection and analysis of field samples. Sensitivities, discrimination information, and well installation and sampling costs are used to form coefficients in the multiobjective problem in which the decision variables are binary (zero/one), each corresponding to the selection of an observation point in time and space. The solution to the multiobjective problem is a noninferior set of designs. To gain insight into effective design strategies, a one-dimensional solute transport problem is hypothesized. Then, an approximation of the noninferior set is found by enumerating 120 designs and evaluating objective functions for each of the designs. Trade-offs between pairs of objectives are demonstrated among the models. The value of an objective function for a given design is shown to correspond to the ability of a design to actually meet an objective.

  14. MEASURING ECONOMIC GROWTH FROM OUTER SPACE.

    PubMed

    Henderson, J Vernon; Storeygard, Adam; Weil, David N

    2012-04-01

    GDP growth is often measured poorly for countries and rarely measured at all for cities or subnational regions. We propose a readily available proxy: satellite data on lights at night. We develop a statistical framework that uses lights growth to augment existing income growth measures, under the assumption that measurement error in using observed light as an indicator of income is uncorrelated with measurement error in national income accounts. For countries with good national income accounts data, information on growth of lights is of marginal value in estimating the true growth rate of income, while for countries with the worst national income accounts, the optimal estimate of true income growth is a composite with roughly equal weights. Among poor-data countries, our new estimate of average annual growth differs by as much as 3 percentage points from official data. Lights data also allow for measurement of income growth in sub- and supranational regions. As an application, we examine growth in Sub Saharan African regions over the last 17 years. We find that real incomes in non-coastal areas have grown faster by 1/3 of an annual percentage point than coastal areas; non-malarial areas have grown faster than malarial ones by 1/3 to 2/3 annual percent points; and primate city regions have grown no faster than hinterland areas. Such applications point toward a research program in which "empirical growth" need no longer be synonymous with "national income accounts."

  15. Oil point and mechanical behaviour of oil palm kernels in linear compression

    NASA Astrophysics Data System (ADS)

    Kabutey, Abraham; Herak, David; Choteborsky, Rostislav; Mizera, Čestmír; Sigalingging, Riswanti; Akangbe, Olaosebikan Layi

    2017-07-01

    The study described the oil point and mechanical properties of roasted and unroasted bulk oil palm kernels under compression loading. The literature information available is very limited. A universal compression testing machine and vessel diameter of 60 mm with a plunger were used by applying maximum force of 100 kN and speed ranging from 5 to 25 mm min-1. The initial pressing height of the bulk kernels was measured at 40 mm. The oil point was determined by a litmus test for each deformation level of 5, 10, 15, 20, and 25 mm at a minimum speed of 5 mmmin-1. The measured parameters were the deformation, deformation energy, oil yield, oil point strain and oil point pressure. Clearly, the roasted bulk kernels required less deformation energy compared to the unroasted kernels for recovering the kernel oil. However, both kernels were not permanently deformed. The average oil point strain was determined at 0.57. The study is an essential contribution to pursuing innovative methods for processing palm kernel oil in rural areas of developing countries.

  16. Multivariate random regression analysis for body weight and main morphological traits in genetically improved farmed tilapia (Oreochromis niloticus).

    PubMed

    He, Jie; Zhao, Yunfeng; Zhao, Jingli; Gao, Jin; Han, Dandan; Xu, Pao; Yang, Runqing

    2017-11-02

    Because of their high economic importance, growth traits in fish are under continuous improvement. For growth traits that are recorded at multiple time-points in life, the use of univariate and multivariate animal models is limited because of the variable and irregular timing of these measures. Thus, the univariate random regression model (RRM) was introduced for the genetic analysis of dynamic growth traits in fish breeding. We used a multivariate random regression model (MRRM) to analyze genetic changes in growth traits recorded at multiple time-point of genetically-improved farmed tilapia. Legendre polynomials of different orders were applied to characterize the influences of fixed and random effects on growth trajectories. The final MRRM was determined by optimizing the univariate RRM for the analyzed traits separately via penalizing adaptively the likelihood statistical criterion, which is superior to both the Akaike information criterion and the Bayesian information criterion. In the selected MRRM, the additive genetic effects were modeled by Legendre polynomials of three orders for body weight (BWE) and body length (BL) and of two orders for body depth (BD). By using the covariance functions of the MRRM, estimated heritabilities were between 0.086 and 0.628 for BWE, 0.155 and 0.556 for BL, and 0.056 and 0.607 for BD. Only heritabilities for BD measured from 60 to 140 days of age were consistently higher than those estimated by the univariate RRM. All genetic correlations between growth time-points exceeded 0.5 for either single or pairwise time-points. Moreover, correlations between early and late growth time-points were lower. Thus, for phenotypes that are measured repeatedly in aquaculture, an MRRM can enhance the efficiency of the comprehensive selection for BWE and the main morphological traits.

  17. Galaxy clustering with photometric surveys using PDF redshift information

    DOE PAGES

    Asorey, J.; Carrasco Kind, M.; Sevilla-Noarbe, I.; ...

    2016-03-28

    Here, photometric surveys produce large-area maps of the galaxy distribution, but with less accurate redshift information than is obtained from spectroscopic methods. Modern photometric redshift (photo-z) algorithms use galaxy magnitudes, or colors, that are obtained through multi-band imaging to produce a probability density function (PDF) for each galaxy in the map. We used simulated data to study the effect of using different photo-z estimators to assign galaxies to redshift bins in order to compare their effects on angular clustering and galaxy bias measurements. We found that if we use the entire PDF, rather than a single-point (mean or mode) estimate, the deviations are less biased, especially when using narrow redshift bins. When the redshift bin widths aremore » $$\\Delta z=0.1$$, the use of the entire PDF reduces the typical measurement bias from 5%, when using single point estimates, to 3%.« less

  18. Human Movement Recognition Based on the Stochastic Characterisation of Acceleration Data

    PubMed Central

    Munoz-Organero, Mario; Lotfi, Ahmad

    2016-01-01

    Human activity recognition algorithms based on information obtained from wearable sensors are successfully applied in detecting many basic activities. Identified activities with time-stationary features are characterised inside a predefined temporal window by using different machine learning algorithms on extracted features from the measured data. Better accuracy, precision and recall levels could be achieved by combining the information from different sensors. However, detecting short and sporadic human movements, gestures and actions is still a challenging task. In this paper, a novel algorithm to detect human basic movements from wearable measured data is proposed and evaluated. The proposed algorithm is designed to minimise computational requirements while achieving acceptable accuracy levels based on characterising some particular points in the temporal series obtained from a single sensor. The underlying idea is that this algorithm would be implemented in the sensor device in order to pre-process the sensed data stream before sending the information to a central point combining the information from different sensors to improve accuracy levels. Intra- and inter-person validation is used for two particular cases: single step detection and fall detection and classification using a single tri-axial accelerometer. Relevant results for the above cases and pertinent conclusions are also presented. PMID:27618063

  19. Patterns and Emerging Trends in Global Ocean Health

    PubMed Central

    Halpern, Benjamin S.; Longo, Catherine; Lowndes, Julia S. Stewart; Best, Benjamin D.; Frazier, Melanie; Katona, Steven K.; Kleisner, Kristin M.; Rosenberg, Andrew A.; Scarborough, Courtney; Selig, Elizabeth R.

    2015-01-01

    International and regional policies aimed at managing ocean ecosystem health need quantitative and comprehensive indices to synthesize information from a variety of sources, consistently measure progress, and communicate with key constituencies and the public. Here we present the second annual global assessment of the Ocean Health Index, reporting current scores and annual changes since 2012, recalculated using updated methods and data based on the best available science, for 221 coastal countries and territories. The Index measures performance of ten societal goals for healthy oceans on a quantitative scale of increasing health from 0 to 100, and combines these scores into a single Index score, for each country and globally. The global Index score improved one point (from 67 to 68), while many country-level Index and goal scores had larger changes. Per-country Index scores ranged from 41–95 and, on average, improved by 0.06 points (range -8 to +12). Globally, average scores increased for individual goals by as much as 6.5 points (coastal economies) and decreased by as much as 1.2 points (natural products). Annual updates of the Index, even when not all input data have been updated, provide valuable information to scientists, policy makers, and resource managers because patterns and trends can emerge from the data that have been updated. Changes of even a few points indicate potential successes (when scores increase) that merit recognition, or concerns (when scores decrease) that may require mitigative action, with changes of more than 10–20 points representing large shifts that deserve greater attention. Goal scores showed remarkably little covariance across regions, indicating low redundancy in the Index, such that each goal delivers information about a different facet of ocean health. Together these scores provide a snapshot of global ocean health and suggest where countries have made progress and where a need for further improvement exists. PMID:25774678

  20. Internet Self-Exclusion: Characteristics of Self-Excluded Gamblers and Preliminary Evidence for Its Effectiveness

    ERIC Educational Resources Information Center

    Hayer, Tobias; Meyer, Gerhard

    2011-01-01

    Preliminary scientific evidence indicates that online gamblers are more likely to be problem gamblers and thus point to the need for effective protection measures. This study focuses on an online self-exclusion program and seeks to comprehensively examine the benefits of this measure. It was intended to collect detailed information on the…

  1. Taiwan's Travel and Border Health Measures in Response to Zika.

    PubMed

    Ho, Li-Li; Tsai, Yu-Hui; Lee, Wang-Ping; Liao, Szu-Tsai; Wu, Li-Gin; Wu, Yi-Chun

    Zika virus has recently emerged as a worldwide public health concern. Travel and border health measures stand as one of the main strategies and frontline defenses in responding to international epidemics. As of October 31, 2016, Taiwan has reported 13 imported cases, 5 of which were detected through routine entry screening and active monitoring at international airports. This article shares Taiwan's disease surveillance activities at designated points of entry and travel and border health measures in response to Zika. The Taiwan government collaborates with its tourism industry to disseminate information about precautionary measures and encourages tour guides to report suspected individuals or events to activate early response measures. Taiwan also engages in vector control activities at points of entry, including targeting aircraft from countries where vector-borne diseases are endemic, implementing mosquito sweep measures, and collecting vector surveillance data. In future emerging and reemerging disease events, entry surveillance at designated points of entry may enable early detection of diseases of international origin and more rapid activation of public health preparedness activities and international collaboration. Taiwan will continue to maximize border and travel health measures in compliance with IHR (2005) requirements, which rely on continued risk assessment, practical implementation activities, and engagement with all stakeholders.

  2. A Monte Carlo–Based Bayesian Approach for Measuring Agreement in a Qualitative Scale

    PubMed Central

    Pérez Sánchez, Carlos Javier

    2014-01-01

    Agreement analysis has been an active research area whose techniques have been widely applied in psychology and other fields. However, statistical agreement among raters has been mainly considered from a classical statistics point of view. Bayesian methodology is a viable alternative that allows the inclusion of subjective initial information coming from expert opinions, personal judgments, or historical data. A Bayesian approach is proposed by providing a unified Monte Carlo–based framework to estimate all types of measures of agreement in a qualitative scale of response. The approach is conceptually simple and it has a low computational cost. Both informative and non-informative scenarios are considered. In case no initial information is available, the results are in line with the classical methodology, but providing more information on the measures of agreement. For the informative case, some guidelines are presented to elicitate the prior distribution. The approach has been applied to two applications related to schizophrenia diagnosis and sensory analysis. PMID:29881002

  3. Application of backpack Lidar to geological cross-section measurement

    NASA Astrophysics Data System (ADS)

    Lin, Jingyu; Wang, Ran; Xiao, Zhouxuan; Li, Lu; Yao, Weihua; Han, Wei; Zhao, Baolin

    2017-11-01

    As the traditional geological cross section measurement, the artificial traverse method was recently substituted by using point coordinates data. However, it is still the crux of the matter that how to acquire the high-precision point coordinates data quickly and economically. Thereby, the backpack Lidar is presented on the premise of the principle of using point coordinates in this issue. Undoubtedly, Lidar technique, one of booming and international active remote sensing techniques, is a powerful tool in obtaining precise topographic information, high-precision 3-D coordinates and building a real 3-D model. With field practice and date processing indoors, it is essentially accomplished that geological sections maps could be generated simply, accurately and automatically in the support of relevant software such as ArcGIS and LiDAR360.

  4. Hysteresis of Soil Point Water Retention Functions Determined by Neutron Radiography

    NASA Astrophysics Data System (ADS)

    Perfect, E.; Kang, M.; Bilheux, H.; Willis, K. J.; Horita, J.; Warren, J.; Cheng, C.

    2010-12-01

    Soil point water retention functions are needed for modeling flow and transport in partially-saturated porous media. Such functions are usually determined by inverse modeling of average water retention data measured experimentally on columns of finite length. However, the resulting functions are subject to the appropriateness of the chosen model, as well as the initial and boundary condition assumptions employed. Soil point water retention functions are rarely measured directly and when they are the focus is invariably on the main drying branch. Previous direct measurement methods include time domain reflectometry and gamma beam attenuation. Here we report direct measurements of the main wetting and drying branches of the point water retention function using neutron radiography. The measurements were performed on a coarse sand (Flint #13) packed into 2.6 cm diameter x 4 cm long aluminum cylinders at the NIST BT-2 (50 μm resolution) and ORNL-HFIR CG1D (70 μm resolution) imaging beamlines. The sand columns were saturated with water and then drained and rewetted under quasi-equilibrium conditions using a hanging water column setup. 2048 x 2048 pixel images of the transmitted flux of neutrons through the column were acquired at each imposed suction (~10-15 suction values per experiment). Volumetric water contents were calculated on a pixel by pixel basis using Beer-Lambert’s law in conjunction with beam hardening and geometric corrections. The pixel rows were averaged and combined with information on the known distribution of suctions within the column to give 2048 point drying and wetting functions for each experiment. The point functions exhibited pronounced hysteresis and varied with column height, possibly due to differences in porosity caused by the packing procedure employed. Predicted point functions, extracted from the hanging water column volumetric data using the TrueCell inverse modeling procedure, showed very good agreement with the range of point functions measured within the column using neutron radiography. Extension of these experiments to 3-dimensions using neutron tomography is planned.

  5. Closing the evidence gap in infectious disease: point-of-care randomization and informed consent.

    PubMed

    Huttner, A; Leibovici, L; Theuretzbacher, U; Huttner, B; Paul, M

    2017-02-01

    The informed consent document is intended to provide basic rights to patients but often fails to do so. Patients' autonomy may be diminished by virtue of their illness; evidence shows that even patients who appear to be ideal candidates for understanding and granting informed consent rarely are, particularly those with acute infections. We argue that for low-risk trials whose purpose is to evaluate nonexperimental therapies or other measures towards which the medical community is in a state of equipoise, ethics committees should play a more active role in a more standardized fashion. Patients in the clinic are continually subject to spontaneous 'pseudo-randomizations' based on local dogma and the anecdotal experience of their physicians. Stronger ethics oversight would allow point-of-care trials to structure these spontaneous randomizations, using widely available informatics tools, in combination with opt-out informed consent where deemed appropriate. Copyright © 2016. Published by Elsevier Ltd.

  6. Attitude control system of the Delfi-n3Xt satellite

    NASA Astrophysics Data System (ADS)

    Reijneveld, J.; Choukroun, D.

    2013-12-01

    This work is concerned with the development of the attitude control algorithms that will be implemented on board of the Delfi-n3xt nanosatellite, which is to be launched in 2013. One of the mission objectives is to demonstrate Sun pointing and three axis stabilization. The attitude control modes and the associated algorithms are described. The control authority is shared between three body-mounted magnetorquers (MTQ) and three orthogonal reaction wheels. The attitude information is retrieved from Sun vector measurements, Earth magnetic field measurements, and gyro measurements. The design of the control is achieved as a trade between simplicity and performance. Stabilization and Sun pointing are achieved via the successive application of the classical Bdot control law and a quaternion feedback control. For the purpose of Sun pointing, a simple quaternion estimation scheme is implemented based on geometric arguments, where the need for a costly optimal filtering algorithm is alleviated, and a single line of sight (LoS) measurement is required - here the Sun vector. Beyond the three-axis Sun pointing mode, spinning Sun pointing modes are also described and used as demonstration modes. The three-axis Sun pointing mode requires reaction wheels and magnetic control while the spinning control modes are implemented with magnetic control only. In addition, a simple scheme for angular rates estimation using Sun vector and Earth magnetic measurements is tested in the case of gyro failures. The various control modes performances are illustrated via extensive simulations over several orbits time spans. The simulated models of the dynamical space environment, of the attitude hardware, and the onboard controller logic are using realistic assumptions. All control modes satisfy the minimal Sun pointing requirements allowed for power generation.

  7. Measurement and analysis of electromagnetic pollution generated by GSM-900 mobile phone networks in Erciyes University, Turkey.

    PubMed

    Sorgucu, Ugur; Develi, Ibrahim

    2012-12-01

    Mobile phones are becoming increasingly important in our everyday lives. The rising number of mobile phones reflects a similar increase in the number of base stations. Because of this rapid evolution, the establishment and planning of new base stations has become mandatory. However, the rise in the number of base stations, in terms of human health, is potentially very harmful. It is important to analyze the radiation levels of base stations until we can confirm that they are definitely not harmful in the long term. Mapping of electromagnetic field (EMF) is also important from a medical point of view because it provides useful information, for example, on the detection of diseases caused by EMF. With the help of this information the distribution of diseases over different regions can be obtained. In this article, the electromagnetic radiation levels of base stations were measured at 80 different points in Erciyes University (ERU), Turkey and detailed information about the measurement tools and measurement method were given. It was observed that no area in ERU exceeded the national and international limits. It is also observed that the effects of base stations vary according to duration and degree of exposure. Therefore, if people are exposed to a very low-intensity electromagnetic field for a very long time, serious health problems can occur.

  8. IR Window Studies

    DTIC Science & Technology

    1974-09-15

    molten gallium but still have a lew resistivity. Stabilized zirconia was used to remove and monitor oxygen. KC1 crystals with a-j« 5 m = lO...information that GaAs grown from Ga solutions at low temperatures can be made with higher purities than that grown at the melting point . The initial...goals were to grow thick films below the melting point which would be semi-insulating and to measure their absorption coefficients. This goal was to

  9. Development and validation of a prognostic index for 4-year mortality in older adults.

    PubMed

    Lee, Sei J; Lindquist, Karla; Segal, Mark R; Covinsky, Kenneth E

    2006-02-15

    Both comorbid conditions and functional measures predict mortality in older adults, but few prognostic indexes combine both classes of predictors. Combining easily obtained measures into an accurate predictive model could be useful to clinicians advising patients, as well as policy makers and epidemiologists interested in risk adjustment. To develop and validate a prognostic index for 4-year mortality using information that can be obtained from patient report. Using the 1998 wave of the Health and Retirement Study (HRS), a population-based study of community-dwelling US adults older than 50 years, we developed the prognostic index from 11,701 individuals and validated the index with 8009. Individuals were asked about their demographic characteristics, whether they had specific diseases, and whether they had difficulty with a series of functional measures. We identified variables independently associated with mortality and weighted the variables to create a risk index. Death by December 31, 2002. The overall response rate was 81%. During the 4-year follow-up, there were 1361 deaths (12%) in the development cohort and 1072 deaths (13%) in the validation cohort. Twelve independent predictors of mortality were identified: 2 demographic variables (age: 60-64 years, 1 point; 65-69 years, 2 points; 70-74 years, 3 points; 75-79 years, 4 points; 80-84 years, 5 points, >85 years, 7 points and male sex, 2 points), 6 comorbid conditions (diabetes, 1 point; cancer, 2 points; lung disease, 2 points; heart failure, 2 points; current tobacco use, 2 points; and body mass index <25, 1 point), and difficulty with 4 functional variables (bathing, 2 points; walking several blocks, 2 points; managing money, 2 points, and pushing large objects, 1 point. Scores on the risk index were strongly associated with 4-year mortality in the validation cohort, with 0 to 5 points predicting a less than 4% risk, 6 to 9 points predicting a 15% risk, 10 to 13 points predicting a 42% risk, and 14 or more points predicting a 64% risk. The risk index showed excellent discrimination with a cstatistic of 0.84 in the development cohort and 0.82 in the validation cohort. This prognostic index, incorporating age, sex, self-reported comorbid conditions, and functional measures, accurately stratifies community-dwelling older adults into groups at varying risk of mortality.

  10. Relationship Between Optimal Gain and Coherence Zone in Flight Simulation

    NASA Technical Reports Server (NTRS)

    Gracio, Bruno Jorge Correia; Pais, Ana Rita Valente; vanPaassen, M. M.; Mulder, Max; Kely, Lon C.; Houck, Jacob A.

    2011-01-01

    In motion simulation the inertial information generated by the motion platform is most of the times different from the visual information in the simulator displays. This occurs due to the physical limits of the motion platform. However, for small motions that are within the physical limits of the motion platform, one-to-one motion, i.e. visual information equal to inertial information, is possible. It has been shown in previous studies that one-to-one motion is often judged as too strong, causing researchers to lower the inertial amplitude. When trying to measure the optimal inertial gain for a visual amplitude, we found a zone of optimal gains instead of a single value. Such result seems related with the coherence zones that have been measured in flight simulation studies. However, the optimal gain results were never directly related with the coherence zones. In this study we investigated whether the optimal gain measurements are the same as the coherence zone measurements. We also try to infer if the results obtained from the two measurements can be used to differentiate between simulators with different configurations. An experiment was conducted at the NASA Langley Research Center which used both the Cockpit Motion Facility and the Visual Motion Simulator. The results show that the inertial gains obtained with the optimal gain are different than the ones obtained with the coherence zone measurements. The optimal gain is within the coherence zone.The point of mean optimal gain was lower and further away from the one-to-one line than the point of mean coherence. The zone width obtained for the coherence zone measurements was dependent on the visual amplitude and frequency. For the optimal gain, the zone width remained constant when the visual amplitude and frequency were varied. We found no effect of the simulator configuration in both the coherence zone and optimal gain measurements.

  11. Comparative analysis of hydroacoustic lakebed classification in three different Brazilian reservoirs

    NASA Astrophysics Data System (ADS)

    Hilgert, Stephan; Sotiri, Klajdi; Fuchs, Stephan

    2017-04-01

    Until today, the surface of artificial water bodies around the world reached an area of around 500,000 km2 equaling one third of the surface of natural water bodies. Most of the constructed waster bodies are reservoirs with a variety of usage purposes, reaching from drinking water supply, electricity production, flood protection to recreation. All reservoirs have in common, that they disrupt riverine systems and their biochemical cycles and promote the accumulation of sediments upstream of the dam. The accumulated sediments contain organic matter, nutrients and/or pollutants which have a direct influence on the water quality within the impoundment. Consequently, detailed knowledge about the amount and the quality of accumulated sediments is an essential information for reservoir management. In many cases the extensive areas covered by the impoundments make it difficult and expensive to assess sediment characteristics with a high spatial resolution. Spatial extrapolations and mass balances based on point information may suffer from strong deviations. We combined sediment point measurements (core and grab sampling) with hydroacoustic sediment classification in order to precisely map sediment parameters. Three different reservoirs (Vossoroca, Capivari, Passauna) in the south-east of Brazil were investigated between 2011 and 2015. A single beam echosounder (EA 400, Kongsberg) with two frequencies (200 & 38 kHz) was used for the hydroacoustic classification. Over 50 core samples and 30 grab samples were taken for physical and chemical analysis to serve as ground truthing of the hydroacoustic measurements. All three reservoirs were covered with dense measurement transects allowing for a lakebed classification of the entire sediment surface. Significant correlations of physical parameters like grain size distribution and density as well chemical parameters like organic carbon content and total phosphorous with a selection of hydroacoustic parameters were obtained. They enabled the derivation of empiric models used for the extrapolation of the sediment point information to the entire reservoir surface. With the obtained spatial information carbon and phosphorous budgets were calculated. Former stock calculations, which were based solely on point sampling, could be improved The results show that the method is transferable to different reservoirs with varying characteristics in regard of their catchments, morphology and trophic state.

  12. Quantized Synchronization of Chaotic Neural Networks With Scheduled Output Feedback Control.

    PubMed

    Wan, Ying; Cao, Jinde; Wen, Guanghui

    In this paper, the synchronization problem of master-slave chaotic neural networks with remote sensors, quantization process, and communication time delays is investigated. The information communication channel between the master chaotic neural network and slave chaotic neural network consists of several remote sensors, with each sensor able to access only partial knowledge of output information of the master neural network. At each sampling instants, each sensor updates its own measurement and only one sensor is scheduled to transmit its latest information to the controller's side in order to update the control inputs for the slave neural network. Thus, such communication process and control strategy are much more energy-saving comparing with the traditional point-to-point scheme. Sufficient conditions for output feedback control gain matrix, allowable length of sampling intervals, and upper bound of network-induced delays are derived to ensure the quantized synchronization of master-slave chaotic neural networks. Lastly, Chua's circuit system and 4-D Hopfield neural network are simulated to validate the effectiveness of the main results.In this paper, the synchronization problem of master-slave chaotic neural networks with remote sensors, quantization process, and communication time delays is investigated. The information communication channel between the master chaotic neural network and slave chaotic neural network consists of several remote sensors, with each sensor able to access only partial knowledge of output information of the master neural network. At each sampling instants, each sensor updates its own measurement and only one sensor is scheduled to transmit its latest information to the controller's side in order to update the control inputs for the slave neural network. Thus, such communication process and control strategy are much more energy-saving comparing with the traditional point-to-point scheme. Sufficient conditions for output feedback control gain matrix, allowable length of sampling intervals, and upper bound of network-induced delays are derived to ensure the quantized synchronization of master-slave chaotic neural networks. Lastly, Chua's circuit system and 4-D Hopfield neural network are simulated to validate the effectiveness of the main results.

  13. Individual Test Point Fluctuations of Macular Sensitivity in Healthy Eyes and Eyes With Age-Related Macular Degeneration Measured With Microperimetry.

    PubMed

    Barboni, Mirella Telles Salgueiro; Szepessy, Zsuzsanna; Ventura, Dora Fix; Németh, János

    2018-04-01

    To establish fluctuation limits, it was considered that not only overall macular sensitivity but also fluctuations of individual test points in the macula might have clinical value. Three repeated measurements of microperimetry were performed using the Standard Expert test of Macular Integrity Assessment (MAIA) in healthy subjects ( N = 12, age = 23.8 ± 1.5 years old) and in patients with age-related macular degeneration (AMD) ( N = 11, age = 68.5 ± 7.4 years old). A total of 37 macular points arranged in four concentric rings and in four quadrants were analyzed individually and in groups. The data show low fluctuation of macular sensitivity of individual test points in healthy subjects (average = 1.38 ± 0.28 dB) and AMD patients (average = 2.12 ± 0.60 dB). Lower sensitivity points are more related to higher fluctuation than to the distance from the central point. Fixation stability showed no effect on the sensitivity fluctuation. The 95th percentile of the standard deviations of healthy subjects was, on average, 2.7 dB, ranging from 1.2 to 4 dB, depending on the point tested. Point analysis and regional analysis might be considered prior to evaluating macular sensitivity fluctuation in order to distinguish between normal variation and a clinical change. S tatistical methods were used to compare repeated microperimetry measurements and to establish fluctuation limits of the macular sensitivity. This analysis could add information regarding the integrity of different macular areas and provide new insights into fixation points prior to the biofeedback fixation training.

  14. Static telescope aberration measurement using lucky imaging techniques

    NASA Astrophysics Data System (ADS)

    López-Marrero, Marcos; Rodríguez-Ramos, Luis Fernando; Marichal-Hernández, José Gil; Rodríguez-Ramos, José Manuel

    2012-07-01

    A procedure has been developed to compute static aberrations once the telescope PSF has been measured with the lucky imaging technique, using a nearby star close to the object of interest as the point source to probe the optical system. This PSF is iteratively turned into a phase map at the pupil using the Gerchberg-Saxton algorithm and then converted to the appropriate actuation information for a deformable mirror having low actuator number but large stroke capability. The main advantage of this procedure is related with the capability of correcting static aberration at the specific pointing direction and without the need of a wavefront sensor.

  15. 46 CFR 153.908 - Cargo viscosity and melting point information; measuring cargo temperature during discharge...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., LIQUEFIED GAS, OR COMPRESSED GAS HAZARDOUS MATERIALS Operations Documents and Cargo Information § 153.908... sensor or thermometer required by § 153.440(a)(3) or (c). If a portable thermometer is used, it must be located as prescribed for the temperature sensor in § 153.440(a)(3). (2) A total of 2 readings must be...

  16. 46 CFR 153.908 - Cargo viscosity and melting point information; measuring cargo temperature during discharge...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., LIQUEFIED GAS, OR COMPRESSED GAS HAZARDOUS MATERIALS Operations Documents and Cargo Information § 153.908... sensor or thermometer required by § 153.440(a)(3) or (c). If a portable thermometer is used, it must be located as prescribed for the temperature sensor in § 153.440(a)(3). (2) A total of 2 readings must be...

  17. Toward automated consumer question answering: automatically separating consumer questions from professional questions in the healthcare domain.

    PubMed

    Liu, Feifan; Antieau, Lamont D; Yu, Hong

    2011-12-01

    Both healthcare professionals and healthcare consumers have information needs that can be met through the use of computers, specifically via medical question answering systems. However, the information needs of both groups are different in terms of literacy levels and technical expertise, and an effective question answering system must be able to account for these differences if it is to formulate the most relevant responses for users from each group. In this paper, we propose that a first step toward answering the queries of different users is automatically classifying questions according to whether they were asked by healthcare professionals or consumers. We obtained two sets of consumer questions (~10,000 questions in total) from Yahoo answers. The professional questions consist of two question collections: 4654 point-of-care questions (denoted as PointCare) obtained from interviews of a group of family doctors following patient visits and 5378 questions from physician practices through professional online services (denoted as OnlinePractice). With more than 20,000 questions combined, we developed supervised machine-learning models for automatic classification between consumer questions and professional questions. To evaluate the robustness of our models, we tested the model that was trained on the Consumer-PointCare dataset on the Consumer-OnlinePractice dataset. We evaluated both linguistic features and statistical features and examined how the characteristics in two different types of professional questions (PointCare vs. OnlinePractice) may affect the classification performance. We explored information gain for feature reduction and the back-off linguistic category features. The 10-fold cross-validation results showed the best F1-measure of 0.936 and 0.946 on Consumer-PointCare and Consumer-OnlinePractice respectively, and the best F1-measure of 0.891 when testing the Consumer-PointCare model on the Consumer-OnlinePractice dataset. Healthcare consumer questions posted at Yahoo online communities can be reliably classified from professional questions posted by point-of-care clinicians and online physicians. The supervised machine-learning models are robust for this task. Our study will significantly benefit further development in automated consumer question answering. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. London Measure of Unplanned Pregnancy: guidance for its use as an outcome measure

    PubMed Central

    Hall, Jennifer A; Barrett, Geraldine; Copas, Andrew; Stephenson, Judith

    2017-01-01

    Background The London Measure of Unplanned Pregnancy (LMUP) is a psychometrically validated measure of the degree of intention of a current or recent pregnancy. The LMUP is increasingly being used worldwide, and can be used to evaluate family planning or preconception care programs. However, beyond recommending the use of the full LMUP scale, there is no published guidance on how to use the LMUP as an outcome measure. Ordinal logistic regression has been recommended informally, but studies published to date have all used binary logistic regression and dichotomized the scale at different cut points. There is thus a need for evidence-based guidance to provide a standardized methodology for multivariate analysis and to enable comparison of results. This paper makes recommendations for the regression method for analysis of the LMUP as an outcome measure. Materials and methods Data collected from 4,244 pregnant women in Malawi were used to compare five regression methods: linear, logistic with two cut points, and ordinal logistic with either the full or grouped LMUP score. The recommendations were then tested on the original UK LMUP data. Results There were small but no important differences in the findings across the regression models. Logistic regression resulted in the largest loss of information, and assumptions were violated for the linear and ordinal logistic regression. Consequently, robust standard errors were used for linear regression and a partial proportional odds ordinal logistic regression model attempted. The latter could only be fitted for grouped LMUP score. Conclusion We recommend the linear regression model with robust standard errors to make full use of the LMUP score when analyzed as an outcome measure. Ordinal logistic regression could be considered, but a partial proportional odds model with grouped LMUP score may be required. Logistic regression is the least-favored option, due to the loss of information. For logistic regression, the cut point for un/planned pregnancy should be between nine and ten. These recommendations will standardize the analysis of LMUP data and enhance comparability of results across studies. PMID:28435343

  19. Operating manual for the RRL 8 channel data logger

    NASA Technical Reports Server (NTRS)

    Paluch, E. J.; Shelton, J. D.; Gardner, C. S.

    1979-01-01

    A data collection device which takes measurements from external sensors at user specified time intervals is described. Three sensor ports are dedicated to temperature, air pressure, and dew point. Five general purpose sensor ports are provided. The user specifies when the measurements are recorded as well as when the information is read or stored in a minicomputer or a paper tape.

  20. Testing and validation of multi-lidar scanning strategies for wind energy applications: Testing and validation of multi-lidar scanning strategies for wind energy applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Jennifer F.; Bonin, Timothy A.; Klein, Petra M.

    Several factors cause lidars to measure different values of turbulence than an anemometer on a tower, including volume averaging, instrument noise, and the use of a scanning circle to estimate the wind field. One way to avoid the use of a scanning circle is to deploy multiple scanning lidars and point them toward the same volume in space to collect velocity measurements and extract high-resolution turbulence information. This paper explores the use of two multi-lidar scanning strategies, the tri-Doppler technique and the virtual tower technique, for measuring 3-D turbulence. In Summer 2013, a vertically profiling Leosphere WindCube lidar and threemore » Halo Photonics Streamline lidars were operated at the Southern Great Plains Atmospheric Radiation Measurement site to test these multi-lidar scanning strategies. During the first half of the field campaign, all three scanning lidars were pointed at approximately the same point in space and a tri-Doppler analysis was completed to calculate the three-dimensional wind vector every second. Next, all three scanning lidars were used to build a “virtual tower” above the WindCube lidar. Results indicate that the tri-Doppler technique measures higher values of horizontal turbulence than the WindCube lidar under stable atmospheric conditions, reduces variance contamination under unstable conditions, and can measure highresolution profiles of mean wind speed and direction. The virtual tower technique provides adequate turbulence information under stable conditions but cannot capture the full temporal variability of turbulence experienced under unstable conditions because of the time needed to readjust the scans.« less

  1. Accuracy of reported flash point values on material safety data sheets and the impact on product classification.

    PubMed

    Radnoff, Diane

    2013-01-01

    Material Safety Data Sheets (MSDSs) are the foundation of worker right-to-know legislation for chemical hazards. Suppliers can use product test data to determine a product's classification. Alternatively, they may use evaluation and professional judgment based on test results for the product or a product, material, or substance with similar properties. While the criteria for classifying products under the new Globally Harmonized System of Classification and Labeling of Chemicals (GHS) are different, a similar process is followed. Neither the current Workplace Hazardous Materials Information System (WHMIS) nor GHS require suppliers to test their products to classify them. In this project 83 samples of products classified as flammable or combustible, representing a variety of industry sectors and product types, were collected. Flash points were measured and compared to the reported values on the MSDSs. The classifications of the products were then compared using the WHMIS and GHS criteria. The results of the study indicated that there were significant variations between the disclosed and measured flash point values. Overall, more than one-third of the products had flash points lower than that disclosed on the MSDS. In some cases, the measured values were more than 20°C lower than the disclosed values. This could potentially result in an underestimation regarding the flammability of the product so it is important for employers to understand the limitations in the information provided on MSDSs when developing safe work procedures and training programs in the workplace. Nearly one-fifth of the products were misclassified under the WHMIS system as combustible when the measured flash point indicated that they should be classified as flammable when laboratory measurement error was taken into account. While a similar number of products were misclassified using GHS criteria, the tendency appeared to be to "over-classify" (provide a hazard class that was more conservative). So the transition to GHS may potentially decrease the possibility of "under-classifying" flammable and combustible products where no test data on the product are available.

  2. Political efficacy in adolescence: Development, gender differences, and outcome relations.

    PubMed

    Arens, A Katrin; Watermann, Rainer

    2017-05-01

    The present study focuses on political efficacy in terms of students' competence self-perceptions related to the domain of politics. The investigation addresses the mean level development and longitudinal relations to outcome variables including gender differences. Drawing on a sample of N = 2,504 German students, political efficacy, along with meaningful outcome variables (i.e., political information behavior, political knowledge, and interest in politics), was measured at 2 measurement points, once in Grade 7 and once in Grade 10. Students' mean levels of political efficacy increased from the first to the second measurement point, and boys consistently displayed higher levels. Political efficacy demonstrated reciprocal relations to political information behavior and political knowledge, and showed a unidirectional relation to interest in politics across time. The pattern of outcome relations was invariant across gender. This study contributes to research and theory on political socialization in adolescence as it outlines temporal relations among, and gender differences in, facets of political socialization. Therefore, this study also offers new practical insights into effectively facilitating political education in adolescent students. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Conceptual recurrence plots: revealing patterns in human discourse.

    PubMed

    Angus, Daniel; Smith, Andrew; Wiles, Janet

    2012-06-01

    Human discourse contains a rich mixture of conceptual information. Visualization of the global and local patterns within this data stream is a complex and challenging problem. Recurrence plots are an information visualization technique that can reveal trends and features in complex time series data. The recurrence plot technique works by measuring the similarity of points in a time series to all other points in the same time series and plotting the results in two dimensions. Previous studies have applied recurrence plotting techniques to textual data; however, these approaches plot recurrence using term-based similarity rather than conceptual similarity of the text. We introduce conceptual recurrence plots, which use a model of language to measure similarity between pairs of text utterances, and the similarity of all utterances is measured and displayed. In this paper, we explore how the descriptive power of the recurrence plotting technique can be used to discover patterns of interaction across a series of conversation transcripts. The results suggest that the conceptual recurrence plotting technique is a useful tool for exploring the structure of human discourse.

  4. Practical aspects of handling data protection and data security.

    PubMed

    Louwerse, C P

    1991-01-01

    Looking at practical applications of health care information systems, we must conclude that in the field of data protection there still is too large a gap between what is feasible and necessary on one hand, and what is achieved in actual realizations on the other. To illustrate this point, we sketch the actual data protection measures in a large hospital information system, and describe the effects of changes affecting the system, such as increasing use of personal computers, and growing intensity of use of the system. Trends in the development of new and additional systems are indicated, and a summary of possible weak points and gaps in the security is given, some suggestions for improvement are made.

  5. 2D first break tomographic processing of data measured for celebration profiles: CEL01, CEL04, CEL05, CEL06, CEL09, CEL11

    NASA Astrophysics Data System (ADS)

    Bielik, M.; Vozar, J.; Hegedus, E.; Celebration Working Group

    2003-04-01

    The contribution informs about the preliminary results that relate to the first arrival p-wave seismic tomographic processing of data measured along the profiles CEL01, CEL04, CEL05, CEL06, CEL09 and CEL11. These profiles were measured in a framework of the seismic project called CELEBRATION 2000. Data acquisition and geometric parameters of the processed profiles, tomographic processing’s principle, particular processing steps and program parameters are described. Characteristic data (shot points, geophone points, total length of profiles, for all profiles, sampling, sensors and record lengths) of observation profiles are given. The fast program package developed by C. Zelt was applied for tomographic velocity inversion. This process consists of several steps. First step is a creation of the starting velocity field for which the calculated arrival times are modelled by the method of finite differences. The next step is minimization of differences between the measured and modelled arrival time till the deviation is small. Elimination of equivalency problem by including a priori information in the starting velocity field was done too. A priori information consists of the depth to the pre-Tertiary basement, estimation of its overlying sedimentary velocity from well-logging and or other seismic velocity data, etc. After checking the reciprocal times, pickings were corrected. The final result of the processing is a reliable travel time curve set considering the reciprocal times. We carried out picking of travel time curves, enhancement of signal-to-noise ratio on the seismograms using the program system of PROMAX. Tomographic inversion was carried out by so called 3D/2D procedure taking into account 3D wave propagation. It means that a corridor along the profile, which contains the outlying shot points and geophone points as well was defined and we carried out 3D processing within this corridor. The preliminary results indicate the seismic anomalous zones within the crust and the uppermost part of the upper mantle in the area consists of the Western Carpathians, the North European platform, the Pannonian basin and the Bohemian Massif.

  6. Automated information and control complex of hydro-gas endogenous mine processes

    NASA Astrophysics Data System (ADS)

    Davkaev, K. S.; Lyakhovets, M. V.; Gulevich, T. M.; Zolin, K. A.

    2017-09-01

    The automated information and control complex designed to prevent accidents, related to aerological situation in the underground workings, accounting of the received and handed over individual devices, transmission and display of measurement data, and the formation of preemptive solutions is considered. Examples for the automated workplace of an airgas control operator by individual means are given. The statistical characteristics of field data characterizing the aerological situation in the mine are obtained. The conducted studies of statistical characteristics confirm the feasibility of creating a subsystem of controlled gas distribution with an adaptive arrangement of points for gas control. The adaptive (multivariant) algorithm for processing measuring information of continuous multidimensional quantities and influencing factors has been developed.

  7. Towards semi-automatic rock mass discontinuity orientation and set analysis from 3D point clouds

    NASA Astrophysics Data System (ADS)

    Guo, Jiateng; Liu, Shanjun; Zhang, Peina; Wu, Lixin; Zhou, Wenhui; Yu, Yinan

    2017-06-01

    Obtaining accurate information on rock mass discontinuities for deformation analysis and the evaluation of rock mass stability is important. Obtaining measurements for high and steep zones with the traditional compass method is difficult. Photogrammetry, three-dimensional (3D) laser scanning and other remote sensing methods have gradually become mainstream methods. In this study, a method that is based on a 3D point cloud is proposed to semi-automatically extract rock mass structural plane information. The original data are pre-treated prior to segmentation by removing outlier points. The next step is to segment the point cloud into different point subsets. Various parameters, such as the normal, dip/direction and dip, can be calculated for each point subset after obtaining the equation of the best fit plane for the relevant point subset. A cluster analysis (a point subset that satisfies some conditions and thus forms a cluster) is performed based on the normal vectors by introducing the firefly algorithm (FA) and the fuzzy c-means (FCM) algorithm. Finally, clusters that belong to the same discontinuity sets are merged and coloured for visualization purposes. A prototype system is developed based on this method to extract the points of the rock discontinuity from a 3D point cloud. A comparison with existing software shows that this method is feasible. This method can provide a reference for rock mechanics, 3D geological modelling and other related fields.

  8. A U.S. Geological Survey Data Standard (Specifications for representation of geographic point locations for information interchange)

    USGS Publications Warehouse

    ,

    1983-01-01

    This standard establishes uniform formats for geographic point location data. Geographic point location refers to the use of a coordinate system to define the position of a point that may be on, above, or below the Earth's surface. It provides a means for representing these data in digital form for the purpose of interchanging information among data systems and improving clarity and accuracy of interpersonal communications. This document is an expansion and clarification of National Bureau of Standards FIPS PUB 70, issued October 24, 1980. There are minor editorial changes, plus the following additions and modifications: (I) The representation of latitude and longitude using radian measure was added. (2) Alternate 2 for Representation of Hemispheric Information was deleted. (3) Use of the maximum precision for all numerical values was emphasized. The Alternate Representation of Precision was deleted. (4) The length of the zone representation for the State Plane Coordinate System was standardized. (5) The term altitude was substituted for elevation throughout to conform with international usage. (6) Section 3, Specifications for Altitude Data, was expanded and upgraded significantly to the same level of detail as for the horizontal values. (7) A table delineating the coverage of Universal Transverse Mercator zones and the longitudes of the Central Meridians was added and the other tables renumbered. (8) The total length of the representation of point location data at maximum precision was standardized.

  9. Cosmological constraints from the convergence 1-point probability distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, Kenneth; Blazek, Jonathan; Honscheid, Klaus

    2017-06-29

    Here, we examine the cosmological information available from the 1-point probability density function (PDF) of the weak-lensing convergence field, utilizing fast l-picola simulations and a Fisher analysis. We find competitive constraints in the Ωm–σ8 plane from the convergence PDF with 188 arcmin 2 pixels compared to the cosmic shear power spectrum with an equivalent number of modes (ℓ < 886). The convergence PDF also partially breaks the degeneracy cosmic shear exhibits in that parameter space. A joint analysis of the convergence PDF and shear 2-point function also reduces the impact of shape measurement systematics, to which the PDF is lessmore » susceptible, and improves the total figure of merit by a factor of 2–3, depending on the level of systematics. Finally, we present a correction factor necessary for calculating the unbiased Fisher information from finite differences using a limited number of cosmological simulations.« less

  10. Cosmological constraints from the convergence 1-point probability distribution

    NASA Astrophysics Data System (ADS)

    Patton, Kenneth; Blazek, Jonathan; Honscheid, Klaus; Huff, Eric; Melchior, Peter; Ross, Ashley J.; Suchyta, Eric

    2017-11-01

    We examine the cosmological information available from the 1-point probability density function (PDF) of the weak-lensing convergence field, utilizing fast L-PICOLA simulations and a Fisher analysis. We find competitive constraints in the Ωm-σ8 plane from the convergence PDF with 188 arcmin2 pixels compared to the cosmic shear power spectrum with an equivalent number of modes (ℓ < 886). The convergence PDF also partially breaks the degeneracy cosmic shear exhibits in that parameter space. A joint analysis of the convergence PDF and shear 2-point function also reduces the impact of shape measurement systematics, to which the PDF is less susceptible, and improves the total figure of merit by a factor of 2-3, depending on the level of systematics. Finally, we present a correction factor necessary for calculating the unbiased Fisher information from finite differences using a limited number of cosmological simulations.

  11. MEASURING ECONOMIC GROWTH FROM OUTER SPACE

    PubMed Central

    Henderson, J. Vernon; Storeygard, Adam; Weil, David N.

    2013-01-01

    GDP growth is often measured poorly for countries and rarely measured at all for cities or subnational regions. We propose a readily available proxy: satellite data on lights at night. We develop a statistical framework that uses lights growth to augment existing income growth measures, under the assumption that measurement error in using observed light as an indicator of income is uncorrelated with measurement error in national income accounts. For countries with good national income accounts data, information on growth of lights is of marginal value in estimating the true growth rate of income, while for countries with the worst national income accounts, the optimal estimate of true income growth is a composite with roughly equal weights. Among poor-data countries, our new estimate of average annual growth differs by as much as 3 percentage points from official data. Lights data also allow for measurement of income growth in sub- and supranational regions. As an application, we examine growth in Sub Saharan African regions over the last 17 years. We find that real incomes in non-coastal areas have grown faster by 1/3 of an annual percentage point than coastal areas; non-malarial areas have grown faster than malarial ones by 1/3 to 2/3 annual percent points; and primate city regions have grown no faster than hinterland areas. Such applications point toward a research program in which “empirical growth” need no longer be synonymous with “national income accounts.” PMID:25067841

  12. Three optical methods for remotely measuring aerosol size distributions.

    NASA Technical Reports Server (NTRS)

    Reagan, J. A.; Herman, B. M.

    1971-01-01

    Three optical probing methods for remotely measuring atmospheric aerosol size distributions are discussed and contrasted. The particular detection methods which are considered make use of monostatic lidar (laser radar), bistatic lidar, and solar radiometer sensing techniques. The theory of each of these measurement techniques is discussed briefly, and the necessary constraints which must be applied to obtain aerosol size distribution information from such measurements are pointed out. Theoretical and/or experimental results are also presented which demonstrate the utility of the three proposed probing methods.

  13. Combining eddy-covariance and chamber measurements to determine the methane budget from a small, heterogeneous urban floodplain wetland park

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morin, T. H.; Bohrer, G.; Stefanik, K. C.

    Methane (CH 4) emissions and carbon uptake in temperate freshwater wetlands act in opposing directions in the context of global radiative forcing. Large uncertainties exist for the rates of CH 4 emissions making it difficult to determine the extent that CH 4 emissions counteract the carbon sequestration of wetlands. Urban temperate wetlands are typically small and feature highly heterogeneous land cover, posing an additional challenge to determining their CH 4 budget. The data analysis approach we introduce here combines two different CH 4 flux measurement techniques to overcome scale and heterogeneity problems and determine the overall CH 4 budget ofmore » a small, heterogeneous, urban wetland landscape. Temporally intermittent point measurements from non-steady-state chambers provided information about patch-level heterogeneity of fluxes, while continuous, high temporal resolution flux measurements using the eddy-covariance (EC) technique provided information about the temporal dynamics of the fluxes. Patch-level scaling parameterization was developed from the chamber data to scale eddy covariance data to a ‘fixed-frame’, which corrects for variability in the spatial coverage of the eddy covariance observation footprint at any single point in time. Finally, by combining two measurement techniques at different scales, we addressed shortcomings of both techniques with respect to heterogeneous wetland sites.« less

  14. Combining eddy-covariance and chamber measurements to determine the methane budget from a small, heterogeneous urban floodplain wetland park

    DOE PAGES

    Morin, T. H.; Bohrer, G.; Stefanik, K. C.; ...

    2017-02-17

    Methane (CH 4) emissions and carbon uptake in temperate freshwater wetlands act in opposing directions in the context of global radiative forcing. Large uncertainties exist for the rates of CH 4 emissions making it difficult to determine the extent that CH 4 emissions counteract the carbon sequestration of wetlands. Urban temperate wetlands are typically small and feature highly heterogeneous land cover, posing an additional challenge to determining their CH 4 budget. The data analysis approach we introduce here combines two different CH 4 flux measurement techniques to overcome scale and heterogeneity problems and determine the overall CH 4 budget ofmore » a small, heterogeneous, urban wetland landscape. Temporally intermittent point measurements from non-steady-state chambers provided information about patch-level heterogeneity of fluxes, while continuous, high temporal resolution flux measurements using the eddy-covariance (EC) technique provided information about the temporal dynamics of the fluxes. Patch-level scaling parameterization was developed from the chamber data to scale eddy covariance data to a ‘fixed-frame’, which corrects for variability in the spatial coverage of the eddy covariance observation footprint at any single point in time. Finally, by combining two measurement techniques at different scales, we addressed shortcomings of both techniques with respect to heterogeneous wetland sites.« less

  15. Completely optical orientation determination for an unstabilized aerial three-line camera

    NASA Astrophysics Data System (ADS)

    Wohlfeil, Jürgen

    2010-10-01

    Aerial line cameras allow the fast acquisition of high-resolution images at low costs. Unfortunately the measurement of the camera's orientation with the necessary rate and precision is related with large effort, unless extensive camera stabilization is used. But also stabilization implicates high costs, weight, and power consumption. This contribution shows that it is possible to completely derive the absolute exterior orientation of an unstabilized line camera from its images and global position measurements. The presented approach is based on previous work on the determination of the relative orientation of subsequent lines using optical information from the remote sensing system. The relative orientation is used to pre-correct the line images, in which homologous points can reliably be determined using the SURF operator. Together with the position measurements these points are used to determine the absolute orientation from the relative orientations via bundle adjustment of a block of overlapping line images. The approach was tested at a flight with the DLR's RGB three-line camera MFC. To evaluate the precision of the resulting orientation the measurements of a high-end navigation system and ground control points are used.

  16. Coping capacities for improving adaptation pathways for flood protection in Can Tho, Vietnam

    NASA Astrophysics Data System (ADS)

    Pathirana, A.; Radhakrishnan, M.; Quan, N. H.; Gersonius, B.; Ashley, R.; Zevenbergen, C.

    2016-12-01

    Studying the evolution of coping and adaptation capacities is a prerequisite for preparing an effective flood management plan for the future, especially in the dynamic and fast changing cities of developing countries. The objectives, requirements, targets, design and performance of flood protection measures will have to be determined after taking into account, or in conjunction with, the coping capacities. A methodology is presented based on adaptation pathways to account for coping capacities and to assess the effect on flood protection measures. The adaptation pathways method determines the point of failure of a particular strategy based on the change in an external driver, a point in time or a socio economic situation where / at which the strategy can no longer meet its objective. Pathways arrived at based on this methodology reflect future reality by considering changing engineering standards along with future uncertainties, risk taking abilities and adaptation capacities. This pathways based methodology determines the Adaptation tipping points (ATP), `time of occurrence of ATP' of flood protection measures after accounting for coping capacities, evaluates the measures and then provides the means to determine the adaptation pathways. Application of this methodology for flood protection measures in Can Tho city in the Mekong delta reveals the effect of coping capacity on the usefulness of flood protection measures and the delay in occurrence of tipping points. Consideration of coping capacity in the system owing to elevated property floor levels lead to the postponement of tipping points and improved the adaptation pathways comprising flood protection measures such as dikes. This information is useful to decision makers for planning and phasing of investments in flood protection.

  17. D Modeling of Components of a Garden by Using Point Cloud Data

    NASA Astrophysics Data System (ADS)

    Kumazakia, R.; Kunii, Y.

    2016-06-01

    Laser measurement is currently applied to several tasks such as plumbing management, road investigation through mobile mapping systems, and elevation model utilization through airborne LiDAR. Effective laser measurement methods have been well-documented in civil engineering, but few attempts have been made to establish equally effective methods in landscape engineering. By using point cloud data acquired through laser measurement, the aesthetic landscaping of Japanese gardens can be enhanced. This study focuses on simple landscape simulations for pruning and rearranging trees as well as rearranging rocks, lanterns, and other garden features by using point cloud data. However, such simulations lack concreteness. Therefore, this study considers the construction of a library of garden features extracted from point cloud data. The library would serve as a resource for creating new gardens and simulating gardens prior to conducting repairs. Extracted garden features are imported as 3ds Max objects, and realistic 3D models are generated by using a material editor system. As further work toward the publication of a 3D model library, file formats for tree crowns and trunks should be adjusted. Moreover, reducing the size of created models is necessary. Models created using point cloud data are informative because simply shaped garden features such as trees are often seen in the 3D industry.

  18. Developing a Web-Based Nursing Practice and Research Information Management System: A Pilot Study.

    PubMed

    Choi, Jeeyae; Lapp, Cathi; Hagle, Mary E

    2015-09-01

    Many hospital information systems have been developed and implemented to collect clinical data from the bedside and have used the information to improve patient care. Because of a growing awareness that the use of clinical information improves quality of care and patient outcomes, measuring tools (electronic and paper based) have been developed, but most of them require multiple steps of data collection and analysis. This necessitated the development of a Web-based Nursing Practice and Research Information Management System that processes clinical nursing data to measure nurses' delivery of care and its impact on patient outcomes and provides useful information to clinicians, administrators, researchers, and policy makers at the point of care. This pilot study developed a computer algorithm based on a falls prevention protocol and programmed the prototype Web-based Nursing Practice and Research Information Management System. It successfully measured performance of nursing care delivered and its impact on patient outcomes successfully using clinical nursing data from the study site. Although Nursing Practice and Research Information Management System was tested with small data sets, results of study revealed that it has the potential to measure nurses' delivery of care and its impact on patient outcomes, while pinpointing components of nursing process in need of improvement.

  19. An information theory framework for dynamic functional domain connectivity.

    PubMed

    Vergara, Victor M; Miller, Robyn; Calhoun, Vince

    2017-06-01

    Dynamic functional network connectivity (dFNC) analyzes time evolution of coherent activity in the brain. In this technique dynamic changes are considered for the whole brain. This paper proposes an information theory framework to measure information flowing among subsets of functional networks call functional domains. Our method aims at estimating bits of information contained and shared among domains. The succession of dynamic functional states is estimated at the domain level. Information quantity is based on the probabilities of observing each dynamic state. Mutual information measurement is then obtained from probabilities across domains. Thus, we named this value the cross domain mutual information (CDMI). Strong CDMIs were observed in relation to the subcortical domain. Domains related to sensorial input, motor control and cerebellum form another CDMI cluster. Information flow among other domains was seldom found. Other methods of dynamic connectivity focus on whole brain dFNC matrices. In the current framework, information theory is applied to states estimated from pairs of multi-network functional domains. In this context, we apply information theory to measure information flow across functional domains. Identified CDMI clusters point to known information pathways in the basal ganglia and also among areas of sensorial input, patterns found in static functional connectivity. In contrast, CDMI across brain areas of higher level cognitive processing follow a different pattern that indicates scarce information sharing. These findings show that employing information theory to formally measured information flow through brain domains reveals additional features of functional connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Context-aware pattern discovery for moving object trajectories

    NASA Astrophysics Data System (ADS)

    Sharif, Mohammad; Asghar Alesheikh, Ali; Kaffash Charandabi, Neda

    2018-05-01

    Movement of point objects are highly sensitive to the underlying situations and conditions during the movement, which are known as contexts. Analyzing movement patterns, while accounting the contextual information, helps to better understand how point objects behave in various contexts and how contexts affect their trajectories. One potential solution for discovering moving objects patterns is analyzing the similarities of their trajectories. This article, therefore, contextualizes the similarity measure of trajectories by not only their spatial footprints but also a notion of internal and external contexts. The dynamic time warping (DTW) method is employed to assess the multi-dimensional similarities of trajectories. Then, the results of similarity searches are utilized in discovering the relative movement patterns of the moving point objects. Several experiments are conducted on real datasets that were obtained from commercial airplanes and the weather information during the flights. The results yielded the robustness of DTW method in quantifying the commonalities of trajectories and discovering movement patterns with 80 % accuracy. Moreover, the results revealed the importance of exploiting contextual information because it can enhance and restrict movements.

  1. Inhomogeneous Point-Processes to Instantaneously Assess Affective Haptic Perception through Heartbeat Dynamics Information

    NASA Astrophysics Data System (ADS)

    Valenza, G.; Greco, A.; Citi, L.; Bianchi, M.; Barbieri, R.; Scilingo, E. P.

    2016-06-01

    This study proposes the application of a comprehensive signal processing framework, based on inhomogeneous point-process models of heartbeat dynamics, to instantaneously assess affective haptic perception using electrocardiogram-derived information exclusively. The framework relies on inverse-Gaussian point-processes with Laguerre expansion of the nonlinear Wiener-Volterra kernels, accounting for the long-term information given by the past heartbeat events. Up to cubic-order nonlinearities allow for an instantaneous estimation of the dynamic spectrum and bispectrum of the considered cardiovascular dynamics, as well as for instantaneous measures of complexity, through Lyapunov exponents and entropy. Short-term caress-like stimuli were administered for 4.3-25 seconds on the forearms of 32 healthy volunteers (16 females) through a wearable haptic device, by selectively superimposing two levels of force, 2 N and 6 N, and two levels of velocity, 9.4 mm/s and 65 mm/s. Results demonstrated that our instantaneous linear and nonlinear features were able to finely characterize the affective haptic perception, with a recognition accuracy of 69.79% along the force dimension, and 81.25% along the velocity dimension.

  2. Explaining Achievement in Higher Education

    ERIC Educational Resources Information Center

    Jansen, Ellen P. W. A.; Bruinsma, Marjon

    2005-01-01

    This research project investigated the relationship between students' pre-entry characteristics, perceptions of the learning environment, reported work discipline, the use of deep information processing strategies, and academic achievement. Ability measured by grade-point average in pre-university education was the most important predictor of…

  3. Inferring Small Scale Dynamics from Aircraft Measurements of Tracers

    NASA Technical Reports Server (NTRS)

    Sparling, L. C.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The millions of ER-2 and DC-8 aircraft measurements of long-lived tracers in the Upper Troposphere/Lower Stratosphere (UT/LS) hold enormous potential as a source of statistical information about subgrid scale dynamics. Extracting this information however can be extremely difficult because the measurements are made along a 1-D transect through fields that are highly anisotropic in all three dimensions. Some of the challenges and limitations posed by both the instrumentation and platform are illustrated within the context of the problem of using the data to obtain an estimate of the dissipation scale. This presentation will also include some tutorial remarks about the conditional and two-point statistics used in the analysis.

  4. Gaussian random bridges and a geometric model for information equilibrium

    NASA Astrophysics Data System (ADS)

    Mengütürk, Levent Ali

    2018-03-01

    The paper introduces a class of conditioned stochastic processes that we call Gaussian random bridges (GRBs) and proves some of their properties. Due to the anticipative representation of any GRB as the sum of a random variable and a Gaussian (T , 0) -bridge, GRBs can model noisy information processes in partially observed systems. In this spirit, we propose an asset pricing model with respect to what we call information equilibrium in a market with multiple sources of information. The idea is to work on a topological manifold endowed with a metric that enables us to systematically determine an equilibrium point of a stochastic system that can be represented by multiple points on that manifold at each fixed time. In doing so, we formulate GRB-based information diversity over a Riemannian manifold and show that it is pinned to zero over the boundary determined by Dirac measures. We then define an influence factor that controls the dominance of an information source in determining the best estimate of a signal in the L2-sense. When there are two sources, this allows us to construct information equilibrium as a functional of a geodesic-valued stochastic process, which is driven by an equilibrium convergence rate representing the signal-to-noise ratio. This leads us to derive price dynamics under what can be considered as an equilibrium probability measure. We also provide a semimartingale representation of Markovian GRBs associated with Gaussian martingales and a non-anticipative representation of fractional Brownian random bridges that can incorporate degrees of information coupling in a given system via the Hurst exponent.

  5. Taiwan's Travel and Border Health Measures in Response to Zika

    PubMed Central

    Ho, Li-Li; Tsai, Yu-Hui; Lee, Wang-Ping; Liao, Szu-Tsai; Wu, Li-Gin

    2017-01-01

    Zika virus has recently emerged as a worldwide public health concern. Travel and border health measures stand as one of the main strategies and frontline defenses in responding to international epidemics. As of October 31, 2016, Taiwan has reported 13 imported cases, 5 of which were detected through routine entry screening and active monitoring at international airports. This article shares Taiwan's disease surveillance activities at designated points of entry and travel and border health measures in response to Zika. The Taiwan government collaborates with its tourism industry to disseminate information about precautionary measures and encourages tour guides to report suspected individuals or events to activate early response measures. Taiwan also engages in vector control activities at points of entry, including targeting aircraft from countries where vector-borne diseases are endemic, implementing mosquito sweep measures, and collecting vector surveillance data. In future emerging and reemerging disease events, entry surveillance at designated points of entry may enable early detection of diseases of international origin and more rapid activation of public health preparedness activities and international collaboration. Taiwan will continue to maximize border and travel health measures in compliance with IHR (2005) requirements, which rely on continued risk assessment, practical implementation activities, and engagement with all stakeholders. PMID:28418744

  6. Magnetism of Minor Bodies in the Solar System: From 433 Eros, passing Braille, Steins, and Lutetia towards Churyumov-Gerasimenko and 1999 JU3.

    NASA Astrophysics Data System (ADS)

    Hercik, David; Auster, Hans-Ulrich; Heinisch, Philip; Richter, Ingo; Glassmeier, Karl-Heinz

    2015-04-01

    Minor bodies in the solar system, such as asteroids and comets, are important sources of information for our knowledge of the solar system formation. Besides other aspects, estimation of a magnetization state of such bodies might prove important in understanding the early aggregation phases of the protoplanetary disk, showing the level of importance of the magnetic forces in the processes involved. Meteorites' magnetization measurements suggest that primitive bodies consist of magnetized material. However, space observations from various flybys give to date diverse results for a global magnetization estimation. The flybys at Braille and Gaspra indicate possible higher magnetization (~ 10-3 Am2/kg), while flybys at Steins and Lutetia show no significant values in the global field change illustrating low global magnetization. Furthermore, the interpretation of remote (during flybys) measurements is very difficult. For correct estimates on the local magnetization one needs (in the best case) multi-point surface measurements. Single point observation has been done by NEAR-Shoemaker on 433 Eros asteroid, revealing no signature in magnetic field that could have origin in asteroid magnetization. Similar results, no magnetization observed, have been provided by evaluation of recent data from ROMAP (Philae lander) and RPC-MAG (Rosetta orbiter) instruments from comet 67P/Churyumov-Gerasimenko. The ROMAP instrument provided measurements from multiple points of the cometary surface as well as data along ballistic path between multiple touchdowns, which support the conclusion of no global magnetization. However, even in case of the in-situ on surface observations the magnetization estimate has a limiting spatial resolution that is dependent on the distance from the surface (~ 50 cm in case of ROMAP). To get information about possible smaller magnetized grains distribution and magnetization strength, the sensor shall be placed as close as possible to the surface. For such observations the next ideal candidate mission is Hayabusa-II with its Mascot lander equipped with fluxgate magnetometer. The small-sized lander shall deliver the magnetometer within centimeters from the surface, providing measurements on multiple points thanks to a hopping ability. The mission has been recently launched (December 2014) and is aiming to a C-type asteroid 1999 JU3 to reach it in 2018. The results will hopefully add some piece of information to the still unclear question of minor solar system bodies magnetization.

  7. Effective Measurement of Reliability of Repairable USAF Systems

    DTIC Science & Technology

    2012-09-01

    Hansen presented a course, Concepts and Models for Repairable Systems Reliability, at the 2009 Centro de Investigacion en Mathematicas ( CIMAT ). The...recurrent event by calculating the mean quantity of recurrent events of the population of systems at risk at that point in time. The number of systems at... risk is the number of systems that are operating and providing information. [9] Information can be obscured by data censoring and truncation. One

  8. An Approach of Estimating Individual Growth Curves for Young Thoroughbred Horses Based on Their Birthdays

    PubMed Central

    ONODA, Tomoaki; YAMAMOTO, Ryuta; SAWAMURA, Kyohei; MURASE, Harutaka; NAMBO, Yasuo; INOUE, Yoshinobu; MATSUI, Akira; MIYAKE, Takeshi; HIRAI, Nobuhiro

    2014-01-01

    ABSTRACT We propose an approach of estimating individual growth curves based on the birthday information of Japanese Thoroughbred horses, with considerations of the seasonal compensatory growth that is a typical characteristic of seasonal breeding animals. The compensatory growth patterns appear during only the winter and spring seasons in the life of growing horses, and the meeting point between winter and spring depends on the birthday of each horse. We previously developed new growth curve equations for Japanese Thoroughbreds adjusting for compensatory growth. Based on the equations, a parameter denoting the birthday information was added for the modeling of the individual growth curves for each horse by shifting the meeting points in the compensatory growth periods. A total of 5,594 and 5,680 body weight and age measurements of Thoroughbred colts and fillies, respectively, and 3,770 withers height and age measurements of both sexes were used in the analyses. The results of predicted error difference and Akaike Information Criterion showed that the individual growth curves using birthday information better fit to the body weight and withers height data than not using them. The individual growth curve for each horse would be a useful tool for the feeding managements of young Japanese Thoroughbreds in compensatory growth periods. PMID:25013356

  9. Thermodynamics of quantum information scrambling

    NASA Astrophysics Data System (ADS)

    Campisi, Michele; Goold, John

    2017-06-01

    Scrambling of quantum information can conveniently be quantified by so-called out-of-time-order correlators (OTOCs), i.e., correlators of the type <[Wτ,V ] †[Wτ,V ] > , whose measurements present a formidable experimental challenge. Here we report on a method for the measurement of OTOCs based on the so-called two-point measurement scheme developed in the field of nonequilibrium quantum thermodynamics. The scheme is of broader applicability than methods employed in current experiments and provides a clear-cut interpretation of quantum information scrambling in terms of nonequilibrium fluctuations of thermodynamic quantities, such as work and heat. Furthermore, we provide a numerical example on a spin chain which highlights the utility of our thermodynamic approach when understanding the differences between integrable and ergodic behaviors. We also discuss how the method can be used to extend the reach of current experiments.

  10. The quality of information on the internet relating to top-selling dietary supplements in the Czech Republic.

    PubMed

    Baudischova, L; Straznicka, J; Pokladnikova, J; Jahodar, L

    2018-02-01

    Background The purchase of dietary supplements (DS) via the Internet is increasing worldwide as well as in the Czech Republic. Objective The aim of the study is to evaluate the quality of information on DS available on the Internet. Setting Czech websites related to dietary supplements. Methods A cross-sectional study was carried out involving the analysis of information placed on the websites related to the 100 top-selling DS in the Czech Republic in 2014, according to IMS Health data. Main outcome measure The following criteria were evaluated: contact for the manufacturer, recommended dosage, information on active substances as well as overall composition, permitted health claims, % of the daily reference intake value (DRIV) for vitamins and minerals, link for online counseling, pregnancy/breastfeeding, allergy information, contraindications, adverse reactions, and supplement-drug interactions (some criteria were evaluated from both points of view). Results A total of 199 web domains and 850 websites were evaluated. From the regulatory point of view, all the criteria were fulfilled by 11.3% of websites. Almost 9% of the websites reported information referring to the treatment, cure, or prevention of a disease. From the clinical point of view, all the criteria were only met by one website. Conclusions The quality of information related to DS available on the Internet in the Czech Republic is quite low. The consumers should consult a specialist when using DS purchased online.

  11. Handheld Synthetic Array Final Report, Part A

    DTIC Science & Technology

    2014-12-01

    Measurement Unit 4/143 IEEE Institute of Electrical and Electronics Engineers KF Kalman Filter KL Kullback - Leibler LAMBDA Least-squares... testing the algorithms for the LOS AN wireless beamforming. Given a good set of feature points, the ego-motion is sufficiently accurate to... of little value to the overall SLAM and the RSS observables are used instead. While individual RSS measurements are low in information value, the

  12. Geopositioning with a quadcopter: Extracted feature locations and predicted accuracy without a priori sensor attitude information

    NASA Astrophysics Data System (ADS)

    Dolloff, John; Hottel, Bryant; Edwards, David; Theiss, Henry; Braun, Aaron

    2017-05-01

    This paper presents an overview of the Full Motion Video-Geopositioning Test Bed (FMV-GTB) developed to investigate algorithm performance and issues related to the registration of motion imagery and subsequent extraction of feature locations along with predicted accuracy. A case study is included corresponding to a video taken from a quadcopter. Registration of the corresponding video frames is performed without the benefit of a priori sensor attitude (pointing) information. In particular, tie points are automatically measured between adjacent frames using standard optical flow matching techniques from computer vision, an a priori estimate of sensor attitude is then computed based on supplied GPS sensor positions contained in the video metadata and a photogrammetric/search-based structure from motion algorithm, and then a Weighted Least Squares adjustment of all a priori metadata across the frames is performed. Extraction of absolute 3D feature locations, including their predicted accuracy based on the principles of rigorous error propagation, is then performed using a subset of the registered frames. Results are compared to known locations (check points) over a test site. Throughout this entire process, no external control information (e.g. surveyed points) is used other than for evaluation of solution errors and corresponding accuracy.

  13. A microfluidic thermometer: Precise temperature measurements in microliter- and nanoliter-scale volumes

    PubMed Central

    McKenzie, Brittney A.

    2017-01-01

    Measuring the temperature of a sample is a fundamental need in many biological and chemical processes. When the volume of the sample is on the microliter or nanoliter scale (e.g., cells, microorganisms, precious samples, or samples in microfluidic devices), accurate measurement of the sample temperature becomes challenging. In this work, we demonstrate a technique for accurately determining the temperature of microliter volumes using a simple 3D-printed microfluidic chip. We accomplish this by first filling “microfluidic thermometer” channels on the chip with substances with precisely known freezing/melting points. We then use a thermoelectric cooler to create a stable and linear temperature gradient along these channels within a measurement region on the chip. A custom software tool (available as online Supporting Information) is then used to find the locations of solid-liquid interfaces in the thermometer channels; these locations have known temperatures equal to the freezing/melting points of the substances in the channels. The software then uses the locations of these interfaces to calculate the temperature at any desired point within the measurement region. Using this approach, the temperature of any microliter-scale on-chip sample can be measured with an uncertainty of about a quarter of a degree Celsius. As a proof-of-concept, we use this technique to measure the unknown freezing point of a 50 microliter volume of solution and demonstrate its feasibility on a 400 nanoliter sample. Additionally, this technique can be used to measure the temperature of any on-chip sample, not just near-zero-Celsius freezing points. We demonstrate this by using an oil that solidifies near room temperature (coconut oil) in a microfluidic thermometer to measure on-chip temperatures well above zero Celsius. By providing a low-cost and simple way to accurately measure temperatures in small volumes, this technique should find applications in both research and educational laboratories. PMID:29284028

  14. Concept for an off-line gain stabilisation method.

    PubMed

    Pommé, S; Sibbens, G

    2004-01-01

    Conceptual ideas are presented for an off-line gain stabilisation method for spectrometry, in particular for alpha-particle spectrometry at low count rate. The method involves list mode storage of individual energy and time stamp data pairs. The 'Stieltjes integral' of measured spectra with respect to a reference spectrum is proposed as an indicator for gain instability. 'Exponentially moving averages' of the latter show the gain shift as a function of time. With this information, the data are relocated stochastically on a point-by-point basis.

  15. Upper Atmosphere Research Report Number 21. Summary of Upper Atmosphere Rocket Research Firings

    DTIC Science & Technology

    1954-02-01

    computer . The sky screens are essentially theodolites which view the rocket through a pair of - crossed rods which are driven closed by an electric motor...positions are electrically measured and fed into a computer . The computer continously predicts the point of impact of the rocket 411 were its thrust...Without such equipment it is neces- sary to rely on optical ’fixes’, sound ranging, or the Impact Point Computer to provide such information. In the early

  16. Tools and data acquisition of borehole geophysical logging for the Florida Power and Light Company Turkey Point Power Plant in support of a groundwater, surface-water, and ecological monitoring plan, Miami-Dade County, Florida

    USGS Publications Warehouse

    Wacker, Michael A.

    2010-01-01

    Borehole geophysical logs were obtained from selected exploratory coreholes in the vicinity of the Florida Power and Light Company Turkey Point Power Plant. The geophysical logging tools used and logging sequences performed during this project are summarized herein to include borehole logging methods, descriptions of the properties measured, types of data obtained, and calibration information.

  17. Reflections on Earth--Remote-Sensing Research from Your Classroom.

    ERIC Educational Resources Information Center

    Campbell, Bruce A.

    2001-01-01

    Points out the uses of remote sensing in different areas, and introduces the program "Reflections on Earth" which provides access to basic and instructional information on remote sensing to students and teachers. Introduces students to concepts related to remote sensing and measuring distances. (YDS)

  18. Accuracy improvement in a calibration test bench for accelerometers by a vision system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D’Emilia, Giulio, E-mail: giulio.demilia@univaq.it; Di Gasbarro, David, E-mail: david.digasbarro@graduate.univaq.it; Gaspari, Antonella, E-mail: antonella.gaspari@graduate.univaq.it

    2016-06-28

    A procedure is described in this paper for the accuracy improvement of calibration of low-cost accelerometers in a prototype rotary test bench, driven by a brushless servo-motor and operating in a low frequency range of vibrations (0 to 5 Hz). Vibration measurements by a vision system based on a low frequency camera have been carried out, in order to reduce the uncertainty of the real acceleration evaluation at the installation point of the sensor to be calibrated. A preliminary test device has been realized and operated in order to evaluate the metrological performances of the vision system, showing a satisfactory behaviormore » if the uncertainty measurement is taken into account. A combination of suitable settings of the control parameters of the motion control system and of the information gained by the vision system allowed to fit the information about the reference acceleration at the installation point to the needs of the procedure for static and dynamic calibration of three-axis accelerometers.« less

  19. Prenatal drug use and the production of infant health.

    PubMed

    Noonan, Kelly; Reichman, Nancy E; Corman, Hope; Dave, Dhaval

    2007-04-01

    We estimate the effect of illicit drug use during pregnancy on two measures of poor infant health: low birth weight and abnormal infant health conditions. We use data from a national longitudinal study of urban parents that includes postpartum interviews with mothers, hospital medical record data on the mothers and their newborns, and information about the neighborhood in which the mother resides. We address the potential endogeneity of prenatal drug use. Depending on how prenatal drug use is measured, we find that it increases low birth weight by 4-6 percentage points and that it increases the likelihood of an abnormal infant health condition by 7-12 percentage points. Copyright (c) 2006 John Wiley & Sons, Ltd.

  20. Measurement Challenges in International Agreements

    NASA Astrophysics Data System (ADS)

    Luke, John

    2006-10-01

    Making measurements in support of international agreements can pose many challenges both from a policy and science point of view. Policy issues may arise because physics measurements made in the area of arms control or disarmament may be deemed too intrusive since they could possibly reveal sensitive information about the material that is being interrogated. Therefore, agreements must include a framework for safeguarding against the potential release of this information. Most of the scientific issues center around the fact that it is desirable to make high quality measurements without any operator interaction. This leads to the development of instrumentation and software that are very stable and robust. Due to different concerns, policy and science priorities may be at odds with one another. Therefore, it is the scientist's challenge - in this field - to keep policy makers informed by conveying what is technically possible and what is not in a manner that is easily understood and also negotiable. In this paper we will discuss some of the technology that has been developed to address some of these challenges in various international and model agreements. We will discuss the principle of informational barrier used in these measurement technologies to safeguard the release of sensitive information. We will also discuss some of the pitfalls that may arise when policy is ill informed about the physical constraints in the making of measurements of nuclear materials.

  1. Measurement of Dam Deformations: Case Study of Obruk Dam (Turkey)

    NASA Astrophysics Data System (ADS)

    Gulal, V. Engin; Alkan, R. Metin; Alkan, M. Nurullah; İlci, Veli; Ozulu, I. Murat; Tombus, F. Engin; Kose, Zafer; Aladogan, Kayhan; Sahin, Murat; Yavasoglu, Hakan; Oku, Guldane

    2016-04-01

    In the literature, there is information regarding the first deformation and displacement measurements in dams that were conducted in 1920s Switzerland. Todays, deformation measurements in the dams have gained very different functions with improvements in both measurement equipment and evaluation of measurements. Deformation measurements and analysis are among the main topics studied by scientists who take interest in the engineering measurement sciences. The Working group of Deformation Measurements and Analysis, which was established under the International Federation of Surveyors (FIG), carries out its studies and activities with regard to this subject. At the end of the 1970s, the subject of the determination of fixed points in the deformation monitoring network was one of the main subjects extensively studied. Many theories arose from this inquiry, as different institutes came to differing conclusions. In 1978, a special commission with representatives of universities has been established within the FIG 6.1 working group; this commission worked on the issue of determining a general approach to geometric deformation analysis. The results gleaned from the commission were discussed at symposiums organized by the FIG. In accordance with these studies, scientists interested in the subject have begun to work on models that investigate cause and effect relations between the effects that cause deformation and deformation. As of the scientist who interest with the issue focused on different deformation methods, another special commission was established within the FIG engineering measurements commission in order to classify deformation models and study terminology. After studying this material for a long time, the official commission report was published in 2001. In this prepared report, studies have been carried out by considering the FIG Engineering Surveying Commission's report entitled, 'MODELS AND TERMINOLOGY FOR THE ANALYSIS OF GEODETIC MONITORING OBSERVATIONS'. In October of 2015, geodetic deformation measurements were conducted by considering FIG reports related to deformation measurements and German DIN 18710 Engineering Measurements norms in the Çorum province of Turkey. The main purpose of the study is to determine optimum measurement and evaluation methods that will be used to specify movements in the horizontal and vertical directions for the fill dam. For this purpose; • In reference networks consisting of 8 points, measurements were performed by using long-term dual-frequency GNSS receivers for duration of 8 hours. • GNSS measurements were conducted in varying times between 30 minutes and 120 minutes at the 44 units object points on the body of the dam. • Two repetitive measurements of real time kinematic (RTK) GNSS were conducted at the object points on dam. • Geometric leveling measurements were performed between reference and object points. • Trigonometric leveling measurements were performed between reference and object points. • Polar measurements were performed between references and object points. GNSS measurements performed at reference points of the monitoring network for 8 hours have been evaluated by using GAMIT software in accordance with the IGS points in the region. In this manner, regional and local movements in the network can be determined. It is aimed to determine measurement period which will provide 1-2mm accuracy that expected in local GNSS network by evaluating GNSS measurements performed on body of dam. Results will be compared by offsetting GNSS and terrestrial measurements. This study will investigate whether or not there is increased accuracy provided by GNSS measurements carried out among reference points without the possibility of vision.

  2. Atmospheric neutral points outside of the principal plane. [points of vanished skylight polarization

    NASA Technical Reports Server (NTRS)

    Fraser, R. S.

    1981-01-01

    It is noted that the positions in the sky where the skylight is unpolarized, that is, the neutral points, are in most cases located in the vertical plane through the sun (the principal plane). Points have been observed outside the principal plane (Soret, 1888) when the plane intersected a lake or sea. Here, the neutral points were located at an azimuth of about 15 deg from the sun and near the almucantar through the sun. In order to investigate the effects of water surface and aerosols in the neutral point positions, the positions are computed for models of the earth-atmosphere system that simulate the observational conditions. The computed and measured positions are found to agree well. While previous observations provided only qualitative information on the degree of polarization, it is noted that the computations provide details concerning the polarization parameters.

  3. Information System and Geographic Information System Tools in the Data Analyses of the Control Program for Visceral Leishmaniases from 2006 to 2010 in the Sanitary District of Venda Nova, Belo Horizonte, Minas Gerais, Brazil

    PubMed Central

    Saraiva, Lara; Leite, Camila Gonçalves; de Carvalho, Luiz Otávio Alves; Andrade Filho, José Dilermando; de Menezes, Fernanda Carvalho; Fiúza, Vanessa de Oliveira Pires

    2012-01-01

    The aim of this paper is to report a brief history of control actions for Visceral Leishmaniasis (VL) from 2006 to 2010 in the Sanitary District (DS) of Venda Nova, Belo Horizonte, Minas Gerais, Brazil, focusing on the use of information systems and Geographic Information System (GIS) tools. The analyses showed that the use of an automated database allied with geoprocessing tools may favor control measures of VL, especially with regard to the evaluation of control actions carried out. Descriptive analyses of control measures allowed to evaluating that the information system and GIS tools promoted greater efficiency in making decisions and planning activities. These analyses also pointed to the necessity of new approaches to the control of VL in large urban centers. PMID:22518168

  4. Measles elimination – review of event notifications sent to National IHR Focal Point between 2010 and 2016

    PubMed

    Izdebski, Radosław; Henszel, Łukasz; Janiec, Janusz; Radziszewski, Franciszek

    The Member States of the World Health Organization (WHO) in accordance with International Health Regulations (2005) were obliged to appoint National IHR Focal Points (N IHR FP), of which tasks include obtaining information concerning public health emergencies of international concern which occurred abroad or within the country. The aim of this work is the review of WHO, ECDC, National IHR Focal Points from the WHO Member States and The State Sanitary Inspection notifications related to measles received by National IHR Focal Point in Poland in the period from 2010 to 2016. During this period N IHR FP was informed about 79 events related to measles. These events include: 36 related to the outbreaks in different countries, 27 concerning individual cases, 14 related to the exposure in contact with a measles case during air travel and two concerning the implementation of the MMR vaccination programs. Despite the progress in implementing the measures included in the elimination of measles programs in Europe, there was a significant increase in the number of measles cases and outbreaks particularly in years 2010-2011.

  5. Semi-physical parameter identification for an iron-loss formula allowing loss-separation

    NASA Astrophysics Data System (ADS)

    Steentjes, S.; Leßmann, M.; Hameyer, K.

    2013-05-01

    This paper presents a semi-physical parameter identification for a recently proposed enhanced iron-loss formula, the IEM-Formula. Measurements are performed on a standardized Epstein frame by the conventional field-metric method under sinusoidal magnetic flux densities up to high magnitudes and frequencies. Quasi-static losses are identified on the one hand by point-by-point dc-measurements using a flux-meter and on the other hand by extrapolating higher frequency measurements to dc magnetization using the statistical loss-separation theory (Jacobs et al., "Magnetic material optimization for hybrid vehicle PMSM drives," in Inductica Conference, CD-Rom, Chicago/USA, 2009). Utilizing this material information, possibilities to identify the parameter of the IEM-Formula are analyzed. Along with this, the importance of excess losses in present-day non-grain oriented Fe-Si laminations is investigated. In conclusion, the calculated losses are compared to the measured losses.

  6. [Comparative quality measurements part 3: funnel plots].

    PubMed

    Kottner, Jan; Lahmann, Nils

    2014-02-01

    Comparative quality measurements between organisations or institutions are common. Quality measures need to be standardised and risk adjusted. Random error must also be taken adequately into account. Rankings without consideration of the precision lead to flawed interpretations and enhances "gaming". Application of confidence intervals is one possibility to take chance variation into account. Funnel plots are modified control charts based on Statistical Process Control (SPC) theory. The quality measures are plotted against their sample size. Warning and control limits that are 2 or 3 standard deviations from the center line are added. With increasing group size the precision increases and so the control limits are forming a funnel. Data points within the control limits are considered to show common cause variation; data points outside special cause variation without the focus of spurious rankings. Funnel plots offer data based information about how to evaluate institutional performance within quality management contexts.

  7. Comparison of Scientific Calipers and Computer-Enabled CT Review for the Measurement of Skull Base and Craniomaxillofacial Dimensions

    PubMed Central

    Citardi, Martin J.; Herrmann, Brian; Hollenbeak, Chris S.; Stack, Brendan C.; Cooper, Margaret; Bucholz, Richard D.

    2001-01-01

    Traditionally, cadaveric studies and plain-film cephalometrics provided information about craniomaxillofacial proportions and measurements; however, advances in computer technology now permit software-based review of computed tomography (CT)-based models. Distances between standardized anatomic points were measured on five dried human skulls with standard scientific calipers (Geneva Gauge, Albany, NY) and through computer workstation (StealthStation 2.6.4, Medtronic Surgical Navigation Technology, Louisville, CO) review of corresponding CT scans. Differences in measurements between the caliper and CT model were not statistically significant for each parameter. Measurements obtained by computer workstation CT review of the cranial skull base are an accurate representation of actual bony anatomy. Such information has important implications for surgical planning and clinical research. ImagesFigure 1Figure 2Figure 3 PMID:17167599

  8. Closed-Form Jensen-Renyi Divergence for Mixture of Gaussians and Applications to Group-Wise Shape Registration*

    PubMed Central

    Wang, Fei; Syeda-Mahmood, Tanveer; Vemuri, Baba C.; Beymer, David; Rangarajan, Anand

    2010-01-01

    In this paper, we propose a generalized group-wise non-rigid registration strategy for multiple unlabeled point-sets of unequal cardinality, with no bias toward any of the given point-sets. To quantify the divergence between the probability distributions – specifically Mixture of Gaussians – estimated from the given point sets, we use a recently developed information-theoretic measure called Jensen-Renyi (JR) divergence. We evaluate a closed-form JR divergence between multiple probabilistic representations for the general case where the mixture models differ in variance and the number of components. We derive the analytic gradient of the divergence measure with respect to the non-rigid registration parameters, and apply it to numerical optimization of the group-wise registration, leading to a computationally efficient and accurate algorithm. We validate our approach on synthetic data, and evaluate it on 3D cardiac shapes. PMID:20426043

  9. Closed-form Jensen-Renyi divergence for mixture of Gaussians and applications to group-wise shape registration.

    PubMed

    Wang, Fei; Syeda-Mahmood, Tanveer; Vemuri, Baba C; Beymer, David; Rangarajan, Anand

    2009-01-01

    In this paper, we propose a generalized group-wise non-rigid registration strategy for multiple unlabeled point-sets of unequal cardinality, with no bias toward any of the given point-sets. To quantify the divergence between the probability distributions--specifically Mixture of Gaussians--estimated from the given point sets, we use a recently developed information-theoretic measure called Jensen-Renyi (JR) divergence. We evaluate a closed-form JR divergence between multiple probabilistic representations for the general case where the mixture models differ in variance and the number of components. We derive the analytic gradient of the divergence measure with respect to the non-rigid registration parameters, and apply it to numerical optimization of the group-wise registration, leading to a computationally efficient and accurate algorithm. We validate our approach on synthetic data, and evaluate it on 3D cardiac shapes.

  10. Finding online health-related information: usability issues of health portals.

    PubMed

    Gurel Koybasi, Nergis A; Cagiltay, Kursat

    2012-01-01

    As Internet and computers become widespread, health portals offering online health-related information become more popular. The most important point for health portals is presenting reliable and valid information. Besides, portal needs to be usable to be able to serve information to users effectively. This study aims to determine usability issues emerging when health-related information is searched on a health portal. User-based usability tests are conducted and eye movement analyses are used in addition to traditional performance measures. Results revealed that users prefer systematic, simple and consistent designs offering interactive tools. Moreover, content and partitions needs to be shaped according to the medical knowledge of target users.

  11. High Precision Edge Detection Algorithm for Mechanical Parts

    NASA Astrophysics Data System (ADS)

    Duan, Zhenyun; Wang, Ning; Fu, Jingshun; Zhao, Wenhui; Duan, Boqiang; Zhao, Jungui

    2018-04-01

    High precision and high efficiency measurement is becoming an imperative requirement for a lot of mechanical parts. So in this study, a subpixel-level edge detection algorithm based on the Gaussian integral model is proposed. For this purpose, the step edge normal section line Gaussian integral model of the backlight image is constructed, combined with the point spread function and the single step model. Then gray value of discrete points on the normal section line of pixel edge is calculated by surface interpolation, and the coordinate as well as gray information affected by noise is fitted in accordance with the Gaussian integral model. Therefore, a precise location of a subpixel edge was determined by searching the mean point. Finally, a gear tooth was measured by M&M3525 gear measurement center to verify the proposed algorithm. The theoretical analysis and experimental results show that the local edge fluctuation is reduced effectively by the proposed method in comparison with the existing subpixel edge detection algorithms. The subpixel edge location accuracy and computation speed are improved. And the maximum error of gear tooth profile total deviation is 1.9 μm compared with measurement result with gear measurement center. It indicates that the method has high reliability to meet the requirement of high precision measurement.

  12. Far and Wide - Microbial Bebop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peter Larsen

    2012-10-01

    This musical composition was created from data of microbes (bacteria, algae and other microorganisms) sampled in the English Channel. Argonne National Laboratory biologist Peter Larsen created the songs as a unique way to present and comprehend large datasets. Microbial species of the Order Rickettsiales, such as the highly abundant, free-living planktonic species Pelagibacter ubique, are typical highly abundant taxa in L4 Station data. Its relative abundance in the microbial community at L4 Station follows a distinctive seasonal pattern. In this composition, there are two chords per measure, generated from photosynthetically active radiation measurements and temperature. The melody of each measuremore » is six notes that describe the relative abundance of the Order Rickettsiales. The first note of each measure is from the relative abundance at a time point. The next five notes of a measure follow one of the following patterns: a continuous rise in pitch, a continuous drop in pitch, a rise then drop in pitch, or a drop then rise in pitch. These patterns are matched to the relative abundance of Rickettsiales at the given time point, relative to the previous and subsequent time points. The pattern of notes in a measure is mapped to the relative abundance of Rickettsiales with fewer rests per measure indicating higher abundance. For time points at which Rickettsiales was the most abundant microbial taxa, the corresponding measure is highlighted with a cymbal crash. More information at http://www.anl.gov/articles/songs-key... Image: Diatoms under a microscope: These tiny phytoplankton are encased within a silicate cell wall. Credit: Prof. Gordon T. Taylor, Stony Brook University« less

  13. Experiments with Cholesteric Liquid Crystals

    ERIC Educational Resources Information Center

    Fergason, James L.

    1970-01-01

    Describes laboratory experiments designed to demonstrate (1) the properties of cholesteric liquid crystals, (2) thermal mapping, (3) thermal diffusivity, (4) adiabatic expansion of rubber, and (5) measurement of radiated energy by a point source. Contains all of the information on materials and apparatus needed to perform the experiments.…

  14. Metrication, American Style. Fastback 41.

    ERIC Educational Resources Information Center

    Izzi, John

    The purpose of this pamphlet is to provide a starting point of information on the metric system for any concerned or interested reader. The material is organized into five brief chapters: Man and Measurement; Learning the Metric System; Progress Report: Education; Recommended Sources; and Metrication, American Style. Appendixes include an…

  15. 18 CFR 5.18 - Application content.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... other reference point; describe the topography and climate; and discuss major land uses and economic... development of project works or changes in project operation. This analysis must be based on the information... environmental measures, including, but not limited to, changes in the project design or operations, to address...

  16. Cultivation of personality awareness - The starting point of thinking and politics in colleges and universities in the current network information

    NASA Astrophysics Data System (ADS)

    Shen, Jie

    2018-03-01

    With the continuous development of network technology, the development of network information age has promoted the orderly development of ideological and political education in colleges and universities. It can effectively improve students' political accomplishments and continuously broaden the ways of thinking and education in colleges and universities. Ideological and political work to provide more information platform and education. This article will elaborate on the cultivation of personality consciousness in college ideological and political work under the network age and put forward corresponding measures.

  17. Information-reality complementarity: The role of measurements and quantum reference frames

    NASA Astrophysics Data System (ADS)

    Dieguez, P. R.; Angelo, R. M.

    2018-02-01

    Recently, a measure has been put forward which allows for the quantification of the degree of reality of an observable for a given preparation [Bilobran and Angelo, Europhys. Lett. 112, 40005 (2015), 10.1209/0295-5075/112/40005]. Here we employ this quantifier to establish, on formal grounds, relations among the concepts of measurement, information, and physical reality. After introducing mathematical objects that unify weak and projective measurements, we study scenarios showing that an arbitrary-intensity unrevealed measurement of a given observable generally leads to an increase of its reality and also of its incompatible observables. We derive a complementarity relation connecting an amount of information associated with the apparatus with the degree of irreality of the monitored observable. Specifically for pure states, we show that the entanglement with the apparatus precisely determines the amount by which the reality of the monitored observable increases. We also point out some mechanisms whereby the irreality of an observable can be generated. Finally, using the aforementioned tools, we construct a consistent picture to address the measurement problem.

  18. Measuring Concentrations of Particulate 140La in the Air

    DOE PAGES

    Okada, Colin E.; Kernan, Warnick J.; Keillor, Martin E.; ...

    2016-05-01

    Air sampling systems were deployed to measure the concentration of radioactive material in the air during the Full-Scale Radiological Dispersal Device experiments. The air samplers were positioned 100-600 meters downwind of the release point. The filters were collected immediately and analyzed in the field. Quantities for total activity collected on the air filters are reported along with additional information to compute the average or integrated air concentrations.

  19. Internet Searching About Disease Elicits a Positive Perception of Own Health When Severity of Illness Is High: A Longitudinal Questionnaire Study.

    PubMed

    Sassenberg, Kai; Greving, Hannah

    2016-03-04

    The Internet is one of the primary sources for health information. However, in research, the effects of Internet use on the perception of one's own health have not received much attention so far. This study tested how Internet use for acquiring health information and severity of illness influence patients with a chronic disease with regard to the perception of their own health. Negative psychological states are known to lead to preferential processing of positive information. In particular, the self-directed nature of Internet use provides room for such biases. Therefore, we predicted that patients experiencing negative health states more frequently, due to more frequent episodes of a chronic illness, will gain a more positive perception of their health if they use the Internet frequently to gain health information, but not if they use the Internet rarely. This effect was not expected for other sources of information. A longitudinal questionnaire study with two measurement points-with a 7-month time lag-tested the hypothesis in a sample of patients with chronic inflammatory bowel disease (n=208). This study assessed patients' frequency of Internet use, their participation in online social support groups, their use of other sources of health information, and several indicators of the participants' perceptions of their own health. A structure equation model (SEM) was used to test the predictions separately for Internet searches and other sources of information. Data analysis supported the prediction; the interaction between frequency of health-related information searches and frequency of episodes at the first measurement point (T1) was related to participants' positive perceptions of their own health at the second measurement point (T2) (B=.10, SE=.04, P=.02) above and beyond the perceptions of their own health at T1. When participants used the Internet relatively rarely (-1 SD), there was no relationship between frequency of episodes and positive perceptions of their own health (B=-.11, SE=.14, t203=-0.82, P=.41). In contrast, when participants used the Internet relatively often (+1 SD), the more frequently they had those episodes the more positive were the perceptions of their own health (B=.36, SE=.15, t203=2.43, P=.02). Additional SEM analyses revealed that this effect occurs exclusively when information is searched for on the Internet, but not when other sources of information are consulted, nor when online social support groups are joined. The results of this study suggest that patients might process information from the Internet selectively, in an unbalanced, biased fashion, with the formation of a self-serving (ie, positive) perception of own health. At the same time, this bias contributes to the ability of patients to cope psychologically with their disease.

  20. Breadth of Coverage, Ease of Use, and Quality of Mobile Point-of-Care Tool Information Summaries: An Evaluation

    PubMed Central

    Ren, Jinma

    2016-01-01

    Background With advances in mobile technology, accessibility of clinical resources at the point of care has increased. Objective The objective of this research was to identify if six selected mobile point-of-care tools meet the needs of clinicians in internal medicine. Point-of-care tools were evaluated for breadth of coverage, ease of use, and quality. Methods Six point-of-care tools were evaluated utilizing four different devices (two smartphones and two tablets). Breadth of coverage was measured using select International Classification of Diseases, Ninth Revision, codes if information on summary, etiology, pathophysiology, clinical manifestations, diagnosis, treatment, and prognosis was provided. Quality measures included treatment and diagnostic inline references and individual and application time stamping. Ease of use covered search within topic, table of contents, scrolling, affordance, connectivity, and personal accounts. Analysis of variance based on the rank of score was used. Results Breadth of coverage was similar among Medscape (mean 6.88), Uptodate (mean 6.51), DynaMedPlus (mean 6.46), and EvidencePlus (mean 6.41) (P>.05) with DynaMed (mean 5.53) and Epocrates (mean 6.12) scoring significantly lower (P<.05). Ease of use had DynaMedPlus with the highest score, and EvidencePlus was lowest (6.0 vs 4.0, respectively, P<.05). For quality, reviewers rated the same score (4.00) for all tools except for Medscape, which was rated lower (P<.05). Conclusions For breadth of coverage, most point-of-care tools were similar with the exception of DynaMed. For ease of use, only UpToDate and DynaMedPlus allow for search within a topic. All point-of-care tools have remote access with the exception of UpToDate and Essential Evidence Plus. All tools except Medscape covered criteria for quality evaluation. Overall, there was no significant difference between the point-of-care tools with regard to coverage on common topics used by internal medicine clinicians. Selection of point-of-care tools is highly dependent on individual preference based on ease of use and cost of the application. PMID:27733328

  1. Reduced vision selectively impairs spatial updating in fall-prone older adults.

    PubMed

    Barrett, Maeve M; Doheny, Emer P; Setti, Annalisa; Maguinness, Corrina; Foran, Timothy G; Kenny, Rose Anne; Newell, Fiona N

    2013-01-01

    The current study examined the role of vision in spatial updating and its potential contribution to an increased risk of falls in older adults. Spatial updating was assessed using a path integration task in fall-prone and healthy older adults. Specifically, participants conducted a triangle completion task in which they were guided along two sides of a triangular route and were then required to return, unguided, to the starting point. During the task, participants could either clearly view their surroundings (full vision) or visuo-spatial information was reduced by means of translucent goggles (reduced vision). Path integration performance was measured by calculating the distance and angular deviation from the participant's return point relative to the starting point. Gait parameters for the unguided walk were also recorded. We found equivalent performance across groups on all measures in the full vision condition. In contrast, in the reduced vision condition, where participants had to rely on interoceptive cues to spatially update their position, fall-prone older adults made significantly larger distance errors relative to healthy older adults. However, there were no other performance differences between fall-prone and healthy older adults. These findings suggest that fall-prone older adults, compared to healthy older adults, have greater difficulty in reweighting other sensory cues for spatial updating when visual information is unreliable.

  2. Open-loop measurement of data sampling point for SPM

    NASA Astrophysics Data System (ADS)

    Wang, Yueyu; Zhao, Xuezeng

    2006-03-01

    SPM (Scanning Probe Microscope) provides "three-dimensional images" with nanometer level resolution, and some of them can be used as metrology tools. However, SPM's images are commonly distorted by non-ideal properties of SPM's piezoelectric scanner, which reduces metrological accuracy and data repeatability. In order to eliminate this limit, an "open-loop sampling" method is presented. In this method, the positional values of sampling points in all three directions on the surface of the sample are measured by the position sensor and recorded in SPM's image file, which is used to replace the image file from a conventional SPM. Because the positions in X and Y directions are measured at the same time of sampling height information in Z direction, the image distortion caused by scanner locating error can be reduced by proper image processing algorithm.

  3. Optimal Compression of Floating-Point Astronomical Images Without Significant Loss of Information

    NASA Technical Reports Server (NTRS)

    Pence, William D.; White, R. L.; Seaman, R.

    2010-01-01

    We describe a compression method for floating-point astronomical images that gives compression ratios of 6 - 10 while still preserving the scientifically important information in the image. The pixel values are first preprocessed by quantizing them into scaled integer intensity levels, which removes some of the uncompressible noise in the image. The integers are then losslessly compressed using the fast and efficient Rice algorithm and stored in a portable FITS format file. Quantizing an image more coarsely gives greater image compression, but it also increases the noise and degrades the precision of the photometric and astrometric measurements in the quantized image. Dithering the pixel values during the quantization process greatly improves the precision of measurements in the more coarsely quantized images. We perform a series of experiments on both synthetic and real astronomical CCD images to quantitatively demonstrate that the magnitudes and positions of stars in the quantized images can be measured with the predicted amount of precision. In order to encourage wider use of these image compression methods, we have made available a pair of general-purpose image compression programs, called fpack and funpack, which can be used to compress any FITS format image.

  4. Catching a glimpse of working memory: top-down capture as a tool for measuring the content of the mind.

    PubMed

    Lange, Nicholas D; Thomas, Rick P; Buttaccio, Daniel R; Davelaar, Eddy J

    2012-11-01

    This article outlines a methodology for probing working memory (WM) content in high-level cognitive tasks (e.g., decision making, problem solving, and memory retrieval) by capitalizing on attentional and oculomotor biases evidenced in top-down capture paradigms. This method would be of great use, as it could measure the information resident in WM at any point in a task and, hence, track information use over time as tasks dynamically evolve. Above and beyond providing a measure of information occupancy in WM, such a method would benefit from sensitivity to the specific activation levels of individual items in WM. This article additionally forwards a novel fusion of standard free recall and visual search paradigms in an effort to assess the sensitivity of eye movements in top-down capture, on which this new measurement technique relies, to item-specific memory activation (ISMA). The results demonstrate eye movement sensitivity to ISMA in some, but not all, cases.

  5. Frame Shift/warp Compensation for the ARID Robot System

    NASA Technical Reports Server (NTRS)

    Latino, Carl D.

    1991-01-01

    The Automatic Radiator Inspection Device (ARID) is a system aimed at automating the tedious task of inspecting orbiter radiator panels. The ARID must have the ability to aim a camera accurately at the desired inspection points, which are in the order of 13,000. The ideal inspection points are known; however, the panel may be relocated due to inaccurate parking and warpage. A method of determining the mathematical description of a translated as well as a warped surface by accurate measurement of only a few points on this surface is developed here. The method uses a linear warp model whose effect is superimposed on the rigid body translation. Due to the angles involved, small angle approximations are possible, which greatly reduces the computational complexity. Given an accurate linear warp model, all the desired translation and warp parameters can be obtained by knowledge of the ideal locations of four fiducial points and the corresponding measurements of these points on the actual radiator surface. The method uses three of the fiducials to define a plane and the fourth to define the warp. Given this information, it is possible to determine a transformation that will enable the ARID system to translate any desired inspection point on the ideal surface to its corresponding value on the actual surface.

  6. The Building America Indoor Temperature and Humidity Measurement Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metzger, C.; Norton, Paul

    2014-02-01

    When modeling homes using simulation tools, the heating and cooling set points can have a significant impact on home energy use. Every four years, the Energy Information Administration (EIA) Residential Energy Consumption Survey (RECS) asks homeowners about their heating and cooling set points. Unfortunately, no temperature data is measured, and most of the time, the homeowner may be guessing at this number. Even one degree Fahrenheit difference in heating set point can make a 5% difference in heating energy use! So, the survey-based RECS data cannot be used as the definitive reference for the set point for the "average occupant"more » in simulations. The purpose of this document is to develop a protocol for collecting consistent data for heating/cooling set points and relative humidity so that an average set point can be determined for asset energy models in residential buildings. This document covers the decision making process for researchers to determine how many sensors should be placed in each home, where to put those sensors, and what kind of asset data should be taken while they are in the home. The authors attempted to design the protocols to maximize the value of this study and minimize the resources required to achieve that value.« less

  7. Building America Indoor Temperature and Humidity Measurement Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engebrecht-Metzger, Cheryn; Norton, Paul

    2014-02-01

    When modeling homes using simulation tools, the heating and cooling set points can have a significant impact on home energy use. Every 4 years the Energy Information Administration (EIA) Residential Energy Consumption Survey (RECS) asks homeowners about their heating and cooling set points. Unfortunately, no temperature data is measured, and most of the time, the homeowner may be guessing at this number. Even one degree Fahrenheit difference in heating set point can make a 5% difference in heating energy use! So, the survey-based RECS data cannot be used as the definitive reference for the set point for the 'average occupant'more » in simulations. The purpose of this document is to develop a protocol for collecting consistent data for heating/cooling set points and relative humidity so that an average set point can be determined for asset energy models in residential buildings. This document covers the decision making process for researchers to determine how many sensors should be placed in each home, where to put those sensors, and what kind of asset data should be taken while they are in the home. The authors attempted to design the protocols to maximize the value of this study and minimize the resources required to achieve that value.« less

  8. An improved arterial pulsation measurement system based on optical triangulation and its application in the traditional Chinese medicine

    NASA Astrophysics Data System (ADS)

    Wu, Jih-Huah; Lee, Wen-Li; Lee, Yun-Parn; Lin, Ching-Huang; Chiou, Ji-Yi; Tai, Chuan-Fu; Jiang, Joe-Air

    2011-08-01

    An improved arterial pulsation measurement (APM) system that uses three LED light sources and a CCD image sensor to measure pulse waveforms of artery is presented. The relative variations of the pulses at three measurement points near wrist joints can be determined by the APM system simultaneously. The height of the arterial pulsations measured by the APM system achieves a resolution of better than 2 μm. These pulsations contain useful information that can be used as diagnostic references in the traditional Chinese medicine (TCM) in the future.

  9. Objectively-Measured Physical Activity and Cognitive Functioning in Breast Cancer Survivors

    PubMed Central

    Marinac, Catherine R.; Godbole, Suneeta; Kerr, Jacqueline; Natarajan, Loki; Patterson, Ruth E.; Hartman, Sheri J.

    2015-01-01

    Purpose To explore the relationship between objectively measured physical activity and cognitive functioning in breast cancer survivors. Methods Participants were 136 postmenopausal breast cancer survivors. Cognitive functioning was assessed using a comprehensive computerized neuropsychological test. 7-day physical activity was assessed using hip-worn accelerometers. Linear regression models examined associations of minutes per day of physical activity at various intensities on individual cognitive functioning domains. The partially adjusted model controlled for primary confounders (model 1), and subsequent adjustments were made for chemotherapy history (model 2), and BMI (model 3). Interaction and stratified models examined BMI as an effect modifier. Results Moderate-to-vigorous physical activity (MVPA) was associated with Information Processing Speed. Specifically, ten minutes of MVPA was associated with a 1.35-point higher score (out of 100) on the Information Processing Speed domain in the partially adjusted model, and a 1.29-point higher score when chemotherapy was added to the model (both p<.05). There was a significant BMI x MVPA interaction (p=.051). In models stratified by BMI (<25 vs. ≥25 kg/m2), the favorable association between MVPA and Information Processing Speed was stronger in the subsample of overweight and obese women (p<.05), but not statistically significant in the leaner subsample. Light-intensity physical activity was not significantly associated with any of the measured domains of cognitive function. Conclusions MVPA may have favorable effects on Information Processing Speed in breast cancer survivors, particularly among overweight or obese women. Implications for Cancer Survivors Interventions targeting increased physical activity may enhance aspects of cognitive function among breast cancer survivors. PMID:25304986

  10. Models of formation and some algorithms of hyperspectral image processing

    NASA Astrophysics Data System (ADS)

    Achmetov, R. N.; Stratilatov, N. R.; Yudakov, A. A.; Vezenov, V. I.; Eremeev, V. V.

    2014-12-01

    Algorithms and information technologies for processing Earth hyperspectral imagery are presented. Several new approaches are discussed. Peculiar properties of processing the hyperspectral imagery, such as multifold signal-to-noise reduction, atmospheric distortions, access to spectral characteristics of every image point, and high dimensionality of data, were studied. Different measures of similarity between individual hyperspectral image points and the effect of additive uncorrelated noise on these measures were analyzed. It was shown that these measures are substantially affected by noise, and a new measure free of this disadvantage was proposed. The problem of detecting the observed scene object boundaries, based on comparing the spectral characteristics of image points, is considered. It was shown that contours are processed much better when spectral characteristics are used instead of energy brightness. A statistical approach to the correction of atmospheric distortions, which makes it possible to solve the stated problem based on analysis of a distorted image in contrast to analytical multiparametric models, was proposed. Several algorithms used to integrate spectral zonal images with data from other survey systems, which make it possible to image observed scene objects with a higher quality, are considered. Quality characteristics of hyperspectral data processing were proposed and studied.

  11. Influence of multiple scattering and absorption on the full scattering profile and the isobaric point in tissue

    NASA Astrophysics Data System (ADS)

    Duadi, Hamootal; Fixler, Dror

    2015-05-01

    Light reflectance and transmission from soft tissue has been utilized in noninvasive clinical measurement devices such as the photoplethysmograph (PPG) and reflectance pulse oximeter. Incident light on the skin travels into the underlying layers and is in part reflected back to the surface, in part transferred and in part absorbed. Most methods of near infrared (NIR) spectroscopy focus on the volume reflectance from a semi-infinite sample, while very few measure transmission. We have previously shown that examining the full scattering profile (angular distribution of exiting photons) provides more comprehensive information when measuring from a cylindrical tissue. Furthermore, an isobaric point was found which is not dependent on changes in the reduced scattering coefficient. The angle corresponding to this isobaric point depends on the tissue diameter. We investigated the role of multiple scattering and absorption on the full scattering profile of a cylindrical tissue. First, we define the range in which multiple scattering occurs for different tissue diameters. Next, we examine the role of the absorption coefficient in the attenuation of the full scattering profile. We demonstrate that the absorption linearly influences the intensity at each angle of the full scattering profile and, more importantly, the absorption does not change the position of the isobaric point. The findings of this work demonstrate a realistic model for optical tissue measurements such as NIR spectroscopy, PPG, and pulse oximetery.

  12. Effect of Impurities on the Triple Point of Water: Experiments with Doped Cells at Different Liquid Fractions

    NASA Astrophysics Data System (ADS)

    Dobre, M.; Peruzzi, A.; Kalemci, M.; Van Geel, J.; Maeck, M.; Uytun, A.

    2018-05-01

    Recent international comparisons showed that there is still room for improvement in triple point of water (TPW) realization uncertainty. Large groups of cells manufactured, maintained and measured in similar conditions still show a spread in the realized TPW temperature that is larger than the best measurement uncertainties (25 µK). One cause is the time-dependent concentration of dissolved impurities in water. The origin of such impurities is the glass/quartz envelope dissolution during a cell lifetime. The effect is a difference in the triple point temperature proportional to the impurities concentration. In order to measure this temperature difference and to investigate the effect of different types of impurities, we manufactured doped cells with different concentrations of silicon (Si), boron (B), sodium (Na) and potassium (K), the glass main chemical components. To identify any influence of the filling process, two completely independent manufacturing procedures were followed in two different laboratories, both national metrology institutes (VSL, Netherlands and UME, Turkey). Cells glass and filling water were also different while the doping materials were identical. Measuring the temperature difference as a function of the liquid fraction is a method to obtain information about impurities concentrations in TPW. Only cells doped with 1 µmol·mol-1 B, Na and K proved to be suitable for measurements at different liquid fractions. We present here the results with related uncertainties and discuss the critical points in this experimental approach.

  13. A framework for automatic feature extraction from airborne light detection and ranging data

    NASA Astrophysics Data System (ADS)

    Yan, Jianhua

    Recent advances in airborne Light Detection and Ranging (LIDAR) technology allow rapid and inexpensive measurements of topography over large areas. Airborne LIDAR systems usually return a 3-dimensional cloud of point measurements from reflective objects scanned by the laser beneath the flight path. This technology is becoming a primary method for extracting information of different kinds of geometrical objects, such as high-resolution digital terrain models (DTMs), buildings and trees, etc. In the past decade, LIDAR gets more and more interest from researchers in the field of remote sensing and GIS. Compared to the traditional data sources, such as aerial photography and satellite images, LIDAR measurements are not influenced by sun shadow and relief displacement. However, voluminous data pose a new challenge for automated extraction the geometrical information from LIDAR measurements because many raster image processing techniques cannot be directly applied to irregularly spaced LIDAR points. In this dissertation, a framework is proposed to filter out information about different kinds of geometrical objects, such as terrain and buildings from LIDAR automatically. They are essential to numerous applications such as flood modeling, landslide prediction and hurricane animation. The framework consists of several intuitive algorithms. Firstly, a progressive morphological filter was developed to detect non-ground LIDAR measurements. By gradually increasing the window size and elevation difference threshold of the filter, the measurements of vehicles, vegetation, and buildings are removed, while ground data are preserved. Then, building measurements are identified from no-ground measurements using a region growing algorithm based on the plane-fitting technique. Raw footprints for segmented building measurements are derived by connecting boundary points and are further simplified and adjusted by several proposed operations to remove noise, which is caused by irregularly spaced LIDAR measurements. To reconstruct 3D building models, the raw 2D topology of each building is first extracted and then further adjusted. Since the adjusting operations for simple building models do not work well on 2D topology, 2D snake algorithm is proposed to adjust 2D topology. The 2D snake algorithm consists of newly defined energy functions for topology adjusting and a linear algorithm to find the minimal energy value of 2D snake problems. Data sets from urbanized areas including large institutional, commercial, and small residential buildings were employed to test the proposed framework. The results demonstrated that the proposed framework achieves a very good performance.

  14. Study on the high-frequency laser measurement of slot surface difference

    NASA Astrophysics Data System (ADS)

    Bing, Jia; Lv, Qiongying; Cao, Guohua

    2017-10-01

    In view of the measurement of the slot surface difference in the large-scale mechanical assembly process, Based on high frequency laser scanning technology and laser detection imaging principle, This paragraph designs a double galvanometer pulse laser scanning system. Laser probe scanning system architecture consists of three parts: laser ranging part, mechanical scanning part, data acquisition and processing part. The part of laser range uses high-frequency laser range finder to measure the distance information of the target shape and get a lot of point cloud data. Mechanical scanning part includes high-speed rotary table, high-speed transit and related structure design, in order to realize the whole system should be carried out in accordance with the design of scanning path on the target three-dimensional laser scanning. Data processing part mainly by FPGA hardware with LAbVIEW software to design a core, to process the point cloud data collected by the laser range finder at the high-speed and fitting calculation of point cloud data, to establish a three-dimensional model of the target, so laser scanning imaging is realized.

  15. Estimating minimally important difference (MID) in PROMIS pediatric measures using the scale-judgment method.

    PubMed

    Thissen, David; Liu, Yang; Magnus, Brooke; Quinn, Hally; Gipson, Debbie S; Dampier, Carlton; Huang, I-Chan; Hinds, Pamela S; Selewski, David T; Reeve, Bryce B; Gross, Heather E; DeWalt, Darren A

    2016-01-01

    To assess minimally important differences (MIDs) for several pediatric self-report item banks from the National Institutes of Health Patient-Reported Outcomes Measurement Information System(®) (PROMIS(®)). We presented vignettes comprising sets of two completed PROMIS questionnaires and asked judges to declare whether the individual completing those questionnaires had an important change or not. We enrolled judges (including adolescents, parents, and clinicians) who responded to 24 vignettes (six for each domain of depression, pain interference, fatigue, and mobility). We used item response theory to model responses to the vignettes across different judges and estimated MID as the point at which 50 % of the judges would declare an important change. We enrolled 246 judges (78 adolescents, 85 parents, and 83 clinicians). The MID estimated with clinician data was about 2 points on the PROMIS T-score scale, and the MID estimated with adolescent and parent data was about 3 points on that same scale. The MIDs enhance the value of PROMIS pediatric measures in clinical research studies to identify meaningful changes in health status over time.

  16. Effect of two doses of ginkgo biloba extract (EGb 761) on the dual-coding test in elderly subjects.

    PubMed

    Allain, H; Raoul, P; Lieury, A; LeCoz, F; Gandon, J M; d'Arbigny, P

    1993-01-01

    The subjects of this double-blind study were 18 elderly men and women (mean age, 69.3 years) with slight age-related memory impairment. In a crossover-study design, each subject received placebo or an extract of Ginkgo biloba (EGb 761) (320 mg or 600 mg) 1 hour before performing a dual-coding test that measures the speed of information processing; the test consists of several coding series of drawings and words presented at decreasing times of 1920, 960, 480, 240, and 120 ms. The dual-coding phenomenon (a break point between coding verbal material and images) was demonstrated in all the tests. After placebo, the break point was observed at 960 ms and dual coding beginning at 1920 ms. After each dose of the ginkgo extract, the break point (at 480 ms) and dual coding (at 960 ms) were significantly shifted toward a shorter presentation time, indicating an improvement in the speed of information processing.

  17. Inhomogeneous Point-Processes to Instantaneously Assess Affective Haptic Perception through Heartbeat Dynamics Information

    PubMed Central

    Valenza, G.; Greco, A.; Citi, L.; Bianchi, M.; Barbieri, R.; Scilingo, E. P.

    2016-01-01

    This study proposes the application of a comprehensive signal processing framework, based on inhomogeneous point-process models of heartbeat dynamics, to instantaneously assess affective haptic perception using electrocardiogram-derived information exclusively. The framework relies on inverse-Gaussian point-processes with Laguerre expansion of the nonlinear Wiener-Volterra kernels, accounting for the long-term information given by the past heartbeat events. Up to cubic-order nonlinearities allow for an instantaneous estimation of the dynamic spectrum and bispectrum of the considered cardiovascular dynamics, as well as for instantaneous measures of complexity, through Lyapunov exponents and entropy. Short-term caress-like stimuli were administered for 4.3–25 seconds on the forearms of 32 healthy volunteers (16 females) through a wearable haptic device, by selectively superimposing two levels of force, 2 N and 6 N, and two levels of velocity, 9.4 mm/s and 65 mm/s. Results demonstrated that our instantaneous linear and nonlinear features were able to finely characterize the affective haptic perception, with a recognition accuracy of 69.79% along the force dimension, and 81.25% along the velocity dimension. PMID:27357966

  18. Assessment of generalizability, applicability and predictability (GAP) for evaluating external validity in studies of universal family-based prevention of alcohol misuse in young people: systematic methodological review of randomized controlled trials.

    PubMed

    Fernandez-Hermida, Jose Ramon; Calafat, Amador; Becoña, Elisardo; Tsertsvadze, Alexander; Foxcroft, David R

    2012-09-01

    To assess external validity characteristics of studies from two Cochrane Systematic Reviews of the effectiveness of universal family-based prevention of alcohol misuse in young people. Two reviewers used an a priori developed external validity rating form and independently assessed three external validity dimensions of generalizability, applicability and predictability (GAP) in randomized controlled trials. The majority (69%) of the included 29 studies were rated 'unclear' on the reporting of sufficient information for judging generalizability from sample to study population. Ten studies (35%) were rated 'unclear' on the reporting of sufficient information for judging applicability to other populations and settings. No study provided an assessment of the validity of the trial end-point measures for subsequent mortality, morbidity, quality of life or other economic or social outcomes. Similarly, no study reported on the validity of surrogate measures using established criteria for assessing surrogate end-points. Studies evaluating the benefits of family-based prevention of alcohol misuse in young people are generally inadequate at reporting information relevant to generalizability of the findings or implications for health or social outcomes. Researchers, study authors, peer reviewers, journal editors and scientific societies should take steps to improve the reporting of information relevant to external validity in prevention trials. © 2012 The Authors. Addiction © 2012 Society for the Study of Addiction.

  19. The SF-8 Spanish Version for Health-Related Quality of Life Assessment: Psychometric Study with IRT and CFA Models.

    PubMed

    Tomás, José M; Galiana, Laura; Fernández, Irene

    2018-03-22

    The aim of current research is to analyze the psychometric properties of the Spanish version of the SF-8, overcoming previous shortcomings. A double line of analyses was used: competitive structural equations models to establish factorial validity, and Item Response theory to analyze item psychometric characteristics and information. 593 people aged 60 years or older, attending long life learning programs at the University were surveyed. Their age ranged from 60 to 92 years old. 67.6% were women. The survey included scales on personality dimensions, attitudes, perceptions, and behaviors related to aging. Competitive confirmatory models pointed out two-factors (physical and mental health) as the best representation of the data: χ2(13) = 72.37 (p < .01); CFI = .99; TLI = .98; RMSEA = .08 (.06, .10). Item 5 was removed because of unreliability and cross-loading. Graded response models showed appropriate fit for two-parameter logistic model both the physical and the mental dimensions. Item Information Curves and Test Information Functions pointed out that the SF-8 was more informative for low levels of health. The Spanish SF-8 has adequate psychometric properties, being better represented by two dimensions, once Item 5 is removed. Gathering evidence on patient-reported outcome measures is of crucial importance, as this type of measurement instruments are increasingly used in clinical arena.

  20. Developing a monitoring protocol for visitor-created informal trails in Yosemite National Park, USA.

    PubMed

    Leung, Yu-Fai; Newburger, Todd; Jones, Marci; Kuhn, Bill; Woiderski, Brittany

    2011-01-01

    Informal trails created or perpetuated by visitors is a management challenge in many protected natural areas such as Yosemite National Park. This is a significant issue as informal trail networks penetrate and proliferate into protected landscapes and habitats, threatening ecological integrity, aesthetics, and visitor experiences. In order to develop effective strategies for addressing this problem under an adaptive management framework, indicators must be developed and monitoring protocol must be established to gather timely and relevant data about the condition, extent, and distribution of these undesired trail segments. This article illustrates a process of developing and evaluating informal trail indicators for meadows in Yosemite Valley. Indicator measures developed in past research were reviewed to identify their appropriateness for the current application. Information gaps in existing indicator measures were addressed by creating two new indices to quantify the degree of informal trailing based on its land fragmentation effects. The selected indicator measures were applied to monitoring data collected between 2006 and 2008. The selected measures and indices were evaluated for their ability to characterize informal trail impacts at site and landscape scales. Results demonstrate the utility of indicator measures in capturing different characteristics of the informal trail problem, though several metrics are strongly related to each other. The two fragmentation indices were able to depict fragmentation without being too sensitive to changes in one constituent parameter. This study points to the need for a multiparameter approach to informal trail monitoring and integration with other monitoring data. Implications for monitoring programs and research are discussed.

  1. Developing a Monitoring Protocol for Visitor-Created Informal Trails in Yosemite National Park, USA

    NASA Astrophysics Data System (ADS)

    Leung, Yu-Fai; Newburger, Todd; Jones, Marci; Kuhn, Bill; Woiderski, Brittany

    2011-01-01

    Informal trails created or perpetuated by visitors is a management challenge in many protected natural areas such as Yosemite National Park. This is a significant issue as informal trail networks penetrate and proliferate into protected landscapes and habitats, threatening ecological integrity, aesthetics, and visitor experiences. In order to develop effective strategies for addressing this problem under an adaptive management framework, indicators must be developed and monitoring protocol must be established to gather timely and relevant data about the condition, extent, and distribution of these undesired trail segments. This article illustrates a process of developing and evaluating informal trail indicators for meadows in Yosemite Valley. Indicator measures developed in past research were reviewed to identify their appropriateness for the current application. Information gaps in existing indicator measures were addressed by creating two new indices to quantify the degree of informal trailing based on its land fragmentation effects. The selected indicator measures were applied to monitoring data collected between 2006 and 2008. The selected measures and indices were evaluated for their ability to characterize informal trail impacts at site and landscape scales. Results demonstrate the utility of indicator measures in capturing different characteristics of the informal trail problem, though several metrics are strongly related to each other. The two fragmentation indices were able to depict fragmentation without being too sensitive to changes in one constituent parameter. This study points to the need for a multiparameter approach to informal trail monitoring and integration with other monitoring data. Implications for monitoring programs and research are discussed.

  2. Information quality measurement of medical encoding support based on usability.

    PubMed

    Puentes, John; Montagner, Julien; Lecornu, Laurent; Cauvin, Jean-Michel

    2013-12-01

    Medical encoding support systems for diagnoses and medical procedures are an emerging technology that begins to play a key role in billing, reimbursement, and health policies decisions. A significant problem to exploit these systems is how to measure the appropriateness of any automatically generated list of codes, in terms of fitness for use, i.e. their quality. Until now, only information retrieval performance measurements have been applied to estimate the accuracy of codes lists as quality indicator. Such measurements do not give the value of codes lists for practical medical encoding, and cannot be used to globally compare the quality of multiple codes lists. This paper defines and validates a new encoding information quality measure that addresses the problem of measuring medical codes lists quality. It is based on a usability study of how expert coders and physicians apply computer-assisted medical encoding. The proposed measure, named ADN, evaluates codes Accuracy, Dispersion and Noise, and is adapted to the variable length and content of generated codes lists, coping with limitations of previous measures. According to the ADN measure, the information quality of a codes list is fully represented by a single point, within a suitably constrained feature space. Using one scheme, our approach is reliable to measure and compare the information quality of hundreds of codes lists, showing their practical value for medical encoding. Its pertinence is demonstrated by simulation and application to real data corresponding to 502 inpatient stays in four clinic departments. Results are compared to the consensus of three expert coders who also coded this anonymized database of discharge summaries, and to five information retrieval measures. Information quality assessment applying the ADN measure showed the degree of encoding-support system variability from one clinic department to another, providing a global evaluation of quality measurement trends. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. Calorie labeling in a rural middle school influences food selection: findings from community-based participatory research.

    PubMed

    Hunsberger, Monica; McGinnis, Paul; Smith, Jamie; Beamer, Beth Ann; O'Malley, Jean

    2015-01-01

    Calorie labeling at the point-of-purchase in chain restaurants has been shown to reduce energy intake. To investigate the impact of point-of-purchase calorie information at one rural middle school. With a community-based participatory research framework a mixed method approach was used to evaluate the impact of point-of-purchase calorie information. Students in grades 6-8, dining at the school cafeteria January and February 2010, participated for 17 school days each month; in January a menu was offered in the usual manner without calorie labels; the same menu was prepared in February with the addition of calorie labels at point-of-purchase. Gross calories served per student were measured each day allowing for matched comparison by menu. In March/April of 2010, 32 students who ate in the cafeteria 3 or more times per week were interviewed regarding their views on menu labeling. Calorie consumption decreased by an average of 47 calories/day; fat intake reduced by 2.1 grams/day. Five main themes were consistent throughout the interviews. Point-of-purchase calorie labels can play a role in reducing the number of calories consumed by middle school age children at the lunch. The majority of students interviewed found the calorie labels helped them choose healthier food.

  4. Lightning arrestor connector lead magnesium niobate qualification pellet test procedures.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuohig, W.; Mahoney, Patrick A.; Tuttle, Bruce Andrew

    2009-02-01

    Enhanced knowledge preservation for DOE DP technical component activities has recently received much attention. As part of this recent knowledge preservation effort, improved documentation of the sample preparation and electrical testing procedures for lead magnesium niobate--lead titanate (PMN/PT) qualification pellets was completed. The qualification pellets are fabricated from the same parent powders used to produce PMN/PT lightning arrestor connector (LAC) granules at HWF&T. In our report, the procedures for fired pellet surface preparation, electrode deposition, electrical testing and data recording are described. The dielectric measurements described in our report are an information only test. Technical reasons for selecting the electrodemore » material, electrode size and geometry are presented. The electrical testing is based on measuring the dielectric constant and dissipation factor of the pellet during cooling from 280 C to 220 C. The most important data are the temperature for which the peak dielectric constant occurs (Curie Point temperature) and the peak dielectric constant magnitude. We determined that the peak dielectric constant for our procedure would be that measured at 1 kHz at the Curie Point. Both the peak dielectric constant and the Curie point parameters provide semi-quantitative information concerning the chemical and microstructural homogeneity of the parent material used for the production of PMN/PT granules for LACs. Finally, we have proposed flag limits for the dielectric data for the pellets. Specifically, if the temperature of the peak dielectric constant falls outside the range of 250 C {+-} 30 C we propose that a flag limit be imposed that will initiate communication between production agency and design agency personnel. If the peak dielectric constant measured falls outside the range 25,000 {+-} 10,000 we also propose that a flag limit be imposed.« less

  5. A randomised study of three different informational AIDS prior to coronary angiography, measuring patient recall, satisfaction and anxiety.

    PubMed

    Astley, Carolyn M; Chew, Derek P; Aylward, Philip E; Molloy, Danielle A; De Pasquale, Carmine G

    2008-02-01

    Informed consent is a basic standard of care for all patients undergoing medical procedures, but recall of information has been shown to be poor. We sought to compare verbal, written and animated audiovisual information delivery, during consent for coronary angiography, by measuring improvement in recall. A sample population of 99 cardiac patients at Flinders Medical Centre was randomised (1:1:1) to receive one of three information delivery methods. The information content was standardised by a risk proforma, which explained the procedure and defined 12 specific risks. Recall, satisfaction and anxiety were assessed by a questionnaire administered at three different time points: post-consent, post-procedure and at 30 days. Effect of delivery method on satisfaction and anxiety was rated on a self-reported scale from 1-5, with 5 representing very satisfied or very anxious. Groups were compared by non-parametric testing and a p-value of <0.05 was considered statistically significant. Patients were a median age of 64 (i.q.r. 56, 72) years. Information delivery method had no effect on recall of risks at any time-point (p=0.2, 0.7, 0.5, respectively) and the average recall score across the population was 3-4 out of 12. There was no significant effect on median satisfaction scores: verbal; 5 (i.q.r.4, 5) versus written/audiovisual; 4 (i.q.r.4, 5) (p=ns), or on median anxiety scores: verbal; 3 (i.q.r.2, 4) versus written/audiovisual; 3 (i.q.r.2, 4) (p=ns). Despite careful design of an innovative audiovisual delivery technique aimed at optimising comprehension and aiding memory, recall of information was poor and informational aids showed no improvement. Modes of information delivery are not the key to patient assimilation of complex medical information.

  6. The Lichen-GIS Project, Teaching Students How to Use Bioindicator Species to Assess Environmental Quality.

    PubMed

    Wagner, Stephen C; McDonald, Darrel; Watson, Trey; Taylor, Josephine; Sowards, Alan B

    2009-01-01

    A content-driven biology course for preservice K-8 teachers has been developed. This course uses the constructivist approach, where instructors engage students by organizing information around concept-based problems. To this end, a semester-long, inquiry-based project was introduced where students studied lichen populations on trees located on their campus to monitor air quality. Data were incorporated into a geographical information systems (GIS) database to demonstrate how it can be used to map communities. Student teams counted the number of each lichen type within a grid placed on each tree trunk sampled and entered this information into a GIS database. The students constructed maps of lichen populations at each sample site and wrote abstracts about their research. Student performance was assessed by the preparation of these abstracts as well as by scores on pre- and posttests of key content measures. Students also completed a survey to determine whether the project aided in their comprehension as well as their interest in incorporating this activity into their own curricula. The students' pre- and posttest results showed an eightfold improvement in the total score after the semester project. Additionally, correct responses to each individual content measure increased by at least 35%. Total scores for the abstract ranged from 12 to 20 points out of 20 total points possible (60% to 100%), with a mean score of 15.8 points (78%). These results indicate that this exercise provided an excellent vehicle to teach students about lichens and their use as bioindicators and the application of geospatial technologies to map environmental data.

  7. Quantification of cerebral ventricle volume change of preterm neonates using 3D ultrasound images

    NASA Astrophysics Data System (ADS)

    Chen, Yimin; Kishimoto, Jessica; Qiu, Wu; de Ribaupierre, Sandrine; Fenster, Aaron; Chiu, Bernard

    2015-03-01

    Intraventricular hemorrhage (IVH) is a major cause of brain injury in preterm neonates. Quantitative measurement of ventricular dilation or shrinkage is important for monitoring patients and in evaluation of treatment options. 3D ultrasound (US) has been used to monitor the ventricle volume as a biomarker for ventricular dilation. However, volumetric quantification does not provide information as to where dilation occurs. The location where dilation occurs may be related to specific neurological problems later in life. For example, posterior horn enlargement, with thinning of the corpus callosum and parietal white matter fibres, could be linked to poor visuo-spatial abilities seen in hydrocephalic children. In this work, we report on the development and application of a method used to analyze local surface change of the ventricles of preterm neonates with IVH from 3D US images. The technique is evaluated using manual segmentations from 3D US images acquired in two imaging sessions. The surfaces from baseline and follow-up were registered and then matched on a point-by-point basis. The distance between each pair of corresponding points served as an estimate of local surface change of the brain ventricle at each vertex. The measurements of local surface change were then superimposed on the ventricle surface to produce the 3D local surface change map that provide information on the spatio-temporal dilation pattern of brain ventricles following IVH. This tool can be used to monitor responses to different treatment options, and may provide important information for elucidating the deficiencies a patient will have later in life.

  8. Use of Repeated Blood Pressure and Cholesterol Measurements to Improve Cardiovascular Disease Risk Prediction: An Individual-Participant-Data Meta-Analysis

    PubMed Central

    Barrett, Jessica; Pennells, Lisa; Sweeting, Michael; Willeit, Peter; Di Angelantonio, Emanuele; Gudnason, Vilmundur; Nordestgaard, Børge G.; Psaty, Bruce M; Goldbourt, Uri; Best, Lyle G; Assmann, Gerd; Salonen, Jukka T; Nietert, Paul J; Verschuren, W. M. Monique; Brunner, Eric J; Kronmal, Richard A; Salomaa, Veikko; Bakker, Stephan J L; Dagenais, Gilles R; Sato, Shinichi; Jansson, Jan-Håkan; Willeit, Johann; Onat, Altan; de la Cámara, Agustin Gómez; Roussel, Ronan; Völzke, Henry; Dankner, Rachel; Tipping, Robert W; Meade, Tom W; Donfrancesco, Chiara; Kuller, Lewis H; Peters, Annette; Gallacher, John; Kromhout, Daan; Iso, Hiroyasu; Knuiman, Matthew; Casiglia, Edoardo; Kavousi, Maryam; Palmieri, Luigi; Sundström, Johan; Davis, Barry R; Njølstad, Inger; Couper, David; Danesh, John; Thompson, Simon G; Wood, Angela

    2017-01-01

    Abstract The added value of incorporating information from repeated blood pressure and cholesterol measurements to predict cardiovascular disease (CVD) risk has not been rigorously assessed. We used data on 191,445 adults from the Emerging Risk Factors Collaboration (38 cohorts from 17 countries with data encompassing 1962–2014) with more than 1 million measurements of systolic blood pressure, total cholesterol, and high-density lipoprotein cholesterol. Over a median 12 years of follow-up, 21,170 CVD events occurred. Risk prediction models using cumulative mean values of repeated measurements and summary measures from longitudinal modeling of the repeated measurements were compared with models using measurements from a single time point. Risk discrimination (C-index) and net reclassification were calculated, and changes in C-indices were meta-analyzed across studies. Compared with the single-time-point model, the cumulative means and longitudinal models increased the C-index by 0.0040 (95% confidence interval (CI): 0.0023, 0.0057) and 0.0023 (95% CI: 0.0005, 0.0042), respectively. Reclassification was also improved in both models; compared with the single-time-point model, overall net reclassification improvements were 0.0369 (95% CI: 0.0303, 0.0436) for the cumulative-means model and 0.0177 (95% CI: 0.0110, 0.0243) for the longitudinal model. In conclusion, incorporating repeated measurements of blood pressure and cholesterol into CVD risk prediction models slightly improves risk prediction. PMID:28549073

  9. [Evaluation and improvement of the comprehension of informed consent documents].

    PubMed

    López-Picazo Ferrer, Julio José; Tomás Garcia, Nuria

    2016-04-01

    The information contained in a good informed consent form (ICF) must be understood by the patients. The aim of this study is to assess and improve the readability of the ICF submitted for accreditation in a tertiary hospital. Study of assessment and improvement of the quality of 132 ICF from 2 departments of a public tertiary hospital, divided into 3 phases: Initial assessment, intervention and reassessment. Both length and readability are assessed. Length is measured in words (adequate to 470, excessive over 940), and readability in INFLESZ points (suitable if over 55). The ICF contents initially proposed by departments were adapted by non-health-related trained persons, whose doubts about medical terms were resolved by the authors. To compare results between evaluations, relative improvement (in both length and INFLESZ) and statistical significances were calculated. 78.8% of the ICFs showed a desired length (CI95% 86,5-71,1) and a mean of 44.1 INFLESZ points (3.8% >55 points, CI95% 6,0-1,6). After the intervention, INFLESZ raised to 61.9 points (improvement 40.3%, P<.001), all ICF showing >55 points. The resulting ICFs had a longer description of the nature of the procedure (P<.0001) and a shorter description of their consequences, risks (P <.0001) and alternatives (P <.05). The introduction of improvement dynamics in the design of ICFs is possible and necessary because it produces more effective and easily readable ICFs. Copyright © 2015 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  10. Up by upwest: Is slope like north?

    PubMed

    Weisberg, Steven M; Nardi, Daniele; Newcombe, Nora S; Shipley, Thomas F

    2014-10-01

    Terrain slope can be used to encode the location of a goal. However, this directional information may be encoded using a conceptual north (i.e., invariantly with respect to the environment), or in an observer-relative fashion (i.e., varying depending on the direction one faces when learning the goal). This study examines which representation is used, whether the sensory modality in which slope is encoded (visual, kinaesthetic, or both) influences representations, and whether use of slope varies for men and women. In a square room, with a sloped floor explicitly pointed out as the only useful cue, participants encoded the corner in which a goal was hidden. Without direct sensory access to slope cues, participants used a dial to point to the goal. For each trial, the goal was hidden uphill or downhill, and the participants were informed whether they faced uphill or downhill when pointing. In support of observer-relative representations, participants pointed more accurately and quickly when facing concordantly with the hiding position. There was no effect of sensory modality, providing support for functional equivalence. Sex did not interact with the findings on modality or reference frame, but spatial measures correlated with success on the slope task differently for each sex.

  11. The Influence of Nutrition Labeling and Point-of-Purchase Information on Food Behaviours.

    PubMed

    Volkova, Ekaterina; Ni Mhurchu, Cliona

    2015-03-01

    Point-of-purchase information on packaged food has been a highly debated topic. Various types of nutrition labels and point-of-purchase information have been studied to determine their ability to attract consumers' attention, be well understood and promote healthy food choices. Country-specific regulatory and monitoring frameworks have been implemented to ensure reliability and accuracy of such information. However, the impact of such information on consumers' behaviour remains contentious. This review summarizes recent evidence on the real-world effectiveness of nutrition labels and point-of-purchase information.

  12. Ground-based measurements of ionospheric dynamics

    NASA Astrophysics Data System (ADS)

    Kouba, Daniel; Chum, Jaroslav

    2018-05-01

    Different methods are used to research and monitor the ionospheric dynamics using ground measurements: Digisonde Drift Measurements (DDM) and Continuous Doppler Sounding (CDS). For the first time, we present comparison between both methods on specific examples. Both methods provide information about the vertical drift velocity component. The DDM provides more information about the drift velocity vector and detected reflection points. However, the method is limited by the relatively low time resolution. In contrast, the strength of CDS is its high time resolution. The discussed methods can be used for real-time monitoring of medium scale travelling ionospheric disturbances. We conclude that it is advantageous to use both methods simultaneously if possible. The CDS is then applied for the disturbance detection and analysis, and the DDM is applied for the reflection height control.

  13. Entropy of Movement Outcome in Space-Time.

    PubMed

    Lai, Shih-Chiung; Hsieh, Tsung-Yu; Newell, Karl M

    2015-07-01

    Information entropy of the joint spatial and temporal (space-time) probability of discrete movement outcome was investigated in two experiments as a function of different movement strategies (space-time, space, and time instructional emphases), task goals (point-aiming and target-aiming) and movement speed-accuracy constraints. The variance of the movement spatial and temporal errors was reduced by instructional emphasis on the respective spatial or temporal dimension, but increased on the other dimension. The space-time entropy was lower in targetaiming task than the point aiming task but did not differ between instructional emphases. However, the joint probabilistic measure of spatial and temporal entropy showed that spatial error is traded for timing error in tasks with space-time criteria and that the pattern of movement error depends on the dimension of the measurement process. The unified entropy measure of movement outcome in space-time reveals a new relation for the speed-accuracy.

  14. A sequence of physical processes quantified in LAOS by continuous local measures

    NASA Astrophysics Data System (ADS)

    Lee, Ching-Wei; Rogers, Simon A.

    2017-11-01

    The response to large amplitude oscillatory shear of a soft colloidal glass formed by a suspension of multiarm star polymers is investigated by means of well-defined continuous local measures. The local measures provide information regarding the transient elastic and viscous response of the material, as well as elastic extension via a shifting equilibrium position. It is shown that even when the amplitude of the strain is very large, cages reform and break twice per period and exhibit maximum elasticity around the point of zero stress. It is also shown that around the point of zero stress, the cages are extended by a nearly constant amount of approximately 5% at 1 rad/s and 7% at 10 rad/s, even when the total strain is as large as 420%. The results of this study provide a blueprint for a generic approach to elucidating the complex dynamics exhibited by soft materials under flow.

  15. A Preliminary Validation Analysis of Orbiting Carbon Observatory-2 (OCO-2) Measurements Using TCCON Data

    NASA Astrophysics Data System (ADS)

    Osterman, G. B.; Fisher, B.; Roehl, C. M.; Wunch, D.; Wennberg, P. O.; Eldering, A.; Naylor, B. J.; Crisp, D.; Pollock, H. R.; Gunson, M. R.

    2014-12-01

    The NASA Orbiting Carbon Observatory-2 (OCO-2) successfully launched from Vandenberg Air Force Base in California on July 2, 2014. The OCO-2 mission is designed to provide remotely sensed measurements of the column averaged dry air mole fraction of carbon dioxide from space. OCO-2 is capable of making measurements in three observation modes: Nadir, glint and target. The standard operational mode for OCO-2 alternates between nadir and glint mode every 16 days, but target mode observations are possible by commanding the spacecraft to point to specific surface location. In this presentation we provide information on the preliminary observations and plans for OCO-2 2015. In particular, we will also provide an update on the pointing capabilities and accuracy for OCO-2. We provide updates on OCO-2 target mode including possible target mode locations. We will show calendars for the different viewing geometries and target mode possibilities.

  16. Rating the strength of coal mine roof rocks. Information circular/1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molinda, G.M.; Mark, C.

    1996-05-01

    The Ferm pictoral classification of coal measure rocks is widely utilized in coalfield exploration. Although extremely useful as an alternative to conventional geologic description, no material properties are provided that would be suitable for engineering solutions. To remedy this problem, the USBM has tested over 30 common coal measure roof rock types for axial and bedding strength. More than 1,300 individual point load tests have been conducted on core from 8 different coal mines representing the full range of common coal measure rocks. The USBM core and roof exposure properties database has been merged with the picture classification to provide,more » for the first time, a simple, clear guide from field identification of core to the associated mechanical strength of the rock. For 33 of the most common roof rocks, the axial and diametral point load strength, as well as the ultimate unit rating, is overprinted onto the photograph.« less

  17. MSL: A Measure to Evaluate Three-dimensional Patterns in Gene Expression Data

    PubMed Central

    Gutiérrez-Avilés, David; Rubio-Escudero, Cristina

    2015-01-01

    Microarray technology is highly used in biological research environments due to its ability to monitor the RNA concentration levels. The analysis of the data generated represents a computational challenge due to the characteristics of these data. Clustering techniques are widely applied to create groups of genes that exhibit a similar behavior. Biclustering relaxes the constraints for grouping, allowing genes to be evaluated only under a subset of the conditions. Triclustering appears for the analysis of longitudinal experiments in which the genes are evaluated under certain conditions at several time points. These triclusters provide hidden information in the form of behavior patterns from temporal experiments with microarrays relating subsets of genes, experimental conditions, and time points. We present an evaluation measure for triclusters called Multi Slope Measure, based on the similarity among the angles of the slopes formed by each profile formed by the genes, conditions, and times of the tricluster. PMID:26124630

  18. Correlation of predicted and measured thermal stresses on an advanced aircraft structure with dissimilar materials. [hypersonic heating simulation

    NASA Technical Reports Server (NTRS)

    Jenkins, J. M.

    1979-01-01

    Additional information was added to a growing data base from which estimates of finite element model complexities can be made with respect to thermal stress analysis. The manner in which temperatures were smeared to the finite element grid points was examined from the point of view of the impact on thermal stress calculations. The general comparison of calculated and measured thermal stresses is guite good and there is little doubt that the finite element approach provided by NASTRAN results in correct thermal stress calculations. Discrepancies did exist between measured and calculated values in the skin and the skin/frame junctures. The problems with predicting skin thermal stress were attributed to inadequate temperature inputs to the structural model rather than modeling insufficiencies. The discrepancies occurring at the skin/frame juncture were most likely due to insufficient modeling elements rather than temperature problems.

  19. Signatures of bifurcation on quantum correlations: Case of the quantum kicked top

    NASA Astrophysics Data System (ADS)

    Bhosale, Udaysinh T.; Santhanam, M. S.

    2017-01-01

    Quantum correlations reflect the quantumness of a system and are useful resources for quantum information and computational processes. Measures of quantum correlations do not have a classical analog and yet are influenced by classical dynamics. In this work, by modeling the quantum kicked top as a multiqubit system, the effect of classical bifurcations on measures of quantum correlations such as the quantum discord, geometric discord, and Meyer and Wallach Q measure is studied. The quantum correlation measures change rapidly in the vicinity of a classical bifurcation point. If the classical system is largely chaotic, time averages of the correlation measures are in good agreement with the values obtained by considering the appropriate random matrix ensembles. The quantum correlations scale with the total spin of the system, representing its semiclassical limit. In the vicinity of trivial fixed points of the kicked top, the scaling function decays as a power law. In the chaotic limit, for large total spin, quantum correlations saturate to a constant, which we obtain analytically, based on random matrix theory, for the Q measure. We also suggest that it can have experimental consequences.

  20. Element-resolved Kikuchi pattern measurements of non-centrosymmetric materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vos, Maarten, E-mail: maarten.vos@anu.edu.au

    2017-01-15

    Angle-resolved electron Rutherford backscattering (ERBS) measurements using an electrostatic electron energy analyser can provide unique access to element-resolved crystallographic information. We present Kikuchi pattern measurements of the non-centrosymmetric crystal GaP, separately resolving the contributions of electrons backscattered from Ga and P. In comparison to element-integrated measurements like in the method of electron backscatter diffraction (EBSD), the effect of the absence of a proper 4-fold rotation axis in the point group of GaP can be sensed with a much higher visibility via the element-resolved Ga to P intensity ratio. These element-resolved measurements make it possible to experimentally attribute the previously observedmore » point-group dependent effect in element-integrated EBSD measurements to the larger contribution of electrons scattered from Ga compared to P. - Highlights: •Element specific Kikuchi patterns are presented for GaP. •Absence of a proper four-fold rotation axis is demonstrated. •Ga and P intensity variations after 90 degree rotation have opposite phase. •The asymmetry in the total intensity distribution resembles that of Ga.« less

  1. Recent Observational Efforts Using the DOE ARM Observatory at Oliktok Point, Alaska

    NASA Astrophysics Data System (ADS)

    de Boer, G.; Shupe, M.; McComiskey, A. C.; Creamean, J.; Williams, C. R.; Matrosov, S. Y.; Solomon, A.; Turner, D. D.; Norgren, M.; Maahn, M.; Lawrence, D.; Argrow, B. M.; Palo, S. E.; Weibel, D.; Curry, N.; Nichols, T.; D'Amore, P.; Finamore, W.; Ivey, M.; Bendure, A.; Schmid, B.; Biraud, S.

    2016-12-01

    The US Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) program has deployed it's third mobile facility (AMF-3) to Oliktok Point, Alaska for an extended measurement campaign. This facility includes a variety of instruments to measure clouds, aerosols, surface meteorology, and surface energy exchange (including radiation). Additionally, this site features two areas of controlled airspace in which additional measurements can be made using manned- and unmanned aircraft and tethered balloons. Over the past two years, several field campaigns have taken place to make measurements complimentary to those collected by the AMF-3. These include several unmanned aircraft and tethered balloon campaigns (Coordinated Observations of the Lower Arctic Atmosphere, COALA; Evaluation of Routine Atmospheric Sounding Measurements using Unmanned Systems, ERASMUS; Inaugural Campaigns for ARM Research using Unmanned Systems, ICARUS), as well as a manned aircraft campaign during the summer of 2015 (ARM Carbon Measurement Experiment, ACME-5). In addition to these field campaigns, DOE has formed a site science team to conduct research using AMF-3 measurements. In this presentation, we will provide an overview of these measurement campaigns. Additionally, we will provide an overview of scientific results from these campaigns and from AMF-3 research that aid to inform numerical modeling efforts.

  2. A comparison of the Scottish Index of Multiple Deprivation (SIMD) 2004 with the 2009 + 1 SIMD: does choice of measure affect the interpretation of inequality in mortality?

    PubMed

    Ralston, Kevin; Dundas, Ruth; Leyland, Alastair H

    2014-07-08

    There is a growing international literature assessing inequalities in health and mortality by area based measures. However, there are few works comparing measures available to inform research design. The analysis here seeks to begin to address this issue by assessing whether there are important differences in the relationship between deprivation and inequalities in mortality when measures that have been constructed at different time points are compared. We contrast whether the interpretation of inequalities in all-cause mortality between the years 2008-10 changes in Scotland if we apply the earliest (2004) and the 2009 + 1 releases of the Scottish Index of Multiple Deprivation (SIMD) to make this comparison. The 2004 release is based on data from 2001/2 and the 2009 + 1 release is based on data from 2008/9. The slope index of inequality (SII) and 1:10 ratio are used to summarise inequalities standardised by age/sex using population and mortality records. The 1:10 ratio suggests some differences in the magnitude of inequalities measured using SIMD at different time points. However, the SII shows much closer correspondence. Overall the findings show that substantive conclusions in relation to inequalities in all-cause mortality are little changed by the updated measure. This information is beneficial to researchers as the most recent measures are not always available. This adds to the body of literature showing stability in inequalities in health and mortality by geographical deprivation over time.

  3. Characterization of irradiation induced deep and shallow impurities

    NASA Astrophysics Data System (ADS)

    Treberspurg, Wolfgang; Bergauer, Thomas; Dragicevic, Marko; Krammer, Manfred; Valentan, Manfred

    2013-12-01

    Silicon Detectors close to the interaction point of the High Luminosity Large Hardron Collider (HL-LHC) have to withstand a harsh irradiation environment. In order to evaluate the behaviour of shallow and deep defects, induced by neutron irradiation, spreading resistance resistivity measurements and capacitance voltage measurements have been performed. These measurements, deliver information about the profile of shallow impurities after irradiation as well as indications of deep defects in the Space Charge Region (SCR) and the Electrical Neutral Bulk (ENB). By considering the theoretical background of the measurement both kinds of defects can be investigated independently from each other.

  4. Control Model for Dampening Hand Vibrations Using Information of Internal and External Coordinates

    PubMed Central

    Togo, Shunta; Kagawa, Takahiro; Uno, Yoji

    2015-01-01

    In the present study, we investigate a control mechanism that dampens hand vibrations. Here, we propose a control method with two components to suppress hand vibrations. The first is a passive suppression method that lowers the joint stiffness to passively dampen the hand vibrations. The second is an active suppression method that adjusts an equilibrium point based on skyhook control to actively dampen the hand vibrations. In a simulation experiment, we applied these two methods to dampen hand vibrations during the shoulder’s horizontal oscillation. We also conducted a measurement experiment wherein a subject’s shoulder was sinusoidally oscillated by a platform that generated horizontal oscillations. The results of the measurement experiments showed that the jerk of each part of the arm in a task using a cup filled with water was smaller than the shoulder jerk and that in a task with a cup filled with stones was larger than the shoulder jerk. Moreover, the amplitude of the hand trajectory in both horizontal and vertical directions was smaller in a task using a cup filled with water than in a task using a cup filled with stones. The results of the measurement experiments were accurately reproduced by the active suppression method based on skyhook control. These results suggest that humans dampen hand vibrations by controlling the equilibrium point through the information of the external workspace and the internal body state rather than by lowering joint stiffness only by using internal information. PMID:25876037

  5. Measuring flow velocity and flow direction by spatial and temporal analysis of flow fluctuations.

    PubMed

    Chagnaud, Boris P; Brücker, Christoph; Hofmann, Michael H; Bleckmann, Horst

    2008-04-23

    If exposed to bulk water flow, fish lateral line afferents respond only to flow fluctuations (AC) and not to the steady (DC) component of the flow. Consequently, a single lateral line afferent can encode neither bulk flow direction nor velocity. It is possible, however, for a fish to obtain bulk flow information using multiple afferents that respond only to flow fluctuations. We show by means of particle image velocimetry that, if a flow contains fluctuations, these fluctuations propagate with the flow. A cross-correlation of water motion measured at an upstream point with that at a downstream point can then provide information about flow velocity and flow direction. In this study, we recorded from pairs of primary lateral line afferents while a fish was exposed to either bulk water flow, or to the water motion caused by a moving object. We confirm that lateral line afferents responded to the flow fluctuations and not to the DC component of the flow, and that responses of many fiber pairs were highly correlated, if they were time-shifted to correct for gross flow velocity and gross flow direction. To prove that a cross-correlation mechanism can be used to retrieve the information about gross flow velocity and direction, we measured the flow-induced bending motions of two flexible micropillars separated in a downstream direction. A cross-correlation of the bending motions of these micropillars did indeed produce an accurate estimate of the velocity vector along the direction of the micropillars.

  6. A point prevalence cross-sectional study of healthcare-associated urinary tract infections in six Australian hospitals

    PubMed Central

    Gardner, Anne; Mitchell, Brett; Beckingham, Wendy; Fasugba, Oyebola

    2014-01-01

    Objectives Urinary tract infections (UTIs) account for over 30% of healthcare-associated infections. The aim of this study was to determine healthcare-associated UTI (HAUTI) and catheter-associated UTI (CAUTI) point prevalence in six Australian hospitals to inform a national point prevalence process and compare two internationally accepted HAUTI definitions. We also described the level and comprehensiveness of clinical record documentation, microbiology laboratory and coding data at identifying HAUTIs and CAUTIs. Setting Data were collected from three public and three private Australian hospitals over the first 6 months of 2013. Participants A total of 1109 patients were surveyed. Records of patients of all ages, hospitalised on the day of the point prevalence at the study sites, were eligible for inclusion. Outpatients, patients in adult mental health units, patients categorised as maintenance care type (ie, patients waiting to be transferred to a long-term care facility) and those in the emergency department during the duration of the survey were excluded. Outcome measures The primary outcome measures were the HAUTI and CAUTI point prevalence. Results Overall HAUTI and CAUTI prevalence was 1.4% (15/1109) and 0.9% (10/1109), respectively. Staphylococcus aureus and Candida species were the most common pathogens. One-quarter (26.3%) of patients had a urinary catheter and fewer than half had appropriate documentation. Eight of the 15 patients ascertained to have a HAUTI based on clinical records (6 being CAUTI) were coded by the medical records department with an International Classification of Diseases (ICD)-10 code for UTI diagnosis. The Health Protection Agency Surveillance definition had a positive predictive value of 91.67% (CI 64.61 to 98.51) compared against the Centers for Disease Control and Prevention definition. Conclusions These study results provide a foundation for a national Australian point prevalence study and inform the development and implementation of targeted healthcare-associated infection surveillance more broadly. PMID:25079929

  7. Vocal Tract Representation in the Recognition of Cerebral Palsied Speech

    ERIC Educational Resources Information Center

    Rudzicz, Frank; Hirst, Graeme; van Lieshout, Pascal

    2012-01-01

    Purpose: In this study, the authors explored articulatory information as a means of improving the recognition of dysarthric speech by machine. Method: Data were derived chiefly from the TORGO database of dysarthric articulation (Rudzicz, Namasivayam, & Wolff, 2011) in which motions of various points in the vocal tract are measured during speech.…

  8. Issues in the Enumeration of Handicapping Conditions in the United States.

    ERIC Educational Resources Information Center

    Martini, Linda; MacTurk, Robert H.

    1985-01-01

    The article identifies sources of prevalence and incidence rates for handicaps and disabilities, points out problems regarding obtaining this information, and examines reasons for the problems. Two measures are suggested: first, to set up a national directory of those health statistics already being collected; and second, to develop a nationwide…

  9. Game Theory, Decision Theory, and Social Choice Theory in the Context of a New Theory of Equity

    DTIC Science & Technology

    1978-12-01

    Press of Harvard University. Sen , Amartya , 1977, "On Weights and Measures: Informational Constraints in Social Welfare Analysis," Econometrica, Vol...this requirement is a natural consequence of a setup that rules out cardinal utility, as Arrow (1978) and Sen (1977) have pointed out in different

  10. Real-time scheduling faces operational challenges.

    PubMed

    2005-01-01

    Online real-time patient scheduling presents a number of challenges. But a few advanced organizations are rolling out systems slowly, meeting those challenges as they go. And while this application is still too new to provide measurable benefits, anecdotal information seems to point to improvements in efficiency, patient satisfaction, and possibly quality of care.

  11. Site quality relationships for shortleaf pine

    Treesearch

    David L. Graney

    1986-01-01

    Existing information about site quality relationships for shortleaf pine (Pinus echinata Mill.) in the southeastern United States is reviewed in this paper. Estimates of site quality, whether from direct tree measurements or indirect estimates based on soil and site features, are only local observations for many points on the landscape. To be of value to the land...

  12. 76 FR 33275 - Agency Information Collection Activities; Submission to OMB for Review and Approval; Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-08

    ... storage, conveyor transfer points, bagging and bulk loading and unloading systems. These standards rely on... part shall maintain a file of these measurements, and retain the file for at least two years following... total time, effort, or financial resources expended by persons to generate, maintain, retain, or...

  13. Wave Information Studies of US Coastlines: Hindcast Wave Information for the Great Lakes: Lake Erie

    DTIC Science & Technology

    1991-10-01

    total ice cover) for individual grid cells measuring 5 km square. 42. The GLERL analyzed each half-month data set to provide the maximum, minimum...average, median, and modal ice concentrations for each 5-km cell . The median value, which represents an estimate of the 50-percent point of the ice...incorporating the progression and decay of the time-dependent ice cover was complicated by the fact that different grid cell sizes were used for mapping the ice

  14. Development of Rhizo-Columns for Nondestructive Root System Architecture Laboratory Measurements

    NASA Astrophysics Data System (ADS)

    Oostrom, M.; Johnson, T. J.; Varga, T.; Hess, N. J.; Wietsma, T. W.

    2016-12-01

    Numerical models for root water uptake in plant-soil systems have been developing rapidly, increasing the demand for laboratory experimental data to test and verify these models. Most of the increasingly detailed models are either compared to long-term field crop data or do not involve comparisons at all. Ideally, experiments would provide information on dynamic root system architecture (RSA) in combination with soil-pant hydraulics such as water pressures and volumetric water contents. Data obtained from emerging methods such as Spectral Induced Polarization (SIP) and x-ray computed tomography (x-ray CT) may be used to provide laboratory RSA data needed for model comparisons. Point measurements such as polymer tensiometers (PT) may provide soil moisture information over a large range of water pressures, from field capacity to the wilting point under drought conditions. In the presentation, we demonstrate a novel laboratory capability allowing for detailed RSA studies in large columns under controlled conditions using automated SIP, X-ray CT, and PT methods. Examples are shown for pea and corn root development under various moisture regimes.

  15. Genetic thinking in the study of social relationships: Five points of entry

    PubMed Central

    Reiss, David

    2014-01-01

    For nearly a generation, researchers studying human behavioral development have combined genetically informed research designs with careful measures of social relationships: parenting, sibling relationships, peer relationships, marital processes, social class stratifications and patterns of social engagement in the elderly. In what way have these genetically informed studies altered the construction and testing of social theories of human development? We consider five points where genetic thinking is taking hold. First, genetic findings suggest an alternative scenario for explaining social data. Associations between measures of the social environment and human development may be due to genes that influence both. Second, genetic studies add to other prompts to study the early developmental origins of current social phenomena in mid-life and beyond. Third, genetic analyses promise to bring to the surface understudied social systems, such as sibling relationships, that have an impact on human development independent of genotype. Fourth, genetic analyses anchor in neurobiology individual differences in resilience and sensitivity to both adverse and favorable social environments. Finally, genetic analyses increase the utility of laboratory simulations of human social processes and of animal models. PMID:25419225

  16. The Development of Memory Efficiency and Value-Directed Remembering Across the Lifespan: A Cross-Sectional Study of Memory and Selectivity

    PubMed Central

    Castel, Alan D.; Humphreys, Kathryn L.; Lee, Steve S.; Galván, Adriana; Balota, David A.; McCabe, David P.

    2012-01-01

    Although attentional control and memory change considerably across the lifespan, no research has examined how the ability to strategically remember important information (i.e., value-directed remembering) changes from childhood to old age. The present study examined this in different age groups across the lifespan (N=320, 5 to 96 years old). We employed a selectivity task where participants were asked to study and recall items worth different point values in order to maximize their point score. This procedure allowed for measures of memory quantity/capacity (number of words recalled) and memory efficiency/selectivity (the recall of high-value items relative to low-value items). Age-related differences were found for memory capacity, as young adults recalled more words than the other groups. However, in terms of selectivity, younger and older adults were more selective than adolescents and children. The dissociation between these measures across the lifespan illustrates important age-related differences in terms of memory capacity and the ability to selectively remember high-value information. PMID:21942664

  17. The development of memory efficiency and value-directed remembering across the life span: a cross-sectional study of memory and selectivity.

    PubMed

    Castel, Alan D; Humphreys, Kathryn L; Lee, Steve S; Galván, Adriana; Balota, David A; McCabe, David P

    2011-11-01

    Although attentional control and memory change considerably across the life span, no research has examined how the ability to strategically remember important information (i.e., value-directed remembering) changes from childhood to old age. The present study examined this in different age groups across the life span (N = 320, 5-96 years old). A selectivity task was used in which participants were asked to study and recall items worth different point values in order to maximize their point score. This procedure allowed for measures of memory quantity/capacity (number of words recalled) and memory efficiency/selectivity (the recall of high-value items relative to low-value items). Age-related differences were found for memory capacity, as young adults recalled more words than the other groups. However, in terms of selectivity, younger and older adults were more selective than adolescents and children. The dissociation between these measures across the life span illustrates important age-related differences in terms of memory capacity and the ability to selectively remember high-value information.

  18. A review and a framework of handheld computer adoption in healthcare.

    PubMed

    Lu, Yen-Chiao; Xiao, Yan; Sears, Andrew; Jacko, Julie A

    2005-06-01

    Wide adoption of mobile computing technology can potentially improve information access, enhance workflow, and promote evidence-based practice to make informed and effective decisions at the point of care. Handheld computers or personal digital assistants (PDAs) offer portable and unobtrusive access to clinical data and relevant information at the point of care. This article reviews the literature on issues related to adoption of PDAs in health care and barriers to PDA adoption. Studies showed that PDAs were used widely in health care providers' practice, and the level of use is expected to rise rapidly. Most care providers found PDAs to be functional and useful in areas of documentation, medical reference, and access to patient data. Major barriers to adoption were identified as usability, security concerns, and lack of technical and organizational support. PDAs offer health care practitioners advantages to enhance their clinical practice. However, better designed PDA hardware and software applications, more institutional support, seamless integration of PDA technology with hospital information systems, and satisfactory security measures are necessary to increase acceptance and wide use of PDAs in healthcare.

  19. Information trade-offs for optical quantum communication.

    PubMed

    Wilde, Mark M; Hayden, Patrick; Guha, Saikat

    2012-04-06

    Recent work has precisely characterized the achievable trade-offs between three key information processing tasks-classical communication (generation or consumption), quantum communication (generation or consumption), and shared entanglement (distribution or consumption), measured in bits, qubits, and ebits per channel use, respectively. Slices and corner points of this three-dimensional region reduce to well-known protocols for quantum channels. A trade-off coding technique can attain any point in the region and can outperform time sharing between the best-known protocols for accomplishing each information processing task by itself. Previously, the benefits of trade-off coding that had been found were too small to be of practical value (viz., for the dephasing and the universal cloning machine channels). In this Letter, we demonstrate that the associated performance gains are in fact remarkably high for several physically relevant bosonic channels that model free-space or fiber-optic links, thermal-noise channels, and amplifiers. We show that significant performance gains from trade-off coding also apply when trading photon-number resources between transmitting public and private classical information simultaneously over secret-key-assisted bosonic channels. © 2012 American Physical Society

  20. Stream Flow Prediction by Remote Sensing and Genetic Programming

    NASA Technical Reports Server (NTRS)

    Chang, Ni-Bin

    2009-01-01

    A genetic programming (GP)-based, nonlinear modeling structure relates soil moisture with synthetic-aperture-radar (SAR) images to present representative soil moisture estimates at the watershed scale. Surface soil moisture measurement is difficult to obtain over a large area due to a variety of soil permeability values and soil textures. Point measurements can be used on a small-scale area, but it is impossible to acquire such information effectively in large-scale watersheds. This model exhibits the capacity to assimilate SAR images and relevant geoenvironmental parameters to measure soil moisture.

  1. A portable battery for objective, non-obstrusive measures of human performances

    NASA Technical Reports Server (NTRS)

    Kennedy, R. S.

    1984-01-01

    The need for a standardized battery of human performance tests to measure the effects of various treatments is pointed out. Progress in such a program is reported. Three batteries are available which differ in length and the number of tests in the battery. All tests are implemented on a portable, lap held, briefcase size microprocessor. Performances measured include: information processing, memory, visual perception, reasoning, and motor skills, programs to determine norms, reliabilities, stabilities, factor structure of tests, comparisons with marker tests, apparatus suitability. Rationale for the battery is provided.

  2. Retrieval of profile information from airborne multiaxis UV-visible skylight absorption measurements.

    PubMed

    Bruns, Marco; Buehler, Stefan A; Burrows, John P; Heue, Klaus-Peter; Platt, Ulrich; Pundt, Irene; Richter, Andreas; Rozanov, Alexej; Wagner, Thomas; Wang, Ping

    2004-08-01

    A recent development in ground-based remote sensing of atmospheric constituents by UV-visible absorption measurements of scattered light is the simultaneous use of several horizon viewing directions in addition to the traditional zenith-sky pointing. The different light paths through the atmosphere enable the vertical distribution of some atmospheric absorbers, such as NO2, BrO, or O3, to be retrieved. This approach has recently been implemented on an airborne platform. This novel instrument, the airborne multiaxis differential optical absorption spectrometer (AMAXDOAS), has been flown for the first time. In this study, the amount of profile information that can be retrieved from such measurements is investigated for the trace gas NO2. Sensitivity studies on synthetic data are performed for a variety of representative measurement conditions including two wavelengths, one in the UV and one in the visible, two different surface spectral reflectances, various lines of sight (LOSs), and for two different flight altitudes. The results demonstrate that the AMAXDOAS measurements contain useful profile information, mainly at flight altitude and below the aircraft. Depending on wavelength and LOS used, the vertical resolution of the retrieved profiles is as good as 2 km near flight altitude. Above 14 km the profile information content of AMAXDOAS measurements is sparse. Airborne multiaxis measurements are thus a promising tool for atmospheric studies in the troposphere and the upper troposphere and lower stratosphere region.

  3. Impact of survey workflow on precision and accuracy of terrestrial LiDAR datasets

    NASA Astrophysics Data System (ADS)

    Gold, P. O.; Cowgill, E.; Kreylos, O.

    2009-12-01

    Ground-based LiDAR (Light Detection and Ranging) survey techniques are enabling remote visualization and quantitative analysis of geologic features at unprecedented levels of detail. For example, digital terrain models computed from LiDAR data have been used to measure displaced landforms along active faults and to quantify fault-surface roughness. But how accurately do terrestrial LiDAR data represent the true ground surface, and in particular, how internally consistent and precise are the mosaiced LiDAR datasets from which surface models are constructed? Addressing this question is essential for designing survey workflows that capture the necessary level of accuracy for a given project while minimizing survey time and equipment, which is essential for effective surveying of remote sites. To address this problem, we seek to define a metric that quantifies how scan registration error changes as a function of survey workflow. Specifically, we are using a Trimble GX3D laser scanner to conduct a series of experimental surveys to quantify how common variables in field workflows impact the precision of scan registration. Primary variables we are testing include 1) use of an independently measured network of control points to locate scanner and target positions, 2) the number of known-point locations used to place the scanner and point clouds in 3-D space, 3) the type of target used to measure distances between the scanner and the known points, and 4) setting up the scanner over a known point as opposed to resectioning of known points. Precision of the registered point cloud is quantified using Trimble Realworks software by automatic calculation of registration errors (errors between locations of the same known points in different scans). Accuracy of the registered cloud (i.e., its ground-truth) will be measured in subsequent experiments. To obtain an independent measure of scan-registration errors and to better visualize the effects of these errors on a registered point cloud, we scan from multiple locations an object of known geometry (a cylinder mounted above a square box). Preliminary results show that even in a controlled experimental scan of an object of known dimensions, there is significant variability in the precision of the registered point cloud. For example, when 3 scans of the central object are registered using 4 known points (maximum time, maximum equipment), the point clouds align to within ~1 cm (normal to the object surface). However, when the same point clouds are registered with only 1 known point (minimum time, minimum equipment), misalignment of the point clouds can range from 2.5 to 5 cm, depending on target type. The greater misalignment of the 3 point clouds when registered with fewer known points stems from the field method employed in acquiring the dataset and demonstrates the impact of field workflow on LiDAR dataset precision. By quantifying the degree of scan mismatch in results such as this, we can provide users with the information needed to maximize efficiency in remote field surveys.

  4. On the Determination of Uncertainty and Limit of Detection in Label-Free Biosensors.

    PubMed

    Lavín, Álvaro; Vicente, Jesús de; Holgado, Miguel; Laguna, María F; Casquel, Rafael; Santamaría, Beatriz; Maigler, María Victoria; Hernández, Ana L; Ramírez, Yolanda

    2018-06-26

    A significant amount of noteworthy articles reviewing different label-free biosensors are being published in the last years. Most of the times, the comparison among the different biosensors is limited by the procedure used of calculating the limit of detection and the measurement uncertainty. This article clarifies and establishes a simple procedure to determine the calibration function and the uncertainty of the concentration measured at any point of the measuring interval of a generic label-free biosensor. The value of the limit of detection arises naturally from this model as the limit at which uncertainty tends when the concentration tends to zero. The need to provide additional information, such as the measurement interval and its linearity, among others, on the analytical systems and biosensor in addition to the detection limit is pointed out. Finally, the model is applied to curves that are typically obtained in immunoassays and a discussion is made on the application validity of the model and its limitations.

  5. Feasibility analysis on integration of luminous environment measuring and design based on exposure curve calibration

    NASA Astrophysics Data System (ADS)

    Zou, Yuan; Shen, Tianxing

    2013-03-01

    Besides illumination calculating during architecture and luminous environment design, to provide more varieties of photometric data, the paper presents combining relation between luminous environment design and SM light environment measuring system, which contains a set of experiment devices including light information collecting and processing modules, and can offer us various types of photometric data. During the research process, we introduced a simulation method for calibration, which mainly includes rebuilding experiment scenes in 3ds Max Design, calibrating this computer aid design software in simulated environment under conditions of various typical light sources, and fitting the exposure curves of rendered images. As analytical research went on, the operation sequence and points for attention during the simulated calibration were concluded, connections between Mental Ray renderer and SM light environment measuring system were established as well. From the paper, valuable reference conception for coordination between luminous environment design and SM light environment measuring system was pointed out.

  6. A new approach for measuring the work and quality of histopathology reporting.

    PubMed

    Sharma, Vijay; Davey, Jonathan G N; Humphreys, Catherine; Johnston, Peter W

    2013-07-01

    Cancer datasets drive report quality, but require more work to inform compliant reports. The aim of this study was to correlate the number of words with measures of quality, to examine the impact of the drive for improved quality on the workload of histopathology reporting over time. We examined the first 10 reports of colon, breast, renal, lung and ovarian carcinoma, melanoma resection, nodal lymphoma appendicitis and seborrhoeic keratosis (SK) issued in 1991, 2001 and 2011. Correlations were analysed using Pearson's partial correlation coefficients. Word count increased significantly over time for most specimen types examined. Word count almost always correlated with units of information, indicating that the word count was a good measure of the amount of information contained within the reports; this correlation was preserved following correction for the effect of time. A good correlation with compliance with cancer datasets was also observed, but was weakened or lost following correction for the increase in word count and units of information that occurred between time points. These data indicate that word count could potentially be used as a measure of information content if its integrity and usefulness are continuously validated. Further prospective studies are required to assess and validate this approach. © 2013 John Wiley & Sons Ltd.

  7. An inventory of bispectrum estimators for redshift space distortions

    NASA Astrophysics Data System (ADS)

    Regan, Donough

    2017-12-01

    In order to best improve constraints on cosmological parameters and on models of modified gravity using current and future galaxy surveys it is necessary maximally exploit the available data. As redshift-space distortions mean statistical translation invariance is broken for galaxy observations, this will require measurement of the monopole, quadrupole and hexadecapole of not just the galaxy power spectrum, but also the galaxy bispectrum. A recent (2015) paper by Scoccimarro demonstrated how the standard bispectrum estimator may be expressed in terms of Fast Fourier Transforms (FFTs) to afford an extremely efficient algorithm, allowing the bispectrum multipoles on all scales and triangle shapes to be measured in comparable time to those of the power spectrum. In this paper we present a suite of alternative proxies to measure the three-point correlation multipoles. In particular, we describe a modal (or plane wave) decomposition to capture the information in each multipole in a series of basis coefficients, and also describe three compressed estimators formed using the skew-spectrum, the line correlation function and the integrated bispectrum, respectively. As well as each of the estimators offering a different measurement channel, and thereby a robustness check, it is expected that some (especially the modal estimator) will offer a vast data compression, and so a much reduced covariance matrix. This compression may be vital to reduce the computational load involved in extracting the available three-point information.

  8. Boron in Calcium Sulfate Vein at Catabola, Mars

    NASA Image and Video Library

    2016-12-13

    The highest concentration of boron measured on Mars, as of late 2016, is in this mineral vein, called "Catabola," examined with the Chemistry and Camera (ChemCam) instrument on NASA's Curiosity rover on Aug, 25, 2016, during Sol 1441 of the mission. This two-part illustration shows the context of the erosion-resistant, raised vein, in an image from Curiosity's Mast Camera (Mastcam), and a detailed inset image from ChemCam's remote micro-imager. The inset includes indicators of the boron content measured at 10 points along the vein that were analyzed with ChemCam's laser-firing spectrometer. The vein's main component is calcium sulfate. The highest boron content identified is less than one-tenth of one percent. The heights of the orange bars at each point indicate relative abundance of boron, compared with boron content at other points. The scale bar for the inset is 9.2 millimeters, or about 0.36 inch. The ChemCam image is enhanced with color information from Mastcam. http://photojournal.jpl.nasa.gov/catalog/PIA21251

  9. Risk management and measuring productivity with POAS--point of act system.

    PubMed

    Akiyama, Masanori; Kondo, Tatsuya

    2007-01-01

    The concept of our system is not only to manage material flows, but also to provide an integrated management resource, a means of correcting errors in medical treatment, and applications to EBM through the data mining of medical records. Prior to the development of this system, electronic processing systems in hospitals did a poor job of accurately grasping medical practice and medical material flows. With POAS (Point of Act System), hospital managers can solve the so-called, "man, money, material, and information" issues inherent in the costs of healthcare. The POAS system synchronizes with each department system, from finance and accounting, to pharmacy, to imaging, and allows information exchange. We can manage Man, Material, Money and Information completely by this system. Our analysis has shown that this system has a remarkable investment effect - saving over four million dollars per year - through cost savings in logistics and business process efficiencies. In addition, the quality of care has been improved dramatically while error rates have been reduced - nearly to zero in some cases.

  10. Optical Ptychographic Microscope for Quantitative Bio-Mechanical Imaging

    NASA Astrophysics Data System (ADS)

    Anthony, Nicholas; Cadenazzi, Guido; Nugent, Keith; Abbey, Brian

    The role that mechanical forces play in biological processes such as cell movement and death is becoming of significant interest to further develop our understanding of the inner workings of cells. The most common method used to obtain stress information is photoelasticity which maps a samples birefringence, or its direction dependent refractive indices, using polarized light. However this method only provides qualitative data and for stress information to be useful quantitative data is required. Ptychography is a method for quantitatively determining the phase of a samples complex transmission function. The technique relies upon the collection of multiple overlapping coherent diffraction patterns from laterally displaced points on the sample. The overlap of measurement points provides complementary information that significantly aids in the reconstruction of the complex wavefield exiting the sample and allows for quantitative imaging of weakly interacting specimens. Here we describe recent advances at La Trobe University Melbourne on achieving quantitative birefringence mapping using polarized light ptychography with applications in cell mechanics. Australian Synchrotron, ARC Centre of Excellence for Advanced Molecular Imaging.

  11. Water Triple-Point Comparisons: Plateau Averaging or Peak Value?

    NASA Astrophysics Data System (ADS)

    Steur, P. P. M.; Dematteis, R.

    2014-04-01

    With a certain regularity, national metrology institutes conduct comparisons of water triple-point (WTP) cells. The WTP is the most important fixed point for the International Temperature Scale of 1990 (ITS-90). In such comparisons, it is common practice to simply average all the single measured temperature points obtained on a single ice mantle. This practice is quite reasonable whenever the measurements show no time dependence in the results. Ever since the first Supplementary Information for the International Temperature Scale of 1990, published by the Bureau International des Poids et Mesures in Sèvres, it was strongly suggested to wait at least 1 day before taking measurements (now up to 10 days), in order for a newly created ice mantle to stabilize. This stabilization is accompanied by a change in temperature with time. A recent improvement in the sensitivity of resistance measurement enabled the Istituto Nazionale di Ricerca Metrologica to detect more clearly the (possible) change in temperature with time of the WTP on a single ice mantle, as for old borosilicate cells. A limited investigation was performed where the temperature of two cells was monitored day-by-day, from the moment of mantle creation, where it was found that with (old) borosilicate cells it may be counterproductive to wait the usual week before starting measurements. The results are presented and discussed, and it is suggested to adapt the standard procedure for comparisons of WTP cells allowing for a different data treatment with (old) borosilicate cells, because taking the temperature dependence into account will surely reduce the reported differences between cells.

  12. LIDAR Metrology for Prescription Characterization and Alignment of Large Mirrors

    NASA Technical Reports Server (NTRS)

    Eegholm, B.; Eichhorn, W.; von Handorf, R.; Hayden, J.; Ohl, R.; Wenzel, G.

    2011-01-01

    We describe the use of LIDAR, or "laser radar," (LR) as a fast, accurate, and non-contact tool for the measurement of the radius of curvature (RoC) of large mirrors. We report the results of a demonstration of this concept using a commercial laser radar system. We measured the RoC of a 1.4m x 1m spherical mirror with a nominal RoC of 4.6 m with a manufacturing tolerance of 4600mm +/- 6mm. The prescription of the mirror is related to its role as ground support equipment used in the test of part of the James Webb Space Telescope (JWST). The RoC of such a large mirror is not easily measured without contacting the surface. From a position near the center of curvature of the mirror, the LIDAR scanned the mirror surface, sampling it with 1 point per 3.5 sq cm. The measurement consisted of 3983 points and lasted only a few minutes. The laser radar uses the LIDAR signal to provide range, and encoder information from angular azimuth and elevation rotation stages provide the spherical coordinates of a given point. A best-fit to a sphere of the measured points was performed. The resulting RoC was within 20 ppm of the nominal RoC, also showing good agreement with the results of a laser tracker-based, contact metrology. This paper also discusses parameters such as test alignment, scan density, and optical surface type, as well as future possible application for full prescription characterization of aspherical mirrors, including radius, conic, off-axis distance, and aperture.

  13. An Innovative Metric to Evaluate Satellite Precipitation's Spatial Distribution

    NASA Astrophysics Data System (ADS)

    Liu, H.; Chu, W.; Gao, X.; Sorooshian, S.

    2011-12-01

    Thanks to its capability to cover the mountains, where ground measurement instruments cannot reach, satellites provide a good means of estimating precipitation over mountainous regions. In regions with complex terrains, accurate information on high-resolution spatial distribution of precipitation is critical for many important issues, such as flood/landslide warning, reservoir operation, water system planning, etc. Therefore, in order to be useful in many practical applications, satellite precipitation products should possess high quality in characterizing spatial distribution. However, most existing validation metrics, which are based on point/grid comparison using simple statistics, cannot effectively measure satellite's skill of capturing the spatial patterns of precipitation fields. This deficiency results from the fact that point/grid-wised comparison does not take into account of the spatial coherence of precipitation fields. Furth more, another weakness of many metrics is that they can barely provide information on why satellite products perform well or poor. Motivated by our recent findings of the consistent spatial patterns of the precipitation field over the western U.S., we developed a new metric utilizing EOF analysis and Shannon entropy. The metric can be derived through two steps: 1) capture the dominant spatial patterns of precipitation fields from both satellite products and reference data through EOF analysis, and 2) compute the similarities between the corresponding dominant patterns using mutual information measurement defined with Shannon entropy. Instead of individual point/grid, the new metric treat the entire precipitation field simultaneously, naturally taking advantage of spatial dependence. Since the dominant spatial patterns are shaped by physical processes, the new metric can shed light on why satellite product can or cannot capture the spatial patterns. For demonstration, a experiment was carried out to evaluate a satellite precipitation product, CMORPH, against the U.S. daily precipitation analysis of Climate Prediction Center (CPC) at a daily and .25o scale over the Western U.S.

  14. Removing respiratory artefacts from transthoracic bioimpedance spectroscopy measurements

    NASA Astrophysics Data System (ADS)

    Cuba-Gyllensten, I.; Abtahi, F.; Bonomi, A. G.; Lindecrantz, K.; Seoane, F.; Amft, O.

    2013-04-01

    Transthoracic impedance spectroscopy (TIS) measurements from wearable textile electrodes provide a tool to remotely and non-invasively monitor patient health. However, breathing and cardiac processes inevitably affect TIS measurements, since they are sensitive to changes in geometry and air or fluid volumes in the thorax. This study aimed at investigating the effect of respiration on Cole parameters extracted from TIS measurements and developing a method to suppress artifacts. TIS data were collected from 10 participants at 16 frequencies (range: 10 kHz - 1 MHz) using a textile electrode system (Philips Technologie Gmbh). Simultaneously, breathing volumes and frequency were logged using an electronic spirometer augmented with data from a breathing belt. The effect of respiration on TIS measurements was studied at paced (10 and 16 bpm) deep and shallow breathing. These measurements were repeated for each subject in three different postures (lying down, reclining and sitting). Cole parameter estimation was improved by assessing the tidal expiration point thus removing breathing artifacts. This leads to lower intra-subject variability between sessions and a need for less measurements points to accurately assess the spectra. Future work should explore algorithmic artifacts compensation models using breathing and posture or patient contextual information to improve ambulatory transthoracic impedance measurements.

  15. Is Anyone Paying Attention to Physician Report Cards? The Impact of Increased Availability on Consumers' Awareness and Use of Physician Quality Information.

    PubMed

    Shi, Yunfeng; Scanlon, Dennis P; Bhandari, Neeraj; Christianson, Jon B

    2017-08-01

    To determine if the release of health care report cards focused on physician practice quality measures leads to changes in consumers' awareness and use of this information. Data from two rounds of a survey of the chronically ill adult population conducted in 14 regions across the United States, combined with longitudinal information from a public reporting tracking database. Both data were collected as part of the evaluation for Aligning Forces for Quality, a nationwide quality improvement initiative funded by the Robert Wood Johnson Foundation. Using a longitudinal design and an individual-level fixed effects modeling approach, we estimated the impact of community public reporting efforts, measured by the availability and applicability of physician quality reports, on consumers' awareness and use of physician quality information (PQI). The baseline level of awareness was 12.6 percent in our study sample, drawn from the general population of chronically ill adults. Among those who were not aware of PQI at the baseline, when PQI became available in their communities for the first time, along with quality measures that are applicable to their specific chronic conditions, the likelihood of PQI awareness increased by 3.8 percentage points. For the same group, we also find similar increases in the uses of PQI linked to newly available physician report cards, although the magnitudes are smaller, between 2 and 3 percentage points. Specific contents of physician report cards can be an important factor in consumers' awareness and use of PQI. Policies to improve awareness and use of PQI may consider how to customize quality report cards and target specific groups of consumers in dissemination. © Health Research and Educational Trust.

  16. Sources of uncertanity as a basis to fill the information gap in a response to flood

    NASA Astrophysics Data System (ADS)

    Kekez, Toni; Knezic, Snjezana

    2016-04-01

    Taking into account uncertainties in flood risk management remains a challenge due to difficulties in choosing adequate structural and/or non-structural risk management options. Despite stated measures wrong decisions are often being made when flood occurs. Parameter and structural uncertainties which include model and observation errors as well as lack of knowledge about system characteristics are the main considerations. Real time flood risk assessment methods are predominantly based on measured water level values and vulnerability as well as other relevant characteristics of flood affected area. The goal of this research is to identify sources of uncertainties and to minimize information gap between the point where the water level is measured and the affected area, taking into consideration main uncertainties that can affect risk value at the observed point or section of the river. Sources of uncertainties are identified and determined using system analysis approach and relevant uncertainties are included in the risk assessment model. With such methodological approach it is possible to increase response time with more effective risk assessment which includes uncertainty propagation model. Response phase could be better planned with adequate early warning systems resulting in more time and less costs to help affected areas and save human lives. Reliable and precise information is necessary to raise emergency operability level in order to enhance safety of citizens and reducing possible damage. The results of the EPISECC (EU funded FP7) project are used to validate potential benefits of this research in order to improve flood risk management and response methods. EPISECC aims at developing a concept of a common European Information Space for disaster response which, among other disasters, considers the floods.

  17. Criterion Validity of Measures of Perceived Relative Harm of E-Cigarettes and Smokeless Tobacco Compared to Cigarettes

    PubMed Central

    Persoskie, Alexander; Nguyen, Anh B.; Kaufman, Annette R.; Tworek, Cindy

    2017-01-01

    Beliefs about the relative harmfulness of one product compared to another (perceived relative harm) are central to research and regulation concerning tobacco and nicotine-containing products, but techniques for measuring such beliefs vary widely. We compared the validity of direct and indirect measures of perceived harm of e-cigarettes and smokeless tobacco (SLT) compared to cigarettes. On direct measures, participants explicitly compare the harmfulness of each product. On indirect measures, participants rate the harmfulness of each product separately, and ratings are compared. The U.S. Health Information National Trends Survey (HINTS-FDA-2015; N=3738) included direct measures of perceived harm of e-cigarettes and SLT compared to cigarettes. Indirect measures were created by comparing ratings of harm from e-cigarettes, SLT, and cigarettes on 3-point scales. Logistic regressions tested validity by assessing whether direct and indirect measures were associated with criterion variables including: ever-trying e-cigarettes, ever-trying snus, and SLT use status. Compared to the indirect measures, the direct measures of harm were more consistently associated with criterion variables. On direct measures, 26% of adults rated e-cigarettes as less harmful than cigarettes, and 11% rated SLT as less harmful than cigarettes. Direct measures appear to provide valid information about individuals’ harm beliefs, which may be used to inform research and tobacco control policy. Further validation research is encouraged. PMID:28073035

  18. Accuracy Assessment of Underwater Photogrammetric Three Dimensional Modelling for Coral Reefs

    NASA Astrophysics Data System (ADS)

    Guo, T.; Capra, A.; Troyer, M.; Gruen, A.; Brooks, A. J.; Hench, J. L.; Schmitt, R. J.; Holbrook, S. J.; Dubbini, M.

    2016-06-01

    Recent advances in automation of photogrammetric 3D modelling software packages have stimulated interest in reconstructing highly accurate 3D object geometry in unconventional environments such as underwater utilizing simple and low-cost camera systems. The accuracy of underwater 3D modelling is affected by more parameters than in single media cases. This study is part of a larger project on 3D measurements of temporal change of coral cover in tropical waters. It compares the accuracies of 3D point clouds generated by using images acquired from a system camera mounted in an underwater housing and the popular GoPro cameras respectively. A precisely measured calibration frame was placed in the target scene in order to provide accurate control information and also quantify the errors of the modelling procedure. In addition, several objects (cinder blocks) with various shapes were arranged in the air and underwater and 3D point clouds were generated by automated image matching. These were further used to examine the relative accuracy of the point cloud generation by comparing the point clouds of the individual objects with the objects measured by the system camera in air (the best possible values). Given a working distance of about 1.5 m, the GoPro camera can achieve a relative accuracy of 1.3 mm in air and 2.0 mm in water. The system camera achieved an accuracy of 1.8 mm in water, which meets our requirements for coral measurement in this system.

  19. Type T reference function suitability for low temperature applications

    NASA Astrophysics Data System (ADS)

    Dowell, D.

    2013-09-01

    Type T thermocouples are commonly used in industrial measurement applications due to their accuracy relative to other thermocouple types, low cost, and the ready availability of measurement equipment. Type T thermocouples are very effective when used in differential measurements, as there is no cold junction compensation necessary for the connections to the measurement equipment. Type T's published accuracy specifications result in its frequent use in low temperature applications. An examination of over 328 samples from a number of manufacturers has been completed for this investigation. Samples were compared to a Standard Platinum Resistance Thermometer (SPRT) at the LN2 boiling point along with four other standardized measurement points using a characterized ice point reference, low-thermal EMF scanner and an 8.5 digit multimeter, and the data compiled and analyzed. The test points were approximately -196 °C, -75 °C, 0 °C, +100 °C, and +200 °C. These data show an anomaly in the conformance to the reference functions where the reference functions meet at 0 °C. Additionally, in the temperature region between -100 °C to -200 °C, a positive offset of up to 5.4 °C exists between the reference function equations published in the ASTM E230-06 for the nitrogen point and the measured response of the actual wire. This paper will examine the historical and technological reasons for this anomaly in the both the ASTM and IEC reference functions. At the request of the author and the Proceedings Editor the above article has been replaced with a corrected version. The original PDF file supplied to AIP Publishing contained several figures with missing information/characters—caused by processes used to generate the PDF file. All figures were affected by this error. The article has been replaced and these figures now display correctly. The corrected article was published on 7 November 2013.

  20. Quantum origin of quantum jumps: Breaking of unitary symmetry induced by information transfer in the transition from quantum to classical

    NASA Astrophysics Data System (ADS)

    Zurek, Wojciech Hubert

    2007-11-01

    Measurements transfer information about a system to the apparatus and then, further on, to observers and (often inadvertently) to the environment. I show that even imperfect copying essential in such situations restricts possible unperturbed outcomes to an orthogonal subset of all possible states of the system, thus breaking the unitary symmetry of its Hilbert space implied by the quantum superposition principle. Preferred outcome states emerge as a result. They provide a framework for “wave-packet collapse,” designating terminal points of quantum jumps and defining the measured observable by specifying its eigenstates. In quantum Darwinism, they are the progenitors of multiple copies spread throughout the environment—the fittest quantum states that not only survive decoherence, but subvert the environment into carrying information about them—into becoming a witness.

  1. Comparison of specificity and information for fuzzy domains

    NASA Technical Reports Server (NTRS)

    Ramer, Arthur

    1992-01-01

    This paper demonstrates how an integrated theory can be built on the foundation of possibility theory. Information and uncertainty were considered in 'fuzzy' literature since 1982. Our departing point is the model proposed by Klir for the discrete case. It was elaborated axiomatically by Ramer, who also introduced the continuous model. Specificity as a numerical function was considered mostly within Dempster-Shafer evidence theory. An explicity definition was given first by Yager, who has also introduced it in the context of possibility theory. Axiomatic approach and the continuous model have been developed very recently by Ramer and Yager. They also establish a close analytical correspondence between specificity and information. In literature to date, specificity and uncertainty are defined only for the discrete finite domains, with a sole exception. Our presentation removes these limitations. We define specificity measures for arbitrary measurable domains.

  2. An empirical model for the complex dielectric permittivity of soils as a function of water content

    NASA Technical Reports Server (NTRS)

    Wang, J. R.; Chmugge, T. J.

    1978-01-01

    The recent measurements on the dielectric properties of soils shows that the variation of dielectric constant with moisture content depends on soil types. The observed dielectric constant increases only slowly with moisture content up to a transition point. Beyond the transition it increases rapidly with moisture content. The moisture value of transition region was found to be higher for high clay content soils than for sandy soils. Many mixing formulas were compared with, and were found incompatible with, the measured dielectric variations of soil-water mixtures. A simple empirical model was proposed to describe the dielectric behavior of ths soil-water mixtures. The relationship between transition moisture and wilting point provides a means of estimating soil dielectric properties on the basis of texture information.

  3. Comparability of item quality indices from sparse data matrices with random and non-random missing data patterns.

    PubMed

    Wolfe, Edward W; McGill, Michael T

    2011-01-01

    This article summarizes a simulation study of the performance of five item quality indicators (the weighted and unweighted versions of the mean square and standardized mean square fit indices and the point-measure correlation) under conditions of relatively high and low amounts of missing data under both random and conditional patterns of missing data for testing contexts such as those encountered in operational administrations of a computerized adaptive certification or licensure examination. The results suggest that weighted fit indices, particularly the standardized mean square index, and the point-measure correlation provide the most consistent information between random and conditional missing data patterns and that these indices perform more comparably for items near the passing score than for items with extreme difficulty values.

  4. Implementation and Initial Testing of Advanced Processing and Analysis Algorithms for Correlated Neutron Counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santi, Peter Angelo; Cutler, Theresa Elizabeth; Favalli, Andrea

    In order to improve the accuracy and capabilities of neutron multiplicity counting, additional quantifiable information is needed in order to address the assumptions that are present in the point model. Extracting and utilizing higher order moments (Quads and Pents) from the neutron pulse train represents the most direct way of extracting additional information from the measurement data to allow for an improved determination of the physical properties of the item of interest. The extraction of higher order moments from a neutron pulse train required the development of advanced dead time correction algorithms which could correct for dead time effects inmore » all of the measurement moments in a self-consistent manner. In addition, advanced analysis algorithms have been developed to address specific assumptions that are made within the current analysis model, namely that all neutrons are created at a single point within the item of interest, and that all neutrons that are produced within an item are created with the same energy distribution. This report will discuss the current status of implementation and initial testing of the advanced dead time correction and analysis algorithms that have been developed in an attempt to utilize higher order moments to improve the capabilities of correlated neutron measurement techniques.« less

  5. Imaging laser radar for high-speed monitoring of the environment

    NASA Astrophysics Data System (ADS)

    Froehlich, Christoph; Mettenleiter, M.; Haertl, F.

    1998-01-01

    In order to establish mobile robot operations and to realize survey and inspection tasks, robust and precise measurements of the geometry of the 3D environment is the basis sensor technology. For visual inspection, surface classification, and documentation purposes, however, additional information concerning reflectance of measured objects is necessary. High-speed acquisition of both geometric and visual information is achieved by means of an active laser radar, supporting consistent range and reflectance images. The laser radar developed at Zoller + Froehlich (ZF) is an optical-wavelength system measuring the range between sensor and target surface as well as the reflectance of the target surface, which corresponds to the magnitude of the back scattered laser energy. In contrast to other range sensing devices, the ZF system is designed for high-speed and high- performance operation in real indoor and outdoor environments, emitting a minimum of near-IR laser energy. It integrates a single-point laser measurement system and a mechanical deflection system for 3D environmental measurements. This paper reports details of the laser radar which is designed to cover requirements with medium range applications. It outlines the performance requirements and introduces the two-frequency phase-shift measurement principle. The hardware design of the single-point laser measurement system, including the main modulates, such as the laser head, the high frequency unit and the signal processing unit are discussed in detail. The paper focuses on performance data of the laser radar, including noise, drift over time, precision, and accuracy with measurements. It discusses the influences of ambient light, surface material of the target, and ambient temperature for range accuracy and range precision. Furthermore, experimental results from inspection of tunnels, buildings, monuments and industrial environments are presented. The paper concludes by summarizing results and gives a short outlook to future work.

  6. Objective evaluation of female feet and leg joint conformation at time of selection and post first parity in swine.

    PubMed

    Stock, J D; Calderón Díaz, J A; Rothschild, M F; Mote, B E; Stalder, K J

    2018-06-09

    Feet and legs of replacement females were objectively evaluated at selection, i.e. approximately 150 days of age (n=319) and post first parity, i.e. any time after weaning of first litter and before 2nd parturition (n=277) to 1) compare feet and leg joint angle ranges between selection and post first parity; 2) identify feet and leg joint angle differences between selection and first three weeks of second gestation; 3) identify feet and leg join angle differences between farms and gestation days during second gestation; and 4) obtain genetic variance components for conformation angles for the two time points measured. Angles for carpal joint (knee), metacarpophalangeal joint (front pastern), metatarsophalangeal joint (rear pastern), tarsal joint (hock), and rear stance were measured using image analysis software. Between selection and post first parity significant differences were observed for all joints measured (P < 0.05). Knee, front and rear pastern angles were less (more flexion), and hock angles were greater (less flexion) as age progressed (P < 0.05), while the rear stance pattern was less (feet further under center) at selection than post first parity (only including measures during first three weeks of second gestation). Only using post first parity leg conformation information, farm was a significant source of variation for front and rear pasterns and rear stance angle measurements (P < 0.05). Knee angle was less (more flexion) (P < 0.05) as gestation age progressed. Heritability estimates were low to moderate (0.04 - 0.35) for all traits measured across time points. Genetic correlations between the same joints at different time points were high (> 0.8) between the front leg joints and low (<0.2) between the rear leg joints. High genetic correlations between time points indicate that the trait can be considered the same at either time point, and low genetic correlations indicate that the trait at different time points should be considered as two separate traits. Minimal change in the front leg suggests conformation traits that remain between selection and post first parity, while larger changes in rear leg indicate that rear leg conformation traits should be evaluated at multiple time periods.

  7. A Low-Cost Approach to Automatically Obtain Accurate 3D Models of Woody Crops.

    PubMed

    Bengochea-Guevara, José M; Andújar, Dionisio; Sanchez-Sardana, Francisco L; Cantuña, Karla; Ribeiro, Angela

    2017-12-24

    Crop monitoring is an essential practice within the field of precision agriculture since it is based on observing, measuring and properly responding to inter- and intra-field variability. In particular, "on ground crop inspection" potentially allows early detection of certain crop problems or precision treatment to be carried out simultaneously with pest detection. "On ground monitoring" is also of great interest for woody crops. This paper explores the development of a low-cost crop monitoring system that can automatically create accurate 3D models (clouds of coloured points) of woody crop rows. The system consists of a mobile platform that allows the easy acquisition of information in the field at an average speed of 3 km/h. The platform, among others, integrates an RGB-D sensor that provides RGB information as well as an array with the distances to the objects closest to the sensor. The RGB-D information plus the geographical positions of relevant points, such as the starting and the ending points of the row, allow the generation of a 3D reconstruction of a woody crop row in which all the points of the cloud have a geographical location as well as the RGB colour values. The proposed approach for the automatic 3D reconstruction is not limited by the size of the sampled space and includes a method for the removal of the drift that appears in the reconstruction of large crop rows.

  8. A Low-Cost Approach to Automatically Obtain Accurate 3D Models of Woody Crops

    PubMed Central

    Andújar, Dionisio; Sanchez-Sardana, Francisco L.; Cantuña, Karla

    2017-01-01

    Crop monitoring is an essential practice within the field of precision agriculture since it is based on observing, measuring and properly responding to inter- and intra-field variability. In particular, “on ground crop inspection” potentially allows early detection of certain crop problems or precision treatment to be carried out simultaneously with pest detection. “On ground monitoring” is also of great interest for woody crops. This paper explores the development of a low-cost crop monitoring system that can automatically create accurate 3D models (clouds of coloured points) of woody crop rows. The system consists of a mobile platform that allows the easy acquisition of information in the field at an average speed of 3 km/h. The platform, among others, integrates an RGB-D sensor that provides RGB information as well as an array with the distances to the objects closest to the sensor. The RGB-D information plus the geographical positions of relevant points, such as the starting and the ending points of the row, allow the generation of a 3D reconstruction of a woody crop row in which all the points of the cloud have a geographical location as well as the RGB colour values. The proposed approach for the automatic 3D reconstruction is not limited by the size of the sampled space and includes a method for the removal of the drift that appears in the reconstruction of large crop rows. PMID:29295536

  9. Topography reconstruction of specular surfaces

    NASA Astrophysics Data System (ADS)

    Kammel, Soren; Horbach, Jan

    2005-01-01

    Specular surfaces are used in a wide variety of industrial and consumer products like varnished or chrome plated parts of car bodies, dies, molds or optical components. Shape deviations of these products usually reduce their quality regarding visual appearance and/or technical performance. One reliable method to inspect such surfaces is deflectometry. It can be employed to obtain highly accurate values representing the local curvature of the surfaces. In a deflectometric measuring system, a series of illumination patterns is reflected at the specular surface and is observed by a camera. The distortions of the patterns in the acquired images contain information about the shape of the surface. This information is suited for the detection and measurement of surface defects like bumps, dents and waviness with depths in the range of a few microns. However, without additional information about the distances between the camera and each observed surface point, a shape reconstruction is only possible in some special cases. Therefore, the reconstruction approach described in this paper uses data observed from at least two different camera positions. The data obtained is used separately to estimate the local surface curvature for each camera position. From the curvature values, the epipolar geometry for the different camera positions is recovered. Matching the curvature values along the epipolar lines yields an estimate of the 3d position of the corresponding surface points. With this additional information, the deflectometric gradient data can be integrated to represent the surface topography.

  10. Long-term observations of aerosol and cloud condensation nuclei concentrations in Barbados

    NASA Astrophysics Data System (ADS)

    Pöhlker, Mira L.; Klimach, Thomas; Krüger, Ovid O.; Hrabe de Angelis, Isabella; Ditas, Florian; Praß, Maria; Holanda, Bruna; Su, Hang; Weber, Bettina; Pöhlker, Christopher; Farrell, David A.; Stevens, Bjorn; Prospero, Joseph M.; Andreae, Meinrat O.; Pöschl, Ulrich

    2017-04-01

    Long-term observation of atmospheric aerosol and cloud condensation nuclei (CCN) concentrations has been conducted at the Ragged Point site in Barbados since August 2016. Ragged Point is a well-established station to monitor the transatlantic transport of Saharan dust outbreaks [1]. In the absence of dust plumes, it represents an ideal site to analyze the maritime boundary layer aerosol that is transported with the trade winds over the Atlantic towards Barbados [2,3]. Broad aerosol size distribution (10 nm to 10 µm) as well as size-resolved CCN measurements at 10 different supersaturations from 0.05 % to 0.84 % have been conducted. The continuous online analyses are supplemented by intensive sampling periods to probe specific aerosol properties with various offline techniques (i.e., microscopy and spectroscopy). Aerosol key properties from our measurements are compared with the continuous and in depth observation of cloud properties at Deebles Point, which is in close neighborhood to the Ragged Point site [2]. Moreover, our activities have been synchronized with the HALO-NARVAL-2 aircraft campaign in August 2016 that added further detailed information on shallow cumulus clouds, which are characteristic for the Atlantic trade winds and represent a crucial factor in the Earth climate system. Our measurements have the following two focal points: (i) We aim to obtain a detailed CCN climatology for the alternation of maritime and dust-impacted episodes at this unique coastal location. This study will complement our recent in-depth analysis for the long-term CCN variability at a remote rain forest location [4]. (ii) Furthermore, we aim to collect detailed information on the role of different aerosol populations on the properties of the climatically important shallow cumulus clouds. References: [1] Prospero, J. M., Collard, F. X., Molinie, J., Jeannot, A. (2014), Global Biogeochemical Cycles, 28, 757-773. [2] Stevens, B., et al. (2016), Bulletin of the American Meteorological Society, 97, 787-801. [3] Wex, H., et al., (2016), Atmos. Chem. Phys., 16, 14107-14130. [4] Pöhlker, M. L.., et al. (2016), Atmos. Chem. Phys., 16, 15709-15740.

  11. Design and development of LED-based irregular leather area measuring machine

    NASA Astrophysics Data System (ADS)

    Adil, Rehan; Khan, Sarah Jamal

    2012-01-01

    Using optical sensor array, a precision motion control system in a conveyer follows the irregular shaped leather sheet to measure its surface area. In operation, irregular shaped leather sheet passes on conveyer belt and optical sensor array detects the leather sheet edge. In this way outside curvature of the leather sheet is detected and is then feed to the controller to measure its approximate area. Such system can measure irregular shapes, by neglecting rounded corners, ellipses etc. To minimize the error in calculating surface area of irregular curve to the above mentioned system, the motion control system only requires the footprint of the optical sensor to be small and the distance between the sensors is to be minimized. In the proposed technique surface area measurement of irregular shaped leather sheet is done by defining velocity and detecting position of the move. The motion controller takes the information and creates the necessary edge profile on point-to-point bases. As a result irregular shape of leather sheet is mapped and is then feed to the controller to calculate surface area.

  12. Characteristics of Hydrogen Monitoring Systems for Severe Accident Management at a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Petrosyan, V. G.; Yeghoyan, E. A.; Grigoryan, A. D.; Petrosyan, A. P.; Movsisyan, M. R.

    2018-02-01

    One of the main objectives of severe accident management at a nuclear power plant is to protect the integrity of the containment, for which the most serious threat is possible ignition of the generated hydrogen. There should be a monitoring system providing information support of NPP personnel, ensuring data on the current state of a containment gaseous environment and trends in its composition changes. Monitoring systems' requisite characteristics definition issues are considered by the example of a particular power unit. Major characteristics important for proper information support are discussed. Some features of progression of severe accident scenarios at considered power unit are described and a possible influence of the hydrogen concentration monitoring system performance on the information support reliability in a severe accident is analyzed. The analysis results show that the following technical characteristics of the combustible gas monitoring systems are important for the proper information support of NPP personnel in the event of a severe accident at a nuclear power plant: measured parameters, measuring ranges and errors, update rate, minimum detectable concentration of combustible gas, monitoring reference points, environmental qualification parameters of the system components. For NPP power units with WWER-440/270 (230) type reactors, which have a relatively small containment volume, the update period for measurement results is a critical characteristic of the containment combustible gas monitoring system, and the choice of monitoring reference points should be focused not so much on the definition of places of possible hydrogen pockets but rather on the definition of places of a possible combustible mixture formation. It may be necessary for the above-mentioned power units to include in the emergency operating procedures measures aimed at a timely heat removal reduction from the containment environment if there are signs of a severe accident phase approaching to prevent a combustible mixture formation in the containment.

  13. Distribution of basic sediments (bedload transport) on changes in coastal coastline Donggala, Central Sulawesi Province, Indonesia

    NASA Astrophysics Data System (ADS)

    Amiruddin

    2018-03-01

    This study entitled "Distribution of Bedload Transport Against Coastline Changes in Donggala Coast", the formulation of the problem (1) how much of the estimated bedload transport in Donggala Bodies; (2) where were the location of erosion and sedimentation strong point based on the estimation of bed load transport; (3) the extent to which the prediction of shoreline change rate of transport of sediments in coastal areas Donggala. This study aims to: (1) the calculation of estimated bed load transport in Donggala waters; (2) determining the location of the point of erosion and sedimentation strong basis of estimated bedload transport; (3) the prediction of shoreline change rate of transport of sediments in coastal areas Donggala.The survey method used in this research to collect primary data include: (1) decision point waypoint coordinates of each location of measurement; (2) measurement of height, period and direction of the waves; (3) a large measurement of sediment transport; (4) The angle measurement coastline, angle of attack and wave direction, and secondary data include: (1) information from the public; (2) the physical condition data field. The results showed that: (1) general estimate sediment transport base in each location data collection is varied. This is due to the different points of the coastline as well as the angle of attack of the shoreline waters broke Donggala; (2) strong abrasion at the study site occurs at the point Ts4 (622.75 m3/yr) and TS11 (755.25 m3/yr) located in the Village Tosale and point Tw7 and Tw17 (649.25 m3/yr) in Village of Towale. As for the strong sedimentation occurs at the point Ts3 (450.50 m3/yr) located in the Village Tosale and Tg3 point (357.75 m3/yr) located in the Village Tolonggano; (3) of the predicted outcome coastline changes based on the input data estimate sediment transport, beaches and waves parameters is seen that the changes in the location prophyl coastline tends toward research into or undergo a process of abrasion.

  14. Philosophy and Sociology of Science Evolution and History

    NASA Astrophysics Data System (ADS)

    Rosen, Joe

    The following sections are included: * Concrete Versus Abstract Theoretical Models * Introduction: concrete and abstract in kepler's contribution * Einstein's theory of gravitation and mach's principle * Unitary symmetry and the structure of hadrons * Conclusion * Dedication * Symmetry, Entropy and Complexity * Introduction * Symmetry Implies Abstraction and Loss of Information * Broken Symmetries - Imposed or Spontaneous * Symmetry, Order and Information * References * Cosmological Surrealism: More Than "Eternal Reality" Is Needed * Pythagoreanism in atomic, nuclear and particle physics * Introduction: Pythagoreanism as part of the Greek scientific world view — and the three questions I will tackle * Point 1: the impact of Gersonides and Crescas, two scientific anti-Aristotelian rebels * Point 2: Kepler's spheres to Bohr's orbits — Pythagoreanisms at last! * Point 3: Aristotle to Maupertuis, Emmy Noether, Schwinger * References * Paradigm Completion For Generalized Evolutionary Theory With Application To Epistemology * Evolution Fully Generalized * Entropy: Gravity as Model * Evolution and Entropy: Measures of Complexity * Extinctions and a Balanced Evolutionary Paradigm * The Evolution of Human Society - the Age of Information as example * High-Energy Physics and the World Wide Web * Twentieth Century Epistemology has Strong (de facto) Evolutionary Elements * The discoveries towards the beginning of the XXth Century * Summary and Conclusions * References * Evolutionary Epistemology and Invalidation * Introduction * Extinctions and A New Evolutionary Paradigm * Evolutionary Epistemology - Active Mutations * Evolutionary Epistemology: Invalidation as An Extinction * References

  15. Three-dimensional reconstruction of indoor whole elements based on mobile LiDAR point cloud data

    NASA Astrophysics Data System (ADS)

    Gong, Yuejian; Mao, Wenbo; Bi, Jiantao; Ji, Wei; He, Zhanjun

    2014-11-01

    Ground-based LiDAR is one of the most effective city modeling tools at present, which has been widely used for three-dimensional reconstruction of outdoor objects. However, as for indoor objects, there are some technical bottlenecks due to lack of GPS signal. In this paper, based on the high-precision indoor point cloud data which was obtained by LiDAR, an international advanced indoor mobile measuring equipment, high -precision model was fulfilled for all indoor ancillary facilities. The point cloud data we employed also contain color feature, which is extracted by fusion with CCD images. Thus, it has both space geometric feature and spectral information which can be used for constructing objects' surface and restoring color and texture of the geometric model. Based on Autodesk CAD platform and with help of PointSence plug, three-dimensional reconstruction of indoor whole elements was realized. Specifically, Pointools Edit Pro was adopted to edit the point cloud, then different types of indoor point cloud data was processed, including data format conversion, outline extracting and texture mapping of the point cloud model. Finally, three-dimensional visualization of the real-world indoor was completed. Experiment results showed that high-precision 3D point cloud data obtained by indoor mobile measuring equipment can be used for indoor whole elements' 3-d reconstruction and that methods proposed in this paper can efficiently realize the 3 -d construction of indoor whole elements. Moreover, the modeling precision could be controlled within 5 cm, which was proved to be a satisfactory result.

  16. Analysis of Deformations of the Skylight Construction at the Main Hall of the Warsaw University of Technology

    NASA Astrophysics Data System (ADS)

    Odziemczyk, Waldemar

    2015-02-01

    The paper presents technology and results of measurements of the steel construction of the skylight of the Main Hall of the Warsaw University of Technology. The new version of the automated measuring system has been used for measurements. This system is based on Leica TCRP1201+ total station and the TCcalc1200 software application, developed by the author, which operates on a laptop computer connected with the total station by the wire. Two test measurements were performed. Each of them consisted of cyclic measurement using the polar method, from one station; points located on the skylight construction, as well as control points located on concrete, bearing poles, were successively measured. Besides geometrical values (such as Hz, V angles and the slope distance D), the changes of temperature and atmospheric pressure, were also recorded. Processed results of measurements contained information concerning the behaviour of the skylight; asymmetry of horizontal displacements with respect to the X axis have been proved. Changes of parameters of the instrument telescope and changes of the instrument orientation were also stated; they were connected with changes of the temperature. The most important results of works have been presented in the form of diagrams.

  17. Study of DNA binding sites using the Rényi parametric entropy measure.

    PubMed

    Krishnamachari, A; moy Mandal, Vijnan; Karmeshu

    2004-04-07

    Shannon's definition of uncertainty or surprisal has been applied extensively to measure the information content of aligned DNA sequences and characterizing DNA binding sites. In contrast to Shannon's uncertainty, this study investigates the applicability and suitability of a parametric uncertainty measure due to Rényi. It is observed that this measure also provides results in agreement with Shannon's measure, pointing to its utility in analysing DNA binding site region. For facilitating the comparison between these uncertainty measures, a dimensionless quantity called "redundancy" has been employed. It is found that Rényi's measure at low parameter values possess a better delineating feature of binding sites (of binding regions) than Shannon's measure. The critical value of the parameter is chosen with an outlier criterion.

  18. Proof of concept demonstration for coherent beam pattern measurements of KID detectors

    NASA Astrophysics Data System (ADS)

    Davis, Kristina K.; Baryshev, Andrey M.; Jellema, Willem; Yates, Stephen J. C.; Ferrari, Lorenza; Baselmans, Jochem J. A.

    2016-07-01

    Here we summarize the initial results from a complex field radiation pattern measurement of a kinetic inductance detector instrument. These detectors are phase insensitive and have thus been limited to scalar, or amplitude-only, beam measurements. Vector beam scans, of both amplitude and phase, double the information received in comparison to scalar beam scans. Scalar beam measurements require multiple scans at varying distances along the optical path of the receiver to fully constrain the divergence angle of the optical system and locate the primary focus. Vector scans provide this information with a single scan, reducing the total measurement time required for new systems and also limiting the influence of system instabilities. The vector scan can be taken at any point along the optical axis of the system including the near-field, which makes beam measurements possible for large systems at high frequencies where these measurements may be inconceivable to be tested in-situ. Therefore, the methodology presented here should enable common heterodyne analysis for direct detector instruments. In principle, this coherent measurement strategy allows phase dependent analysis to be performed on any direct-detect receiver instrument.

  19. Gender Bias in Initial Perceptions and Subsequent Hiring Decisions.

    ERIC Educational Resources Information Center

    Morris, Scott B.; And Others

    Much of the research on sex bias looks at impressions at a single point in time. However, impressions are often changed as information is accumulated. This study attempted to look at the dynamic nature of impression formation. Impressions of both male and female job applicants were measured before and after subjects had an opportunity to view…

  20. Motivation for Staying in College: Differences Between LEP (Limited English Proficiency) and Non-LEP Hispanic Community College Students

    ERIC Educational Resources Information Center

    Fong, Carlton J.; Krause, Jaimie M.; Acee, Taylor W.; Weinstein, Claire Ellen

    2016-01-01

    The study investigated motivational differences and higher education outcomes between limited English proficiency (LEP) Hispanic students compared with non-LEP Hispanic students. With a sample of 668 Hispanic community college students, we measured various forms of achievement motivation informed by self-determination theory, grade point average…

  1. Multi-intelligence critical rating assessment of fusion techniques (MiCRAFT)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik

    2015-06-01

    Assessment of multi-intelligence fusion techniques includes credibility of algorithm performance, quality of results against mission needs, and usability in a work-domain context. Situation awareness (SAW) brings together low-level information fusion (tracking and identification), high-level information fusion (threat and scenario-based assessment), and information fusion level 5 user refinement (physical, cognitive, and information tasks). To measure SAW, we discuss the SAGAT (Situational Awareness Global Assessment Technique) technique for a multi-intelligence fusion (MIF) system assessment that focuses on the advantages of MIF against single intelligence sources. Building on the NASA TLX (Task Load Index), SAGAT probes, SART (Situational Awareness Rating Technique) questionnaires, and CDM (Critical Decision Method) decision points; we highlight these tools for use in a Multi-Intelligence Critical Rating Assessment of Fusion Techniques (MiCRAFT). The focus is to measure user refinement of a situation over the information fusion quality of service (QoS) metrics: timeliness, accuracy, confidence, workload (cost), and attention (throughput). A key component of any user analysis includes correlation, association, and summarization of data; so we also seek measures of product quality and QuEST of information. Building a notion of product quality from multi-intelligence tools is typically subjective which needs to be aligned with objective machine metrics.

  2. Effectiveness of provider incentives for anaemia reduction in rural China: a cluster randomised trial

    PubMed Central

    Luo, Renfu; Zhang, Linxiu; Sylvia, Sean; Shi, Yaojiang; Foo, Patricia; Zhao, Qiran; Martorell, Reynaldo; Medina, Alexis; Rozelle, Scott

    2012-01-01

    Objectives To test the impact of provider performance pay for anaemia reduction in rural China. Design A cluster randomised trial of information, subsidies, and incentives for school principals to reduce anaemia among their students. Enumerators and study participants were not informed of study arm assignment. Setting 72 randomly selected rural primary schools across northwest China. Participants 3553 fourth and fifth grade students aged 9-11 years. All fourth and fifth grade students in sample schools participated in the study. Interventions Sample schools were randomly assigned to a control group, with no intervention, or one of three treatment arms: (a) an information arm, in which principals received information about anaemia; (b) a subsidy arm, in which principals received information and unconditional subsidies; and (c) an incentive arm, in which principals received information, subsidies, and financial incentives for reducing anaemia among students. Twenty seven schools were assigned to the control arm (1816 students at baseline, 1623 at end point), 15 were assigned to the information arm (659 students at baseline, 596 at end point), 15 to the subsidy arm (726 students at baseline, 667 at end point), and 15 to the incentive arm (743 students at baseline, 667 at end point). Main outcome measures Student haemoglobin concentrations. Results Mean student haemoglobin concentration rose by 1.5 g/L (95% CI –1.1 to 4.1) in information schools, 0.8 g/L (–1.8 to 3.3) in subsidy schools, and 2.4 g/L (0 to 4.9) in incentive schools compared with the control group. This increase in haemoglobin corresponded to a reduction in prevalence of anaemia (Hb <115 g/L) of 24% in incentive schools. Interactions with pre-existing incentives for principals to achieve good academic performance led to substantially larger gains in the information and incentive arms: when combined with incentives for good academic performance, associated effects on student haemoglobin concentration were 9.8 g/L (4.1 to 15.5) larger in information schools and 8.6 g/L (2.1 to 15.1) larger in incentive schools. Conclusions Financial incentives for health improvement were modestly effective. Understanding interactions with other motives and pre-existing incentives is critical. Trial registration number ISRCTN76158086. PMID:22842354

  3. Rényi entropies and topological quantum numbers in 2D gapped Dirac materials

    NASA Astrophysics Data System (ADS)

    Bolívar, Juan Carlos; Romera, Elvira

    2017-05-01

    New topological quantum numbers are introduced by analyzing complexity measures and relative Rényi entropies in silicene in the presence of perpendicular electric and magnetic fields. These topological quantum numbers characterize the topological insulator and band insulator phases in silicene. In addition, we have found that, these information measures reach extremum values at the charge neutrality points. These results are valid for other 2D gapped Dirac materials analogous to silicene with a buckled honeycomb structure and a significant spin-orbit coupling.

  4. Design and Production of Color Calibration Targets for Digital Input Devices

    DTIC Science & Technology

    2000-07-01

    gamuts . Fourth, color transform form CIELCH to sRGB will be described. Fifth, the relevant target mockups will be created. Sixth, the quality will be...Implement statistical _ • process controls Print, process and measure •, reject Transfer the measured CIEXYZ of I the target patches to SRGB a Genterate...Kodak Royal VII paper and sRGB . This plot shows all points on the a*-b* plane without information about the L*. The sRGB’s color gamut is obtained from

  5. Coastal Research Imaging Spectrometer

    NASA Technical Reports Server (NTRS)

    Lucey, Paul G.; Williams, Timothy; Horton, Keith A.

    2002-01-01

    The Coastal Research Imaging Spectrometer (CRIS) is an airborne remote-sensing system designed specifically for research on the physical, chemical, and biological characteristics of coastal waters. The CRIS includes a visible-light hyperspectral imaging subsystem for measuring the color of water, which contains information on the biota, sediment, and nutrient contents of the water. The CRIS also includes an infrared imaging subsystem, which provides information on the temperature of the water. The combination of measurements enables investigation of biological effects of both natural and artificial flows of water from land into the ocean, including diffuse and point-source flows that may contain biological and/or chemical pollutants. Temperature is an important element of such measurements because temperature contrasts can often be used to distinguish among flows from different sources: for example, a sewage outflow could manifest itself in spectral images as a local high-temperature anomaly.

  6. Inference of relativistic electron spectra from measurements of inverse Compton radiation

    NASA Astrophysics Data System (ADS)

    Craig, I. J. D.; Brown, J. C.

    1980-07-01

    The inference of relativistic electron spectra from spectral measurement of inverse Compton radiation is discussed for the case where the background photon spectrum is a Planck function. The problem is formulated in terms of an integral transform that relates the measured spectrum to the unknown electron distribution. A general inversion formula is used to provide a quantitative assessment of the information content of the spectral data. It is shown that the observations must generally be augmented by additional information if anything other than a rudimentary two or three parameter model of the source function is to be derived. It is also pointed out that since a similar equation governs the continuum spectra emitted by a distribution of black-body radiators, the analysis is relevant to the problem of stellar population synthesis from galactic spectra.

  7. Metric adjusted skew information

    PubMed Central

    Hansen, Frank

    2008-01-01

    We extend the concept of Wigner–Yanase–Dyson skew information to something we call “metric adjusted skew information” (of a state with respect to a conserved observable). This “skew information” is intended to be a non-negative quantity bounded by the variance (of an observable in a state) that vanishes for observables commuting with the state. We show that the skew information is a convex function on the manifold of states. It also satisfies other requirements, proposed by Wigner and Yanase, for an effective measure-of-information content of a state relative to a conserved observable. We establish a connection between the geometrical formulation of quantum statistics as proposed by Chentsov and Morozova and measures of quantum information as introduced by Wigner and Yanase and extended in this article. We show that the set of normalized Morozova–Chentsov functions describing the possible quantum statistics is a Bauer simplex and determine its extreme points. We determine a particularly simple skew information, the “λ-skew information,” parametrized by a λ ∈ (0, 1], and show that the convex cone this family generates coincides with the set of all metric adjusted skew informations. PMID:18635683

  8. Evaluating hospital information systems from the point of view of the medical records section users in Medical-Educational Hospitals of Kermanshah 2014.

    PubMed

    Rostami, S; Sarmad, A; Mohammadi, M; Cheleie, M; Amiri, S; Zardoei Golanbary, S H

    2015-01-01

    Evaluating hospital information systems leads to the improvement and devotion based on the users' needs, especially the medical records section users in hospitals, which are in contact with this system from the moment the patient enters the hospital until his/ her release and after that. The present research aimed to evaluate the hospital information systems from the point of view of the medical record section employees. Materials and method : The current research was applicative-descriptive analytical and the research society included 70 users of the medical history section in the educational-medical centers of Kermanshah city. The data-gathering tool was the 10th part of 9241/ 10 Isometric standard questionnaire of evaluating hospital information systems, with 75 specific questions in 7 bases, with the five spectra Likertt scale, its conceptual admissibility being confirmed in previous researches. 22 SPSS statistical software analyzed its permanency in the present study, which was also confirmed by Cronbach's's alpha test, which equaled to 0.89, and the data. Findings : The highest level of the employees' satisfaction, based on gained scores median, was respectively the incompatibility with the users' expectations, measuring 3.55, self-description measuring 3.54 and controllability - 3.51, which in total presented the average scores of 3.39, the lowest level of satisfaction being related to useful learning , whose value was 3.19. Discussion and conclusion : Hospital information systems' users believe that it is more desirable that the existing systems are based on the measures and consider them proper for making them non-governmental and useful for undesired learning. Considering the long distance of the existing information systems with the desired performance, it is essential that "these systems pay more attention to a more complete and deeper recognition and awareness of users' opinions and requirements in their road. The movement and development is to increase their chance in succeeding and achieving their goals, where the goal is to improve the patients' care and improve the health of the society members with the help of information technology.

  9. "I like the sound of that!" Wine descriptions influence consumers' expectations, liking, emotions and willingness to pay for Australian white wines.

    PubMed

    Danner, Lukas; Johnson, Trent E; Ristic, Renata; Meiselman, Herbert L; Bastian, Susan E P

    2017-09-01

    This study investigated how information, typically presented on wine back-labels or wine company websites, influences consumers' expected liking, informed liking, wine-evoked emotions and willingness to pay for Australian white wines. Regular white wine consumers (n=126) evaluated the same set of three commercially available white wines (mono-varietal Chardonnay, Riesling, Sauvignon Blanc) under three information levels. Session 1, blind tasting (no information provided) and Session 2, informed tasting (held at least 1week later) with both basic (sensory description of the wines) and elaborate (sensory plus high wine quality and favourable winery information) descriptions followed by liking, wine-evoked emotions (measured with the Australian Wine Evoked Emotions Lexicon (AWEEL)) and willingness to pay evaluations. Before tasting the wine in session 2, consumers also rated expected liking. Results showed that information level had a significant effect on all investigated variables. The elaborate information level evoked higher expectations before tasting the wines, plus resulted in higher liking ratings, elicitation of more intense positive (e.g. contented, happy and warm-hearted) and less intense negative emotions (e.g. embarrassed and unfulfilled), and a substantial increase in willingness to pay after tasting the wines compared to the blind condition, with the basic condition ranging in-between. These results were consistent across the three wine samples. Furthermore, if the liking rating after tasting the wines matched the expected liking or exceeded the expectations by 1 point on a 9-point hedonic scale, participants felt the most intense positive emotions and the least intense negative emotions. Whereas, if the expectations were not met or the actual liking exceeded the expectations by >2 points, participants felt less intense positive and more intense negative emotions. This highlights not only the importance of well written and accurate wine descriptions, but also that information can influence consumers' wine drinking experience and behaviour. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Research on distributed optical fiber sensing data processing method based on LabVIEW

    NASA Astrophysics Data System (ADS)

    Li, Zhonghu; Yang, Meifang; Wang, Luling; Wang, Jinming; Yan, Junhong; Zuo, Jing

    2018-01-01

    The pipeline leak detection and leak location problem have gotten extensive attention in the industry. In this paper, the distributed optical fiber sensing system is designed based on the heat supply pipeline. The data processing method of distributed optical fiber sensing based on LabVIEW is studied emphatically. The hardware system includes laser, sensing optical fiber, wavelength division multiplexer, photoelectric detector, data acquisition card and computer etc. The software system is developed using LabVIEW. The software system adopts wavelet denoising method to deal with the temperature information, which improved the SNR. By extracting the characteristic value of the fiber temperature information, the system can realize the functions of temperature measurement, leak location and measurement signal storage and inquiry etc. Compared with traditional negative pressure wave method or acoustic signal method, the distributed optical fiber temperature measuring system can measure several temperatures in one measurement and locate the leak point accurately. It has a broad application prospect.

  11. Study on the measuring distance for blood glucose infrared spectral measuring by Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Li, Xiang

    2016-10-01

    Blood glucose monitoring is of great importance for controlling diabetes procedure and preventing the complications. At present, the clinical blood glucose concentration measurement is invasive and could be replaced by noninvasive spectroscopy analytical techniques. Among various parameters of optical fiber probe used in spectrum measuring, the measurement distance is the key one. The Monte Carlo technique is a flexible method for simulating light propagation in tissue. The simulation is based on the random walks that photons make as they travel through tissue, which are chosen by statistically sampling the probability distributions for step size and angular deflection per scattering event. The traditional method for determine the optimal distance between transmitting fiber and detector is using Monte Carlo simulation to find out the point where most photons come out. But there is a problem. In the epidermal layer there is no artery, vein or capillary vessel. Thus, when photons propagate and interactive with tissue in epidermal layer, no information is given to the photons. A new criterion is proposed to determine the optimal distance, which is named effective path length in this paper. The path length of each photons travelling in dermis is recorded when running Monte-Carlo simulation, which is the effective path length defined above. The sum of effective path length of every photon at each point is calculated. The detector should be place on the point which has most effective path length. Then the optimal measuring distance between transmitting fiber and detector is determined.

  12. How to use the Sun-Earth Lagrange points for fundamental physics and navigation

    NASA Astrophysics Data System (ADS)

    Tartaglia, A.; Lorenzini, E. C.; Lucchesi, D.; Pucacco, G.; Ruggiero, M. L.; Valko, P.

    2018-01-01

    We illustrate the proposal, nicknamed LAGRANGE, to use spacecraft, located at the Sun-Earth Lagrange points, as a physical reference frame. Performing time of flight measurements of electromagnetic signals traveling on closed paths between the points, we show that it would be possible: (a) to refine gravitational time delay knowledge due both to the Sun and the Earth; (b) to detect the gravito-magnetic frame dragging of the Sun, so deducing information about the interior of the star; (c) to check the possible existence of a galactic gravitomagnetic field, which would imply a revision of the properties of a dark matter halo; (d) to set up a relativistic positioning and navigation system at the scale of the inner solar system. The paper presents estimated values for the relevant quantities and discusses the feasibility of the project analyzing the behavior of the space devices close to the Lagrange points.

  13. Comparison of Fixed Point Realisations between Inmetro and PTB

    NASA Astrophysics Data System (ADS)

    Santiago, J. F. N.; Petkovic, S. G.; Teixeira, R. N.; Noatsch, U.; Thiele-Krivoj, B.

    2003-09-01

    An interlaboratory comparison in the temperature range between -190 °C and 420 °C was organised between the National Institute of Quality, Normalisation and Industrial Quality (Inmetro), Brazil, and the Physikalisch Technische Bundesanstalt (PTB), Germany. This comparison followed the same protocol as the EUROMET project 552 comparison and was carried out in the years 2001-2002. A standard platinum resistance thermometer (SPRT) of 25 Ω was calibrated at the temperature fixed points of Ar, Hg, the triple point of water (TWP), Ga, In, Sn and Zn, with at least three realisations of each fixed point at both institutes. The uncertainty evaluation is given by Inmetro and some differences in the calibration procedures or in the measuring instruments used are described. The agreement between the results of laboratories was not in all cases within the combined uncertainties. Results of other comparisons are presented, which give additional information on the equivalence of the realised temperature scales.

  14. Quality of internet-based decision aids for shoulder arthritis: what are patients reading?

    PubMed

    Somerson, Jeremy S; Bois, Aaron J; Jeng, Jeffrey; Bohsali, Kamal I; Hinchey, John W; Wirth, Michael A

    2018-04-11

    The objective of this study was to assess the source, quality, accuracy, and completeness of Internet-based information for shoulder arthritis. A web search was performed using three common Internet search engines and the top 50 sites from each search were analyzed. Information sources were categorized into academic, commercial, non-profit, and physician sites. Information quality was measured using the Health On the Net (HON) Foundation principles, content accuracy by counting factual errors and completeness using a custom template. After removal of duplicates and sites that did not provide an overview of shoulder arthritis, 49 websites remained for analysis. The majority of sites were from commercial (n = 16, 33%) and physician (n = 16, 33%) sources. An additional 12 sites (24%) were from an academic institution and five sites (10%) were from a non-profit organization. Commercial sites had the highest number of errors, with a five-fold likelihood of containing an error compared to an academic site. Non-profit sites had the highest HON scores, with an average of 9.6 points on a 16-point scale. The completeness score was highest for academic sites, with an average score of 19.2 ± 6.7 (maximum score of 49 points); other information sources had lower scores (commercial, 15.2 ± 2.9; non-profit, 18.7 ± 6.8; physician, 16.6 ± 6.3). Patient information on the Internet regarding shoulder arthritis is of mixed accuracy, quality, and completeness. Surgeons should actively direct patients to higher-quality Internet sources.

  15. Evaluation of methods for rapid determination of freezing point of aviation fuels

    NASA Technical Reports Server (NTRS)

    Mathiprakasam, B.

    1982-01-01

    Methods for identification of the more promising concepts for the development of a portable instrument to rapidly determine the freezing point of aviation fuels are described. The evaluation process consisted of: (1) collection of information on techniques previously used for the determination of the freezing point, (2) screening and selection of these techniques for further evaluation of their suitability in a portable unit for rapid measurement, and (3) an extensive experimental evaluation of the selected techniques and a final selection of the most promising technique. Test apparatuses employing differential thermal analysis and the change in optical transparency during phase change were evaluated and tested. A technique similar to differential thermal analysis using no reference fuel was investigated. In this method, the freezing point was obtained by digitizing the data and locating the point of inflection. Results obtained using this technique compare well with those obtained elsewhere using different techniques. A conceptual design of a portable instrument incorporating this technique is presented.

  16. Terrestrial laser scanning point clouds time series for the monitoring of slope movements: displacement measurement using image correlation and 3D feature tracking

    NASA Astrophysics Data System (ADS)

    Bornemann, Pierrick; Jean-Philippe, Malet; André, Stumpf; Anne, Puissant; Julien, Travelletti

    2016-04-01

    Dense multi-temporal point clouds acquired with terrestrial laser scanning (TLS) have proved useful for the study of structure and kinematics of slope movements. Most of the existing deformation analysis methods rely on the use of interpolated data. Approaches that use multiscale image correlation provide a precise and robust estimation of the observed movements; however, for non-rigid motion patterns, these methods tend to underestimate all the components of the movement. Further, for rugged surface topography, interpolated data introduce a bias and a loss of information in some local places where the point cloud information is not sufficiently dense. Those limits can be overcome by using deformation analysis exploiting directly the original 3D point clouds assuming some hypotheses on the deformation (e.g. the classic ICP algorithm requires an initial guess by the user of the expected displacement patterns). The objective of this work is therefore to propose a deformation analysis method applied to a series of 20 3D point clouds covering the period October 2007 - October 2015 at the Super-Sauze landslide (South East French Alps). The dense point clouds have been acquired with a terrestrial long-range Optech ILRIS-3D laser scanning device from the same base station. The time series are analyzed using two approaches: 1) a method of correlation of gradient images, and 2) a method of feature tracking in the raw 3D point clouds. The estimated surface displacements are then compared with GNSS surveys on reference targets. Preliminary results tend to show that the image correlation method provides a good estimation of the displacement fields at first order, but shows limitations such as the inability to track some deformation patterns, and the use of a perspective projection that does not maintain original angles and distances in the correlated images. Results obtained with 3D point clouds comparison algorithms (C2C, ICP, M3C2) bring additional information on the displacement fields. Displacement fields derived from both approaches are then combined and provide a better understanding of the landslide kinematics.

  17. Investigation of Service Quality of Measurement Reference Points for the Internet Services on Mobile Networks

    NASA Astrophysics Data System (ADS)

    Lipenbergs, E.; Bobrovs, Vj.; Ivanovs, G.

    2016-10-01

    To ensure that end-users and consumers have access to comprehensive, comparable and user-friendly information regarding the Internet access service quality, it is necessary to implement and regularly renew a set of legislative regulatory acts and to provide monitoring of the quality of Internet access services regarding the current European Regulatory Framework. The actual situation regarding the quality of service monitoring solutions in different European countries depends on national regulatory initiatives and public awareness. The service monitoring solutions are implemented using different measurement methodologies and tools. The paper investigates the practical implementations for developing a harmonising approach to quality monitoring in order to obtain objective information on the quality of Internet access services on mobile networks.

  18. Scatter metrology of photovoltaic textured surfaces

    NASA Astrophysics Data System (ADS)

    Stover, John C.; Hegstrom, Eric L.

    2010-09-01

    In recent years it has become common practice to texture many of the layered surfaces making up photovoltaic cells in order to increase light absorption and efficiency. Profilometry has been used to characterize the texture, but this is not satisfactory for in-line production systems which move surfaces too fast for that measurement. Scatterometry has been used successfully to measure roughness for many years. Its advantages include low cost, non-contact measurement and insensitivity to vibration; however, it also has some limitations. This paper presents scatter measurements made on a number of photovoltaic samples using two different scatterometers. It becomes clear that in many cases the surface roughness exceeds the optical smoothness limit (required to calculate surface statistics from scatter), but it is also clear that scatter measurement is a fast, sensitive indicator of texture and can be used to monitor whether design specifications are being met. A third key point is that there is a lot of surface dependent information available in the angular variations of the measured scatter. When the surface is inspected by integrating the scatter signal (often called a "Haze" measurement) this information is lost.

  19. Imprecise (fuzzy) information in geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bardossy, A.; Bogardi, I.; Kelly, W.E.

    1988-05-01

    A methodology based on fuzzy set theory for the utilization of imprecise data in geostatistics is presented. A common problem preventing a broader use of geostatistics has been the insufficient amount of accurate measurement data. In certain cases, additional but uncertain (soft) information is available and can be encoded as subjective probabilities, and then the soft kriging method can be applied (Journal, 1986). In other cases, a fuzzy encoding of soft information may be more realistic and simplify the numerical calculations. Imprecise (fuzzy) spatial information on the possible variogram is integrated into a single variogram which is used in amore » fuzzy kriging procedure. The overall uncertainty of prediction is represented by the estimation variance and the calculated membership function for each kriged point. The methodology is applied to the permeability prediction of a soil liner for hazardous waste containment. The available number of hard measurement data (20) was not enough for a classical geostatistical analysis. An additional 20 soft data made it possible to prepare kriged contour maps using the fuzzy geostatistical procedure.« less

  20. Health indicators 1991.

    PubMed

    Dawson, N

    1991-01-01

    This is the second edition of a database developed by the Canadian Centre for Health Information (CCHI). It features 49 health indicators, under one cover containing the most recent data available from a variety of national surveys. This information may be used to establish health goals for the population and to offer objective measures of their success. The database can be accessed through CANSIM, Statistics Canada's socio-economic electronic database and retrieval system, or through a personal computer package which enables the user to retrieve and analyze the 1.2 million data points in the system.

  1. Reply to ``Comment on `Performance of different synchronization measures in real data: A case study on electroencephalographic signals' ''

    NASA Astrophysics Data System (ADS)

    Quian Quiroga, R.; Kraskov, A.; Kreuz, T.; Grassberger, P.

    2003-06-01

    We agree with the Comment by Duckrow and Albano [Phys. Rev. E 67, 063901 (2003)] that mutual information, estimated with an optimized algorithm, can be a useful tool for studying synchronization in real data. However, we point out that the improvement they found is mainly due to an interesting but nonstandard embedding technique used, and not so much due to the algorithm used for the estimation of mutual information itself. We also address the issue of stationarity of electroencephalographic (EEG) data.

  2. Loose fusion based on SLAM and IMU for indoor environment

    NASA Astrophysics Data System (ADS)

    Zhu, Haijiang; Wang, Zhicheng; Zhou, Jinglin; Wang, Xuejing

    2018-04-01

    The simultaneous localization and mapping (SLAM) method based on the RGB-D sensor is widely researched in recent years. However, the accuracy of the RGB-D SLAM relies heavily on correspondence feature points, and the position would be lost in case of scenes with sparse textures. Therefore, plenty of fusion methods using the RGB-D information and inertial measurement unit (IMU) data have investigated to improve the accuracy of SLAM system. However, these fusion methods usually do not take into account the size of matched feature points. The pose estimation calculated by RGB-D information may not be accurate while the number of correct matches is too few. Thus, considering the impact of matches in SLAM system and the problem of missing position in scenes with few textures, a loose fusion method combining RGB-D with IMU is proposed in this paper. In the proposed method, we design a loose fusion strategy based on the RGB-D camera information and IMU data, which is to utilize the IMU data for position estimation when the corresponding point matches are quite few. While there are a lot of matches, the RGB-D information is still used to estimate position. The final pose would be optimized by General Graph Optimization (g2o) framework to reduce error. The experimental results show that the proposed method is better than the RGB-D camera's method. And this method can continue working stably for indoor environment with sparse textures in the SLAM system.

  3. Trail resource impacts and an examination of alternative assessment techniques

    USGS Publications Warehouse

    Marion, J.L.; Leung, Y.-F.

    2001-01-01

    Trails are a primary recreation resource facility on which recreation activities are performed. They provide safe access to non-roaded areas, support recreational opportunities such as hiking, biking, and wildlife observation, and protect natural resources by concentrating visitor traffic on resistant treads. However, increasing recreational use, coupled with poorly designed and/or maintained trails, has led to a variety of resource impacts. Trail managers require objective information on trails and their conditions to monitor trends, direct trail maintenance efforts, and evaluate the need for visitor management and resource protection actions. This paper reviews trail impacts and different types of trail assessments, including inventory, maintenance, and condition assessment approaches. Two assessment methods, point sampling and problem assessment, are compared empirically from separate assessments of a 15-mile segment of the Appalachian Trail in Great Smoky Mountains National Park. Results indicate that point sampling and problem assessment methods yield distinctly different types of quantitative information. The point sampling method provides more accurate and precise measures of trail characteristics that are continuous or frequent (e.g., tread width or exposed soil). The problem assessment method is a preferred approach for monitoring trail characteristics that can be easily predefined or are infrequent (e.g., excessive width or secondary treads), particularly when information on the location of specific trail impact problems is needed. The advantages and limitations of these two assessment methods are examined in relation to various management and research information needs. The choice and utility of these assessment methods are also discussed.

  4. Molecular interaction networks in the analyses of sequence variation and proteomics data.

    PubMed

    Stelzl, Ulrich

    2013-12-01

    Protein-protein interaction networks are typically generated in standard cell lines or model organisms as it is prohibitively difficult to record large interaction datasets from specific tissues or disease models at a reasonable pace. Although the interaction data are of high confidence, they thus do not reflect in vivo relationships as such. A wealth of physiologically relevant protein information, obtained under different conditions and from different systems, is available including information on genetic variation, protein levels, and PTMs. However, these data are difficult to assess comprehensively because the relationships between the entities remain elusive from the measurements. Here, we exemplarily highlight recent studies that gained deeper insight from genetic variation, protein, and PTM measurements using interaction information pointing toward the importance and potential of interaction networks for the interpretation of sequencing and proteomics data. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Influencing Food Selection with Point-of-Choice Nutrition Information.

    ERIC Educational Resources Information Center

    Davis-Chervin, Doryn; And Others

    1985-01-01

    Evaluated the effectiveness of a point-of-choice nutrition information program that used a comprehensive set of communication functions in its design. Results indicate that point-of-choice information without direct tangible rewards can (to a moderate degree) modify food-selection behavior of cafeteria patrons. (JN)

  6. Determination of point of zero charge of natural organic materials.

    PubMed

    Bakatula, Elisee Nsimba; Richard, Dominique; Neculita, Carmen Mihaela; Zagury, Gerald J

    2018-03-01

    This study evaluates different methods to determine points of zero charge (PZCs) on five organic materials, namely maple sawdust, wood ash, peat moss, compost, and brown algae, used for the passive treatment of contaminated neutral drainage effluents. The PZC provides important information about metal sorption mechanisms. Three methods were used: (1) the salt addition method, measuring the PZC; (2) the zeta potential method, measuring the isoelectric point (IEP); (3) the ion adsorption method, measuring the point of zero net charge (PZNC). Natural kaolinite and synthetic goethite were also tested with both the salt addition and the ion adsorption methods in order to validate experimental protocols. Results obtained from the salt addition method in 0.05 M NaNO 3 were the following: 4.72 ± 0.06 (maple sawdust), 9.50 ± 0.07 (wood ash), 3.42 ± 0.03 (peat moss), 7.68 ± 0.01 (green compost), and 6.06 ± 0.11 (brown algae). Both the ion adsorption and the zeta potential methods failed to give points of zero charge for these substrates. The PZC of kaolinite (3.01 ± 0.03) was similar to the PZNC (2.9-3.4) and fell within the range of values reported in the literature (2.7-4.1). As for the goethite, the PZC (10.9 ± 0.05) was slightly higher than the PZNC (9.0-9.4). The salt addition method has been found appropriate and convenient to determine the PZC of natural organic substrates.

  7. Impaired Patient-Reported Outcomes Predict Poor School Functioning and Daytime Sleepiness: The PROMIS Pediatric Asthma Study.

    PubMed

    Jones, Conor M; DeWalt, Darren A; Huang, I-Chan

    Poor asthma control in children is related to impaired patient-reported outcomes (PROs; eg, fatigue, depressive symptoms, anxiety), but less well studied is the effect of PROs on children's school performance and sleep outcomes. In this study we investigated whether the consistency status of PROs over time affected school functioning and daytime sleepiness in children with asthma. Of the 238 children with asthma enrolled in the Patient-Reported Outcomes Measurement Information System (PROMIS) Pediatric Asthma Study, 169 children who provided survey data for all 4 time points were used in the analysis. The child's PROs, school functioning, and daytime sleepiness were measured 4 times within a 15-month period. PRO domains included asthma impact, pain interference, fatigue, depressive symptoms, anxiety, and mobility. Each child was classified as having poor/fair versus good PROs per meaningful cut points. The consistency status of each domain was classified as consistently poor/fair if poor/fair status was present for at least 3 time points; otherwise, the status was classified as consistently good. Seemingly unrelated regression was performed to test if consistently poor/fair PROs predicted impaired school functioning and daytime sleepiness at the fourth time point. Consistently poor/fair in all PRO domains was significantly associated with impaired school functioning and excessive daytime sleepiness (Ps < .01) after controlling for the influence of the child's age, sex, and race/ethnicity. Children with asthma with consistently poor/fair PROs are at risk of poor school functioning and daytime sleepiness. Developing child-friendly PRO assessment systems to track PROs can inform potential problems in the school setting. Copyright © 2017 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  8. The fuzzy polynucleotide space: basic properties.

    PubMed

    Torres, Angela; Nieto, Juan J

    2003-03-22

    Any triplet codon may be regarded as a 12-dimensional fuzzy code. Sufficient information about a particular sequence may not be available in certain situations. The investigator will be confronted with imprecise sequences, yet want to make comparisons of sequences. Fuzzy polynucleotides can be compared by using geometrical interpretation of fuzzy sets as points in a hypercube. We introduce the space of fuzzy polynucleotides and a means of measuring dissimilitudes between them. We establish mathematical principles to measure dissimilarities between fuzzy polynucleotides and present several examples in this metric space. We calculate the frequencies of the nucleotides at the three base sites of a codon in the coding sequences of Escherichia coli K-12 and Mycobacterium tuberculosis H37Rv, and consider them as points in that fuzzy space. We compute the distance between the genomes of E.coli and M.tuberculosis.

  9. Electrical resistivity imaging in transmission between surface and underground tunnel for fault characterization

    NASA Astrophysics Data System (ADS)

    Lesparre, N.; Boyle, A.; Grychtol, B.; Cabrera, J.; Marteau, J.; Adler, A.

    2016-05-01

    Electrical resistivity images supply information on sub-surface structures and are classically performed to characterize faults geometry. Here we use the presence of a tunnel intersecting a regional fault to inject electrical currents between surface and the tunnel to improve the image resolution at depth. We apply an original methodology for defining the inversion parametrization based on pilot points to better deal with the heterogeneous sounding of the medium. An increased region of high spatial resolution is shown by analysis of point spread functions as well as inversion of synthetics. Such evaluations highlight the advantages of using transmission measurements by transferring a few electrodes from the main profile to increase the sounding depth. Based on the resulting image we propose a revised structure for the medium surrounding the Cernon fault supported by geological observations and muon flux measurements.

  10. Meeting Report: Batch-to-Batch Variability in Estrogenic Activity in Commercial Animal Diets—Importance and Approaches for Laboratory Animal Research

    PubMed Central

    Heindel, Jerrold J.; vom Saal, Frederick S.

    2008-01-01

    We report information from two workshops sponsored by the National Institutes of Health that were held to a) assess whether dietary estrogens could significantly impact end points in experimental animals, and b) involve program participants and feed manufacturers to address the problems associated with measuring and eliminating batch-to-batch variability in rodent diets that may lead to conflicting findings in animal experiments within and between laboratories. Data were presented at the workshops showing that there is significant batch-to-batch variability in estrogenic content of commercial animal diets, and that this variability results in differences in experimental outcomes. A combination of methods were proposed to determine levels of total estrogenic activity and levels of specific estrogenic constituents in soy-containing, casein-containing, and other soy-free rodent diets. Workshop participants recommended that researchers pay greater attention to the type of diet being used in animal studies and choose a diet whose estrogenic activity (or lack thereof) is appropriate for the experimental model and end points of interest. Information about levels of specific phytoestrogens, as well as estrogenic activity caused by other contaminants and measured by bioassay, should be disclosed in scientific publications. This will require laboratory animal diet manufacturers to provide investigators with information regarding the phytoestrogen content and other estrogenic compounds in commercial diets used in animal research. PMID:18335108

  11. Meeting report: batch-to-batch variability in estrogenic activity in commercial animal diets--importance and approaches for laboratory animal research.

    PubMed

    Heindel, Jerrold J; vom Saal, Frederick S

    2008-03-01

    We report information from two workshops sponsored by the National Institutes of Health that were held to a) assess whether dietary estrogens could significantly impact end points in experimental animals, and b) involve program participants and feed manufacturers to address the problems associated with measuring and eliminating batch-to-batch variability in rodent diets that may lead to conflicting findings in animal experiments within and between laboratories. Data were presented at the workshops showing that there is significant batch-to-batch variability in estrogenic content of commercial animal diets, and that this variability results in differences in experimental outcomes. A combination of methods were proposed to determine levels of total estrogenic activity and levels of specific estrogenic constituents in soy-containing, casein-containing, and other soy-free rodent diets. Workshop participants recommended that researchers pay greater attention to the type of diet being used in animal studies and choose a diet whose estrogenic activity (or lack thereof) is appropriate for the experimental model and end points of interest. Information about levels of specific phytoestrogens, as well as estrogenic activity caused by other contaminants and measured by bioassay, should be disclosed in scientific publications. This will require laboratory animal diet manufacturers to provide investigators with information regarding the phytoestrogen content and other estrogenic compounds in commercial diets used in animal research.

  12. Evaluation of potential kidney donors with the personality assessment inventory: normative data for a unique population.

    PubMed

    Hurst, Duane F; Locke, Dona E C; Osborne, David

    2010-09-01

    Many transplant centers require personality assessment and/or psychiatric clearance prior to allowing an individual to donate a kidney. This is a unique cohort for personality assessment, and there is no normative information available for this population on standardized self-report measures such as the Personality Assessment Inventory (PAI). We evaluated a prospective sample of 434 kidney donor candidates with development of normative T-scores relevant to this specific comparison group. Compared to the original normative group from the PAI manual, potential kidney donors are 5-7 T-score points above the mean on PIM, RXR, DOM, and WRM and 4-6 points below the mean on the majority of the remaining scales. Raw score/T score conversion tables are provided. The normative data provided here is meant to supplement the original normative information and aid psychologists in evaluation of this unique medical population.

  13. Identifying musical pieces from fMRI data using encoding and decoding models.

    PubMed

    Hoefle, Sebastian; Engel, Annerose; Basilio, Rodrigo; Alluri, Vinoo; Toiviainen, Petri; Cagy, Maurício; Moll, Jorge

    2018-02-02

    Encoding models can reveal and decode neural representations in the visual and semantic domains. However, a thorough understanding of how distributed information in auditory cortices and temporal evolution of music contribute to model performance is still lacking in the musical domain. We measured fMRI responses during naturalistic music listening and constructed a two-stage approach that first mapped musical features in auditory cortices and then decoded novel musical pieces. We then probed the influence of stimuli duration (number of time points) and spatial extent (number of voxels) on decoding accuracy. Our approach revealed a linear increase in accuracy with duration and a point of optimal model performance for the spatial extent. We further showed that Shannon entropy is a driving factor, boosting accuracy up to 95% for music with highest information content. These findings provide key insights for future decoding and reconstruction algorithms and open new venues for possible clinical applications.

  14. Undergraduate Consent Form Reading in Relation to Conscientiousness, Procrastination, and the Point-of-Time Effect.

    PubMed

    Theiss, Justin D; Hobbs, William B; Giordano, Peter J; Brunson, Olivia M

    2014-07-01

    Informed consent is central to conducting ethical research with human participants. The present study investigated differences in consent form reading in relation to conscientiousness, procrastination, and the point-of-time (PT) effect among undergraduate participants at a U.S. university. As hypothesized, conscientious participants and those who signed up to participate in a research study more days in advance and for earlier sessions (PT effect) read the consent form more thoroughly. However, procrastination was not related to consent form reading. Most importantly, consent form reading in general was poor, with 80% of participants demonstrating that they had not read the consent form. Conscientious participants were more likely to self-report reading the consent form, irrespective of their measured consent form reading. The article closes with suggestions to improve the process of obtaining informed consent with undergraduate participants. © The Author(s) 2014.

  15. Enhancing user acceptance of mandated mobile health information systems: the ePOC (electronic point-of-care project) experience.

    PubMed

    Burgess, Lois; Sargent, Jason

    2007-01-01

    From a clinical perspective, the use of mobile technologies, such as Personal Digital Assistants (PDAs) within hospital environments is not new. A paradigm shift however is underway towards the acceptance and utility of these systems within mobile-based healthcare environments. Introducing new technologies and associated work practices has intrinsic risks which must be addressed. This paper contends that intervening to address user concerns as they arise throughout the system development lifecycle will lead to greater levels of user acceptance, while ultimately enhancing the deliverability of a system that provides a best fit with end user needs. It is envisaged this research will lead to the development of a formalised user acceptance framework based on an agile approach to user acceptance measurement. The results of an ongoing study of user perceptions towards a mandated electronic point-of-care information system in the Northern Illawarra Ambulatory Care Team (TACT) are presented.

  16. Characterization of Initial Parameter Information for Lifetime Prediction of Electronic Devices.

    PubMed

    Li, Zhigang; Liu, Boying; Yuan, Mengxiong; Zhang, Feifei; Guo, Jiaqiang

    2016-01-01

    Newly manufactured electronic devices are subject to different levels of potential defects existing among the initial parameter information of the devices. In this study, a characterization of electromagnetic relays that were operated at their optimal performance with appropriate and steady parameter values was performed to estimate the levels of their potential defects and to develop a lifetime prediction model. First, the initial parameter information value and stability were quantified to measure the performance of the electronics. In particular, the values of the initial parameter information were estimated using the probability-weighted average method, whereas the stability of the parameter information was determined by using the difference between the extrema and end points of the fitting curves for the initial parameter information. Second, a lifetime prediction model for small-sized samples was proposed on the basis of both measures. Finally, a model for the relationship of the initial contact resistance and stability over the lifetime of the sampled electromagnetic relays was proposed and verified. A comparison of the actual and predicted lifetimes of the relays revealed a 15.4% relative error, indicating that the lifetime of electronic devices can be predicted based on their initial parameter information.

  17. Characterization of Initial Parameter Information for Lifetime Prediction of Electronic Devices

    PubMed Central

    Li, Zhigang; Liu, Boying; Yuan, Mengxiong; Zhang, Feifei; Guo, Jiaqiang

    2016-01-01

    Newly manufactured electronic devices are subject to different levels of potential defects existing among the initial parameter information of the devices. In this study, a characterization of electromagnetic relays that were operated at their optimal performance with appropriate and steady parameter values was performed to estimate the levels of their potential defects and to develop a lifetime prediction model. First, the initial parameter information value and stability were quantified to measure the performance of the electronics. In particular, the values of the initial parameter information were estimated using the probability-weighted average method, whereas the stability of the parameter information was determined by using the difference between the extrema and end points of the fitting curves for the initial parameter information. Second, a lifetime prediction model for small-sized samples was proposed on the basis of both measures. Finally, a model for the relationship of the initial contact resistance and stability over the lifetime of the sampled electromagnetic relays was proposed and verified. A comparison of the actual and predicted lifetimes of the relays revealed a 15.4% relative error, indicating that the lifetime of electronic devices can be predicted based on their initial parameter information. PMID:27907188

  18. Exploring the potential for using 210Pbex measurements within a re-sampling approach to document recent changes in soil redistribution rates within a small catchment in southern Italy.

    PubMed

    Porto, Paolo; Walling, Desmond E; Cogliandro, Vanessa; Callegari, Giovanni

    2016-11-01

    In recent years, the fallout radionuclides caesium-137 ( 137 Cs) and unsupported lead-210 ( 210 Pb ex) have been successfully used to document rates of soil erosion in many areas of the world, as an alternative to conventional measurements. By virtue of their different half-lives, these two radionuclides are capable of providing information related to different time windows. 137 Cs measurements are commonly used to generate information on mean annual erosion rates over the past ca. 50-60 years, whereas 210 Pb ex measurements are able to provide information relating to a longer period of up to ca. 100 years. However, the time-integrated nature of the estimates of soil redistribution provided by 137 Cs and 210 Pb ex measurements can be seen as a limitation, particularly when viewed in the context of global change and interest in the response of soil redistribution rates to contemporary climate change and land use change. Re-sampling techniques used with these two fallout radionuclides potentially provide a basis for providing information on recent changes in soil redistribution rates. By virtue of the effectively continuous fallout input, of 210 Pb, the response of the 210 Pb ex inventory of a soil profile to changing soil redistribution rates and thus its potential for use with the re-sampling approach differs from that of 137 Cs. Its greater sensitivity to recent changes in soil redistribution rates suggests that 210 Pb ex may have advantages over 137 Cs for use in the re-sampling approach. The potential for using 210 Pb ex measurements in re-sampling studies is explored further in this contribution. Attention focuses on a small (1.38 ha) forested catchment in southern Italy. The catchment was originally sampled for 210 Pb ex measurements in 2001 and equivalent samples were collected from points very close to the original sampling points again in 2013. This made it possible to compare the estimates of mean annual erosion related to two different time windows. This comparison suggests that mean annual rates of net soil loss had increased during the period between the two sampling campaigns and that this increase was associated with a shift to an increased sediment delivery ratio. This change was consistent with independent information on likely changes in the sediment response of the study catchment provided by the available records of annual sediment yield and changes in the annual rainfall documented for the local area. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Spatially Enabling the Health Sector

    PubMed Central

    Weeramanthri, Tarun Stephen; Woodgate, Peter

    2016-01-01

    Spatial information describes the physical location of either people or objects, and the measured relationships between them. In this article, we offer the view that greater utilization of spatial information and its related technology, as part of a broader redesign of the architecture of health information at local and national levels, could assist and speed up the process of health reform, which is taking place across the globe in richer and poorer countries alike. In making this point, we describe the impetus for health sector reform, recent developments in spatial information and analytics, and current Australasian spatial health research. We highlight examples of uptake of spatial information by the health sector, as well as missed opportunities. Our recommendations to spatially enable the health sector are applicable to high- and low-resource settings. PMID:27867933

  20. Spatially Enabling the Health Sector.

    PubMed

    Weeramanthri, Tarun Stephen; Woodgate, Peter

    2016-01-01

    Spatial information describes the physical location of either people or objects, and the measured relationships between them. In this article, we offer the view that greater utilization of spatial information and its related technology, as part of a broader redesign of the architecture of health information at local and national levels, could assist and speed up the process of health reform, which is taking place across the globe in richer and poorer countries alike. In making this point, we describe the impetus for health sector reform, recent developments in spatial information and analytics, and current Australasian spatial health research. We highlight examples of uptake of spatial information by the health sector, as well as missed opportunities. Our recommendations to spatially enable the health sector are applicable to high- and low-resource settings.

  1. Sensing with Superconducting Point Contacts

    PubMed Central

    Nurbawono, Argo; Zhang, Chun

    2012-01-01

    Superconducting point contacts have been used for measuring magnetic polarizations, identifying magnetic impurities, electronic structures, and even the vibrational modes of small molecules. Due to intrinsically small energy scale in the subgap structures of the supercurrent determined by the size of the superconducting energy gap, superconductors provide ultrahigh sensitivities for high resolution spectroscopies. The so-called Andreev reflection process between normal metal and superconductor carries complex and rich information which can be utilized as powerful sensor when fully exploited. In this review, we would discuss recent experimental and theoretical developments in the supercurrent transport through superconducting point contacts and their relevance to sensing applications, and we would highlight their current issues and potentials. A true utilization of the method based on Andreev reflection analysis opens up possibilities for a new class of ultrasensitive sensors. PMID:22778630

  2. 75 FR 78269 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Telephone...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-15

    ... for OMB Review; Comment Request; Telephone Point of Purchase Survey ACTION: Notice. SUMMARY: The... information collection request (ICR) titled, ``Telephone Point of Purchase Survey,'' to the Office [email protected] . SUPPLEMENTARY INFORMATION: The purpose of the Telephone Point of Purchase Survey is to...

  3. Stable Isotope Measurements of Carbon Dioxide, Methane, and Hydrogen Sulfide Gas Using Frequency Modulation Spectroscopy

    NASA Astrophysics Data System (ADS)

    Nowak-Lovato, K.

    2014-12-01

    Seepage from enhanced oil recovery, carbon storage, and natural gas sites can emit trace gases such as carbon dioxide, methane, and hydrogen sulfide. Trace gas emission at these locations demonstrate unique light stable isotope signatures that provide information to enable source identification of the material. Light stable isotope detection through surface monitoring, offers the ability to distinguish between trace gases emitted from sources such as, biological (fertilizers and wastes), mineral (coal or seams), or liquid organic systems (oil and gas reservoirs). To make light stable isotope measurements, we employ the ultra-sensitive technique, frequency modulation spectroscopy (FMS). FMS is an absorption technique with sensitivity enhancements approximately 100-1000x more than standard absorption spectroscopy with the advantage of providing stable isotope signature information. We have developed an integrated in situ (point source) system that measures carbon dioxide, methane and hydrogen sulfide with isotopic resolution and enhanced sensitivity. The in situ instrument involves the continuous collection of air and records the stable isotope ratio for the gas being detected. We have included in-line flask collection points to obtain gas samples for validation of isotopic concentrations using our in-house isotope ratio mass spectroscopy (IRMS). We present calibration curves for each species addressed above to demonstrate the sensitivity and accuracy of the system. We also show field deployment data demonstrating the capabilities of the system in making live dynamic measurements from an active source.

  4. Risk management and measuring productivity with POAS--Point of Act System--a medical information system as ERP (Enterprise Resource Planning) for hospital management.

    PubMed

    Akiyama, M

    2007-01-01

    The concept of our system is not only to manage material flows, but also to provide an integrated management resource, a means of correcting errors in medical treatment, and applications to EBM (evidence-based medicine) through the data mining of medical records. Prior to the development of this system, electronic processing systems in hospitals did a poor job of accurately grasping medical practice and medical material flows. With POAS (Point of Act System), hospital managers can solve the so-called, "man, money, material, and information" issues inherent in the costs of healthcare. The POAS system synchronizes with each department system, from finance and accounting, to pharmacy, to imaging, and allows information exchange. We can manage Man (Business Process), Material (Medical Materials and Medicine), Money (Expenditure for purchase and Receipt), and Information (Medical Records) completely by this system. Our analysis has shown that this system has a remarkable investment effect - saving over four million dollars per year - through cost savings in logistics and business process efficiencies. In addition, the quality of care has been improved dramatically while error rates have been reduced - nearly to zero in some cases.

  5. Ground based mobile isotopic methane measurements in the Front Range, Colorado

    NASA Astrophysics Data System (ADS)

    Vaughn, B. H.; Rella, C.; Petron, G.; Sherwood, O.; Mielke-Maday, I.; Schwietzke, S.

    2014-12-01

    Increased development of unconventional oil and gas resources in North America has given rise to attempts to monitor and quantify fugitive emissions of methane from the industry. Emission estimates of methane from oil and gas basins can vary significantly from one study to another as well as from EPA or State estimates. New efforts are aimed at reconciling bottom-up, or inventory-based, emission estimates of methane with top-down estimates based on atmospheric measurements from aircraft, towers, mobile ground-based vehicles, and atmospheric models. Attributing airborne measurements of regional methane fluxes to specific sources is informed by ground-based measurements of methane. Stable isotopic measurements (δ13C) of methane help distinguish between emissions from the O&G industry, Confined Animal Feed Operations (CAFO), and landfills, but analytical challenges typically limit meaningful isotopic measurements to individual point sampling. We are developing a toolbox to use δ13CH4 measurements to assess the partitioning of methane emissions for regions with multiple methane sources. The method was applied to the Denver-Julesberg Basin. Here we present data from continuous isotopic measurements obtained over a wide geographic area by using MegaCore, a 1500 ft. tube that is constantly filled with sample air while driving, then subsequently analyzed at slower rates using cavity ring down spectroscopy (CRDS). Pressure, flow and calibration are tightly controlled allowing precise attribution of methane enhancements to their point of collection. Comparisons with point measurements are needed to confirm regional values and further constrain flux estimates and models. This effort was made in conjunction with several major field campaigns in the Colorado Front Range in July-August 2014, including FRAPPÉ (Front Range Air Pollution and Photochemistry Experiment), DISCOVER-AQ, and the Air Water Gas NSF Sustainability Research Network at the University of Colorado.

  6. Making Decisions about Adult Learners Based on Performances on Functional Competency Measures.

    ERIC Educational Resources Information Center

    Bunch, Michael B.

    The validity and dependability of functional competency tests for adults are examined as they relate to the information needs of instructional decision makers. Test data from the Adult Performance Level (APL) Program (funded by the U.S. Office of Education at the University of Texas at Austin) is used to illustrate key points. In the discussion of…

  7. New techniques to measure cliff change from historical oblique aerial photographs and structure-from-motion photogrammetry

    USGS Publications Warehouse

    Warrick, Jonathan; Ritchie, Andy; Adelman, Gabrielle; Adelman, Ken; Limber, Patrick W.

    2017-01-01

    Oblique aerial photograph surveys are commonly used to document coastal landscapes. Here it is shown that adequate overlap may exist in these photographic records to develop topographic models with Structure-from-Motion (SfM) photogrammetric techniques. Using photographs of Fort Funston, California, from the California Coastal Records Project, imagery were combined with ground control points in a four-dimensional analysis that produced topographic point clouds of the study area’s cliffs for 5 years spanning 2002 to 2010. Uncertainty was assessed by comparing point clouds with airborne LIDAR data, and these uncertainties were related to the number and spatial distribution of ground control points used in the SfM analyses. With six or more ground control points, the root mean squared errors between the SfM and LIDAR data were less than 0.30 m (minimum 1⁄4 0.18 m), and the mean systematic error was less than 0.10 m. The SfM results had several benefits over traditional airborne LIDAR in that they included point coverage on vertical- to-overhanging sections of the cliff and resulted in 10–100 times greater point densities. Time series of the SfM results revealed topographic changes, including landslides, rock falls, and the erosion of landslide talus along the Fort Funston beach. Thus, it was concluded that SfM photogrammetric techniques with historical oblique photographs allow for the extraction of useful quantitative information for mapping coastal topography and measuring coastal change. The new techniques presented here are likely applicable to many photograph collections and problems in the earth sciences.

  8. Deriving Temporal Height Information for Maize Breeding

    NASA Astrophysics Data System (ADS)

    Malambo, L.; Popescu, S. C.; Murray, S.; Sheridan, R.; Richardson, G.; Putman, E.

    2016-12-01

    Phenotypic data such as height provide useful information to crop breeders to better understand their field experiments and associated field variability. However, the measurement of crop height in many breeding programs is done manually which demands significant effort and time and does not scale well when large field experiments are involved. Through structure from motion (SfM) techniques, small unmanned aerial vehicles (sUAV) or drones offer tremendous potential for generating crop height data and other morphological data such as canopy area and biomass in cost-effective and efficient way. We present results of an on-going UAV application project aimed at generating temporal height metrics for maize breeding at the Texas A&M AgriLife Research farm in Burleson County, Texas. We outline the activities involved from the drone aerial surveys, image processing and generation of crop height metrics. The experimental period ran from April (planting) through August (harvest) 2016 and involved 36 maize hybrids replicated over 288 plots ( 1.7 Ha). During the time, crop heights were manually measured per plot at weekly intervals. Corresponding aerial flights were carried out using a DJI Phantom 3 Professional UAV at each interval and images captured processed into point clouds and image mosaics using Pix4D (Pix4D SA; Lausanne, Switzerland) software. LiDAR data was also captured at two intervals (05/06 and 07/29) to provide another source of height information. To obtain height data per plot from SfM point clouds and LiDAR data, percentile height metrics were then generated using FUSION software. Results of the comparison between SfM and field measurement height show high correlation (R2 > 0.7), showing that use of sUAV can replace laborious manual height measurement and enhance plant breeding programs. Similar results were also obtained from the comparison of SfM and LiDAR heights. Outputs of this project are helping plant breeders at Texas A&M automate routine height measurements in maize and quickly make actionable decisions and discover new hybrids.

  9. Satellite Based Soil Moisture Product Validation Using NOAA-CREST Ground and L-Band Observations

    NASA Astrophysics Data System (ADS)

    Norouzi, H.; Campo, C.; Temimi, M.; Lakhankar, T.; Khanbilvardi, R.

    2015-12-01

    Soil moisture content is among most important physical parameters in hydrology, climate, and environmental studies. Many microwave-based satellite observations have been utilized to estimate this parameter. The Advanced Microwave Scanning Radiometer 2 (AMSR2) is one of many remotely sensors that collects daily information of land surface soil moisture. However, many factors such as ancillary data and vegetation scattering can affect the signal and the estimation. Therefore, this information needs to be validated against some "ground-truth" observations. NOAA - Cooperative Remote Sensing and Technology (CREST) center at the City University of New York has a site located at Millbrook, NY with several insitu soil moisture probes and an L-Band radiometer similar to Soil Moisture Passive and Active (SMAP) one. This site is among SMAP Cal/Val sites. Soil moisture information was measured at seven different locations from 2012 to 2015. Hydra probes are used to measure six of these locations. This study utilizes the observations from insitu data and the L-Band radiometer close to ground (at 3 meters height) to validate and to compare soil moisture estimates from AMSR2. Analysis of the measurements and AMSR2 indicated a weak correlation with the hydra probes and a moderate correlation with Cosmic-ray Soil Moisture Observing System (COSMOS probes). Several differences including the differences between pixel size and point measurements can cause these discrepancies. Some interpolation techniques are used to expand point measurements from 6 locations to AMSR2 footprint. Finally, the effect of penetration depth in microwave signal and inconsistencies with other ancillary data such as skin temperature is investigated to provide a better understanding in the analysis. The results show that the retrieval algorithm of AMSR2 is appropriate under certain circumstances. This validation algorithm and similar study will be conducted for SMAP mission. Keywords: Remote Sensing, Soil Moisture, AMSR2, SMAP, L-Band.

  10. Quantitative assessment of distance to collection point and improved sorting information on source separation of household waste.

    PubMed

    Rousta, Kamran; Bolton, Kim; Lundin, Magnus; Dahlén, Lisa

    2015-06-01

    The present study measures the participation of households in a source separation scheme and, in particular, if the household's application of the scheme improved after two interventions: (a) shorter distance to the drop-off point and (b) easy access to correct sorting information. The effect of these interventions was quantified and, as far as possible, isolated from other factors that can influence the recycling behaviour. The study was based on households located in an urban residential area in Sweden, where waste composition studies were performed before and after the interventions by manual sorting (pick analysis). Statistical analyses of the results indicated a significant decrease (28%) of packaging and newsprint in the residual waste after establishing a property close collection system (intervention (a)), as well as significant decrease (70%) of the miss-sorted fraction in bags intended for food waste after new information stickers were introduced (intervention (b)). Providing a property close collection system to collect more waste fractions as well as finding new communication channels for information about sorting can be used as tools to increase the source separation ratio. This contribution also highlights the need to evaluate the effects of different types of information and communication concerning sorting instructions in a property close collection system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Cost-effective ways of delivering enquiry services: a rapid review.

    PubMed

    Sutton, Anthea; Grant, Maria J

    2011-12-01

    In the recent times of recession and budget cuts, it is more important than ever for library and information services to deliver cost-effective services. This rapid review aims to examine the evidence for the most cost-effective ways of delivering enquiry services. A literature search was conducted on LISA (Library and Information Sciences Abstracts) and MEDLINE. Searches were limited to 2007 onwards. Eight studies met the inclusion criteria. The studies covered hospital and academic libraries in the USA and Canada. Services analysed were 'point-of-care' librarian consultations, staffing models for reference desks and virtual/digital reference services. Transferable lessons, relevant to health library and information services generally, can be drawn from this rapid review. These suggest that 'point-of-care' librarians for primary care practitioners are a cost-effective way of answering questions. Reference desks can be cost-effectively staffed by student employees or general reference staff, although librarian referral must be provided for more complex and subject-specific enquiries. However, it is not possible to draw any conclusions on virtual/digital reference services because of the limited literature available. Further case analysis studies measuring specific services, particularly enquiry services within a health library and information context, are required. © 2011 The authors. Health Information and Libraries Journal © 2011 Health Libraries Group.

  12. A new similarity index for nonlinear signal analysis based on local extrema patterns

    NASA Astrophysics Data System (ADS)

    Niknazar, Hamid; Motie Nasrabadi, Ali; Shamsollahi, Mohammad Bagher

    2018-02-01

    Common similarity measures of time domain signals such as cross-correlation and Symbolic Aggregate approximation (SAX) are not appropriate for nonlinear signal analysis. This is because of the high sensitivity of nonlinear systems to initial points. Therefore, a similarity measure for nonlinear signal analysis must be invariant to initial points and quantify the similarity by considering the main dynamics of signals. The statistical behavior of local extrema (SBLE) method was previously proposed to address this problem. The SBLE similarity index uses quantized amplitudes of local extrema to quantify the dynamical similarity of signals by considering patterns of sequential local extrema. By adding time information of local extrema as well as fuzzifying quantized values, this work proposes a new similarity index for nonlinear and long-term signal analysis, which extends the SBLE method. These new features provide more information about signals and reduce noise sensitivity by fuzzifying them. A number of practical tests were performed to demonstrate the ability of the method in nonlinear signal clustering and classification on synthetic data. In addition, epileptic seizure detection based on electroencephalography (EEG) signal processing was done by the proposed similarity to feature the potentials of the method as a real-world application tool.

  13. Genetic Thinking in the Study of Social Relationships: Five Points of Entry.

    PubMed

    Reiss, David

    2010-09-01

    For nearly a generation, researchers studying human behavioral development have combined genetically informed research designs with careful measures of social relationships such as parenting, sibling relationships, peer relationships, marital processes, social class stratifications, and patterns of social engagement in the elderly. In what way have these genetically informed studies altered the construction and testing of social theories of human development? We consider five points of entry where genetic thinking is taking hold. First, genetic findings suggest an alternative scenario for explaining social data. Associations between measures of the social environment and human development may be due to genes that influence both. Second, genetic studies add to other prompts to study the early developmental origins of current social phenomena in midlife and beyond. Third, genetic analyses promise to shed light on understudied social systems, such as sibling relationships, that have an impact on human development independent of genotype. Fourth, genetic analyses anchor in neurobiology individual differences in resilience and sensitivity to both adverse and favorable social environments. Finally, genetic analyses increase the utility of laboratory simulations of human social processes and of animal models. © The Author(s) 2010.

  14. Statistical Attitude Determination

    NASA Technical Reports Server (NTRS)

    Markley, F. Landis

    2010-01-01

    All spacecraft require attitude determination at some level of accuracy. This can be a very coarse requirement of tens of degrees, in order to point solar arrays at the sun, or a very fine requirement in the milliarcsecond range, as required by Hubble Space Telescope. A toolbox of attitude determination methods, applicable across this wide range, has been developed over the years. There have been many advances in the thirty years since the publication of Reference, but the fundamentals remain the same. One significant change is that onboard attitude determination has largely superseded ground-based attitude determination, due to the greatly increased power of onboard computers. The availability of relatively inexpensive radiation-hardened microprocessors has led to the development of "smart" sensors, with autonomous star trackers being the first spacecraft application. Another new development is attitude determination using interferometry of radio signals from the Global Positioning System (GPS) constellation. This article reviews both the classic material and these newer developments at approximately the level of, with emphasis on. methods suitable for use onboard a spacecraft. We discuss both "single frame" methods that are based on measurements taken at a single point in time, and sequential methods that use information about spacecraft dynamics to combine the information from a time series of measurements.

  15. Using multi-stakeholder alliances to accelerate the adoption of health information technology by physician practices.

    PubMed

    McHugh, Megan; Shi, Yunfeng; McClellan, Sean R; Shortell, Stephen M; Fareed, Naleef; Harvey, Jillian; Ramsay, Patricia; Casalino, Lawrence P

    2016-06-01

    Multi-stakeholder alliances - groups of payers, purchasers, providers, and consumers that work together to address local health goals - are frequently used to improve health care quality within communities. Under the Aligning Forces for Quality (AF4Q) initiative, multi-stakeholder alliances were given funding and technical assistance to encourage the use of health information technology (HIT) to improve quality. We investigated whether HIT adoption was greater in AF4Q communities than in other communities. Drawing upon survey data from 782 small and medium-sized physician practices collected as part of the National Study of Physician Organizations during July 2007 - March 2009 and January 2012-November 2013, we used weighted fixed effects models to detect relative changes in four measures representing three domains: use of electronic health records (EHRs), receipt of electronic information from hospitals, and patients' online access to their medical records. Improvement on a composite EHR adoption measure was 7.6 percentage points greater in AF4Q communities than in non-AF4Q communities, and the increase in the probability of adopting all five EHR capabilities was 23.9 percentage points greater in AF4Q communities. There was no significant difference in improvement in receipt of electronic information from hospitals or patients' online access to medical records between AF4Q and non-AF4Q communities. By linking HIT to quality improvement efforts, AF4Q alliances may have facilitated greater adoption of EHRs in small and medium-sized physician practices, but not receipt of electronic information from hospitals or patients' online access to medical records. Multi-stakeholder alliances charged with promoting HIT to advance quality improvement may accelerate adoption of EHRs. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Applications of pilot scanning behavior to integrated display research

    NASA Technical Reports Server (NTRS)

    Waller, M. C.

    1977-01-01

    The oculometer is an electrooptical device designed to measure pilot scanning behavior during instrument approaches and landing operations. An overview of some results from a simulation study is presented to illustrate how information from the oculometer installed in a visual motion simulator, combined with measures of performance and control input data, can provide insight into the behavior and tactics of individual pilots during instrument approaches. Differences in measured behavior of the pilot subjects are pointed out; these differences become apparent in the way the pilots distribute their visual attention, in the amount of control activity, and in selected performance measures. Some of these measured differences have diagnostic implications, suggesting the use of the oculometer along with performance measures as a pilot training tool.

  17. Terahertz radar cross section measurements.

    PubMed

    Iwaszczuk, Krzysztof; Heiselberg, Henning; Jepsen, Peter Uhd

    2010-12-06

    We perform angle- and frequency-resolved radar cross section (RCS) measurements on objects at terahertz frequencies. Our RCS measurements are performed on a scale model aircraft of size 5-10 cm in polar and azimuthal configurations, and correspond closely to RCS measurements with conventional radar on full-size objects. The measurements are performed in a terahertz time-domain system with freely propagating terahertz pulses generated by tilted pulse front excitation of lithium niobate crystals and measured with sub-picosecond time resolution. The application of a time domain system provides ranging information and also allows for identification of scattering points such as weaponry attached to the aircraft. The shapes of the models and positions of reflecting parts are retrieved by the filtered back projection algorithm.

  18. High-NA optical CD metrology on small in-cell targets enabling improved higher order dose control and process control for logic

    NASA Astrophysics Data System (ADS)

    Cramer, Hugo; Mc Namara, Elliott; van Laarhoven, Rik; Jaganatharaja, Ram; de la Fuente, Isabel; Hsu, Sharon; Belletti, Filippo; Popadic, Milos; Tu, Ward; Huang, Wade

    2017-03-01

    The logic manufacturing process requires small in-device metrology targets to exploit the full dose correction potential of the modern scanners and process tools. A high-NA angular resolved scatterometer (YieldStar S-1250D) was modified to demonstrate the possibility of OCD measurements on 5x5µm2 targets. The results obtained on test wafers in a logic manufacturing environment, measured after litho and after core etch, showed a good correlation to larger reference targets and AEI to ADI intra-field CDU correlation, thereby demonstrating the feasibility of OCD on such small targets. The data was used to determine a reduction potential of 55% for the intra-field CD variation, using 145 points per field on a few inner fields, and 33% of the process induced across wafer CD variation using 16 points per field full wafer. In addition, the OCD measurements reveal valuable information on wafer-to-wafer layer height variations within a lot.

  19. Design of an Improved Heater Array to Measure Microscale Wall Heat Transfer

    NASA Technical Reports Server (NTRS)

    Kim, Jungho; Chng, Choon Ping; Kalkur, T. S.

    1996-01-01

    An improved array of microscale heaters is being developed to measure the heat transfer coefficient at many points underneath individual bubbles during boiling as a function of space and time. This heater array enables the local heat transfer from a surface during the bubble growth and departure process to be measured with very high temporal and spatial resolution, and should allow better understanding of the boiling heat transfer mechanisms by pin-pointing when and where in the bubble departure cycle large amounts of wall heat transfer occur. Such information can provide much needed data regarding the important heat transfer mechanisms during the bubble departure cycle, and can serve as benchmarks to validate many of the analytical and numerical models used to simulate boiling. The improvements to the heater array include using a silicon-on-quartz substrate to reduce thermal cross-talk between the heaters, decreased space between the heaters, increased pad sizes on the heaters, and progressive heater sizes. Some results using the present heater array are discussed.

  20. Near-Infrared Spatially Resolved Spectroscopy for Tablet Quality Determination.

    PubMed

    Igne, Benoît; Talwar, Sameer; Feng, Hanzhou; Drennen, James K; Anderson, Carl A

    2015-12-01

    Near-infrared (NIR) spectroscopy has become a well-established tool for the characterization of solid oral dosage forms manufacturing processes and finished products. In this work, the utility of a traditional single-point NIR measurement was compared with that of a spatially resolved spectroscopic (SRS) measurement for the determination of tablet assay. Experimental designs were used to create samples that allowed for calibration models to be developed and tested on both instruments. Samples possessing a poor distribution of ingredients (highly heterogeneous) were prepared by under-blending constituents prior to compaction to compare the analytical capabilities of the two NIR methods. The results indicate that SRS can provide spatial information that is usually obtainable only through imaging experiments for the determination of local heterogeneity and detection of abnormal tablets that would not be detected with single-point spectroscopy, thus complementing traditional NIR measurement systems for in-line, and in real-time tablet analysis. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  1. Point model equations for neutron correlation counting: Extension of Böhnel's equations to any order

    DOE PAGES

    Favalli, Andrea; Croft, Stephen; Santi, Peter

    2015-06-15

    Various methods of autocorrelation neutron analysis may be used to extract information about a measurement item containing spontaneously fissioning material. The two predominant approaches being the time correlation analysis (that make use of a coincidence gate) methods of multiplicity shift register logic and Feynman sampling. The common feature is that the correlated nature of the pulse train can be described by a vector of reduced factorial multiplet rates. We call these singlets, doublets, triplets etc. Within the point reactor model the multiplet rates may be related to the properties of the item, the parameters of the detector, and basic nuclearmore » data constants by a series of coupled algebraic equations – the so called point model equations. Solving, or inverting, the point model equations using experimental calibration model parameters is how assays of unknown items is performed. Currently only the first three multiplets are routinely used. In this work we develop the point model equations to higher order multiplets using the probability generating functions approach combined with the general derivative chain rule, the so called Faà di Bruno Formula. Explicit expression up to 5th order are provided, as well the general iterative formula to calculate any order. This study represents the first necessary step towards determining if higher order multiplets can add value to nondestructive measurement practice for nuclear materials control and accountancy.« less

  2. I feel good! Gender differences and reporting heterogeneity in self-assessed health.

    PubMed

    Schneider, Udo; Pfarr, Christian; Schneider, Brit S; Ulrich, Volker

    2012-06-01

    For empirical analysis and policy-oriented recommendations, the precise measurement of individual health or well-being is essential. The difficulty is that the answer may depend on individual reporting behaviour. Moreover, if an individual's health perception varies with certain attitudes of the respondent, reporting heterogeneity may lead to index or cut-point shifts of the health distribution, causing estimation problems. An index shift is a parallel shift in the thresholds of the underlying distribution of health categories. In contrast, a cut-point shift means that the relative position of the thresholds changes, implying different response behaviour. Our paper aims to detect how socioeconomic determinants and health experiences influence the individual valuation of health. We analyse the reporting behaviour of individuals on their self-assessed health status, a five-point categorical variable. Using German panel data, we control for observed heterogeneity in the categorical health variable as well as unobserved individual heterogeneity in the panel estimation. In the empirical analysis, we find strong evidence for cut-point shifts. Our estimation results show different impacts of socioeconomic and health-related variables on the five categories of self-assessed health. Moreover, the answering behaviour varies between female and male respondents, pointing to gender-specific perception and assessment of health. Hence, in case of reporting heterogeneity, using self-assessed measures in empirical studies may be misleading and the information needs to be handled with care.

  3. An analysis of neural receptive field plasticity by point process adaptive filtering

    PubMed Central

    Brown, Emery N.; Nguyen, David P.; Frank, Loren M.; Wilson, Matthew A.; Solo, Victor

    2001-01-01

    Neural receptive fields are plastic: with experience, neurons in many brain regions change their spiking responses to relevant stimuli. Analysis of receptive field plasticity from experimental measurements is crucial for understanding how neural systems adapt their representations of relevant biological information. Current analysis methods using histogram estimates of spike rate functions in nonoverlapping temporal windows do not track the evolution of receptive field plasticity on a fine time scale. Adaptive signal processing is an established engineering paradigm for estimating time-varying system parameters from experimental measurements. We present an adaptive filter algorithm for tracking neural receptive field plasticity based on point process models of spike train activity. We derive an instantaneous steepest descent algorithm by using as the criterion function the instantaneous log likelihood of a point process spike train model. We apply the point process adaptive filter algorithm in a study of spatial (place) receptive field properties of simulated and actual spike train data from rat CA1 hippocampal neurons. A stability analysis of the algorithm is sketched in the Appendix. The adaptive algorithm can update the place field parameter estimates on a millisecond time scale. It reliably tracked the migration, changes in scale, and changes in maximum firing rate characteristic of hippocampal place fields in a rat running on a linear track. Point process adaptive filtering offers an analytic method for studying the dynamics of neural receptive fields. PMID:11593043

  4. Using spectral methods to obtain particle size information from optical data: applications to measurements from CARES 2010

    NASA Astrophysics Data System (ADS)

    Atkinson, Dean B.; Pekour, Mikhail; Chand, Duli; Radney, James G.; Kolesar, Katheryn R.; Zhang, Qi; Setyan, Ari; O'Neill, Norman T.; Cappa, Christopher D.

    2018-04-01

    Multi-wavelength in situ aerosol extinction, absorption and scattering measurements made at two ground sites during the 2010 Carbonaceous Aerosols and Radiative Effects Study (CARES) are analyzed using a spectral deconvolution method that allows extraction of particle-size-related information, including the fraction of extinction produced by the fine-mode particles and the effective radius of the fine mode. The spectral deconvolution method is typically applied to analysis of remote sensing measurements. Here, its application to in situ measurements allows for comparison with more direct measurement methods and validation of the retrieval approach. Overall, the retrieved fine-mode fraction and effective radius compare well with other in situ measurements, including size distribution measurements and scattering and absorption measurements made separately for PM1 and PM10, although there were some periods during which the different methods yielded different results. One key contributor to differences between the results obtained is the alternative, spectrally based definitions of fine and coarse modes from the optical methods, relative to instruments that use a physically defined cut point. These results indicate that for campaigns where size, composition and multi-wavelength optical property measurements are made, comparison of the results can result in closure or can identify unusual circumstances. The comparison here also demonstrates that in situ multi-wavelength optical property measurements can be used to determine information about particle size distributions in situations where direct size distribution measurements are not available.

  5. Using spectral methods to obtain particle size information from optical data: applications to measurements from CARES 2010

    DOE PAGES

    Atkinson, Dean B.; Pekour, Mikhail; Chand, Duli; ...

    2018-04-23

    Here, multi-wavelength in situ aerosol extinction, absorption and scattering measurements made at two ground sites during the 2010 Carbonaceous Aerosols and Radiative Effects Study (CARES) are analyzed using a spectral deconvolution method that allows extraction of particle-size-related information, including the fraction of extinction produced by the fine-mode particles and the effective radius of the fine mode. The spectral deconvolution method is typically applied to analysis of remote sensing measurements. Here, its application to in situ measurements allows for comparison with more direct measurement methods and validation of the retrieval approach. Overall, the retrieved fine-mode fraction and effective radius compare wellmore » with other in situ measurements, including size distribution measurements and scattering and absorption measurements made separately for PM 1 and PM 10, although there were some periods during which the different methods yielded different results. One key contributor to differences between the results obtained is the alternative, spectrally based definitions of fine and coarse modes from the optical methods, relative to instruments that use a physically defined cut point. These results indicate that for campaigns where size, composition and multi-wavelength optical property measurements are made, comparison of the results can result in closure or can identify unusual circumstances. The comparison here also demonstrates that in situ multi-wavelength optical property measurements can be used to determine information about particle size distributions in situations where direct size distribution measurements are not available.« less

  6. Using spectral methods to obtain particle size information from optical data: applications to measurements from CARES 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atkinson, Dean B.; Pekour, Mikhail; Chand, Duli

    Here, multi-wavelength in situ aerosol extinction, absorption and scattering measurements made at two ground sites during the 2010 Carbonaceous Aerosols and Radiative Effects Study (CARES) are analyzed using a spectral deconvolution method that allows extraction of particle-size-related information, including the fraction of extinction produced by the fine-mode particles and the effective radius of the fine mode. The spectral deconvolution method is typically applied to analysis of remote sensing measurements. Here, its application to in situ measurements allows for comparison with more direct measurement methods and validation of the retrieval approach. Overall, the retrieved fine-mode fraction and effective radius compare wellmore » with other in situ measurements, including size distribution measurements and scattering and absorption measurements made separately for PM 1 and PM 10, although there were some periods during which the different methods yielded different results. One key contributor to differences between the results obtained is the alternative, spectrally based definitions of fine and coarse modes from the optical methods, relative to instruments that use a physically defined cut point. These results indicate that for campaigns where size, composition and multi-wavelength optical property measurements are made, comparison of the results can result in closure or can identify unusual circumstances. The comparison here also demonstrates that in situ multi-wavelength optical property measurements can be used to determine information about particle size distributions in situations where direct size distribution measurements are not available.« less

  7. Using spectral methods to obtain particle size information from optical data: applications to measurements from CARES 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atkinson, Dean B.; Pekour, Mikhail; Chand, Duli

    Multi-wavelength in situ aerosol extinction, absorption and scattering measurements made at two ground sites during the 2010 Carbonaceous Aerosols and Radiative Effects Study (CARES) are analyzed using a spectral deconvolution method that allows extraction of particle-size-related information, including the fraction of extinction produced by the fine-mode particles and the effective radius of the fine mode. The spectral deconvolution method is typically applied to analysis of remote sensing measurements. Here, its application to in situ measurements allows for comparison with more direct measurement methods and validation of the retrieval approach. Overall, the retrieved fine-mode fraction and effective radius compare well withmore » other in situ measurements, including size distribution measurements and scattering and absorption measurements made separately for PM 1 and PM 10, although there were some periods during which the different methods yielded different results. One key contributor to differences between the results obtained is the alternative, spectrally based definitions of fine and coarse modes from the optical methods, relative to instruments that use a physically defined cut point. These results indicate that for campaigns where size, composition and multi-wavelength optical property measurements are made, comparison of the results can result in closure or can identify unusual circumstances. The comparison here also demonstrates that in situ multi-wavelength optical property measurements can be used to determine information about particle size distributions in situations where direct size distribution measurements are not available.« less

  8. Measurement of luminance and color uniformity of displays using the large-format scanner

    NASA Astrophysics Data System (ADS)

    Mazikowski, Adam

    2017-08-01

    Uniformity of display luminance and color is important for comfort and good perception of the information presented on the display. Although display technology has developed and improved a lot over the past years, different types of displays still present a challenge in selected applications, e.g. in medical use or in case of multi-screen installations. A simplified 9-point method of determining uniformity does not always produce satisfactory results, so a different solution is proposed in the paper. The developed system consists of the large-format X-Y-Z ISEL scanner (isel Germany AG), Konica Minolta high sensitivity spot photometer-colorimeter (e.g. CS-200, Konica Minolta, Inc.) and PC computer. Dedicated software in LabView environment for control of the scanner, transfer the measured data to the computer, and visualization of measurement results was also prepared. Based on the developed setup measurements of plasma display and LCD-LED display were performed. A heavily wornout plasma TV unit, with several artifacts visible was selected. These tests show the advantages and drawbacks of described scanning method with comparison with 9-point simplified uniformity determining method.

  9. Predicting Functional Independence Measure Scores During Rehabilitation with Wearable Inertial Sensors

    PubMed Central

    Sprint, Gina; Cook, Diane J.; Weeks, Douglas L.; Borisov, Vladimir

    2016-01-01

    Evaluating patient progress and making discharge decisions regarding inpatient medical rehabilitation rely upon standard clinical assessments administered by trained clinicians. Wearable inertial sensors can offer more objective measures of patient movement and progress. We undertook a study to investigate the contribution of wearable sensor data to predict discharge functional independence measure (FIM) scores for 20 patients at an inpatient rehabilitation facility. The FIM utilizes a 7-point ordinal scale to measure patient independence while performing several activities of daily living, such as walking, grooming, and bathing. Wearable inertial sensor data were collected from ecological ambulatory tasks at two time points mid-stay during inpatient rehabilitation. Machine learning algorithms were trained with sensor-derived features and clinical information obtained from medical records at admission to the inpatient facility. While models trained only with clinical features predicted discharge scores well, we were able to achieve an even higher level of prediction accuracy when also including the wearable sensor-derived features. Correlations as high as 0.97 for leave-one-out cross validation predicting discharge FIM motor scores are reported. PMID:27054054

  10. Application of Gaussian Elimination to Determine Field Components within Unmeasured Regions in the UCN τ Trap

    NASA Astrophysics Data System (ADS)

    Felkins, Joseph; Holley, Adam

    2017-09-01

    Determining the average lifetime of a neutron gives information about the fundamental parameters of interactions resulting from the charged weak current. It is also an input for calculations of the abundance of light elements in the early cosmos, which are also directly measured. Experimentalists have devised two major approaches to measure the lifespan of the neutron, the beam experiment, and the bottle experiment. For the bottle experiment, I have designed a computational algorithm based on a numerical technique that interpolates magnetic field values in between measured points. This algorithm produces interpolated fields that satisfy the Maxwell-Heaviside equations for use in a simulation that will investigate the rate of depolarization in magnetic traps used for bottle experiments, such as the UCN τ experiment at Los Alamos National Lab. I will present how UCN depolarization can cause a systematic error in experiments like UCN τ. I will then describe the technique that I use for the interpolation, and will discuss the accuracy of interpolation for changes with the number of measured points and the volume of the interpolated region. Supported by NSF Grant 1553861.

  11. Earth-Affecting Solar Causes Observatory (EASCO): A Potential International Living with a Star Mission from Sun-Earth L5

    NASA Technical Reports Server (NTRS)

    Gopalswamy, N.; Davila, J. M.; St Cyr, O. C.; Sittler, E. C.; Auchere, F.; Duvall, Jr. T. L.; Hoeksema, J. T.; Maksimovic, M.; MacDowall, R. J.; Szabo, A.; hide

    2011-01-01

    This paper describes the scientific rationale for an L5 mission and a partial list of key scientific instruments the mission should carry. The L5 vantage point provides an unprecedented view of the solar disturbances and their solar sources that can greatly advance the science behind space weather. A coronagraph and a heliospheric imager at L5 will be able to view CMEs broadsided, so space speed of the Earth-directed CMEs can be measured accurately and their radial structure discerned. In addition, an inner coronal imager and a magnetograph from L5 can give advance information on active regions and coronal holes that will soon rotate on to the solar disk. Radio remote sensing at low frequencies can provide information on shock-driving CMEs, the most dangerous of all CMEs. Coordinated helioseismic measurements from the Sun Earth line and L5 provide information on the physical conditions at the base of the convection zone, where solar magnetism originates. Finally, in situ measurements at L5 can provide information on the large-scale solar wind structures (corotating interaction regions (CIRs)) heading towards Earth that potentially result in adverse space weather.

  12. Georeferenced LiDAR 3D vine plantation map generation.

    PubMed

    Llorens, Jordi; Gil, Emilio; Llop, Jordi; Queraltó, Meritxell

    2011-01-01

    The use of electronic devices for canopy characterization has recently been widely discussed. Among such devices, LiDAR sensors appear to be the most accurate and precise. Information obtained with LiDAR sensors during reading while driving a tractor along a crop row can be managed and transformed into canopy density maps by evaluating the frequency of LiDAR returns. This paper describes a proposed methodology to obtain a georeferenced canopy map by combining the information obtained with LiDAR with that generated using a GPS receiver installed on top of a tractor. Data regarding the velocity of LiDAR measurements and UTM coordinates of each measured point on the canopy were obtained by applying the proposed transformation process. The process allows overlap of the canopy density map generated with the image of the intended measured area using Google Earth(®), providing accurate information about the canopy distribution and/or location of damage along the rows. This methodology was applied and tested on different vine varieties and crop stages in two important vine production areas in Spain. The results indicate that the georeferenced information obtained with LiDAR sensors appears to be an interesting tool with the potential to improve crop management processes.

  13. Evaluation of a 3D point cloud tetrahedral tomographic reconstruction method

    PubMed Central

    Pereira, N F; Sitek, A

    2011-01-01

    Tomographic reconstruction on an irregular grid may be superior to reconstruction on a regular grid. This is achieved through an appropriate choice of the image space model, the selection of an optimal set of points and the use of any available prior information during the reconstruction process. Accordingly, a number of reconstruction-related parameters must be optimized for best performance. In this work, a 3D point cloud tetrahedral mesh reconstruction method is evaluated for quantitative tasks. A linear image model is employed to obtain the reconstruction system matrix and five point generation strategies are studied. The evaluation is performed using the recovery coefficient, as well as voxel- and template-based estimates of bias and variance measures, computed over specific regions in the reconstructed image. A similar analysis is performed for regular grid reconstructions that use voxel basis functions. The maximum likelihood expectation maximization reconstruction algorithm is used. For the tetrahedral reconstructions, of the five point generation methods that are evaluated, three use image priors. For evaluation purposes, an object consisting of overlapping spheres with varying activity is simulated. The exact parallel projection data of this object are obtained analytically using a parallel projector, and multiple Poisson noise realizations of these exact data are generated and reconstructed using the different point generation strategies. The unconstrained nature of point placement in some of the irregular mesh-based reconstruction strategies has superior activity recovery for small, low-contrast image regions. The results show that, with an appropriately generated set of mesh points, the irregular grid reconstruction methods can out-perform reconstructions on a regular grid for mathematical phantoms, in terms of the performance measures evaluated. PMID:20736496

  14. Evaluation of a 3D point cloud tetrahedral tomographic reconstruction method

    NASA Astrophysics Data System (ADS)

    Pereira, N. F.; Sitek, A.

    2010-09-01

    Tomographic reconstruction on an irregular grid may be superior to reconstruction on a regular grid. This is achieved through an appropriate choice of the image space model, the selection of an optimal set of points and the use of any available prior information during the reconstruction process. Accordingly, a number of reconstruction-related parameters must be optimized for best performance. In this work, a 3D point cloud tetrahedral mesh reconstruction method is evaluated for quantitative tasks. A linear image model is employed to obtain the reconstruction system matrix and five point generation strategies are studied. The evaluation is performed using the recovery coefficient, as well as voxel- and template-based estimates of bias and variance measures, computed over specific regions in the reconstructed image. A similar analysis is performed for regular grid reconstructions that use voxel basis functions. The maximum likelihood expectation maximization reconstruction algorithm is used. For the tetrahedral reconstructions, of the five point generation methods that are evaluated, three use image priors. For evaluation purposes, an object consisting of overlapping spheres with varying activity is simulated. The exact parallel projection data of this object are obtained analytically using a parallel projector, and multiple Poisson noise realizations of these exact data are generated and reconstructed using the different point generation strategies. The unconstrained nature of point placement in some of the irregular mesh-based reconstruction strategies has superior activity recovery for small, low-contrast image regions. The results show that, with an appropriately generated set of mesh points, the irregular grid reconstruction methods can out-perform reconstructions on a regular grid for mathematical phantoms, in terms of the performance measures evaluated.

  15. Enhancing clinical teaching with information technologies: what can we do right now?

    PubMed

    Sandroni, S

    1997-09-01

    Effective small-group clinical teaching requires recognizing the challenges posed by clinical settings, mastering certain teaching skills, and responding to the needs of what is often a diverse group of learners. Information technologies can enhance clinical teaching by increasing the amount of relevant clinical information available to learners, allowing for the rapid integration of needed information into the teaching encounter, facilitating information processing within small groups, and helping to compensate for the many discontinuities inherent in today's clinical teaching environment. However, as many clinical teachers look toward future implementations of advanced, totally integrated medical information systems, they often overlook information technologies they have at hand right now--e.g., CD-ROM textbooks--that can measurably enhance their teaching. The author describes the "real-world" use of several available technologies (for example, "bookmarking" MEDLINE access points) and offers suggestions for how they might be used by faculty in clinical settings.

  16. Strengthening health information systems to address health equity challenges.

    PubMed Central

    Nolen, Lexi Bambas; Braveman, Paula; Dachs, J. Norberto W.; Delgado, Iris; Gakidou, Emmanuela; Moser, Kath; Rolfe, Liz; Vega, Jeanette; Zarowsky, Christina

    2005-01-01

    Special studies and isolated initiatives over the past several decades in low-, middle- and high-income countries have consistently shown inequalities in health among socioeconomic groups and by gender, race or ethnicity, geographical area and other measures associated with social advantage. Significant health inequalities linked to social (dis)advantage rather than to inherent biological differences are generally considered unfair or inequitable. Such health inequities are the main object of health development efforts, including global targets such as the Millennium Development Goals, which require monitoring to evaluate progress. However, most national health information systems (HIS) lack key information needed to assess and address health inequities, namely, reliable, longitudinal and representative data linking measures of health with measures of social status or advantage at the individual or small-area level. Without empirical documentation and monitoring of such inequities, as well as country-level capacity to use this information for effective planning and monitoring of progress in response to interventions, movement towards equity is unlikely to occur. This paper reviews core information requirements and potential databases and proposes short-term and longer term strategies for strengthening the capabilities of HIS for the analysis of health equity and discusses HIS-related entry points for supporting a culture of equity-oriented decision-making and policy development. PMID:16184279

  17. First Steps to Automated Interior Reconstruction from Semantically Enriched Point Clouds and Imagery

    NASA Astrophysics Data System (ADS)

    Obrock, L. S.; Gülch, E.

    2018-05-01

    The automated generation of a BIM-Model from sensor data is a huge challenge for the modeling of existing buildings. Currently the measurements and analyses are time consuming, allow little automation and require expensive equipment. We do lack an automated acquisition of semantical information of objects in a building. We are presenting first results of our approach based on imagery and derived products aiming at a more automated modeling of interior for a BIM building model. We examine the building parts and objects visible in the collected images using Deep Learning Methods based on Convolutional Neural Networks. For localization and classification of building parts we apply the FCN8s-Model for pixel-wise Semantic Segmentation. We, so far, reach a Pixel Accuracy of 77.2 % and a mean Intersection over Union of 44.2 %. We finally use the network for further reasoning on the images of the interior room. We combine the segmented images with the original images and use photogrammetric methods to produce a three-dimensional point cloud. We code the extracted object types as colours of the 3D-points. We thus are able to uniquely classify the points in three-dimensional space. We preliminary investigate a simple extraction method for colour and material of building parts. It is shown, that the combined images are very well suited to further extract more semantic information for the BIM-Model. With the presented methods we see a sound basis for further automation of acquisition and modeling of semantic and geometric information of interior rooms for a BIM-Model.

  18. Satisfaction with the local service point for care: results of an evaluation study

    PubMed Central

    Esslinger, Adelheid Susanne; Macco, Katrin; Schmidt, Katharina

    2009-01-01

    Purpose The market of care increases and is characterized by complexity. Therefore, service points, such as the ‘Zentrale Anlaufstelle Pflege (ZAPf)’ in Nuremberg, are helpful for clients to get orientation. The purpose of the presentation is to show the results of an evaluation study about the clients' satisfaction with the offers of ZAPf. Study Satisfaction with service may be measured with the SERVQUAL concept introduced by Parasuraman et al. (1988). They found out five dimensions of quality (tangibles, reliability, responsiveness, assurances and empathy). We took these dimensions in our study. The study focuses on the quality of service and the benefits recognized by clients. In spring 2007, we conducted 67 interviews by phone, based on a half standardized questionnaire. Statistical analysis was conducted using SPSS. Results The clients want to get information about care in general, financial and legal aspects, alternative care arrangement (e.g. ambulant, long-term care) and typical age-related diseases. They show a high satisfaction with the service provided. Their benefits are to get information and advice, to strengthen the ability of decision taking, to cope with changing situations in life, and to develop solutions. Conclusions The results show that the quality of service is on a high level. Critical success factors are the interdisciplinary cooperation at the service point, based on a regularly and open exchange of information. Every member focuses on an optimal individual solution for the client. Local professional service points act as networkers and brokers. They serve not only for the clients' needs but also support the effective and efficient provision of optimized care.

  19. Black holes and beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mathur, Samir D., E-mail: mathur.16@osu.edu

    The black hole information paradox forces us into a strange situation: we must find a way to break the semiclassical approximation in a domain where no quantum gravity effects would normally be expected. Traditional quantizations of gravity do not exhibit any such breakdown, and this forces us into a difficult corner: either we must give up quantum mechanics or we must accept the existence of troublesome 'remnants'. In string theory, however, the fundamental quanta are extended objects, and it turns out that the bound states of such objects acquire a size that grows with the number of quanta in themore » bound state. The interior of the black hole gets completely altered to a 'fuzzball' structure, and information is able to escape in radiation from the hole. The semiclassical approximation can break at macroscopic scales due to the large entropy of the hole: the measure in the path integral competes with the classical action, instead of giving a subleading correction. Putting this picture of black hole microstates together with ideas about entangled states leads to a natural set of conjectures on many long-standing questions in gravity: the significance of Rindler and de Sitter entropies, the notion of black hole complementarity, and the fate of an observer falling into a black hole. - Highlights: Black-Right-Pointing-Pointer The information paradox is a serious problem. Black-Right-Pointing-Pointer To solve it we need to find 'hair' on black holes. Black-Right-Pointing-Pointer In string theory we find 'hair' by the fuzzball construction. Black-Right-Pointing-Pointer Fuzzballs help to resolve many other issues in gravity.« less

  20. Quantum-Classical Hybrid for Information Processing

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2011-01-01

    Based upon quantum-inspired entanglement in quantum-classical hybrids, a simple algorithm for instantaneous transmissions of non-intentional messages (chosen at random) to remote distances is proposed. The idea is to implement instantaneous transmission of conditional information on remote distances via a quantum-classical hybrid that preserves superposition of random solutions, while allowing one to measure its state variables using classical methods. Such a hybrid system reinforces the advantages, and minimizes the limitations, of both quantum and classical characteristics. Consider n observers, and assume that each of them gets a copy of the system and runs it separately. Although they run identical systems, the outcomes of even synchronized runs may be different because the solutions of these systems are random. However, the global constrain must be satisfied. Therefore, if the observer #1 (the sender) made a measurement of the acceleration v(sub 1) at t =T, then the receiver, by measuring the corresponding acceleration v(sub 1) at t =T, may get a wrong value because the accelerations are random, and only their ratios are deterministic. Obviously, the transmission of this knowledge is instantaneous as soon as the measurements have been performed. In addition to that, the distance between the observers is irrelevant because the x-coordinate does not enter the governing equations. However, the Shannon information transmitted is zero. None of the senders can control the outcomes of their measurements because they are random. The senders cannot transmit intentional messages. Nevertheless, based on the transmitted knowledge, they can coordinate their actions based on conditional information. If the observer #1 knows his own measurements, the measurements of the others can be fully determined. It is important to emphasize that the origin of entanglement of all the observers is the joint probability density that couples their actions. There is no centralized source, or a sender of the signal, because each receiver can become a sender as well. An observer receives a signal by performing certain measurements synchronized with the measurements of the others. This means that the signal is uniformly and simultaneously distributed over the observers in a decentralized way. The signals transmit no intentional information that would favor one agent over another. All the sequence of signals received by different observers are not only statistically equivalent, but are also point-by-point identical. It is important to assume that each agent knows that the other agent simultaneously receives the identical signals. The sequences of the signals are true random, so that no agent could predict the next step with the probability different from those described by the density. Under these quite general assumptions, the entangled observers-agents can perform non-trivial tasks that include transmission of conditional information from one agent to another, simple paradigm of cooperation, etc. The problem of behavior of intelligent agents correlated by identical random messages in a decentralized way has its own significance: it simulates evolutionary behavior of biological and social systems correlated only via simultaneous sensoring sequences of unexpected events.

  1. Assimilation of concentration measurements for retrieving multiple point releases in atmosphere: A least-squares approach to inverse modelling

    NASA Astrophysics Data System (ADS)

    Singh, Sarvesh Kumar; Rani, Raj

    2015-10-01

    The study addresses the identification of multiple point sources, emitting the same tracer, from their limited set of merged concentration measurements. The identification, here, refers to the estimation of locations and strengths of a known number of simultaneous point releases. The source-receptor relationship is described in the framework of adjoint modelling by using an analytical Gaussian dispersion model. A least-squares minimization framework, free from an initialization of the release parameters (locations and strengths), is presented to estimate the release parameters. This utilizes the distributed source information observable from the given monitoring design and number of measurements. The technique leads to an exact retrieval of the true release parameters when measurements are noise free and exactly described by the dispersion model. The inversion algorithm is evaluated using the real data from multiple (two, three and four) releases conducted during Fusion Field Trials in September 2007 at Dugway Proving Ground, Utah. The release locations are retrieved, on average, within 25-45 m of the true sources with the distance from retrieved to true source ranging from 0 to 130 m. The release strengths are also estimated within a factor of three to the true release rates. The average deviations in retrieval of source locations are observed relatively large in two release trials in comparison to three and four release trials.

  2. Algorithm for pose estimation based on objective function with uncertainty-weighted measuring error of feature point cling to the curved surface.

    PubMed

    Huo, Ju; Zhang, Guiyang; Yang, Ming

    2018-04-20

    This paper is concerned with the anisotropic and non-identical gray distribution of feature points clinging to the curved surface, upon which a high precision and uncertainty-resistance algorithm for pose estimation is proposed. Weighted contribution of uncertainty to the objective function of feature points measuring error is analyzed. Then a novel error objective function based on the spatial collinear error is constructed by transforming the uncertainty into a covariance-weighted matrix, which is suitable for the practical applications. Further, the optimized generalized orthogonal iterative (GOI) algorithm is utilized for iterative solutions such that it avoids the poor convergence and significantly resists the uncertainty. Hence, the optimized GOI algorithm extends the field-of-view applications and improves the accuracy and robustness of the measuring results by the redundant information. Finally, simulation and practical experiments show that the maximum error of re-projection image coordinates of the target is less than 0.110 pixels. Within the space 3000  mm×3000  mm×4000  mm, the maximum estimation errors of static and dynamic measurement for rocket nozzle motion are superior to 0.065° and 0.128°, respectively. Results verify the high accuracy and uncertainty attenuation performance of the proposed approach and should therefore have potential for engineering applications.

  3. Moving Controlled Vocabularies into the Semantic Web

    NASA Astrophysics Data System (ADS)

    Thomas, R.; Lowry, R. K.; Kokkinaki, A.

    2015-12-01

    One of the issues with legacy oceanographic data formats is that the only tool available for describing what a measurement is and how it was made is a single metadata tag known as the parameter code. The British Oceanographic Data Centre (BODC) has been supporting the international oceanographic community gain maximum benefit from this through a controlled vocabulary known as the BODC Parameter Usage Vocabulary (PUV). Over time this has grown to over 34,000 entries some of which have preferred labels with over 400 bytes of descriptive information detailing what was measured and how. A decade ago the BODC pioneered making this information available in a more useful form with the implementation of a prototype vocabulary server (NVS) that referenced each 'parameter code' as a URL. This developed into the current server (NVS V2) in which the parameter URL resolves into an RDF document based on the SKOS data model which includes a list of resource URLs mapped to the 'parameter'. For example the parameter code for a contaminant in biota, such as 'cadmium in Mytilus edulis', carries RDF triples leading to the entry for Mytilus edulis in the WoRMS and for cadmium in the ChEBI ontologies. By providing links into these external ontologies the information captured in a 1980s parameter code now conforms to the Linked Data paradigm of the Semantic Web, vastly increasing the descriptive information accessible to a user. This presentation will describe the next steps along the road to the Semantic Web with the development of a SPARQL end point1 to expose the PUV plus the 190 other controlled vocabularies held in NVS. Whilst this is ideal for those fluent in SPARQL, most users require something a little more user-friendly and so the NVS browser2 was developed over the end point to allow less technical users to query the vocabularies and navigate the NVS ontology. This tool integrates into an editor that allows vocabulary content to be manipulated by authorised users outside BODC. Having placed Linked Data tooling over a single SPARQL end point the obvious future development for this system is to support semantic interoperability outside NVS by the incorporation of federated SPARQL end points in the USA and Australia during the ODIP II project. 1https://vocab.nerc.ac.uk/sparql 2 https://www.bodc.ac.uk/data/codes_and_formats/vocabulary_search/

  4. Does a quality management system improve quality in primary care practices in Switzerland? A longitudinal study.

    PubMed

    Goetz, Katja; Hess, Sigrid; Jossen, Marianne; Huber, Felix; Rosemann, Thomas; Brodowski, Marc; Künzi, Beat; Szecsenyi, Joachim

    2015-04-21

    To examine the effectiveness of the quality management programme--European Practice Assessment--in primary care in Switzerland. Longitudinal study with three points of measurement. Primary care practices in Switzerland. In total, 45 of 91 primary care practices completed European Practice Assessment three times. The interval between each assessment was around 36 months. A variance analyses for repeated measurements were performed for all 129 quality indicators from the domains: 'infrastructure', 'information', 'finance', and 'quality and safety' to examine changes over time. Significant improvements were found in three of four domains: 'quality and safety' (F=22.81, p<0.01), 'information' (F=27.901, p<0.01) and 'finance' (F=4.073, p<0.02). The 129 quality indicators showed a significant improvement within the three points of measurement (F=33.864, p<0.01). The European Practice Assessment for primary care practices thus provides a functioning quality management programme, focusing on the sustainable improvement of structural and organisational aspects to promote high quality of primary care. The implementation of a quality management system which also includes a continuous improvement process would give added value to provide good care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. The Graphical Cadastre Problem in Turkey: The Case of Trabzon Province.

    PubMed

    Demir, Osman; Çoruhlu, Yakup Emre

    2008-09-11

    Cadastral projects in Turkey have been accelerated in recent years by the involvement of the private sector. These projects aim at completing the country's cadastre, along with producing bases in standards that could be a foundation for Land Registry and Cadastre Information System (LRCIS). It is possible to produce cadastral data with today's technological means. In this context, three dimensional cadastre data can be properly produced, especially in digital cadastre projects with the required point accuracy. Nevertheless this is not enough for LRCIS. The cadastre bases that have been produced so far by different methods with different scales and bases, with or without coordinates, should also be converted into digital form based on National Basic GPS Network of Turkey (NBGN) in required point-location accuracy. As the result of evaluation of graphical cadastre bases produced without coordinates, actual land measurements, and information obtained from sheets and field book data together, it was found out that there are significant base problems in the graphical maps. These bases, comprising 20% of Turkey's cadastre constitutes the most important bottleneck of completing the country's cadastre. In the scope of this paper, the possibilities of converting the field book measurement values of graphic cadastre bases into digital forms in national coordinate system by comparing them with actual land measurements are investigated, along with Turkey's Cadastre and its problems.

  6. The Graphical Cadastre Problem in Turkey: The Case of Trabzon Province

    PubMed Central

    Demir, Osman; Çoruhlu, Yakup Emre

    2008-01-01

    Cadastral projects in Turkey have been accelerated in recent years by the involvement of the private sector. These projects aim at completing the country's cadastre, along with producing bases in standards that could be a foundation for Land Registry and Cadastre Information System (LRCIS). It is possible to produce cadastral data with today's technological means. In this context, three dimensional cadastre data can be properly produced, especially in digital cadastre projects with the required point accuracy. Nevertheless this is not enough for LRCIS. The cadastre bases that have been produced so far by different methods with different scales and bases, with or without coordinates, should also be converted into digital form based on National Basic GPS Network of Turkey (NBGN) in required point-location accuracy. As the result of evaluation of graphical cadastre bases produced without coordinates, actual land measurements, and information obtained from sheets and field book data together, it was found out that there are significant base problems in the graphical maps. These bases, comprising 20% of Turkey's cadastre constitutes the most important bottleneck of completing the country's cadastre. In the scope of this paper, the possibilities of converting the field book measurement values of graphic cadastre bases into digital forms in national coordinate system by comparing them with actual land measurements are investigated, along with Turkey's Cadastre and its problems. PMID:27873830

  7. Detection of Baryon Acoustic Oscillation features in the large-scale 3-point correlation function of SDSS BOSS DR12 CMASS galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slepian, Zachary; Slosar, Anze; Eisenstein, Daniel J.

    We present the large-scale 3-point correlation function (3PCF) of the SDSS DR12 CMASS sample of 777,202 Luminous Red Galaxies, the largest-ever sample used for a 3PCF or bispectrum measurement. We make the first high-significance (4.5σ) detection of Baryon Acoustic Oscillations (BAO) in the 3PCF. Using these acoustic features in the 3PCF as a standard ruler, we measure the distance to z=0.57 to 1.7% precision (statistical plus systematic). We find D V = 2024 ± 29Mpc (stat) ± 20Mpc(sys) for our fiducial cosmology (consistent with Planck 2015) and bias model. This measurement extends the use of the BAO technique from themore » 2-point correlation function (2PCF) and power spectrum to the 3PCF and opens an avenue for deriving additional cosmological distance information from future large-scale structure redshift surveys such as DESI. Our measured distance scale from the 3PCF is fairly independent from that derived from the pre-reconstruction 2PCF and is equivalent to increasing the length of BOSS by roughly 10%; reconstruction appears to lower the independence of the distance measurements. In conclusion, fitting a model including tidal tensor bias yields a moderate significance (2.6σ) detection of this bias with a value in agreement with the prediction from local Lagrangian biasing.« less

  8. Detection of baryon acoustic oscillation features in the large-scale three-point correlation function of SDSS BOSS DR12 CMASS galaxies

    NASA Astrophysics Data System (ADS)

    Slepian, Zachary; Eisenstein, Daniel J.; Brownstein, Joel R.; Chuang, Chia-Hsun; Gil-Marín, Héctor; Ho, Shirley; Kitaura, Francisco-Shu; Percival, Will J.; Ross, Ashley J.; Rossi, Graziano; Seo, Hee-Jong; Slosar, Anže; Vargas-Magaña, Mariana

    2017-08-01

    We present the large-scale three-point correlation function (3PCF) of the Sloan Digital Sky Survey DR12 Constant stellar Mass (CMASS) sample of 777 202 Luminous Red Galaxies, the largest-ever sample used for a 3PCF or bispectrum measurement. We make the first high-significance (4.5σ) detection of baryon acoustic oscillations (BAO) in the 3PCF. Using these acoustic features in the 3PCF as a standard ruler, we measure the distance to z = 0.57 to 1.7 per cent precision (statistical plus systematic). We find DV = 2024 ± 29 Mpc (stat) ± 20 Mpc (sys) for our fiducial cosmology (consistent with Planck 2015) and bias model. This measurement extends the use of the BAO technique from the two-point correlation function (2PCF) and power spectrum to the 3PCF and opens an avenue for deriving additional cosmological distance information from future large-scale structure redshift surveys such as DESI. Our measured distance scale from the 3PCF is fairly independent from that derived from the pre-reconstruction 2PCF and is equivalent to increasing the length of BOSS by roughly 10 per cent; reconstruction appears to lower the independence of the distance measurements. Fitting a model including tidal tensor bias yields a moderate-significance (2.6σ) detection of this bias with a value in agreement with the prediction from local Lagrangian biasing.

  9. Detection of Baryon Acoustic Oscillation features in the large-scale 3-point correlation function of SDSS BOSS DR12 CMASS galaxies

    DOE PAGES

    Slepian, Zachary; Slosar, Anze; Eisenstein, Daniel J.; ...

    2017-03-01

    We present the large-scale 3-point correlation function (3PCF) of the SDSS DR12 CMASS sample of 777,202 Luminous Red Galaxies, the largest-ever sample used for a 3PCF or bispectrum measurement. We make the first high-significance (4.5σ) detection of Baryon Acoustic Oscillations (BAO) in the 3PCF. Using these acoustic features in the 3PCF as a standard ruler, we measure the distance to z=0.57 to 1.7% precision (statistical plus systematic). We find D V = 2024 ± 29Mpc (stat) ± 20Mpc(sys) for our fiducial cosmology (consistent with Planck 2015) and bias model. This measurement extends the use of the BAO technique from themore » 2-point correlation function (2PCF) and power spectrum to the 3PCF and opens an avenue for deriving additional cosmological distance information from future large-scale structure redshift surveys such as DESI. Our measured distance scale from the 3PCF is fairly independent from that derived from the pre-reconstruction 2PCF and is equivalent to increasing the length of BOSS by roughly 10%; reconstruction appears to lower the independence of the distance measurements. In conclusion, fitting a model including tidal tensor bias yields a moderate significance (2.6σ) detection of this bias with a value in agreement with the prediction from local Lagrangian biasing.« less

  10. The exploration of the exhibition informatization

    NASA Astrophysics Data System (ADS)

    Zhang, Jiankang

    2017-06-01

    The construction and management of exhibition informatization is the main task and choke point during the process of Chinese exhibition industry’s transformation and promotion. There are three key points expected to realize a breakthrough during the construction of Chinese exhibition informatization, and the three aspects respectively are adopting service outsourcing to construct and maintain the database, adopting advanced chest card technology to collect various kinds of information, developing statistics analysis to maintain good cutomer relations. The success of Chinese exhibition informatization mainly calls for mature suppliers who can provide construction and maintenance of database, the proven technology, a sense of data security, advanced chest card technology, the ability of data mining and analysis and the ability to improve the exhibition service basing on the commercial information got from the data analysis. Several data security measures are expected to apply during the process of system developing, including the measures of the terminal data security, the internet data security, the media data security, the storage data security and the application data security. The informatization of this process is based on the chest card designing. At present, there are several types of chest card technology: bar code chest card; two-dimension code card; magnetic stripe chest card; smart-chip chest card. The information got from the exhibition data will help the organizers to make relevant service strategies, quantify the accumulated indexes of the customers, and improve the level of the customer’s satisfaction and loyalty, what’s more, the information can also provide more additional services like the commercial trips, VIP ceremonial reception.

  11. Comparison of scalar measures used in magnetic resonance diffusion tensor imaging.

    PubMed

    Bahn, M M

    1999-07-01

    The tensors derived from diffusion tensor imaging describe complex diffusion in tissues. However, it is difficult to compare tensors directly or to produce images that contain all of the information of the tensor. Therefore, it is convenient to produce scalar measures that extract desired aspects of the tensor. These measures map the three-dimensional eigenvalues of the diffusion tensor into scalar values. The measures impose an order on eigenvalue space. Many invariant scalar measures have been introduced in the literature. In the present manuscript, a general approach for producing invariant scalar measures is introduced. Because it is often difficult to determine in clinical practice which of the many measures is best to apply to a given situation, two formalisms are introduced for the presentation, definition, and comparison of measures applied to eigenvalues: (1) normalized eigenvalue space, and (2) parametric eigenvalue transformation plots. All of the anisotropy information contained in the three eigenvalues can be retained and displayed in a two-dimensional plot, the normalized eigenvalue plot. An example is given of how to determine the best measure to use for a given situation by superimposing isometric contour lines from various anisotropy measures on plots of actual measured eigenvalue data points. Parametric eigenvalue transformation plots allow comparison of how different measures impose order on normalized eigenvalue space to determine whether the measures are equivalent and how the measures differ. These formalisms facilitate the comparison of scalar invariant measures for diffusion tensor imaging. Normalized eigenvalue space allows presentation of eigenvalue anisotropy information. Copyright 1999 Academic Press.

  12. A multiple objective optimization approach to quality control

    NASA Technical Reports Server (NTRS)

    Seaman, Christopher Michael

    1991-01-01

    The use of product quality as the performance criteria for manufacturing system control is explored. The goal in manufacturing, for economic reasons, is to optimize product quality. The problem is that since quality is a rather nebulous product characteristic, there is seldom an analytic function that can be used as a measure. Therefore standard control approaches, such as optimal control, cannot readily be applied. A second problem with optimizing product quality is that it is typically measured along many dimensions: there are many apsects of quality which must be optimized simultaneously. Very often these different aspects are incommensurate and competing. The concept of optimality must now include accepting tradeoffs among the different quality characteristics. These problems are addressed using multiple objective optimization. It is shown that the quality control problem can be defined as a multiple objective optimization problem. A controller structure is defined using this as the basis. Then, an algorithm is presented which can be used by an operator to interactively find the best operating point. Essentially, the algorithm uses process data to provide the operator with two pieces of information: (1) if it is possible to simultaneously improve all quality criteria, then determine what changes to the process input or controller parameters should be made to do this; and (2) if it is not possible to improve all criteria, and the current operating point is not a desirable one, select a criteria in which a tradeoff should be made, and make input changes to improve all other criteria. The process is not operating at an optimal point in any sense if no tradeoff has to be made to move to a new operating point. This algorithm ensures that operating points are optimal in some sense and provides the operator with information about tradeoffs when seeking the best operating point. The multiobjective algorithm was implemented in two different injection molding scenarios: tuning of process controllers to meet specified performance objectives and tuning of process inputs to meet specified quality objectives. Five case studies are presented.

  13. Biomarkers and biometric measures of adherence to use of ARV-based vaginal rings.

    PubMed

    Stalter, Randy M; Moench, Thomas R; MacQueen, Kathleen M; Tolley, Elizabeth E; Owen, Derek H

    2016-01-01

    Poor adherence to product use has been observed in recent trials of antiretroviral (ARV)-based oral and vaginal gel HIV prevention products, resulting in an inability to determine product efficacy. The delivery of microbicides through vaginal rings is widely perceived as a way to achieve better adherence but vaginal rings do not eliminate the adherence challenges exhibited in clinical trials. Improved objective measures of adherence are needed as new ARV-based vaginal ring products enter the clinical trial stage. To identify technologies that have potential future application for vaginal ring adherence measurement, a comprehensive literature search was conducted that covered a number of biomedical and public health databases, including PubMed, Embase, POPLINE and the Web of Science. Published patents and patent applications were also searched. Technical experts were also consulted to gather more information and help evaluate identified technologies. Approaches were evaluated as to feasibility of development and clinical trial implementation, cost and technical strength. Numerous approaches were identified through our landscape analysis and classified as either point measures or cumulative measures of vaginal ring adherence. Point measurements are those that give a measure of adherence at a particular point in time. Cumulative measures attempt to measure ring adherence over a period of time. Approaches that require modifications to an existing ring product are at a significant disadvantage, as this will likely introduce additional regulatory barriers to the development process and increase manufacturing costs. From the point of view of clinical trial implementation, desirable attributes would be high acceptance by trial participants, and little or no additional time or training requirements on the part of participants or clinic staff. We have identified four promising approaches as being high priority for further development based on the following measurements: intracellular drug levels, drug levels in hair, the accumulation of a vaginal analyte that diffuses into the ring, and the depletion of an intrinsic ring constituent. While some approaches show significant promise over others, it is recommended that a strategy of using complementary biometric and behavioural approaches be adopted to best understand participants' adherence to ARV-based ring products in clinical trials.

  14. Assessing the information desire of patients with advanced cancer by providing information with a decision aid, which is evaluated in a randomized trial: a study protocol.

    PubMed

    Oostendorp, Linda J M; Ottevanger, Petronella B; van der Graaf, Winette T A; Stalmeier, Peep F M

    2011-02-14

    There is a continuing debate on the desirability of informing patients with cancer and thereby involving them in treatment decisions. On the one hand, information uptake may be hampered, and additional stress could be inflicted by involving these patients. On the other hand, even patients with advanced cancer desire information on risks and prognosis. To settle the debate, a decision aid will be developed and presented to patients with advanced disease at the point of decision making. The aid is used to assess the amount of information desired. Factors related to information desire are explored, as well as the ability of the medical oncologist to judge the patient's information desire. The effects of the information on patient well-being are assessed by comparing the decision aid group with a usual care group. This study is a randomized controlled trial of patients with advanced colorectal, breast, or ovarian cancer who have started treatment with first-line palliative chemotherapy. The trial will consist of 100 patients in the decision aid group and 70 patients in the usual care group. To collect complete data of 170 patients, 246 patients will be approached for the study. Patients will complete a baseline questionnaire on sociodemographic data, well-being measures, and psychological measures, believed to predict information desire. The medical oncologist will judge the patient's information desire. After disease progression is diagnosed, the medical oncologist offers the choice between second-line palliative chemotherapy plus best supportive care (BSC) and BSC alone. Randomization will take place to determine whether patients will receive usual care (n = 70) or usual care and the decision aid (n = 100). The aid offers information about the potential risks and benefits of both treatment options, in terms of adverse events, tumour response, and survival. Patients decide for each item whether they desire the information or not. Two follow-up questionnaires will evaluate the effect of the decision aid. This study attempts to settle the debate on the desirability of informing patients with cancer. In contrast to several earlier studies, we will actually deliver information on treatment options to patients at the point of decision making.

  15. Object-Based Coregistration of Terrestrial Photogrammetric and ALS Point Clouds in Forested Areas

    NASA Astrophysics Data System (ADS)

    Polewski, P.; Erickson, A.; Yao, W.; Coops, N.; Krzystek, P.; Stilla, U.

    2016-06-01

    Airborne Laser Scanning (ALS) and terrestrial photogrammetry are methods applicable for mapping forested environments. While ground-based techniques provide valuable information about the forest understory, the measured point clouds are normally expressed in a local coordinate system, whose transformation into a georeferenced system requires additional effort. In contrast, ALS point clouds are usually georeferenced, yet the point density near the ground may be poor under dense overstory conditions. In this work, we propose to combine the strengths of the two data sources by co-registering the respective point clouds, thus enriching the georeferenced ALS point cloud with detailed understory information in a fully automatic manner. Due to markedly different sensor characteristics, coregistration methods which expect a high geometric similarity between keypoints are not suitable in this setting. Instead, our method focuses on the object (tree stem) level. We first calculate approximate stem positions in the terrestrial and ALS point clouds and construct, for each stem, a descriptor which quantifies the 2D and vertical distances to other stem centers (at ground height). Then, the similarities between all descriptor pairs from the two point clouds are calculated, and standard graph maximum matching techniques are employed to compute corresponding stem pairs (tiepoints). Finally, the tiepoint subset yielding the optimal rigid transformation between the terrestrial and ALS coordinate systems is determined. We test our method on simulated tree positions and a plot situated in the northern interior of the Coast Range in western Oregon, USA, using ALS data (76 x 121 m2) and a photogrammetric point cloud (33 x 35 m2) derived from terrestrial photographs taken with a handheld camera. Results on both simulated and real data show that the proposed stem descriptors are discriminative enough to derive good correspondences. Specifically, for the real plot data, 24 corresponding stems were coregistered with an average 2D position deviation of 66 cm.

  16. Measurement Uncertainty Relations for Discrete Observables: Relative Entropy Formulation

    NASA Astrophysics Data System (ADS)

    Barchielli, Alberto; Gregoratti, Matteo; Toigo, Alessandro

    2018-02-01

    We introduce a new information-theoretic formulation of quantum measurement uncertainty relations, based on the notion of relative entropy between measurement probabilities. In the case of a finite-dimensional system and for any approximate joint measurement of two target discrete observables, we define the entropic divergence as the maximal total loss of information occurring in the approximation at hand. For fixed target observables, we study the joint measurements minimizing the entropic divergence, and we prove the general properties of its minimum value. Such a minimum is our uncertainty lower bound: the total information lost by replacing the target observables with their optimal approximations, evaluated at the worst possible state. The bound turns out to be also an entropic incompatibility degree, that is, a good information-theoretic measure of incompatibility: indeed, it vanishes if and only if the target observables are compatible, it is state-independent, and it enjoys all the invariance properties which are desirable for such a measure. In this context, we point out the difference between general approximate joint measurements and sequential approximate joint measurements; to do this, we introduce a separate index for the tradeoff between the error of the first measurement and the disturbance of the second one. By exploiting the symmetry properties of the target observables, exact values, lower bounds and optimal approximations are evaluated in two different concrete examples: (1) a couple of spin-1/2 components (not necessarily orthogonal); (2) two Fourier conjugate mutually unbiased bases in prime power dimension. Finally, the entropic incompatibility degree straightforwardly generalizes to the case of many observables, still maintaining all its relevant properties; we explicitly compute it for three orthogonal spin-1/2 components.

  17. I See Your Point: Infants under 12 Months Understand that Pointing Is Communicative

    ERIC Educational Resources Information Center

    Krehm, Madelaine; Onishi, Kristine H.; Vouloumanos, Athena

    2014-01-01

    Do young infants understand that pointing gestures allow the pointer to change the information state of a recipient? We used a third-party experimental scenario to examine whether 9- and 11-month-olds understand that a pointer's pointing gesture can inform a recipient about a target object. When the pointer pointed to a target, infants…

  18. Do domestic dogs interpret pointing as a command?

    PubMed

    Scheider, Linda; Kaminski, Juliane; Call, Josep; Tomasello, Michael

    2013-05-01

    Domestic dogs comprehend human gestural communication flexibly, particularly the pointing gesture. Here, we examine whether dogs interpret pointing informatively, that is, as simply providing information, or rather as a command, for example, ordering them to move to a particular location. In the first study a human pointed toward an empty cup. In one manipulation, the dog either knew or did not know that the designated cup was empty (and that the other cup actually contained the food). In another manipulation, the human (as authority) either did or did not remain in the room after pointing. Dogs ignored the human's gesture if they had better information, irrespective of the authority's presence. In the second study, we varied the level of authority of the person pointing. Sometimes this person was an adult, and sometimes a young child. Dogs followed children's pointing just as frequently as they followed adults' pointing (and ignored the dishonest pointing of both), suggesting that the level of authority did not affect their behavior. Taken together these studies suggest that dogs do not see pointing as an imperative command ordering them to a particular location. It is still not totally clear, however, if they interpret it as informative or in some other way.

  19. A PLM-based automated inspection planning system for coordinate measuring machine

    NASA Astrophysics Data System (ADS)

    Zhao, Haibin; Wang, Junying; Wang, Boxiong; Wang, Jianmei; Chen, Huacheng

    2006-11-01

    With rapid progress of Product Lifecycle Management (PLM) in manufacturing industry, automatic generation of inspection planning of product and the integration with other activities in product lifecycle play important roles in quality control. But the techniques for these purposes are laggard comparing with techniques of CAD/CAM. Therefore, an automatic inspection planning system for Coordinate Measuring Machine (CMM) was developed to improve the automatization of measuring based on the integration of inspection system in PLM. Feature information representation is achieved based on a PLM canter database; measuring strategy is optimized through the integration of multi-sensors; reasonable number and distribution of inspection points are calculated and designed with the guidance of statistic theory and a synthesis distribution algorithm; a collision avoidance method is proposed to generate non-collision inspection path with high efficiency. Information mapping is performed between Neutral Interchange Files (NIFs), such as STEP, DML, DMIS, XML, etc., to realize information integration with other activities in the product lifecycle like design, manufacturing and inspection execution, etc. Simulation was carried out to demonstrate the feasibility of the proposed system. As a result, the inspection process is becoming simpler and good result can be got based on the integration in PLM.

  20. Graz kHz SLR LIDAR: first results

    NASA Astrophysics Data System (ADS)

    Kirchner, Georg; Koidl, Franz; Kucharski, Daniel; Pachler, Walther; Seiss, Matthias; Leitgeb, Erich

    2009-05-01

    The Satellite Laser Ranging (SLR) Station Graz is measuring routinely distances to satellites with a 2 kHz laser, achieving an accuracy of 2-3 mm. Using this available equipment, we developed - and added as a byproduct - a kHz SLR LIDAR for the Graz station: Photons of each transmitted laser pulse are backscattered from clouds, atmospheric layers, aircraft vapor trails etc. An additional 10 cm diameter telescope - installed on our main telescope mount - and a Single- Photon Counting Module (SPCM) detect these photons. Using an ISA-Bus based FPGA card - developed in Graz for the kHz SLR operation - these detection times are stored with 100 ns resolution (15 m slots in distance). Event times of any number of laser shots can be accumulated in up to 4096 counters (according to > 60 km distance). The LIDAR distances are stored together with epoch time and telescope pointing information; any reflection point is therefore determined with 3D coordinates, with 15 m resolution in distance, and with the angular precision of the laser telescope pointing. First test results to clouds in full daylight conditions - accumulating up to several 100 laser shots per measurement - yielded high LIDAR data rates (> 100 points per second) and excellent detection of clouds (up to 10 km distance at the moment). Our ultimate goal is to operate the LIDAR automatically and in parallel with the standard SLR measurements, during day and night, collecting LIDAR data as a byproduct, and without any additional expenses.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cornwell, Paris A; Bunn, Jeffrey R; Schmidlin, Joshua E

    The December 2010 version of the guide, ORNL/TM-2008/159, by Jeff Bunn, Josh Schmidlin, Camden Hubbard, and Paris Cornwell, has been further revised due to a major change in the GeoMagic Studio software for constructing a surface model. The Studio software update also includes a plug-in module to operate the FARO Scan Arm. Other revisions for clarity were also made. The purpose of this revision document is to guide the reader through the process of laser alignment used by NRSF2 at HFIR and VULCAN at SNS. This system was created to increase the spatial accuracy of the measurement points in amore » sample, reduce the use of neutron time used for alignment, improve experiment planning, and reduce operator error. The need for spatial resolution has been driven by the reduction in gauge volumes to the sub-millimeter level, steep strain gradients in some samples, and requests to mount multiple samples within a few days for relating data from each sample to a common sample coordinate system. The first step in this process involves mounting the sample on an indexer table in a laboratory set up for offline sample mounting and alignment in the same manner it would be mounted at either instrument. In the shared laboratory, a FARO ScanArm is used to measure the coordinates of points on the sample surface ('point cloud'), specific features and fiducial points. A Sample Coordinate System (SCS) needs to be established first. This is an advantage of the technique because the SCS can be defined in such a way to facilitate simple definition of measurement points within the sample. Next, samples are typically mounted to a frame of 80/20 and fiducial points are attached to the sample or frame then measured in the established sample coordinate system. The laser scan probe on the ScanArm can then be used to scan in an 'as-is' model of the sample as well as mounting hardware. GeoMagic Studio 12 is the software package used to construct the model from the point cloud the scan arm creates. Once a model, fiducial, and measurement files are created, a special program, called SScanSS combines the information and by simulation of the sample on the diffractometer can help plan the experiment before using neutron time. Finally, the sample is mounted on the relevant stress measurement instrument and the fiducial points are measured again. In the HFIR beam room, a laser tracker is used in conjunction with a program called CAM2 to measure the fiducial points in the NRSF2 instrument's sample positioner coordinate system. SScanSS is then used again to perform a coordinate system transformation of the measurement file locations to the sample positioner coordinate system. A procedure file is then written with the coordinates in the sample positioner coordinate system for the desired measurement locations. This file is often called a script or command file and can be further modified using excel. It is very important to note that this process is not a linear one, but rather, it often is iterative. Many of the steps in this guide are interdependent on one another. It is very important to discuss the process as it pertains to the specific sample being measured. What works with one sample may not necessarily work for another. This guide attempts to provide a typical work flow that has been successful in most cases.« less

  2. Reliable clinical serum analysis with reusable electrochemical sensor: Toward point-of-care measurement of the antipsychotic medication clozapine.

    PubMed

    Kang, Mijeong; Kim, Eunkyoung; Winkler, Thomas E; Banis, George; Liu, Yi; Kitchen, Christopher A; Kelly, Deanna L; Ghodssi, Reza; Payne, Gregory F

    2017-09-15

    Clozapine is one of the most promising medications for managing schizophrenia but it is under-utilized because of the challenges of maintaining serum levels in a safe therapeutic range (1-3μM). Timely measurement of serum clozapine levels has been identified as a barrier to the broader use of clozapine, which is however challenging due to the complexity of serum samples. We demonstrate a robust and reusable electrochemical sensor with graphene-chitosan composite for rapidly measuring serum levels of clozapine. Our electrochemical measurements in clinical serum from clozapine-treated and clozapine-untreated schizophrenia groups are well correlated to centralized laboratory analysis for the readily detected uric acid and for the clozapine which is present at 100-fold lower concentration. The benefits of our electrochemical measurement approach for serum clozapine monitoring are: (i) rapid measurement (≈20min) without serum pretreatment; (ii) appropriate selectivity and sensitivity (limit of detection 0.7μM); (iii) reusability of an electrode over several weeks; and (iv) rapid reliability testing to detect common error-causing problems. This simple and rapid electrochemical approach for serum clozapine measurements should provide clinicians with the timely point-of-care information required to adjust dosages and personalize the management of schizophrenia. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Determination of Jet Noise Radiation Source Locations using a Dual Sideline Cross-Correlation/Spectrum Technique

    NASA Technical Reports Server (NTRS)

    Allen, C. S.; Jaeger, S. M.

    1999-01-01

    The goal of our efforts is to extrapolate nearfield jet noise measurements to the geometric far field where the jet noise sources appear to radiate from a single point. To accomplish this, information about the location of noise sources in the jet plume, the radiation patterns of the noise sources and the sound pressure level distribution of the radiated field must be obtained. Since source locations and radiation patterns can not be found with simple single microphone measurements, a more complicated method must be used.

  4. Method for measuring multiple scattering corrections between liquid scintillators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verbeke, J. M.; Glenn, A. M.; Keefer, G. J.

    2016-04-11

    In this study, a time-of-flight method is proposed to experimentally quantify the fractions of neutrons scattering between scintillators. An array of scintillators is characterized in terms of crosstalk with this method by measuring a californium source, for different neutron energy thresholds. The spectral information recorded by the scintillators can be used to estimate the fractions of neutrons multiple scattering. With the help of a correction to Feynman's point model theory to account for multiple scattering, these fractions can in turn improve the mass reconstruction of fissile materials under investigation.

  5. Ground Control Point - Wireless System Network for UAV-based environmental monitoring applications

    NASA Astrophysics Data System (ADS)

    Mejia-Aguilar, Abraham

    2016-04-01

    In recent years, Unmanned Aerial Vehicles (UAV) have seen widespread civil applications including usage for survey and monitoring services in areas such as agriculture, construction and civil engineering, private surveillance and reconnaissance services and cultural heritage management. Most aerial monitoring services require the integration of information acquired during the flight (such as imagery) with ground-based information (such as GPS information or others) for improved ground truth validation. For example, to obtain an accurate 3D and Digital Elevation Model based on aerial imagery, it is necessary to include ground-based information of coordinate points, which are normally acquired with surveying methods based on Global Position Systems (GPS). However, GPS surveys are very time consuming and especially for longer time series of monitoring data repeated GPS surveys are necessary. In order to improve speed of data collection and integration, this work presents an autonomous system based on Waspmote technologies build on single nodes interlinked in a Wireless Sensor Network (WSN) star-topology for ground based information collection and later integration with surveying data obtained by UAV. Nodes are designed to be visible from the air, to resist extreme weather conditions with low-power consumption. Besides, nodes are equipped with GPS as well as Inertial Measurement Unit (IMU), accelerometer, temperature and soil moisture sensors and thus provide significant advantages in a broad range of applications for environmental monitoring. For our purpose, the WSN transmits the environmental data with 3G/GPRS to a database on a regular time basis. This project provides a detailed case study and implementation of a Ground Control Point System Network for UAV-based vegetation monitoring of dry mountain grassland in the Matsch valley, Italy.

  6. Development of Field Information Monitoring System Based on the Internet of Things

    NASA Astrophysics Data System (ADS)

    Cai, Ken; Liang, Xiaoying; Wang, Keqiang

    With the rapid development and wide application of electronics, communication and embedded system technologies, the global agriculture is changing from traditional agriculture that is to improve the production relying on the increase of labor, agricultural inputs to the new stage of modern agriculture with low yields, high efficiency, real-time and accuracy. On the other hand the research and development of the Internet of Things, which is an information network to connect objects, with the full capacity to perceive objects, and having the capabilities of reliable transmission and intelligence processing for information, allows us to obtain real-time information of anything. The application of the Internet of Things in field information online monitoring is an effective solution for present wired sensor monitoring system, which has much more disadvantages, such as high cost, the problems of laying lines and so on. In this paper, a novel field information monitoring system based on the Internet of Things is proposed. It can satisfy the requirements of multi-point measurement, mobility, convenience in the field information monitoring process. The whole structure of system is given and the key designs of system design are described in the hardware and software aspect. The studies have expanded current field information measurement methods and strengthen the application of the Internet of Things.

  7. Application of Semantic Tagging to Generate Superimposed Information on a Digital Encyclopedia

    NASA Astrophysics Data System (ADS)

    Garrido, Piedad; Tramullas, Jesus; Martinez, Francisco J.

    We can find in the literature several works regarding the automatic or semi-automatic processing of textual documents with historic information using free software technologies. However, more research work is needed to integrate the analysis of the context and provide coverage to the peculiarities of the Spanish language from a semantic point of view. This research work proposes a novel knowledge-based strategy based on combining subject-centric computing, a topic-oriented approach, and superimposed information. It subsequent combination with artificial intelligence techniques led to an automatic analysis after implementing a made-to-measure interpreted algorithm which, in turn, produced a good number of associations and events with 90% reliability.

  8. Inhomogeneous point-process entropy: An instantaneous measure of complexity in discrete systems

    NASA Astrophysics Data System (ADS)

    Valenza, Gaetano; Citi, Luca; Scilingo, Enzo Pasquale; Barbieri, Riccardo

    2014-05-01

    Measures of entropy have been widely used to characterize complexity, particularly in physiological dynamical systems modeled in discrete time. Current approaches associate these measures to finite single values within an observation window, thus not being able to characterize the system evolution at each moment in time. Here, we propose a new definition of approximate and sample entropy based on the inhomogeneous point-process theory. The discrete time series is modeled through probability density functions, which characterize and predict the time until the next event occurs as a function of the past history. Laguerre expansions of the Wiener-Volterra autoregressive terms account for the long-term nonlinear information. As the proposed measures of entropy are instantaneously defined through probability functions, the novel indices are able to provide instantaneous tracking of the system complexity. The new measures are tested on synthetic data, as well as on real data gathered from heartbeat dynamics of healthy subjects and patients with cardiac heart failure and gait recordings from short walks of young and elderly subjects. Results show that instantaneous complexity is able to effectively track the system dynamics and is not affected by statistical noise properties.

  9. Imaging samples in silica aerogel using an experimental point spread function.

    PubMed

    White, Amanda J; Ebel, Denton S

    2015-02-01

    Light microscopy is a powerful tool that allows for many types of samples to be examined in a rapid, easy, and nondestructive manner. Subsequent image analysis, however, is compromised by distortion of signal by instrument optics. Deconvolution of images prior to analysis allows for the recovery of lost information by procedures that utilize either a theoretically or experimentally calculated point spread function (PSF). Using a laser scanning confocal microscope (LSCM), we have imaged whole impact tracks of comet particles captured in silica aerogel, a low density, porous SiO2 solid, by the NASA Stardust mission. In order to understand the dynamical interactions between the particles and the aerogel, precise grain location and track volume measurement are required. We report a method for measuring an experimental PSF suitable for three-dimensional deconvolution of imaged particles in aerogel. Using fluorescent beads manufactured into Stardust flight-grade aerogel, we have applied a deconvolution technique standard in the biological sciences to confocal images of whole Stardust tracks. The incorporation of an experimentally measured PSF allows for better quantitative measurements of the size and location of single grains in aerogel and more accurate measurements of track morphology.

  10. Measurement Uncertainty of Dew-Point Temperature in a Two-Pressure Humidity Generator

    NASA Astrophysics Data System (ADS)

    Martins, L. Lages; Ribeiro, A. Silva; Alves e Sousa, J.; Forbes, Alistair B.

    2012-09-01

    This article describes the measurement uncertainty evaluation of the dew-point temperature when using a two-pressure humidity generator as a reference standard. The estimation of the dew-point temperature involves the solution of a non-linear equation for which iterative solution techniques, such as the Newton-Raphson method, are required. Previous studies have already been carried out using the GUM method and the Monte Carlo method but have not discussed the impact of the approximate numerical method used to provide the temperature estimation. One of the aims of this article is to take this approximation into account. Following the guidelines presented in the GUM Supplement 1, two alternative approaches can be developed: the forward measurement uncertainty propagation by the Monte Carlo method when using the Newton-Raphson numerical procedure; and the inverse measurement uncertainty propagation by Bayesian inference, based on prior available information regarding the usual dispersion of values obtained by the calibration process. The measurement uncertainties obtained using these two methods can be compared with previous results. Other relevant issues concerning this research are the broad application to measurements that require hygrometric conditions obtained from two-pressure humidity generators and, also, the ability to provide a solution that can be applied to similar iterative models. The research also studied the factors influencing both the use of the Monte Carlo method (such as the seed value and the convergence parameter) and the inverse uncertainty propagation using Bayesian inference (such as the pre-assigned tolerance, prior estimate, and standard deviation) in terms of their accuracy and adequacy.

  11. Many worlds in perspective

    NASA Astrophysics Data System (ADS)

    Päs, Heinrich

    2017-08-01

    A minimal approach to the measurement problem and the quantum-to-classical transition assumes a universally valid quantum formalism, i.e. unitary time evolution governed by a Schrödinger-type equation. As had been pointed out long ago, in this view the measurement process can be described by decoherence which results in a ”Many-Worlds” or ”Many-Minds” scenario according to Everett and Zeh. A silent assumption for decoherence to proceed is however, that there exists incomplete information about the environment our object system gets entangled with in the measurement process. This paper addresses the question where this information is traced out and - by adopting recent approaches to model consciousness in neuroscience - argues that a rigorous interpretation results in a perspectival notion of the quantum-to-classical transition. The information that is or is not available in the consciousness of the observer is crucial for the definition of the environment (i.e. the unknown degrees of freedom in the remainder of the Universe). As such the Many-Worlds-Interpretation, while being difficult or impossible to probe in physics, may become testable in psychology.

  12. Compound-Specific Isotopic Analysis of Meteoritic Amino Acids as a Tool for Evaluating Potential Formation Pathways

    NASA Technical Reports Server (NTRS)

    Elsila, Jamie E.; Burton, Aaron S.; Callahan, Michael C.; Charnley, Steven B.; Glavin, Daniel P.; Dworkin, Jason P.

    2012-01-01

    Measurements of stable hydrogen, carbon, and nitrogen isotopic ratios (delta D, delta C-13, delta N-15) of organic compounds can reveal information about their origin and formation pathways. Several formation mechanisms and environments have been postulated for the amino acids detected in carbonaceous chondrites. As each proposed mechanism utilizes different precursor molecules, the isotopic signatures of the resulting amino acids may point towards the most likely of these proposed pathways. The technique of gas chromatography coupled with mass spectrometry and isotope ratio mass spectrometry provides compound-specific structural and isotopic information from a single splitless injection, enhancing the amount of information gained from small amounts of precious samples such as carbonaceous chondrites. We have applied this technique to measure the compound-specific C, N, and H isotopic ratios of amino acids from seven CM and CR carbonaceous chondrites. We are using these measurements to evaluate predictions of expected isotopic enrichments from potential formation pathways and environments, leading to a better understanding of the origin of these compounds.

  13. Efficient measurement of point-to-set correlations and overlap fluctuations in glass-forming liquids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berthier, Ludovic; Charbonneau, Patrick; Department of Physics, Duke University, Durham, North Carolina 27708

    Cavity point-to-set correlations are real-space tools to detect the roughening of the free-energy landscape that accompanies the dynamical slowdown of glass-forming liquids. Measuring these correlations in model glass formers remains, however, a major computational challenge. Here, we develop a general parallel-tempering method that provides orders-of-magnitude improvement for sampling and equilibrating configurations within cavities. We apply this improved scheme to the canonical Kob-Andersen binary Lennard-Jones model for temperatures down to the mode-coupling theory crossover. Most significant improvements are noted for small cavities, which have thus far been the most difficult to study. This methodological advance also enables us to study amore » broader range of physical observables associated with thermodynamic fluctuations. We measure the probability distribution of overlap fluctuations in cavities, which displays a non-trivial temperature evolution. The corresponding overlap susceptibility is found to provide a robust quantitative estimate of the point-to-set length scale requiring no fitting. By resolving spatial fluctuations of the overlap in the cavity, we also obtain quantitative information about the geometry of overlap fluctuations. We can thus examine in detail how the penetration length as well as its fluctuations evolve with temperature and cavity size.« less

  14. Stereovision-based integrated system for point cloud reconstruction and simulated brain shift validation.

    PubMed

    Yang, Xiaochen; Clements, Logan W; Luo, Ma; Narasimhan, Saramati; Thompson, Reid C; Dawant, Benoit M; Miga, Michael I

    2017-07-01

    Intraoperative soft tissue deformation, referred to as brain shift, compromises the application of current image-guided surgery navigation systems in neurosurgery. A computational model driven by sparse data has been proposed as a cost-effective method to compensate for cortical surface and volumetric displacements. We present a mock environment developed to acquire stereoimages from a tracked operating microscope and to reconstruct three-dimensional point clouds from these images. A reconstruction error of 1 mm is estimated by using a phantom with a known geometry and independently measured deformation extent. The microscope is tracked via an attached tracking rigid body that facilitates the recording of the position of the microscope via a commercial optical tracking system as it moves during the procedure. Point clouds, reconstructed under different microscope positions, are registered into the same space to compute the feature displacements. Using our mock craniotomy device, realistic cortical deformations are generated. When comparing our tracked microscope stereo-pair measure of mock vessel displacements to that of the measurement determined by the independent optically tracked stylus marking, the displacement error was [Formula: see text] on average. These results demonstrate the practicality of using tracked stereoscopic microscope as an alternative to laser range scanners to collect sufficient intraoperative information for brain shift correction.

  15. Three-dimensional sensing methodology combining stereo vision and phase-measuring profilometry based on dynamic programming

    NASA Astrophysics Data System (ADS)

    Lee, Hyunki; Kim, Min Young; Moon, Jeon Il

    2017-12-01

    Phase measuring profilometry and moiré methodology have been widely applied to the three-dimensional shape measurement of target objects, because of their high measuring speed and accuracy. However, these methods suffer from inherent limitations called a correspondence problem, or 2π-ambiguity problem. Although a kind of sensing method to combine well-known stereo vision and phase measuring profilometry (PMP) technique simultaneously has been developed to overcome this problem, it still requires definite improvement for sensing speed and measurement accuracy. We propose a dynamic programming-based stereo PMP method to acquire more reliable depth information and in a relatively small time period. The proposed method efficiently fuses information from two stereo sensors in terms of phase and intensity simultaneously based on a newly defined cost function of dynamic programming. In addition, the important parameters are analyzed at the view point of the 2π-ambiguity problem and measurement accuracy. To analyze the influence of important hardware and software parameters related to the measurement performance and to verify its efficiency, accuracy, and sensing speed, a series of experimental tests were performed with various objects and sensor configurations.

  16. A Study of Cognitive Load for Enhancing Student’s Quantitative Literacy in Inquiry Lab Learning

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahman, T.; Alifiani, D. P.; Khoerunnisa, R. S.

    2017-09-01

    Students often find it difficult to appreciate the relevance of the role of quantitative analysis and concept attainment in the science class. This study measured student cognitive load during the inquiry lab of the respiratory system to improve quantitative literacy. Participants in this study were 40 11th graders from senior high school in Indonesia. After students learned, their feelings about the degree of mental effort that it took to complete the learning tasks were measured by 28 self-report on a 4-point Likert scale. The Task Complexity Worksheet were used to asses processing quantitative information and paper based test were applied to assess participants’ concept achievements. The results showed that inquiry instructional induced a relatively low mental effort, high processing information and high concept achievments.

  17. String Theory - Using Kites for Introducing Remote Sensing and In-Situ Measurement Concepts

    NASA Astrophysics Data System (ADS)

    Bland, G.; Bydlowski, D.; Henry, A.

    2016-12-01

    Kites are often overlooked as a practical and accessible tool for gaining an aerial perspective. This perspective can be used as a proxy for the vantage points of space and aircraft, particularly when introducing the concepts of remote sensing and in-situ measurements that form the foundation of much of NASA's Earth science research. Kites combined with miniature cameras and instrumentation, can easily and affordably be used in formal and informal learning environments to demonstrate techniques and develop skills related to gathering information from above. Additionally, collaborative team work can play an important role, particularly in the form of synthesizing flight operations. Hands-on technology exploration can be a component as well, as there are numerous possibilities for creating sensor systems, line-handling techniques, and understanding kite flight itself.

  18. In Situ Multi-Species (O2, N2, Fuel, Other) Fiber Optic Sensor for Fuel Tank Ullage

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet

    2007-01-01

    A rugged and compact fiber optic sensor system for in situ real-time measurement of nitrogen (N2), oxygen (O2), hydrocarbon (HC) fuel vapors, and other gases has been developed over the past several years at Glenn Research Center. The intrinsically-safe, solid-state fiber optic sensor system provides a 1% precision measurement (by volume) of multiple gases in a 5-sec time window. The sensor has no consumable parts to wear out and requires less than 25 W of electrical power to operate. The sensor head is rugged and compact and is ideal for use in harsh environments such as inside an aircraft fuel tank, or as a feedback sensor in the vent-box of an on-board inert gas generation system (OBIGGS). Multiple sensor heads can be monitored with a single optical detection unit for a cost-effective multi-point sensor system. The present sensor technology is unique in its ability to measure N2 concentration directly, and in its ability to differentiate different types of HC fuels. The present sensor system provides value-added aircraft safety information by simultaneously and directly measuring the nitrogen-oxygen-fuel triplet, which provides the following advantages: (1) information regarding the extent of inerting by N2, (2) information regarding the chemical equivalence ratio, (3) information regarding the composition of the aircraft fuel, and (4) by providing a self-consistent calibration by utilizing a singular sensor for all species. Using the extra information made available by this sensor permits the ignitability of a fuel-oxidizer mixture to be more accurately characterized, which may permit a reduction in the amount of inerting required on a real-time basis, and yet still maintain a fire-safe fuel tank. This translates to an increase in fuel tank fire-safety through a better understanding of the physics of fuel ignition, and at the same time, a reduction in compressed bleed air usage and concomitant aircraft operational costs over the long-run. The present fiber optic sensor can also be used as a false-alarm-free engine/hidden/cargo space fire detector (by measuring increased CO2 and CO, and decreased O2), a multi-point in situ measurement and certification system for halogenated-compound fire protection systems, and for the testing and certification of other aircraft safety sensor systems. The technology (LEW-17826-1) developed in the present sensor system is patent pending.

  19. Continuous section extraction and over-underbreak detection of tunnel based on 3D laser technology and image analysis

    NASA Astrophysics Data System (ADS)

    Wang, Weixing; Wang, Zhiwei; Han, Ya; Li, Shuang; Zhang, Xin

    2015-03-01

    In order to ensure safety, long term stability and quality control in modern tunneling operations, the acquisition of geotechnical information about encountered rock conditions and detailed installed support information is required. The limited space and time in an operational tunnel environment make the acquiring data challenging. The laser scanning in a tunneling environment, however, shows a great potential. The surveying and mapping of tunnels are crucial for the optimal use after construction and in routine inspections. Most of these applications focus on the geometric information of the tunnels extracted from the laser scanning data. There are two kinds of applications widely discussed: deformation measurement and feature extraction. The traditional deformation measurement in an underground environment is performed with a series of permanent control points installed around the profile of an excavation, which is unsuitable for a global consideration of the investigated area. Using laser scanning for deformation analysis provides many benefits as compared to traditional monitoring techniques. The change in profile is able to be fully characterized and the areas of the anomalous movement can easily be separated from overall trends due to the high density of the point cloud data. Furthermore, monitoring with a laser scanner does not require the permanent installation of control points, therefore the monitoring can be completed more quickly after excavation, and the scanning is non-contact, hence, no damage is done during the installation of temporary control points. The main drawback of using the laser scanning for deformation monitoring is that the point accuracy of the original data is generally the same magnitude as the smallest level of deformations that are to be measured. To overcome this, statistical techniques and three dimensional image processing techniques for the point clouds must be developed. For safely, effectively and easily control the problem of Over Underbreak detection of road and solve the problemof the roadway data collection difficulties, this paper presents a new method of continuous section extraction and Over Underbreak detection of road based on 3D laser scanning technology and image processing, the method is divided into the following three steps: based on Canny edge detection, local axis fitting, continuous extraction section and Over Underbreak detection of section. First, after Canny edge detection, take the least-squares curve fitting method to achieve partial fitting in axis. Then adjust the attitude of local roadway that makes the axis of the roadway be consistent with the direction of the extraction reference, and extract section along the reference direction. Finally, we compare the actual cross-sectional view and the cross-sectional design to complete Overbreak detected. Experimental results show that the proposed method have a great advantage in computing costs and ensure cross-section orthogonal intercept terms compared with traditional detection methods.

  20. Equations for obtaining melting points for the ternary system ethylene glycol/sodium chloride/water and their application to cryopreservation.

    PubMed

    Woods, E J; Zieger, M A; Gao, D Y; Critser, J K

    1999-06-01

    The present study describes the H(2)O-NaCl-ethylene glycol ternary system by using a differential scanning calorimeter to measure melting points (T(m)) of four different ratios (R) of ethylene glycol to NaCl and then devising equations to fit the experimental measurements. Ultimately an equation is derived which characterizes the liquidus surface above the eutectic for any R value in the system. This study focuses on ethylene glycol in part because of recent evidence indicating it may be less toxic to pancreatic islets than Me(2)SO, which is currently used routinely for islet cryopreservation. The resulting physical data and previously determined information regarding the osmotic characteristics of canine pancreatic islets are combined in a mathematical model to describe the volumetric response to equilibrium-rate freezing in varying initial concentrations of ethylene glycol. Copyright 1999 Academic Press.

  1. Wavefront measurements of phase plates combining a point-diffraction interferometer and a Hartmann-Shack sensor.

    PubMed

    Bueno, Juan M; Acosta, Eva; Schwarz, Christina; Artal, Pablo

    2010-01-20

    A dual setup composed of a point diffraction interferometer (PDI) and a Hartmann-Shack (HS) wavefront sensor was built to compare the estimates of wavefront aberrations provided by the two different and complementary techniques when applied to different phase plates. Results show that under the same experimental and fitting conditions both techniques provide similar information concerning the wavefront aberration map. When taking into account all Zernike terms up to 6th order, the maximum difference in root-mean-square wavefront error was 0.08 microm, and this reduced up to 0.03 microm when excluding lower-order terms. The effects of the pupil size and the order of the Zernike expansion used to reconstruct the wavefront were evaluated. The combination of the two techniques can accurately measure complicated phase profiles, combining the robustness of the HS and the higher resolution and dynamic range of the PDI.

  2. Recurrence plots of discrete-time Gaussian stochastic processes

    NASA Astrophysics Data System (ADS)

    Ramdani, Sofiane; Bouchara, Frédéric; Lagarde, Julien; Lesne, Annick

    2016-09-01

    We investigate the statistical properties of recurrence plots (RPs) of data generated by discrete-time stationary Gaussian random processes. We analytically derive the theoretical values of the probabilities of occurrence of recurrence points and consecutive recurrence points forming diagonals in the RP, with an embedding dimension equal to 1. These results allow us to obtain theoretical values of three measures: (i) the recurrence rate (REC) (ii) the percent determinism (DET) and (iii) RP-based estimation of the ε-entropy κ(ε) in the sense of correlation entropy. We apply these results to two Gaussian processes, namely first order autoregressive processes and fractional Gaussian noise. For these processes, we simulate a number of realizations and compare the RP-based estimations of the three selected measures to their theoretical values. These comparisons provide useful information on the quality of the estimations, such as the minimum required data length and threshold radius used to construct the RP.

  3. Raman Scattering Study of the Soft Phonon Mode in the Hexagonal Ferroelectric Crystal KNiCl 3

    NASA Astrophysics Data System (ADS)

    Machida, Ken-ichi; Kato, Tetsuya; Chao, Peng; Iio, Katsunori

    1997-10-01

    Raman spectra of some phonon modes of the hexagonal ferroelectriccrystal KNiCl3are obtained in the temperature range between 290 K and 590 K, which includes the structural phase transition point T2(=561 K) at which previous measurements of dielectric constant and spontaneouspolarization as a function of temperature had shown that KNiCl3 undergoes a transition between polar phases II and III. An optical birefringence measurement carried outas a complement to the present Raman scattering revealed that this transition is of second order. Towards this transition point, the totally symmetric phonon mode with the lowest frequency observed in the room-temperature phasewas found to soften with increasing temperature.The present results provide new information on the phase-transitionmechanism and the space groups of thehigher (II)- and lower (III)-symmetric phases around T2.

  4. Two-Year Outcomes from a Randomized Controlled Trial of Minimally Invasive Sacroiliac Joint Fusion vs. Non-Surgical Management for Sacroiliac Joint Dysfunction.

    PubMed

    Polly, David W; Swofford, John; Whang, Peter G; Frank, Clay J; Glaser, John A; Limoni, Robert P; Cher, Daniel J; Wine, Kathryn D; Sembrano, Jonathan N

    2016-01-01

    Sacroiliac joint (SIJ) dysfunction is an important and underappreciated cause of chronic low back pain. To prospectively and concurrently compare outcomes after surgical and non-surgical treatment for chronic SIJ dysfunction. One hundred and forty-eight subjects with SIJ dysfunction were randomly assigned to minimally invasive SIJ fusion with triangular titanium implants (SIJF, n = 102) or non-surgical management (NSM, n = 46). SIJ pain (measured with a 100-point visual analog scale, VAS), disability (measured with Oswestry Disability Index, ODI) and quality of life scores were collected at baseline and at scheduled visits to 24 months. Crossover from non-surgical to surgical care was allowed after the 6-month study visit was complete. Improvements in continuous measures were compared using repeated measures analysis of variance. The proportions of subjects with clinical improvement (SIJ pain improvement ≥20 points, ODI ≥15 points) and substantial clinical benefit (SIJ pain improvement ≥25 points or SIJ pain rating ≤35, ODI ≥18.8 points) were compared. In the SIJF group, mean SIJ pain improved rapidly and was sustained (mean improvement of 55.4 points) at month 24. The 6-month mean change in the NSM group (12.2 points) was substantially smaller than that in the SIJF group (by 38.3 points, p<.0001 for superiority). By month 24, 83.1% and 82.0% received either clinical improvement or substantial clinical benefit in VAS SIJ pain score. Similarly, 68.2% and 65.9% had received clinical improvement or substantial clinical benefit in ODI score at month 24. In the NSM group, these proportions were <10% with non-surgical treatment only. Parallel changes were seen for EQ-5D and SF-36, with larger changes in the surgery group at 6 months compared to NSM. The rate of adverse events related to SIJF was low and only 3 subjects assigned to SIJF underwent revision surgery within the 24-month follow-up period. In this Level 1 multicenter prospective randomized controlled trial, minimally invasive SIJF with triangular titanium implants provided larger improvements in pain, disability and quality of life compared to NSM. Improvements after SIJF persisted to 24 months. This study was approved by a local or central IRB before any subjects were enrolled. All patients provided study-specific informed consent prior to participation.

  5. Using Different Measures, Informants, and Clinical Cut-Off Points to Estimate Prevalence of Emotional or Behavioral Disorders in Preschoolers: Effects on Age, Gender, and Ethnicity

    ERIC Educational Resources Information Center

    Feil, Edward G.; Small, Jason W.; Forness, Steven R.; Serna, Loretta R.; Kaiser, Ann P.; Hancock, Terry B.; Brooks-Gunn, Jeanne; Bryant, Donna; Kuperschmidt, Janis; Burchinal, Margaret R.; Boyce, Cheryl A.; Lopez, Michael L.

    2005-01-01

    The early identification and remediation of emotional or behavior disorders are high priorities for early-childhood researchers and are based on the assumption that problems such as school failure can be averted with early screening, prevention, and intervention. Presently, prevalence, severity, and topography of mental health needs among…

  6. Information Retrieval from SAGE II and MFRSR Multi-Spectral Extinction Measurements

    NASA Technical Reports Server (NTRS)

    Lacis, Andrew A.; Hansen, James E. (Technical Monitor)

    2001-01-01

    Direct beam spectral extinction measurements of solar radiation contain important information on atmospheric composition in a form that is essentially free from multiple scattering contributions that otherwise tend to complicate the data analysis and information retrieval. Such direct beam extinction measurements are available from the solar occultation satellite-based measurements made by the Stratospheric and Aerosol Gas Experiment (SAGE II) instrument and by ground-based Multi-Filter Shadowband Radiometers (MFRSRs). The SAGE II data provide cross-sectional slices of the atmosphere twice per orbit at seven wavelengths between 385 and 1020 nm with approximately 1 km vertical resolution, while the MFRSR data provide atmospheric column measurements at six wavelengths between 415 and 940 nm but at one minute time intervals. We apply the same retrieval technique of simultaneous least-squares fit to the observed spectral extinctions to retrieve aerosol optical depth, effective radius and variance, and ozone, nitrogen dioxide, and water vapor amounts from the SAGE II and MFRSR measurements. The retrieval technique utilizes a physical model approach based on laboratory measurements of ozone and nitrogen dioxide extinction, line-by-line and numerical k-distribution calculations for water vapor absorption, and Mie scattering constraints on aerosol spectral extinction properties. The SAGE II measurements have the advantage of being self-calibrating in that deep space provides an effective zero point for the relative spectral extinctions. The MFRSR measurements require periodic clear-day Langley regression calibration events to maintain accurate knowledge of instrument calibration.

  7. One-dimensional barcode reading: an information theoretic approach

    NASA Astrophysics Data System (ADS)

    Houni, Karim; Sawaya, Wadih; Delignon, Yves

    2008-03-01

    In the convergence context of identification technology and information-data transmission, the barcode found its place as the simplest and the most pervasive solution for new uses, especially within mobile commerce, bringing youth to this long-lived technology. From a communication theory point of view, a barcode is a singular coding based on a graphical representation of the information to be transmitted. We present an information theoretic approach for 1D image-based barcode reading analysis. With a barcode facing the camera, distortions and acquisition are modeled as a communication channel. The performance of the system is evaluated by means of the average mutual information quantity. On the basis of this theoretical criterion for a reliable transmission, we introduce two new measures: the theoretical depth of field and the theoretical resolution. Simulations illustrate the gain of this approach.

  8. One-dimensional barcode reading: an information theoretic approach.

    PubMed

    Houni, Karim; Sawaya, Wadih; Delignon, Yves

    2008-03-10

    In the convergence context of identification technology and information-data transmission, the barcode found its place as the simplest and the most pervasive solution for new uses, especially within mobile commerce, bringing youth to this long-lived technology. From a communication theory point of view, a barcode is a singular coding based on a graphical representation of the information to be transmitted. We present an information theoretic approach for 1D image-based barcode reading analysis. With a barcode facing the camera, distortions and acquisition are modeled as a communication channel. The performance of the system is evaluated by means of the average mutual information quantity. On the basis of this theoretical criterion for a reliable transmission, we introduce two new measures: the theoretical depth of field and the theoretical resolution. Simulations illustrate the gain of this approach.

  9. Four-dimensional characterization of a sheet-forming web

    DOEpatents

    Sari-Sarraf, Hamed; Goddard, James S.

    2003-04-22

    A method and apparatus are provided by which a sheet-forming web may be characterized in four dimensions. Light images of the web are recorded at a point adjacent the initial stage of the web, for example, near the headbox in a paperforming operation. The images are digitized, and the resulting data is processed by novel algorithms to provide a four-dimensional measurement of the web. The measurements include two-dimensional spatial information, the intensity profile of the web, and the depth profile of the web. These measurements can be used to characterize the web, predict its properties and monitor production events, and to analyze and quantify headbox flow dynamics.

  10. Selective corneal optical aberration (SCOA) for customized ablation

    NASA Astrophysics Data System (ADS)

    Jean, Benedikt J.; Bende, Thomas

    2001-06-01

    Wavefront analysis still have some technical problems which may be solved within the next years. There are some limitations to use wavefront as a diagnostic tool for customized ablation alone. An ideal combination would be wavefront and topography. Meanwhile Selective Corneal Aberration is a method to visualize the optical quality of a measured corneal surface. It is based on a true measured 3D elevation information of a video topometer. Thus values can be interpreted either using Zernike polynomials or visualized as a so called color coded surface quality map. This map gives a quality factor (corneal aberration) for each measured point of the cornea.

  11. Importance of Grid Center Arrangement

    NASA Astrophysics Data System (ADS)

    Pasaogullari, O.; Usul, N.

    2012-12-01

    In Digital Elevation Modeling, grid size is accepted to be the most important parameter. Despite the point density and/or scale of the source data, it is freely decided by the user. Most of the time, arrangement of the grid centers are ignored, even most GIS packages omit the choice of grid center coordinate selection. In our study; importance of the arrangement of grid centers is investigated. Using the analogy between "Raster Grid DEM" and "Bitmap Image", importance of placement of grid centers in DEMs are measured. The study has been conducted on four different grid DEMs obtained from a half ellipsoid. These grid DEMs are obtained in such a way that they are half grid size apart from each other. Resulting grid DEMs are investigated through similarity measures. Image processing scientists use different measures to investigate the dis/similarity between the images and the amount of different information they carry. Grid DEMs are projected to a finer grid in order to co-center. Similarity measures are then applied to each grid DEM pairs. These similarity measures are adapted to DEM with band reduction and real number operation. One of the measures gives function graph and the others give measure matrices. Application of similarity measures to six grid DEM pairs shows interesting results. These four different grid DEMs are created with the same method for the same area, surprisingly; thirteen out of 14 measures state that, the half grid size apart grid DEMs are different from each other. The results indicated that although grid DEMs carry mutual information, they have also additional individual information. In other words, half grid size apart constructed grid DEMs have non-redundant information.; Joint Probability Distributions Function Graphs

  12. Fine Grained Chaos in AdS2 Gravity

    NASA Astrophysics Data System (ADS)

    Haehl, Felix M.; Rozali, Moshe

    2018-03-01

    Quantum chaos can be characterized by an exponential growth of the thermal out-of-time-order four-point function up to a scrambling time u^*. We discuss generalizations of this statement for certain higher-point correlation functions. For concreteness, we study the Schwarzian theory of a one-dimensional time reparametrization mode, which describes two-dimensional anti-de Sitter space (AdS2 ) gravity and the low-energy dynamics of the Sachdev-Ye-Kitaev model. We identify a particular set of 2 k -point functions, characterized as being both "maximally braided" and "k -out of time order," which exhibit exponential growth until progressively longer time scales u^*(k)˜(k -1 )u^*. We suggest an interpretation as scrambling of increasingly fine grained measures of quantum information, which correspondingly take progressively longer time to reach their thermal values.

  13. Fine Grained Chaos in AdS_{2} Gravity.

    PubMed

    Haehl, Felix M; Rozali, Moshe

    2018-03-23

    Quantum chaos can be characterized by an exponential growth of the thermal out-of-time-order four-point function up to a scrambling time u[over ^]_{*}. We discuss generalizations of this statement for certain higher-point correlation functions. For concreteness, we study the Schwarzian theory of a one-dimensional time reparametrization mode, which describes two-dimensional anti-de Sitter space (AdS_{2}) gravity and the low-energy dynamics of the Sachdev-Ye-Kitaev model. We identify a particular set of 2k-point functions, characterized as being both "maximally braided" and "k-out of time order," which exhibit exponential growth until progressively longer time scales u[over ^]_{*}^{(k)}∼(k-1)u[over ^]_{*}. We suggest an interpretation as scrambling of increasingly fine grained measures of quantum information, which correspondingly take progressively longer time to reach their thermal values.

  14. Small scale characterization of vine plant root zone via 3D electrical resistivity tomography and Mise-à-la-Masse method: a case study in a Bordeaux Vineyard

    NASA Astrophysics Data System (ADS)

    Mary, Benjamin; Peruzzo, Luca; Boaga, Jacopo; Schmutz, Myriam; Wu, Yuxin; Hubbard, Susan S.; Cassiani, Giorgio

    2017-04-01

    Nowadays, best viticulture practices require the joint interpretation of climate and soils data. However, information about the soil structure and subsoil processes is often lacking, as point measurements, albeit precise, cannot ensure sufficient spatial coverage and resolution. Non-invasive methods can provide spatially extensive, high resolution information that, supported by traditional point-like data, help complete the complex picture of subsoil static and dynamic reality. So far very little emphasis has been given to investigating the role of soil properties and even less of roots activity on winegrapes. Vine plant's root systems play an important role in providing the minerals to the plants, but also control the water uptake and thus the water state of the vines, which is a key factor determining the grape quality potential. In this contribution we report about the measurements conducted since June 2016 in a vineyard near Bordeaux (France, Pessac Leognan Chateau). Two neighbor plants of different sizes have been selected. In order to spot small scale soil variations and root zone physical structure at the vicinity of the vine plants, we applied a methodology using longitudinal 2D tomography, 3D borehole-based electrical resistivity tomography and a variation of the mise-à-la-masse method (MALM) to assess the effect of plant roots on the current injection in the ground. Time-lapse measurements are particularly informative about the plant dynamics, and the focus is particularly applied on this approach. The time-lapse 3D ERT and MALM results are presented, and the potential to assimilate these data into a hydrological model that can account for the root water uptake as a function of atmospheric conditions is discussed.

  15. Creating a Web-accessible, point-of-care, team-based information system (PointTIS): the librarian as publisher.

    PubMed

    Burrows, S C; Moore, K M; Lemkau, H L

    2001-04-01

    The Internet has created new opportunities for librarians to develop information systems that are readily accessible at the point of care. This paper describes the multiyear process used to justify, fund, design, develop, promote, and evaluate a rehabilitation prototype of a point-of-care, team-based information system (PoinTIS) and train health care providers to use this prototype for their spinal cord injury and traumatic brain injury patient care and education activities. PoinTIS is a successful model for librarians in the twenty-first century to serve as publishers of information created or used by their parent organizations and to respond to the opportunities for information dissemination provided by recent technological advances.

  16. Negativity Bias in Dangerous Drivers.

    PubMed

    Chai, Jing; Qu, Weina; Sun, Xianghong; Zhang, Kan; Ge, Yan

    2016-01-01

    The behavioral and cognitive characteristics of dangerous drivers differ significantly from those of safe drivers. However, differences in emotional information processing have seldom been investigated. Previous studies have revealed that drivers with higher anger/anxiety trait scores are more likely to be involved in crashes and that individuals with higher anger traits exhibit stronger negativity biases when processing emotions compared with control groups. However, researchers have not explored the relationship between emotional information processing and driving behavior. In this study, we examined the emotional information processing differences between dangerous drivers and safe drivers. Thirty-eight non-professional drivers were divided into two groups according to the penalty points that they had accrued for traffic violations: 15 drivers with 6 or more points were included in the dangerous driver group, and 23 drivers with 3 or fewer points were included in the safe driver group. The emotional Stroop task was used to measure negativity biases, and both behavioral and electroencephalograph data were recorded. The behavioral results revealed stronger negativity biases in the dangerous drivers than in the safe drivers. The bias score was correlated with self-reported dangerous driving behavior. Drivers with strong negativity biases reported having been involved in mores crashes compared with the less-biased drivers. The event-related potentials (ERPs) revealed that the dangerous drivers exhibited reduced P3 components when responding to negative stimuli, suggesting decreased inhibitory control of information that is task-irrelevant but emotionally salient. The influence of negativity bias provides one possible explanation of the effects of individual differences on dangerous driving behavior and traffic crashes.

  17. Negativity Bias in Dangerous Drivers

    PubMed Central

    Chai, Jing; Qu, Weina; Sun, Xianghong; Zhang, Kan; Ge, Yan

    2016-01-01

    The behavioral and cognitive characteristics of dangerous drivers differ significantly from those of safe drivers. However, differences in emotional information processing have seldom been investigated. Previous studies have revealed that drivers with higher anger/anxiety trait scores are more likely to be involved in crashes and that individuals with higher anger traits exhibit stronger negativity biases when processing emotions compared with control groups. However, researchers have not explored the relationship between emotional information processing and driving behavior. In this study, we examined the emotional information processing differences between dangerous drivers and safe drivers. Thirty-eight non-professional drivers were divided into two groups according to the penalty points that they had accrued for traffic violations: 15 drivers with 6 or more points were included in the dangerous driver group, and 23 drivers with 3 or fewer points were included in the safe driver group. The emotional Stroop task was used to measure negativity biases, and both behavioral and electroencephalograph data were recorded. The behavioral results revealed stronger negativity biases in the dangerous drivers than in the safe drivers. The bias score was correlated with self-reported dangerous driving behavior. Drivers with strong negativity biases reported having been involved in mores crashes compared with the less-biased drivers. The event-related potentials (ERPs) revealed that the dangerous drivers exhibited reduced P3 components when responding to negative stimuli, suggesting decreased inhibitory control of information that is task-irrelevant but emotionally salient. The influence of negativity bias provides one possible explanation of the effects of individual differences on dangerous driving behavior and traffic crashes. PMID:26765225

  18. Perception of local three-dimensional shape.

    PubMed

    Phillips, F; Todd, J T

    1996-08-01

    The authors present a series of 4 experiments designed to test the ability to perceive local shape information. Observers were presented with various smoothly varying 3-dimensional surfaces where they reported shape index and sign of Gaussian curvature at several probe locations. Results show that observers are poor at making judgments based on these local measures, especially when the region surrounding the local point is restricted or manipulated to make it noncoherent. Shape index judgments required at least 2 degrees of context surrounding the probe location, and performance on sign of Gaussian curvature judgments deteriorated as the contextual information was restricted as well.

  19. Object-Based Point Cloud Analysis of Full-Waveform Airborne Laser Scanning Data for Urban Vegetation Classification

    PubMed Central

    Rutzinger, Martin; Höfle, Bernhard; Hollaus, Markus; Pfeifer, Norbert

    2008-01-01

    Airborne laser scanning (ALS) is a remote sensing technique well-suited for 3D vegetation mapping and structure characterization because the emitted laser pulses are able to penetrate small gaps in the vegetation canopy. The backscattered echoes from the foliage, woody vegetation, the terrain, and other objects are detected, leading to a cloud of points. Higher echo densities (>20 echoes/m2) and additional classification variables from full-waveform (FWF) ALS data, namely echo amplitude, echo width and information on multiple echoes from one shot, offer new possibilities in classifying the ALS point cloud. Currently FWF sensor information is hardly used for classification purposes. This contribution presents an object-based point cloud analysis (OBPA) approach, combining segmentation and classification of the 3D FWF ALS points designed to detect tall vegetation in urban environments. The definition tall vegetation includes trees and shrubs, but excludes grassland and herbage. In the applied procedure FWF ALS echoes are segmented by a seeded region growing procedure. All echoes sorted descending by their surface roughness are used as seed points. Segments are grown based on echo width homogeneity. Next, segment statistics (mean, standard deviation, and coefficient of variation) are calculated by aggregating echo features such as amplitude and surface roughness. For classification a rule base is derived automatically from a training area using a statistical classification tree. To demonstrate our method we present data of three sites with around 500,000 echoes each. The accuracy of the classified vegetation segments is evaluated for two independent validation sites. In a point-wise error assessment, where the classification is compared with manually classified 3D points, completeness and correctness better than 90% are reached for the validation sites. In comparison to many other algorithms the proposed 3D point classification works on the original measurements directly, i.e. the acquired points. Gridding of the data is not necessary, a process which is inherently coupled to loss of data and precision. The 3D properties provide especially a good separability of buildings and terrain points respectively, if they are occluded by vegetation. PMID:27873771

  20. Classification by Using Multispectral Point Cloud Data

    NASA Astrophysics Data System (ADS)

    Liao, C. T.; Huang, H. H.

    2012-07-01

    Remote sensing images are generally recorded in two-dimensional format containing multispectral information. Also, the semantic information is clearly visualized, which ground features can be better recognized and classified via supervised or unsupervised classification methods easily. Nevertheless, the shortcomings of multispectral images are highly depending on light conditions, and classification results lack of three-dimensional semantic information. On the other hand, LiDAR has become a main technology for acquiring high accuracy point cloud data. The advantages of LiDAR are high data acquisition rate, independent of light conditions and can directly produce three-dimensional coordinates. However, comparing with multispectral images, the disadvantage is multispectral information shortage, which remains a challenge in ground feature classification through massive point cloud data. Consequently, by combining the advantages of both LiDAR and multispectral images, point cloud data with three-dimensional coordinates and multispectral information can produce a integrate solution for point cloud classification. Therefore, this research acquires visible light and near infrared images, via close range photogrammetry, by matching images automatically through free online service for multispectral point cloud generation. Then, one can use three-dimensional affine coordinate transformation to compare the data increment. At last, the given threshold of height and color information is set as threshold in classification.

  1. Compliance of Austrian tourists with prophylactic measures.

    PubMed

    Kollaritsch, H; Wiedermann, G

    1992-03-01

    Physicians dealing with prophylactic measures for tourists going to developing countries will often not be able to foresee the outcome of their recommendations. Therefore an open study with 2,627 Austrian tourists on their flight home from a tropical destination was carried out to evaluate the behaviour of typical short-term travellers with respect to different kinds of precautionary measures. 94.1% of all tourists informed themselves before travelling abroad, but a high proportion of travellers tends to contact only their travel agency or their personal friends, this leading to inadequate information. Regarding the individual performance of precautionary measures the results indicate a few principal conclusions: Among the recommended inoculations the vaccinations against typhoid fever, poliomyelitis and tetanus are widely underestimated, the latter two in particular for adults, while compliance with the passive immunization against Hepatitis A is generally good (more than 80% of all travellers receive Hepatitis A immunoglobulins prophylactically). The most crucial point seems to be the chemoprophylaxis against malaria in as much as a) there seems to be a considerable lack of information about malaria endemic areas among physicians, b) tourists tend to use the most simple applicable drug unaware of epidemiological considerations and c) the regular intake of chemoprophylaxis declines significantly with the complexity of the intake procedure. In addition, tourists are in general well informed about nutritional risks, but only half of them will receive adequate information on the risk of sexually transmitted diseases and a basic medical travel kit.

  2. Measuring Spatial Dependence for Infectious Disease Epidemiology

    PubMed Central

    Grabowski, M. Kate; Cummings, Derek A. T.

    2016-01-01

    Global spatial clustering is the tendency of points, here cases of infectious disease, to occur closer together than expected by chance. The extent of global clustering can provide a window into the spatial scale of disease transmission, thereby providing insights into the mechanism of spread, and informing optimal surveillance and control. Here the authors present an interpretable measure of spatial clustering, τ, which can be understood as a measure of relative risk. When biological or temporal information can be used to identify sets of potentially linked and likely unlinked cases, this measure can be estimated without knowledge of the underlying population distribution. The greater our ability to distinguish closely related (i.e., separated by few generations of transmission) from more distantly related cases, the more closely τ will track the true scale of transmission. The authors illustrate this approach using examples from the analyses of HIV, dengue and measles, and provide an R package implementing the methods described. The statistic presented, and measures of global clustering in general, can be powerful tools for analysis of spatially resolved data on infectious diseases. PMID:27196422

  3. Radiation Measurements Performed with Active Detectors Relevant for Human Space Exploration

    PubMed Central

    Narici, Livio; Berger, Thomas; Matthiä, Daniel; Reitz, Günther

    2015-01-01

    A reliable radiation risk assessment in space is a mandatory step for the development of countermeasures and long-duration mission planning in human spaceflight. Research in radiobiology provides information about possible risks linked to radiation. In addition, for a meaningful risk evaluation, the radiation exposure has to be assessed to a sufficient level of accuracy. Consequently, both the radiation models predicting the risks and the measurements used to validate such models must have an equivalent precision. Corresponding measurements can be performed both with passive and active devices. The former is easier to handle, cheaper, lighter, and smaller but they measure neither the time dependence of the radiation environment nor some of the details useful for a comprehensive radiation risk assessment. Active detectors provide most of these details and have been extensively used in the International Space Station. To easily access such an amount of data, a single point access is becoming essential. This review presents an ongoing work on the development of a tool that allows obtaining information about all relevant measurements performed with active detectors providing reliable inputs for radiation model validation. PMID:26697408

  4. Radiation Measurements Performed with Active Detectors Relevant for Human Space Exploration.

    PubMed

    Narici, Livio; Berger, Thomas; Matthiä, Daniel; Reitz, Günther

    2015-01-01

    A reliable radiation risk assessment in space is a mandatory step for the development of countermeasures and long-duration mission planning in human spaceflight. Research in radiobiology provides information about possible risks linked to radiation. In addition, for a meaningful risk evaluation, the radiation exposure has to be assessed to a sufficient level of accuracy. Consequently, both the radiation models predicting the risks and the measurements used to validate such models must have an equivalent precision. Corresponding measurements can be performed both with passive and active devices. The former is easier to handle, cheaper, lighter, and smaller but they measure neither the time dependence of the radiation environment nor some of the details useful for a comprehensive radiation risk assessment. Active detectors provide most of these details and have been extensively used in the International Space Station. To easily access such an amount of data, a single point access is becoming essential. This review presents an ongoing work on the development of a tool that allows obtaining information about all relevant measurements performed with active detectors providing reliable inputs for radiation model validation.

  5. Presymptomatic atrophy in autosomal dominant Alzheimer's disease: A serial magnetic resonance imaging study.

    PubMed

    Kinnunen, Kirsi M; Cash, David M; Poole, Teresa; Frost, Chris; Benzinger, Tammie L S; Ahsan, R Laila; Leung, Kelvin K; Cardoso, M Jorge; Modat, Marc; Malone, Ian B; Morris, John C; Bateman, Randall J; Marcus, Daniel S; Goate, Alison; Salloway, Stephen P; Correia, Stephen; Sperling, Reisa A; Chhatwal, Jasmeer P; Mayeux, Richard P; Brickman, Adam M; Martins, Ralph N; Farlow, Martin R; Ghetti, Bernardino; Saykin, Andrew J; Jack, Clifford R; Schofield, Peter R; McDade, Eric; Weiner, Michael W; Ringman, John M; Thompson, Paul M; Masters, Colin L; Rowe, Christopher C; Rossor, Martin N; Ourselin, Sebastien; Fox, Nick C

    2018-01-01

    Identifying at what point atrophy rates first change in Alzheimer's disease is important for informing design of presymptomatic trials. Serial T1-weighted magnetic resonance imaging scans of 94 participants (28 noncarriers, 66 carriers) from the Dominantly Inherited Alzheimer Network were used to measure brain, ventricular, and hippocampal atrophy rates. For each structure, nonlinear mixed-effects models estimated the change-points when atrophy rates deviate from normal and the rates of change before and after this point. Atrophy increased after the change-point, which occurred 1-1.5 years (assuming a single step change in atrophy rate) or 3-8 years (assuming gradual acceleration of atrophy) before expected symptom onset. At expected symptom onset, estimated atrophy rates were at least 3.6 times than those before the change-point. Atrophy rates are pathologically increased up to seven years before "expected onset". During this period, atrophy rates may be useful for inclusion and tracking of disease progression. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  6. Optical spectroscopy and microscopy of radiation-induced light-emitting point defects in lithium fluoride crystals and films

    NASA Astrophysics Data System (ADS)

    Montereali, R. M.; Bonfigli, F.; Menchini, F.; Vincenti, M. A.

    2012-08-01

    Broad-band light-emitting radiation-induced F2 and F3+ electronic point defects, which are stable and laser-active at room temperature in lithium fluoride crystals and films, are used in dosimeters, tuneable color-center lasers, broad-band miniaturized light sources and novel radiation imaging detectors. A brief review of their photoemission properties is presented, and their behavior at liquid nitrogen temperatures is discussed. Some experimental data from optical spectroscopy and fluorescence microscopy of these radiation-induced point defects in LiF crystals and thin films are used to obtain information about the coloration curves, the efficiency of point defect formation, the effects of photo-bleaching processes, etc. Control of the local formation, stabilization, and transformation of radiation-induced light-emitting defect centers is crucial for the development of optically active micro-components and nanostructures. Some of the advantages of low temperature measurements for novel confocal laser scanning fluorescence microscopy techniques, widely used for spatial mapping of these point defects through the optical reading of their visible photoluminescence, are highlighted.

  7. A comparison of measured versus self-reported anthropometrics for assessing obesity in adults: a literature review.

    PubMed

    Maukonen, Mirkka; Männistö, Satu; Tolonen, Hanna

    2018-03-01

    Up-to-date information on the accuracy between different anthropometric data collection methods is vital for the reliability of anthropometric data. A previous review on this matter was conducted a decade ago. Our aim was to conduct a literature review on the accuracy of self-reported height, weight, and body mass index (BMI) against measured values for assessing obesity in adults. To obtain an overview of the present situation, we included studies published after the previous review. Differences according to sex, BMI groups, and continents were also assessed. Studies published between January 2006 and April 2017 were identified from a literature search on PubMed. Our search retrieved 62 publications on adult populations that showed a tendency for self-reported height to be overestimated and weight to be underestimated when compared with measured values. The findings were similar for both sexes. BMI derived from self-reported height and weight was underestimated; there was a clear tendency for underestimation of overweight (from 1.8%-points to 9.8%-points) and obesity (from 0.7%-points to 13.4%-points) prevalence by self-report. The bias was greater in overweight and obese participants than those of normal weight. Studies conducted in North America showed a greater bias, whereas the bias in Asian studies seemed to be lower than those from other continents. With globally rising obesity rates, accurate estimation of obesity is essential for effective public health policies to support obesity prevention. As self-report bias tends to be higher among overweight and obese individuals, measured anthropometrics provide a more reliable tool for assessing the prevalence of obesity.

  8. Point of care testing of phospholipase A2 group IIA for serological diagnosis of rheumatoid arthritis

    NASA Astrophysics Data System (ADS)

    Liu, Nathan J.; Chapman, Robert; Lin, Yiyang; Mmesi, Jonas; Bentham, Andrew; Tyreman, Matthew; Abraham, Sonya; Stevens, Molly M.

    2016-02-01

    Secretory phospholipase A2 group IIA (sPLA2-IIA) was examined as a point of care marker for determining disease activity in rheumatoid (RA) and psoriatic (PsA) arthritis. Serum concentration and activity of sPLA2-IIA were measured using in-house antibodies and a novel point of care lateral flow device assay in patients diagnosed with varying severities of RA (n = 30) and PsA (n = 25) and found to correlate strongly with C-reactive protein (CRP). Levels of all markers were elevated in patients with active RA over those with inactive RA as well as both active and inactive PsA, indicating that sPLA2-IIA can be used as an analogue to CRP for RA diagnosis at point of care.Secretory phospholipase A2 group IIA (sPLA2-IIA) was examined as a point of care marker for determining disease activity in rheumatoid (RA) and psoriatic (PsA) arthritis. Serum concentration and activity of sPLA2-IIA were measured using in-house antibodies and a novel point of care lateral flow device assay in patients diagnosed with varying severities of RA (n = 30) and PsA (n = 25) and found to correlate strongly with C-reactive protein (CRP). Levels of all markers were elevated in patients with active RA over those with inactive RA as well as both active and inactive PsA, indicating that sPLA2-IIA can be used as an analogue to CRP for RA diagnosis at point of care. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr08423g

  9. Abdominal fat thickness measurement using Focused Impedance Method (FIM) - phantom study

    NASA Astrophysics Data System (ADS)

    Haowlader, Salahuddin; Baig, Tanveer Noor; Siddique-e Rabbani, K.

    2010-04-01

    Abdominal fat thickness is a risk indicator of heart diseases, diabetes, etc., and its measurement is therefore important from the point of view of preventive care. Tetrapolar electrical impedance measurements (TPIM) could offer a simple and low cost alternative for such measurement compared to conventional techniques using CT scan and MRI, and has been tried by different groups. Focused Impedance Method (FIM) appears attractive as it can give localised information. An intuitive physical model was developed and experimental work was performed on a phantom designed to simulate abdominal subcutaneous fat layer in a body. TPIM measurements were performed with varying electrode separations. For small separations of current and potential electrodes, the measured impedance changed little, but started to decrease sharply beyond a certain separation, eventually diminishing gradually to negligible values. The finding could be explained using the intuitive physical model and gives an important practical information. TPIM and FIM may be useful for measurement of SFL thickness only if the electrode separations are within a certain specific range, and will fail to give reliable results if beyond this range. Further work, both analytical and experimental, are needed to establish this technique on a sound footing.

  10. 47 CFR 73.154 - AM directional antenna partial proof of performance measurements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... available to the FCC upon request. Maps showing new measurement points, i.e., points not measured in the...) Measurement points shall be selected from the points measured in latest full proof of performance provided..., the licensee shall measure directional field strength for comparison to either the directional or the...

  11. Integration of Libration Point Orbit Dynamics into a Universal 3-D Autonomous Formation Flying Algorithm

    NASA Technical Reports Server (NTRS)

    Folta, David; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    The autonomous formation flying control algorithm developed by the Goddard Space Flight Center (GSFC) for the New Millennium Program (NMP) Earth Observing-1 (EO-1) mission is investigated for applicability to libration point orbit formations. In the EO-1 formation-flying algorithm, control is accomplished via linearization about a reference transfer orbit with a state transition matrix (STM) computed from state inputs. The effect of libration point orbit dynamics on this algorithm architecture is explored via computation of STMs using the flight proven code, a monodromy matrix developed from a N-body model of a libration orbit, and a standard STM developed from the gravitational and coriolis effects as measured at the libration point. A comparison of formation flying Delta-Vs calculated from these methods is made to a standard linear quadratic regulator (LQR) method. The universal 3-D approach is optimal in the sense that it can be accommodated as an open-loop or closed-loop control using only state information.

  12. Deep-Focusing Time-Distance Helioseismology

    NASA Technical Reports Server (NTRS)

    Duvall, T. L., Jr.; Jensen, J. M.; Kosovichev, A. G.; Birch, A. C.; Fisher, Richard R. (Technical Monitor)

    2001-01-01

    Much progress has been made by measuring the travel times of solar acoustic waves from a central surface location to points at equal arc distance away. Depth information is obtained from the range of arc distances examined, with the larger distances revealing the deeper layers. This method we will call surface-focusing, as the common point, or focus, is at the surface. To obtain a clearer picture of the subsurface region, it would, no doubt, be better to focus on points below the surface. Our first attempt to do this used the ray theory to pick surface location pairs that would focus on a particular subsurface point. This is not the ideal procedure, as Born approximation kernels suggest that this focus should have zero sensitivity to sound speed inhomogeneities. However, the sensitivity is concentrated below the surface in a much better way than the old surface-focusing method, and so we expect the deep-focusing method to be more sensitive. A large sunspot group was studied by both methods. Inversions based on both methods will be compared.

  13. Approximate registration of point clouds with large scale differences

    NASA Astrophysics Data System (ADS)

    Novak, D.; Schindler, K.

    2013-10-01

    3D reconstruction of objects is a basic task in many fields, including surveying, engineering, entertainment and cultural heritage. The task is nowadays often accomplished with a laser scanner, which produces dense point clouds, but lacks accurate colour information, and lacks per-point accuracy measures. An obvious solution is to combine laser scanning with photogrammetric recording. In that context, the problem arises to register the two datasets, which feature large scale, translation and rotation differences. The absence of approximate registration parameters (3D translation, 3D rotation and scale) precludes the use of fine-registration methods such as ICP. Here, we present a method to register realistic photogrammetric and laser point clouds in a fully automated fashion. The proposed method decomposes the registration into a sequence of simpler steps: first, two rotation angles are determined by finding dominant surface normal directions, then the remaining parameters are found with RANSAC followed by ICP and scale refinement. These two steps are carried out at low resolution, before computing a precise final registration at higher resolution.

  14. Expert opinions on good practice in evaluation of health promotion and primary prevention measures related to children and adolescents in Germany.

    PubMed

    Korber, Katharina; Becker, Christian

    2017-10-02

    Determining what constitutes "good practice" in the measurement of the costs and effects of health promotion and disease prevention measures is of particular importance. The aim of this paper was to gather expert knowledge on (economic) evaluations of health promotion and prevention measures for children and adolescents, especially on the practical importance, the determinants of project success, meaningful parameters for evaluations, and supporting factors, but also on problems in their implementation. This information is targeted at people responsible for the development of primary prevention or health promotion programs. Partially structured open interviews were conducted by two interviewers and transcribed, paraphrased, and summarized for further use. Eight experts took part in the interviews. The interviewed experts saw evaluation as a useful tool to establish the effects of prevention programs, to inform program improvement and further development, and to provide arguments to decision making. The respondents' thought that determinants of a program's success were effectiveness with evidence of causality, cost benefit relation, target-group reach and sustainability. It was considered important that hard and soft factors were included in an evaluation; costs were mentioned only by one expert. According to the experts, obstacles to evaluation were lacking resources, additional labor requirements, and the evaluators' unfamiliarity with a program's contents. It was recommended to consider evaluation design before a program is launched, to co-operate with people involved in a program and to make use of existing structures. While in in this study only a partial view of expert knowledge is represented, it could show important points to consider when developing evaluations of prevention programs. By considering these points, researchers could further advance towards a more comprehensive approach of evaluation targeting measures in children and adolescents.

  15. Information Theoretic Approaches to Rapid Discovery of Relationships in Large Climate Data Sets

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.; Rossow, William B.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Mutual information as the asymptotic Bayesian measure of independence is an excellent starting point for investigating the existence of possible relationships among climate-relevant variables in large data sets, As mutual information is a nonlinear function of of its arguments, it is not beholden to the assumption of a linear relationship between the variables in question and can reveal features missed in linear correlation analyses. However, as mutual information is symmetric in its arguments, it only has the ability to reveal the probability that two variables are related. it provides no information as to how they are related; specifically, causal interactions or a relation based on a common cause cannot be detected. For this reason we also investigate the utility of a related quantity called the transfer entropy. The transfer entropy can be written as a difference between mutual informations and has the capability to reveal whether and how the variables are causally related. The application of these information theoretic measures is rested on some familiar examples using data from the International Satellite Cloud Climatology Project (ISCCP) to identify relation between global cloud cover and other variables, including equatorial pacific sea surface temperature (SST), over seasonal and El Nino Southern Oscillation (ENSO) cycles.

  16. Using digital image correlation and three dimensional point tracking in conjunction with real time operating data expansion techniques to predict full-field dynamic strain

    NASA Astrophysics Data System (ADS)

    Avitabile, Peter; Baqersad, Javad; Niezrecki, Christopher

    2014-05-01

    Large structures pose unique difficulties in the acquisition of measured dynamic data with conventional techniques that are further complicated when the structure also has rotating members such as wind turbine blades and helicopter blades. Optical techniques (digital image correlation and dynamic point tracking) are used to measure line of sight data without the need to contact the structure, eliminating cumbersome cabling issues. The data acquired from these optical approaches are used in conjunction with a unique real time operating data expansion process to obtain full-field dynamic displacement and dynamic strain. The measurement approaches are described in this paper along with the expansion procedures. The data is collected for a single blade from a wind turbine and also for a three bladed assembled wind turbine configuration. Measured strains are compared to results from a limited set of optical measurements used to perform the expansion to obtain full-field strain results including locations that are not available from the line of sight measurements acquired. The success of the approach clearly shows that there are some very extraordinary possibilities that exist to provide very desperately needed full field displacement and strain information that can be used to help identify the structural health of structures.

  17. Statistical process control: A feasibility study of the application of time-series measurement in early neurorehabilitation after acquired brain injury.

    PubMed

    Markovic, Gabriela; Schult, Marie-Louise; Bartfai, Aniko; Elg, Mattias

    2017-01-31

    Progress in early cognitive recovery after acquired brain injury is uneven and unpredictable, and thus the evaluation of rehabilitation is complex. The use of time-series measurements is susceptible to statistical change due to process variation. To evaluate the feasibility of using a time-series method, statistical process control, in early cognitive rehabilitation. Participants were 27 patients with acquired brain injury undergoing interdisciplinary rehabilitation of attention within 4 months post-injury. The outcome measure, the Paced Auditory Serial Addition Test, was analysed using statistical process control. Statistical process control identifies if and when change occurs in the process according to 3 patterns: rapid, steady or stationary performers. The statistical process control method was adjusted, in terms of constructing the baseline and the total number of measurement points, in order to measure a process in change. Statistical process control methodology is feasible for use in early cognitive rehabilitation, since it provides information about change in a process, thus enabling adjustment of the individual treatment response. Together with the results indicating discernible subgroups that respond differently to rehabilitation, statistical process control could be a valid tool in clinical decision-making. This study is a starting-point in understanding the rehabilitation process using a real-time-measurements approach.

  18. 3D local feature BKD to extract road information from mobile laser scanning point clouds

    NASA Astrophysics Data System (ADS)

    Yang, Bisheng; Liu, Yuan; Dong, Zhen; Liang, Fuxun; Li, Bijun; Peng, Xiangyang

    2017-08-01

    Extracting road information from point clouds obtained through mobile laser scanning (MLS) is essential for autonomous vehicle navigation, and has hence garnered a growing amount of research interest in recent years. However, the performance of such systems is seriously affected due to varying point density and noise. This paper proposes a novel three-dimensional (3D) local feature called the binary kernel descriptor (BKD) to extract road information from MLS point clouds. The BKD consists of Gaussian kernel density estimation and binarization components to encode the shape and intensity information of the 3D point clouds that are fed to a random forest classifier to extract curbs and markings on the road. These are then used to derive road information, such as the number of lanes, the lane width, and intersections. In experiments, the precision and recall of the proposed feature for the detection of curbs and road markings on an urban dataset and a highway dataset were as high as 90%, thus showing that the BKD is accurate and robust against varying point density and noise.

  19. Measuring coverage in MNCH: total survey error and the interpretation of intervention coverage estimates from household surveys.

    PubMed

    Eisele, Thomas P; Rhoda, Dale A; Cutts, Felicity T; Keating, Joseph; Ren, Ruilin; Barros, Aluisio J D; Arnold, Fred

    2013-01-01

    Nationally representative household surveys are increasingly relied upon to measure maternal, newborn, and child health (MNCH) intervention coverage at the population level in low- and middle-income countries. Surveys are the best tool we have for this purpose and are central to national and global decision making. However, all survey point estimates have a certain level of error (total survey error) comprising sampling and non-sampling error, both of which must be considered when interpreting survey results for decision making. In this review, we discuss the importance of considering these errors when interpreting MNCH intervention coverage estimates derived from household surveys, using relevant examples from national surveys to provide context. Sampling error is usually thought of as the precision of a point estimate and is represented by 95% confidence intervals, which are measurable. Confidence intervals can inform judgments about whether estimated parameters are likely to be different from the real value of a parameter. We recommend, therefore, that confidence intervals for key coverage indicators should always be provided in survey reports. By contrast, the direction and magnitude of non-sampling error is almost always unmeasurable, and therefore unknown. Information error and bias are the most common sources of non-sampling error in household survey estimates and we recommend that they should always be carefully considered when interpreting MNCH intervention coverage based on survey data. Overall, we recommend that future research on measuring MNCH intervention coverage should focus on refining and improving survey-based coverage estimates to develop a better understanding of how results should be interpreted and used.

  20. Measuring Coverage in MNCH: Total Survey Error and the Interpretation of Intervention Coverage Estimates from Household Surveys

    PubMed Central

    Eisele, Thomas P.; Rhoda, Dale A.; Cutts, Felicity T.; Keating, Joseph; Ren, Ruilin; Barros, Aluisio J. D.; Arnold, Fred

    2013-01-01

    Nationally representative household surveys are increasingly relied upon to measure maternal, newborn, and child health (MNCH) intervention coverage at the population level in low- and middle-income countries. Surveys are the best tool we have for this purpose and are central to national and global decision making. However, all survey point estimates have a certain level of error (total survey error) comprising sampling and non-sampling error, both of which must be considered when interpreting survey results for decision making. In this review, we discuss the importance of considering these errors when interpreting MNCH intervention coverage estimates derived from household surveys, using relevant examples from national surveys to provide context. Sampling error is usually thought of as the precision of a point estimate and is represented by 95% confidence intervals, which are measurable. Confidence intervals can inform judgments about whether estimated parameters are likely to be different from the real value of a parameter. We recommend, therefore, that confidence intervals for key coverage indicators should always be provided in survey reports. By contrast, the direction and magnitude of non-sampling error is almost always unmeasurable, and therefore unknown. Information error and bias are the most common sources of non-sampling error in household survey estimates and we recommend that they should always be carefully considered when interpreting MNCH intervention coverage based on survey data. Overall, we recommend that future research on measuring MNCH intervention coverage should focus on refining and improving survey-based coverage estimates to develop a better understanding of how results should be interpreted and used. PMID:23667331

  1. Investment feasibility tracking: the importance of measuring and tracking the success level of the project during commercialization phase

    NASA Astrophysics Data System (ADS)

    Saputra, Y. A.; Setyaningtyas, V. E. D.; Latiffianti, E.; Wijaya, S. H.; Ladamay, O. S. A.

    2018-04-01

    Measuring project success level is a challenging activity. This area of works has attracted many researchers to look deeper into the method of measurement, success factor identification, risk management, and many others relevant topics. However, the project management scope is limited until the project handover stage. After a project handover, the control of a project management changes from Project Management Team to the project owner/commercialization team. From an investor’s point of view, the success of a project delivery needs to be followed by the success of commercialization phase. This paper aims to present an approach on how we track and measure the progress and success level of a project investment in the commercialization phase. This is an interesting topic which probably often being forgotten in many practical case. Our proposed concept modify Freeman and Beale concept by estimating the variance between the Planned Net Present Value / Annual Worth (as it is in the Feasibility Study Document) and the Actual Net Present Value / Annual Worth (until the point time of evaluation). The gap will lead us to the next analysis and give us some important information, especially exposing whether our project investment performs better than the planning or underperformed. Some corrective actions can be suggested based on the provided information. Practical cases to exercise the concept is also provided and discussed; one case in a property sector in the middle of commercialization phase, and another case in a Power Plant investment approaching the end of commercialization phase.

  2. Design of a small laser ceilometer and visibility measuring device for helicopter landing sites

    NASA Astrophysics Data System (ADS)

    Streicher, Jurgen; Werner, Christian; Dittel, Walter

    2004-01-01

    Hardware development for remote sensing costs a lot of time and money. A virtual instrument based on software modules was developed to optimise a small visibility and cloud base height sensor. Visibility is the parameter describing the turbidity of the atmosphere. This can be done either by a mean value over a path measured by a transmissometer or for each point of the atmosphere like the backscattered intensity of a range resolved lidar measurement. A standard ceilometer detects the altitude of clouds by using the runtime of the laser pulse and the increasing intensity of the back scattered light when hitting the boundary of a cloud. This corresponds to hard target range finding, but with a more sensitive detection. The output of a standard ceilometer is in case of cloud coverage the altitude of one or more layers. Commercial cloud sensors are specified to track cloud altitude at rather large distances (100 m up to 10 km) and are therefore big and expensive. A virtual instrument was used to calculate the system parameters for a small system for heliports at hospitals and landing platforms under visual flight rules (VFR). Helicopter pilots need information about cloud altitude (base not below 500 feet) and/or the visibility conditions (visual range not lower than 600m) at the destinated landing point. Private pilots need this information too when approaching a non-commercial airport. Both values can be measured automatically with the developed small and compact prototype, at the size of a shoebox for a reasonable price.

  3. Comparison of connectivity analyses for resting state EEG data

    NASA Astrophysics Data System (ADS)

    Olejarczyk, Elzbieta; Marzetti, Laura; Pizzella, Vittorio; Zappasodi, Filippo

    2017-06-01

    Objective. In the present work, a nonlinear measure (transfer entropy, TE) was used in a multivariate approach for the analysis of effective connectivity in high density resting state EEG data in eyes open and eyes closed. Advantages of the multivariate approach in comparison to the bivariate one were tested. Moreover, the multivariate TE was compared to an effective linear measure, i.e. directed transfer function (DTF). Finally, the existence of a relationship between the information transfer and the level of brain synchronization as measured by phase synchronization value (PLV) was investigated. Approach. The comparison between the connectivity measures, i.e. bivariate versus multivariate TE, TE versus DTF, TE versus PLV, was performed by means of statistical analysis of indexes based on graph theory. Main results. The multivariate approach is less sensitive to false indirect connections with respect to the bivariate estimates. The multivariate TE differentiated better between eyes closed and eyes open conditions compared to DTF. Moreover, the multivariate TE evidenced non-linear phenomena in information transfer, which are not evidenced by the use of DTF. We also showed that the target of information flow, in particular the frontal region, is an area of greater brain synchronization. Significance. Comparison of different connectivity analysis methods pointed to the advantages of nonlinear methods, and indicated a relationship existing between the flow of information and the level of synchronization of the brain.

  4. Operationalizing hippocampal volume as an enrichment biomarker for amnestic MCI trials: effect of algorithm, test-retest variability and cut-point on trial cost, duration and sample size

    PubMed Central

    Yu, P.; Sun, J.; Wolz, R.; Stephenson, D.; Brewer, J.; Fox, N.C.; Cole, P.E.; Jack, C.R.; Hill, D.L.G.; Schwarz, A.J.

    2014-01-01

    Objective To evaluate the effect of computational algorithm, measurement variability and cut-point on hippocampal volume (HCV)-based patient selection for clinical trials in mild cognitive impairment (MCI). Methods We used normal control and amnestic MCI subjects from ADNI-1 as normative reference and screening cohorts. We evaluated the enrichment performance of four widely-used hippocampal segmentation algorithms (FreeSurfer, HMAPS, LEAP and NeuroQuant) in terms of two-year changes in MMSE, ADAS-Cog and CDR-SB. We modeled the effect of algorithm, test-retest variability and cut-point on sample size, screen fail rates and trial cost and duration. Results HCV-based patient selection yielded not only reduced sample sizes (by ~40–60%) but also lower trial costs (by ~30–40%) across a wide range of cut-points. Overall, the dependence on the cut-point value was similar for the three clinical instruments considered. Conclusion These results provide a guide to the choice of HCV cut-point for aMCI clinical trials, allowing an informed trade-off between statistical and practical considerations. PMID:24211008

  5. Lunar Reconnaissance Orbiter (LRO) Guidance, Navigation and Control (GN&C) Overview

    NASA Technical Reports Server (NTRS)

    Garrick, Joseph; Simpson, James; Shah, Neerav

    2010-01-01

    The National Aeronautics and Space Administration s (NASA) Lunar Reconnaissance Orbiter (LRO) launched on June 18, 2009 from the Cape Canaveral Air Force Station aboard an Atlas V launch vehicle and into a direct insertion trajectory to the oon. LRO, which was designed, built, and operated by the NASA Goddard Space Flight Center in Greenbelt, MD, is gathering crucial data on the lunar environment that will help astronauts prepare for long-duration lunar expeditions. The mission has a nominal life of 1 year as its seven instruments find safe landing sites, locate potential resources, characterize the radiation environment, and test new technology. To date, LRO has been operating well within the bounds of its requirements and has been collecting excellent science data images taken from the LRO Camera Narrow Angle Camera of the Apollo landing sites appeared on cable news networks. A significant amount of information on LRO s science instruments is provided at the LRO mission webpage. LRO s Guidance, Navigation and Control (GN&C) subsystem is made up of an onboard attitude control system (ACS) and a hardware suite of sensors and actuators. The LRO onboard ACS is a collection of algorithms based on high level and derived requirements, and reflect the science and operational events throughout the mission lifetime. The primary control mode is the Observing mode, which maintains the lunar pointing orientation and any offset pointing from this baseline. It is within this mode that all science instrument calibrations, slews and science data is collected. Because of a high accuracy requirement for knowledge and pointing, the Observing mode makes use of star tracker (ST) measurement data to determine an instantaneous attitude pointing. But even the star trackers alone do not meet the tight requirements, so a six-state Kalman Filter is employed to improve the noisy measurement data. The Observing mode obtains its rate information from an inertial reference unit (IRU) and in the event of an IRU failure, the rate data is be derived from the star tracker, but with degraded pointing performance. The Delta-V control mode responsibility is to maintain attitude pointing during the cruise trajectory, insertion burns and lunar orbit maintenance by adjustments made to the spacecraft s velocity magnitude and vector direction. The ACS also provides for a thruster based system momentum management algorithm (known as Delta-H) to maintain the system and wheel momentum to within acceptable levels. In the event an anomaly causes the LRO spacecraft to lose the ability to maintain its current attitude pointing, a Sun Safe mode is included in the ACS for the purpose of providing a known power and thermally safe coarse inertial sun attitude for an indefinite period of time, within the manageable limits of the reaction wheels. The Sun Safe mode is also the initial spacecraft control mode off of the launch vehicle and provides for a means to null tip-off rates immediately after separation. The nominal configuration is to use the IRU for rate information in the controller. In the event of a gyro failure a gyroless control mode was developed that computes rate information from the CSS data.

  6. RGB-D SLAM Based on Extended Bundle Adjustment with 2D and 3D Information

    PubMed Central

    Di, Kaichang; Zhao, Qiang; Wan, Wenhui; Wang, Yexin; Gao, Yunjun

    2016-01-01

    In the study of SLAM problem using an RGB-D camera, depth information and visual information as two types of primary measurement data are rarely tightly coupled during refinement of camera pose estimation. In this paper, a new method of RGB-D camera SLAM is proposed based on extended bundle adjustment with integrated 2D and 3D information on the basis of a new projection model. First, the geometric relationship between the image plane coordinates and the depth values is constructed through RGB-D camera calibration. Then, 2D and 3D feature points are automatically extracted and matched between consecutive frames to build a continuous image network. Finally, extended bundle adjustment based on the new projection model, which takes both image and depth measurements into consideration, is applied to the image network for high-precision pose estimation. Field experiments show that the proposed method has a notably better performance than the traditional method, and the experimental results demonstrate the effectiveness of the proposed method in improving localization accuracy. PMID:27529256

  7. The theory behind the full scattering profile

    NASA Astrophysics Data System (ADS)

    Feder, Idit; Duadi, Hamootal; Fixler, Dror

    2018-02-01

    Optical methods for extracting properties of tissues are commonly used. These methods are non-invasive, cause no harm to the patient and are characterized by high speed. The human tissue is a turbid media hence it poses a challenge for different optical methods. In addition the analysis of the emitted light requires calibration for achieving accuracy information. Most of the methods analyze the reflected light based on their phase and amplitude or the transmitted light. We suggest a new optical method for extracting optical properties of cylindrical tissues based on their full scattering profile (FSP), which means the angular distribution of the reemitted light. The FSP of cylindrical tissues is relevant for biomedical measurement of fingers, earlobes or pinched tissues. We found the iso-pathlength (IPL) point, a point on the surface of the cylinder medium where the light intensity remains constant and does not depend on the reduced scattering coefficient of the medium, but rather depends on the spatial structure and the cylindrical geometry. However, a similar behavior was also previously reported in reflection from a semi-infinite medium. Moreover, we presented a linear dependency between the radius of the tissue and the point's location. This point can be used as a self-calibration point and thus improve the accuracy of optical tissue measurements. This natural phenomenon has not been investigated before. We show this phenomenon theoretically, based on the diffusion theory, which is supported by our simulation results using Monte Carlo simulation.

  8. MLS data segmentation using Point Cloud Library procedures. (Polish Title: Segmentacja danych MLS z użyciem procedur Point Cloud Library)

    NASA Astrophysics Data System (ADS)

    Grochocka, M.

    2013-12-01

    Mobile laser scanning is dynamically developing measurement technology, which is becoming increasingly widespread in acquiring three-dimensional spatial information. Continuous technical progress based on the use of new tools, technology development, and thus the use of existing resources in a better way, reveals new horizons of extensive use of MLS technology. Mobile laser scanning system is usually used for mapping linear objects, and in particular the inventory of roads, railways, bridges, shorelines, shafts, tunnels, and even geometrically complex urban spaces. The measurement is done from the perspective of use of the object, however, does not interfere with the possibilities of movement and work. This paper presents the initial results of the segmentation data acquired by the MLS. The data used in this work was obtained as part of an inventory measurement infrastructure railway line. Measurement of point clouds was carried out using a profile scanners installed on the railway platform. To process the data, the tools of 'open source' Point Cloud Library was used. These tools allow to use templates of programming libraries. PCL is an open, independent project, operating on a large scale for processing 2D/3D image and point clouds. Software PCL is released under the terms of the BSD license (Berkeley Software Distribution License), which means it is a free for commercial and research use. The article presents a number of issues related to the use of this software and its capabilities. Segmentation data is based on applying the templates library pcl_ segmentation, which contains the segmentation algorithms to separate clusters. These algorithms are best suited to the processing point clouds, consisting of a number of spatially isolated regions. Template library performs the extraction of the cluster based on the fit of the model by the consensus method samples for various parametric models (planes, cylinders, spheres, lines, etc.). Most of the mathematical operation is carried out on the basis of Eigen library, a set of templates for linear algebra.

  9. Nonlinear electric reaction arising in dry bone subjected to 4-point bending

    NASA Astrophysics Data System (ADS)

    Murasawa, Go; Cho, Hideo; Ogawa, Kazuma

    2007-04-01

    Bone is a smart, self-adaptive and also partly self-repairing tissue. In recent years, many researchers seek to find how to give the effective mechanical stimulation to bone, because it is the predominant loading that determines the bone shape and macroscopic structure. However, the trial of regeneration of bone is still under way. On the other hand, it has been known that electrical potential generates from bone by mechanical stimulation (Yasuda, 1977; Williams, 1982; Starkebaum, 1979; Cochran, 1968; Lanyon, 1977; Salzstein, 1987a,b; Friedenberg, 1966). This is called "stress-generated potential (SGP)". The process of information transfer between "strain" and "cells" is not still clear. But, there is some possibility that SGP has something to do with the process of information transfer. If the electrical potential is more clear under some mechanical loadings, we will be able to regenerate bone artificially and freely. Therefore, it is important to investigate SGP in detail. The aim of present study is to investigate the electric reaction arising in dry bone subjected to mechanical loadings at high amplitude and low frequency strain. Firstly, specimen is fabricated from femur of cow. Next, the speeds of wave propagation in bone are tried to measure by laser ultra sonic technique and wavelet transform, because these have relationship with bone density. Secondary, 4-point bending test is conducted up to fracture. Then, electric reaction arising in bone is measured during loading. Finally, cyclic 4-point bending tests are conducted to investigate the electric reaction arising in bone at low frequency strain.

  10. KiDS-450: cosmological constraints from weak-lensing peak statistics - II: Inference from shear peaks using N-body simulations

    NASA Astrophysics Data System (ADS)

    Martinet, Nicolas; Schneider, Peter; Hildebrandt, Hendrik; Shan, HuanYuan; Asgari, Marika; Dietrich, Jörg P.; Harnois-Déraps, Joachim; Erben, Thomas; Grado, Aniello; Heymans, Catherine; Hoekstra, Henk; Klaes, Dominik; Kuijken, Konrad; Merten, Julian; Nakajima, Reiko

    2018-02-01

    We study the statistics of peaks in a weak-lensing reconstructed mass map of the first 450 deg2 of the Kilo Degree Survey (KiDS-450). The map is computed with aperture masses directly applied to the shear field with an NFW-like compensated filter. We compare the peak statistics in the observations with that of simulations for various cosmologies to constrain the cosmological parameter S_8 = σ _8 √{Ω _m/0.3}, which probes the (Ωm, σ8) plane perpendicularly to its main degeneracy. We estimate S8 = 0.750 ± 0.059, using peaks in the signal-to-noise range 0 ≤ S/N ≤ 4, and accounting for various systematics, such as multiplicative shear bias, mean redshift bias, baryon feedback, intrinsic alignment, and shear-position coupling. These constraints are ˜ 25 per cent tighter than the constraints from the high significance peaks alone (3 ≤ S/N ≤ 4) which typically trace single-massive haloes. This demonstrates the gain of information from low-S/N peaks. However, we find that including S/N < 0 peaks does not add further information. Our results are in good agreement with the tomographic shear two-point correlation function measurement in KiDS-450. Combining shear peaks with non-tomographic measurements of the shear two-point correlation functions yields a ˜20 per cent improvement in the uncertainty on S8 compared to the shear two-point correlation functions alone, highlighting the great potential of peaks as a cosmological probe.

  11. Community-based restaurant interventions to promote healthy eating: a systematic review.

    PubMed

    Valdivia Espino, Jennifer N; Guerrero, Natalie; Rhoads, Natalie; Simon, Norma-Jean; Escaron, Anne L; Meinen, Amy; Nieto, F Javier; Martinez-Donate, Ana P

    2015-05-21

    Eating in restaurants is associated with high caloric intake. This review summarizes and evaluates the evidence supporting community-based restaurant interventions. We searched all years of PubMed and Web of Knowledge through January 2014 for original articles describing or evaluating community-based restaurant interventions to promote healthy eating. We extracted summary information and classified the interventions into 9 categories according to the strategies implemented. A scoring system was adapted to evaluate the evidence, assigning 0 to 3 points to each intervention for study design, public awareness, and effectiveness. The average values were summed and then multiplied by 1 to 3 points, according to the volume of research available for each category. These summary scores were used to determine the level of evidence (insufficient, sufficient, or strong) supporting the effectiveness of each category. This review included 27 interventions described in 25 studies published since 1979. Most interventions took place in exclusively urban areas of the United States, either in the West or the South. The most common intervention categories were the use of point-of-purchase information with promotion and communication (n = 6), and point-of-purchase information with increased availability of healthy choices (n = 6). Only the latter category had sufficient evidence. The remaining 8 categories had insufficient evidence because of interventions showing no, minimal, or mixed findings; limited reporting of awareness and effectiveness; low volume of research; or weak study designs. No intervention reported an average negative impact on outcomes. Evidence about effective community-based strategies to promote healthy eating in restaurants is limited, especially for interventions in rural areas. To expand the evidence base, more studies should be conducted using robust study designs, standardized evaluation methods, and measures of sales, behavior, and health outcomes.

  12. Community-Based Restaurant Interventions to Promote Healthy Eating: A Systematic Review

    PubMed Central

    Valdivia Espino, Jennifer N.; Guerrero, Natalie; Rhoads, Natalie; Simon, Norma-Jean; Escaron, Anne L.; Meinen, Amy; Nieto, F. Javier

    2015-01-01

    Introduction Eating in restaurants is associated with high caloric intake. This review summarizes and evaluates the evidence supporting community-based restaurant interventions. Methods We searched all years of PubMed and Web of Knowledge through January 2014 for original articles describing or evaluating community-based restaurant interventions to promote healthy eating. We extracted summary information and classified the interventions into 9 categories according to the strategies implemented. A scoring system was adapted to evaluate the evidence, assigning 0 to 3 points to each intervention for study design, public awareness, and effectiveness. The average values were summed and then multiplied by 1 to 3 points, according to the volume of research available for each category. These summary scores were used to determine the level of evidence (insufficient, sufficient, or strong) supporting the effectiveness of each category. Results This review included 27 interventions described in 25 studies published since 1979. Most interventions took place in exclusively urban areas of the United States, either in the West or the South. The most common intervention categories were the use of point-of-purchase information with promotion and communication (n = 6), and point-of-purchase information with increased availability of healthy choices (n = 6). Only the latter category had sufficient evidence. The remaining 8 categories had insufficient evidence because of interventions showing no, minimal, or mixed findings; limited reporting of awareness and effectiveness; low volume of research; or weak study designs. No intervention reported an average negative impact on outcomes. Conclusion Evidence about effective community-based strategies to promote healthy eating in restaurants is limited, especially for interventions in rural areas. To expand the evidence base, more studies should be conducted using robust study designs, standardized evaluation methods, and measures of sales, behavior, and health outcomes. PMID:25996986

  13. Managing distance and covariate information with point-based clustering.

    PubMed

    Whigham, Peter A; de Graaf, Brandon; Srivastava, Rashmi; Glue, Paul

    2016-09-01

    Geographic perspectives of disease and the human condition often involve point-based observations and questions of clustering or dispersion within a spatial context. These problems involve a finite set of point observations and are constrained by a larger, but finite, set of locations where the observations could occur. Developing a rigorous method for pattern analysis in this context requires handling spatial covariates, a method for constrained finite spatial clustering, and addressing bias in geographic distance measures. An approach, based on Ripley's K and applied to the problem of clustering with deliberate self-harm (DSH), is presented. Point-based Monte-Carlo simulation of Ripley's K, accounting for socio-economic deprivation and sources of distance measurement bias, was developed to estimate clustering of DSH at a range of spatial scales. A rotated Minkowski L1 distance metric allowed variation in physical distance and clustering to be assessed. Self-harm data was derived from an audit of 2 years' emergency hospital presentations (n = 136) in a New Zealand town (population ~50,000). Study area was defined by residential (housing) land parcels representing a finite set of possible point addresses. Area-based deprivation was spatially correlated. Accounting for deprivation and distance bias showed evidence for clustering of DSH for spatial scales up to 500 m with a one-sided 95 % CI, suggesting that social contagion may be present for this urban cohort. Many problems involve finite locations in geographic space that require estimates of distance-based clustering at many scales. A Monte-Carlo approach to Ripley's K, incorporating covariates and models for distance bias, are crucial when assessing health-related clustering. The case study showed that social network structure defined at the neighbourhood level may account for aspects of neighbourhood clustering of DSH. Accounting for covariate measures that exhibit spatial clustering, such as deprivation, are crucial when assessing point-based clustering.

  14. Optical layout and mechanical structure of polarimeter-interferometer system for Experimental Advanced Superconducting Tokamak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Z. Y.; Liu, H. Q., E-mail: hqliu@ipp.ac.cn; Jie, Y. X.

    A Far-InfaRed (FIR) three-wave POlarimeter-INTerferometer (POINT) system for measurement current density profile and electron density profile is under development for the EAST tokamak. The FIR beams are transmitted from the laser room to the optical tower adjacent to EAST via ∼20 m overmoded dielectric waveguide and then divided into 5 horizontal chords. The optical arrangement was designed using ZEMAX, which provides information on the beam spot size and energy distribution throughout the optical system. ZEMAX calculations used to optimize the optical layout design are combined with the mechanical design from CATIA, providing a 3D visualization of the entire POINT system.

  15. Optical layout and mechanical structure of polarimeter-interferometer system for Experimental Advanced Superconducting Tokamak.

    PubMed

    Zou, Z Y; Liu, H Q; Jie, Y X; Ding, W X; Brower, D L; Wang, Z X; Shen, J S; An, Z H; Yang, Y; Zeng, L; Wei, X C; Li, G S; Zhu, X; Lan, T

    2014-11-01

    A Far-InfaRed (FIR) three-wave POlarimeter-INTerferometer (POINT) system for measurement current density profile and electron density profile is under development for the EAST tokamak. The FIR beams are transmitted from the laser room to the optical tower adjacent to EAST via ∼20 m overmoded dielectric waveguide and then divided into 5 horizontal chords. The optical arrangement was designed using ZEMAX, which provides information on the beam spot size and energy distribution throughout the optical system. ZEMAX calculations used to optimize the optical layout design are combined with the mechanical design from CATIA, providing a 3D visualization of the entire POINT system.

  16. Experimental Estimating Deflection of a Simple Beam Bridge Model Using Grating Eddy Current Sensors

    PubMed Central

    Lü, Chunfeng; Liu, Weiwen; Zhang, Yongjie; Zhao, Hui

    2012-01-01

    A novel three-point method using a grating eddy current absolute position sensor (GECS) for bridge deflection estimation is proposed in this paper. Real spatial positions of the measuring points along the span axis are directly used as relative reference points of each other rather than using any other auxiliary static reference points for measuring devices in a conventional method. Every three adjacent measuring points are defined as a measuring unit and a straight connecting bar with a GECS fixed on the center section of it links the two endpoints. In each measuring unit, the displacement of the mid-measuring point relative to the connecting bar measured by the GECS is defined as the relative deflection. Absolute deflections of each measuring point can be calculated from the relative deflections of all the measuring units directly without any correcting approaches. Principles of the three-point method and displacement measurement of the GECS are introduced in detail. Both static and dynamic experiments have been carried out on a simple beam bridge model, which demonstrate that the three-point deflection estimation method using the GECS is effective and offers a reliable way for bridge deflection estimation, especially for long-term monitoring. PMID:23112583

  17. Experimental estimating deflection of a simple beam bridge model using grating eddy current sensors.

    PubMed

    Lü, Chunfeng; Liu, Weiwen; Zhang, Yongjie; Zhao, Hui

    2012-01-01

    A novel three-point method using a grating eddy current absolute position sensor (GECS) for bridge deflection estimation is proposed in this paper. Real spatial positions of the measuring points along the span axis are directly used as relative reference points of each other rather than using any other auxiliary static reference points for measuring devices in a conventional method. Every three adjacent measuring points are defined as a measuring unit and a straight connecting bar with a GECS fixed on the center section of it links the two endpoints. In each measuring unit, the displacement of the mid-measuring point relative to the connecting bar measured by the GECS is defined as the relative deflection. Absolute deflections of each measuring point can be calculated from the relative deflections of all the measuring units directly without any correcting approaches. Principles of the three-point method and displacement measurement of the GECS are introduced in detail. Both static and dynamic experiments have been carried out on a simple beam bridge model, which demonstrate that the three-point deflection estimation method using the GECS is effective and offers a reliable way for bridge deflection estimation, especially for long-term monitoring.

  18. Arenani: pointing and information query system for object beyond your reach

    NASA Astrophysics Data System (ADS)

    Adachi, Mariko; Sakamoto, Kunio

    2008-03-01

    The authors developed a prototype information query system. It is easy to get the information about an object with in your reach. But it is troublesome to do the same in case that the object is far away. If someone is around you, you can ask an easy question with a finger pointing; "What is that?" Our developed system also realizes this approach using information technologies. The system consists of a laser pointer, transmitter and receiver units for an optical communication. The laser pointer is used for pointing an object. Moreover this laser light is modulated for sending information about user's identification (ID) codes to identify who asks a question. Each object has a receiver for laser light communication and sends user's identification to a main computer. After pointing an object, a questioner receives an answer through a wireless information network like an email on the cellular phone.

  19. Synthesis of hover autopilots for rotary-wing VTOL aircraft

    NASA Technical Reports Server (NTRS)

    Hall, W. E.; Bryson, A. E., Jr.

    1972-01-01

    The practical situation is considered where imperfect information on only a few rotor and fuselage state variables is available. Filters are designed to estimate all the state variables from noisy measurements of fuselage pitch/roll angles and from noisy measurements of both fuselage and rotor pitch/roll angles. The mean square response of the vehicle to a very gusty, random wind is computed using various filter/controllers and is found to be quite satisfactory although, of course, not so good as when one has perfect information (idealized case). The second part of the report considers precision hover over a point on the ground. A vehicle model without rotor dynamics is used and feedback signals in position and integral of position error are added. The mean square response of the vehicle to a very gusty, random wind is computed, assuming perfect information feedback, and is found to be excellent. The integral error feedback gives zero position error for a steady wind, and smaller position error for a random wind.

  20. Dynamics of fingertip contact during the onset of tangential slip

    PubMed Central

    Delhaye, Benoit; Lefèvre, Philippe; Thonnard, Jean-Louis

    2014-01-01

    Through highly precise perceptual and sensorimotor activities, the human tactile system continuously acquires information about the environment. Mechanical interactions between the skin at the point of contact and a touched surface serve as the source of this tactile information. Using a dedicated custom robotic platform, we imaged skin deformation at the contact area between the finger and a flat surface during the onset of tangential sliding movements in four different directions (proximal, distal, radial and ulnar) and with varying normal force and tangential speeds. This simple tactile event evidenced complex mechanics. We observed a reduction of the contact area while increasing the tangential force and proposed to explain this phenomenon by nonlinear stiffening of the skin. The deformation's shape and amplitude were highly dependent on stimulation direction. We conclude that the complex, but highly patterned and reproducible, deformations measured in this study are a potential source of information for the central nervous system and that further mechanical measurement are needed to better understand tactile perceptual and motor performances. PMID:25253033

  1. Mnemonic convergence in social networks: The emergent properties of cognition at a collective level.

    PubMed

    Coman, Alin; Momennejad, Ida; Drach, Rae D; Geana, Andra

    2016-07-19

    The development of shared memories, beliefs, and norms is a fundamental characteristic of human communities. These emergent outcomes are thought to occur owing to a dynamic system of information sharing and memory updating, which fundamentally depends on communication. Here we report results on the formation of collective memories in laboratory-created communities. We manipulated conversational network structure in a series of real-time, computer-mediated interactions in fourteen 10-member communities. The results show that mnemonic convergence, measured as the degree of overlap among community members' memories, is influenced by both individual-level information-processing phenomena and by the conversational social network structure created during conversational recall. By studying laboratory-created social networks, we show how large-scale social phenomena (i.e., collective memory) can emerge out of microlevel local dynamics (i.e., mnemonic reinforcement and suppression effects). The social-interactionist approach proposed herein points to optimal strategies for spreading information in social networks and provides a framework for measuring and forging collective memories in communities of individuals.

  2. An Information-Theoretical Approach to Image Resolution Applied to Neutron Imaging Detectors Based Upon Individual Discriminator Signals

    NASA Astrophysics Data System (ADS)

    Clergeau, Jean-François; Ferraton, Matthieu; Guérard, Bruno; Khaplanov, Anton; Piscitelli, Francesco; Platz, Martin; Rigal, Jean-Marie; Van Esch, Patrick; Daullé, Thibault

    2017-01-01

    1D or 2D neutron position sensitive detectors with individual wire or strip readout using discriminators have the advantage of being able to treat several neutron impacts partially overlapping in time, hence reducing global dead time. A single neutron impact usually gives rise to several discriminator signals. In this paper, we introduce an information-theoretical definition of image resolution. Two point-like spots of neutron impacts with a given distance between them act as a source of information (each neutron hit belongs to one spot or the other), and the detector plus signal treatment is regarded as an imperfect communication channel that transmits this information. The maximal mutual information obtained from this channel as a function of the distance between the spots allows to define a calibration-independent measure of position resolution. We then apply this measure to quantify the power of position resolution of different algorithms treating these individual discriminator signals which can be implemented in firmware. The method is then applied to different detectors existing at the ILL. Center-of-gravity methods usually improve the position resolution over best-wire algorithms which are the standard way of treating these signals.

  3. Experiential knowledge of expert coaches can help identify informational constraints on performance of dynamic interceptive actions.

    PubMed

    Greenwood, Daniel; Davids, Keith; Renshaw, Ian

    2014-01-01

    Coordination of dynamic interceptive movements is predicated on cyclical relations between an individual's actions and information sources from the performance environment. To identify dynamic informational constraints, which are interwoven with individual and task constraints, coaches' experiential knowledge provides a complementary source to support empirical understanding of performance in sport. In this study, 15 expert coaches from 3 sports (track and field, gymnastics and cricket) participated in a semi-structured interview process to identify potential informational constraints which they perceived to regulate action during run-up performance. Expert coaches' experiential knowledge revealed multiple information sources which may constrain performance adaptations in such locomotor pointing tasks. In addition to the locomotor pointing target, coaches' knowledge highlighted two other key informational constraints: vertical reference points located near the locomotor pointing target and a check mark located prior to the locomotor pointing target. This study highlights opportunities for broadening the understanding of perception and action coupling processes, and the identified information sources warrant further empirical investigation as potential constraints on athletic performance. Integration of experiential knowledge of expert coaches with theoretically driven empirical knowledge represents a promising avenue to drive future applied science research and pedagogical practice.

  4. Predicting Bradycardia in Preterm Infants Using Point Process Analysis of Heart Rate.

    PubMed

    Gee, Alan H; Barbieri, Riccardo; Paydarfar, David; Indic, Premananda

    2017-09-01

    Episodes of bradycardia are common and recur sporadically in preterm infants, posing a threat to the developing brain and other vital organs. We hypothesize that bradycardias are a result of transient temporal destabilization of the cardiac autonomic control system and that fluctuations in the heart rate signal might contain information that precedes bradycardia. We investigate infant heart rate fluctuations with a novel application of point process theory. In ten preterm infants, we estimate instantaneous linear measures of the heart rate signal, use these measures to extract statistical features of bradycardia, and propose a simplistic framework for prediction of bradycardia. We present the performance of a prediction algorithm using instantaneous linear measures (mean area under the curve = 0.79 ± 0.018) for over 440 bradycardia events. The algorithm achieves an average forecast time of 116 s prior to bradycardia onset (FPR = 0.15). Our analysis reveals that increased variance in the heart rate signal is a precursor of severe bradycardia. This increase in variance is associated with an increase in power from low content dynamics in the LF band (0.04-0.2 Hz) and lower multiscale entropy values prior to bradycardia. Point process analysis of the heartbeat time series reveals instantaneous measures that can be used to predict infant bradycardia prior to onset. Our findings are relevant to risk stratification, predictive monitoring, and implementation of preventative strategies for reducing morbidity and mortality associated with bradycardia in neonatal intensive care units.

  5. The CTBTO/WMO Atmospheric Backtracking Response System and the Data Fusion Exercise 2007

    DTIC Science & Technology

    2008-09-01

    sensitivity of the measurement (sample) towards releases at all points on the globe . For a more comprehensive description, see the presentation from last...localization information, including the error ellipse, is comparatively small. The red spots on the right image mark seismic events that occurred on...hours indicated in the calendar of the PTS post-processing software WEB- GRAPE . 2008 Monitoring Research Review: Ground-Based Nuclear

  6. Longitudinal costs of caring for people with Alzheimer's disease.

    PubMed

    Gillespie, Paddy; O'Shea, Eamon; Cullinan, John; Buchanan, Jacqui; Bobula, Joel; Lacey, Loretto; Gallagher, Damien; Mhaolain, Aine Ni; Lawlor, Brian

    2015-05-01

    There has been an increasing interest in the relationship between severity of disease and costs in the care of people with dementia. Much of the current evidence is based on cross-sectional data, suggesting the need to examine trends over time for this important and growing cohort of the population. This paper estimates resource use and costs of care based on longitudinal data for 72 people with dementia in Ireland. Data were collected from the Enhancing Care in Alzheimer's Disease (ECAD) study at two time points: baseline and follow-up, two years later. Patients' dependence on others was measured using the Dependence Scale (DS), while patient function was measured using the Disability Assessment for Dementia (DAD) scale. Univariate and multivariate analysis were used to explore the effects of a range of variables on formal and informal care costs. Total costs of formal and informal care over six months rose from €9,266 (Standard Deviation (SD): 12,947) per patient at baseline to €21,266 (SD: 26,883) at follow-up, two years later. This constituted a statistically significant (p = 0.0014) increase in costs over time, driven primarily by an increase in estimated informal care costs. In the multivariate analysis, a one-point increase in the DS score, that is a one-unit increase in patient's dependence on others, was associated with a 19% increase in total costs (p = 0.0610). Higher levels of dependence in people with Alzheimer's disease are significantly associated with increased costs of informal care as the disease progresses. Formal care services did not respond to increased dependence in people with dementia, leaving it to families to fill the caring gap, mainly through increased supervision with the progress of disease.

  7. Effects of Detector Thickness on Geometric Sensitivity and Event Positioning Errors in the Rectangular PET/X Scanner

    NASA Astrophysics Data System (ADS)

    MacDonald, Lawrence R.; Hunter, William C. J.; Kinahan, Paul E.; Miyaoka, Robert S.

    2013-10-01

    We used simulations to investigate the relationship between sensitivity and spatial resolution as a function of crystal thickness in a rectangular PET scanner intended for quantitative assessment of breast cancers. The system had two 20 × 15-cm2 and two 10 × 15-cm2 flat detectors forming a box, with the larger detectors separated by 4 or 8 cm. Depth-of-interaction (DOI) resolution was modeled as a function of crystal thickness based on prior measurements. Spatial resolution was evaluated independent of image reconstruction by deriving and validating a surrogate metric from list-mode data ( dFWHM). When increasing crystal thickness from 5 to 40 mm, and without using DOI information, the dFWHM for a centered point source increased from 0.72 to 1.6 mm. Including DOI information improved dFWHM by 12% and 27% for 5- and 40-mm-thick crystals, respectively. For a point source in the corner of the FOV, use of DOI information improved dFWHM by 20% (5-mm crystal) and 44% (40-mm crystal). Sensitivity was 7.7% for 10-mm-thick crystals (8-cm object). Increasing crystal thickness on the smaller side detectors from 10 to 20 mm (keeping 10-mm crystals on the larger detectors) boosted sensitivity by 24% (relative) and degraded dFWHM by only 3%/8% with/without DOI information. The benefits of measuring DOI must be evaluated in terms of the intended clinical task of assessing tracer uptake in small lesions. Increasing crystal thickness on the smaller side detectors provides substantial sensitivity increase with minimal accompanying loss in resolution.

  8. Associations between structural capabilities of primary care practices and performance on selected quality measures.

    PubMed

    Friedberg, Mark W; Coltin, Kathryn L; Safran, Dana Gelb; Dresser, Marguerite; Zaslavsky, Alan M; Schneider, Eric C

    2009-10-06

    Recent proposals to reform primary care have encouraged physician practices to adopt such structural capabilities as performance feedback and electronic health records. Whether practices with these capabilities have higher performance on measures of primary care quality is unknown. To measure associations between structural capabilities of primary care practices and performance on commonly used quality measures. Cross-sectional analysis. Massachusetts. 412 primary care practices. During 2007, 1 physician from each participating primary care practice (median size, 4 physicians) was surveyed about structural capabilities of the practice (responses representing 308 practices were obtained). Data on practice structural capabilities were linked to multipayer performance data on 13 Healthcare Effectiveness Data and Information Set (HEDIS) process measures in 4 clinical areas: screening, diabetes, depression, and overuse. Frequently used multifunctional electronic health records were associated with higher performance on 5 HEDIS measures (3 in screening and 2 in diabetes), with statistically significant differences in performance ranging from 3.1 to 7.6 percentage points. Frequent meetings to discuss quality were associated with higher performance on 3 measures of diabetes care (differences ranging from 2.3 to 3.1 percentage points). Physician awareness of patient experience ratings was associated with higher performance on screening for breast cancer and cervical cancer (1.9 and 2.2 percentage points, respectively). No other structural capabilities were associated with performance on more than 1 measure. No capabilities were associated with performance on depression care or overuse. Structural capabilities of primary care practices were assessed by physician survey. Among the investigated structural capabilities of primary care practices, electronic health records were associated with higher performance across multiple HEDIS measures. Overall, the modest magnitude and limited number of associations between structural capabilities and clinical performance suggest the importance of continuing to measure the processes and outcomes of care for patients. The Commonwealth Fund.

  9. Rapid estimation of recharge potential in ephemeral-stream channels using electromagnetic methods, and measurements of channel and vegetation characteristics

    USGS Publications Warehouse

    Callegary, J.B.; Leenhouts, J.M.; Paretti, N.V.; Jones, Christopher A.

    2007-01-01

    To classify recharge potential (RCP) in ephemeral-stream channels, a method was developed that incorporates information about channel geometry, vegetation characteristics, and bed-sediment apparent electrical conductivity (??a). Recharge potential is not independently measurable, but is instead formulated as a site-specific, qualitative parameter. We used data from 259 transects across two ephemeral-stream channels near Sierra Vista, Arizona, a location with a semiarid climate. Seven data types were collected: ??a averaged over two depth intervals (0-3 m, and 0-6 m), channel incision depth and width, diameter-at-breast-height of the largest tree, woody-plant and grass density. A two-tiered system was used to classify a transect's RCP. In the first tier, transects were categorized by estimates of near-surface-sediment hydraulic permeability as low, moderate, or high using measurements of 0-3 m-depth ??a. Each of these categories was subdivided into low, medium, or high RCP classes using the remaining six data types, thus yielding a total of nine RCP designations. Six sites in the study area were used to compare RCP and ??a with previously measured surrogates for hydraulic permeability. Borehole-averaged percent fines showed a moderate correlation with both shallow and deep ??a measurements, however, correlation of point measurements of saturated hydraulic conductivity, percent fines, and cylinder infiltrometer measurements with ??a and RCP was generally poor. The poor correlation was probably caused by the relatively large measurement volume and spatial averaging of ??a compared with the spatially-limited point measurements. Because of the comparatively large spatial extent of measurement transects and variety of data types collected, RCP estimates can give a more complete picture of the major factors affecting recharge at a site than is possible through point or borehole-averaged estimates of hydraulic permeability alone. ?? 2007 Elsevier B.V. All rights reserved.

  10. A PC-based magnetometer-only attitude and rate determination system for gyroless spacecraft

    NASA Technical Reports Server (NTRS)

    Challa, M.; Natanson, G.; Deutschmann, J.; Galal, K.

    1995-01-01

    This paper describes a prototype PC-based system that uses measurements from a three-axis magnetometer (TAM) to estimate the state (three-axis attitude and rates) of a spacecraft given no a priori information other than the mass properties. The system uses two algorithms that estimate the spacecraft's state - a deterministic magnetic-field only algorithm and a Kalman filter for gyroless spacecraft. The algorithms are combined by invoking the deterministic algorithm to generate the spacecraft state at epoch using a small batch of data and then using this deterministic epoch solution as the initial condition for the Kalman filter during the production run. System input comprises processed data that includes TAM and reference magnetic field data. Additional information, such as control system data and measurements from line-of-sight sensors, can be input to the system if available. Test results are presented using in-flight data from two three-axis stabilized spacecraft: Solar, Anomalous, and Magnetospheric Particle Explorer (SAMPEX) (gyroless, Sun-pointing) and Earth Radiation Budget Satellite (ERBS) (gyro-based, Earth-pointing). The results show that, using as little as 700 s of data, the system is capable of accuracies of 1.5 deg in attitude and 0.01 deg/s in rates; i.e., within SAMPEX mission requirements.

  11. Estimation of error on the cross-correlation, phase and time lag between evenly sampled light curves

    NASA Astrophysics Data System (ADS)

    Misra, R.; Bora, A.; Dewangan, G.

    2018-04-01

    Temporal analysis of radiation from Astrophysical sources like Active Galactic Nuclei, X-ray Binaries and Gamma-ray bursts provides information on the geometry and sizes of the emitting regions. Establishing that two light-curves in different energy bands are correlated, and measuring the phase and time-lag between them is an important and frequently used temporal diagnostic. Generally the estimates are done by dividing the light-curves into large number of adjacent intervals to find the variance or by using numerically expensive simulations. In this work we have presented alternative expressions for estimate of the errors on the cross-correlation, phase and time-lag between two shorter light-curves when they cannot be divided into segments. Thus the estimates presented here allow for analysis of light-curves with relatively small number of points, as well as to obtain information on the longest time-scales available. The expressions have been tested using 200 light curves simulated from both white and 1 / f stochastic processes with measurement errors. We also present an application to the XMM-Newton light-curves of the Active Galactic Nucleus, Akn 564. The example shows that the estimates presented here allow for analysis of light-curves with relatively small (∼ 1000) number of points.

  12. Puerto Rican understandings of child disability: methods for the cultural validation of standardized measures of child health.

    PubMed

    Gannotti, Mary E; Handwerker, W Penn

    2002-12-01

    Validating the cultural context of health is important for obtaining accurate and useful information from standardized measures of child health adapted for cross-cultural applications. This paper describes the application of ethnographic triangulation for cultural validation of a measure of childhood disability, the Pediatric Evaluation of Disability Inventory (PEDI) for use with children living in Puerto Rico. The key concepts include macro-level forces such as geography, demography, and economics, specific activities children performed and their key social interactions, beliefs, attitudes, emotions, and patterns of behavior surrounding independence in children and childhood disability, as well as the definition of childhood disability. Methods utilize principal components analysis to establish the validity of cultural concepts and multiple regression analysis to identify intracultural variation. Findings suggest culturally specific modifications to the PEDI, provide contextual information for informed interpretation of test scores, and point to the need to re-standardize normative values for use with Puerto Rican children. Without this type of information, Puerto Rican children may appear more disabled than expected for their level of impairment or not to be making improvements in functional status. The methods also allow for cultural boundaries to be quantitatively established, rather than presupposed. Copyright 2002 Elsevier Science Ltd.

  13. Image matching for digital close-range stereo photogrammetry based on constraints of Delaunay triangulated network and epipolar-line

    NASA Astrophysics Data System (ADS)

    Zhang, K.; Sheng, Y. H.; Li, Y. Q.; Han, B.; Liang, Ch.; Sha, W.

    2006-10-01

    In the field of digital photogrammetry and computer vision, the determination of conjugate points in a stereo image pair, referred to as "image matching," is the critical step to realize automatic surveying and recognition. Traditional matching methods encounter some problems in the digital close-range stereo photogrammetry, because the change of gray-scale or texture is not obvious in the close-range stereo images. The main shortcoming of traditional matching methods is that geometric information of matching points is not fully used, which will lead to wrong matching results in regions with poor texture. To fully use the geometry and gray-scale information, a new stereo image matching algorithm is proposed in this paper considering the characteristics of digital close-range photogrammetry. Compared with the traditional matching method, the new algorithm has three improvements on image matching. Firstly, shape factor, fuzzy maths and gray-scale projection are introduced into the design of synthetical matching measure. Secondly, the topology connecting relations of matching points in Delaunay triangulated network and epipolar-line are used to decide matching order and narrow the searching scope of conjugate point of the matching point. Lastly, the theory of parameter adjustment with constraint is introduced into least square image matching to carry out subpixel level matching under epipolar-line constraint. The new algorithm is applied to actual stereo images of a building taken by digital close-range photogrammetric system. The experimental result shows that the algorithm has a higher matching speed and matching accuracy than pyramid image matching algorithm based on gray-scale correlation.

  14. Particle Filter-Based Recursive Data Fusion With Sensor Indexing for Large Core Neutron Flux Estimation

    NASA Astrophysics Data System (ADS)

    Tamboli, Prakash Kumar; Duttagupta, Siddhartha P.; Roy, Kallol

    2017-06-01

    We introduce a sequential importance sampling particle filter (PF)-based multisensor multivariate nonlinear estimator for estimating the in-core neutron flux distribution for pressurized heavy water reactor core. Many critical applications such as reactor protection and control rely upon neutron flux information, and thus their reliability is of utmost importance. The point kinetic model based on neutron transport conveniently explains the dynamics of nuclear reactor. The neutron flux in the large core loosely coupled reactor is sensed by multiple sensors measuring point fluxes located at various locations inside the reactor core. The flux values are coupled to each other through diffusion equation. The coupling facilitates redundancy in the information. It is shown that multiple independent data about the localized flux can be fused together to enhance the estimation accuracy to a great extent. We also propose the sensor anomaly handling feature in multisensor PF to maintain the estimation process even when the sensor is faulty or generates data anomaly.

  15. The Unified Lunar Control Network 2005

    USGS Publications Warehouse

    Archinal, Brent A.; Rosiek, Mark R.; Kirk, Randolph L.; Redding, Bonnie L.

    2006-01-01

    This report documents a new general unified lunar control network and lunar topographic model based on a combination of Clementine images and a previous network derived from Earth-based & Apollo photographs, and Mariner 10, & Galileo images. This photogrammetric network solution is the largest planetary control network ever completed. It includes the determination of the 3-D positions of 272,931 points on the lunar surface and the correction of the camera angles for 43,866 Clementine images, using 546,126 tie point measurements. The solution RMS is 20 ?m (= 0.9 pixels) in the image plane, with the largest residual of 6.4 pixels. The explanation given here, along with the accompanying files, comprises the release of the network information and of global lunar digital elevation models (DEMs) derived from the network. A paper that will describe the solution and network in further detail will be submitted to a refereed journal, and will include additional background information, solution details, discussion of accuracy and precision, and explanatory figures.

  16. Design with limited anthropometric data: A method of interpreting sums of percentiles in anthropometric design.

    PubMed

    Albin, Thomas J

    2017-07-01

    Occasionally practitioners must work with single dimensions defined as combinations (sums or differences) of percentile values, but lack information (e.g. variances) to estimate the accommodation achieved. This paper describes methods to predict accommodation proportions for such combinations of percentile values, e.g. two 90th percentile values. Kreifeldt and Nah z-score multipliers were used to estimate the proportions accommodated by combinations of percentile values of 2-15 variables; two simplified versions required less information about variance and/or correlation. The estimates were compared to actual observed proportions; for combinations of 2-15 percentile values the average absolute differences ranged between 0.5 and 1.5 percentage points. The multipliers were also used to estimate adjusted percentile values, that, when combined, estimate a desired proportion of the combined measurements. For combinations of two and three adjusted variables, the average absolute difference between predicted and observed proportions ranged between 0.5 and 3.0 percentage points. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Monitoring system of hydraulic lifting device based on the fiber optic sensors

    NASA Astrophysics Data System (ADS)

    Fajkus, Marcel; Nedoma, Jan; Novak, Martin; Martinek, Radek; Vanus, Jan; Mec, Pavel; Vasinek, Vladimir

    2017-10-01

    This article deals with the description of the monitoring system of hydraulic lifting device based on the fiber-optic sensors. For minimize the financial costs of the proposed monitoring system, the power evaluation of measured signal has been chosen. The solution is based on an evaluation of the signal obtained using the single point optic fiber sensors with overlapping reflective spectra. For encapsulation of the sensors was used polydimethylsiloxane (PDMS) polymer. To obtain a information of loading is uses the action of deformation of the lifting device on the pair single point optic fiber sensors mounted on the lifting device of the tested car. According to the proposed algorithm is determined information of pressure with an accuracy of +/- 5 %. Verification of the proposed system was realized on the various types of the tested car with different loading. The original contribution of the paper is to verify the new low-cost system for monitoring the hydraulic lifting device based on the fiber-optic sensors.

  18. Multiple-component Decomposition from Millimeter Single-channel Data

    NASA Astrophysics Data System (ADS)

    Rodríguez-Montoya, Iván; Sánchez-Argüelles, David; Aretxaga, Itziar; Bertone, Emanuele; Chávez-Dagostino, Miguel; Hughes, David H.; Montaña, Alfredo; Wilson, Grant W.; Zeballos, Milagros

    2018-03-01

    We present an implementation of a blind source separation algorithm to remove foregrounds off millimeter surveys made by single-channel instruments. In order to make possible such a decomposition over single-wavelength data, we generate levels of artificial redundancy, then perform a blind decomposition, calibrate the resulting maps, and lastly measure physical information. We simulate the reduction pipeline using mock data: atmospheric fluctuations, extended astrophysical foregrounds, and point-like sources, but we apply the same methodology to the Aztronomical Thermal Emission Camera/ASTE survey of the Great Observatories Origins Deep Survey–South (GOODS-S). In both applications, our technique robustly decomposes redundant maps into their underlying components, reducing flux bias, improving signal-to-noise ratio, and minimizing information loss. In particular, GOODS-S is decomposed into four independent physical components: one of them is the already-known map of point sources, two are atmospheric and systematic foregrounds, and the fourth component is an extended emission that can be interpreted as the confusion background of faint sources.

  19. Outcry Consistency and Prosecutorial Decisions in Child Sexual Abuse Cases.

    PubMed

    Bracewell, Tammy E

    2018-05-18

    This study examines the correlation between the consistency in a child's sexual abuse outcry and the prosecutorial decision to accept or reject cases of child sexual abuse. Case-specific information was obtained from one Texas Children's Advocacy Center on all cases from 2010 to 2013. After the needed deletion, the total number of cases included in the analysis was 309. An outcry was defined as a sexual abuse disclosure. Consistency was measured at both the forensic interview and the sexual assault exam. Logistic regression was used to evaluate whether a correlation existed between disclosure and prosecutorial decisions. Disclosure was statistically significant. Partial disclosure (disclosure at one point in time and denial at another) versus full disclosure (disclosure at two points in time) had a statistically significant odds ratio of 4.801. Implications are discussed, specifically, how the different disciplines involved in child protection should take advantage of the expertise of both forensic interviewers and forensic nurses to inform their decisions.

  20. Fundamental procedures of geographic information analysis

    NASA Technical Reports Server (NTRS)

    Berry, J. K.; Tomlin, C. D.

    1981-01-01

    Analytical procedures common to most computer-oriented geographic information systems are composed of fundamental map processing operations. A conceptual framework for such procedures is developed and basic operations common to a broad range of applications are described. Among the major classes of primitive operations identified are those associated with: reclassifying map categories as a function of the initial classification, the shape, the position, or the size of the spatial configuration associated with each category; overlaying maps on a point-by-point, a category-wide, or a map-wide basis; measuring distance; establishing visual or optimal path connectivity; and characterizing cartographic neighborhoods based on the thematic or spatial attributes of the data values within each neighborhood. By organizing such operations in a coherent manner, the basis for a generalized cartographic modeling structure can be developed which accommodates a variety of needs in a common, flexible and intuitive manner. The use of each is limited only by the general thematic and spatial nature of the data to which it is applied.

Top