Sample records for objective analysis precision

  1. Desired Precision in Multi-Objective Optimization: Epsilon Archiving or Rounding Objectives?

    NASA Astrophysics Data System (ADS)

    Asadzadeh, M.; Sahraei, S.

    2016-12-01

    Multi-objective optimization (MO) aids in supporting the decision making process in water resources engineering and design problems. One of the main goals of solving a MO problem is to archive a set of solutions that is well-distributed across a wide range of all the design objectives. Modern MO algorithms use the epsilon dominance concept to define a mesh with pre-defined grid-cell size (often called epsilon) in the objective space and archive at most one solution at each grid-cell. Epsilon can be set to the desired precision level of each objective function to make sure that the difference between each pair of archived solutions is meaningful. This epsilon archiving process is computationally expensive in problems that have quick-to-evaluate objective functions. This research explores the applicability of a similar but computationally more efficient approach to respect the desired precision level of all objectives in the solution archiving process. In this alternative approach each objective function is rounded to the desired precision level before comparing any new solution to the set of archived solutions that already have rounded objective function values. This alternative solution archiving approach is compared to the epsilon archiving approach in terms of efficiency and quality of archived solutions for solving mathematical test problems and hydrologic model calibration problems.

  2. The neural basis of precise visual short-term memory for complex recognisable objects.

    PubMed

    Veldsman, Michele; Mitchell, Daniel J; Cusack, Rhodri

    2017-10-01

    Recent evidence suggests that visual short-term memory (VSTM) capacity estimated using simple objects, such as colours and oriented bars, may not generalise well to more naturalistic stimuli. More visual detail can be stored in VSTM when complex, recognisable objects are maintained compared to simple objects. It is not yet known if it is recognisability that enhances memory precision, nor whether maintenance of recognisable objects is achieved with the same network of brain regions supporting maintenance of simple objects. We used a novel stimulus generation method to parametrically warp photographic images along a continuum, allowing separate estimation of the precision of memory representations and the number of items retained. The stimulus generation method was also designed to create unrecognisable, though perceptually matched, stimuli, to investigate the impact of recognisability on VSTM. We adapted the widely-used change detection and continuous report paradigms for use with complex, photographic images. Across three functional magnetic resonance imaging (fMRI) experiments, we demonstrated greater precision for recognisable objects in VSTM compared to unrecognisable objects. This clear behavioural advantage was not the result of recruitment of additional brain regions, or of stronger mean activity within the core network. Representational similarity analysis revealed greater variability across item repetitions in the representations of recognisable, compared to unrecognisable complex objects. We therefore propose that a richer range of neural representations support VSTM for complex recognisable objects. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. The role of judgment frames and task precision in object attention: Reduced template sharpness limits dual-object performance.

    PubMed

    Liu, Shiau-Hua; Dosher, Barbara Anne; Lu, Zhong-Lin

    2009-06-01

    Multiple attributes of a single-object are often processed more easily than attributes of different objects-a phenomenon associated with object attention. Here we investigate the influence of two factors, judgment frames and judgment precision, on dual-object report deficits as an index of object attention. [Han, S., Dosher, B., & Lu, Z.-L. (2003). Object attention revisited: Identifying mechanisms and boundary conditions. Psychological Science, 14, 598-604] predicted that consistency of the frame for judgments about two separate objects could reduce or eliminate the expression of object attention limitations. The current studies examine the effects of judgment frames and of task precision in orientation identification and find that dual-object report deficits within one feature are indeed affected modestly by the congruency of the judgments and more substantially by the required precision of judgments. The observed dual-object deficits affected contrast thresholds for incongruent frame conditions and for high precision judgments and reduce psychometric asymptotes. These dual-object deficits reflect a combined effect of multiplicative noise and external noise exclusion in dual-object conditions, both related to the effects of attention on the tuning of perceptual templates. These results have implications for modification of object attention theory, for understanding limitations on concurrent tasks.

  4. Multiple-objective optimization in precision laser cutting of different thermoplastics

    NASA Astrophysics Data System (ADS)

    Tamrin, K. F.; Nukman, Y.; Choudhury, I. A.; Shirley, S.

    2015-04-01

    Thermoplastics are increasingly being used in biomedical, automotive and electronics industries due to their excellent physical and chemical properties. Due to the localized and non-contact process, use of lasers for cutting could result in precise cut with small heat-affected zone (HAZ). Precision laser cutting involving various materials is important in high-volume manufacturing processes to minimize operational cost, error reduction and improve product quality. This study uses grey relational analysis to determine a single optimized set of cutting parameters for three different thermoplastics. The set of the optimized processing parameters is determined based on the highest relational grade and was found at low laser power (200 W), high cutting speed (0.4 m/min) and low compressed air pressure (2.5 bar). The result matches with the objective set in the present study. Analysis of variance (ANOVA) is then carried out to ascertain the relative influence of process parameters on the cutting characteristics. It was found that the laser power has dominant effect on HAZ for all thermoplastics.

  5. Analysis and Test Support for Phillips Laboratory Precision Structures

    DTIC Science & Technology

    1998-11-01

    Air Force Research Laboratory ( AFRL ), Phillips Research Site . Task objectives centered...around analysis and structural dynamic test support on experiments within the Space Vehicles Directorate at Kirtland Air Force Base. These efforts help...support for Phillips Laboratory Precision Structures." Mr. James Goodding of CSA Engineering was the principal investigator for this task. Mr.

  6. Effects of material properties and object orientation on precision grip kinematics.

    PubMed

    Paulun, Vivian C; Gegenfurtner, Karl R; Goodale, Melvyn A; Fleming, Roland W

    2016-08-01

    Successfully picking up and handling objects requires taking into account their physical properties (e.g., material) and position relative to the body. Such features are often inferred by sight, but it remains unclear to what extent observers vary their actions depending on the perceived properties. To investigate this, we asked participants to grasp, lift and carry cylinders to a goal location with a precision grip. The cylinders were made of four different materials (Styrofoam, wood, brass and an additional brass cylinder covered with Vaseline) and were presented at six different orientations with respect to the participant (0°, 30°, 60°, 90°, 120°, 150°). Analysis of their grasping kinematics revealed differences in timing and spatial modulation at all stages of the movement that depended on both material and orientation. Object orientation affected the spatial configuration of index finger and thumb during the grasp, but also the timing of handling and transport duration. Material affected the choice of local grasp points and the duration of the movement from the first visual input until release of the object. We find that conditions that make grasping more difficult (orientation with the base pointing toward the participant, high weight and low surface friction) lead to longer durations of individual movement segments and a more careful placement of the fingers on the object.

  7. Precision Teaching.

    ERIC Educational Resources Information Center

    Couch, Richard W.

    Precision teaching (PT) is an approach to the science of human behavior that focuses on precise monitoring of carefully defined behaviors in an attempt to construct an environmental analysis of that behavior and its controlling variables. A variety of subjects have been used with PT, ranging in academic objectives from beginning reading to college…

  8. Memory for a single object has differently variable precisions for relevant and irrelevant features.

    PubMed

    Swan, Garrett; Collins, John; Wyble, Brad

    2016-01-01

    Working memory is a limited resource. To further characterize its limitations, it is vital to understand exactly what is encoded about a visual object beyond the "relevant" features probed in a particular task. We measured the memory quality of a task-irrelevant feature of an attended object by coupling a delayed estimation task with a surprise test. Participants were presented with a single colored arrow and were asked to retrieve just its color for the first half of the experiment before unexpectedly being asked to report its direction. Mixture modeling of the data revealed that participants had highly variable precision on the surprise test, indicating a coarse-grained memory for the irrelevant feature. Following the surprise test, all participants could precisely recall the arrow's direction; however, this improvement in direction memory came at a cost in precision for color memory even though only a single object was being remembered. We attribute these findings to varying levels of attention to different features during memory encoding.

  9. Numerical Simulation Analysis of High-precision Dispensing Needles for Solid-liquid Two-phase Grinding

    NASA Astrophysics Data System (ADS)

    Li, Junye; Hu, Jinglei; Wang, Binyu; Sheng, Liang; Zhang, Xinming

    2018-03-01

    In order to investigate the effect of abrasive flow polishing surface variable diameter pipe parts, with high precision dispensing needles as the research object, the numerical simulation of the process of polishing high precision dispensing needle was carried out. Analysis of different volume fraction conditions, the distribution of the dynamic pressure and the turbulence viscosity of the abrasive flow field in the high precision dispensing needle, through comparative analysis, the effectiveness of the abrasive grain polishing high precision dispensing needle was studied, controlling the volume fraction of silicon carbide can change the viscosity characteristics of the abrasive flow during the polishing process, so that the polishing quality of the abrasive grains can be controlled.

  10. Quantization and training of object detection networks with low-precision weights and activations

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Liu, Jian; Zhou, Li; Wang, Yun; Chen, Jie

    2018-01-01

    As convolutional neural networks have demonstrated state-of-the-art performance in object recognition and detection, there is a growing need for deploying these systems on resource-constrained mobile platforms. However, the computational burden and energy consumption of inference for these networks are significantly higher than what most low-power devices can afford. To address these limitations, this paper proposes a method to train object detection networks with low-precision weights and activations. The probability density functions of weights and activations of each layer are first directly estimated using piecewise Gaussian models. Then, the optimal quantization intervals and step sizes for each convolution layer are adaptively determined according to the distribution of weights and activations. As the most computationally expensive convolutions can be replaced by effective fixed point operations, the proposed method can drastically reduce computation complexity and memory footprint. Performing on the tiny you only look once (YOLO) and YOLO architectures, the proposed method achieves comparable accuracy to their 32-bit counterparts. As an illustration, the proposed 4-bit and 8-bit quantized versions of the YOLO model achieve a mean average precision of 62.6% and 63.9%, respectively, on the Pascal visual object classes 2012 test dataset. The mAP of the 32-bit full-precision baseline model is 64.0%.

  11. A Mission Planning Approach for Precision Farming Systems Based on Multi-Objective Optimization.

    PubMed

    Zhai, Zhaoyu; Martínez Ortega, José-Fernán; Lucas Martínez, Néstor; Rodríguez-Molina, Jesús

    2018-06-02

    As the demand for food grows continuously, intelligent agriculture has drawn much attention due to its capability of producing great quantities of food efficiently. The main purpose of intelligent agriculture is to plan agricultural missions properly and use limited resources reasonably with minor human intervention. This paper proposes a Precision Farming System (PFS) as a Multi-Agent System (MAS). Components of PFS are treated as agents with different functionalities. These agents could form several coalitions to complete the complex agricultural missions cooperatively. In PFS, mission planning should consider several criteria, like expected benefit, energy consumption or equipment loss. Hence, mission planning could be treated as a Multi-objective Optimization Problem (MOP). In order to solve MOP, an improved algorithm, MP-PSOGA, is proposed, taking advantages of the Genetic Algorithms and Particle Swarm Optimization. A simulation, called precise pesticide spraying mission, is performed to verify the feasibility of the proposed approach. Simulation results illustrate that the proposed approach works properly. This approach enables the PFS to plan missions and allocate scarce resources efficiently. The theoretical analysis and simulation is a good foundation for the future study. Once the proposed approach is applied to a real scenario, it is expected to bring significant economic improvement.

  12. Determining characteristics of artificial near-Earth objects using observability analysis

    NASA Astrophysics Data System (ADS)

    Friedman, Alex M.; Frueh, Carolin

    2018-03-01

    Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.

  13. Statistical analysis for improving data precision in the SPME GC-MS analysis of blackberry (Rubus ulmifolius Schott) volatiles.

    PubMed

    D'Agostino, M F; Sanz, J; Martínez-Castro, I; Giuffrè, A M; Sicari, V; Soria, A C

    2014-07-01

    Statistical analysis has been used for the first time to evaluate the dispersion of quantitative data in the solid-phase microextraction (SPME) followed by gas chromatography-mass spectrometry (GC-MS) analysis of blackberry (Rubus ulmifolius Schott) volatiles with the aim of improving their precision. Experimental and randomly simulated data were compared using different statistical parameters (correlation coefficients, Principal Component Analysis loadings and eigenvalues). Non-random factors were shown to significantly contribute to total dispersion; groups of volatile compounds could be associated with these factors. A significant improvement of precision was achieved when considering percent concentration ratios, rather than percent values, among those blackberry volatiles with a similar dispersion behavior. As novelty over previous references, and to complement this main objective, the presence of non-random dispersion trends in data from simple blackberry model systems was evidenced. Although the influence of the type of matrix on data precision was proved, the possibility of a better understanding of the dispersion patterns in real samples was not possible from model systems. The approach here used was validated for the first time through the multicomponent characterization of Italian blackberries from different harvest years. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Development of novel hybrid flexure-based microgrippers for precision micro-object manipulation.

    PubMed

    Mohd Zubir, Mohd Nashrul; Shirinzadeh, Bijan; Tian, Yanling

    2009-06-01

    This paper describes the process of developing a microgripper that is capable of high precision and fidelity manipulation of micro-objects. The design adopts the concept of flexure-based hinges on its joints to provide the rotational motion, thus eliminating the inherent nonlinearities associated with the application of conventional rigid hinges. A combination of two modeling techniques, namely, pseudorigid body model and finite element analysis was utilized to expedite the prototyping procedure, which leads to the establishment of a high performance mechanism. A new hybrid compliant structure integrating cantilever beam and flexural hinge configurations within microgripper mechanism mainframe has been developed. This concept provides a novel approach to harness the advantages within each individual configuration while mutually compensating the limitations inherent between them. A wire electrodischarge machining technique was utilized to fabricate the gripper out of high grade aluminum alloy (Al 7075T6). Experimental studies were conducted on the model to obtain various correlations governing the gripper performance as well as for model verification. The experimental results demonstrate high level of compliance in comparison to the computational results. A high amplification characteristic and maximum achievable stroke of 100 microm can be achieved.

  15. Development of novel hybrid flexure-based microgrippers for precision micro-object manipulation

    NASA Astrophysics Data System (ADS)

    Mohd Zubir, Mohd Nashrul; Shirinzadeh, Bijan; Tian, Yanling

    2009-06-01

    This paper describes the process of developing a microgripper that is capable of high precision and fidelity manipulation of micro-objects. The design adopts the concept of flexure-based hinges on its joints to provide the rotational motion, thus eliminating the inherent nonlinearities associated with the application of conventional rigid hinges. A combination of two modeling techniques, namely, pseudorigid body model and finite element analysis was utilized to expedite the prototyping procedure, which leads to the establishment of a high performance mechanism. A new hybrid compliant structure integrating cantilever beam and flexural hinge configurations within microgripper mechanism mainframe has been developed. This concept provides a novel approach to harness the advantages within each individual configuration while mutually compensating the limitations inherent between them. A wire electrodischarge machining technique was utilized to fabricate the gripper out of high grade aluminum alloy (Al 7075T6). Experimental studies were conducted on the model to obtain various correlations governing the gripper performance as well as for model verification. The experimental results demonstrate high level of compliance in comparison to the computational results. A high amplification characteristic and maximum achievable stroke of 100 μm can be achieved.

  16. Precision Efficacy Analysis for Regression.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.

    When multiple linear regression is used to develop a prediction model, sample size must be large enough to ensure stable coefficients. If the derivation sample size is inadequate, the model may not predict well for future subjects. The precision efficacy analysis for regression (PEAR) method uses a cross- validity approach to select sample sizes…

  17. Moving Object Detection Using Scanning Camera on a High-Precision Intelligent Holder

    PubMed Central

    Chen, Shuoyang; Xu, Tingfa; Li, Daqun; Zhang, Jizhou; Jiang, Shenwang

    2016-01-01

    During the process of moving object detection in an intelligent visual surveillance system, a scenario with complex background is sure to appear. The traditional methods, such as “frame difference” and “optical flow”, may not able to deal with the problem very well. In such scenarios, we use a modified algorithm to do the background modeling work. In this paper, we use edge detection to get an edge difference image just to enhance the ability of resistance illumination variation. Then we use a “multi-block temporal-analyzing LBP (Local Binary Pattern)” algorithm to do the segmentation. In the end, a connected component is used to locate the object. We also produce a hardware platform, the core of which consists of the DSP (Digital Signal Processor) and FPGA (Field Programmable Gate Array) platforms and the high-precision intelligent holder. PMID:27775671

  18. Moving Object Detection Using Scanning Camera on a High-Precision Intelligent Holder.

    PubMed

    Chen, Shuoyang; Xu, Tingfa; Li, Daqun; Zhang, Jizhou; Jiang, Shenwang

    2016-10-21

    During the process of moving object detection in an intelligent visual surveillance system, a scenario with complex background is sure to appear. The traditional methods, such as "frame difference" and "optical flow", may not able to deal with the problem very well. In such scenarios, we use a modified algorithm to do the background modeling work. In this paper, we use edge detection to get an edge difference image just to enhance the ability of resistance illumination variation. Then we use a "multi-block temporal-analyzing LBP (Local Binary Pattern)" algorithm to do the segmentation. In the end, a connected component is used to locate the object. We also produce a hardware platform, the core of which consists of the DSP (Digital Signal Processor) and FPGA (Field Programmable Gate Array) platforms and the high-precision intelligent holder.

  19. An object-based image analysis approach for aquaculture ponds precise mapping and monitoring: a case study of Tam Giang-Cau Hai Lagoon, Vietnam.

    PubMed

    Virdis, Salvatore Gonario Pasquale

    2014-01-01

    Monitoring and mapping shrimp farms, including their impact on land cover and land use, is critical to the sustainable management and planning of coastal zones. In this work, a methodology was proposed to set up a cost-effective and reproducible procedure that made use of satellite remote sensing, object-based classification approach, and open-source software for mapping aquaculture areas with high planimetric and thematic accuracy between 2005 and 2008. The analysis focused on two characteristic areas of interest of the Tam Giang-Cau Hai Lagoon (in central Vietnam), which have similar farming systems to other coastal aquaculture worldwide: the first was primarily characterised by locally referred "low tide" shrimp ponds, which are partially submerged areas; the second by earthed shrimp ponds, locally referred to as "high tide" ponds, which are non-submerged areas on the lagoon coast. The approach was based on the region-growing segmentation of high- and very high-resolution panchromatic images, SPOT5 and Worldview-1, and the unsupervised clustering classifier ISOSEG embedded on SPRING non-commercial software. The results, the accuracy of which was tested with a field-based aquaculture inventory, showed that in favourable situations (high tide shrimp ponds), the classification results provided high rates of accuracy (>95 %) through a fully automatic object-based classification. In unfavourable situations (low tide shrimp ponds), the performance degraded due to the low contrast between the water and the pond embankments. In these situations, the automatic results were improved by manual delineation of the embankments. Worldview-1 necessarily showed better thematic accuracy, and precise maps have been realised at a scale of up to 1:2,000. However, SPOT5 provided comparable results in terms of number of correctly classified ponds, but less accurate results in terms of the precision of mapped features. The procedure also demonstrated high degrees of reproducibility

  20. Precision and Accuracy of Analysis for Boron in ITP Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tovo, L.L.

    'Inductively Coupled Plasma Emission Spectroscopy (ICPES) has been used by the Analytical Development Section (ADS) to measure boron in catalytic tetraphenylboron decomposition studies performed by the Waste Processing Technology (WPT) section. Analysis of these samples is complicated due to the presence of high concentrations of sodium and organic compounds. Previously, we found signal suppression in samples analyzed "as received". We suspected that the suppression was due to the high organic concentration (up to 0.01 molar organic decomposition products) in the samples. When the samples were acid digested prior to analysis, the suppression was eliminated. The precision of the reported boronmore » concentration was estimated as 10 percent based on the known precision of the inorganic boron standard used for calibration and quality control check of the ICPES analysis. However, a precision better than 10 percent was needed to evaluate ITP process operating parameters. Therefore, the purpose of this work was (1) to measure, instead of estimating, the precision of the boron measurement on ITP samples and (2) to determine the optimum precision attainable with current instrumentation.'« less

  1. In vivo glenohumeral analysis using 3D MRI models and a flexible software tool: feasibility and precision.

    PubMed

    Busse, Harald; Thomas, Michael; Seiwerts, Matthias; Moche, Michael; Busse, Martin W; von Salis-Soglio, Georg; Kahn, Thomas

    2008-01-01

    To implement a PC-based morphometric analysis platform and to evaluate the feasibility and precision of MRI measurements of glenohumeral translation. Using a vertically open 0.5T MRI scanner, the shoulders of 10 healthy subjects were scanned in apprehension (AP) and in neutral position (NP), respectively. Surface models of the humeral head (HH) and the glenoid cavity (GC) were created from segmented MR images by three readers. Glenohumeral translation was determined by the projection point of the manually fitted HH center on the GC plane defined by the two main principal axes of the GC model. Positional precision, given as mean (extreme value at 95% confidence level), was 0.9 (1.8) mm for the HH center and 0.7 (1.6) mm for the GC centroid; angular GC precision was 1.3 degrees (2.3 degrees ) for the normal and about 4 degrees (7 degrees ) for the anterior and superior coordinate axes. The two-dimensional (2D) precision of the HH projection point was 1.1 (2.2) mm. A significant HH translation between AP and NP was found. Despite a limited quality of the underlying model data, our PC-based analysis platform allows a precise morphometric analysis of the glenohumeral joint. The software is easily extendable and may potentially be used for an objective evaluation of therapeutical measures.

  2. The economic case for precision medicine.

    PubMed

    Gavan, Sean P; Thompson, Alexander J; Payne, Katherine

    2018-01-01

    Introduction : The advancement of precision medicine into routine clinical practice has been highlighted as an agenda for national and international health care policy. A principle barrier to this advancement is in meeting requirements of the payer or reimbursement agency for health care. This special report aims to explain the economic case for precision medicine, by accounting for the explicit objectives defined by decision-makers responsible for the allocation of limited health care resources. Areas covered : The framework of cost-effectiveness analysis, a method of economic evaluation, is used to describe how precision medicine can, in theory, exploit identifiable patient-level heterogeneity to improve population health outcomes and the relative cost-effectiveness of health care. Four case studies are used to illustrate potential challenges when demonstrating the economic case for a precision medicine in practice. Expert commentary : The economic case for a precision medicine should be considered at an early stage during its research and development phase. Clinical and economic evidence can be generated iteratively and should be in alignment with the objectives and requirements of decision-makers. Programmes of further research, to demonstrate the economic case of a precision medicine, can be prioritized by the extent that they reduce the uncertainty expressed by decision-makers.

  3. The economic case for precision medicine

    PubMed Central

    Gavan, Sean P.; Thompson, Alexander J.; Payne, Katherine

    2018-01-01

    ABSTRACT Introduction: The advancement of precision medicine into routine clinical practice has been highlighted as an agenda for national and international health care policy. A principle barrier to this advancement is in meeting requirements of the payer or reimbursement agency for health care. This special report aims to explain the economic case for precision medicine, by accounting for the explicit objectives defined by decision-makers responsible for the allocation of limited health care resources. Areas covered: The framework of cost-effectiveness analysis, a method of economic evaluation, is used to describe how precision medicine can, in theory, exploit identifiable patient-level heterogeneity to improve population health outcomes and the relative cost-effectiveness of health care. Four case studies are used to illustrate potential challenges when demonstrating the economic case for a precision medicine in practice. Expert commentary: The economic case for a precision medicine should be considered at an early stage during its research and development phase. Clinical and economic evidence can be generated iteratively and should be in alignment with the objectives and requirements of decision-makers. Programmes of further research, to demonstrate the economic case of a precision medicine, can be prioritized by the extent that they reduce the uncertainty expressed by decision-makers. PMID:29682615

  4. Taxonomy based analysis of force exchanges during object grasping and manipulation

    PubMed Central

    Martin-Brevet, Sandra; Jarrassé, Nathanaël; Burdet, Etienne

    2017-01-01

    The flexibility of the human hand in object manipulation is essential for daily life activities, but remains relatively little explored with quantitative methods. On the one hand, recent taxonomies describe qualitatively the classes of hand postures for object grasping and manipulation. On the other hand, the quantitative analysis of hand function has been generally restricted to precision grip (with thumb and index opposition) during lifting tasks. The aim of the present study is to fill the gap between these two kinds of descriptions, by investigating quantitatively the forces exerted by the hand on an instrumented object in a set of representative manipulation tasks. The object was a parallelepiped object able to measure the force exerted on the six faces and its acceleration. The grasping force was estimated from the lateral force and the unloading force from the bottom force. The protocol included eleven tasks with complementary constraints inspired by recent taxonomies: four tasks corresponding to lifting and holding the object with different grasp configurations, and seven to manipulating the object (rotation around each of its axis and translation). The grasping and unloading forces and object rotations were measured during the five phases of the actions: unloading, lifting, holding or manipulation, preparation to deposit, and deposit. The results confirm the tight regulation between grasping and unloading forces during lifting, and extend this to the deposit phase. In addition, they provide a precise description of the regulation of force exchanges during various manipulation tasks spanning representative actions of daily life. The timing of manipulation showed both sequential and overlapping organization of the different sub-actions, and micro-errors could be detected. This phenomenological study confirms the feasibility of using an instrumented object to investigate complex manipulative behavior in humans. This protocol will be used in the future to

  5. The Precision Efficacy Analysis for Regression Sample Size Method.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The general purpose of this study was to examine the efficiency of the Precision Efficacy Analysis for Regression (PEAR) method for choosing appropriate sample sizes in regression studies used for precision. The PEAR method, which is based on the algebraic manipulation of an accepted cross-validity formula, essentially uses an effect size to…

  6. Object properties and cognitive load in the formation of associative memory during precision lifting.

    PubMed

    Li, Yong; Randerath, Jennifer; Bauer, Hans; Marquardt, Christian; Goldenberg, Georg; Hermsdörfer, Joachim

    2009-01-03

    When we manipulate familiar objects in our daily life, our grip force anticipates the physical demands right from the moment of contact with the object, indicating the existence of a memory for relevant object properties. This study explores the formation and consolidation of the memory processes that associate either familiar (size) or arbitrary object features (color) with object weight. In the general task, participants repetitively lifted two differently weighted objects (580 and 280 g) in a pseudo-random order. Forty young healthy adults participated in this study and were randomly distributed into four groups: Color Cue Single task (CCS, blue and red, 9.8(3)cm(3)), Color Cue Dual task (CCD), No Cue (NC) and Size Cue (SC, 9.8(3) and 6(3)cm(3)) group. All groups performed a repetitive precision grasp-lift task and were retested with the same protocol after a 5-min pause. The CCD group was also required to simultaneously perform a memory task during each lift of differently weighted objects coded by color. The results show that groups lifting objects with arbitrary or familiar features successfully formed the association between object weight and manipulated object features and incorporated this into grip force programming, as observed in the different scaling of grip force and grip force rate for different object weights. An arbitrary feature, i.e., color, can be sufficiently associated with object weight, however with less strength than the familiar feature of size. The simultaneous memory task impaired anticipatory force scaling during repetitive object lifting but did not jeopardize the learning process and the consolidation of the associative memory.

  7. Effectiveness of Spectral Similarity Measures to Develop Precise Crop Spectra for Hyperspectral Data Analysis

    NASA Astrophysics Data System (ADS)

    Chauhan, H.; Krishna Mohan, B.

    2014-11-01

    The present study was undertaken with the objective to check effectiveness of spectral similarity measures to develop precise crop spectra from the collected hyperspectral field spectra. In Multispectral and Hyperspectral remote sensing, classification of pixels is obtained by statistical comparison (by means of spectral similarity) of known field or library spectra to unknown image spectra. Though these algorithms are readily used, little emphasis has been placed on use of various spectral similarity measures to select precise crop spectra from the set of field spectra. Conventionally crop spectra are developed after rejecting outliers based only on broad-spectrum analysis. Here a successful attempt has been made to develop precise crop spectra based on spectral similarity. As unevaluated data usage leads to uncertainty in the image classification, it is very crucial to evaluate the data. Hence, notwithstanding the conventional method, the data precision has been performed effectively to serve the purpose of the present research work. The effectiveness of developed precise field spectra was evaluated by spectral discrimination measures and found higher discrimination values compared to spectra developed conventionally. Overall classification accuracy for the image classified by field spectra selected conventionally is 51.89% and 75.47% for the image classified by field spectra selected precisely based on spectral similarity. KHAT values are 0.37, 0.62 and Z values are 2.77, 9.59 for image classified using conventional and precise field spectra respectively. Reasonable higher classification accuracy, KHAT and Z values shows the possibility of a new approach for field spectra selection based on spectral similarity measure.

  8. Control of Precision Grip Force in Lifting and Holding of Low-Mass Objects

    PubMed Central

    Kimura, Daisuke; Kadota, Koji; Ito, Taro

    2015-01-01

    Few studies have investigated the control of grip force when manipulating an object with an extremely small mass using a precision grip, although some related information has been provided by studies conducted in an unusual microgravity environment. Grip-load force coordination was examined while healthy adults (N = 17) held a moveable instrumented apparatus with its mass changed between 6 g and 200 g in 14 steps, with its grip surface set as either sandpaper or rayon. Additional measurements of grip-force-dependent finger-surface contact area and finger skin indentation, as well as a test of weight discrimination, were also performed. For each surface condition, the static grip force was modulated in parallel with load force while holding the object of a mass above 30 g. For objects with mass smaller than 30 g, on the other hand, the parallel relationship was changed, resulting in a progressive increase in grip-to-load force (GF/LF) ratio. The rayon had a higher GF/LF force ratio across all mass levels. The proportion of safety margin in the static grip force and normalized moment-to-moment variability of the static grip force were also elevated towards the lower end of the object mass for both surfaces. These findings indicate that the strategy of grip force control for holding objects with an extremely small mass differs from that with a mass above 30 g. The data for the contact area, skin indentation, and weight discrimination suggest that a decreased level of cutaneous feedback signals from the finger pads could have played some role in a cost function in efficient grip force control with low-mass objects. The elevated grip force variability associated with signal-dependent and internal noises, and anticipated inertial force on the held object due to acceleration of the arm and hand, could also have contributed to the cost function. PMID:26376484

  9. Precision Machining Technologies. Occupational Competency Analysis Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    This Occupational Competency Analysis Profile (OCAP), which is one of a series of OCAPs developed to identify the skills that Ohio employers deem necessary to entering a given occupation/occupational area, lists the occupational, academic, and employability skills required of individuals entering the occupation of precision machinist. The…

  10. Accuracy and Precision of Silicon Based Impression Media for Quantitative Areal Texture Analysis

    PubMed Central

    Goodall, Robert H.; Darras, Laurent P.; Purnell, Mark A.

    2015-01-01

    Areal surface texture analysis is becoming widespread across a diverse range of applications, from engineering to ecology. In many studies silicon based impression media are used to replicate surfaces, and the fidelity of replication defines the quality of data collected. However, while different investigators have used different impression media, the fidelity of surface replication has not been subjected to quantitative analysis based on areal texture data. Here we present the results of an analysis of the accuracy and precision with which different silicon based impression media of varying composition and viscosity replicate rough and smooth surfaces. Both accuracy and precision vary greatly between different media. High viscosity media tested show very low accuracy and precision, and most other compounds showed either the same pattern, or low accuracy and high precision, or low precision and high accuracy. Of the media tested, mid viscosity President Jet Regular Body and low viscosity President Jet Light Body (Coltène Whaledent) are the only compounds to show high levels of accuracy and precision on both surface types. Our results show that data acquired from different impression media are not comparable, supporting calls for greater standardisation of methods in areal texture analysis. PMID:25991505

  11. Droplet-counting Microtitration System for Precise On-site Analysis.

    PubMed

    Kawakubo, Susumu; Omori, Taichi; Suzuki, Yasutada; Ueta, Ikuo

    2018-01-01

    A new microtitration system based on the counting of titrant droplets has been developed for precise on-site analysis. The dropping rate was controlled by inserting a capillary tube as a flow resistance in a laboratory-made micropipette. The error of titration was 3% in a simulated titration with 20 droplets. The pre-addition of a titrant was proposed for precise titration within an error of 0.5%. The analytical performances were evaluated for chelate titration, redox titration and acid-base titration.

  12. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    PubMed

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Defining precision: The precision medicine initiative trials NCI-MPACT and NCI-MATCH.

    PubMed

    Coyne, Geraldine O'Sullivan; Takebe, Naoko; Chen, Alice P

    "Precision" trials, using rationally incorporated biomarker targets and molecularly selective anticancer agents, have become of great interest to both patients and their physicians. In the endeavor to test the cornerstone premise of precision oncotherapy, that is, determining if modulating a specific molecular aberration in a patient's tumor with a correspondingly specific therapeutic agent improves clinical outcomes, the design of clinical trials with embedded genomic characterization platforms which guide therapy are an increasing challenge. The National Cancer Institute Precision Medicine Initiative is an unprecedented large interdisciplinary collaborative effort to conceptualize and test the feasibility of trials incorporating sequencing platforms and large-scale bioinformatics processing that are not currently uniformly available to patients. National Cancer Institute-Molecular Profiling-based Assignment of Cancer Therapy and National Cancer Institute-Molecular Analysis for Therapy Choice are 2 genomic to phenotypic trials under this National Cancer Institute initiative, where treatment is selected according to predetermined genetic alterations detected using next-generation sequencing technology across a broad range of tumor types. In this article, we discuss the objectives and trial designs that have enabled the public-private partnerships required to complete the scale of both trials, as well as interim trial updates and strategic considerations that have driven data analysis and targeted therapy assignment, with the intent of elucidating further the benefits of this treatment approach for patients. Copyright © 2017. Published by Elsevier Inc.

  14. Fully Nonlinear Modeling and Analysis of Precision Membranes

    NASA Technical Reports Server (NTRS)

    Pai, P. Frank; Young, Leyland G.

    2003-01-01

    High precision membranes are used in many current space applications. This paper presents a fully nonlinear membrane theory with forward and inverse analyses of high precision membrane structures. The fully nonlinear membrane theory is derived from Jaumann strains and stresses, exact coordinate transformations, the concept of local relative displacements, and orthogonal virtual rotations. In this theory, energy and Newtonian formulations are fully correlated, and every structural term can be interpreted in terms of vectors. Fully nonlinear ordinary differential equations (ODES) governing the large static deformations of known axisymmetric membranes under known axisymmetric loading (i.e., forward problems) are presented as first-order ODES, and a method for obtaining numerically exact solutions using the multiple shooting procedure is shown. A method for obtaining the undeformed geometry of any axisymmetric membrane with a known inflated geometry and a known internal pressure (i.e., inverse problems) is also derived. Numerical results from forward analysis are verified using results in the literature, and results from inverse analysis are verified using known exact solutions and solutions from the forward analysis. Results show that the membrane theory and the proposed numerical methods for solving nonlinear forward and inverse membrane problems are accurate.

  15. System and method for high precision isotope ratio destructive analysis

    DOEpatents

    Bushaw, Bruce A; Anheier, Norman C; Phillips, Jon R

    2013-07-02

    A system and process are disclosed that provide high accuracy and high precision destructive analysis measurements for isotope ratio determination of relative isotope abundance distributions in liquids, solids, and particulate samples. The invention utilizes a collinear probe beam to interrogate a laser ablated plume. This invention provides enhanced single-shot detection sensitivity approaching the femtogram range, and isotope ratios that can be determined at approximately 1% or better precision and accuracy (relative standard deviation).

  16. Precision medicine in myasthenia graves: begin from the data precision

    PubMed Central

    Hong, Yu; Xie, Yanchen; Hao, Hong-Jun; Sun, Ren-Cheng

    2016-01-01

    Myasthenia gravis (MG) is a prototypic autoimmune disease with overt clinical and immunological heterogeneity. The data of MG is far from individually precise now, partially due to the rarity and heterogeneity of this disease. In this review, we provide the basic insights of MG data precision, including onset age, presenting symptoms, generalization, thymus status, pathogenic autoantibodies, muscle involvement, severity and response to treatment based on references and our previous studies. Subgroups and quantitative traits of MG are discussed in the sense of data precision. The role of disease registries and scientific bases of precise analysis are also discussed to ensure better collection and analysis of MG data. PMID:27127759

  17. The emerging potential for network analysis to inform precision cancer medicine.

    PubMed

    Ozturk, Kivilcim; Dow, Michelle; Carlin, Daniel E; Bejar, Rafael; Carter, Hannah

    2018-06-14

    Precision cancer medicine promises to tailor clinical decisions to patients using genomic information. Indeed, successes of drugs targeting genetic alterations in tumors, such as imatinib that targets BCR-ABL in chronic myelogenous leukemia, have demonstrated the power of this approach. However biological systems are complex, and patients may differ not only by the specific genetic alterations in their tumor, but by more subtle interactions among such alterations. Systems biology and more specifically, network analysis, provides a framework for advancing precision medicine beyond clinical actionability of individual mutations. Here we discuss applications of network analysis to study tumor biology, early methods for N-of-1 tumor genome analysis and the path for such tools to the clinic. Copyright © 2018. Published by Elsevier Ltd.

  18. Accuracy assessment of the Precise Point Positioning method applied for surveys and tracking moving objects in GIS environment

    NASA Astrophysics Data System (ADS)

    Ilieva, Tamara; Gekov, Svetoslav

    2017-04-01

    The Precise Point Positioning (PPP) method gives the users the opportunity to determine point locations using a single GNSS receiver. The accuracy of the determined by PPP point locations is better in comparison to the standard point positioning, due to the precise satellite orbit and clock corrections that are developed and maintained by the International GNSS Service (IGS). The aim of our current research is the accuracy assessment of the PPP method applied for surveys and tracking moving objects in GIS environment. The PPP data is collected by using preliminary developed by us software application that allows different sets of attribute data for the measurements and their accuracy to be used. The results from the PPP measurements are directly compared within the geospatial database to different other sets of terrestrial data - measurements obtained by total stations, real time kinematic and static GNSS.

  19. Error analysis of high-rate GNSS precise point positioning for seismic wave measurement

    NASA Astrophysics Data System (ADS)

    Shu, Yuanming; Shi, Yun; Xu, Peiliang; Niu, Xiaoji; Liu, Jingnan

    2017-06-01

    High-rate GNSS precise point positioning (PPP) has been playing a more and more important role in providing precise positioning information in fast time-varying environments. Although kinematic PPP is commonly known to have a precision of a few centimeters, the precision of high-rate PPP within a short period of time has been reported recently with experiments to reach a few millimeters in the horizontal components and sub-centimeters in the vertical component to measure seismic motion, which is several times better than the conventional kinematic PPP practice. To fully understand the mechanism of mystified excellent performance of high-rate PPP within a short period of time, we have carried out a theoretical error analysis of PPP and conducted the corresponding simulations within a short period of time. The theoretical analysis has clearly indicated that the high-rate PPP errors consist of two types: the residual systematic errors at the starting epoch, which affect high-rate PPP through the change of satellite geometry, and the time-varying systematic errors between the starting epoch and the current epoch. Both the theoretical error analysis and simulated results are fully consistent with and thus have unambiguously confirmed the reported high precision of high-rate PPP, which has been further affirmed here by the real data experiments, indicating that high-rate PPP can indeed achieve the millimeter level of precision in the horizontal components and the sub-centimeter level of precision in the vertical component to measure motion within a short period of time. The simulation results have clearly shown that the random noise of carrier phases and higher order ionospheric errors are two major factors to affect the precision of high-rate PPP within a short period of time. The experiments with real data have also indicated that the precision of PPP solutions can degrade to the cm level in both the horizontal and vertical components, if the geometry of satellites is

  20. A double sealing technique for increasing the precision of headspace-gas chromatographic analysis.

    PubMed

    Xie, Wei-Qi; Yu, Kong-Xian; Gong, Yi-Xian

    2018-01-19

    This paper investigates a new double sealing technique for increasing the precision of the headspace gas chromatographic method. The air leakage problem caused by the high pressure in the headspace vial during the headspace sampling process has a great impact to the measurement precision in the conventional headspace analysis (i.e., single sealing technique). The results (using ethanol solution as the model sample) show that the present technique is effective to minimize such a problem. The double sealing technique has an excellent measurement precision (RSD < 0.15%) and accuracy (recovery = 99.1%-100.6%) for the ethanol quantification. The detection precision of the present method was 10-20 times higher than that in earlier HS-GC work that use conventional single sealing technique. The present double sealing technique may open up a new avenue, and also serve as a general strategy for improving the performance (i.e., accuracy and precision) of headspace analysis of various volatile compounds. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Measurement Model and Precision Analysis of Accelerometers for Maglev Vibration Isolation Platforms.

    PubMed

    Wu, Qianqian; Yue, Honghao; Liu, Rongqiang; Zhang, Xiaoyou; Ding, Liang; Liang, Tian; Deng, Zongquan

    2015-08-14

    High precision measurement of acceleration levels is required to allow active control for vibration isolation platforms. It is necessary to propose an accelerometer configuration measurement model that yields such a high measuring precision. In this paper, an accelerometer configuration to improve measurement accuracy is proposed. The corresponding calculation formulas of the angular acceleration were derived through theoretical analysis. A method is presented to minimize angular acceleration noise based on analysis of the root mean square noise of the angular acceleration. Moreover, the influence of installation position errors and accelerometer orientation errors on the calculation precision of the angular acceleration is studied. Comparisons of the output differences between the proposed configuration and the previous planar triangle configuration under the same installation errors are conducted by simulation. The simulation results show that installation errors have a relatively small impact on the calculation accuracy of the proposed configuration. To further verify the high calculation precision of the proposed configuration, experiments are carried out for both the proposed configuration and the planar triangle configuration. On the basis of the results of simulations and experiments, it can be concluded that the proposed configuration has higher angular acceleration calculation precision and can be applied to different platforms.

  2. Measurement Model and Precision Analysis of Accelerometers for Maglev Vibration Isolation Platforms

    PubMed Central

    Wu, Qianqian; Yue, Honghao; Liu, Rongqiang; Zhang, Xiaoyou; Ding, Liang; Liang, Tian; Deng, Zongquan

    2015-01-01

    High precision measurement of acceleration levels is required to allow active control for vibration isolation platforms. It is necessary to propose an accelerometer configuration measurement model that yields such a high measuring precision. In this paper, an accelerometer configuration to improve measurement accuracy is proposed. The corresponding calculation formulas of the angular acceleration were derived through theoretical analysis. A method is presented to minimize angular acceleration noise based on analysis of the root mean square noise of the angular acceleration. Moreover, the influence of installation position errors and accelerometer orientation errors on the calculation precision of the angular acceleration is studied. Comparisons of the output differences between the proposed configuration and the previous planar triangle configuration under the same installation errors are conducted by simulation. The simulation results show that installation errors have a relatively small impact on the calculation accuracy of the proposed configuration. To further verify the high calculation precision of the proposed configuration, experiments are carried out for both the proposed configuration and the planar triangle configuration. On the basis of the results of simulations and experiments, it can be concluded that the proposed configuration has higher angular acceleration calculation precision and can be applied to different platforms. PMID:26287203

  3. Location precision analysis of stereo thermal anti-sniper detection system

    NASA Astrophysics Data System (ADS)

    He, Yuqing; Lu, Ya; Zhang, Xiaoyan; Jin, Weiqi

    2012-06-01

    Anti-sniper detection devices are the urgent requirement in modern warfare. The precision of the anti-sniper detection system is especially important. This paper discusses the location precision analysis of the anti-sniper detection system based on the dual-thermal imaging system. It mainly discusses the following two aspects which produce the error: the digital quantitative effects of the camera; effect of estimating the coordinate of bullet trajectory according to the infrared images in the process of image matching. The formula of the error analysis is deduced according to the method of stereovision model and digital quantitative effects of the camera. From this, we can get the relationship of the detecting accuracy corresponding to the system's parameters. The analysis in this paper provides the theory basis for the error compensation algorithms which are put forward to improve the accuracy of 3D reconstruction of the bullet trajectory in the anti-sniper detection devices.

  4. [Refractive precision and objective quality of vision after toric lens implantation in cataract surgery].

    PubMed

    Debois, A; Nochez, Y; Bezo, C; Bellicaud, D; Pisella, P-J

    2012-10-01

    To study efficacy and predictability of toric IOL implantation for correction of preoperative corneal astigmatism by analysing spherocylindrical refractive precision and objective quality of vision. Prospective study of 13 eyes undergoing micro-incisional cataract surgery through a 1.8mm corneal incision with toric IOL implantation (Lentis L313T(®), Oculentis) to treat over one D of preoperative corneal astigmatism. Preoperative evaluation included keratometry, subjective refraction, and total and corneal aberrometry (KR-1(®), Topcon). Six months postoperatively, measurements included slit lamp photography, documenting IOL rotation, tilt or decentration, uncorrected visual acuity, best-corrected visual acuity and objective quality of vision measurement (OQAS(®) Visiometrics, Spain). Postoperatively, mean uncorrected distance visual acuity was 8.33/10 ± 1.91 (0.09 ± 0.11 LogMar). Mean postoperative refractive sphere was 0.13 ± 0.73 diopters. Mean refractive astigmatism was -0.66 ± 0.56 diopters with corneal astigmatism of 2.17 ± 0.68 diopters. Mean IOL rotation was 4.4° ± 3.6° (range 0° to 10°). Mean rotation of this IOL at 6 months was less than 5°, demonstrating stability of the optic within the capsular bag. Objective quality of vision measurements were consistent with subjective uncorrected visual acuity. Implantation of the L313T(®) IOL is safe and effective for correction of corneal astigmatism in 1.8mm micro-incisional cataract surgery. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  5. Selective Attention to Auditory Memory Neurally Enhances Perceptual Precision.

    PubMed

    Lim, Sung-Joo; Wöstmann, Malte; Obleser, Jonas

    2015-12-09

    Selective attention to a task-relevant stimulus facilitates encoding of that stimulus into a working memory representation. It is less clear whether selective attention also improves the precision of a stimulus already represented in memory. Here, we investigate the behavioral and neural dynamics of selective attention to representations in auditory working memory (i.e., auditory objects) using psychophysical modeling and model-based analysis of electroencephalographic signals. Human listeners performed a syllable pitch discrimination task where two syllables served as to-be-encoded auditory objects. Valid (vs neutral) retroactive cues were presented during retention to allow listeners to selectively attend to the to-be-probed auditory object in memory. Behaviorally, listeners represented auditory objects in memory more precisely (expressed by steeper slopes of a psychometric curve) and made faster perceptual decisions when valid compared to neutral retrocues were presented. Neurally, valid compared to neutral retrocues elicited a larger frontocentral sustained negativity in the evoked potential as well as enhanced parietal alpha/low-beta oscillatory power (9-18 Hz) during memory retention. Critically, individual magnitudes of alpha oscillatory power (7-11 Hz) modulation predicted the degree to which valid retrocues benefitted individuals' behavior. Our results indicate that selective attention to a specific object in auditory memory does benefit human performance not by simply reducing memory load, but by actively engaging complementary neural resources to sharpen the precision of the task-relevant object in memory. Can selective attention improve the representational precision with which objects are held in memory? And if so, what are the neural mechanisms that support such improvement? These issues have been rarely examined within the auditory modality, in which acoustic signals change and vanish on a milliseconds time scale. Introducing a new auditory memory

  6. Bridging the gap between genome analysis and precision breeding in potato.

    PubMed

    Gebhardt, Christiane

    2013-04-01

    Efficiency and precision in plant breeding can be enhanced by using diagnostic DNA-based markers for the selection of superior cultivars. This technique has been applied to many crops, including potatoes. The first generation of diagnostic DNA-based markers useful in potato breeding were enabled by several developments: genetic linkage maps based on DNA polymorphisms, linkage mapping of qualitative and quantitative agronomic traits, cloning and functional analysis of genes for pathogen resistance and genes controlling plant metabolism, and association genetics in collections of tetraploid varieties and advanced breeding clones. Although these have led to significant improvements in potato genetics, the prediction of most, if not all, natural variation in agronomic traits by diagnostic markers ultimately requires the identification of the causal genes and their allelic variants. This objective will be facilitated by new genomic tools, such as genomic resequencing and comparative profiling of the proteome, transcriptome, and metabolome in combination with phenotyping genetic materials relevant for variety development. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Precision Attitude Determination System (PADS) design and analysis. Two-axis gimbal star tracker

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Development of the Precision Attitude Determination System (PADS) focused chiefly on the two-axis gimballed star tracker and electronics design improved from that of Precision Pointing Control System (PPCS), and application of the improved tracker for PADS at geosynchronous altitude. System design, system analysis, software design, and hardware design activities are reported. The system design encompasses the PADS configuration, system performance characteristics, component design summaries, and interface considerations. The PADS design and performance analysis includes error analysis, performance analysis via attitude determination simulation, and star tracker servo design analysis. The design of the star tracker and electronics are discussed. Sensor electronics schematics are included. A detailed characterization of the application software algorithms and computer requirements is provided.

  8. The Use of Object-Oriented Analysis Methods in Surety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less

  9. Objective monitoring of mTOR inhibitor therapy by three-dimensional facial analysis.

    PubMed

    Baynam, Gareth S; Walters, Mark; Dawkins, Hugh; Bellgard, Matthew; Halbert, Anne R; Claes, Peter

    2013-08-01

    With advances in therapeutics for rare, genetic and syndromic diseases, there is an increasing need for objective assessments of phenotypic endpoints. These assessments will preferentially be high precision, non-invasive, non-irradiating, and relatively inexpensive and portable. We report a case of a child with an extensive lymphatic vascular malformation of the head and neck, treated with an mammalian target of Rapamycin (mTOR) inhibitor that was assessed using 3D facial analysis. This case illustrates that this technology is prospectively a cost-effective modality for treatment monitoring, and it supports that it may also be used for novel explorations of disease biology for conditions associated with disturbances in the mTOR, and interrelated, pathways.

  10. An Integrative Object-Based Image Analysis Workflow for Uav Images

    NASA Astrophysics Data System (ADS)

    Yu, Huai; Yan, Tianheng; Yang, Wen; Zheng, Hong

    2016-06-01

    In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA). More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT) representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC). Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya'an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  11. An Image Analysis Method for the Precise Selection and Quantitation of Fluorescently Labeled Cellular Constituents

    PubMed Central

    Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.

    2012-01-01

    The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600

  12. Precise, High-throughput Analysis of Bacterial Growth.

    PubMed

    Kurokawa, Masaomi; Ying, Bei-Wen

    2017-09-19

    Bacterial growth is a central concept in the development of modern microbial physiology, as well as in the investigation of cellular dynamics at the systems level. Recent studies have reported correlations between bacterial growth and genome-wide events, such as genome reduction and transcriptome reorganization. Correctly analyzing bacterial growth is crucial for understanding the growth-dependent coordination of gene functions and cellular components. Accordingly, the precise quantitative evaluation of bacterial growth in a high-throughput manner is required. Emerging technological developments offer new experimental tools that allow updates of the methods used for studying bacterial growth. The protocol introduced here employs a microplate reader with a highly optimized experimental procedure for the reproducible and precise evaluation of bacterial growth. This protocol was used to evaluate the growth of several previously described Escherichia coli strains. The main steps of the protocol are as follows: the preparation of a large number of cell stocks in small vials for repeated tests with reproducible results, the use of 96-well plates for high-throughput growth evaluation, and the manual calculation of two major parameters (i.e., maximal growth rate and population density) representing the growth dynamics. In comparison to the traditional colony-forming unit (CFU) assay, which counts the cells that are cultured in glass tubes over time on agar plates, the present method is more efficient and provides more detailed temporal records of growth changes, but has a stricter detection limit at low population densities. In summary, the described method is advantageous for the precise and reproducible high-throughput analysis of bacterial growth, which can be used to draw conceptual conclusions or to make theoretical observations.

  13. Breath alcohol analysis incorporating standardization to water vapour is as precise as blood alcohol analysis.

    PubMed

    Grubb, D; Rasmussen, B; Linnet, K; Olsson, S G; Lindberg, L

    2012-03-10

    A novel breath-alcohol analyzer based on the standardization of the breath alcohol concentration (BrAC) to the alveolar-air water vapour concentration has been developed and evaluated. The present study compares results with this particular breath analyzer with arterial blood alcohol concentrations (ABAC), the most relevant quantitative measure of brain alcohol exposure. The precision of analysis of alcohol in arterial blood and breath were determined as well as the agreement between ABAC and BrAC over time post-dosing. Twelve healthy volunteers were administered 0.6g alcohol/kg bodyweight via an orogastric tube. Duplicate breath and arterial blood samples were obtained simultaneously during the absorption, distribution and elimination phases of the alcohol metabolism with particular emphasis on the absorption phase. The precision of the breath analyzer was similar to the determination of blood alcohol concentration by headspace gas chromatography (CV 2.40 vs. 2.38%, p=0.43). The ABAC/BrAC ratio stabilized 30min post-dosing (2089±99; mean±SD). Before this the BrAC tended to underestimate the coexisting ABAC. In conclusion, breath alcohol analysis utilizing standardization of alcohol to water vapour was as precise as blood alcohol analysis, the present "gold standard" method. The BrAC reliably predicted the coexisting ABAC from 30min onwards after the intake of alcohol. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  14. Attention Modulates Spatial Precision in Multiple-Object Tracking.

    PubMed

    Srivastava, Nisheeth; Vul, Ed

    2016-01-01

    We present a computational model of multiple-object tracking that makes trial-level predictions about the allocation of visual attention and the effect of this allocation on observers' ability to track multiple objects simultaneously. This model follows the intuition that increased attention to a location increases the spatial resolution of its internal representation. Using a combination of empirical and computational experiments, we demonstrate the existence of a tight coupling between cognitive and perceptual resources in this task: Low-level tracking of objects generates bottom-up predictions of error likelihood, and high-level attention allocation selectively reduces error probabilities in attended locations while increasing it at non-attended locations. Whereas earlier models of multiple-object tracking have predicted the big picture relationship between stimulus complexity and response accuracy, our approach makes accurate predictions of both the macro-scale effect of target number and velocity on tracking difficulty and micro-scale variations in difficulty across individual trials and targets arising from the idiosyncratic within-trial interactions of targets and distractors. Copyright © 2016 Cognitive Science Society, Inc.

  15. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Submillimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation Flying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  16. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  17. Precise Truss Assembly Using Commodity Parts and Low Precision Welding

    NASA Technical Reports Server (NTRS)

    Komendera, Erik; Reishus, Dustin; Dorsey, John T.; Doggett, W. R.; Correll, Nikolaus

    2014-01-01

    Hardware and software design and system integration for an intelligent precision jigging robot (IPJR), which allows high precision assembly using commodity parts and low-precision bonding, is described. Preliminary 2D experiments that are motivated by the problem of assembling space telescope optical benches and very large manipulators on orbit using inexpensive, stock hardware and low-precision welding are also described. An IPJR is a robot that acts as the precise "jigging", holding parts of a local structure assembly site in place, while an external low precision assembly agent cuts and welds members. The prototype presented in this paper allows an assembly agent (for this prototype, a human using only low precision tools), to assemble a 2D truss made of wooden dowels to a precision on the order of millimeters over a span on the order of meters. The analysis of the assembly error and the results of building a square structure and a ring structure are discussed. Options for future work, to extend the IPJR paradigm to building in 3D structures at micron precision are also summarized.

  18. CognitionMaster: an object-based image analysis framework

    PubMed Central

    2013-01-01

    Background Automated image analysis methods are becoming more and more important to extract and quantify image features in microscopy-based biomedical studies and several commercial or open-source tools are available. However, most of the approaches rely on pixel-wise operations, a concept that has limitations when high-level object features and relationships between objects are studied and if user-interactivity on the object-level is desired. Results In this paper we present an open-source software that facilitates the analysis of content features and object relationships by using objects as basic processing unit instead of individual pixels. Our approach enables also users without programming knowledge to compose “analysis pipelines“ that exploit the object-level approach. We demonstrate the design and use of example pipelines for the immunohistochemistry-based cell proliferation quantification in breast cancer and two-photon fluorescence microscopy data about bone-osteoclast interaction, which underline the advantages of the object-based concept. Conclusions We introduce an open source software system that offers object-based image analysis. The object-based concept allows for a straight-forward development of object-related interactive or fully automated image analysis solutions. The presented software may therefore serve as a basis for various applications in the field of digital image analysis. PMID:23445542

  19. Automated analysis of art object surfaces using time-averaged digital speckle pattern interferometry

    NASA Astrophysics Data System (ADS)

    Lukomski, Michal; Krzemien, Leszek

    2013-05-01

    Technical development and practical evaluation of a laboratory built, out-of-plane digital speckle pattern interferometer (DSPI) are reported. The instrument was used for non-invasive, non-contact detection and characterization of early-stage damage, like fracturing and layer separation, of painted objects of art. A fully automated algorithm was developed for recording and analysis of vibrating objects utilizing continuous-wave laser light. The algorithm uses direct, numerical fitting or Hilbert transformation for an independent, quantitative evaluation of the Bessel function at every point of the investigated surface. The procedure does not require phase modulation and thus can be implemented within any, even the simplest, DSPI apparatus. The proposed deformation analysis is fast and computationally inexpensive. Diagnosis of physical state of the surface of a panel painting attributed to Nicolaus Haberschrack (a late-mediaeval painter active in Krakow) from the collection of the National Museum in Krakow is presented as an example of an in situ application of the developed methodology. It has allowed the effectiveness of the deformation analysis to be evaluated for the surface of a real painting (heterogeneous colour and texture) in a conservation studio where vibration level was considerably higher than in the laboratory. It has been established that the methodology, which offers automatic analysis of the interferometric fringe patterns, has a considerable potential to facilitate and render more precise the condition surveys of works of art.

  20. Using Public Data for Comparative Proteome Analysis in Precision Medicine Programs.

    PubMed

    Hughes, Christopher S; Morin, Gregg B

    2018-03-01

    Maximizing the clinical utility of information obtained in longitudinal precision medicine programs would benefit from robust comparative analyses to known information to assess biological features of patient material toward identifying the underlying features driving their disease phenotype. Herein, the potential for utilizing publically deposited mass-spectrometry-based proteomics data to perform inter-study comparisons of cell-line or tumor-tissue materials is investigated. To investigate the robustness of comparison between MS-based proteomics studies carried out with different methodologies, deposited data representative of label-free (MS1) and isobaric tagging (MS2 and MS3 quantification) are utilized. In-depth quantitative proteomics data acquired from analysis of ovarian cancer cell lines revealed the robust recapitulation of observable gene expression dynamics between individual studies carried out using significantly different methodologies. The observed signatures enable robust inter-study clustering of cell line samples. In addition, the ability to classify and cluster tumor samples based on observed gene expression trends when using a single patient sample is established. With this analysis, relevant gene expression dynamics are obtained from a single patient tumor, in the context of a precision medicine analysis, by leveraging a large cohort of repository data as a comparator. Together, these data establish the potential for state-of-the-art MS-based proteomics data to serve as resources for robust comparative analyses in precision medicine applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Object locating system

    DOEpatents

    Novak, J.L.; Petterson, B.

    1998-06-09

    A sensing system locates an object by sensing the object`s effect on electric fields. The object`s effect on the mutual capacitance of electrode pairs varies according to the distance between the object and the electrodes. A single electrode pair can sense the distance from the object to the electrodes. Multiple electrode pairs can more precisely locate the object in one or more dimensions. 12 figs.

  2. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study.

    PubMed

    Egbewale, Bolaji E; Lewis, Martyn; Sim, Julius

    2014-04-09

    Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. 126 hypothetical trial scenarios were evaluated (126,000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power.

  3. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study

    PubMed Central

    2014-01-01

    Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304

  4. Object locating system

    DOEpatents

    Novak, James L.; Petterson, Ben

    1998-06-09

    A sensing system locates an object by sensing the object's effect on electric fields. The object's effect on the mutual capacitance of electrode pairs varies according to the distance between the object and the electrodes. A single electrode pair can sense the distance from the object to the electrodes. Multiple electrode pairs can more precisely locate the object in one or more dimensions.

  5. Numerical Analysis Objects

    NASA Astrophysics Data System (ADS)

    Henderson, Michael

    1997-08-01

    The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.

  6. [Research of Identify Spatial Object Using Spectrum Analysis Technique].

    PubMed

    Song, Wei; Feng, Shi-qi; Shi, Jing; Xu, Rong; Wang, Gong-chang; Li, Bin-yu; Liu, Yu; Li, Shuang; Cao Rui; Cai, Hong-xing; Zhang, Xi-he; Tan, Yong

    2015-06-01

    The high precision scattering spectrum of spatial fragment with the minimum brightness of 4.2 and the resolution of 0.5 nm has been observed using spectrum detection technology on the ground. The obvious differences for different types of objects are obtained by the normalizing and discrete rate analysis of the spectral data. Each of normalized multi-frame scattering spectral line shape for rocket debris is identical. However, that is different for lapsed satellites. The discrete rate of the single frame spectrum of normalized space debris for rocket debris ranges from 0.978% to 3.067%, and the difference of oscillation and average value is small. The discrete rate for lapsed satellites ranges from 3.118 4% to 19.472 7%, and the difference of oscillation and average value relatively large. The reason is that the composition of rocket debris is single, while that of the lapsed satellites is complex. Therefore, the spectrum detection technology on the ground can be used to the classification of the spatial fragment.

  7. The accuracy and precision of radiostereometric analysis in upper limb arthroplasty.

    PubMed

    Ten Brinke, Bart; Beumer, Annechien; Koenraadt, Koen L M; Eygendaal, Denise; Kraan, Gerald A; Mathijssen, Nina M C

    2017-06-01

    Background and purpose - Radiostereometric analysis (RSA) is an accurate method for measurement of early migration of implants. Since a relation has been shown between early migration and future loosening of total knee and hip prostheses, RSA plays an important role in the development and evaluation of prostheses. However, there have been few RSA studies of the upper limb, and the value of RSA of the upper limb is not yet clear. We therefore performed a systematic review to investigate the accuracy and precision of RSA of the upper limb. Patients and methods - PRISMA guidelines were followed and the protocol for this review was published online at PROSPERO under registration number CRD42016042014. A systematic search of the literature was performed in the databases Embase, Medline, Cochrane, Web of Science, Scopus, Cinahl, and Google Scholar on April 25, 2015 based on the keywords radiostereometric analysis, shoulder prosthesis, elbow prosthesis, wrist prosthesis, trapeziometacarpal joint prosthesis, humerus, ulna, radius, carpus. Articles concerning RSA for the analysis of early migration of prostheses of the upper limb were included. Quality assessment was performed using the MINORS score, Downs and Black checklist, and the ISO RSA Results - 23 studies were included. Precision values were in the 0.06-0.88 mm and 0.05-10.7° range for the shoulder, the 0.05-0.34 mm and 0.16-0.76° range for the elbow, and the 0.16-1.83 mm and 11-124° range for the TMC joint. Accuracy data from marker- and model-based RSA were not reported in the studies included. Interpretation - RSA is a highly precise method for measurement of early migration of orthopedic implants in the upper limb. However, the precision of rotation measurement is poor in some components. Challenges with RSA in the upper limb include the symmetrical shape of prostheses and the limited size of surrounding bone, leading to over-projection of the markers by the prosthesis. We recommend higher adherence to

  8. EOS-AM precision pointing verification

    NASA Technical Reports Server (NTRS)

    Throckmorton, A.; Braknis, E.; Bolek, J.

    1993-01-01

    The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.

  9. Precision Medicine: Functional Advancements.

    PubMed

    Caskey, Thomas

    2018-01-29

    Precision medicine was conceptualized on the strength of genomic sequence analysis. High-throughput functional metrics have enhanced sequence interpretation and clinical precision. These technologies include metabolomics, magnetic resonance imaging, and I rhythm (cardiac monitoring), among others. These technologies are discussed and placed in clinical context for the medical specialties of internal medicine, pediatrics, obstetrics, and gynecology. Publications in these fields support the concept of a higher level of precision in identifying disease risk. Precise disease risk identification has the potential to enable intervention with greater specificity, resulting in disease prevention-an important goal of precision medicine.

  10. Object detection in cinematographic video sequences for automatic indexing

    NASA Astrophysics Data System (ADS)

    Stauder, Jurgen; Chupeau, Bertrand; Oisel, Lionel

    2003-06-01

    This paper presents an object detection framework applied to cinematographic post-processing of video sequences. Post-processing is done after production and before editing. At the beginning of each shot of a video, a slate (also called clapperboard) is shown. The slate contains notably an electronic audio timecode that is necessary for audio-visual synchronization. This paper presents an object detection framework to detect slates in video sequences for automatic indexing and post-processing. It is based on five steps. The first two steps aim to reduce drastically the video data to be analyzed. They ensure high recall rate but have low precision. The first step detects images at the beginning of a shot possibly showing up a slate while the second step searches in these images for candidates regions with color distribution similar to slates. The objective is to not miss any slate while eliminating long parts of video without slate appearance. The third and fourth steps are statistical classification and pattern matching to detected and precisely locate slates in candidate regions. These steps ensure high recall rate and high precision. The objective is to detect slates with very little false alarms to minimize interactive corrections. In a last step, electronic timecodes are read from slates to automize audio-visual synchronization. The presented slate detector has a recall rate of 89% and a precision of 97,5%. By temporal integration, much more than 89% of shots in dailies are detected. By timecode coherence analysis, the precision can be raised too. Issues for future work are to accelerate the system to be faster than real-time and to extend the framework for several slate types.

  11. Behavior analysis of video object in complicated background

    NASA Astrophysics Data System (ADS)

    Zhao, Wenting; Wang, Shigang; Liang, Chao; Wu, Wei; Lu, Yang

    2016-10-01

    This paper aims to achieve robust behavior recognition of video object in complicated background. Features of the video object are described and modeled according to the depth information of three-dimensional video. Multi-dimensional eigen vector are constructed and used to process high-dimensional data. Stable object tracing in complex scenes can be achieved with multi-feature based behavior analysis, so as to obtain the motion trail. Subsequently, effective behavior recognition of video object is obtained according to the decision criteria. What's more, the real-time of algorithms and accuracy of analysis are both improved greatly. The theory and method on the behavior analysis of video object in reality scenes put forward by this project have broad application prospect and important practical significance in the security, terrorism, military and many other fields.

  12. Analysis and the Derivation of Valid Objectives

    ERIC Educational Resources Information Center

    Tiemann, Philip W.

    1973-01-01

    Author states that "to the extent that behavioral objectives are derived from an analysis of relatively broad objectives, they can serve as valid criteria which enable our students to avoid trivia." (Author)

  13. Binocular optical axis parallelism detection precision analysis based on Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ying, Jiaju; Liu, Bingqi

    2018-02-01

    According to the working principle of the binocular photoelectric instrument optical axis parallelism digital calibration instrument, and in view of all components of the instrument, the various factors affect the system precision is analyzed, and then precision analysis model is established. Based on the error distribution, Monte Carlo method is used to analyze the relationship between the comprehensive error and the change of the center coordinate of the circle target image. The method can further guide the error distribution, optimize control the factors which have greater influence on the comprehensive error, and improve the measurement accuracy of the optical axis parallelism digital calibration instrument.

  14. Performance Analysis of Beidou-2/Beidou-3e Combined Solution with Emphasis on Precise Orbit Determination and Precise Point Positioning

    PubMed Central

    Xu, Xiaolong; Li, Min; Li, Wenwen; Liu, Jingnan

    2018-01-01

    In 2015, the plan for global coverage by the Chinese BeiDou Navigation Satellite System was launched. Five global BeiDou experimental satellites (BeiDou-3e) are in orbit for testing. To analyze the performances of precise orbit determination (POD) and precise point positioning (PPP) of onboard BeiDou satellites, about two months of data from 24 tracking stations were used. According to quality analysis of BeiDou-2/BeiDou-3e data, there is no satellite-induced code bias in BeiDou-3e satellites, which has been found in BeiDou-2 satellites. This phenomenon indicates that the quality issues of pseudorange data in BeiDou satellites have been solved well. POD results indicate that the BeiDou-3e orbit precision is comparable to that of BeiDou-2 satellites. The ambiguity fixed solution improved the orbit consistency of inclined geosynchronous orbit satellites in along-track and cross-track directions, but had little effect in the radial direction. Satellite laser ranging of BeiDou-3e medium Earth orbit satellites (MEOs) achieved a standard deviation of about 4 cm. Differences in clock offset series after the removal of reference clock in overlapping arcs were used to assess clock quality, and standard deviation of clock offset could reach 0.18 ns on average, which was in agreement with the orbit precision. For static PPP, when BeiDou-3e satellites were included, the positioning performance for horizontal components was improved slightly. For kinematic PPP, when global positioning satellites (GPS) were combined with BeiDou-2 and BeiDou-3e satellites, the convergence time was 13.5 min with a precision of 2–3 cm for horizontal components, and 3–4 cm for the vertical component. PMID:29304000

  15. Precise orbit computation and sea surface modeling

    NASA Technical Reports Server (NTRS)

    Wakker, Karel F.; Ambrosius, B. A. C.; Rummel, R.; Vermaat, E.; Deruijter, W. P. M.; Vandermade, J. W.; Zimmerman, J. T. F.

    1991-01-01

    The research project described below is part of a long-term program at Delft University of Technology aiming at the application of European Remote Sensing satellite (ERS-1) and TOPEX/POSEIDON altimeter measurements for geophysical purposes. This program started in 1980 with the processing of Seasat laser range and altimeter height measurements and concentrates today on the analysis of Geosat altimeter data. The objectives of the TOPEX/POSEIDON research project are the tracking of the satellite by the Dutch mobile laser tracking system MTLRS-2, the computation of precise TOPEX/POSEIDON orbits, the analysis of the spatial and temporal distribution of the orbit errors, the improvement of ERS-1 orbits through the information obtained from the altimeter crossover difference residuals for crossing ERS-1 and TOPEX/POSEIDON tracks, the combination of ERS-1 and TOPEX/POSEIDON altimeter data into a single high-precision data set, and the application of this data set to model the sea surface. The latter application will focus on the determination of detailed regional mean sea surfaces, sea surface variability, ocean topography, and ocean currents in the North Atlantic, the North Sea, the seas around Indonesia, the West Pacific, and the oceans around South Africa.

  16. Grid-based precision aim system and method for disrupting suspect objects

    DOEpatents

    Gladwell, Thomas Scott; Garretson, Justin; Hobart, Clinton G.; Monda, Mark J.

    2014-06-10

    A system and method for disrupting at least one component of a suspect object is provided. The system has a source for passing radiation through the suspect object, a grid board positionable adjacent the suspect object (the grid board having a plurality of grid areas, the radiation from the source passing through the grid board), a screen for receiving the radiation passing through the suspect object and generating at least one image, a weapon for deploying a discharge, and a targeting unit for displaying the image of the suspect object and aiming the weapon according to a disruption point on the displayed image and deploying the discharge into the suspect object to disable the suspect object.

  17. Measurement precision and noise analysis of CCD cameras

    NASA Astrophysics Data System (ADS)

    Wu, ZhenSen; Li, Zhiyang; Zhang, Ping

    1993-09-01

    CHINA The lirait precision of CCD camera with 1O. bit analogue to digital conversion is estimated in this paper . The noise effect on ineasurenent precision and the noise characteristics are analyzed in details. The noise process means are also discussed and the diagram of noise properties is given in this paper.

  18. Closed tubes preparation of graphite for high-precision AMS radiocarbon analysis

    NASA Astrophysics Data System (ADS)

    Hajdas, I.; Michczynska, D.; Bonani, G.; Maurer, M.; Wacker, L.

    2009-04-01

    Radiocarbon dating is an established tool applied in Geochronology. Technical developments of Accelerator Mass Spectrometry AMS, which allow measurements of samples containing less than 1 mg of carbon, opened opportunities for new applications. Moreover, high resolution records of the past changes require high-resolution chronologies i.e. sampling for 14C dating. In result, the field of applications is rapidly expanding and number of radiocarbon analysis is growing rapidly. Nowadays dedicated 14C AMS machines have great capacity for analysis but in order to keep up with the demand for analysis and provide the results as fast as possible a very efficient way of sample preparation is required. Sample preparation for 14C AMS analysis consists of two steps: separation of relevant carbon from the sample material (removing contamination) and preparation of graphite for AMS analysis. The last step usually involves reaction of CO2 with H2, in the presence of metal catalyst (Fe or Co) of specific mesh size heated to 550-625°C, as originally suggested by Vogel et al. (1984). Various graphitization systems have been built in order to fulfil the requirement of sample quality needed for high-precision radiocarbon data. In the early 90ties another method has been proposed (Vogel 1992) and applied by few laboratories mainly for environmental or biomedical samples. This method uses TiH2 as a source of H2 and can be easily and flexibly applied to produce graphite. Sample of CO2 is frozen in to the tube containing pre-conditioned Zn/TiH2 and Fe catalyst. Torch sealed tubes are then placed in the stepwise heated oven at 500/550°C and left to react for several hours. The greatest problem is the lack of control of the reaction completeness and considerable fractionation. However, recently reported results (Xu et al. 2007) suggest that high precision dating using graphite produced in closed tubes might be possible. We will present results of radiocarbon dating of the set of standards

  19. Precise and Efficient Static Array Bound Checking for Large Embedded C Programs

    NASA Technical Reports Server (NTRS)

    Venet, Arnaud

    2004-01-01

    In this paper we describe the design and implementation of a static array-bound checker for a family of embedded programs: the flight control software of recent Mars missions. These codes are large (up to 250 KLOC), pointer intensive, heavily multithreaded and written in an object-oriented style, which makes their analysis very challenging. We designed a tool called C Global Surveyor (CGS) that can analyze the largest code in a couple of hours with a precision of 80%. The scalability and precision of the analyzer are achieved by using an incremental framework in which a pointer analysis and a numerical analysis of array indices mutually refine each other. CGS has been designed so that it can distribute the analysis over several processors in a cluster of machines. To the best of our knowledge this is the first distributed implementation of static analysis algorithms. Throughout the paper we will discuss the scalability setbacks that we encountered during the construction of the tool and their impact on the initial design decisions.

  20. F-16 Task Analysis Criterion-Referenced Objective and Objectives Hierarchy Report. Volume 4

    DTIC Science & Technology

    1981-03-01

    Initiation cues: Engine flameout Systems presenting cues: Aircraft fuel, engine STANDARD: Authority: TACR 60-2 Performance precision: TD in first 1/3 of...task: None Initiation cues: On short final Systems preventing cues: N/A STANDARD: Authority: 60-2 Performance precision: +/- .5 AOA; TD zone 150-1000...precision: +/- .05 AOA; TD Zone 150-1000 Computational accuracy: N/A ... . . ... . ... e e m I TASK NO.: 1.9.4 BEHAVIOR: Perform short field landing

  1. Precision analysis for standard deviation measurements of immobile single fluorescent molecule images.

    PubMed

    DeSantis, Michael C; DeCenzo, Shawn H; Li, Je-Luen; Wang, Y M

    2010-03-29

    Standard deviation measurements of intensity profiles of stationary single fluorescent molecules are useful for studying axial localization, molecular orientation, and a fluorescence imaging system's spatial resolution. Here we report on the analysis of the precision of standard deviation measurements of intensity profiles of single fluorescent molecules imaged using an EMCCD camera.We have developed an analytical expression for the standard deviation measurement error of a single image which is a function of the total number of detected photons, the background photon noise, and the camera pixel size. The theoretical results agree well with the experimental, simulation, and numerical integration results. Using this expression, we show that single-molecule standard deviation measurements offer nanometer precision for a large range of experimental parameters.

  2. Elevation data fitting and precision analysis of Google Earth in road survey

    NASA Astrophysics Data System (ADS)

    Wei, Haibin; Luan, Xiaohan; Li, Hanchao; Jia, Jiangkun; Chen, Zhao; Han, Leilei

    2018-05-01

    Objective: In order to improve efficiency of road survey and save manpower and material resources, this paper intends to apply Google Earth to the feasibility study stage of road survey and design. Limited by the problem that Google Earth elevation data lacks precision, this paper is focused on finding several different fitting or difference methods to improve the data precision, in order to make every effort to meet the accuracy requirements of road survey and design specifications. Method: On the basis of elevation difference of limited public points, any elevation difference of the other points can be fitted or interpolated. Thus, the precise elevation can be obtained by subtracting elevation difference from the Google Earth data. Quadratic polynomial surface fitting method, cubic polynomial surface fitting method, V4 interpolation method in MATLAB and neural network method are used in this paper to process elevation data of Google Earth. And internal conformity, external conformity and cross correlation coefficient are used as evaluation indexes to evaluate the data processing effect. Results: There is no fitting difference at the fitting point while using V4 interpolation method. Its external conformity is the largest and the effect of accuracy improvement is the worst, so V4 interpolation method is ruled out. The internal and external conformity of the cubic polynomial surface fitting method both are better than those of the quadratic polynomial surface fitting method. The neural network method has a similar fitting effect with the cubic polynomial surface fitting method, but its fitting effect is better in the case of a higher elevation difference. Because the neural network method is an unmanageable fitting model, the cubic polynomial surface fitting method should be mainly used and the neural network method can be used as the auxiliary method in the case of higher elevation difference. Conclusions: Cubic polynomial surface fitting method can obviously

  3. Super-resolution imaging applied to moving object tracking

    NASA Astrophysics Data System (ADS)

    Swalaganata, Galandaru; Ratna Sulistyaningrum, Dwi; Setiyono, Budi

    2017-10-01

    Moving object tracking in a video is a method used to detect and analyze changes that occur in an object that being observed. Visual quality and the precision of the tracked target are highly wished in modern tracking system. The fact that the tracked object does not always seem clear causes the tracking result less precise. The reasons are low quality video, system noise, small object, and other factors. In order to improve the precision of the tracked object especially for small object, we propose a two step solution that integrates a super-resolution technique into tracking approach. First step is super-resolution imaging applied into frame sequences. This step was done by cropping the frame in several frame or all of frame. Second step is tracking the result of super-resolution images. Super-resolution image is a technique to obtain high-resolution images from low-resolution images. In this research single frame super-resolution technique is proposed for tracking approach. Single frame super-resolution was a kind of super-resolution that it has the advantage of fast computation time. The method used for tracking is Camshift. The advantages of Camshift was simple calculation based on HSV color that use its histogram for some condition and color of the object varies. The computational complexity and large memory requirements required for the implementation of super-resolution and tracking were reduced and the precision of the tracked target was good. Experiment showed that integrate a super-resolution imaging into tracking technique can track the object precisely with various background, shape changes of the object, and in a good light conditions.

  4. Direction information in multiple object tracking is limited by a graded resource.

    PubMed

    Horowitz, Todd S; Cohen, Michael A

    2010-10-01

    Is multiple object tracking (MOT) limited by a fixed set of structures (slots), a limited but divisible resource, or both? Here, we answer this question by measuring the precision of the direction representation for tracked targets. The signature of a limited resource is a decrease in precision as the square root of the tracking load. The signature of fixed slots is a fixed precision. Hybrid models predict a rapid decrease to asymptotic precision. In two experiments, observers tracked moving disks and reported target motion direction by adjusting a probe arrow. We derived the precision of representation of correctly tracked targets using a mixture distribution analysis. Precision declined with target load according to the square-root law up to six targets. This finding is inconsistent with both pure and hybrid slot models. Instead, directional information in MOT appears to be limited by a continuously divisible resource.

  5. A novel algorithm for a precise analysis of subchondral bone alterations

    PubMed Central

    Gao, Liang; Orth, Patrick; Goebel, Lars K. H.; Cucchiarini, Magali; Madry, Henning

    2016-01-01

    Subchondral bone alterations are emerging as considerable clinical problems associated with articular cartilage repair. Their analysis exposes a pattern of variable changes, including intra-lesional osteophytes, residual microfracture holes, peri-hole bone resorption, and subchondral bone cysts. A precise distinction between them is becoming increasingly important. Here, we present a tailored algorithm based on continuous data to analyse subchondral bone changes using micro-CT images, allowing for a clear definition of each entity. We evaluated this algorithm using data sets originating from two large animal models of osteochondral repair. Intra-lesional osteophytes were detected in 3 of 10 defects in the minipig and in 4 of 5 defects in the sheep model. Peri-hole bone resorption was found in 22 of 30 microfracture holes in the minipig and in 17 of 30 microfracture holes in the sheep model. Subchondral bone cysts appeared in 1 microfracture hole in the minipig and in 5 microfracture holes in the sheep model (n = 30 holes each). Calculation of inter-rater agreement (90% agreement) and Cohen’s kappa (kappa = 0.874) revealed that the novel algorithm is highly reliable, reproducible, and valid. Comparison analysis with the best existing semi-quantitative evaluation method was also performed, supporting the enhanced precision of this algorithm. PMID:27596562

  6. From fields to objects: A review of geographic boundary analysis

    NASA Astrophysics Data System (ADS)

    Jacquez, G. M.; Maruca, S.; Fortin, M.-J.

    Geographic boundary analysis is a relatively new approach unfamiliar to many spatial analysts. It is best viewed as a technique for defining objects - geographic boundaries - on spatial fields, and for evaluating the statistical significance of characteristics of those boundary objects. This is accomplished using null spatial models representative of the spatial processes expected in the absence of boundary-generating phenomena. Close ties to the object-field dialectic eminently suit boundary analysis to GIS data. The majority of existing spatial methods are field-based in that they describe, estimate, or predict how attributes (variables defining the field) vary through geographic space. Such methods are appropriate for field representations but not object representations. As the object-field paradigm gains currency in geographic information science, appropriate techniques for the statistical analysis of objects are required. The methods reviewed in this paper are a promising foundation. Geographic boundary analysis is clearly a valuable addition to the spatial statistical toolbox. This paper presents the philosophy of, and motivations for geographic boundary analysis. It defines commonly used statistics for quantifying boundaries and their characteristics, as well as simulation procedures for evaluating their significance. We review applications of these techniques, with the objective of making this promising approach accessible to the GIS-spatial analysis community. We also describe the implementation of these methods within geographic boundary analysis software: GEM.

  7. Using frequency analysis to improve the precision of human body posture algorithms based on Kalman filters.

    PubMed

    Olivares, Alberto; Górriz, J M; Ramírez, J; Olivares, G

    2016-05-01

    With the advent of miniaturized inertial sensors many systems have been developed within the last decade to study and analyze human motion and posture, specially in the medical field. Data measured by the sensors are usually processed by algorithms based on Kalman Filters in order to estimate the orientation of the body parts under study. These filters traditionally include fixed parameters, such as the process and observation noise variances, whose value has large influence in the overall performance. It has been demonstrated that the optimal value of these parameters differs considerably for different motion intensities. Therefore, in this work, we show that, by applying frequency analysis to determine motion intensity, and varying the formerly fixed parameters accordingly, the overall precision of orientation estimation algorithms can be improved, therefore providing physicians with reliable objective data they can use in their daily practice. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Analysis of achievable disturbance attenuation in a precision magnetically-suspended motion control system

    NASA Technical Reports Server (NTRS)

    Kuzin, Alexander V.; Holmes, Michael L.; Behrouzjou, Roxana; Trumper, David L.

    1994-01-01

    The results of the analysis of the achievable disturbance attenuation to get an Angstrom motion control resolution and macroscopic travel in a precision magnetically-suspended motion control system are presented in this paper. Noise sources in the transducers, electronics, and mechanical vibrations are used to develop the control design.

  9. Cosmological surveys with multi-object spectrographs

    NASA Astrophysics Data System (ADS)

    Colless, Matthew

    2016-08-01

    Multi-object spectroscopy has been a key technique contributing to the current era of `precision cosmology.' From the first exploratory surveys of the large-scale structure and evolution of the universe to the current generation of superbly detailed maps spanning a wide range of redshifts, multi-object spectroscopy has been a fundamentally important tool for mapping the rich structure of the cosmic web and extracting cosmological information of increasing variety and precision. This will continue to be true for the foreseeable future, as we seek to map the evolving geometry and structure of the universe over the full extent of cosmic history in order to obtain the most precise and comprehensive measurements of cosmological parameters. Here I briefly summarize the contributions that multi-object spectroscopy has made to cosmology so far, then review the major surveys and instruments currently in play and their prospects for pushing back the cosmological frontier. Finally, I examine some of the next generation of instruments and surveys to explore how the field will develop in coming years, with a particular focus on specialised multi-object spectrographs for cosmology and the capabilities of multi-object spectrographs on the new generation of extremely large telescopes.

  10. Precise Orbit Determination for ALOS

    NASA Technical Reports Server (NTRS)

    Nakamura, Ryo; Nakamura, Shinichi; Kudo, Nobuo; Katagiri, Seiji

    2007-01-01

    The Advanced Land Observing Satellite (ALOS) has been developed to contribute to the fields of mapping, precise regional land coverage observation, disaster monitoring, and resource surveying. Because the mounted sensors need high geometrical accuracy, precise orbit determination for ALOS is essential for satisfying the mission objectives. So ALOS mounts a GPS receiver and a Laser Reflector (LR) for Satellite Laser Ranging (SLR). This paper deals with the precise orbit determination experiments for ALOS using Global and High Accuracy Trajectory determination System (GUTS) and the evaluation of the orbit determination accuracy by SLR data. The results show that, even though the GPS receiver loses lock of GPS signals more frequently than expected, GPS-based orbit is consistent with SLR-based orbit. And considering the 1 sigma error, orbit determination accuracy of a few decimeters (peak-to-peak) was achieved.

  11. Analysis of the Bayesian Cramér-Rao lower bound in astrometry. Studying the impact of prior information in the location of an object

    NASA Astrophysics Data System (ADS)

    Echeverria, Alex; Silva, Jorge F.; Mendez, Rene A.; Orchard, Marcos

    2016-10-01

    Context. The best precision that can be achieved to estimate the location of a stellar-like object is a topic of permanent interest in the astrometric community. Aims: We analyze bounds for the best position estimation of a stellar-like object on a CCD detector array in a Bayesian setting where the position is unknown, but where we have access to a prior distribution. In contrast to a parametric setting where we estimate a parameter from observations, the Bayesian approach estimates a random object (I.e., the position is a random variable) from observations that are statistically dependent on the position. Methods: We characterize the Bayesian Cramér-Rao (CR) that bounds the minimum mean square error (MMSE) of the best estimator of the position of a point source on a linear CCD-like detector, as a function of the properties of detector, the source, and the background. Results: We quantify and analyze the increase in astrometric performance from the use of a prior distribution of the object position, which is not available in the classical parametric setting. This gain is shown to be significant for various observational regimes, in particular in the case of faint objects or when the observations are taken under poor conditions. Furthermore, we present numerical evidence that the MMSE estimator of this problem tightly achieves the Bayesian CR bound. This is a remarkable result, demonstrating that all the performance gains presented in our analysis can be achieved with the MMSE estimator. Conclusions: The Bayesian CR bound can be used as a benchmark indicator of the expected maximum positional precision of a set of astrometric measurements in which prior information can be incorporated. This bound can be achieved through the conditional mean estimator, in contrast to the parametric case where no unbiased estimator precisely reaches the CR bound.

  12. High-Precision Image Aided Inertial Navigation with Known Features: Observability Analysis and Performance Evaluation

    PubMed Central

    Jiang, Weiping; Wang, Li; Niu, Xiaoji; Zhang, Quan; Zhang, Hui; Tang, Min; Hu, Xiangyun

    2014-01-01

    A high-precision image-aided inertial navigation system (INS) is proposed as an alternative to the carrier-phase-based differential Global Navigation Satellite Systems (CDGNSSs) when satellite-based navigation systems are unavailable. In this paper, the image/INS integrated algorithm is modeled by a tightly-coupled iterative extended Kalman filter (IEKF). Tightly-coupled integration ensures that the integrated system is reliable, even if few known feature points (i.e., less than three) are observed in the images. A new global observability analysis of this tightly-coupled integration is presented to guarantee that the system is observable under the necessary conditions. The analysis conclusions were verified by simulations and field tests. The field tests also indicate that high-precision position (centimeter-level) and attitude (half-degree-level)-integrated solutions can be achieved in a global reference. PMID:25330046

  13. Large-scale weakly supervised object localization via latent category learning.

    PubMed

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  14. Frame sequences analysis technique of linear objects movement

    NASA Astrophysics Data System (ADS)

    Oshchepkova, V. Y.; Berg, I. A.; Shchepkin, D. V.; Kopylova, G. V.

    2017-12-01

    Obtaining data by noninvasive methods are often needed in many fields of science and engineering. This is achieved through video recording in various frame rate and light spectra. In doing so quantitative analysis of movement of the objects being studied becomes an important component of the research. This work discusses analysis of motion of linear objects on the two-dimensional plane. The complexity of this problem increases when the frame contains numerous objects whose images may overlap. This study uses a sequence containing 30 frames at the resolution of 62 × 62 pixels and frame rate of 2 Hz. It was required to determine the average velocity of objects motion. This velocity was found as an average velocity for 8-12 objects with the error of 15%. After processing dependencies of the average velocity vs. control parameters were found. The processing was performed in the software environment GMimPro with the subsequent approximation of the data obtained using the Hill equation.

  15. Centering Objects in the Workspace

    ERIC Educational Resources Information Center

    Free, Cory

    2005-01-01

    Drafters must be detail-oriented people. The objects they draw are interpreted and then built with the extreme precision required by today's manufacturers. Now that computer-aided drafting (CAD) has taken over the drafting profession, anything less than exact precision is unacceptable. In her drafting classes, the author expects her students to…

  16. Objective analysis of observational data from the FGGE observing systems

    NASA Technical Reports Server (NTRS)

    Baker, W.; Edelmann, D.; Iredell, M.; Han, D.; Jakkempudi, S.

    1981-01-01

    An objective analysis procedure for updating the GLAS second and fourth order general atmospheric circulation models using observational data from the first GARP global experiment is described. The objective analysis procedure is based on a successive corrections method and the model is updated in a data assimilation cycle. Preparation of the observational data for analysis and the objective analysis scheme are described. The organization of the program and description of the required data sets are presented. The program logic and detailed descriptions of each subroutine are given.

  17. Precision Nutrition 4.0: A Big Data and Ethics Foresight Analysis--Convergence of Agrigenomics, Nutrigenomics, Nutriproteomics, and Nutrimetabolomics.

    PubMed

    Özdemir, Vural; Kolker, Eugene

    2016-02-01

    Nutrition is central to sustenance of good health, not to mention its role as a cultural object that brings together or draws lines among societies. Undoubtedly, understanding the future paths of nutrition science in the current era of Big Data remains firmly on science, technology, and innovation strategy agendas around the world. Nutrigenomics, the confluence of nutrition science with genomics, brought about a new focus on and legitimacy for "variability science" (i.e., the study of mechanisms of person-to-person and population differences in response to food, and the ways in which food variably impacts the host, for example, nutrient-related disease outcomes). Societal expectations, both public and private, and claims over genomics-guided and individually-tailored precision diets continue to proliferate. While the prospects of nutrition science, and nutrigenomics in particular, are established, there is a need to integrate the efforts in four Big Data domains that are naturally allied--agrigenomics, nutrigenomics, nutriproteomics, and nutrimetabolomics--that address complementary variability questions pertaining to individual differences in response to food-related environmental exposures. The joint use of these four omics knowledge domains, coined as Precision Nutrition 4.0 here, has sadly not been realized to date, but the potentials for such integrated knowledge innovation are enormous. Future personalized nutrition practices would benefit from a seamless planning of life sciences funding, research, and practice agendas from "farm to clinic to supermarket to society," and from "genome to proteome to metabolome." Hence, this innovation foresight analysis explains the already existing potentials waiting to be realized, and suggests ways forward for innovation in both technology and ethics foresight frames on precision nutrition. We propose the creation of a new Precision Nutrition Evidence Barometer for periodic, independent, and ongoing retrieval, screening

  18. Precision of coherence analysis to detect cerebral autoregulation by near-infrared spectroscopy in preterm infants

    NASA Astrophysics Data System (ADS)

    Hahn, Gitte Holst; Christensen, Karl Bang; Leung, Terence S.; Greisen, Gorm

    2010-05-01

    Coherence between spontaneous fluctuations in arterial blood pressure (ABP) and the cerebral near-infrared spectroscopy signal can detect cerebral autoregulation. Because reliable measurement depends on signals with high signal-to-noise ratio, we hypothesized that coherence is more precisely determined when fluctuations in ABP are large rather than small. Therefore, we investigated whether adjusting for variability in ABP (variabilityABP) improves precision. We examined the impact of variabilityABP within the power spectrum in each measurement and between repeated measurements in preterm infants. We also examined total monitoring time required to discriminate among infants with a simulation study. We studied 22 preterm infants (GA<30) yielding 215 10-min measurements. Surprisingly, adjusting for variabilityABP within the power spectrum did not improve the precision. However, adjusting for the variabilityABP among repeated measurements (i.e., weighting measurements with high variabilityABP in favor of those with low) improved the precision. The evidence of drift in individual infants was weak. Minimum monitoring time needed to discriminate among infants was 1.3-3.7 h. Coherence analysis in low frequencies (0.04-0.1 Hz) had higher precision and statistically more power than in very low frequencies (0.003-0.04 Hz). In conclusion, a reliable detection of cerebral autoregulation takes hours and the precision is improved by adjusting for variabilityABP between repeated measurements.

  19. Object width modulates object-based attentional selection.

    PubMed

    Nah, Joseph C; Neppi-Modona, Marco; Strother, Lars; Behrmann, Marlene; Shomstein, Sarah

    2018-04-24

    Visual input typically includes a myriad of objects, some of which are selected for further processing. While these objects vary in shape and size, most evidence supporting object-based guidance of attention is drawn from paradigms employing two identical objects. Importantly, object size is a readily perceived stimulus dimension, and whether it modulates the distribution of attention remains an open question. Across four experiments, the size of the objects in the display was manipulated in a modified version of the two-rectangle paradigm. In Experiment 1, two identical parallel rectangles of two sizes (thin or thick) were presented. Experiments 2-4 employed identical trapezoids (each having a thin and thick end), inverted in orientation. In the experiments, one end of an object was cued and participants performed either a T/L discrimination or a simple target-detection task. Combined results show that, in addition to the standard object-based attentional advantage, there was a further attentional benefit for processing information contained in the thick versus thin end of objects. Additionally, eye-tracking measures demonstrated increased saccade precision towards thick object ends, suggesting that Fitts's Law may play a role in object-based attentional shifts. Taken together, these results suggest that object-based attentional selection is modulated by object width.

  20. Precision Higgs Boson Physics and Implications for Beyond the Standard Model Physics Theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wells, James

    The discovery of the Higgs boson is one of science's most impressive recent achievements. We have taken a leap forward in understanding what is at the heart of elementary particle mass generation. We now have a significant opportunity to develop even deeper understanding of how the fundamental laws of nature are constructed. As such, we need intense focus from the scientific community to put this discovery in its proper context, to realign and narrow our understanding of viable theory based on this positive discovery, and to detail the implications the discovery has for theories that attempt to answer questions beyondmore » what the Standard Model can explain. This project's first main object is to develop a state-of-the-art analysis of precision Higgs boson physics. This is to be done in the tradition of the electroweak precision measurements of the LEP/SLC era. Indeed, the electroweak precision studies of the past are necessary inputs to the full precision Higgs program. Calculations will be presented to the community of Higgs boson observables that detail just how well various couplings of the Higgs boson can be measured, and more. These will be carried out using state-of-the-art theory computations coupled with the new experimental results coming in from the LHC. The project's second main objective is to utilize the results obtained from LHC Higgs boson experiments and the precision analysis, along with the direct search studies at LHC, and discern viable theories of physics beyond the Standard Model that unify physics to a deeper level. Studies will be performed on supersymmetric theories, theories of extra spatial dimensions (and related theories, such as compositeness), and theories that contain hidden sector states uniquely accessible to the Higgs boson. In addition, if data becomes incompatible with the Standard Model's low-energy effective lagrangian, new physics theories will be developed that explain the anomaly and put it into a more unified

  1. Comparative analysis of gene expression level by quantitative real-time PCR has limited application in objects with different morphology.

    PubMed

    Demidenko, Natalia V; Penin, Aleksey A

    2012-01-01

    qRT-PCR is a generally acknowledged method for gene expression analysis due to its precision and reproducibility. However, it is well known that the accuracy of qRT-PCR data varies greatly depending on the experimental design and data analysis. Recently, a set of guidelines has been proposed that aims to improve the reliability of qRT-PCR. However, there are additional factors that have not been taken into consideration in these guidelines that can seriously affect the data obtained using this method. In this study, we report the influence that object morphology can have on qRT-PCR data. We have used a number of Arabidopsis thaliana mutants with altered floral morphology as models for this study. These mutants have been well characterised (including in terms of gene expression levels and patterns) by other techniques. This allows us to compare the results from the qRT-PCR with the results inferred from other methods. We demonstrate that the comparison of gene expression levels in objects that differ greatly in their morphology can lead to erroneous results.

  2. Analysis of de-noising methods to improve the precision of the ILSF BPM electronic readout system

    NASA Astrophysics Data System (ADS)

    Shafiee, M.; Feghhi, S. A. H.; Rahighi, J.

    2016-12-01

    In order to have optimum operation and precise control system at particle accelerators, it is required to measure the beam position with the precision of sub-μm. We developed a BPM electronic readout system at Iranian Light Source Facility and it has been experimentally tested at ALBA accelerator facility. The results show the precision of 0.54 μm in beam position measurements. To improve the precision of this beam position monitoring system to sub-μm level, we have studied different de-noising methods such as principal component analysis, wavelet transforms, filtering by FIR, and direct averaging method. An evaluation of the noise reduction was given to testify the ability of these methods. The results show that the noise reduction based on Daubechies wavelet transform is better than other algorithms, and the method is suitable for signal noise reduction in beam position monitoring system.

  3. Multi-objective optimization in quantum parameter estimation

    NASA Astrophysics Data System (ADS)

    Gong, BeiLi; Cui, Wei

    2018-04-01

    We investigate quantum parameter estimation based on linear and Kerr-type nonlinear controls in an open quantum system, and consider the dissipation rate as an unknown parameter. We show that while the precision of parameter estimation is improved, it usually introduces a significant deformation to the system state. Moreover, we propose a multi-objective model to optimize the two conflicting objectives: (1) maximizing the Fisher information, improving the parameter estimation precision, and (2) minimizing the deformation of the system state, which maintains its fidelity. Finally, simulations of a simplified ɛ-constrained model demonstrate the feasibility of the Hamiltonian control in improving the precision of the quantum parameter estimation.

  4. Oxygen isotope analysis of phosphate: improved precision using TC/EA CF-IRMS.

    PubMed

    LaPorte, D F; Holmden, C; Patterson, W P; Prokopiuk, T; Eglington, B M

    2009-06-01

    Oxygen isotope values of biogenic apatite have long demonstrated considerable promise for paleothermometry potential because of the abundance of material in the fossil record and greater resistance of apatite to diagenesis compared to carbonate. Unfortunately, this promise has not been fully realized because of relatively poor precision of isotopic measurements, and exceedingly small size of some substrates for analysis. Building on previous work, we demonstrate that it is possible to improve precision of delta18O(PO4) measurements using a 'reverse-plumbed' thermal conversion elemental analyzer (TC/EA) coupled to a continuous flow isotope ratio mass spectrometer (CF-IRMS) via a helium stream [Correction made here after initial online publication]. This modification to the flow of helium through the TC/EA, and careful location of the packing of glassy carbon fragments relative to the hot spot in the reactor, leads to narrower, more symmetrically distributed CO elution peaks with diminished tailing. In addition, we describe our apatite purification chemistry that uses nitric acid and cation exchange resin. Purification chemistry is optimized for processing small samples, minimizing isotopic fractionation of PO4(-3) and permitting Ca, Sr and Nd to be eluted and purified further for the measurement of delta44Ca and 87Sr/86Sr in modern biogenic apatite and 143Nd/144Nd in fossil apatite. Our methodology yields an external precision of +/- 0.15 per thousand (1sigma) for delta18O(PO4). The uncertainty is related to the preparation of the Ag3PO4 salt, conversion to CO gas in a reversed-plumbed TC/EA, analysis of oxygen isotopes using a CF-IRMS, and uncertainty in constructing calibration lines that convert raw delta18O data to the VSMOW scale. Matrix matching of samples and standards for the purpose of calibration to the VSMOW scale was determined to be unnecessary. Our method requires only slightly modified equipment that is widely available. This fact, and the

  5. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  6. Precision medicine at the crossroads.

    PubMed

    Olson, Maynard V

    2017-10-11

    There are bioethical, institutional, economic, legal, and cultural obstacles to creating the robust-precompetitive-data resource that will be required to advance the vision of "precision medicine," the ability to use molecular data to target therapies to patients for whom they offer the most benefit at the least risk. Creation of such an "information commons" was the central recommendation of the 2011 report Toward Precision Medicine issued by a committee of the National Research Council of the USA (Committee on a Framework for Development of a New Taxonomy of Disease; National Research Council. Toward precision medicine: building a knowledge network for biomedical research and a new taxonomy of disease. 2011). In this commentary, I review the rationale for creating an information commons and the obstacles to doing so; then, I endorse a path forward based on the dynamic consent of research subjects interacting with researchers through trusted mediators. I assert that the advantages of the proposed system overwhelm alternative ways of handling data on the phenotypes, genotypes, and environmental exposures of individual humans; hence, I argue that its creation should be the central policy objective of early efforts to make precision medicine a reality.

  7. Precision Health Economics and Outcomes Research to Support Precision Medicine: Big Data Meets Patient Heterogeneity on the Road to Value.

    PubMed

    Chen, Yixi; Guzauskas, Gregory F; Gu, Chengming; Wang, Bruce C M; Furnback, Wesley E; Xie, Guotong; Dong, Peng; Garrison, Louis P

    2016-11-02

    The "big data" era represents an exciting opportunity to utilize powerful new sources of information to reduce clinical and health economic uncertainty on an individual patient level. In turn, health economic outcomes research (HEOR) practices will need to evolve to accommodate individual patient-level HEOR analyses. We propose the concept of "precision HEOR", which utilizes a combination of costs and outcomes derived from big data to inform healthcare decision-making that is tailored to highly specific patient clusters or individuals. To explore this concept, we discuss the current and future roles of HEOR in health sector decision-making, big data and predictive analytics, and several key HEOR contexts in which big data and predictive analytics might transform traditional HEOR into precision HEOR. The guidance document addresses issues related to the transition from traditional to precision HEOR practices, the evaluation of patient similarity analysis and its appropriateness for precision HEOR analysis, and future challenges to precision HEOR adoption. Precision HEOR should make precision medicine more realizable by aiding and adapting healthcare resource allocation. The combined hopes for precision medicine and precision HEOR are that individual patients receive the best possible medical care while overall healthcare costs remain manageable or become more cost-efficient.

  8. Precision Health Economics and Outcomes Research to Support Precision Medicine: Big Data Meets Patient Heterogeneity on the Road to Value

    PubMed Central

    Chen, Yixi; Guzauskas, Gregory F.; Gu, Chengming; Wang, Bruce C. M.; Furnback, Wesley E.; Xie, Guotong; Dong, Peng; Garrison, Louis P.

    2016-01-01

    The “big data” era represents an exciting opportunity to utilize powerful new sources of information to reduce clinical and health economic uncertainty on an individual patient level. In turn, health economic outcomes research (HEOR) practices will need to evolve to accommodate individual patient–level HEOR analyses. We propose the concept of “precision HEOR”, which utilizes a combination of costs and outcomes derived from big data to inform healthcare decision-making that is tailored to highly specific patient clusters or individuals. To explore this concept, we discuss the current and future roles of HEOR in health sector decision-making, big data and predictive analytics, and several key HEOR contexts in which big data and predictive analytics might transform traditional HEOR into precision HEOR. The guidance document addresses issues related to the transition from traditional to precision HEOR practices, the evaluation of patient similarity analysis and its appropriateness for precision HEOR analysis, and future challenges to precision HEOR adoption. Precision HEOR should make precision medicine more realizable by aiding and adapting healthcare resource allocation. The combined hopes for precision medicine and precision HEOR are that individual patients receive the best possible medical care while overall healthcare costs remain manageable or become more cost-efficient. PMID:27827859

  9. Outlier analysis of functional genomic profiles enriches for oncology targets and enables precision medicine.

    PubMed

    Zhu, Zhou; Ihle, Nathan T; Rejto, Paul A; Zarrinkar, Patrick P

    2016-06-13

    Genome-scale functional genomic screens across large cell line panels provide a rich resource for discovering tumor vulnerabilities that can lead to the next generation of targeted therapies. Their data analysis typically has focused on identifying genes whose knockdown enhances response in various pre-defined genetic contexts, which are limited by biological complexities as well as the incompleteness of our knowledge. We thus introduce a complementary data mining strategy to identify genes with exceptional sensitivity in subsets, or outlier groups, of cell lines, allowing an unbiased analysis without any a priori assumption about the underlying biology of dependency. Genes with outlier features are strongly and specifically enriched with those known to be associated with cancer and relevant biological processes, despite no a priori knowledge being used to drive the analysis. Identification of exceptional responders (outliers) may not lead only to new candidates for therapeutic intervention, but also tumor indications and response biomarkers for companion precision medicine strategies. Several tumor suppressors have an outlier sensitivity pattern, supporting and generalizing the notion that tumor suppressors can play context-dependent oncogenic roles. The novel application of outlier analysis described here demonstrates a systematic and data-driven analytical strategy to decipher large-scale functional genomic data for oncology target and precision medicine discoveries.

  10. Sternal instability measured with radiostereometric analysis. A study of method feasibility, accuracy and precision.

    PubMed

    Vestergaard, Rikke Falsig; Søballe, Kjeld; Hasenkam, John Michael; Stilling, Maiken

    2018-05-18

    A small, but unstable, saw-gap may hinder bone-bridging and induce development of painful sternal dehiscence. We propose the use of Radiostereometric Analysis (RSA) for evaluation of sternal instability and present a method validation. Four bone analogs (phantoms) were sternotomized and tantalum beads were inserted in each half. The models were reunited with wire cerclage and placed in a radiolucent separation device. Stereoradiographs (n = 48) of the phantoms in 3 positions were recorded at 4 imposed separation points. The accuracy and precision was compared statistically and presented as translations along the 3 orthogonal axes. 7 sternotomized patients were evaluated for clinical RSA precision by double-examination stereoradiographs (n = 28). In the phantom study, we found no systematic error (p > 0.3) between the three phantom positions, and precision for evaluation of sternal separation was 0.02 mm. Phantom accuracy was mean 0.13 mm (SD 0.25). In the clinical study, we found a detection limit of 0.42 mm for sternal separation and of 2 mm for anterior-posterior dislocation of the sternal halves for the individual patient. RSA is a precise and low-dose image modality feasible for clinical evaluation of sternal stability in research. ClinicalTrials.gov Identifier: NCT02738437 , retrospectively registered.

  11. Evaluation of Alzheimer's disease by analysis of MR images using Objective Dialectical Classifiers as an alternative to ADC maps.

    PubMed

    Dos Santos, Wellington P; de Assis, Francisco M; de Souza, Ricardo E; Dos Santos Filho, Plinio B

    2008-01-01

    Alzheimer's disease is the most common cause of dementia, yet hard to diagnose precisely without invasive techniques, particularly at the onset of the disease. This work approaches image analysis and classification of synthetic multispectral images composed by diffusion-weighted (DW) magnetic resonance (MR) cerebral images for the evaluation of cerebrospinal fluid area and measuring the advance of Alzheimer's disease. A clinical 1.5 T MR imaging system was used to acquire all images presented. The classification methods are based on Objective Dialectical Classifiers, a new method based on Dialectics as defined in the Philosophy of Praxis. A 2-degree polynomial network with supervised training is used to generate the ground truth image. The classification results are used to improve the usual analysis of the apparent diffusion coefficient map.

  12. Precision digital control systems

    NASA Astrophysics Data System (ADS)

    Vyskub, V. G.; Rozov, B. S.; Savelev, V. I.

    This book is concerned with the characteristics of digital control systems of great accuracy. A classification of such systems is considered along with aspects of stabilization, programmable control applications, digital tracking systems and servomechanisms, and precision systems for the control of a scanning laser beam. Other topics explored are related to systems of proportional control, linear devices and methods for increasing precision, approaches for further decreasing the response time in the case of high-speed operation, possibilities for the implementation of a logical control law, and methods for the study of precision digital control systems. A description is presented of precision automatic control systems which make use of electronic computers, taking into account the existing possibilities for an employment of computers in automatic control systems, approaches and studies required for including a computer in such control systems, and an analysis of the structure of automatic control systems with computers. Attention is also given to functional blocks in the considered systems.

  13. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  14. Towards Precision Spectroscopy of Baryonic Resonances

    NASA Astrophysics Data System (ADS)

    Döring, Michael; Mai, Maxim; Rönchen, Deborah

    2017-01-01

    Recent progress in baryon spectroscopy is reviewed. In a common effort, various groups have analyzed a set of new high-precision polarization observables from ELSA. The Jülich-Bonn group has finalized the analysis of pion-induced meson-baryon production, the potoproduction of pions and eta mesons, and (almost) the KΛ final state. As data become preciser, statistical aspects in the analysis of excited baryons become increasingly relevant and several advances in this direction are proposed.

  15. Towards precision spectroscopy of baryonic resonances

    DOE PAGES

    Doring, Michael; Mai, Maxim; Ronchen, Deborah

    2017-01-26

    Recent progress in baryon spectroscopy is reviewed. In a common effort, various groups have analyzed a set of new high-precision polarization observables from ELSA. The Julich-Bonn group has finalized the analysis of pion-induced meson-baryon production, the potoproduction of pions and eta mesons, and (almost) the KΛ final state. Lastly, as data become preciser, statistical aspects in the analysis of excited baryons become increasingly relevant and several advances in this direction are proposed.

  16. Omics Profiling in Precision Oncology*

    PubMed Central

    Yu, Kun-Hsing; Snyder, Michael

    2016-01-01

    Cancer causes significant morbidity and mortality worldwide, and is the area most targeted in precision medicine. Recent development of high-throughput methods enables detailed omics analysis of the molecular mechanisms underpinning tumor biology. These studies have identified clinically actionable mutations, gene and protein expression patterns associated with prognosis, and provided further insights into the molecular mechanisms indicative of cancer biology and new therapeutics strategies such as immunotherapy. In this review, we summarize the techniques used for tumor omics analysis, recapitulate the key findings in cancer omics studies, and point to areas requiring further research on precision oncology. PMID:27099341

  17. The GEMPAK Barnes objective analysis scheme

    NASA Technical Reports Server (NTRS)

    Koch, S. E.; Desjardins, M.; Kocin, P. J.

    1981-01-01

    GEMPAK, an interactive computer software system developed for the purpose of assimilating, analyzing, and displaying various conventional and satellite meteorological data types is discussed. The objective map analysis scheme possesses certain characteristics that allowed it to be adapted to meet the analysis needs GEMPAK. Those characteristics and the specific adaptation of the scheme to GEMPAK are described. A step-by-step guide for using the GEMPAK Barnes scheme on an interactive computer (in real time) to analyze various types of meteorological datasets is also presented.

  18. Advanced bioanalytics for precision medicine.

    PubMed

    Roda, Aldo; Michelini, Elisa; Caliceti, Cristiana; Guardigli, Massimo; Mirasoli, Mara; Simoni, Patrizia

    2018-01-01

    Precision medicine is a new paradigm that combines diagnostic, imaging, and analytical tools to produce accurate diagnoses and therapeutic interventions tailored to the individual patient. This approach stands in contrast to the traditional "one size fits all" concept, according to which researchers develop disease treatments and preventions for an "average" patient without considering individual differences. The "one size fits all" concept has led to many ineffective or inappropriate treatments, especially for pathologies such as Alzheimer's disease and cancer. Now, precision medicine is receiving massive funding in many countries, thanks to its social and economic potential in terms of improved disease prevention, diagnosis, and therapy. Bioanalytical chemistry is critical to precision medicine. This is because identifying an appropriate tailored therapy requires researchers to collect and analyze information on each patient's specific molecular biomarkers (e.g., proteins, nucleic acids, and metabolites). In other words, precision diagnostics is not possible without precise bioanalytical chemistry. This Trend article highlights some of the most recent advances, including massive analysis of multilayer omics, and new imaging technique applications suitable for implementing precision medicine. Graphical abstract Precision medicine combines bioanalytical chemistry, molecular diagnostics, and imaging tools for performing accurate diagnoses and selecting optimal therapies for each patient.

  19. Segmentation quality evaluation using region-based precision and recall measures for remote sensing images

    NASA Astrophysics Data System (ADS)

    Zhang, Xueliang; Feng, Xuezhi; Xiao, Pengfeng; He, Guangjun; Zhu, Liujun

    2015-04-01

    Segmentation of remote sensing images is a critical step in geographic object-based image analysis. Evaluating the performance of segmentation algorithms is essential to identify effective segmentation methods and optimize their parameters. In this study, we propose region-based precision and recall measures and use them to compare two image partitions for the purpose of evaluating segmentation quality. The two measures are calculated based on region overlapping and presented as a point or a curve in a precision-recall space, which can indicate segmentation quality in both geometric and arithmetic respects. Furthermore, the precision and recall measures are combined by using four different methods. We examine and compare the effectiveness of the combined indicators through geometric illustration, in an effort to reveal segmentation quality clearly and capture the trade-off between the two measures. In the experiments, we adopted the multiresolution segmentation (MRS) method for evaluation. The proposed measures are compared with four existing discrepancy measures to further confirm their capabilities. Finally, we suggest using a combination of the region-based precision-recall curve and the F-measure for supervised segmentation evaluation.

  20. Department of Defense Precise Time and Time Interval program improvement plan

    NASA Technical Reports Server (NTRS)

    Bowser, J. R.

    1981-01-01

    The United States Naval Observatory is responsible for ensuring uniformity in precise time and time interval operations including measurements, the establishment of overall DOD requirements for time and time interval, and the accomplishment of objectives requiring precise time and time interval with minimum cost. An overview of the objectives, the approach to the problem, the schedule, and a status report, including significant findings relative to organizational relationships, current directives, principal PTTI users, and future requirements as currently identified by the users are presented.

  1. Touch Precision Modulates Visual Bias.

    PubMed

    Misceo, Giovanni F; Jones, Maurice D

    2018-01-01

    The sensory precision hypothesis holds that different seen and felt cues about the size of an object resolve themselves in favor of the more reliable modality. To examine this precision hypothesis, 60 college students were asked to look at one size while manually exploring another unseen size either with their bare fingers or, to lessen the reliability of touch, with their fingers sleeved in rigid tubes. Afterwards, the participants estimated either the seen size or the felt size by finding a match from a visual display of various sizes. Results showed that the seen size biased the estimates of the felt size when the reliability of touch decreased. This finding supports the interaction between touch reliability and visual bias predicted by statistically optimal models of sensory integration.

  2. Foreign object detection and removal to improve automated analysis of chest radiographs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogeweg, Laurens; Sanchez, Clara I.; Melendez, Jaime

    2013-07-15

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The methodmore » is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A{sub z} value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis.« less

  3. IDL Object Oriented Software for Hinode/XRT Image Analysis

    NASA Astrophysics Data System (ADS)

    Higgins, P. A.; Gallagher, P. T.

    2008-09-01

    We have developed a set of object oriented IDL routines that enable users to search, download and analyse images from the X-Ray Telescope (XRT) on-board Hinode. In this paper, we give specific examples of how the object can be used and how multi-instrument data analysis can be performed. The XRT object is a highly versatile and powerful IDL object, which will prove to be a useful tool for solar researchers. This software utilizes the generic Framework object available within the GEN branch of SolarSoft.

  4. A light and faster regional convolutional neural network for object detection in optical remote sensing images

    NASA Astrophysics Data System (ADS)

    Ding, Peng; Zhang, Ye; Deng, Wei-Jian; Jia, Ping; Kuijper, Arjan

    2018-07-01

    Detection of objects from satellite optical remote sensing images is very important for many commercial and governmental applications. With the development of deep convolutional neural networks (deep CNNs), the field of object detection has seen tremendous advances. Currently, objects in satellite remote sensing images can be detected using deep CNNs. In general, optical remote sensing images contain many dense and small objects, and the use of the original Faster Regional CNN framework does not yield a suitably high precision. Therefore, after careful analysis we adopt dense convoluted networks, a multi-scale representation and various combinations of improvement schemes to enhance the structure of the base VGG16-Net for improving the precision. We propose an approach to reduce the test-time (detection time) and memory requirements. To validate the effectiveness of our approach, we perform experiments using satellite remote sensing image datasets of aircraft and automobiles. The results show that the improved network structure can detect objects in satellite optical remote sensing images more accurately and efficiently.

  5. Platform Precision Autopilot Overview and Mission Performance

    NASA Technical Reports Server (NTRS)

    Strovers, Brian K.; Lee, James A.

    2009-01-01

    The Platform Precision Autopilot is an instrument landing system-interfaced autopilot system, developed to enable an aircraft to repeatedly fly nearly the same trajectory hours, days, or weeks later. The Platform Precision Autopilot uses a novel design to interface with a NASA Gulfstream III jet by imitating the output of an instrument landing system approach. This technique minimizes, as much as possible, modifications to the baseline Gulfstream III jet and retains the safety features of the aircraft autopilot. The Platform Precision Autopilot requirement is to fly within a 5-m (16.4-ft) radius tube for distances to 200 km (108 nmi) in the presence of light turbulence for at least 90 percent of the time. This capability allows precise repeat-pass interferometry for the Unmanned Aerial Vehicle Synthetic Aperture Radar program, whose primary objective is to develop a miniaturized, polarimetric, L-band synthetic aperture radar. Precise navigation is achieved using an accurate differential global positioning system developed by the Jet Propulsion Laboratory. Flight-testing has demonstrated the ability of the Platform Precision Autopilot to control the aircraft within the specified tolerance greater than 90 percent of the time in the presence of aircraft system noise and nonlinearities, constant pilot throttle adjustments, and light turbulence.

  6. Categorical data processing for real estate objects valuation using statistical analysis

    NASA Astrophysics Data System (ADS)

    Parygin, D. S.; Malikov, V. P.; Golubev, A. V.; Sadovnikova, N. P.; Petrova, T. M.; Finogeev, A. G.

    2018-05-01

    Theoretical and practical approaches to the use of statistical methods for studying various properties of infrastructure objects are analyzed in the paper. Methods of forecasting the value of objects are considered. A method for coding categorical variables describing properties of real estate objects is proposed. The analysis of the results of modeling the price of real estate objects using regression analysis and an algorithm based on a comparative approach is carried out.

  7. System identification of the JPL micro-precision interferometer truss - Test-analysis reconciliation

    NASA Technical Reports Server (NTRS)

    Red-Horse, J. R.; Marek, E. L.; Levine-West, M.

    1993-01-01

    The JPL Micro-Precision Interferometer (MPI) is a testbed for studying the use of control-structure interaction technology in the design of space-based interferometers. A layered control architecture will be employed to regulate the interferometer optical system to tolerances in the nanometer range. An important aspect of designing and implementing the control schemes for such a system is the need for high fidelity, test-verified analytical structural models. This paper focuses on one aspect of the effort to produce such a model for the MPI structure, test-analysis model reconciliation. Pretest analysis, modal testing, and model refinement results are summarized for a series of tests at both the component and full system levels.

  8. High-precision Orbit Fitting and Uncertainty Analysis of (486958) 2014 MU69

    NASA Astrophysics Data System (ADS)

    Porter, Simon B.; Buie, Marc W.; Parker, Alex H.; Spencer, John R.; Benecchi, Susan; Tanga, Paolo; Verbiscer, Anne; Kavelaars, J. J.; Gwyn, Stephen D. J.; Young, Eliot F.; Weaver, H. A.; Olkin, Catherine B.; Parker, Joel W.; Stern, S. Alan

    2018-07-01

    NASA’s New Horizons spacecraft will conduct a close flyby of the cold-classical Kuiper Belt Object (KBO) designated (486958) 2014 MU69 on 2019 January 1. At a heliocentric distance of 44 au, “MU69” will be the most distant object ever visited by a spacecraft. To enable this flyby, we have developed an extremely high-precision orbit fitting and uncertainty processing pipeline, making maximal use of the Hubble Space Telescope’s Wide Field Camera 3 (WFC3) and pre-release versions of the ESA Gaia Data Release 2 (DR2) catalog. This pipeline also enabled successful predictions of a stellar occultation by MU69 in 2017 July. We describe how we process the WFC3 images to match the Gaia DR2 catalog, extract positional uncertainties for this extremely faint target (typically 140 photons per WFC3 exposure), and translate those uncertainties into probability distribution functions for MU69 at any given time. We also describe how we use these uncertainties to guide New Horizons, plan stellar occultions of MU69, and derive MU69's orbital evolution and long-term stability.

  9. [Precision nutrition in the era of precision medicine].

    PubMed

    Chen, P Z; Wang, H

    2016-12-06

    Precision medicine has been increasingly incorporated into clinical practice and is enabling a new era for disease prevention and treatment. As an important constituent of precision medicine, precision nutrition has also been drawing more attention during physical examinations. The main aim of precision nutrition is to provide safe and efficient intervention methods for disease treatment and management, through fully considering the genetics, lifestyle (dietary, exercise and lifestyle choices), metabolic status, gut microbiota and physiological status (nutrient level and disease status) of individuals. Three major components should be considered in precision nutrition, including individual criteria for sufficient nutritional status, biomarker monitoring or techniques for nutrient detection and the applicable therapeutic or intervention methods. It was suggested that, in clinical practice, many inherited and chronic metabolic diseases might be prevented or managed through precision nutritional intervention. For generally healthy populations, because lifestyles, dietary factors, genetic factors and environmental exposures vary among individuals, precision nutrition is warranted to improve their physical activity and reduce disease risks. In summary, research and practice is leading toward precision nutrition becoming an integral constituent of clinical nutrition and disease prevention in the era of precision medicine.

  10. Classification of LIDAR Data for Generating a High-Precision Roadway Map

    NASA Astrophysics Data System (ADS)

    Jeong, J.; Lee, I.

    2016-06-01

    Generating of a highly precise map grows up with development of autonomous driving vehicles. The highly precise map includes a precision of centimetres level unlike an existing commercial map with the precision of meters level. It is important to understand road environments and make a decision for autonomous driving since a robust localization is one of the critical challenges for the autonomous driving car. The one of source data is from a Lidar because it provides highly dense point cloud data with three dimensional position, intensities and ranges from the sensor to target. In this paper, we focus on how to segment point cloud data from a Lidar on a vehicle and classify objects on the road for the highly precise map. In particular, we propose the combination with a feature descriptor and a classification algorithm in machine learning. Objects can be distinguish by geometrical features based on a surface normal of each point. To achieve correct classification using limited point cloud data sets, a Support Vector Machine algorithm in machine learning are used. Final step is to evaluate accuracies of obtained results by comparing them to reference data The results show sufficient accuracy and it will be utilized to generate a highly precise road map.

  11. Enhanced online convolutional neural networks for object tracking

    NASA Astrophysics Data System (ADS)

    Zhang, Dengzhuo; Gao, Yun; Zhou, Hao; Li, Tianwen

    2018-04-01

    In recent several years, object tracking based on convolution neural network has gained more and more attention. The initialization and update of convolution filters can directly affect the precision of object tracking effective. In this paper, a novel object tracking via an enhanced online convolution neural network without offline training is proposed, which initializes the convolution filters by a k-means++ algorithm and updates the filters by an error back-propagation. The comparative experiments of 7 trackers on 15 challenging sequences showed that our tracker can perform better than other trackers in terms of AUC and precision.

  12. Comparative analysis of imaging configurations and objectives for Fourier microscopy.

    PubMed

    Kurvits, Jonathan A; Jiang, Mingming; Zia, Rashid

    2015-11-01

    Fourier microscopy is becoming an increasingly important tool for the analysis of optical nanostructures and quantum emitters. However, achieving quantitative Fourier space measurements requires a thorough understanding of the impact of aberrations introduced by optical microscopes that have been optimized for conventional real-space imaging. Here we present a detailed framework for analyzing the performance of microscope objectives for several common Fourier imaging configurations. To this end, we model objectives from Nikon, Olympus, and Zeiss using parameters that were inferred from patent literature and confirmed, where possible, by physical disassembly. We then examine the aberrations most relevant to Fourier microscopy, including the alignment tolerances of apodization factors for different objective classes, the effect of magnification on the modulation transfer function, and vignetting-induced reductions of the effective numerical aperture for wide-field measurements. Based on this analysis, we identify an optimal objective class and imaging configuration for Fourier microscopy. In addition, the Zemax files for the objectives and setups used in this analysis have been made publicly available as a resource for future studies.

  13. Sensor-based precision fertilization for field crops

    USDA-ARS?s Scientific Manuscript database

    From the development of the first viable variable-rate fertilizer systems in the upper Midwest USA, precision agriculture is now approaching three decades old. Early precision fertilization practice relied on laboratory analysis of soil samples collected on a spatial pattern to define the nutrient-s...

  14. Consumer Education: Consumer Education I and Consumer Education II. Course Objectives, Content Analysis, Supporting Objectives and Content Generalizations.

    ERIC Educational Resources Information Center

    Crow, Karen, Comp.; Martin, Joan, Ed.

    Consumer education course objectives, content analysis, supporting objectives, and content generalizations are presented in this teacher's guide for Consumer Education 1 and 2 for the San Diego Unified School District. Course objectives are aimed at several areas of consumer and family studies: consumer education, cultural awareness, human…

  15. Significantly improved precision of cell migration analysis in time-lapse video microscopy through use of a fully automated tracking system

    PubMed Central

    2010-01-01

    Background Cell motility is a critical parameter in many physiological as well as pathophysiological processes. In time-lapse video microscopy, manual cell tracking remains the most common method of analyzing migratory behavior of cell populations. In addition to being labor-intensive, this method is susceptible to user-dependent errors regarding the selection of "representative" subsets of cells and manual determination of precise cell positions. Results We have quantitatively analyzed these error sources, demonstrating that manual cell tracking of pancreatic cancer cells lead to mis-calculation of migration rates of up to 410%. In order to provide for objective measurements of cell migration rates, we have employed multi-target tracking technologies commonly used in radar applications to develop fully automated cell identification and tracking system suitable for high throughput screening of video sequences of unstained living cells. Conclusion We demonstrate that our automatic multi target tracking system identifies cell objects, follows individual cells and computes migration rates with high precision, clearly outperforming manual procedures. PMID:20377897

  16. PRECISE ANGLE MONITOR BASED ON THE CONCEPT OF PENCIL-BEAM INTERFEROMETRY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    QIAN,S.; TAKACS,P.

    2000-07-30

    The precise angle monitoring is a very important metrology task for research, development and industrial applications. Autocollimator is one of the most powerful and widely applied instruments for small angle monitoring, which is based on the principle of geometric optics. In this paper the authors introduce a new precise angle monitoring system, Pencil-beam Angle Monitor (PAM), base on pencil beam interferometry. Its principle of operation is a combination of physical and geometrical optics. The angle calculation method is similar to the autocollimator. However, the autocollimator creates a cross image but the precise pencil-beam angle monitoring system produces an interference fringemore » on the focal plane. The advantages of the PAM are: high angular sensitivity, long-term stability character making angle monitoring over long time periods possible, high measurement accuracy in the order of sub-microradian, simultaneous measurement ability in two perpendicular directions or on two different objects, dynamic measurement possibility, insensitive to the vibration and air turbulence, automatic display, storage and analysis by use of the computer, small beam diameter making the alignment extremely easy and longer test distance. Some test examples are presented.« less

  17. Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective

    NASA Astrophysics Data System (ADS)

    Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.

    2016-06-01

    We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).

  18. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  19. Present situation and trend of precision guidance technology and its intelligence

    NASA Astrophysics Data System (ADS)

    Shang, Zhengguo; Liu, Tiandong

    2017-11-01

    This paper first introduces the basic concepts of precision guidance technology and artificial intelligence technology. Then gives a brief introduction of intelligent precision guidance technology, and with the help of development of intelligent weapon based on deep learning project in foreign: LRASM missile project, TRACE project, and BLADE project, this paper gives an overview of the current foreign precision guidance technology. Finally, the future development trend of intelligent precision guidance technology is summarized, mainly concentrated in the multi objectives, intelligent classification, weak target detection and recognition, intelligent between complex environment intelligent jamming and multi-source, multi missile cooperative fighting and other aspects.

  20. Meshing complex macro-scale objects into self-assembling bricks

    PubMed Central

    Hacohen, Adar; Hanniel, Iddo; Nikulshin, Yasha; Wolfus, Shuki; Abu-Horowitz, Almogit; Bachelet, Ido

    2015-01-01

    Self-assembly provides an information-economical route to the fabrication of objects at virtually all scales. However, there is no known algorithm to program self-assembly in macro-scale, solid, complex 3D objects. Here such an algorithm is described, which is inspired by the molecular assembly of DNA, and based on bricks designed by tetrahedral meshing of arbitrary objects. Assembly rules are encoded by topographic cues imprinted on brick faces while attraction between bricks is provided by embedded magnets. The bricks can then be mixed in a container and agitated, leading to properly assembled objects at high yields and zero errors. The system and its assembly dynamics were characterized by video and audio analysis, enabling the precise time- and space-resolved characterization of its performance and accuracy. Improved designs inspired by our system could lead to successful implementation of self-assembly at the macro-scale, allowing rapid, on-demand fabrication of objects without the need for assembly lines. PMID:26226488

  1. Angular trapping of anisometric nano-objects in a fluid.

    PubMed

    Celebrano, Michele; Rosman, Christina; Sönnichsen, Carsten; Krishnan, Madhavi

    2012-11-14

    We demonstrate the ability to trap, levitate, and orient single anisometric nanoscale objects with high angular precision in a fluid. An electrostatic fluidic trap confines a spherical object at a spatial location defined by the minimum of the electrostatic system free energy. For an anisometric object and a potential well lacking angular symmetry, the system free energy can further strongly depend on the object's orientation in the trap. Engineering the morphology of the trap thus enables precise spatial and angular confinement of a single levitating nano-object, and the process can be massively parallelized. Since the physics of the trap depends strongly on the surface charge of the object, the method is insensitive to the object's dielectric function. Furthermore, levitation of the assembled objects renders them amenable to individual manipulation using externally applied optical, electrical, or hydrodynamic fields, raising prospects for reconfigurable chip-based nano-object assemblies.

  2. Trace element analysis by EPMA in geosciences: detection limit, precision and accuracy

    NASA Astrophysics Data System (ADS)

    Batanova, V. G.; Sobolev, A. V.; Magnin, V.

    2018-01-01

    Use of the electron probe microanalyser (EPMA) for trace element analysis has increased over the last decade, mainly because of improved stability of spectrometers and the electron column when operated at high probe current; development of new large-area crystal monochromators and ultra-high count rate spectrometers; full integration of energy-dispersive / wavelength-dispersive X-ray spectrometry (EDS/WDS) signals; and the development of powerful software packages. For phases that are stable under a dense electron beam, the detection limit and precision can be decreased to the ppm level by using high acceleration voltage and beam current combined with long counting time. Data on 10 elements (Na, Al, P, Ca, Ti, Cr, Mn, Co, Ni, Zn) in olivine obtained on a JEOL JXA-8230 microprobe with tungsten filament show that the detection limit decreases proportionally to the square root of counting time and probe current. For all elements equal or heavier than phosphorus (Z = 15), the detection limit decreases with increasing accelerating voltage. The analytical precision for minor and trace elements analysed in olivine at 25 kV accelerating voltage and 900 nA beam current is 4 - 18 ppm (2 standard deviations of repeated measurements of the olivine reference sample) and is similar to the detection limit of corresponding elements. To analyse trace elements accurately requires careful estimation of background, and consideration of sample damage under the beam and secondary fluorescence from phase boundaries. The development and use of matrix reference samples with well-characterised trace elements of interest is important for monitoring and improving of the accuracy. An evaluation of the accuracy of trace element analyses in olivine has been made by comparing EPMA data for new reference samples with data obtained by different in-situ and bulk analytical methods in six different laboratories worldwide. For all elements, the measured concentrations in the olivine reference sample

  3. Critical Steps in Data Analysis for Precision Casimir Force Measurements with Semiconducting Films

    NASA Astrophysics Data System (ADS)

    Banishev, A. A.; Chang, Chia-Cheng; Mohideen, U.

    2011-06-01

    Some experimental procedures and corresponding results of the precision measurement of the Casimir force between low doped Indium Tin Oxide (ITO) film and gold sphere are described. Measurements were performed using an Atomic Force Microscope in high vacuum. It is shown that the magnitude of the Casimir force decreases after prolonged UV treatment of the ITO film. Some critical data analysis steps such as the correction for the mechanical drift of the sphere-plate system and photodiodes are discussed.

  4. Critical Steps in Data Analysis for Precision Casimir Force Measurements with Semiconducting Films

    NASA Astrophysics Data System (ADS)

    Banishev, A. A.; Chang, Chia-Cheng; Mohideen, U.

    Some experimental procedures and corresponding results of the precision measurement of the Casimir force between low doped Indium Tin Oxide (ITO) film and gold sphere are described. Measurements were performed using an Atomic Force Microscope in high vacuum. It is shown that the magnitude of the Casimir force decreases after prolonged UV treatment of the ITO film. Some critical data analysis steps such as the correction for the mechanical drift of the sphere-plate system and photodiodes are discussed.

  5. Multisensory Self-Motion Compensation During Object Trajectory Judgments

    PubMed Central

    Dokka, Kalpana; MacNeilage, Paul R.; DeAngelis, Gregory C.; Angelaki, Dora E.

    2015-01-01

    Judging object trajectory during self-motion is a fundamental ability for mobile organisms interacting with their environment. This fundamental ability requires the nervous system to compensate for the visual consequences of self-motion in order to make accurate judgments, but the mechanisms of this compensation are poorly understood. We comprehensively examined both the accuracy and precision of observers' ability to judge object trajectory in the world when self-motion was defined by vestibular, visual, or combined visual–vestibular cues. Without decision feedback, subjects demonstrated no compensation for self-motion that was defined solely by vestibular cues, partial compensation (47%) for visually defined self-motion, and significantly greater compensation (58%) during combined visual–vestibular self-motion. With decision feedback, subjects learned to accurately judge object trajectory in the world, and this generalized to novel self-motion speeds. Across conditions, greater compensation for self-motion was associated with decreased precision of object trajectory judgments, indicating that self-motion compensation comes at the cost of reduced discriminability. Our findings suggest that the brain can flexibly represent object trajectory relative to either the observer or the world, but a world-centered representation comes at the cost of decreased precision due to the inclusion of noisy self-motion signals. PMID:24062317

  6. A practical approach to object based requirements analysis

    NASA Technical Reports Server (NTRS)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  7. Omics AnalySIs System for PRecision Oncology (OASISPRO): A Web-based Omics Analysis Tool for Clinical Phenotype Prediction.

    PubMed

    Yu, Kun-Hsing; Fitzpatrick, Michael R; Pappas, Luke; Chan, Warren; Kung, Jessica; Snyder, Michael

    2017-09-12

    Precision oncology is an approach that accounts for individual differences to guide cancer management. Omics signatures have been shown to predict clinical traits for cancer patients. However, the vast amount of omics information poses an informatics challenge in systematically identifying patterns associated with health outcomes, and no general-purpose data-mining tool exists for physicians, medical researchers, and citizen scientists without significant training in programming and bioinformatics. To bridge this gap, we built the Omics AnalySIs System for PRecision Oncology (OASISPRO), a web-based system to mine the quantitative omics information from The Cancer Genome Atlas (TCGA). This system effectively visualizes patients' clinical profiles, executes machine-learning algorithms of choice on the omics data, and evaluates the prediction performance using held-out test sets. With this tool, we successfully identified genes strongly associated with tumor stage, and accurately predicted patients' survival outcomes in many cancer types, including mesothelioma and adrenocortical carcinoma. By identifying the links between omics and clinical phenotypes, this system will facilitate omics studies on precision cancer medicine and contribute to establishing personalized cancer treatment plans. This web-based tool is available at http://tinyurl.com/oasispro ;source codes are available at http://tinyurl.com/oasisproSourceCode . © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  8. Is There Space for the Objective Force?

    DTIC Science & Technology

    2003-04-07

    force through the combination of precision weapons and knowledge-based warfare. Army forces will survive through information dominance , provided by a...Objective Forces. Space-based systems will be foundational building blocks for the Objective Force to achieve information dominance and satellite...communications required for information dominance across a distributed battlefield? Second, what exists to provide the Objective Force information

  9. Platform Precision Autopilot Overview and Flight Test Results

    NASA Technical Reports Server (NTRS)

    Lin, V.; Strovers, B.; Lee, J.; Beck, R.

    2008-01-01

    The Platform Precision Autopilot is an instrument landing system interfaced autopilot system, developed to enable an aircraft to repeatedly fly nearly the same trajectory hours, days, or weeks later. The Platform Precision Autopilot uses a novel design to interface with a NASA Gulfstream III jet by imitating the output of an instrument landing system approach. This technique minimizes, as much as possible, modifications to the baseline Gulfstream III jet and retains the safety features of the aircraft autopilot. The Platform Precision Autopilot requirement is to fly within a 5-m (16.4-ft) radius tube for distances to 200 km (108 nmi) in the presence of light turbulence for at least 90 percent of the time. This capability allows precise repeat-pass interferometry for the Uninhabited Aerial Vehicle Synthetic Aperture Radar program, whose primary objective is to develop a miniaturized, polarimetric, L-band synthetic aperture radar. Precise navigation is achieved using an accurate differential global positioning system developed by the Jet Propulsion Laboratory. Flight-testing has demonstrated the ability of the Platform Precision Autopilot to control the aircraft within the specified tolerance greater than 90 percent of the time in the presence of aircraft system noise and nonlinearities, constant pilot throttle adjustments, and light turbulence.

  10. Toward Precision Healthcare: Context and Mathematical Challenges

    PubMed Central

    Colijn, Caroline; Jones, Nick; Johnston, Iain G.; Yaliraki, Sophia; Barahona, Mauricio

    2017-01-01

    Precision medicine refers to the idea of delivering the right treatment to the right patient at the right time, usually with a focus on a data-centered approach to this task. In this perspective piece, we use the term “precision healthcare” to describe the development of precision approaches that bridge from the individual to the population, taking advantage of individual-level data, but also taking the social context into account. These problems give rise to a broad spectrum of technical, scientific, policy, ethical and social challenges, and new mathematical techniques will be required to meet them. To ensure that the science underpinning “precision” is robust, interpretable and well-suited to meet the policy, ethical and social questions that such approaches raise, the mathematical methods for data analysis should be transparent, robust, and able to adapt to errors and uncertainties. In particular, precision methodologies should capture the complexity of data, yet produce tractable descriptions at the relevant resolution while preserving intelligibility and traceability, so that they can be used by practitioners to aid decision-making. Through several case studies in this domain of precision healthcare, we argue that this vision requires the development of new mathematical frameworks, both in modeling and in data analysis and interpretation. PMID:28377724

  11. Evaluation of the prediction precision capability of partial least squares regression approach for analysis of high alloy steel by laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Sarkar, Arnab; Karki, Vijay; Aggarwal, Suresh K.; Maurya, Gulab S.; Kumar, Rohit; Rai, Awadhesh K.; Mao, Xianglei; Russo, Richard E.

    2015-06-01

    Laser induced breakdown spectroscopy (LIBS) was applied for elemental characterization of high alloy steel using partial least squares regression (PLSR) with an objective to evaluate the analytical performance of this multivariate approach. The optimization of the number of principle components for minimizing error in PLSR algorithm was investigated. The effect of different pre-treatment procedures on the raw spectral data before PLSR analysis was evaluated based on several statistical (standard error of prediction, percentage relative error of prediction etc.) parameters. The pre-treatment with "NORM" parameter gave the optimum statistical results. The analytical performance of PLSR model improved by increasing the number of laser pulses accumulated per spectrum as well as by truncating the spectrum to appropriate wavelength region. It was found that the statistical benefit of truncating the spectrum can also be accomplished by increasing the number of laser pulses per accumulation without spectral truncation. The constituents (Co and Mo) present in hundreds of ppm were determined with relative precision of 4-9% (2σ), whereas the major constituents Cr and Ni (present at a few percent levels) were determined with a relative precision of ~ 2%(2σ).

  12. Robotic Observatory System Design-Specification Considerations for Achieving Long-Term Sustainable Precision Performance

    NASA Astrophysics Data System (ADS)

    Wray, J. D.

    2003-05-01

    The robotic observatory telescope must point precisely on the target object, and then track autonomously to a fraction of the FWHM of the system PSF for durations of ten to twenty minutes or more. It must retain this precision while continuing to function at rates approaching thousands of observations per night for all its years of useful life. These stringent requirements raise new challenges unique to robotic telescope systems design. Critical design considerations are driven by the applicability of the above requirements to all systems of the robotic observatory, including telescope and instrument systems, telescope-dome enclosure systems, combined electrical and electronics systems, environmental (e.g. seeing) control systems and integrated computer control software systems. Traditional telescope design considerations include the effects of differential thermal strain, elastic flexure, plastic flexure and slack or backlash with respect to focal stability, optical alignment and angular pointing and tracking precision. Robotic observatory design must holistically encapsulate these traditional considerations within the overall objective of maximized long-term sustainable precision performance. This overall objective is accomplished through combining appropriate mechanical and dynamical system characteristics with a full-time real-time telescope mount model feedback computer control system. Important design considerations include: identifying and reducing quasi-zero-backlash; increasing size to increase precision; directly encoding axis shaft rotation; pointing and tracking operation via real-time feedback between precision mount model and axis mounted encoders; use of monolithic construction whenever appropriate for sustainable mechanical integrity; accelerating dome motion to eliminate repetitive shock; ducting internal telescope air to outside dome; and the principal design criteria: maximizing elastic repeatability while minimizing slack, plastic deformation

  13. Accounting for Limited Detection Efficiency and Localization Precision in Cluster Analysis in Single Molecule Localization Microscopy

    PubMed Central

    Shivanandan, Arun; Unnikrishnan, Jayakrishnan; Radenovic, Aleksandra

    2015-01-01

    Single Molecule Localization Microscopy techniques like PhotoActivated Localization Microscopy, with their sub-diffraction limit spatial resolution, have been popularly used to characterize the spatial organization of membrane proteins, by means of quantitative cluster analysis. However, such quantitative studies remain challenged by the techniques’ inherent sources of errors such as a limited detection efficiency of less than 60%, due to incomplete photo-conversion, and a limited localization precision in the range of 10 – 30nm, varying across the detected molecules, mainly depending on the number of photons collected from each. We provide analytical methods to estimate the effect of these errors in cluster analysis and to correct for them. These methods, based on the Ripley’s L(r) – r or Pair Correlation Function popularly used by the community, can facilitate potentially breakthrough results in quantitative biology by providing a more accurate and precise quantification of protein spatial organization. PMID:25794150

  14. Diamond tool wear detection method using cutting force and its power spectrum analysis in ultra-precision fly cutting

    NASA Astrophysics Data System (ADS)

    Zhang, G. Q.; To, S.

    2014-08-01

    Cutting force and its power spectrum analysis was thought to be an effective method monitoring tool wear in many cutting processes and a significant body of research has been conducted on this research area. However, relative little similar research was found in ultra-precision fly cutting. In this paper, a group of experiments were carried out to investigate the cutting forces and its power spectrum characteristics under different tool wear stages. Result reveals that the cutting force increases with the progress of tool wear. The cutting force signals under different tool wear stages were analyzed using power spectrum analysis. The analysis indicates that a characteristic frequency does exist in the power spectrum of the cutting force, whose power spectral density increases with the increasing of tool wear level, this characteristic frequency could be adopted to monitor diamond tool wear in ultra-precision fly cutting.

  15. Introductory Psychology Textbooks: An Objective Analysis Update

    ERIC Educational Resources Information Center

    Griggs, Richard A.; Jackson, Sherri L.

    2013-01-01

    It has been 13 years since the last objective analysis of full-length introductory psychology textbooks was published and 15 years since the textbook copyright period used in that study, 1995-1997. Given the importance of informed textbook evaluation and selection to the introductory course but the difficulty of this task because of the large…

  16. Precise Analysis of Gallium Isotopic Composition by MC-ICP-MS.

    PubMed

    Yuan, Wei; Chen, Jiu Bin; Birck, Jean-Louis; Yin, Zuo Ying; Yuan, Sheng Liu; Cai, Hong Ming; Wang, Zhong Wei; Huang, Qiang; Wang, Zhu Hong

    2016-10-04

    Though an isotope approach could be beneficial for better understanding the biogeochemical cycle of gallium (Ga), an analogue of the monoisotopic element aluminum (Al), the geochemistry of Ga isotopes has not been widely elaborated. We developed a two-step method for purifying Ga from geological (biological) samples for precise measurement of Ga isotope ratio using multicollector inductively coupled plasma mass spectrometry (MC-ICP-MS). Ga was thoroughly separated from other matrix elements using two chromatographic columns loaded with AG 1-X4 and Ln-spec resin, respectively. The separation method was carefully calibrated using both synthetic and natural samples and validated by assessing the extraction yield (99.8 ± 0.8%, 2SD, n = 23) and the reproducibility (2SD uncertainty better than 0.05‰, n = 116) of the measured isotopic ratio (expressed as δ 71 Ga). The validation of the whole protocol, together with instrumental analysis, was confirmed by the investigation of the matrix effect, the result of a standard addition experiment, and the comparison of Ga isotope measurement on two mass spectrometers-Nu Plasma II and Neptune Plus. Although the measurements using the sample-standard bracketing (SSB) correction method on both instruments resulted in identical δ 71 Ga values for reference materials, the modified empirical external normalization (MEEN) method gave relatively better precision compared to SSB on Neptune. Our preliminary results showed large variation of δ 71 Ga (up to 1.83‰) for 10 standards, with higher values in industrially produced materials, implying potential application of Ga isotopes.

  17. An interval precise integration method for transient unbalance response analysis of rotor system with uncertainty

    NASA Astrophysics Data System (ADS)

    Fu, Chao; Ren, Xingmin; Yang, Yongfeng; Xia, Yebao; Deng, Wangqun

    2018-07-01

    A non-intrusive interval precise integration method (IPIM) is proposed in this paper to analyze the transient unbalance response of uncertain rotor systems. The transfer matrix method (TMM) is used to derive the deterministic equations of motion of a hollow-shaft overhung rotor. The uncertain transient dynamic problem is solved by combing the Chebyshev approximation theory with the modified precise integration method (PIM). Transient response bounds are calculated by interval arithmetic of the expansion coefficients. Theoretical error analysis of the proposed method is provided briefly, and its accuracy is further validated by comparing with the scanning method in simulations. Numerical results show that the IPIM can keep good accuracy in vibration prediction of the start-up transient process. Furthermore, the proposed method can also provide theoretical guidance to other transient dynamic mechanical systems with uncertainties.

  18. Optimal tracers for parallel labeling experiments and 13C metabolic flux analysis: A new precision and synergy scoring system.

    PubMed

    Crown, Scott B; Long, Christopher P; Antoniewicz, Maciek R

    2016-11-01

    13 C-Metabolic flux analysis ( 13 C-MFA) is a widely used approach in metabolic engineering for quantifying intracellular metabolic fluxes. The precision of fluxes determined by 13 C-MFA depends largely on the choice of isotopic tracers and the specific set of labeling measurements. A recent advance in the field is the use of parallel labeling experiments for improved flux precision and accuracy. However, as of today, no systemic methods exist for identifying optimal tracers for parallel labeling experiments. In this contribution, we have addressed this problem by introducing a new scoring system and evaluating thousands of different isotopic tracer schemes. Based on this extensive analysis we have identified optimal tracers for 13 C-MFA. The best single tracers were doubly 13 C-labeled glucose tracers, including [1,6- 13 C]glucose, [5,6- 13 C]glucose and [1,2- 13 C]glucose, which consistently produced the highest flux precision independent of the metabolic flux map (here, 100 random flux maps were evaluated). Moreover, we demonstrate that pure glucose tracers perform better overall than mixtures of glucose tracers. For parallel labeling experiments the optimal isotopic tracers were [1,6- 13 C]glucose and [1,2- 13 C]glucose. Combined analysis of [1,6- 13 C]glucose and [1,2- 13 C]glucose labeling data improved the flux precision score by nearly 20-fold compared to widely use tracer mixture 80% [1- 13 C]glucose +20% [U- 13 C]glucose. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  19. High precision calcium isotope analysis using 42Ca-48Ca double-spike TIMS technique

    NASA Astrophysics Data System (ADS)

    Feng, L.; Zhou, L.; Gao, S.; Tong, S. Y.; Zhou, M. L.

    2014-12-01

    Double spike techniques are widely used for determining calcium isotopic compositions of natural samples. The most important factor controlling precision of the double spike technique is the choice of appropriate spike isotope pair, the composition of double spikes and the ratio of spike to sample(CSp/CN). We propose an optimal 42Ca-48Ca double spike protocol which yields the best internal precision for calcium isotopic composition determinations among all kinds of spike pairs and various spike compositions and ratios of spike to sample, as predicted by linear error propagation method. It is suggested to use spike composition of 42Ca/(42Ca+48Ca) = 0.44 mol/mol and CSp/(CN+ CSp)= 0.12mol/mol because it takes both advantages of the largest mass dispersion between 42Ca and 48Ca (14%) and lowest spike cost. Spiked samples were purified by pass through homemade micro-column filled with Ca special resin. K, Ti and other interference elements were completely separated, while 100% calcium was recovered with negligible blank. Data collection includes integration time, idle time, focus and peakcenter frequency, which were all carefully designed for the highest internal precision and lowest analysis time. All beams were automatically measured in a sequence by Triton TIMS so as to eliminate difference of analytical conditions between samples and standards, and also to increase the analytical throughputs. The typical internal precision of 100 duty cycles for one beam is 0.012‒0.015 ‰ (2δSEM), which agrees well with the predicted internal precision of 0.0124 ‰ (2δSEM). Our methods improve internal precisions by a factor of 2‒10 compared to previous methods of determination of calcium isotopic compositions by double spike TIMS. We analyzed NIST SRM 915a, NIST SRM 915b and Pacific Seawater as well as interspersed geological samples during two months. The obtained average δ44/40Ca (all relative to NIST SRM 915a) is 0.02 ± 0.02 ‰ (n=28), 0.72±0.04 ‰ (n=10) and 1

  20. Precise Truss Assembly using Commodity Parts and Low Precision Welding

    NASA Technical Reports Server (NTRS)

    Komendera, Erik; Reishus, Dustin; Dorsey, John T.; Doggett, William R.; Correll, Nikolaus

    2013-01-01

    We describe an Intelligent Precision Jigging Robot (IPJR), which allows high precision assembly of commodity parts with low-precision bonding. We present preliminary experiments in 2D that are motivated by the problem of assembling a space telescope optical bench on orbit using inexpensive, stock hardware and low-precision welding. An IPJR is a robot that acts as the precise "jigging", holding parts of a local assembly site in place while an external low precision assembly agent cuts and welds members. The prototype presented in this paper allows an assembly agent (in this case, a human using only low precision tools), to assemble a 2D truss made of wooden dowels to a precision on the order of millimeters over a span on the order of meters. We report the challenges of designing the IPJR hardware and software, analyze the error in assembly, document the test results over several experiments including a large-scale ring structure, and describe future work to implement the IPJR in 3D and with micron precision.

  1. High-Precision Pinpointing of Luminescent Targets in Encoder-Assisted Scanning Microscopy Allowing High-Speed Quantitative Analysis.

    PubMed

    Zheng, Xianlin; Lu, Yiqing; Zhao, Jiangbo; Zhang, Yuhai; Ren, Wei; Liu, Deming; Lu, Jie; Piper, James A; Leif, Robert C; Liu, Xiaogang; Jin, Dayong

    2016-01-19

    Compared with routine microscopy imaging of a few analytes at a time, rapid scanning through the whole sample area of a microscope slide to locate every single target object offers many advantages in terms of simplicity, speed, throughput, and potential for robust quantitative analysis. Existing techniques that accommodate solid-phase samples incorporating individual micrometer-sized targets generally rely on digital microscopy and image analysis, with intrinsically low throughput and reliability. Here, we report an advanced on-the-fly stage scanning method to achieve high-precision target location across the whole slide. By integrating X- and Y-axis linear encoders to a motorized stage as the virtual "grids" that provide real-time positional references, we demonstrate an orthogonal scanning automated microscopy (OSAM) technique which can search a coverslip area of 50 × 24 mm(2) in just 5.3 min and locate individual 15 μm lanthanide luminescent microspheres with standard deviations of 1.38 and 1.75 μm in X and Y directions. Alongside implementation of an autofocus unit that compensates the tilt of a slide in the Z-axis in real time, we increase the luminescence detection efficiency by 35% with an improved coefficient of variation. We demonstrate the capability of advanced OSAM for robust quantification of luminescence intensities and lifetimes for a variety of micrometer-scale luminescent targets, specifically single down-shifting and upconversion microspheres, crystalline microplates, and color-barcoded microrods, as well as quantitative suspension array assays of biotinylated-DNA functionalized upconversion nanoparticles.

  2. Accurate object tracking system by integrating texture and depth cues

    NASA Astrophysics Data System (ADS)

    Chen, Ju-Chin; Lin, Yu-Hang

    2016-03-01

    A robust object tracking system that is invariant to object appearance variations and background clutter is proposed. Multiple instance learning with a boosting algorithm is applied to select discriminant texture information between the object and background data. Additionally, depth information, which is important to distinguish the object from a complicated background, is integrated. We propose two depth-based models that can compensate texture information to cope with both appearance variants and background clutter. Moreover, in order to reduce the risk of drifting problem increased for the textureless depth templates, an update mechanism is proposed to select more precise tracking results to avoid incorrect model updates. In the experiments, the robustness of the proposed system is evaluated and quantitative results are provided for performance analysis. Experimental results show that the proposed system can provide the best success rate and has more accurate tracking results than other well-known algorithms.

  3. Plasmonic micropillars for precision cell force measurement across a large field-of-view

    NASA Astrophysics Data System (ADS)

    Xiao, Fan; Wen, Ximiao; Tan, Xing Haw Marvin; Chiou, Pei-Yu

    2018-01-01

    A plasmonic micropillar platform with self-organized gold nanospheres is reported for the precision cell traction force measurement across a large field-of-view (FOV). Gold nanospheres were implanted into the tips of polymer micropillars by annealing gold microdisks with nanosecond laser pulses. Each gold nanosphere is physically anchored in the center of a pillar tip and serves as a strong, point-source-like light scattering center for each micropillar. This allows a micropillar to be clearly observed and precisely tracked even under a low magnification objective lens for the concurrent and precision measurement across a large FOV. A spatial resolution of 30 nm for the pillar deflection measurement has been accomplished on this platform with a 20× objective lens.

  4. Variation objective analyses for cyclone studies

    NASA Technical Reports Server (NTRS)

    Achtemeier, G. L.; Kidder, S. Q.; Ochs, H. T.

    1985-01-01

    The objectives were to: (1) develop an objective analysis technique that will maximize the information content of data available from diverse sources, with particular emphasis on the incorporation of observations from satellites with those from more traditional immersion techniques; and (2) to develop a diagnosis of the state of the synoptic scale atmosphere on a much finer scale over a much broader region than is presently possible to permit studies of the interactions and energy transfers between global, synoptic and regional scale atmospheric processes. The variational objective analysis model consists of the two horizontal momentum equations, the hydrostatic equation, and the integrated continuity equation for a dry hydrostatic atmosphere. Preliminary tests of the model with the SESMAE I data set are underway for 12 GMT 10 April 1979. At this stage of purpose of the analysis is not the diagnosis of atmospheric structures but rather the validation of the model. Model runs for rawinsonde data and with the precision modulus weights set to force most of the adjustment of the wind field to the mass field have produced 90 to 95 percent reductions in the imbalance of the initial data after only 4-cycles through the Euler-Lagrange equations. Sensitivity tests for linear stability of the 11 Euler-Lagrange equations that make up the VASP Model 1 indicate that there will be a lower limit to the scales of motion that can be resolved by this method. Linear stability criteria are violated where there is large horizontal wind shear near the upper tropospheric jet.

  5. Precise terrestrial time: A means for improved ballistic missile guidance analysis

    NASA Technical Reports Server (NTRS)

    Ehrsam, E. E.; Cresswell, S. A.; Mckelvey, G. R.; Matthews, F. L.

    1978-01-01

    An approach developed to improve the ground instrumentation time tagging accuracy and adapted to support the Minuteman ICBM program is desired. The Timing Insertion Unit (TIU) technique produces a telemetry data time tagging resolution of one tenth of a microsecond, with a relative intersite accuracy after corrections and velocity data (range, azimuth, elevation and range rate) also used in missile guidance system analysis can be correlated to within ten microseconds of the telemetry guidance data. This requires precise timing synchronization between the metric and telemetry instrumentation sites. The timing synchronization can be achieved by using the radar automatic phasing system time correlation methods. Other time correlation techniques such as Television (TV) Line-10 and the Geostationary Operational Environmental Satellites (GEOS) terrestial timing receivers are also considered.

  6. Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.

    NASA Technical Reports Server (NTRS)

    Leonard, Desiree M.

    1991-01-01

    Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.

  7. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Machine learning and data mining advance predictive big data analysis in precision animal agriculture.

    PubMed

    Morota, Gota; Ventura, Ricardo V; Silva, Fabyano F; Koyama, Masanori; Fernando, Samodha C

    2018-04-14

    Precision animal agriculture is poised to rise to prominence in the livestock enterprise in the domains of management, production, welfare, sustainability, health surveillance, and environmental footprint. Considerable progress has been made in the use of tools to routinely monitor and collect information from animals and farms in a less laborious manner than before. These efforts have enabled the animal sciences to embark on information technology-driven discoveries to improve animal agriculture. However, the growing amount and complexity of data generated by fully automated, high-throughput data recording or phenotyping platforms, including digital images, sensor and sound data, unmanned systems, and information obtained from real-time noninvasive computer vision, pose challenges to the successful implementation of precision animal agriculture. The emerging fields of machine learning and data mining are expected to be instrumental in helping meet the daunting challenges facing global agriculture. Yet, their impact and potential in "big data" analysis have not been adequately appreciated in the animal science community, where this recognition has remained only fragmentary. To address such knowledge gaps, this article outlines a framework for machine learning and data mining and offers a glimpse into how they can be applied to solve pressing problems in animal sciences.

  8. Accuracy assessment of BDS precision orbit determination and the influence analysis of site distribution

    NASA Astrophysics Data System (ADS)

    Chen, Ming; Guo, Jiming; Li, Zhicai; Zhang, Peng; Wu, Junli; Song, Weiwei

    2017-04-01

    BDS precision orbit determination is a key content of the BDS application, but the inadequate ground stations and the poor distribution of the network are the main reasons for the low accuracy of BDS precise orbit determination. In this paper, the BDS precise orbit determination results are obtained by using the IGS MGEX stations and the Chinese national reference stations,the accuracy of orbit determination of GEO, IGSO and MEO is 10.3cm, 2.8cm and 3.2cm, and the radial accuracy is 1.6cm,1.9cm and 1.5cm.The influence of ground reference stations distribution on BDS precise orbit determination is studied. The results show that the Chinese national reference stations contribute significantly to the BDS orbit determination, the overlap precision of GEO/IGSO/MEO satellites were improved by 15.5%, 57.5% and 5.3% respectively after adding the Chinese stations.Finally, the results of ODOP(orbit distribution of precision) and SLR are verified. Key words: BDS precise orbit determination; accuracy assessment;Chinese national reference stations;reference stations distribution;orbit distribution of precision

  9. Geographic Object-Based Image Analysis - Towards a new paradigm.

    PubMed

    Blaschke, Thomas; Hay, Geoffrey J; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ' per-pixel paradigm ' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  10. Capacity and precision in an animal model of visual short-term memory.

    PubMed

    Lara, Antonio H; Wallis, Jonathan D

    2012-03-14

    Temporary storage of information in visual short-term memory (VSTM) is a key component of many complex cognitive abilities. However, it is highly limited in capacity. Understanding the neurophysiological nature of this capacity limit will require a valid animal model of VSTM. We used a multiple-item color change detection task to measure macaque monkeys' VSTM capacity. Subjects' performance deteriorated and reaction times increased as a function of the number of items in memory. Additionally, we measured the precision of the memory representations by varying the distance between sample and test colors. In trials with similar sample and test colors, subjects made more errors compared to trials with highly discriminable colors. We modeled the error distribution as a Gaussian function and used this to estimate the precision of VSTM representations. We found that as the number of items in memory increases the precision of the representations decreases dramatically. Additionally, we found that focusing attention on one of the objects increases the precision with which that object is stored and degrades the precision of the remaining. These results are in line with recent findings in human psychophysics and provide a solid foundation for understanding the neurophysiological nature of the capacity limit of VSTM.

  11. Capacity and precision in an animal model of visual short-term memory

    PubMed Central

    Lara, Antonio H.; Wallis, Jonathan D.

    2013-01-01

    Temporary storage of information in visual short-term memory (VSTM) is a key component of many complex cognitive abilities. However, it is highly limited in capacity. Understanding the neurophysiological nature of this capacity limit will require a valid animal model of VSTM. We used a multiple-item color change detection task to measure macaque monkeys’ VSTM capacity. Subjects’ performance deteriorated and reaction times increased as a function of the number of items in memory. Additionally, we measured the precision of the memory representations by varying the distance between sample and test colors. In trials with similar sample and test colors, subjects made more errors compared to trials with highly discriminable colors. We modeled the error distribution as a Gaussian function and used this to estimate the precision of VSTM representations. We found that as the number of items in memory increases the precision of the representations decreases dramatically. Additionally, we found that focusing attention on one of the objects increases the precision with which that object is stored and degrading the precision of the remaining. These results are in line with recent findings in human psychophysics and provide a solid foundation for understanding the neurophysiological nature of the capacity limit of VSTM. PMID:22419756

  12. The forthcoming era of precision medicine.

    PubMed

    Gamulin, Stjepan

    2016-11-01

    The aim of this essay is to present the definition and principles of personalized or precision medicine, the perspective and barriers to its development and clinical application. The implementation of precision medicine in health care requires the coordinated efforts of all health care stakeholders (the biomedical community, government, regulatory bodies, patients' groups). Particularly, translational research with the integration of genomic and comprehensive data from all levels of the organism ("big data"), development of bioinformatics platforms enabling network analysis of disease etiopathogenesis, development of a legislative framework for handling personal data, and new paradigms of medical education are necessary for successful application of the concept of precision medicine in health care. In the present and future era of precision medicine, the collaboration of all participants in health care is necessary for its realization, resulting in improvement of diagnosis, prevention and therapy, based on a holistic, individually tailored approach. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  13. Creating Objects and Object Categories for Studying Perception and Perceptual Learning

    PubMed Central

    Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay

    2012-01-01

    In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties2. Many innovative and useful methods currently exist for creating novel objects and object categories3-6 (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter5,9,10, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13. Objects and object categories created

  14. Creating objects and object categories for studying perception and perceptual learning.

    PubMed

    Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay

    2012-11-02

    In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties. Many innovative and useful methods currently exist for creating novel objects and object categories (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection. Objects and object categories created by these simulations can

  15. Local x-ray structure analysis of optically manipulated biological micro-objects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cojoc, Dan; Ferrari, Enrico; Santucci, Silvia C.

    2010-12-13

    X-ray diffraction using micro- and nanofocused beams is well suited for nanostructure analysis at different sites of a biological micro-object. To conduct in vitro studies without mechanical contact, we developed object manipulation by optical tweezers in a microfluidic cell. Here we report x-ray microdiffraction analysis of a micro-object optically trapped in three dimensions. We revealed the nanostructure of a single starch granule at different points and investigated local radiation damage induced by repeated x-ray exposures at the same position, demonstrating high stability and full control of the granule orientation by multiple optical traps.

  16. An Astronomical Test of CCD Photometric Precision

    NASA Technical Reports Server (NTRS)

    Koch, David G.; Dunham, Edward W.; Borucki, William J.; Jenkins, Jon M.

    2001-01-01

    Ground-based differential photometry is limited to a precision of order 10(exp -3) because of atmospheric effects. A space-based photometer should be limited only by the inherent instrument precision and shot noise. Laboratory tests have shown that a precision of order 10-5 is achievable with commercially available charged coupled devices (CCDs). We have proposed to take this one step further by performing measurements at a telescope using a Wollaston prism as a beam splitter First-order atmospheric effects (e.g., extinction) will appear to be identical in the two images of each star formed by the prism and will be removed in the data analysis. This arrangement can determine the precision that is achievable under the influence of second-order atmospheric effects (e.g., variable point-spread function (PSF) from seeing). These telescopic observations will thus provide a lower limit to the precision that can be realized by a space-based differential photometer.

  17. Objective Bayesian analysis of neutrino masses and hierarchy

    NASA Astrophysics Data System (ADS)

    Heavens, Alan F.; Sellentin, Elena

    2018-04-01

    Given the precision of current neutrino data, priors still impact noticeably the constraints on neutrino masses and their hierarchy. To avoid our understanding of neutrinos being driven by prior assumptions, we construct a prior that is mathematically minimally informative. Using the constructed uninformative prior, we find that the normal hierarchy is favoured but with inconclusive posterior odds of 5.1:1. Better data is hence needed before the neutrino masses and their hierarchy can be well constrained. We find that the next decade of cosmological data should provide conclusive evidence if the normal hierarchy with negligible minimum mass is correct, and if the uncertainty in the sum of neutrino masses drops below 0.025 eV. On the other hand, if neutrinos obey the inverted hierarchy, achieving strong evidence will be difficult with the same uncertainties. Our uninformative prior was constructed from principles of the Objective Bayesian approach. The prior is called a reference prior and is minimally informative in the specific sense that the information gain after collection of data is maximised. The prior is computed for the combination of neutrino oscillation data and cosmological data and still applies if the data improve.

  18. Multi-GNSS real-time precise orbit/clock/UPD products and precise positioning service at GFZ

    NASA Astrophysics Data System (ADS)

    Li, Xingxing; Ge, Maorong; Liu, Yang; Fritsche, Mathias; Wickert, Jens; Schuh, Harald

    2016-04-01

    The rapid development of multi-constellation GNSSs (Global Navigation Satellite Systems, e.g., BeiDou, Galileo, GLONASS, GPS) and the IGS (International GNSS Service) Multi-GNSS Experiment (MGEX) bring great opportunities and challenges for real-time precise positioning service. In this contribution, we present a GPS+GLONASS+BeiDou+Galileo four-system model to fully exploit the observations of all these four navigation satellite systems for real-time precise orbit determination, clock estimation and positioning. A rigorous multi-GNSS analysis is performed to achieve the best possible consistency by processing the observations from different GNSS together in one common parameter estimation procedure. Meanwhile, an efficient multi-GNSS real-time precise positioning service system is designed and demonstrated by using the Multi-GNSS Experiment (MGEX) and International GNSS Service (IGS) data streams including stations all over the world. The addition of the BeiDou, Galileo and GLONASS systems to the standard GPS-only processing, reduces the convergence time almost by 70%, while the positioning accuracy is improved by about 25%. Some outliers in the GPS-only solutions vanish when multi-GNSS observations are processed simultaneous. The availability and reliability of GPS precise positioning decrease dramatically as the elevation cutoff increases. However, the accuracy of multi-GNSS precise point positioning (PPP) is hardly decreased and few centimeters are still achievable in the horizontal components even with 40° elevation cutoff.

  19. Effects of grasp compatibility on long-term memory for objects.

    PubMed

    Canits, Ivonne; Pecher, Diane; Zeelenberg, René

    2018-01-01

    Previous studies have shown action potentiation during conceptual processing of manipulable objects. In four experiments, we investigated whether these motor actions also play a role in long-term memory. Participants categorized objects that afforded either a power grasp or a precision grasp as natural or artifact by grasping cylinders with either a power grasp or a precision grasp. In all experiments, responses were faster when the affordance of the object was compatible with the type of grasp response. However, subsequent free recall and recognition memory tasks revealed no better memory for object pictures and object names for which the grasp affordance was compatible with the grasp response. The present results therefore do not support the hypothesis that motor actions play a role in long-term memory. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Universal, computer facilitated, steady state oscillator, closed loop analysis theory and some applications to precision oscillators

    NASA Technical Reports Server (NTRS)

    Parzen, Benjamin

    1992-01-01

    The theory of oscillator analysis in the immittance domain should be read in conjunction with the additional theory presented here. The combined theory enables the computer simulation of the steady state oscillator. The simulation makes the calculation of the oscillator total steady state performance practical, including noise at all oscillator locations. Some specific precision oscillators are analyzed.

  1. Accuracy or precision: Implications of sample design and methodology on abundance estimation

    USGS Publications Warehouse

    Kowalewski, Lucas K.; Chizinski, Christopher J.; Powell, Larkin A.; Pope, Kevin L.; Pegg, Mark A.

    2015-01-01

    Sampling by spatially replicated counts (point-count) is an increasingly popular method of estimating population size of organisms. Challenges exist when sampling by point-count method, and it is often impractical to sample entire area of interest and impossible to detect every individual present. Ecologists encounter logistical limitations that force them to sample either few large-sample units or many small sample-units, introducing biases to sample counts. We generated a computer environment and simulated sampling scenarios to test the role of number of samples, sample unit area, number of organisms, and distribution of organisms in the estimation of population sizes using N-mixture models. Many sample units of small area provided estimates that were consistently closer to true abundance than sample scenarios with few sample units of large area. However, sample scenarios with few sample units of large area provided more precise abundance estimates than abundance estimates derived from sample scenarios with many sample units of small area. It is important to consider accuracy and precision of abundance estimates during the sample design process with study goals and objectives fully recognized, although and with consequence, consideration of accuracy and precision of abundance estimates is often an afterthought that occurs during the data analysis process.

  2. Precision of MRI-based body composition measurements of postmenopausal women

    PubMed Central

    Romu, Thobias; Thorell, Sofia; Lindblom, Hanna; Berin, Emilia; Holm, Anna-Clara Spetz; Åstrand, Lotta Lindh; Karlsson, Anette; Borga, Magnus; Hammar, Mats; Leinhard, Olof Dahlqvist

    2018-01-01

    Objectives To determine precision of magnetic resonance imaging (MRI) based fat and muscle quantification in a group of postmenopausal women. Furthermore, to extend the method to individual muscles relevant to upper-body exercise. Materials and methods This was a sub-study to a randomized control trial investigating effects of resistance training to decrease hot flushes in postmenopausal women. Thirty-six women were included, mean age 56 ± 6 years. Each subject was scanned twice with a 3.0T MR-scanner using a whole-body Dixon protocol. Water and fat images were calculated using a 6-peak lipid model including R2*-correction. Body composition analyses were performed to measure visceral and subcutaneous fat volumes, lean volumes and muscle fat infiltration (MFI) of the muscle groups’ thigh muscles, lower leg muscles, and abdominal muscles, as well as the three individual muscles pectoralis, latissimus, and rhomboideus. Analysis was performed using a multi-atlas, calibrated water-fat separated quantification method. Liver-fat was measured as average proton density fat-fraction (PDFF) of three regions-of-interest. Precision was determined with Bland-Altman analysis, repeatability, and coefficient of variation. Results All of the 36 included women were successfully scanned and analysed. The coefficient of variation was 1.1% to 1.5% for abdominal fat compartments (visceral and subcutaneous), 0.8% to 1.9% for volumes of muscle groups (thigh, lower leg, and abdomen), and 2.3% to 7.0% for individual muscle volumes (pectoralis, latissimus, and rhomboideus). Limits of agreement for MFI was within ± 2.06% for muscle groups and within ± 5.13% for individual muscles. The limits of agreement for liver PDFF was within ± 1.9%. Conclusion Whole-body Dixon MRI could characterize a range of different fat and muscle compartments with high precision, including individual muscles, in the study-group of postmenopausal women. The inclusion of individual muscles, calculated from the

  3. Ion chromatography for the precise analysis of chloride and sodium in sweat for the diagnosis of cystic fibrosis.

    PubMed

    Doorn, J; Storteboom, T T R; Mulder, A M; de Jong, W H A; Rottier, B L; Kema, I P

    2015-07-01

    Measurement of chloride in sweat is an essential part of the diagnostic algorithm for cystic fibrosis. The lack in sensitivity and reproducibility of current methods led us to develop an ion chromatography/high-performance liquid chromatography (IC/HPLC) method, suitable for the analysis of both chloride and sodium in small volumes of sweat. Precision, linearity and limit of detection of an in-house developed IC/HPLC method were established. Method comparison between the newly developed IC/HPLC method and the traditional Chlorocounter was performed, and trueness was determined using Passing Bablok method comparison with external quality assurance material (Royal College of Pathologists of Australasia). Precision and linearity fulfill criteria as established by UK guidelines are comparable with inductively coupled plasma-mass spectrometry methods. Passing Bablok analysis demonstrated excellent correlation between IC/HPLC measurements and external quality assessment target values, for both chloride and sodium. With a limit of quantitation of 0.95 mmol/L, our method is suitable for the analysis of small amounts of sweat and can thus be used in combination with the Macroduct collection system. Although a chromatographic application results in a somewhat more expensive test compared to a Chlorocounter test, more accurate measurements are achieved. In addition, simultaneous measurements of sodium concentrations will result in better detection of false positives, less test repeating and thus faster and more accurate and effective diagnosis. The described IC/HPLC method, therefore, provides a precise, relatively cheap and easy-to-handle application for the analysis of both chloride and sodium in sweat. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  4. Developing web-based data analysis tools for precision farming using R and Shiny

    NASA Astrophysics Data System (ADS)

    Jahanshiri, Ebrahim; Mohd Shariff, Abdul Rashid

    2014-06-01

    Technologies that are set to increase the productivity of agricultural practices require more and more data. Nevertheless, farming data is also being increasingly cheap to collect and maintain. Bulk of data that are collected by the sensors and samples need to be analysed in an efficient and transparent manner. Web technologies have long being used to develop applications that can assist the farmers and managers. However until recently, analysing the data in an online environment has not been an easy task especially in the eyes of data analysts. This barrier is now overcome by the availability of new application programming interfaces that can provide real-time web based data analysis. In this paper developing a prototype web based application for data analysis using new facilities in R statistical package and its web development facility, Shiny is explored. The pros and cons of this type of data analysis environment for precision farming are enumerated and future directions in web application development for agricultural data are discussed.

  5. Linearity, Bias, and Precision of Hepatic Proton Density Fat Fraction Measurements by Using MR Imaging: A Meta-Analysis.

    PubMed

    Yokoo, Takeshi; Serai, Suraj D; Pirasteh, Ali; Bashir, Mustafa R; Hamilton, Gavin; Hernando, Diego; Hu, Houchun H; Hetterich, Holger; Kühn, Jens-Peter; Kukuk, Guido M; Loomba, Rohit; Middleton, Michael S; Obuchowski, Nancy A; Song, Ji Soo; Tang, An; Wu, Xinhuai; Reeder, Scott B; Sirlin, Claude B

    2018-02-01

    Purpose To determine the linearity, bias, and precision of hepatic proton density fat fraction (PDFF) measurements by using magnetic resonance (MR) imaging across different field strengths, imager manufacturers, and reconstruction methods. Materials and Methods This meta-analysis was performed in accordance with Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. A systematic literature search identified studies that evaluated the linearity and/or bias of hepatic PDFF measurements by using MR imaging (hereafter, MR imaging-PDFF) against PDFF measurements by using colocalized MR spectroscopy (hereafter, MR spectroscopy-PDFF) or the precision of MR imaging-PDFF. The quality of each study was evaluated by using the Quality Assessment of Studies of Diagnostic Accuracy 2 tool. De-identified original data sets from the selected studies were pooled. Linearity was evaluated by using linear regression between MR imaging-PDFF and MR spectroscopy-PDFF measurements. Bias, defined as the mean difference between MR imaging-PDFF and MR spectroscopy-PDFF measurements, was evaluated by using Bland-Altman analysis. Precision, defined as the agreement between repeated MR imaging-PDFF measurements, was evaluated by using a linear mixed-effects model, with field strength, imager manufacturer, reconstruction method, and region of interest as random effects. Results Twenty-three studies (1679 participants) were selected for linearity and bias analyses and 11 studies (425 participants) were selected for precision analyses. MR imaging-PDFF was linear with MR spectroscopy-PDFF (R 2 = 0.96). Regression slope (0.97; P < .001) and mean Bland-Altman bias (-0.13%; 95% limits of agreement: -3.95%, 3.40%) indicated minimal underestimation by using MR imaging-PDFF. MR imaging-PDFF was precise at the region-of-interest level, with repeatability and reproducibility coefficients of 2.99% and 4.12%, respectively. Field strength, imager manufacturer, and reconstruction method

  6. All-digital precision processing of ERTS images

    NASA Technical Reports Server (NTRS)

    Bernstein, R. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Digital techniques have been developed and used to apply precision-grade radiometric and geometric corrections to ERTS MSS and RBV scenes. Geometric accuracies sufficient for mapping at 1:250,000 scale have been demonstrated. Radiometric quality has been superior to ERTS NDPF precision products. A configuration analysis has shown that feasible, cost-effective all-digital systems for correcting ERTS data are easily obtainable. This report contains a summary of all results obtained during this study and includes: (1) radiometric and geometric correction techniques, (2) reseau detection, (3) GCP location, (4) resampling, (5) alternative configuration evaluations, and (6) error analysis.

  7. [Medical imaging in tumor precision medicine: opportunities and challenges].

    PubMed

    Xu, Jingjing; Tan, Yanbin; Zhang, Minming

    2017-05-25

    Tumor precision medicine is an emerging approach for tumor diagnosis, treatment and prevention, which takes account of individual variability of environment, lifestyle and genetic information. Tumor precision medicine is built up on the medical imaging innovations developed during the past decades, including the new hardware, new imaging agents, standardized protocols, image analysis and multimodal imaging fusion technology. Also the development of automated and reproducible analysis algorithm has extracted large amount of information from image-based features. With the continuous development and mining of tumor clinical and imaging databases, the radiogenomics, radiomics and artificial intelligence have been flourishing. Therefore, these new technological advances bring new opportunities and challenges to the application of imaging in tumor precision medicine.

  8. Statistical analysis of an RNA titration series evaluates microarray precision and sensitivity on a whole-array basis

    PubMed Central

    Holloway, Andrew J; Oshlack, Alicia; Diyagama, Dileepa S; Bowtell, David DL; Smyth, Gordon K

    2006-01-01

    Background Concerns are often raised about the accuracy of microarray technologies and the degree of cross-platform agreement, but there are yet no methods which can unambiguously evaluate precision and sensitivity for these technologies on a whole-array basis. Results A methodology is described for evaluating the precision and sensitivity of whole-genome gene expression technologies such as microarrays. The method consists of an easy-to-construct titration series of RNA samples and an associated statistical analysis using non-linear regression. The method evaluates the precision and responsiveness of each microarray platform on a whole-array basis, i.e., using all the probes, without the need to match probes across platforms. An experiment is conducted to assess and compare four widely used microarray platforms. All four platforms are shown to have satisfactory precision but the commercial platforms are superior for resolving differential expression for genes at lower expression levels. The effective precision of the two-color platforms is improved by allowing for probe-specific dye-effects in the statistical model. The methodology is used to compare three data extraction algorithms for the Affymetrix platforms, demonstrating poor performance for the commonly used proprietary algorithm relative to the other algorithms. For probes which can be matched across platforms, the cross-platform variability is decomposed into within-platform and between-platform components, showing that platform disagreement is almost entirely systematic rather than due to measurement variability. Conclusion The results demonstrate good precision and sensitivity for all the platforms, but highlight the need for improved probe annotation. They quantify the extent to which cross-platform measures can be expected to be less accurate than within-platform comparisons for predicting disease progression or outcome. PMID:17118209

  9. Geometrical accuracy of metallic objects produced with additive or subtractive manufacturing: A comparative in vitro study.

    PubMed

    Braian, Michael; Jönsson, David; Kevci, Mir; Wennerberg, Ann

    2018-07-01

    To evaluate the accuracy and precision of objects produced by additive manufacturing systems (AM) for use in dentistry and to compare with subtractive manufacturing systems (SM). Ten specimens of two geometrical objects were produced by five different AM machines and one SM machine. Object A mimics an inlay-shaped object, while object B imitates a four-unit bridge model. All the objects were sorted into different measurement dimensions (x, y, z), linear distances, angles and corner radius. None of the additive manufacturing or subtractive manufacturing groups presented a perfect match to the CAD file with regard to all parameters included in the present study. Considering linear measurements, the precision for subtractive manufacturing group was consistent in all axes for object A, presenting results of <0.050mm. The additive manufacturing groups had consistent precision in the x-axis and y-axis but not in the z-axis. With regard to corner radius measurements, the SM group had the best overall accuracy and precision for both objects A and B when compared to the AM groups. Within the limitations of this in vitro study, the conclusion can be made that subtractive manufacturing presented overall precision on all measurements below 0.050mm. The AM machines also presented fairly good precision, <0.150mm, on all axes except for the z-axis. Knowledge regarding accuracy and precision for different production techniques utilized in dentistry is of great clinical importance. The dental community has moved from casting to milling and additive techniques are now being implemented. Thus all these production techniques need to be tested, compared and validated. Copyright © 2018 The Academy of Dental Materials. Published by Elsevier Inc. All rights reserved.

  10. Object-Driven and Temporal Action Rules Mining

    ERIC Educational Resources Information Center

    Hajja, Ayman

    2013-01-01

    In this thesis, I present my complete research work in the field of action rules, more precisely object-driven and temporal action rules. The drive behind the introduction of object-driven and temporally based action rules is to bring forth an adapted approach to extract action rules from a subclass of systems that have a specific nature, in which…

  11. Fast grasping of unknown objects using principal component analysis

    NASA Astrophysics Data System (ADS)

    Lei, Qujiang; Chen, Guangming; Wisse, Martijn

    2017-09-01

    Fast grasping of unknown objects has crucial impact on the efficiency of robot manipulation especially subjected to unfamiliar environments. In order to accelerate grasping speed of unknown objects, principal component analysis is utilized to direct the grasping process. In particular, a single-view partial point cloud is constructed and grasp candidates are allocated along the principal axis. Force balance optimization is employed to analyze possible graspable areas. The obtained graspable area with the minimal resultant force is the best zone for the final grasping execution. It is shown that an unknown object can be more quickly grasped provided that the component analysis principle axis is determined using single-view partial point cloud. To cope with the grasp uncertainty, robot motion is assisted to obtain a new viewpoint. Virtual exploration and experimental tests are carried out to verify this fast gasping algorithm. Both simulation and experimental tests demonstrated excellent performances based on the results of grasping a series of unknown objects. To minimize the grasping uncertainty, the merits of the robot hardware with two 3D cameras can be utilized to suffice the partial point cloud. As a result of utilizing the robot hardware, the grasping reliance is highly enhanced. Therefore, this research demonstrates practical significance for increasing grasping speed and thus increasing robot efficiency under unpredictable environments.

  12. Analysis and Recognition of Curve Type as The Basis of Object Recognition in Image

    NASA Astrophysics Data System (ADS)

    Nugraha, Nurma; Madenda, Sarifuddin; Indarti, Dina; Dewi Agushinta, R.; Ernastuti

    2016-06-01

    An object in an image when analyzed further will show the characteristics that distinguish one object with another object in an image. Characteristics that are used in object recognition in an image can be a color, shape, pattern, texture and spatial information that can be used to represent objects in the digital image. The method has recently been developed for image feature extraction on objects that share characteristics curve analysis (simple curve) and use the search feature of chain code object. This study will develop an algorithm analysis and the recognition of the type of curve as the basis for object recognition in images, with proposing addition of complex curve characteristics with maximum four branches that will be used for the process of object recognition in images. Definition of complex curve is the curve that has a point of intersection. By using some of the image of the edge detection, the algorithm was able to do the analysis and recognition of complex curve shape well.

  13. Mineral element analyses of switchgrass biomass: comparison of the accuracy and precision of laboratories

    USDA-ARS?s Scientific Manuscript database

    Mineral concentration of plant biomass can affect its use in thermal conversion to energy. The objective of this study was to compare the precision and accuracy of university and private laboratories that conduct mineral analyses of plant biomass on a fee basis. Accuracy and precision of the laborat...

  14. Precision medicine: In need of guidance and surveillance.

    PubMed

    Lin, Jian-Zhen; Long, Jun-Yu; Wang, An-Qiang; Zheng, Ying; Zhao, Hai-Tao

    2017-07-28

    Precision medicine, currently a hotspot in mainstream medicine, has been strongly promoted in recent years. With rapid technological development, such as next-generation sequencing, and fierce competition in molecular targeted drug exploitation, precision medicine represents an advance in science and technology; it also fulfills needs in public health care. The clinical translation and application of precision medicine - especially in the prevention and treatment of tumors - is far from satisfactory; however, the aims of precision medicine deserve approval. Thus, this medical approach is currently in its infancy; it has promising prospects, but it needs to overcome numbers of problems and deficiencies. It is expected that in addition to conventional symptoms and signs, precision medicine will define disease in terms of the underlying molecular characteristics and other environmental susceptibility factors. Those expectations should be realized by constructing a novel data network, integrating clinical data from individual patients and personal genomic background with existing research on the molecular makeup of diseases. In addition, multi-omics analysis and multi-discipline collaboration will become crucial elements in precision medicine. Precision medicine deserves strong support, and its development demands directed momentum. We propose three kinds of impetus (research, application and collaboration impetus) for such directed momentum toward promoting precision medicine and accelerating its clinical translation and application.

  15. Precision medicine: In need of guidance and surveillance

    PubMed Central

    Lin, Jian-Zhen; Long, Jun-Yu; Wang, An-Qiang; Zheng, Ying; Zhao, Hai-Tao

    2017-01-01

    Precision medicine, currently a hotspot in mainstream medicine, has been strongly promoted in recent years. With rapid technological development, such as next-generation sequencing, and fierce competition in molecular targeted drug exploitation, precision medicine represents an advance in science and technology; it also fulfills needs in public health care. The clinical translation and application of precision medicine - especially in the prevention and treatment of tumors - is far from satisfactory; however, the aims of precision medicine deserve approval. Thus, this medical approach is currently in its infancy; it has promising prospects, but it needs to overcome numbers of problems and deficiencies. It is expected that in addition to conventional symptoms and signs, precision medicine will define disease in terms of the underlying molecular characteristics and other environmental susceptibility factors. Those expectations should be realized by constructing a novel data network, integrating clinical data from individual patients and personal genomic background with existing research on the molecular makeup of diseases. In addition, multi-omics analysis and multi-discipline collaboration will become crucial elements in precision medicine. Precision medicine deserves strong support, and its development demands directed momentum. We propose three kinds of impetus (research, application and collaboration impetus) for such directed momentum toward promoting precision medicine and accelerating its clinical translation and application. PMID:28811702

  16. Concepts and analysis for precision segmented reflector and feed support structures

    NASA Technical Reports Server (NTRS)

    Miller, Richard K.; Thomson, Mark W.; Hedgepeth, John M.

    1990-01-01

    Several issues surrounding the design of a large (20-meter diameter) Precision Segmented Reflector are investigated. The concerns include development of a reflector support truss geometry that will permit deployment into the required doubly-curved shape without significant member strains. For deployable and erectable reflector support trusses, the reduction of structural redundancy was analyzed to achieve reduced weight and complexity for the designs. The stiffness and accuracy of such reduced member trusses, however, were found to be affected to a degree that is unexpected. The Precision Segmented Reflector designs were developed with performance requirements that represent the Reflector application. A novel deployable sunshade concept was developed, and a detailed parametric study of various feed support structural concepts was performed. The results of the detailed study reveal what may be the most desirable feed support structure geometry for Precision Segmented Reflector/Large Deployable Reflector applications.

  17. Causal diagrams and multivariate analysis II: precision work.

    PubMed

    Jupiter, Daniel C

    2014-01-01

    In this Investigators' Corner, I continue my discussion of when and why we researchers should include variables in multivariate regression. My examination focuses on studies comparing treatment groups and situations for which we can either exclude variables from multivariate analyses or include them for reasons of precision. Copyright © 2014 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  18. Precision Joining Center

    NASA Astrophysics Data System (ADS)

    Powell, J. W.; Westphal, D. A.

    1991-08-01

    A workshop to obtain input from industry on the establishment of the Precision Joining Center (PJC) was held on July 10-12, 1991. The PJC is a center for training Joining Technologists in advanced joining techniques and concepts in order to promote the competitiveness of U.S. industry. The center will be established as part of the DOE Defense Programs Technology Commercialization Initiative, and operated by EG&G Rocky Flats in cooperation with the American Welding Society and the Colorado School of Mines Center for Welding and Joining Research. The overall objectives of the workshop were to validate the need for a Joining Technologists to fill the gap between the welding operator and the welding engineer, and to assure that the PJC will train individuals to satisfy that need. The consensus of the workshop participants was that the Joining Technologist is a necessary position in industry, and is currently used, with some variation, by many companies. It was agreed that the PJC core curriculum, as presented, would produce a Joining Technologist of value to industries that use precision joining techniques. The advantage of the PJC would be to train the Joining Technologist much more quickly and more completely. The proposed emphasis of the PJC curriculum on equipment intensive and hands-on training was judged to be essential.

  19. Search Filter Precision Can Be Improved By NOTing Out Irrelevant Content

    PubMed Central

    Wilczynski, Nancy L.; McKibbon, K. Ann; Haynes, R. Brian

    2011-01-01

    Background: Most methodologic search filters developed for use in large electronic databases such as MEDLINE have low precision. One method that has been proposed but not tested for improving precision is NOTing out irrelevant content. Objective: To determine if search filter precision can be improved by NOTing out the text words and index terms assigned to those articles that are retrieved but are off-target. Design: Analytic survey. Methods: NOTing out unique terms in off-target articles and testing search filter performance in the Clinical Hedges Database. Main Outcome Measures: Sensitivity, specificity, precision and number needed to read (NNR). Results: For all purpose categories (diagnosis, prognosis and etiology) except treatment and for all databases (MEDLINE, EMBASE, CINAHL and PsycINFO), constructing search filters that NOTed out irrelevant content resulted in substantive improvements in NNR (over four-fold for some purpose categories and databases). Conclusion: Search filter precision can be improved by NOTing out irrelevant content. PMID:22195215

  20. Development of High Precision Tsunami Runup Calculation Method Coupled with Structure Analysis

    NASA Astrophysics Data System (ADS)

    Arikawa, Taro; Seki, Katsumi; Chida, Yu; Takagawa, Tomohiro; Shimosako, Kenichiro

    2017-04-01

    The 2011 Great East Japan Earthquake (GEJE) has shown that tsunami disasters are not limited to inundation damage in a specified region, but may destroy a wide area, causing a major disaster. Evaluating standing land structures and damage to them requires highly precise evaluation of three-dimensional fluid motion - an expensive process. Our research goals were thus to develop a coupling STOC-CADMAS (Arikawa and Tomita, 2016) coupling with the structure analysis (Arikawa et. al., 2009) to efficiently calculate all stages from tsunami source to runup including the deformation of structures and to verify their applicability. We also investigated the stability of breakwaters at Kamaishi Bay. Fig. 1 shows the whole of this calculation system. The STOC-ML simulator approximates pressure by hydrostatic pressure and calculates the wave profiles based on an equation of continuity, thereby lowering calculation cost, primarily calculating from a e epi center to the shallow region. As a simulator, STOC-IC solves pressure based on a Poisson equation to account for a shallower, more complex topography, but reduces computation cost slightly to calculate the area near a port by setting the water surface based on an equation of continuity. CS3D also solves a Navier-Stokes equation and sets the water surface by VOF to deal with the runup area, with its complex surfaces of overflows and bores. STR solves the structure analysis including the geo analysis based on the Biot's formula. By coupling these, it efficiently calculates the tsunami profile from the propagation to the inundation. The numerical results compared with the physical experiments done by Arikawa et. al.,2012. It was good agreement with the experimental ones. Finally, the system applied to the local situation at Kamaishi bay. The almost breakwaters were washed away, whose situation was similar to the damage at Kamaishi bay. REFERENCES T. Arikawa and T. Tomita (2016): "Development of High Precision Tsunami Runup

  1. An object-mediated updating account of insensitivity to transsaccadic change

    PubMed Central

    Tas, A. Caglar; Moore, Cathleen M.; Hollingworth, Andrew

    2012-01-01

    Recent evidence has suggested that relatively precise information about the location and visual form of a saccade target object is retained across a saccade. However, this information appears to be available for report only when the target is removed briefly, so that the display is blank when the eyes land. We hypothesized that the availability of precise target information is dependent on whether a post-saccade object is mapped to the same object representation established for the presaccade target. If so, then the post-saccade features of the target overwrite the presaccade features, a process of object mediated updating in which visual masking is governed by object continuity. In two experiments, participants' sensitivity to the spatial displacement of a saccade target was improved when that object changed surface feature properties across the saccade, consistent with the prediction of the object-mediating updating account. Transsaccadic perception appears to depend on a mechanism of object-based masking that is observed across multiple domains of vision. In addition, the results demonstrate that surface-feature continuity contributes to visual stability across saccades. PMID:23092946

  2. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric S.

    2016-01-01

    The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.

  3. Precision force sensing with optically-levitated nanospheres

    NASA Astrophysics Data System (ADS)

    Geraci, Andrew

    2017-04-01

    In high vacuum, optically-trapped dielectric nanospheres achieve excellent decoupling from their environment and experience minimal friction, making them ideal for precision force sensing. We have shown that 300 nm silica spheres can be used for calibrated zeptonewton force measurements in a standing-wave optical trap. In this optical potential, the known spacing of the standing wave anti-nodes can serve as an independent calibration tool for the displacement spectrum of the trapped particle. I will describe our progress towards using these sensors for tests of the Newtonian gravitational inverse square law at micron length scales. Optically levitated dielectric objects also show promise for a variety of other precision sensing applications, including searches for gravitational waves and other experiments in quantum optomechanics. National Science Foundation PHY-1205994, PHY-1506431, PHY-1509176.

  4. Brief Introductory Psychology Textbooks: An Objective Analysis Update

    ERIC Educational Resources Information Center

    Griggs, Richard A.; Jackson, Sherri L.

    2013-01-01

    It has been 12 years since the last objective analysis of brief introductory psychology textbooks was published and 13 years since the textbook copyright period used in that study, 1997-2000. Given the importance of informed textbook evaluation and selection to the introductory course but the difficulty of this task because of the large number of…

  5. Objective analysis of toolmarks in forensics

    NASA Astrophysics Data System (ADS)

    Grieve, Taylor N.

    Since the 1993 court case of Daubert v. Merrell Dow Pharmaceuticals, Inc. the subjective nature of toolmark comparison has been questioned by attorneys and law enforcement agencies alike. This has led to an increased drive to establish objective comparison techniques with known error rates, much like those that DNA analysis is able to provide. This push has created research in which the 3-D surface profile of two different marks are characterized and the marks' cross-sections are run through a comparative statistical algorithm to acquire a value that is intended to indicate the likelihood of a match between the marks. The aforementioned algorithm has been developed and extensively tested through comparison of evenly striated marks made by screwdrivers. However, this algorithm has yet to be applied to quasi-striated marks such as those made by the shear edge of slip-joint pliers. The results of this algorithm's application to the surface of copper wire will be presented. Objective mark comparison also extends to comparison of toolmarks made by firearms. In an effort to create objective comparisons, microstamping of firing pins and breech faces has been introduced. This process involves placing unique alphanumeric identifiers surrounded by a radial code on the surface of firing pins, which transfer to the cartridge's primer upon firing. Three different guns equipped with microstamped firing pins were used to fire 3000 cartridges. These cartridges are evaluated based on the clarity of their alphanumeric transfers and the clarity of the radial code surrounding the alphanumerics.

  6. Objective analysis of toolmarks in forensics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grieve, Taylor N.

    2013-01-01

    Since the 1993 court case of Daubert v. Merrell Dow Pharmaceuticals, Inc. the subjective nature of toolmark comparison has been questioned by attorneys and law enforcement agencies alike. This has led to an increased drive to establish objective comparison techniques with known error rates, much like those that DNA analysis is able to provide. This push has created research in which the 3-D surface profile of two different marks are characterized and the marks’ cross-sections are run through a comparative statistical algorithm to acquire a value that is intended to indicate the likelihood of a match between the marks. Themore » aforementioned algorithm has been developed and extensively tested through comparison of evenly striated marks made by screwdrivers. However, this algorithm has yet to be applied to quasi-striated marks such as those made by the shear edge of slip-joint pliers. The results of this algorithm’s application to the surface of copper wire will be presented. Objective mark comparison also extends to comparison of toolmarks made by firearms. In an effort to create objective comparisons, microstamping of firing pins and breech faces has been introduced. This process involves placing unique alphanumeric identifiers surrounded by a radial code on the surface of firing pins, which transfer to the cartridge’s primer upon firing. Three different guns equipped with microstamped firing pins were used to fire 3000 cartridges. These cartridges are evaluated based on the clarity of their alphanumeric transfers and the clarity of the radial code surrounding the alphanumerics.« less

  7. High-precision radius automatic measurement using laser differential confocal technology

    NASA Astrophysics Data System (ADS)

    Jiang, Hongwei; Zhao, Weiqian; Yang, Jiamiao; Guo, Yongkui; Xiao, Yang

    2015-02-01

    A high precision radius automatic measurement method using laser differential confocal technology is proposed. Based on the property of an axial intensity curve that the null point precisely corresponds to the focus of the objective and the bipolar property, the method uses the composite PID (proportional-integral-derivative) control to ensure the steady movement of the motor for process of quick-trigger scanning, and uses least-squares linear fitting to obtain the position of the cat-eye and confocal positions, then calculates the radius of curvature of lens. By setting the number of measure times, precision auto-repeat measurement of the radius of curvature is achieved. The experiment indicates that the method has the measurement accuracy of better than 2 ppm, and the measuring repeatability is better than 0.05 μm. In comparison with the existing manual-single measurement, this method has a high measurement precision, a strong environment anti-interference capability, a better measuring repeatability which is only tenth of former's.

  8. Precision powder feeder

    DOEpatents

    Schlienger, M. Eric; Schmale, David T.; Oliver, Michael S.

    2001-07-10

    A new class of precision powder feeders is disclosed. These feeders provide a precision flow of a wide range of powdered materials, while remaining robust against jamming or damage. These feeders can be precisely controlled by feedback mechanisms.

  9. A Method for a Retrospective Analysis of Course Objectives: Have Pursued Objectives in Fact Been Attained? Twente Educational Report Number 7.

    ERIC Educational Resources Information Center

    Plomp, Tjeerd; van der Meer, Adri

    A method pertaining to the identification and analysis of course objectives is discussed. A framework is developed by which post facto objectives can be determined and students' attainment of the objectives can be assessed. The method can also be used for examining the quality of instruction. Using this method, it is possible to determine…

  10. Anticipatory scaling of grip forces when lifting objects of everyday life.

    PubMed

    Hermsdörfer, Joachim; Li, Yong; Randerath, Jennifer; Goldenberg, Georg; Eidenmüller, Sandra

    2011-07-01

    The ability to predict and anticipate the mechanical demands of the environment promotes smooth and skillful motor actions. Thus, the finger forces produced to grasp and lift an object are scaled to the physical properties such as weight. While grip force scaling is well established for neutral objects, only few studies analyzed objects known from daily routine and none studied grip forces. In the present study, eleven healthy subjects each lifted twelve objects of everyday life that encompassed a wide range of weights. The finger pads were covered with force sensors that enabled the measurement of grip force. A scale registered load forces. In a control experiment, the objects were wrapped into paper to prevent recognition by the subjects. Data from the first lift of each object confirmed that object weight was anticipated by adequately scaled forces. The maximum grip force rate during the force increase phase emerged as the most reliable measure to verify that weight was actually predicted and to characterize the precision of this prediction, while other force measures were scaled to object weight also when object identity was not known. Variability and linearity of the grip force-weight relationship improved for time points reached after liftoff, suggesting that sensory information refined the force adjustment. The same mechanism seemed to be involved with unrecognizable objects, though a lower precision was reached. Repeated lifting of the same object within a second and third presentation block did not improve the precision of the grip force scaling. Either practice was too variable or the motor system does not prioritize the optimization of the internal representation when objects are highly familiar.

  11. voom: precision weights unlock linear model analysis tools for RNA-seq read counts

    PubMed Central

    2014-01-01

    New normal linear modeling strategies are presented for analyzing read counts from RNA-seq experiments. The voom method estimates the mean-variance relationship of the log-counts, generates a precision weight for each observation and enters these into the limma empirical Bayes analysis pipeline. This opens access for RNA-seq analysts to a large body of methodology developed for microarrays. Simulation studies show that voom performs as well or better than count-based RNA-seq methods even when the data are generated according to the assumptions of the earlier methods. Two case studies illustrate the use of linear modeling and gene set testing methods. PMID:24485249

  12. voom: Precision weights unlock linear model analysis tools for RNA-seq read counts.

    PubMed

    Law, Charity W; Chen, Yunshun; Shi, Wei; Smyth, Gordon K

    2014-02-03

    New normal linear modeling strategies are presented for analyzing read counts from RNA-seq experiments. The voom method estimates the mean-variance relationship of the log-counts, generates a precision weight for each observation and enters these into the limma empirical Bayes analysis pipeline. This opens access for RNA-seq analysts to a large body of methodology developed for microarrays. Simulation studies show that voom performs as well or better than count-based RNA-seq methods even when the data are generated according to the assumptions of the earlier methods. Two case studies illustrate the use of linear modeling and gene set testing methods.

  13. Structural Analysis and Testing of an Erectable Truss for Precision Segmented Reflector Application

    NASA Technical Reports Server (NTRS)

    Collins, Timothy J.; Fichter, W. B.; Adams, Richard R.; Javeed, Mehzad

    1995-01-01

    This paper describes analysis and test results obtained at Langley Research Center (LaRC) on a doubly curved testbed support truss for precision reflector applications. Descriptions of test procedures and experimental results that expand upon previous investigations are presented. A brief description of the truss is given, and finite-element-analysis models are described. Static-load and vibration test procedures are discussed, and experimental results are shown to be repeatable and in generally good agreement with linear finite-element predictions. Truss structural performance (as determined by static deflection and vibration testing) is shown to be predictable and very close to linear. Vibration test results presented herein confirm that an anomalous mode observed during initial testing was due to the flexibility of the truss support system. Photogrammetric surveys with two 131-in. reference scales show that the root-mean-square (rms) truss-surface accuracy is about 0.0025 in. Photogrammetric measurements also indicate that the truss coefficient of thermal expansion (CTE) is in good agreement with that predicted by analysis. A detailed description of the photogrammetric procedures is included as an appendix.

  14. Analysis of micro computed tomography images; a look inside historic enamelled metal objects

    NASA Astrophysics Data System (ADS)

    van der Linden, Veerle; van de Casteele, Elke; Thomas, Mienke Simon; de Vos, Annemie; Janssen, Elsje; Janssens, Koen

    2010-02-01

    In this study the usefulness of micro-Computed Tomography (µ-CT) for the in-depth analysis of enamelled metal objects was tested. Usually investigations of enamelled metal artefacts are restricted to non-destructive surface analysis or analysis of cross sections after destructive sampling. Radiography, a commonly used technique in the field of cultural heritage studies, is limited to providing two-dimensional information about a three-dimensional object (Lang and Middleton, Radiography of Cultural Material, pp. 60-61, Elsevier-Butterworth-Heinemann, Amsterdam-Stoneham-London, 2005). Obtaining virtual slices and information about the internal structure of these objects was made possible by CT analysis. With this technique the underlying metal work was studied without removing the decorative enamel layer. Moreover visible defects such as cracks were measured in both width and depth and as of yet invisible defects and weaker areas are visualised. All these features are of great interest to restorers and conservators as they allow a view inside these objects without so much as touching them.

  15. Precision thermometry and the quantum speed limit

    NASA Astrophysics Data System (ADS)

    Campbell, Steve; Genoni, Marco G.; Deffner, Sebastian

    2018-04-01

    We assess precision thermometry for an arbitrary single quantum system. For a d-dimensional harmonic system we show that the gap sets a single temperature that can be optimally estimated. Furthermore, we establish a simple linear relationship between the gap and this temperature, and show that the precision exhibits a quadratic relationship. We extend our analysis to explore systems with arbitrary spectra, showing that exploiting anharmonicity and degeneracy can greatly enhance the precision of thermometry. Finally, we critically assess the dynamical features of two thermometry protocols for a two level system. By calculating the quantum speed limit we find that, despite the gap fixing a preferred temperature to probe, there is no evidence of this emerging in the dynamical features.

  16. Negative emotion enhances mnemonic precision and subjective feelings of remembering in visual long-term memory.

    PubMed

    Xie, Weizhen; Zhang, Weiwei

    2017-09-01

    Negative emotion sometimes enhances memory (higher accuracy and/or vividness, e.g., flashbulb memories). The present study investigates whether it is the qualitative (precision) or quantitative (the probability of successful retrieval) aspect of memory that drives these effects. In a visual long-term memory task, observers memorized colors (Experiment 1a) or orientations (Experiment 1b) of sequentially presented everyday objects under negative, neutral, or positive emotions induced with International Affective Picture System images. In a subsequent test phase, observers reconstructed objects' colors or orientations using the method of adjustment. We found that mnemonic precision was enhanced under the negative condition relative to the neutral and positive conditions. In contrast, the probability of successful retrieval was comparable across the emotion conditions. Furthermore, the boost in memory precision was associated with elevated subjective feelings of remembering (vividness and confidence) and metacognitive sensitivity in Experiment 2. Altogether, these findings suggest a novel precision-based account for emotional memories. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Expertise for upright faces improves the precision but not the capacity of visual working memory.

    PubMed

    Lorenc, Elizabeth S; Pratte, Michael S; Angeloni, Christopher F; Tong, Frank

    2014-10-01

    Considerable research has focused on how basic visual features are maintained in working memory, but little is currently known about the precision or capacity of visual working memory for complex objects. How precisely can an object be remembered, and to what extent might familiarity or perceptual expertise contribute to working memory performance? To address these questions, we developed a set of computer-generated face stimuli that varied continuously along the dimensions of age and gender, and we probed participants' memories using a method-of-adjustment reporting procedure. This paradigm allowed us to separately estimate the precision and capacity of working memory for individual faces, on the basis of the assumptions of a discrete capacity model, and to assess the impact of face inversion on memory performance. We found that observers could maintain up to four to five items on average, with equally good memory capacity for upright and upside-down faces. In contrast, memory precision was significantly impaired by face inversion at every set size tested. Our results demonstrate that the precision of visual working memory for a complex stimulus is not strictly fixed but, instead, can be modified by learning and experience. We find that perceptual expertise for upright faces leads to significant improvements in visual precision, without modifying the capacity of working memory.

  18. Geographic Object-Based Image Analysis – Towards a new paradigm

    PubMed Central

    Blaschke, Thomas; Hay, Geoffrey J.; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ‘per-pixel paradigm’ and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm. PMID:24623958

  19. Correlating Subjective and Objective Sleepiness: Revisiting the Association Using Survival Analysis

    PubMed Central

    Aurora, R. Nisha; Caffo, Brian; Crainiceanu, Ciprian; Punjabi, Naresh M.

    2011-01-01

    Study Objectives: The Epworth Sleepiness Scale (ESS) and multiple sleep latency test (MSLT) are the most commonly used measures of subjective and objective sleepiness, respectively. The strength of the association between these measures as well as the optimal ESS threshold that indicates objective sleepiness remains a topic of significant interest in the clinical and research arenas. The current investigation sought to: (a) examine the association between the ESS and the average sleep latency from the MSLT using the techniques of survival analysis; (b) determine whether specific patient factors influence the association; (c) examine the utility of each ESS question; and (d) identify the optimal ESS threshold that indicates objective sleepiness. Design: Cross-sectional study. Patients and Settings: Patients (N = 675) referred for polysomnography and MSLT. Measurements and Results: Using techniques of survival analysis, a significant association was noted between the ESS score and the average sleep latency. The adjusted hazard ratios for sleep onset during the MSLT for the ESS quartiles were 1.00 (ESS < 9), 1.32 (ESS: 10–13), 1.85 (ESS: 14-17), and 2.53 (ESS ≥ 18), respectively. The association was independent of several patient factors and was distinct for the 4 naps. Furthermore, most of the ESS questions were individually predictive of the average sleep latency except the tendency to doze off when lying down to rest in the afternoon, which was only predictive in patients with less than a college education. Finally, an ESS score ≥ 13 optimally predicted an average sleep latency < 8 minutes. Conclusions: In contrast to previous reports, the association between the ESS and the average sleep latency is clearly apparent when the data are analyzed by survival analysis, and most of the ESS questions are predictive of objective sleepiness. An ESS score ≥ 13 most effectively predicts objective sleepiness, which is higher than what has typically been used in

  20. Stochastic precision analysis of 2D cardiac strain estimation in vivo

    NASA Astrophysics Data System (ADS)

    Bunting, E. A.; Provost, J.; Konofagou, E. E.

    2014-11-01

    Ultrasonic strain imaging has been applied to echocardiography and carries great potential to be used as a tool in the clinical setting. Two-dimensional (2D) strain estimation may be useful when studying the heart due to the complex, 3D deformation of the cardiac tissue. Increasing the framerate used for motion estimation, i.e. motion estimation rate (MER), has been shown to improve the precision of the strain estimation, although maintaining the spatial resolution necessary to view the entire heart structure in a single heartbeat remains challenging at high MERs. Two previously developed methods, the temporally unequispaced acquisition sequence (TUAS) and the diverging beam sequence (DBS), have been used in the past to successfully estimate in vivo axial strain at high MERs without compromising spatial resolution. In this study, a stochastic assessment of 2D strain estimation precision is performed in vivo for both sequences at varying MERs (65, 272, 544, 815 Hz for TUAS; 250, 500, 1000, 2000 Hz for DBS). 2D incremental strains were estimated during left ventricular contraction in five healthy volunteers using a normalized cross-correlation function and a least-squares strain estimator. Both sequences were shown capable of estimating 2D incremental strains in vivo. The conditional expected value of the elastographic signal-to-noise ratio (E(SNRe|ɛ)) was used to compare strain estimation precision of both sequences at multiple MERs over a wide range of clinical strain values. The results here indicate that axial strain estimation precision is much more dependent on MER than lateral strain estimation, while lateral estimation is more affected by strain magnitude. MER should be increased at least above 544 Hz to avoid suboptimal axial strain estimation. Radial and circumferential strain estimations were influenced by the axial and lateral strain in different ways. Furthermore, the TUAS and DBS were found to be of comparable precision at similar MERs.

  1. Design and algorithm research of high precision airborne infrared touch screen

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-Bing; Wang, Shuang-Jie; Fu, Yan; Chen, Zhao-Quan

    2016-10-01

    There are shortcomings of low precision, touch shaking, and sharp decrease of touch precision when emitting and receiving tubes are failure in the infrared touch screen. A high precision positioning algorithm based on extended axis is proposed to solve these problems. First, the unimpeded state of the beam between emitting and receiving tubes is recorded as 0, while the impeded state is recorded as 1. Then, the method of oblique scan is used, in which the light of one emitting tube is used for five receiving tubes. The impeded information of all emitting and receiving tubes is collected as matrix. Finally, according to the method of arithmetic average, the position of the touch object is calculated. The extended axis positioning algorithm is characteristic of high precision in case of failure of individual infrared tube and affects slightly the precision. The experimental result shows that the 90% display area of the touch error is less than 0.25D, where D is the distance between adjacent emitting tubes. The conclusion is gained that the algorithm based on extended axis has advantages of high precision, little impact when individual infrared tube is failure, and using easily.

  2. A decision analysis approach for risk management of near-earth objects

    NASA Astrophysics Data System (ADS)

    Lee, Robert C.; Jones, Thomas D.; Chapman, Clark R.

    2014-10-01

    Risk management of near-Earth objects (NEOs; e.g., asteroids and comets) that can potentially impact Earth is an important issue that took on added urgency with the Chelyabinsk event of February 2013. Thousands of NEOs large enough to cause substantial damage are known to exist, although only a small fraction of these have the potential to impact Earth in the next few centuries. The probability and location of a NEO impact are subject to complex physics and great uncertainty, and consequences can range from minimal to devastating, depending upon the size of the NEO and location of impact. Deflecting a potential NEO impactor would be complex and expensive, and inter-agency and international cooperation would be necessary. Such deflection campaigns may be risky in themselves, and mission failure may result in unintended consequences. The benefits, risks, and costs of different potential NEO risk management strategies have not been compared in a systematic fashion. We present a decision analysis framework addressing this hazard. Decision analysis is the science of informing difficult decisions. It is inherently multi-disciplinary, especially with regard to managing catastrophic risks. Note that risk analysis clarifies the nature and magnitude of risks, whereas decision analysis guides rational risk management. Decision analysis can be used to inform strategic, policy, or resource allocation decisions. First, a problem is defined, including the decision situation and context. Second, objectives are defined, based upon what the different decision-makers and stakeholders (i.e., participants in the decision) value as important. Third, quantitative measures or scales for the objectives are determined. Fourth, alternative choices or strategies are defined. Fifth, the problem is then quantitatively modeled, including probabilistic risk analysis, and the alternatives are ranked in terms of how well they satisfy the objectives. Sixth, sensitivity analyses are performed in

  3. Multiresolution saliency map based object segmentation

    NASA Astrophysics Data System (ADS)

    Yang, Jian; Wang, Xin; Dai, ZhenYou

    2015-11-01

    Salient objects' detection and segmentation are gaining increasing research interest in recent years. A saliency map can be obtained from different models presented in previous studies. Based on this saliency map, the most salient region (MSR) in an image can be extracted. This MSR, generally a rectangle, can be used as the initial parameters for object segmentation algorithms. However, to our knowledge, all of those saliency maps are represented in a unitary resolution although some models have even introduced multiscale principles in the calculation process. Furthermore, some segmentation methods, such as the well-known GrabCut algorithm, need more iteration time or additional interactions to get more precise results without predefined pixel types. A concept of a multiresolution saliency map is introduced. This saliency map is provided in a multiresolution format, which naturally follows the principle of the human visual mechanism. Moreover, the points in this map can be utilized to initialize parameters for GrabCut segmentation by labeling the feature pixels automatically. Both the computing speed and segmentation precision are evaluated. The results imply that this multiresolution saliency map-based object segmentation method is simple and efficient.

  4. The Analysis of Object-Based Change Detection in Mining Area: a Case Study with Pingshuo Coal Mine

    NASA Astrophysics Data System (ADS)

    Zhang, M.; Zhou, W.; Li, Y.

    2017-09-01

    Accurate information on mining land use and land cover change are crucial for monitoring and environmental change studies. In this paper, RapidEye Remote Sensing Image (Map 2012) and SPOT7 Remote Sensing Image (Map 2015) in Pingshuo Mining Area are selected to monitor changes combined with object-based classification and change vector analysis method, we also used R in highresolution remote sensing image for mining land classification, and found the feasibility and the flexibility of open source software. The results show that (1) the classification of reclaimed mining land has higher precision, the overall accuracy and kappa coefficient of the classification of the change region map were 86.67 % and 89.44 %. It's obvious that object-based classification and change vector analysis which has a great significance to improve the monitoring accuracy can be used to monitor mining land, especially reclaiming mining land; (2) the vegetation area changed from 46 % to 40 % accounted for the proportion of the total area from 2012 to 2015, and most of them were transformed into the arable land. The sum of arable land and vegetation area increased from 51 % to 70 %; meanwhile, build-up land has a certain degree of increase, part of the water area was transformed into arable land, but the extent of the two changes is not obvious. The result illustrated the transformation of reclaimed mining area, at the same time, there is still some land convert to mining land, and it shows the mine is still operating, mining land use and land cover are the dynamic procedure.

  5. Impact of survey workflow on precision and accuracy of terrestrial LiDAR datasets

    NASA Astrophysics Data System (ADS)

    Gold, P. O.; Cowgill, E.; Kreylos, O.

    2009-12-01

    Ground-based LiDAR (Light Detection and Ranging) survey techniques are enabling remote visualization and quantitative analysis of geologic features at unprecedented levels of detail. For example, digital terrain models computed from LiDAR data have been used to measure displaced landforms along active faults and to quantify fault-surface roughness. But how accurately do terrestrial LiDAR data represent the true ground surface, and in particular, how internally consistent and precise are the mosaiced LiDAR datasets from which surface models are constructed? Addressing this question is essential for designing survey workflows that capture the necessary level of accuracy for a given project while minimizing survey time and equipment, which is essential for effective surveying of remote sites. To address this problem, we seek to define a metric that quantifies how scan registration error changes as a function of survey workflow. Specifically, we are using a Trimble GX3D laser scanner to conduct a series of experimental surveys to quantify how common variables in field workflows impact the precision of scan registration. Primary variables we are testing include 1) use of an independently measured network of control points to locate scanner and target positions, 2) the number of known-point locations used to place the scanner and point clouds in 3-D space, 3) the type of target used to measure distances between the scanner and the known points, and 4) setting up the scanner over a known point as opposed to resectioning of known points. Precision of the registered point cloud is quantified using Trimble Realworks software by automatic calculation of registration errors (errors between locations of the same known points in different scans). Accuracy of the registered cloud (i.e., its ground-truth) will be measured in subsequent experiments. To obtain an independent measure of scan-registration errors and to better visualize the effects of these errors on a registered point

  6. [Application of target restoration space quantity and quantitative relation in precise esthetic prosthodontics].

    PubMed

    Haiyang, Yu; Tian, Luo

    2016-06-01

    Target restoration space (TRS) is the most precise space required for designing optimal prosthesis. TRS consists of an internal or external tooth space to confirm the esthetics and function of the final restoration. Therefore, assisted with quantitive analysis transfer, TRS quantitative analysis is a significant improvement for minimum tooth preparation. This article presents TRS quantity-related measurement, analysis, transfer, and internal relevance of three TR. classifications. Results reveal the close bond between precision and minimally invasive treatment. This study can be used to improve the comprehension and execution of precise esthetic prosthodontics.

  7. DeepID-Net: Deformable Deep Convolutional Neural Networks for Object Detection.

    PubMed

    Ouyang, Wanli; Zeng, Xingyu; Wang, Xiaogang; Qiu, Shi; Luo, Ping; Tian, Yonglong; Li, Hongsheng; Yang, Shuo; Wang, Zhe; Li, Hongyang; Loy, Chen Change; Wang, Kun; Yan, Junjie; Tang, Xiaoou

    2016-07-07

    In this paper, we propose deformable deep convolutional neural networks for generic object detection. This new deep learning object detection framework has innovations in multiple aspects. In the proposed new deep architecture, a new deformation constrained pooling (def-pooling) layer models the deformation of object parts with geometric constraint and penalty. A new pre-training strategy is proposed to learn feature representations more suitable for the object detection task and with good generalization capability. By changing the net structures, training strategies, adding and removing some key components in the detection pipeline, a set of models with large diversity are obtained, which significantly improves the effectiveness of model averaging. The proposed approach improves the mean averaged precision obtained by RCNN [16], which was the state-of-the-art, from 31% to 50.3% on the ILSVRC2014 detection test set. It also outperforms the winner of ILSVRC2014, GoogLeNet, by 6.1%. Detailed component-wise analysis is also provided through extensive experimental evaluation, which provides a global view for people to understand the deep learning object detection pipeline.

  8. Tendency for interlaboratory precision in the GMO analysis method based on real-time PCR.

    PubMed

    Kodama, Takashi; Kurosawa, Yasunori; Kitta, Kazumi; Naito, Shigehiro

    2010-01-01

    The Horwitz curve estimates interlaboratory precision as a function only of concentration, and is frequently used as a method performance criterion in food analysis with chemical methods. The quantitative biochemical methods based on real-time PCR require an analogous criterion to progressively promote method validation. We analyzed the tendency of precision using a simplex real-time PCR technique in 53 collaborative studies of seven genetically modified (GM) crops. Reproducibility standard deviation (SR) and repeatability standard deviation (Sr) of the genetically modified organism (GMO) amount (%) was more or less independent of GM crops (i.e., maize, soybean, cotton, oilseed rape, potato, sugar beet, and rice) and evaluation procedure steps. Some studies evaluated whole steps consisting of DNA extraction and PCR quantitation, whereas others focused only on the PCR quantitation step by using DNA extraction solutions. Therefore, SR and Sr for GMO amount (%) are functions only of concentration similar to the Horwitz curve. We proposed S(R) = 0.1971C 0.8685 and S(r) = 0.1478C 0.8424, where C is the GMO amount (%). We also proposed a method performance index in GMO quantitative methods that is analogous to the Horwitz Ratio.

  9. An object correlation and maneuver detection approach for space surveillance

    NASA Astrophysics Data System (ADS)

    Huang, Jian; Hu, Wei-Dong; Xin, Qin; Du, Xiao-Yong

    2012-10-01

    Object correlation and maneuver detection are persistent problems in space surveillance and maintenance of a space object catalog. We integrate these two problems into one interrelated problem, and consider them simultaneously under a scenario where space objects only perform a single in-track orbital maneuver during the time intervals between observations. We mathematically formulate this integrated scenario as a maximum a posteriori (MAP) estimation. In this work, we propose a novel approach to solve the MAP estimation. More precisely, the corresponding posterior probability of an orbital maneuver and a joint association event can be approximated by the Joint Probabilistic Data Association (JPDA) algorithm. Subsequently, the maneuvering parameters are estimated by optimally solving the constrained non-linear least squares iterative process based on the second-order cone programming (SOCP) algorithm. The desired solution is derived according to the MAP criterions. The performance and advantages of the proposed approach have been shown by both theoretical analysis and simulation results. We hope that our work will stimulate future work on space surveillance and maintenance of a space object catalog.

  10. High-resolution tree canopy mapping for New York City using LIDAR and object-based image analysis

    NASA Astrophysics Data System (ADS)

    MacFaden, Sean W.; O'Neil-Dunne, Jarlath P. M.; Royar, Anna R.; Lu, Jacqueline W. T.; Rundle, Andrew G.

    2012-01-01

    Urban tree canopy is widely believed to have myriad environmental, social, and human-health benefits, but a lack of precise canopy estimates has hindered quantification of these benefits in many municipalities. This problem was addressed for New York City using object-based image analysis (OBIA) to develop a comprehensive land-cover map, including tree canopy to the scale of individual trees. Mapping was performed using a rule-based expert system that relied primarily on high-resolution LIDAR, specifically its capacity for evaluating the height and texture of aboveground features. Multispectral imagery was also used, but shadowing and varying temporal conditions limited its utility. Contextual analysis was a key part of classification, distinguishing trees according to their physical and spectral properties as well as their relationships to adjacent, nonvegetated features. The automated product was extensively reviewed and edited via manual interpretation, and overall per-pixel accuracy of the final map was 96%. Although manual editing had only a marginal effect on accuracy despite requiring a majority of project effort, it maximized aesthetic quality and ensured the capture of small, isolated trees. Converting high-resolution LIDAR and imagery into usable information is a nontrivial exercise, requiring significant processing time and labor, but an expert system-based combination of OBIA and manual review was an effective method for fine-scale canopy mapping in a complex urban environment.

  11. Department of Defense Tri-Service Precision Machine-Tool Program. Quarterly report, February--April 1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-06-01

    Following a planning period during which the Lawrence Livermore Laboratory and the Department of Defense managing sponsor, the USAF Materials Laboratory, agreed on work statements, the Department of Defense Tri-Service Precision Machine-Tool Program began in February 1978. Milestones scheduled for the first quarter have been met. Tasks and manpower requirements for two basic projects, precision-machining commercialization (PMC) and a machine-tool task force (MTTF), were defined. Progress by PMC includes: (1) documentation of existing precision machine-tool technology by initiation and compilation of a bibliography containing several hundred entries: (2) identification of the problems and needs of precision turning-machine builders and ofmore » precision turning-machine users interested in developing high-precision machining capability; and (3) organization of the schedule and content of the first seminar, to be held in October 1978, which will bring together representatives from the machine-tool and optics communities to address the problems and begin the process of high-precision machining commercialization. Progress by MTTF includes: (1) planning for the organization of a team effort of approximately 60 to 80 international experts to contribute in various ways to project objectives, namely, to summarize state-of-the-art cutting-machine-tool technology and to identify areas where future R and D should prove technically and economically profitable; (2) preparation of a comprehensive plan to achieve those objectives; and (3) preliminary arrangements for a plenary session, also in October, when the task force will meet to formalize the details for implementing the plan.« less

  12. Optimization Techniques for Improving the Precision of Isotopic Analysis by Thermal Ionization Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Wang, G. Q.; Xu, J. F.; Wu-Yang, S. Q.

    2016-12-01

    Operation of instruments and preparation of samples are also significant factors that can affect the precision of TIMS analyses, in addition to instrument hardware. We have reviewed the isotopic data of several standard materials at our TIMS lab for 5 years. It is suggested that several optimization techniques should be used in order to obtain high-precision isotopic ratio data: (1) It is important to choose a suitable filament material for isotopic measurements. We have established that W filament is likely the most efficient for ionizing Sr when selecting from W, Re, and Ta; meanwhile, Re filament can produce a higher intensity for Nd isotopes than W and Ta filament can. It is concluded that the best TIMS signals are obtained for Sr using W signal-filaments and for Nd using Re double-filaments. (2) The preparation of the activator plays a key role in the analysis of some isotopic ratios. This study indicates that choosing a suitable activator can greatly elevate the precision of 206Pb/204Pb ratios during Pb isotopic measurements. We have suggested a new scheme to make an activator by using a mixture of 10% Si-gel + 7.5% H3PO3 + 82.5% H2O (weight %). (3) It is necessary to re-set the cup configuration to avoid cup degradation when operating for a long period of time (a year or more). We propose a new cup configuration to avoid this disadvantage during Sr isotopic analyses. (4) The contamination of 187Re and 185Re after using Re-filament could be eliminated by cleaning the ion source and baking the source housing.

  13. Precision pointing and control of flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Bantell, M. H., Jr.

    1987-01-01

    The problem and long term objectives for the precision pointing and control of flexible spacecraft are given. The four basic objectives are stated in terms of two principle tasks. Under Task 1, robust low order controllers, improved structural modeling methods for control applications and identification methods for structural dynamics are being developed. Under Task 2, a lab test experiment for verification of control laws and system identification algorithms is being developed. For Task 1, work has focused on robust low order controller design and some initial considerations for structural modeling in control applications. For Task 2, work has focused on experiment design and fabrication, along with sensor selection and initial digital controller implementation. Conclusions are given.

  14. COSMOS: Carnegie Observatories System for MultiObject Spectroscopy

    NASA Astrophysics Data System (ADS)

    Oemler, A.; Clardy, K.; Kelson, D.; Walth, G.; Villanueva, E.

    2017-05-01

    COSMOS (Carnegie Observatories System for MultiObject Spectroscopy) reduces multislit spectra obtained with the IMACS and LDSS3 spectrographs on the Magellan Telescopes. It can be used for the quick-look analysis of data at the telescope as well as for pipeline reduction of large data sets. COSMOS is based on a precise optical model of the spectrographs, which allows (after alignment and calibration) an accurate prediction of the location of spectra features. This eliminates the line search procedure which is fundamental to many spectral reduction programs, and allows a robust data pipeline to be run in an almost fully automatic mode, allowing large amounts of data to be reduced with minimal intervention.

  15. Microhartree precision in density functional theory calculations

    NASA Astrophysics Data System (ADS)

    Gulans, Andris; Kozhevnikov, Anton; Draxl, Claudia

    2018-04-01

    To address ultimate precision in density functional theory calculations we employ the full-potential linearized augmented plane-wave + local-orbital (LAPW + lo) method and justify its usage as a benchmark method. LAPW + lo and two completely unrelated numerical approaches, the multiresolution analysis (MRA) and the linear combination of atomic orbitals, yield total energies of atoms with mean deviations of 0.9 and 0.2 μ Ha , respectively. Spectacular agreement with the MRA is reached also for total and atomization energies of the G2-1 set consisting of 55 molecules. With the example of α iron we demonstrate the capability of LAPW + lo to reach μ Ha /atom precision also for periodic systems, which allows also for the distinction between the numerical precision and the accuracy of a given functional.

  16. An object-based approach to weather analysis and its applications

    NASA Astrophysics Data System (ADS)

    Troemel, Silke; Diederich, Malte; Horvath, Akos; Simmer, Clemens; Kumjian, Matthew

    2013-04-01

    The research group 'Object-based Analysis and SEamless prediction' (OASE) within the Hans Ertel Centre for Weather Research programme (HErZ) pursues an object-based approach to weather analysis. The object-based tracking approach adopts the Lagrange perspective by identifying and following the development of convective events over the course of their lifetime. Prerequisites of the object-based analysis are a high-resolved observational data base and a tracking algorithm. A near real-time radar and satellite remote sensing-driven 3D observation-microphysics composite covering Germany, currently under development, contains gridded observations and estimated microphysical quantities. A 3D scale-space tracking identifies convective rain events in the dual-composite and monitors the development over the course of their lifetime. The OASE-group exploits the object-based approach in several fields of application: (1) For a better understanding and analysis of precipitation processes responsible for extreme weather events, (2) in nowcasting, (3) as a novel approach for validation of meso-γ atmospheric models, and (4) in data assimilation. Results from the different fields of application will be presented. The basic idea of the object-based approach is to identify a small set of radar- and satellite derived descriptors which characterize the temporal development of precipitation systems which constitute the objects. So-called proxies of the precipitation process are e.g. the temporal change of the brightband, vertically extensive columns of enhanced differential reflectivity ZDR or the cloud top temperature and heights identified in the 4D field of ground-based radar reflectivities and satellite retrievals generated by a cell during its life time. They quantify (micro-) physical differences among rain events and relate to the precipitation yield. Analyses on the informative content of ZDR columns as precursor for storm evolution for example will be presented to demonstrate

  17. Wireless sensor networks for heritage object deformation detection and tracking algorithm.

    PubMed

    Xie, Zhijun; Huang, Guangyan; Zarei, Roozbeh; He, Jing; Zhang, Yanchun; Ye, Hongwu

    2014-10-31

    Deformation is the direct cause of heritage object collapse. It is significant to monitor and signal the early warnings of the deformation of heritage objects. However, traditional heritage object monitoring methods only roughly monitor a simple-shaped heritage object as a whole, but cannot monitor complicated heritage objects, which may have a large number of surfaces inside and outside. Wireless sensor networks, comprising many small-sized, low-cost, low-power intelligent sensor nodes, are more useful to detect the deformation of every small part of the heritage objects. Wireless sensor networks need an effective mechanism to reduce both the communication costs and energy consumption in order to monitor the heritage objects in real time. In this paper, we provide an effective heritage object deformation detection and tracking method using wireless sensor networks (EffeHDDT). In EffeHDDT, we discover a connected core set of sensor nodes to reduce the communication cost for transmitting and collecting the data of the sensor networks. Particularly, we propose a heritage object boundary detecting and tracking mechanism. Both theoretical analysis and experimental results demonstrate that our EffeHDDT method outperforms the existing methods in terms of network traffic and the precision of the deformation detection.

  18. Wireless Sensor Networks for Heritage Object Deformation Detection and Tracking Algorithm

    PubMed Central

    Xie, Zhijun; Huang, Guangyan; Zarei, Roozbeh; He, Jing; Zhang, Yanchun; Ye, Hongwu

    2014-01-01

    Deformation is the direct cause of heritage object collapse. It is significant to monitor and signal the early warnings of the deformation of heritage objects. However, traditional heritage object monitoring methods only roughly monitor a simple-shaped heritage object as a whole, but cannot monitor complicated heritage objects, which may have a large number of surfaces inside and outside. Wireless sensor networks, comprising many small-sized, low-cost, low-power intelligent sensor nodes, are more useful to detect the deformation of every small part of the heritage objects. Wireless sensor networks need an effective mechanism to reduce both the communication costs and energy consumption in order to monitor the heritage objects in real time. In this paper, we provide an effective heritage object deformation detection and tracking method using wireless sensor networks (EffeHDDT). In EffeHDDT, we discover a connected core set of sensor nodes to reduce the communication cost for transmitting and collecting the data of the sensor networks. Particularly, we propose a heritage object boundary detecting and tracking mechanism. Both theoretical analysis and experimental results demonstrate that our EffeHDDT method outperforms the existing methods in terms of network traffic and the precision of the deformation detection. PMID:25365458

  19. Sub-cell turning to accomplish micron-level alignment of precision assemblies

    NASA Astrophysics Data System (ADS)

    Kumler, James J.; Buss, Christian

    2017-08-01

    Higher performance expectations for complex optical systems demand tighter alignment requirements for lens assembly alignment. In order to meet diffraction limited imaging performance over wide spectral bands across the UV and visible wavebands, new manufacturing approaches and tools must be developed if the optical systems will be produced consistently in volume production. This is especially applicable in the field of precision microscope objectives for life science, semiconductor inspection and laser material processing systems. We observe a rising need for the improvement in the optical imaging performance of objective lenses. The key challenge lies in the micron-level decentration and tilt of each lens element. One solution for the production of high quality lens systems is sub-cell assembly with alignment turning. This process relies on an automatic alignment chuck to align the optical axis of a mounted lens to the spindle axis of the machine. Subsequently, the mount is cut with diamond tools on a lathe with respect to the optical axis of the mount. Software controlled integrated measurement technology ensures highest precision. In addition to traditional production processes, further dimensions can be controlled in a very precise manner, e.g. the air gaps between the lenses. Using alignment turning simplifies further alignment steps and reduces the risk of errors. This paper describes new challenges in microscope objective design and manufacturing, and addresses difficulties with standard production processes. A new measurement and alignment technique is described, and strengths and limitations are outlined.

  20. Error analysis of motion correction method for laser scanning of moving objects

    NASA Astrophysics Data System (ADS)

    Goel, S.; Lohani, B.

    2014-05-01

    The limitation of conventional laser scanning methods is that the objects being scanned should be static. The need of scanning moving objects has resulted in the development of new methods capable of generating correct 3D geometry of moving objects. Limited literature is available showing development of very few methods capable of catering to the problem of object motion during scanning. All the existing methods utilize their own models or sensors. Any studies on error modelling or analysis of any of the motion correction methods are found to be lacking in literature. In this paper, we develop the error budget and present the analysis of one such `motion correction' method. This method assumes availability of position and orientation information of the moving object which in general can be obtained by installing a POS system on board or by use of some tracking devices. It then uses this information along with laser scanner data to apply correction to laser data, thus resulting in correct geometry despite the object being mobile during scanning. The major application of this method lie in the shipping industry to scan ships either moving or parked in the sea and to scan other objects like hot air balloons or aerostats. It is to be noted that the other methods of "motion correction" explained in literature can not be applied to scan the objects mentioned here making the chosen method quite unique. This paper presents some interesting insights in to the functioning of "motion correction" method as well as a detailed account of the behavior and variation of the error due to different sensor components alone and in combination with each other. The analysis can be used to obtain insights in to optimal utilization of available components for achieving the best results.

  1. Evaluation and analysis of real-time precise orbits and clocks products from different IGS analysis centers

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; Yang, Hongzhou; Gao, Yang; Yao, Yibin; Xu, Chaoqian

    2018-06-01

    To meet the increasing demands from the real-time Precise Point Positioning (PPP) users, the real-time satellite orbit and clock products are generated by different International GNSS Service (IGS) real-time analysis centers and can be publicly received through the Internet. Based on different data sources and processing strategies, the real-time products from different analysis centers therefore differ in availability and accuracy. The main objective of this paper is to evaluate availability and accuracy of different real-time products and their effects on real-time PPP. A total of nine commonly used Real-Time Service (RTS) products, namely IGS01, IGS03, CLK01, CLK15, CLK22, CLK52, CLK70, CLK81 and CLK90, will be evaluated in this paper. Because not all RTS products support multi-GNSS, only GPS products are analyzed in this paper. Firstly, the availability of all RTS products is analyzed in two levels. The first level is the epoch availability, indicating whether there is outage for that epoch. The second level is the satellite availability, which defines the available satellite number for each epoch. Then the accuracy of different RTS products is investigated on nominal accuracy and the accuracy degradation over time. Results show that Root-Mean-Square Error (RMSE) of satellite orbit ranges from 3.8 cm to 7.5 cm for different RTS products. While the mean Standard Deviations of Errors (STDE) of satellite clocks range from 1.9 cm to 5.6 cm. The modified Signal In Space Range Error (SISRE) for all products are from 1.3 cm to 5.5 cm for different RTS products. The accuracy degradation of the orbit has the linear trend for all RTS products and the satellite clock degradation depends on the satellite clock types. The Rb clocks on board of GPS IIF satellites have the smallest degradation rate of less than 3 cm over 10 min while the Cs clocks on board of GPS IIF have the largest degradation rate of more than 10 cm over 10 min. Finally, the real-time kinematic PPP is

  2. An Object Oriented Analysis Method for Ada and Embedded Systems

    DTIC Science & Technology

    1989-12-01

    expansion of the paradligm from the coding anld desiningactivities into the earlier activity of reurmnsalyi.Ts hpl, begins by discussing the application of...response time: 0.1 seconds.I Step le: Identify Known Restrictions on the Software.I " The cruise control system object code must fit within 16K of mem- orv...application of object-oriented techniques to the coding and desigll phases of the life cycle, as well as various approaches to requirements analysis. 3

  3. Laser technology for high precision satellite tracking

    NASA Technical Reports Server (NTRS)

    Plotkin, H. H.

    1974-01-01

    Fixed and mobile laser ranging stations have been developed to track satellites equipped with retro-reflector arrays. These have operated consistently at data rates of once per second with range precision better than 50 cm, using Q-switched ruby lasers with pulse durations of 20 to 40 nanoseconds. Improvements are being incorporated to improve the precision to 10 cm, and to permit ranging to more distant satellites. These include improved reflector array designs, processing and analysis of the received reflection pulses, and use of sub-nanosecond pulse duration lasers.

  4. MultiMetEval: Comparative and Multi-Objective Analysis of Genome-Scale Metabolic Models

    PubMed Central

    Gevorgyan, Albert; Kierzek, Andrzej M.; Breitling, Rainer; Takano, Eriko

    2012-01-01

    Comparative metabolic modelling is emerging as a novel field, supported by the development of reliable and standardized approaches for constructing genome-scale metabolic models in high throughput. New software solutions are needed to allow efficient comparative analysis of multiple models in the context of multiple cellular objectives. Here, we present the user-friendly software framework Multi-Metabolic Evaluator (MultiMetEval), built upon SurreyFBA, which allows the user to compose collections of metabolic models that together can be subjected to flux balance analysis. Additionally, MultiMetEval implements functionalities for multi-objective analysis by calculating the Pareto front between two cellular objectives. Using a previously generated dataset of 38 actinobacterial genome-scale metabolic models, we show how these approaches can lead to exciting novel insights. Firstly, after incorporating several pathways for the biosynthesis of natural products into each of these models, comparative flux balance analysis predicted that species like Streptomyces that harbour the highest diversity of secondary metabolite biosynthetic gene clusters in their genomes do not necessarily have the metabolic network topology most suitable for compound overproduction. Secondly, multi-objective analysis of biomass production and natural product biosynthesis in these actinobacteria shows that the well-studied occurrence of discrete metabolic switches during the change of cellular objectives is inherent to their metabolic network architecture. Comparative and multi-objective modelling can lead to insights that could not be obtained by normal flux balance analyses. MultiMetEval provides a powerful platform that makes these analyses straightforward for biologists. Sources and binaries of MultiMetEval are freely available from https://github.com/PiotrZakrzewski/MetEval/downloads. PMID:23272111

  5. Effect of Correlated Precision Errors on Uncertainty of a Subsonic Venturi Calibration

    NASA Technical Reports Server (NTRS)

    Hudson, S. T.; Bordelon, W. J., Jr.; Coleman, H. W.

    1996-01-01

    An uncertainty analysis performed in conjunction with the calibration of a subsonic venturi for use in a turbine test facility produced some unanticipated results that may have a significant impact in a variety of test situations. Precision uncertainty estimates using the preferred propagation techniques in the applicable American National Standards Institute/American Society of Mechanical Engineers standards were an order of magnitude larger than precision uncertainty estimates calculated directly from a sample of results (discharge coefficient) obtained at the same experimental set point. The differences were attributable to the effect of correlated precision errors, which previously have been considered negligible. An analysis explaining this phenomenon is presented. The article is not meant to document the venturi calibration, but rather to give a real example of results where correlated precision terms are important. The significance of the correlated precision terms could apply to many test situations.

  6. Some new mathematical methods for variational objective analysis

    NASA Technical Reports Server (NTRS)

    Wahba, Grace; Johnson, Donald R.

    1994-01-01

    Numerous results were obtained relevant to remote sensing, variational objective analysis, and data assimilation. A list of publications relevant in whole or in part is attached. The principal investigator gave many invited lectures, disseminating the results to the meteorological community as well as the statistical community. A list of invited lectures at meetings is attached, as well as a list of departmental colloquia at various universities and institutes.

  7. Object-oriented millisecond timers for the PC.

    PubMed

    Hamm, J P

    2001-11-01

    Object-oriented programming provides a useful structure for designing reusable code. Accurate millisecond timing is essential for many areas of research. With this in mind, this paper provides a Turbo Pascal unit containing an object-oriented millisecond timer. This approach allows for multiple timers to be running independently. The timers may also be set at different levels of temporal precision, such as 10(-3) (milliseconds) or 10(-5) sec. The object also is able to store the time of a flagged event for later examination without interrupting the ongoing timing operation.

  8. Sensitivity of Forecast Skill to Different Objective Analysis Schemes

    NASA Technical Reports Server (NTRS)

    Baker, W. E.

    1979-01-01

    Numerical weather forecasts are characterized by rapidly declining skill in the first 48 to 72 h. Recent estimates of the sources of forecast error indicate that the inaccurate specification of the initial conditions contributes substantially to this error. The sensitivity of the forecast skill to the initial conditions is examined by comparing a set of real-data experiments whose initial data were obtained with two different analysis schemes. Results are presented to emphasize the importance of the objective analysis techniques used in the assimilation of observational data.

  9. Spacecraft Alignment Determination and Control for Dual Spacecraft Precision Formation Flying

    NASA Technical Reports Server (NTRS)

    Calhoun, Philip; Novo-Gradac, Anne-Marie; Shah, Neerav

    2017-01-01

    Many proposed formation flying missions seek to advance the state of the art in spacecraft science imaging by utilizing precision dual spacecraft formation flying to enable a virtual space telescope. Using precision dual spacecraft alignment, very long focal lengths can be achieved by locating the optics on one spacecraft and the detector on the other. Proposed science missions include astrophysics concepts with spacecraft separations from 1000 km to 25,000 km, such as the Milli-Arc-Second Structure Imager (MASSIM) and the New Worlds Observer, and Heliophysics concepts for solar coronagraphs and X-ray imaging with smaller separations (50m-500m). All of these proposed missions require advances in guidance, navigation, and control (GNC) for precision formation flying. In particular, very precise astrometric alignment control and estimation is required for precise inertial pointing of the virtual space telescope to enable science imaging orders of magnitude better than can be achieved with conventional single spacecraft instruments. This work develops design architectures, algorithms, and performance analysis of proposed GNC systems for precision dual spacecraft astrometric alignment. These systems employ a variety of GNC sensors and actuators, including laser-based alignment and ranging systems, optical imaging sensors (e.g. guide star telescope), inertial measurement units (IMU), as well as microthruster and precision stabilized platforms. A comprehensive GNC performance analysis is given for Heliophysics dual spacecraft PFF imaging mission concept.

  10. Spacecraft Alignment Determination and Control for Dual Spacecraft Precision Formation Flying

    NASA Technical Reports Server (NTRS)

    Calhoun, Philip C.; Novo-Gradac, Anne-Marie; Shah, Neerav

    2017-01-01

    Many proposed formation flying missions seek to advance the state of the art in spacecraft science imaging by utilizing precision dual spacecraft formation flying to enable a virtual space telescope. Using precision dual spacecraft alignment, very long focal lengths can be achieved by locating the optics on one spacecraft and the detector on the other. Proposed science missions include astrophysics concepts with spacecraft separations from 1000 km to 25,000 km, such as the Milli-Arc-Second Structure Imager (MASSIM) and the New Worlds Observer, and Heliophysics concepts for solar coronagraphs and X-ray imaging with smaller separations (50m 500m). All of these proposed missions require advances in guidance, navigation, and control (GNC) for precision formation flying. In particular, very precise astrometric alignment control and estimation is required for precise inertial pointing of the virtual space telescope to enable science imaging orders of magnitude better than can be achieved with conventional single spacecraft instruments. This work develops design architectures, algorithms, and performance analysis of proposed GNC systems for precision dual spacecraft astrometric alignment. These systems employ a variety of GNC sensors and actuators, including laser-based alignment and ranging systems, optical imaging sensors (e.g. guide star telescope), inertial measurement units (IMU), as well as micro-thruster and precision stabilized platforms. A comprehensive GNC performance analysis is given for Heliophysics dual spacecraft PFF imaging mission concept.

  11. Applying formal methods and object-oriented analysis to existing flight software

    NASA Technical Reports Server (NTRS)

    Cheng, Betty H. C.; Auernheimer, Brent

    1993-01-01

    Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.

  12. Precision mechatronics based on high-precision measuring and positioning systems and machines

    NASA Astrophysics Data System (ADS)

    Jäger, Gerd; Manske, Eberhard; Hausotte, Tino; Mastylo, Rostyslav; Dorozhovets, Natalja; Hofmann, Norbert

    2007-06-01

    Precision mechatronics is defined in the paper as the science and engineering of a new generation of high precision systems and machines. Nanomeasuring and nanopositioning engineering represents important fields of precision mechatronics. The nanometrology is described as the today's limit of the precision engineering. The problem, how to design nanopositioning machines with uncertainties as small as possible will be discussed. The integration of several optical and tactile nanoprobes makes the 3D-nanopositioning machine suitable for various tasks, such as long range scanning probe microscopy, mask and wafer inspection, nanotribology, nanoindentation, free form surface measurement as well as measurement of microoptics, precision molds, microgears, ring gauges and small holes.

  13. Cost analysis of objective resident cataract surgery assessments.

    PubMed

    Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M

    2015-05-01

    To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  14. Video-rate or high-precision: a flexible range imaging camera

    NASA Astrophysics Data System (ADS)

    Dorrington, Adrian A.; Cree, Michael J.; Carnegie, Dale A.; Payne, Andrew D.; Conroy, Richard M.; Godbaz, John P.; Jongenelen, Adrian P. P.

    2008-02-01

    A range imaging camera produces an output similar to a digital photograph, but every pixel in the image contains distance information as well as intensity. This is useful for measuring the shape, size and location of objects in a scene, hence is well suited to certain machine vision applications. Previously we demonstrated a heterodyne range imaging system operating in a relatively high resolution (512-by-512) pixels and high precision (0.4 mm best case) configuration, but with a slow measurement rate (one every 10 s). Although this high precision range imaging is useful for some applications, the low acquisition speed is limiting in many situations. The system's frame rate and length of acquisition is fully configurable in software, which means the measurement rate can be increased by compromising precision and image resolution. In this paper we demonstrate the flexibility of our range imaging system by showing examples of high precision ranging at slow acquisition speeds and video-rate ranging with reduced ranging precision and image resolution. We also show that the heterodyne approach and the use of more than four samples per beat cycle provides better linearity than the traditional homodyne quadrature detection approach. Finally, we comment on practical issues of frame rate and beat signal frequency selection.

  15. An Assessment of Imaging Informatics for Precision Medicine in Cancer.

    PubMed

    Chennubhotla, C; Clarke, L P; Fedorov, A; Foran, D; Harris, G; Helton, E; Nordstrom, R; Prior, F; Rubin, D; Saltz, J H; Shalley, E; Sharma, A

    2017-08-01

    Objectives: Precision medicine requires the measurement, quantification, and cataloging of medical characteristics to identify the most effective medical intervention. However, the amount of available data exceeds our current capacity to extract meaningful information. We examine the informatics needs to achieve precision medicine from the perspective of quantitative imaging and oncology. Methods: The National Cancer Institute (NCI) organized several workshops on the topic of medical imaging and precision medicine. The observations and recommendations are summarized herein. Results: Recommendations include: use of standards in data collection and clinical correlates to promote interoperability; data sharing and validation of imaging tools; clinician's feedback in all phases of research and development; use of open-source architecture to encourage reproducibility and reusability; use of challenges which simulate real-world situations to incentivize innovation; partnership with industry to facilitate commercialization; and education in academic communities regarding the challenges involved with translation of technology from the research domain to clinical utility and the benefits of doing so. Conclusions: This article provides a survey of the role and priorities for imaging informatics to help advance quantitative imaging in the era of precision medicine. While these recommendations were drawn from oncology, they are relevant and applicable to other clinical domains where imaging aids precision medicine. Georg Thieme Verlag KG Stuttgart.

  16. Precise FIA plot registration using field and dense LIDAR data

    Treesearch

    Demetrios Gatziolis

    2009-01-01

    Precise registration of forest inventory and analysis (FIA) plots is a prerequisite for an effective fusion of field data with ancillary spatial information, which is an approach commonly employed in the mapping of various forest parameters. Although the adoption of Global Positioning System technology has improved the precision of plot coordinates obtained during...

  17. Scout: orbit analysis and hazard assessment for NEOCP objects

    NASA Astrophysics Data System (ADS)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  18. Optimization of Exposure Time Division for Multi-object Photometry

    NASA Astrophysics Data System (ADS)

    Popowicz, Adam; Kurek, Aleksander R.

    2017-09-01

    Optical observations of wide fields of view entail the problem of selecting the best exposure time. As many objects are usually observed simultaneously, the quality of photometry of the brightest ones is always better than that of the dimmer ones, even though all of them are frequently equally interesting for astronomers. Thus, measuring all objects with the highest possible precision is desirable. In this paper, we present a new optimization algorithm, dedicated for the division of exposure time into sub-exposures, which enables photometry with a more balanced noise budget. The proposed technique increases the photometric precision of dimmer objects at the expense of the measurement fidelity of the brightest ones. We have tested the method on real observations using two telescope setups, demonstrating its usefulness and good consistency with theoretical expectations. The main application of our approach is a wide range of sky surveys, including ones performed by space telescopes. The method can be used to plan virtually any photometric observation of objects that show a wide range of magnitudes.

  19. Precision positioning device

    DOEpatents

    McInroy, John E.

    2005-01-18

    A precision positioning device is provided. The precision positioning device comprises a precision measuring/vibration isolation mechanism. A first plate is provided with the precision measuring mean secured to the first plate. A second plate is secured to the first plate. A third plate is secured to the second plate with the first plate being positioned between the second plate and the third plate. A fourth plate is secured to the third plate with the second plate being positioned between the third plate and the fourth plate. An adjusting mechanism for adjusting the position of the first plate, the second plate, the third plate, and the fourth plate relative to each other.

  20. Virtual learning object and environment: a concept analysis.

    PubMed

    Salvador, Pétala Tuani Candido de Oliveira; Bezerril, Manacés Dos Santos; Mariz, Camila Maria Santos; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2017-01-01

    To analyze the concept of virtual learning object and environment according to Rodgers' evolutionary perspective. Descriptive study with a mixed approach, based on the stages proposed by Rodgers in his concept analysis method. Data collection occurred in August 2015 with the search of dissertations and theses in the Bank of Theses of the Coordination for the Improvement of Higher Education Personnel. Quantitative data were analyzed based on simple descriptive statistics and the concepts through lexicographic analysis with support of the IRAMUTEQ software. The sample was made up of 161 studies. The concept of "virtual learning environment" was presented in 99 (61.5%) studies, whereas the concept of "virtual learning object" was presented in only 15 (9.3%) studies. A virtual learning environment includes several and different types of virtual learning objects in a common pedagogical context. Analisar o conceito de objeto e de ambiente virtual de aprendizagem na perspectiva evolucionária de Rodgers. Estudo descritivo, de abordagem mista, realizado a partir das etapas propostas por Rodgers em seu modelo de análise conceitual. A coleta de dados ocorreu em agosto de 2015 com a busca de dissertações e teses no Banco de Teses e Dissertações da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. Os dados quantitativos foram analisados a partir de estatística descritiva simples e os conceitos pela análise lexicográfica com suporte do IRAMUTEQ. A amostra é constituída de 161 estudos. O conceito de "ambiente virtual de aprendizagem" foi apresentado em 99 (61,5%) estudos, enquanto o de "objeto virtual de aprendizagem" em apenas 15 (9,3%). Concluiu-se que um ambiente virtual de aprendizagem reúne vários e diferentes tipos de objetos virtuais de aprendizagem em um contexto pedagógico comum.

  1. Objective image analysis of the meibomian gland area.

    PubMed

    Arita, Reiko; Suehiro, Jun; Haraguchi, Tsuyoshi; Shirakawa, Rika; Tokoro, Hideaki; Amano, Shiro

    2014-06-01

    To evaluate objectively the meibomian gland area using newly developed software for non-invasive meibography. Eighty eyelids of 42 patients without meibomian gland loss (meiboscore=0), 105 eyelids of 57 patients with loss of less than one-third total meibomian gland area (meiboscore=1), 13 eyelids of 11 patients with between one-third and two-thirds loss of meibomian gland area (meiboscore=2) and 20 eyelids of 14 patients with two-thirds loss of meibomian gland area (meiboscore=3) were studied. Lid borders were automatically determined. The software evaluated the distribution of the luminance and, by enhancing the contrast and reducing image noise, the meibomian gland area was automatically discriminated. The software calculated the ratio of the total meibomian gland area relative to the total analysis area in all subjects. Repeatability of the software was also evaluated. The mean ratio of the meibomian gland area to the total analysis area in the upper/lower eyelids was 51.9±5.7%/54.7±5.4% in subjects with a meiboscore of 0, 47.7±6.0%/51.5±5.4% in those with a meiboscore of 1, 32.0±4.4%/37.2±3.5% in those with a meiboscore of 2 and 16.7±6.4%/19.5±5.8% in subjects with a meiboscore of 3. The meibomian gland area was objectively evaluated using the developed software. This system could be useful for objectively evaluating the effect of treatment on meibomian gland dysfunction. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Analysis on the precision of the dimensions of self-ligating brackets.

    PubMed

    Erduran, Rackel Hatice Milhomens Gualberto; Maeda, Fernando Akio; Ortiz, Sandra Regina Mota; Triviño, Tarcila; Fuziy, Acácio; Carvalho, Paulo Eduardo Guedes

    2016-12-01

    The present study aimed to evaluate the precision of the torque applied by 0.022" self-ligating brackets of different brands, the precision of parallelism between the inner walls of their slots, and precision of their slot height. Eighty brackets for upper central incisors of eight trademarked models were selected: Abzil, GAC, American Orthodontics, Morelli, Orthometric, Ormco, Forestadent, and Ortho Organizers. Images of the brackets were obtained using a scanning electron microscope (SEM) and these were measured using the AutoCAD 2011 software. The tolerance parameters stated in the ISO 27020 standard were used as references. The results showed that only the Orthometric, Morelli, and Ormco groups showed results inconsistent with the ISO standard. Regarding the parallelism of the internal walls of the slots, most of the models studied had results in line with the ISO prescription, except the Morelli group. In assessing bracket slot height, only the Forestadent, GAC, American Orthodontics, and Ormco groups presented results in accordance with the ISO standard. The GAC, Forestadent, and American Orthodontics groups did not differ in relation to the three factors of the ISO 27020 standard. Great variability of results is observed in relation to all the variables. © 2016 Wiley Periodicals, Inc.

  3. Search for Cross-Correlations of Ultrahigh-Energy Cosmic Rays with BL Lacertae Objects

    NASA Astrophysics Data System (ADS)

    Abbasi, R. U.; Abu-Zayyad, T.; Amann, J. F.; Archbold, G.; Belov, K.; Belz, J. W.; BenZvi, S.; Bergman, D. R.; Blake, S. A.; Boyer, J. H.; Burt, G. W.; Cao, Z.; Connolly, B. M.; Deng, W.; Fedorova, Y.; Findlay, J.; Finley, C. B.; Hanlon, W. F.; Hoffman, C. M.; Holzscheiter, M. H.; Hughes, G. A.; Hüntemeyer, P.; Jui, C. C. H.; Kim, K.; Kirn, M. A.; Knapp, B. C.; Loh, E. C.; Maestas, M. M.; Manago, N.; Mannel, E. J.; Marek, L. J.; Martens, K.; Matthews, J. A. J.; Matthews, J. N.; O'Neill, A.; Painter, C. A.; Perera, L.; Reil, K.; Riehle, R.; Roberts, M. D.; Rodriguez, D.; Sasaki, M.; Schnetzer, S. R.; Seman, M.; Sinnis, G.; Smith, J. D.; Snow, R.; Sokolsky, P.; Springer, R. W.; Stokes, B. T.; Thomas, J. R.; Thomas, S. B.; Thomson, G. B.; Tupa, D.; Westerhoff, S.; Wiencke, L. R.; Zech, A.; HIRES Collaboration

    2006-01-01

    Data taken in stereo mode by the High Resolution Fly's Eye (HiRes) air fluorescence experiment are analyzed to search for correlations between the arrival directions of ultrahigh-energy cosmic rays with the positions of BL Lacertae objects. Several previous claims of significant correlations between BL Lac objects and cosmic rays observed by other experiments are tested. These claims are not supported by the HiRes data. However, we verify a recent analysis of correlations between HiRes events and a subset of confirmed BL Lac objects from the 10th Veron Catalog, and we study this correlation in detail. Due to the a posteriori nature of the search, the significance level cannot be reliably estimated and the correlation must be tested independently before any claim can be made. We identify the precise hypotheses that will be tested with statistically independent data.

  4. Accuracy and precision of 3 intraoral scanners and accuracy of conventional impressions: A novel in vivo analysis method.

    PubMed

    Nedelcu, R; Olsson, P; Nyström, I; Rydén, J; Thor, A

    2018-02-01

    To evaluate a novel methodology using industrial scanners as a reference, and assess in vivo accuracy of 3 intraoral scanners (IOS) and conventional impressions. Further, to evaluate IOS precision in vivo. Four reference-bodies were bonded to the buccal surfaces of upper premolars and incisors in five subjects. After three reference-scans, ATOS Core 80 (ATOS), subjects were scanned three times with three IOS systems: 3M True Definition (3M), CEREC Omnicam (OMNI) and Trios 3 (TRIOS). One conventional impression (IMPR) was taken, 3M Impregum Penta Soft, and poured models were digitized with laboratory scanner 3shape D1000 (D1000). Best-fit alignment of reference-bodies and 3D Compare Analysis was performed. Precision of ATOS and D1000 was assessed for quantitative evaluation and comparison. Accuracy of IOS and IMPR were analyzed using ATOS as reference. Precision of IOS was evaluated through intra-system comparison. Precision of ATOS reference scanner (mean 0.6 μm) and D1000 (mean 0.5 μm) was high. Pairwise multiple comparisons of reference-bodies located in different tooth positions displayed a statistically significant difference of accuracy between two scanner-groups: 3M and TRIOS, over OMNI (p value range 0.0001 to 0.0006). IMPR did not show any statistically significant difference to IOS. However, deviations of IOS and IMPR were within a similar magnitude. No statistical difference was found for IOS precision. The methodology can be used for assessing accuracy of IOS and IMPR in vivo in up to five units bilaterally from midline. 3M and TRIOS had a higher accuracy than OMNI. IMPR overlapped both groups. Intraoral scanners can be used as a replacement for conventional impressions when restoring up to ten units without extended edentulous spans. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. The role of color information on object recognition: a review and meta-analysis.

    PubMed

    Bramão, Inês; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2011-09-01

    In this study, we systematically review the scientific literature on the effect of color on object recognition. Thirty-five independent experiments, comprising 1535 participants, were included in a meta-analysis. We found a moderate effect of color on object recognition (d=0.28). Specific effects of moderator variables were analyzed and we found that color diagnosticity is the factor with the greatest moderator effect on the influence of color in object recognition; studies using color diagnostic objects showed a significant color effect (d=0.43), whereas a marginal color effect was found in studies that used non-color diagnostic objects (d=0.18). The present study did not permit the drawing of specific conclusions about the moderator effect of the object recognition task; while the meta-analytic review showed that color information improves object recognition mainly in studies using naming tasks (d=0.36), the literature review revealed a large body of evidence showing positive effects of color information on object recognition in studies using a large variety of visual recognition tasks. We also found that color is important for the ability to recognize artifacts and natural objects, to recognize objects presented as types (line-drawings) or as tokens (photographs), and to recognize objects that are presented without surface details, such as texture or shadow. Taken together, the results of the meta-analysis strongly support the contention that color plays a role in object recognition. This suggests that the role of color should be taken into account in models of visual object recognition. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. The effect of input data transformations on object-based image analysis

    PubMed Central

    LIPPITT, CHRISTOPHER D.; COULTER, LLOYD L.; FREEMAN, MARY; LAMANTIA-BISHOP, JEFFREY; PANG, WYSON; STOW, DOUGLAS A.

    2011-01-01

    The effect of using spectral transform images as input data on segmentation quality and its potential effect on products generated by object-based image analysis are explored in the context of land cover classification in Accra, Ghana. Five image data transformations are compared to untransformed spectral bands in terms of their effect on segmentation quality and final product accuracy. The relationship between segmentation quality and product accuracy is also briefly explored. Results suggest that input data transformations can aid in the delineation of landscape objects by image segmentation, but the effect is idiosyncratic to the transformation and object of interest. PMID:21673829

  7. Objective analysis of tidal fields in the Atlantic and Indian Oceans

    NASA Technical Reports Server (NTRS)

    Sanchez, B. V.; Rao, D. B.; Steenrod, S. D.

    1986-01-01

    An objective analysis technique has been developed to extrapolate tidal amplitudes and phases over entire ocean basins using existing gauge data and the altimetric measurements which are now beginning to be provided by satellite oceanography. The technique was previously tested in the Lake Superior basin. The method has now been developed and applied in the Atlantic-Indian ocean basins using a 6 deg x 6 deg grid to test its essential features. The functions used in the interpolation are the eigenfunctions of the velocity potential (Proudman functions) which are computed numerically from a knowledge of the basin's bottom topography, the horizontal plan form and the necessary boundary conditions. These functions are characteristic of the particular basin. The gravitational normal modes of the basin are computed as part of the investigation, they are used to obtain the theoretical forced solutions for the tidal constituents, the latter provide the simulated data for the testing of the method and serve as a guide in choosing the most energetic modes for the objective analysis. The results of the objective analysis of the M2 and K1 tidal constituents indicate the possibility of recovering the tidal signal with a degree of accuracy well within the error bounds of present day satellite techniques.

  8. Analysis of pesticide residues on museum objects repatriated to the Hupa tribe of California.

    PubMed

    Palmer, Peter T; Martin, Matthew; Wentworth, Gregory; Caldararo, Niccolo; Davis, Lee; Kane, Shawn; Hostler, David

    2003-03-15

    In the past, it was common practice for museum professionals and private collectors to apply a variety of pesticide agents to objects in their collections to preserve them from depredations by microorganisms, fungi, and other pests. The Native American Graves Repatriation and Protection Act allows federally recognized tribes to request that museums return objects taken from their ancestors. Given that poor records were kept on the treatment of individual objects, it is unknown whether specific objects are contaminated with these pesticide agents. Although chemical analysis represents the only reliable means to determine the types and levels of pesticides on these objects, surprisingly few publications document the extent of this contamination in museum collections. This paper reports on the determination of arsenic, mercury, and several organic pesticides on 17 objects that were recently repatriated to the Hupa tribe in northern California. Four samples were taken from each object: two for arsenic and mercury analysis via flame atomic absorption spectrophotometry and two for organic pesticide analysis via gas chromatography/mass spectrometry. Percent levels (wt/wt) of mercury were detected on many samples, and 0.001 to 0.183% (wt/wt) levels of p-dichlorobenzene, naphthalene, thymol, lindane, and/or DDT were detected on many of the samples. These results indicate that Hupa tribal members should not wear these objects in religious ceremonies, proper precautions should be followed when dealing with potentially contaminated objects, and that more serious consideration should be given to this issue at a national level.

  9. Santa Fe School Precision Teaching Program, Evaluation Report 1974-75.

    ERIC Educational Resources Information Center

    Spencer, Mary L.; Henderson, Joan C.

    The Santa Fe Precision Teaching for Effective Learning, (PTEL) an ESEA Title III program, was selected as a remedial instructional approach to the performance and motivational problems of Santa Fe students. It proposed the following six major program objectives: (1) planning and implementation of start-up activities; (2) staff training in the…

  10. Shape Analysis of Planar Multiply-Connected Objects Using Conformal Welding.

    PubMed

    Lok Ming Lui; Wei Zeng; Shing-Tung Yau; Xianfeng Gu

    2014-07-01

    Shape analysis is a central problem in the field of computer vision. In 2D shape analysis, classification and recognition of objects from their observed silhouettes are extremely crucial but difficult. It usually involves an efficient representation of 2D shape space with a metric, so that its mathematical structure can be used for further analysis. Although the study of 2D simply-connected shapes has been subject to a corpus of literatures, the analysis of multiply-connected shapes is comparatively less studied. In this work, we propose a representation for general 2D multiply-connected domains with arbitrary topologies using conformal welding. A metric can be defined on the proposed representation space, which gives a metric to measure dissimilarities between objects. The main idea is to map the exterior and interior of the domain conformally to unit disks and circle domains (unit disk with several inner disks removed), using holomorphic 1-forms. A set of diffeomorphisms of the unit circle S(1) can be obtained, which together with the conformal modules are used to define the shape signature. A shape distance between shape signatures can be defined to measure dissimilarities between shapes. We prove theoretically that the proposed shape signature uniquely determines the multiply-connected objects under suitable normalization. We also introduce a reconstruction algorithm to obtain shapes from their signatures. This completes our framework and allows us to move back and forth between shapes and signatures. With that, a morphing algorithm between shapes can be developed through the interpolation of the Beltrami coefficients associated with the signatures. Experiments have been carried out on shapes extracted from real images. Results demonstrate the efficacy of our proposed algorithm as a stable shape representation scheme.

  11. In vivo short-term precision of hip structure analysis variables in comparison with bone mineral density using paired dual-energy X-ray absorptiometry scans from multi-center clinical trials.

    PubMed

    Khoo, Benjamin C C; Beck, Thomas J; Qiao, Qi-Hong; Parakh, Pallav; Semanick, Lisa; Prince, Richard L; Singer, Kevin P; Price, Roger I

    2005-07-01

    Hip structural analysis (HSA) is a technique for extracting strength-related structural dimensions of bone cross-sections from two-dimensional hip scan images acquired by dual energy X-ray absorptiometry (DXA) scanners. Heretofore the precision of the method has not been thoroughly tested in the clinical setting. Using paired scans from two large clinical trials involving a range of different DXA machines, this study reports the first precision analysis of HSA variables, in comparison with that of conventional bone mineral density (BMD) on the same scans. A key HSA variable, section modulus (Z), biomechanically indicative of bone strength during bending, had a short-term precision percentage coefficient of variation (CV%) in the femoral neck of 3.4-10.1%, depending on the manufacturer or model of the DXA equipment. Cross-sectional area (CSA), a determinant of bone strength during axial loading and closely aligned with conventional DXA bone mineral content, had a range of CV% from 2.8% to 7.9%. Poorer precision was associated with inadequate inclusion of the femoral shaft or femoral head in the DXA-scanned hip region. Precision of HSA-derived BMD varied between 2.4% and 6.4%. Precision of DXA manufacturer-derived BMD varied between 1.9% and 3.4%, arising from the larger analysis region of interest (ROI). The precision of HSA variables was not generally dependent on magnitude, subject height, weight, or conventional femoral neck densitometric variables. The generally poorer precision of key HSA variables in comparison with conventional DXA-derived BMD highlights the critical roles played by correct limb repositioning and choice of an adequate and appropriately positioned ROI.

  12. mMass 3: a cross-platform software environment for precise analysis of mass spectrometric data.

    PubMed

    Strohalm, Martin; Kavan, Daniel; Novák, Petr; Volný, Michael; Havlícek, Vladimír

    2010-06-01

    While tools for the automated analysis of MS and LC-MS/MS data are continuously improving, it is still often the case that at the end of an experiment, the mass spectrometrist will spend time carefully examining individual spectra. Current software support is mostly provided only by the instrument vendors, and the available software tools are often instrument-dependent. Here we present a new generation of mMass, a cross-platform environment for the precise analysis of individual mass spectra. The software covers a wide range of processing tasks such as import from various data formats, smoothing, baseline correction, peak picking, deisotoping, charge determination, and recalibration. Functions presented in the earlier versions such as in silico digestion and fragmentation were redesigned and improved. In addition to Mascot, an interface for ProFound has been implemented. A specific tool is available for isotopic pattern modeling to enable precise data validation. The largest available lipid database (from the LIPID MAPS Consortium) has been incorporated and together with the new compound search tool lipids can be rapidly identified. In addition, the user can define custom libraries of compounds and use them analogously. The new version of mMass is based on a stand-alone Python library, which provides the basic functionality for data processing and interpretation. This library can serve as a good starting point for other developers in their projects. Binary distributions of mMass, its source code, a detailed user's guide, and video tutorials are freely available from www.mmass.org .

  13. Performance Analysis of Several GPS/Galileo Precise Point Positioning Models

    PubMed Central

    Afifi, Akram; El-Rabbany, Ahmed

    2015-01-01

    This paper examines the performance of several precise point positioning (PPP) models, which combine dual-frequency GPS/Galileo observations in the un-differenced and between-satellite single-difference (BSSD) modes. These include the traditional un-differenced model, the decoupled clock model, the semi-decoupled clock model, and the between-satellite single-difference model. We take advantage of the IGS-MGEX network products to correct for the satellite differential code biases and the orbital and satellite clock errors. Natural Resources Canada’s GPSPace PPP software is modified to handle the various GPS/Galileo PPP models. A total of six data sets of GPS and Galileo observations at six IGS stations are processed to examine the performance of the various PPP models. It is shown that the traditional un-differenced GPS/Galileo PPP model, the GPS decoupled clock model, and the semi-decoupled clock GPS/Galileo PPP model improve the convergence time by about 25% in comparison with the un-differenced GPS-only model. In addition, the semi-decoupled GPS/Galileo PPP model improves the solution precision by about 25% compared to the traditional un-differenced GPS/Galileo PPP model. Moreover, the BSSD GPS/Galileo PPP model improves the solution convergence time by about 50%, in comparison with the un-differenced GPS PPP model, regardless of the type of BSSD combination used. As well, the BSSD model improves the precision of the estimated parameters by about 50% and 25% when the loose and the tight combinations are used, respectively, in comparison with the un-differenced GPS-only model. Comparable results are obtained through the tight combination when either a GPS or a Galileo satellite is selected as a reference. PMID:26102495

  14. Performance Analysis of Several GPS/Galileo Precise Point Positioning Models.

    PubMed

    Afifi, Akram; El-Rabbany, Ahmed

    2015-06-19

    This paper examines the performance of several precise point positioning (PPP) models, which combine dual-frequency GPS/Galileo observations in the un-differenced and between-satellite single-difference (BSSD) modes. These include the traditional un-differenced model, the decoupled clock model, the semi-decoupled clock model, and the between-satellite single-difference model. We take advantage of the IGS-MGEX network products to correct for the satellite differential code biases and the orbital and satellite clock errors. Natural Resources Canada's GPSPace PPP software is modified to handle the various GPS/Galileo PPP models. A total of six data sets of GPS and Galileo observations at six IGS stations are processed to examine the performance of the various PPP models. It is shown that the traditional un-differenced GPS/Galileo PPP model, the GPS decoupled clock model, and the semi-decoupled clock GPS/Galileo PPP model improve the convergence time by about 25% in comparison with the un-differenced GPS-only model. In addition, the semi-decoupled GPS/Galileo PPP model improves the solution precision by about 25% compared to the traditional un-differenced GPS/Galileo PPP model. Moreover, the BSSD GPS/Galileo PPP model improves the solution convergence time by about 50%, in comparison with the un-differenced GPS PPP model, regardless of the type of BSSD combination used. As well, the BSSD model improves the precision of the estimated parameters by about 50% and 25% when the loose and the tight combinations are used, respectively, in comparison with the un-differenced GPS-only model. Comparable results are obtained through the tight combination when either a GPS or a Galileo satellite is selected as a reference.

  15. Linear and Nonlinear Time-Frequency Analysis for Parameter Estimation of Resident Space Objects

    DTIC Science & Technology

    2017-02-22

    AFRL-AFOSR-UK-TR-2017-0023 Linear and Nonlinear Time -Frequency Analysis for Parameter Estimation of Resident Space Objects Marco Martorella...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing   data sources, gathering and maintaining the...Nonlinear Time -Frequency Analysis for Parameter Estimation of Resident Space Objects 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-14-1-0183 5c.  PROGRAM

  16. Analysis of the Precision of Pulsar Time Clock Modeltwo

    NASA Astrophysics Data System (ADS)

    Zhao, Cheng-shi; Tong, Ming-lei; Gao, Yu-ping; Yang, Ting-gao

    2018-04-01

    Millisecond pulsars have a very high rotation stability, which can be applied to many research fields, such as the establishment of the pulsar time standard, the detection of gravitational wave, the spacecraft navigation by using X-ray pulsars and so on. In this paper, we employ two millisecond pulsars PSR J0437-4715 and J1713+0743, which are observed by the International Pulsar Timing Array (IPTA), to analyze the precision of pulsar clock parameter and the prediction accuracy of pulse time of arrival (TOA). It is found that the uncertainty of spin frequency is 10-15 Hz, the uncertainty of the first derivative of spin frequency is 10-23 s-2, and the precision of measured rotational parameters increases by one order of magnitude with the accumulated observational data every 4∼5 years. In addition, the errors of TOAs within 4.8 yr which are predicted by the clock model established by the 10 yr data of J0437-4715 are less than 1 μs. Therefore, one can use the pulsar time standard to calibrate the atomic clock, and make the atomic time deviate from the TT (Terrestrial Time) less than 1 μs within 4.8 yr.

  17. Memory color of natural familiar objects: effects of surface texture and 3-D shape.

    PubMed

    Vurro, Milena; Ling, Yazhu; Hurlbert, Anya C

    2013-06-28

    Natural objects typically possess characteristic contours, chromatic surface textures, and three-dimensional shapes. These diagnostic features aid object recognition, as does memory color, the color most associated in memory with a particular object. Here we aim to determine whether polychromatic surface texture, 3-D shape, and contour diagnosticity improve memory color for familiar objects, separately and in combination. We use solid three-dimensional familiar objects rendered with their natural texture, which participants adjust in real time to match their memory color for the object. We analyze mean, accuracy, and precision of the memory color settings relative to the natural color of the objects under the same conditions. We find that in all conditions, memory colors deviate slightly but significantly in the same direction from the natural color. Surface polychromaticity, shape diagnosticity, and three dimensionality each improve memory color accuracy, relative to uniformly colored, generic, or two-dimensional shapes, respectively. Shape diagnosticity improves the precision of memory color also, and there is a trend for polychromaticity to do so as well. Differently from other studies, we find that the object contour alone also improves memory color. Thus, enhancing the naturalness of the stimulus, in terms of either surface or shape properties, enhances the accuracy and precision of memory color. The results support the hypothesis that memory color representations are polychromatic and are synergistically linked with diagnostic shape representations.

  18. Real-time GPS seismology using a single receiver: method comparison, error analysis and precision validation

    NASA Astrophysics Data System (ADS)

    Li, Xingxing

    2014-05-01

    Earthquake monitoring and early warning system for hazard assessment and mitigation has traditional been based on seismic instruments. However, for large seismic events, it is difficult for traditional seismic instruments to produce accurate and reliable displacements because of the saturation of broadband seismometers and problematic integration of strong-motion data. Compared with the traditional seismic instruments, GPS can measure arbitrarily large dynamic displacements without saturation, making them particularly valuable in case of large earthquakes and tsunamis. GPS relative positioning approach is usually adopted to estimate seismic displacements since centimeter-level accuracy can be achieved in real-time by processing double-differenced carrier-phase observables. However, relative positioning method requires a local reference station, which might itself be displaced during a large seismic event, resulting in misleading GPS analysis results. Meanwhile, the relative/network approach is time-consuming, particularly difficult for the simultaneous and real-time analysis of GPS data from hundreds or thousands of ground stations. In recent years, several single-receiver approaches for real-time GPS seismology, which can overcome the reference station problem of the relative positioning approach, have been successfully developed and applied to GPS seismology. One available method is real-time precise point positioning (PPP) relied on precise satellite orbit and clock products. However, real-time PPP needs a long (re)convergence period, of about thirty minutes, to resolve integer phase ambiguities and achieve centimeter-level accuracy. In comparison with PPP, Colosimo et al. (2011) proposed a variometric approach to determine the change of position between two adjacent epochs, and then displacements are obtained by a single integration of the delta positions. This approach does not suffer from convergence process, but the single integration from delta positions to

  19. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  20. A Retrospective Analysis of Precision Medicine Outcomes in Patients With Advanced Cancer Reveals Improved Progression-Free Survival Without Increased Health Care Costs.

    PubMed

    Haslem, Derrick S; Van Norman, S Burke; Fulde, Gail; Knighton, Andrew J; Belnap, Tom; Butler, Allison M; Rhagunath, Sharanya; Newman, David; Gilbert, Heather; Tudor, Brian P; Lin, Karen; Stone, Gary R; Loughmiller, David L; Mishra, Pravin J; Srivastava, Rajendu; Ford, James M; Nadauld, Lincoln D

    2017-02-01

    The advent of genomic diagnostic technologies such as next-generation sequencing has recently enabled the use of genomic information to guide targeted treatment in patients with cancer, an approach known as precision medicine. However, clinical outcomes, including survival and the cost of health care associated with precision cancer medicine, have been challenging to measure and remain largely unreported. We conducted a matched cohort study of 72 patients with metastatic cancer of diverse subtypes in the setting of a large, integrated health care delivery system. We analyzed the outcomes of 36 patients who received genomic testing and targeted therapy (precision cancer medicine) between July 1, 2013, and January 31, 2015, compared with 36 historical control patients who received standard chemotherapy (n = 29) or best supportive care (n = 7). The average progression-free survival was 22.9 weeks for the precision medicine group and 12.0 weeks for the control group ( P = .002) with a hazard ratio of 0.47 (95% CI, 0.29 to 0.75) when matching on age, sex, histologic diagnosis, and previous lines of treatment. In a subset analysis of patients who received all care within the Intermountain Healthcare system (n = 44), per patient charges per week were $4,665 in the precision treatment group and $5,000 in the control group ( P = .126). These findings suggest that precision cancer medicine may improve survival for patients with refractory cancer without increasing health care costs. Although the results of this study warrant further validation, this precision medicine approach may be a viable option for patients with advanced cancer.

  1. Panel 3: Genetics and Precision Medicine of Otitis Media.

    PubMed

    Lin, Jizhen; Hafrén, Hena; Kerschner, Joseph; Li, Jian-Dong; Brown, Steve; Zheng, Qing Y; Preciado, Diego; Nakamura, Yoshihisa; Huang, Qiuhong; Zhang, Yan

    2017-04-01

    Objective The objective is to perform a comprehensive review of the literature up to 2015 on the genetics and precision medicine relevant to otitis media. Data Sources PubMed database of the National Library of Medicine. Review Methods Two subpanels were formed comprising experts in the genetics and precision medicine of otitis media. Each of the panels reviewed the literature in their respective fields and wrote draft reviews. The reviews were shared with all panel members, and a merged draft was created. The entire panel met at the 18th International Symposium on Recent Advances in Otitis Media in June 2015 and discussed the review and refined the content. A final draft was made, circulated, and approved by the panel members. Conclusion Many genes relevant to otitis media have been identified in the last 4 years in advancing our knowledge regarding the predisposition of the middle ear mucosa to commensals and pathogens. Advances include mutant animal models and clinical studies. Many signaling pathways are involved in the predisposition of otitis media. Implications for Practice New knowledge on the genetic background relevant to otitis media forms a basis of novel potential interventions, including potential new ways to treat otitis media.

  2. Tunable laser techniques for improving the precision of observational astronomy

    NASA Astrophysics Data System (ADS)

    Cramer, Claire E.; Brown, Steven W.; Lykke, Keith R.; Woodward, John T.; Bailey, Stephen; Schlegel, David J.; Bolton, Adam S.; Brownstein, Joel; Doherty, Peter E.; Stubbs, Christopher W.; Vaz, Amali; Szentgyorgyi, Andrew

    2012-09-01

    Improving the precision of observational astronomy requires not only new telescopes and instrumentation, but also advances in observing protocols, calibrations and data analysis. The Laser Applications Group at the National Institute of Standards and Technology in Gaithersburg, Maryland has been applying advances in detector metrology and tunable laser calibrations to problems in astronomy since 2007. Using similar measurement techniques, we have addressed a number of seemingly disparate issues: precision flux calibration for broad-band imaging, precision wavelength calibration for high-resolution spectroscopy, and precision PSF mapping for fiber spectrographs of any resolution. In each case, we rely on robust, commercially-available laboratory technology that is readily adapted to use at an observatory. In this paper, we give an overview of these techniques.

  3. Precision and repeatability of the Optotrak 3020 motion measurement system.

    PubMed

    States, R A; Pappas, E

    2006-01-01

    Several motion analysis systems are used by researchers to quantify human motion and to perform accurate surgical procedures. The Optotrak 3020 is one of these systems and despite its widespread use there is not any published information on its precision and repeatability. We used a repeated measures design study to evaluate the precision and repeatability of the Optotrak 3020 by measuring distance and angle in three sessions, four distances and three conditions (motion, static vertical, and static tilted). Precision and repeatability were found to be excellent for both angle and distance although they decreased with increasing distance from the sensors and with tilt from the plane of the sensors. Motion did not have a significant effect on the precision of the measurements. In conclusion, the measurement error of the Optotrak is minimal. Further studies are needed to evaluate its precision and repeatability under human motion conditions.

  4. Human genomics projects and precision medicine.

    PubMed

    Carrasco-Ramiro, F; Peiró-Pastor, R; Aguado, B

    2017-09-01

    The completion of the Human Genome Project (HGP) in 2001 opened the floodgates to a deeper understanding of medicine. There are dozens of HGP-like projects which involve from a few tens to several million genomes currently in progress, which vary from having specialized goals or a more general approach. However, data generation, storage, management and analysis in public and private cloud computing platforms have raised concerns about privacy and security. The knowledge gained from further research has changed the field of genomics and is now slowly permeating into clinical medicine. The new precision (personalized) medicine, where genome sequencing and data analysis are essential components, allows tailored diagnosis and treatment according to the information from the patient's own genome and specific environmental factors. P4 (predictive, preventive, personalized and participatory) medicine is introducing new concepts, challenges and opportunities. This review summarizes current sequencing technologies, concentrates on ongoing human genomics projects, and provides some examples in which precision medicine has already demonstrated clinical impact in diagnosis and/or treatment.

  5. Precision Mass Property Measurements Using a Five-Wire Torsion Pendulum

    NASA Technical Reports Server (NTRS)

    Swank, Aaron J.

    2012-01-01

    A method for measuring the moment of inertia of an object using a five-wire torsion pendulum design is described here. Typical moment of inertia measurement devices are capable of 1 part in 10(exp 3) accuracy and current state of the art techniques have capabilities of about one part in 10(exp 4). The five-wire apparatus design shows the prospect of improving on current state of the art. Current measurements using a laboratory prototype indicate a moment of inertia measurement precision better than a part in 10(exp 4). In addition, the apparatus is shown to be capable of measuring the mass center offset from the geometric center. Typical mass center measurement devices exhibit a measurement precision up to approximately 1 micrometer. Although the five-wire pendulum was not originally designed for mass center measurements, preliminary results indicate an apparatus with a similar design may have the potential of achieving state of the art precision.

  6. Simulation analysis of photometric data for attitude estimation of unresolved space objects

    NASA Astrophysics Data System (ADS)

    Du, Xiaoping; Gou, Ruixin; Liu, Hao; Hu, Heng; Wang, Yang

    2017-10-01

    The attitude information acquisition of unresolved space objects, such as micro-nano satellites and GEO objects under the way of ground-based optical observations, is a challenge to space surveillance. In this paper, a useful method is proposed to estimate the SO attitude state according to the simulation analysis of photometric data in different attitude states. The object shape model was established and the parameters of the BRDF model were determined, then the space object photometric model was established. Furthermore, the photometric data of space objects in different states are analyzed by simulation and the regular characteristics of the photometric curves are summarized. The simulation results show that the photometric characteristics are useful for attitude inversion in a unique way. Thus, a new idea is provided for space object identification in this paper.

  7. Sleep deprivation accelerates delay-related loss of visual short-term memories without affecting precision.

    PubMed

    Wee, Natalie; Asplund, Christopher L; Chee, Michael W L

    2013-06-01

    Visual short-term memory (VSTM) is an important measure of information processing capacity and supports many higher-order cognitive processes. We examined how sleep deprivation (SD) and maintenance duration interact to influence the number and precision of items in VSTM using an experimental design that limits the contribution of lapses at encoding. For each trial, participants attempted to maintain the location and color of three stimuli over a delay. After a retention interval of either 1 or 10 seconds, participants reported the color of the item at the cued location by selecting it on a color wheel. The probability of reporting the probed item, the precision of report, and the probability of reporting a nonprobed item were determined using a mixture-modeling analysis. Participants were studied twice in counterbalanced order, once after a night of normal sleep and once following a night of sleep deprivation. Sleep laboratory. Nineteen healthy college age volunteers (seven females) with regular sleep patterns. Approximately 24 hours of total SD. SD selectively reduced the number of integrated representations that can be retrieved after a delay, while leaving the precision of object information in the stored representations intact. Delay interacted with SD to lower the rate of successful recall. Visual short-term memory is compromised during sleep deprivation, an effect compounded by delay. However, when memories are retrieved, they tend to be intact.

  8. Precisely and Accurately Inferring Single-Molecule Rate Constants

    PubMed Central

    Kinz-Thompson, Colin D.; Bailey, Nevette A.; Gonzalez, Ruben L.

    2017-01-01

    The kinetics of biomolecular systems can be quantified by calculating the stochastic rate constants that govern the biomolecular state versus time trajectories (i.e., state trajectories) of individual biomolecules. To do so, the experimental signal versus time trajectories (i.e., signal trajectories) obtained from observing individual biomolecules are often idealized to generate state trajectories by methods such as thresholding or hidden Markov modeling. Here, we discuss approaches for idealizing signal trajectories and calculating stochastic rate constants from the resulting state trajectories. Importantly, we provide an analysis of how the finite length of signal trajectories restrict the precision of these approaches, and demonstrate how Bayesian inference-based versions of these approaches allow rigorous determination of this precision. Similarly, we provide an analysis of how the finite lengths and limited time resolutions of signal trajectories restrict the accuracy of these approaches, and describe methods that, by accounting for the effects of the finite length and limited time resolution of signal trajectories, substantially improve this accuracy. Collectively, therefore, the methods we consider here enable a rigorous assessment of the precision, and a significant enhancement of the accuracy, with which stochastic rate constants can be calculated from single-molecule signal trajectories. PMID:27793280

  9. Empirical analysis of web-based user-object bipartite networks

    NASA Astrophysics Data System (ADS)

    Shang, Ming-Sheng; Lü, Linyuan; Zhang, Yi-Cheng; Zhou, Tao

    2010-05-01

    Understanding the structure and evolution of web-based user-object networks is a significant task since they play a crucial role in e-commerce nowadays. This letter reports the empirical analysis on two large-scale web sites, audioscrobbler.com and del.icio.us, where users are connected with music groups and bookmarks, respectively. The degree distributions and degree-degree correlations for both users and objects are reported. We propose a new index, named collaborative similarity, to quantify the diversity of tastes based on the collaborative selection. Accordingly, the correlation between degree and selection diversity is investigated. We report some novel phenomena well characterizing the selection mechanism of web users and outline the relevance of these phenomena to the information recommendation problem.

  10. Power and Precision in Confirmatory Factor Analytic Tests of Measurement Invariance

    ERIC Educational Resources Information Center

    Meade, Adam W.; Bauer, Daniel J.

    2007-01-01

    This study investigates the effects of sample size, factor overdetermination, and communality on the precision of factor loading estimates and the power of the likelihood ratio test of factorial invariance in multigroup confirmatory factor analysis. Although sample sizes are typically thought to be the primary determinant of precision and power,…

  11. Time Delay Embedding Increases Estimation Precision of Models of Intraindividual Variability

    ERIC Educational Resources Information Center

    von Oertzen, Timo; Boker, Steven M.

    2010-01-01

    This paper investigates the precision of parameters estimated from local samples of time dependent functions. We find that "time delay embedding," i.e., structuring data prior to analysis by constructing a data matrix of overlapping samples, increases the precision of parameter estimates and in turn statistical power compared to standard…

  12. High-precision measurement of chlorine stable isotope ratios

    USGS Publications Warehouse

    Long, A.; Eastoe, C.J.; Kaufmann, R.S.; Martin, J.G.; Wirt, L.; Finley, J.B.

    1993-01-01

    We present an analysis procedure that allows stable isotopes of chlorine to be analyzed with precision sufficient for geological and hydrological studies. The total analytical precision is ?????0.09%., and the present known range of chloride in the surface and near-surface environment is 3.5???. As Cl- is essentially nonreactive in natural aquatic environments, it is a conservative tracer and its ??37Cl is also conservative. Thus, the ??37Cl parameter is valuable for quantitative evaluation of mixing of different sources of chloride in brines and aquifers. ?? 1993.

  13. Analysis of Hand and Wrist Postural Synergies in Tolerance Grasping of Various Objects

    PubMed Central

    Liu, Yuan; Jiang, Li; Yang, Dapeng; Liu, Hong

    2016-01-01

    Human can successfully grasp various objects in different acceptable relative positions between human hand and objects. This grasp functionality can be described as the grasp tolerance of human hand, which is a significant functionality of human grasp. To understand the motor control of human hand completely, an analysis of hand and wrist postural synergies in tolerance grasping of various objects is needed. Ten healthy right-handed subjects were asked to perform the tolerance grasping with right hand using 6 objects of different shapes, sizes and relative positions between human hand and objects. Subjects were wearing CyberGlove attaching motion tracker on right hand, allowing a measurement of the hand and wrist postures. Correlation analysis of joints and inter-joint/inter-finger modules were carried on to explore the coordination between joints or modules. As the correlation between hand and wrist module is not obvious in tolerance grasping, individual analysis of wrist synergies would be more practical. In this case, postural synergies of hand and wrist were then presented separately through principal component analysis (PCA), expressed through the principal component (PC) information transmitted ratio, PC elements distribution and reconstructed angle error of joints. Results on correlation comparison of different module movements can be well explained by the influence factors of the joint movement correlation. Moreover, correlation analysis of joints and modules showed the wrist module had the lowest correlation among all inter-finger and inter-joint modules. Hand and wrist postures were both sufficient to be described by a few principal components. In terms of the PC elements distribution of hand postures, compared with previous investigations, there was a greater proportion of movement in the thumb joints especially the interphalangeal (IP) and opposition rotation (ROT) joint. The research could serve to a complete understanding of hand grasp, and the design

  14. High precision spectroscopy and imaging in THz frequency range

    NASA Astrophysics Data System (ADS)

    Vaks, Vladimir L.

    2014-03-01

    Application of microwave methods for development of the THz frequency range has resulted in elaboration of high precision THz spectrometers based on nonstationary effects. The spectrometers characteristics (spectral resolution and sensitivity) meet the requirements for high precision analysis. The gas analyzers, based on the high precision spectrometers, have been successfully applied for analytical investigations of gas impurities in high pure substances. These investigations can be carried out both in absorption cell and in reactor. The devices can be used for ecological monitoring, detecting the components of chemical weapons and explosive in the atmosphere. The great field of THz investigations is the medicine application. Using the THz spectrometers developed one can detect markers for some diseases in exhaled air.

  15. Precision lens assembly with alignment turning system

    NASA Astrophysics Data System (ADS)

    Ho, Cheng-Fang; Huang, Chien-Yao; Lin, Yi-Hao; Kuo, Hui-Jean; Kuo, Ching-Hsiang; Hsu, Wei-Yao; Chen, Fong-Zhi

    2017-10-01

    The poker chip assembly with high precision lens barrels is widely applied to ultra-high performance optical system. ITRC applies the poker chip assembly technology to the high numerical aperture objective lenses and lithography projection lenses because of its high efficiency assembly process. In order to achieve high precision lens cell for poker chip assembly, an alignment turning system (ATS) is developed. The ATS includes measurement, alignment and turning modules. The measurement module is equipped with a non-contact displacement sensor (NCDS) and an autocollimator (ACM). The NCDS and ACM are used to measure centration errors of the top and the bottom surface of a lens respectively; then the amount of adjustment of displacement and tilt with respect to the rotational axis of the turning machine for the alignment module can be determined. After measurement, alignment and turning processes on the ATS, the centration error of a lens cell with 200 mm in diameter can be controlled within 10 arcsec. Furthermore, a poker chip assembly lens cell with three sub-cells is demonstrated, each sub-cells are measured and accomplished with alignment and turning processes. The lens assembly test for five times by each three technicians; the average transmission centration error of assembly lens is 12.45 arcsec. The results show that ATS can achieve high assembly efficiency for precision optical systems.

  16. A list of some bright objects which S-052 can observe

    NASA Technical Reports Server (NTRS)

    Mcquire, J. P.

    1972-01-01

    In order to find out the precise orientation of the photographs obtained by the High Altitude Observatory's ATM white light coronagraph, celestial objects must appear on each roll of film. A list of such bright objects and the times during which they can be observed is presented.

  17. [Assessment of precision and accuracy of digital surface photogrammetry with the DSP 400 system].

    PubMed

    Krimmel, M; Kluba, S; Dietz, K; Reinert, S

    2005-03-01

    The objective of the present study was to evaluate the precision and accuracy of facial anthropometric measurements obtained through digital 3-D surface photogrammetry with the DSP 400 system in comparison to traditional 2-D photogrammetry. Fifty plaster casts of cleft infants were imaged and 21 standard anthropometric measurements were obtained. For precision assessment the measurements were performed twice in a subsample. Accuracy was determined by comparison of direct measurements and indirect 2-D and 3-D image measurements. Precision of digital surface photogrammetry was almost as good as direct anthropometry and clearly better than 2-D photogrammetry. Measurements derived from 3-D images showed better congruence to direct measurements than from 2-D photos. Digital surface photogrammetry with the DSP 400 system is sufficiently precise and accurate for craniofacial anthropometric examinations.

  18. A Geometric Analysis to Protect Manned Assets from Newly Launched Objects - Cola Gap Analysis

    NASA Technical Reports Server (NTRS)

    Hametz, Mark E.; Beaver, Brian A.

    2013-01-01

    A safety risk was identified for the International Space Station (ISS) by The Aerospace Corporation, where the ISS would be unable to react to a conjunction with a newly launched object following the end of the launch Collision Avoidance (COLA) process. Once an object is launched, there is a finite period of time required to track, catalog, and evaluate that new object as part of standard onorbit COLA screening processes. Additionally, should a conjunction be identified, there is an additional period of time required to plan and execute a collision avoidance maneuver. While the computed prelaunch probability of collision with any object is extremely low, NASA/JSC has requested that all US launches take additional steps to protect the ISS during this "COLA gap" period. This paper details a geometric-based COLA gap analysis method developed by the NASA Launch Services Program to determine if launch window cutouts are required to mitigate this risk. Additionally, this paper presents the results of several missions where this process has been used operationally.

  19. Laser-induced breakdown spectroscopy (LIBS) analysis of calcium ions dissolved in water using filter paper substrates: an ideal internal standard for precision improvement.

    PubMed

    Choi, Daewoong; Gong, Yongdeuk; Nam, Sang-Ho; Han, Song-Hee; Yoo, Jonghyun; Lee, Yonghoon

    2014-01-01

    We report an approach for selecting an internal standard to improve the precision of laser-induced breakdown spectroscopy (LIBS) analysis for determining calcium (Ca) concentration in water. The dissolved Ca(2+) ions were pre-concentrated on filter paper by evaporating water. The filter paper was dried and analyzed using LIBS. By adding strontium chloride to sample solutions and using a Sr II line at 407.771 nm for the intensity normalization of Ca II lines at 393.366 or 396.847 nm, the analysis precision could be significantly improved. The Ca II and Sr II line intensities were mapped across the filter paper, and they showed a strong positive shot-to-shot correlation with the same spatial distribution on the filter paper surface. We applied this analysis approach for the measurement of Ca(2+) in tap, bottled, and ground water samples. The Ca(2+) concentrations determined using LIBS are in good agreement with those obtained from flame atomic absorption spectrometry. Finally, we suggest a homologous relation of the strongest emission lines of period 4 and 5 elements in groups IA and IIA based on their similar electronic structures. Our results indicate that the LIBS can be effectively applied for liquid analysis at the sub-parts per million level with high precision using a simple drying of liquid solutions on filter paper and the use of the correct internal standard elements with the similar valence electronic structure with respect to the analytes of interest.

  20. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    PubMed

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  1. Moving the Weber Fraction: The Perceptual Precision for Moment of Inertia Increases with Exploration Force

    PubMed Central

    Debats, Nienke B.; Kingma, Idsart; Beek, Peter J.; Smeets, Jeroen B. J.

    2012-01-01

    How does the magnitude of the exploration force influence the precision of haptic perceptual estimates? To address this question, we examined the perceptual precision for moment of inertia (i.e., an object's “angular mass”) under different force conditions, using the Weber fraction to quantify perceptual precision. Participants rotated a rod around a fixed axis and judged its moment of inertia in a two-alternative forced-choice task. We instructed different levels of exploration force, thereby manipulating the magnitude of both the exploration force and the angular acceleration. These are the two signals that are needed by the nervous system to estimate moment of inertia. Importantly, one can assume that the absolute noise on both signals increases with an increase in the signals' magnitudes, while the relative noise (i.e., noise/signal) decreases with an increase in signal magnitude. We examined how the perceptual precision for moment of inertia was affected by this neural noise. In a first experiment we found that a low exploration force caused a higher Weber fraction (22%) than a high exploration force (13%), which suggested that the perceptual precision was constrained by the relative noise. This hypothesis was supported by the result of a second experiment, in which we found that the relationship between exploration force and Weber fraction had a similar shape as the theoretical relationship between signal magnitude and relative noise. The present study thus demonstrated that the amount of force used to explore an object can profoundly influence the precision by which its properties are perceived. PMID:23028437

  2. Energy transfer mechanism and probability analysis of submarine pipe laterally impacted by dropped objects

    NASA Astrophysics Data System (ADS)

    Liang, Jing; Yu, Jian-xing; Yu, Yang; Lam, W.; Zhao, Yi-yu; Duan, Jing-hui

    2016-06-01

    Energy transfer ratio is the basic-factor affecting the level of pipe damage during the impact between dropped object and submarine pipe. For the purpose of studying energy transfer and damage mechanism of submarine pipe impacted by dropped objects, series of experiments are designed and carried out. The effective yield strength is deduced to make the quasi-static analysis more reliable, and the normal distribution of energy transfer ratio caused by lateral impact on pipes is presented by statistic analysis of experimental results based on the effective yield strength, which provides experimental and theoretical basis for the risk analysis of submarine pipe system impacted by dropped objects. Failure strains of pipe material are confirmed by comparing experimental results with finite element simulation. In addition, impact contact area and impact time are proved to be the major influence factors of energy transfer by sensitivity analysis of the finite element simulation.

  3. N-of-1-pathways MixEnrich: advancing precision medicine via single-subject analysis in discovering dynamic changes of transcriptomes.

    PubMed

    Li, Qike; Schissler, A Grant; Gardeux, Vincent; Achour, Ikbel; Kenost, Colleen; Berghout, Joanne; Li, Haiquan; Zhang, Hao Helen; Lussier, Yves A

    2017-05-24

    Transcriptome analytic tools are commonly used across patient cohorts to develop drugs and predict clinical outcomes. However, as precision medicine pursues more accurate and individualized treatment decisions, these methods are not designed to address single-patient transcriptome analyses. We previously developed and validated the N-of-1-pathways framework using two methods, Wilcoxon and Mahalanobis Distance (MD), for personal transcriptome analysis derived from a pair of samples of a single patient. Although, both methods uncover concordantly dysregulated pathways, they are not designed to detect dysregulated pathways with up- and down-regulated genes (bidirectional dysregulation) that are ubiquitous in biological systems. We developed N-of-1-pathways MixEnrich, a mixture model followed by a gene set enrichment test, to uncover bidirectional and concordantly dysregulated pathways one patient at a time. We assess its accuracy in a comprehensive simulation study and in a RNA-Seq data analysis of head and neck squamous cell carcinomas (HNSCCs). In presence of bidirectionally dysregulated genes in the pathway or in presence of high background noise, MixEnrich substantially outperforms previous single-subject transcriptome analysis methods, both in the simulation study and the HNSCCs data analysis (ROC Curves; higher true positive rates; lower false positive rates). Bidirectional and concordant dysregulated pathways uncovered by MixEnrich in each patient largely overlapped with the quasi-gold standard compared to other single-subject and cohort-based transcriptome analyses. The greater performance of MixEnrich presents an advantage over previous methods to meet the promise of providing accurate personal transcriptome analysis to support precision medicine at point of care.

  4. A Retrospective Analysis of Precision Medicine Outcomes in Patients With Advanced Cancer Reveals Improved Progression-Free Survival Without Increased Health Care Costs

    PubMed Central

    Haslem, Derrick S.; Van Norman, S. Burke; Fulde, Gail; Knighton, Andrew J.; Belnap, Tom; Butler, Allison M.; Rhagunath, Sharanya; Newman, David; Gilbert, Heather; Tudor, Brian P.; Lin, Karen; Stone, Gary R.; Loughmiller, David L.; Mishra, Pravin J.; Srivastava, Rajendu; Ford, James M.; Nadauld, Lincoln D.

    2017-01-01

    Purpose: The advent of genomic diagnostic technologies such as next-generation sequencing has recently enabled the use of genomic information to guide targeted treatment in patients with cancer, an approach known as precision medicine. However, clinical outcomes, including survival and the cost of health care associated with precision cancer medicine, have been challenging to measure and remain largely unreported. Patients and Methods: We conducted a matched cohort study of 72 patients with metastatic cancer of diverse subtypes in the setting of a large, integrated health care delivery system. We analyzed the outcomes of 36 patients who received genomic testing and targeted therapy (precision cancer medicine) between July 1, 2013, and January 31, 2015, compared with 36 historical control patients who received standard chemotherapy (n = 29) or best supportive care (n = 7). Results: The average progression-free survival was 22.9 weeks for the precision medicine group and 12.0 weeks for the control group (P = .002) with a hazard ratio of 0.47 (95% CI, 0.29 to 0.75) when matching on age, sex, histologic diagnosis, and previous lines of treatment. In a subset analysis of patients who received all care within the Intermountain Healthcare system (n = 44), per patient charges per week were $4,665 in the precision treatment group and $5,000 in the control group (P = .126). Conclusion: These findings suggest that precision cancer medicine may improve survival for patients with refractory cancer without increasing health care costs. Although the results of this study warrant further validation, this precision medicine approach may be a viable option for patients with advanced cancer. PMID:27601506

  5. Artificial intelligence, physiological genomics, and precision medicine.

    PubMed

    Williams, Anna Marie; Liu, Yong; Regner, Kevin R; Jotterand, Fabrice; Liu, Pengyuan; Liang, Mingyu

    2018-04-01

    Big data are a major driver in the development of precision medicine. Efficient analysis methods are needed to transform big data into clinically-actionable knowledge. To accomplish this, many researchers are turning toward machine learning (ML), an approach of artificial intelligence (AI) that utilizes modern algorithms to give computers the ability to learn. Much of the effort to advance ML for precision medicine has been focused on the development and implementation of algorithms and the generation of ever larger quantities of genomic sequence data and electronic health records. However, relevance and accuracy of the data are as important as quantity of data in the advancement of ML for precision medicine. For common diseases, physiological genomic readouts in disease-applicable tissues may be an effective surrogate to measure the effect of genetic and environmental factors and their interactions that underlie disease development and progression. Disease-applicable tissue may be difficult to obtain, but there are important exceptions such as kidney needle biopsy specimens. As AI continues to advance, new analytical approaches, including those that go beyond data correlation, need to be developed and ethical issues of AI need to be addressed. Physiological genomic readouts in disease-relevant tissues, combined with advanced AI, can be a powerful approach for precision medicine for common diseases.

  6. Single photon ranging system using two wavelengths laser and analysis of precision

    NASA Astrophysics Data System (ADS)

    Chen, Yunfei; He, Weiji; Miao, Zhuang; Gu, Guohua; Chen, Qian

    2013-09-01

    The laser ranging system based on time correlation single photon counting technology and single photon detector has the feature of high precision and low emergent energy etc. In this paper, we established a single photon laser ranging system that use the supercontinuum laser as light source, and two wavelengths (532nm and 830nm) of echo signal as the stop signal. We propose a new method that is capable to improve the single photon ranging system performance. The method is implemented by using two single-photon detectors to receive respectively the two different wavelength signals at the same time. We extracted the firings of the two detectors triggered by the same laser pulse at the same time and then took mean time of the two firings as the combined detection time-of-flight. The detection by two channels using two wavelengths will effectively improve the detection precision and decrease the false alarm probability. Finally, an experimental single photon ranging system was established. Through a lot of experiments, we got the system precision using both single and two wavelengths and verified the effectiveness of the method.

  7. Precision cleaning verification of fluid components by air/water impingement and total carbon analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1994-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 sq m. Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging/diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg/sq ft of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVRs impinged from witness plates of 0.05 to 0.75 sq m.

  8. Precision Cleaning Verification of Fluid Components by Air/Water Impingement and Total Carbon Analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1995-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).

  9. -Omic and Electronic Health Records Big Data Analytics for Precision Medicine

    PubMed Central

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.

    2017-01-01

    Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470

  10. Precision measurements with LPCTrap at GANIL

    NASA Astrophysics Data System (ADS)

    Liénard, E.; Ban, G.; Couratin, C.; Delahaye, P.; Durand, D.; Fabian, X.; Fabre, B.; Fléchard, X.; Finlay, P.; Mauger, F.; Méry, A.; Naviliat-Cuncic, O.; Pons, B.; Porobic, T.; Quéméner, G.; Severijns, N.; Thomas, J. C.; Velten, Ph.

    2015-11-01

    The experimental achievements and the results obtained so far with the LPCTrap device installed at GANIL are presented. The apparatus is dedicated to the study of the weak interaction at low energy by means of precise measurements of the β - ν angular correlation parameter in nuclear β decays. So far, the data collected with three isotopes have enabled to determine, for the first time, the charge state distributions of the recoiling ions, induced by shakeoff process. The analysis is presently refined to deduce the correlation parameters, with the potential of improving both the constraint deduced at low energy on exotic tensor currents (6He1+) and the precision on the V u d element of the quark-mixing matrix (35Ar1+ and 19Ne1+) deduced from the mirror transitions dataset.

  11. A Flexile and High Precision Calibration Method for Binocular Structured Light Scanning System

    PubMed Central

    Yuan, Jianying; Wang, Qiong; Li, Bailin

    2014-01-01

    3D (three-dimensional) structured light scanning system is widely used in the field of reverse engineering, quality inspection, and so forth. Camera calibration is the key for scanning precision. Currently, 2D (two-dimensional) or 3D fine processed calibration reference object is usually applied for high calibration precision, which is difficult to operate and the cost is high. In this paper, a novel calibration method is proposed with a scale bar and some artificial coded targets placed randomly in the measuring volume. The principle of the proposed method is based on hierarchical self-calibration and bundle adjustment. We get initial intrinsic parameters from images. Initial extrinsic parameters in projective space are estimated with the method of factorization and then upgraded to Euclidean space with orthogonality of rotation matrix and rank 3 of the absolute quadric as constraint. Last, all camera parameters are refined through bundle adjustment. Real experiments show that the proposed method is robust, and has the same precision level as the result using delicate artificial reference object, but the hardware cost is very low compared with the current calibration method used in 3D structured light scanning system. PMID:25202736

  12. Object classification and outliers analysis in the forthcoming Gaia mission

    NASA Astrophysics Data System (ADS)

    Ordóñez-Blanco, D.; Arcay, B.; Dafonte, C.; Manteiga, M.; Ulla, A.

    2010-12-01

    Astrophysics is evolving towards the rational optimization of costly observational material by the intelligent exploitation of large astronomical databases from both terrestrial telescopes and spatial mission archives. However, there has been relatively little advance in the development of highly scalable data exploitation and analysis tools needed to generate the scientific returns from these large and expensively obtained datasets. Among the upcoming projects of astronomical instrumentation, Gaia is the next cornerstone ESA mission. The Gaia survey foresees the creation of a data archive and its future exploitation with automated or semi-automated analysis tools. This work reviews some of the work that is being developed by the Gaia Data Processing and Analysis Consortium for the object classification and analysis of outliers in the forthcoming mission.

  13. Doublet Pulse Coherent Laser Radar for Tracking of Resident Space Objects

    NASA Technical Reports Server (NTRS)

    Prasad, Narasimha S.; Rudd, Van; Shald, Scott; Sandford, Stephen; Dimarcantonio, Albert

    2014-01-01

    In this paper, the development of a long range ladar system known as ExoSPEAR at NASA Langley Research Center for tracking rapidly moving resident space objects is discussed. Based on 100 W, nanosecond class, near-IR laser, this ladar system with coherent detection technique is currently being investigated for short dwell time measurements of resident space objects (RSOs) in LEO and beyond for space surveillance applications. This unique ladar architecture is configured using a continuously agile doublet-pulse waveform scheme coupled to a closed-loop tracking and control loop approach to simultaneously achieve mm class range precision and mm/s velocity precision and hence obtain unprecedented track accuracies. Salient features of the design architecture followed by performance modeling and engagement simulations illustrating the dependence of range and velocity precision in LEO orbits on ladar parameters are presented. Estimated limits on detectable optical cross sections of RSOs in LEO orbits are discussed.

  14. Sensitivity analysis of multi-objective optimization of CPG parameters for quadruped robot locomotion

    NASA Astrophysics Data System (ADS)

    Oliveira, Miguel; Santos, Cristina P.; Costa, Lino

    2012-09-01

    In this paper, a study based on sensitivity analysis is performed for a gait multi-objective optimization system that combines bio-inspired Central Patterns Generators (CPGs) and a multi-objective evolutionary algorithm based on NSGA-II. In this system, CPGs are modeled as autonomous differential equations, that generate the necessary limb movement to perform the required walking gait. In order to optimize the walking gait, a multi-objective problem with three conflicting objectives is formulated: maximization of the velocity, the wide stability margin and the behavioral diversity. The experimental results highlight the effectiveness of this multi-objective approach and the importance of the objectives to find different walking gait solutions for the quadruped robot.

  15. Using object-based image analysis to guide the selection of field sample locations

    USDA-ARS?s Scientific Manuscript database

    One of the most challenging tasks for resource management and research is designing field sampling schemes to achieve unbiased estimates of ecosystem parameters as efficiently as possible. This study focused on the potential of fine-scale image objects from object-based image analysis (OBIA) to be u...

  16. A Geometric Analysis to Protect Manned Assets from Newly Launched Objects - COLA Gap Analysis

    NASA Technical Reports Server (NTRS)

    Hametz, Mark E.; Beaver, Brian A.

    2012-01-01

    A safety risk was identified for the International Space Station (ISS) by The Aerospace Corporation following the launch of GPS IIR-20 (March 24, 2009), when the spent upper stage of the launch vehicle unexpectedly crossed inside the ISS notification box shortly after launch. This event highlighted a 56-hour vulnerability period following the end of the launch Collision Avoidance (COLA) process where the ISS would be unable to react to a conjunction with a newly launched object. Current launch COLA processes screen each launched object across the launch window to determine if an object's nominal trajectory is predicted to pass within 200 km of the ISS (or any other manned/mannable object), resulting in a launch time closure. These launch COLA screens are performed from launch through separation plus I 00 minutes. Once the objects are in orbit, they are cataloged and evaluated as part of routine on-orbit conjunction assessment processes. However, as the GPS IIR-20 scenario illustrated, there is a vulnerability period in the time line between the end of launch COLA coverage and the beginning of standard on-orbit COLA assessment activities. The gap between existing launch and on-orbit COLA processes is driven by the time it takes to track and catalog a launched object, identify a conjunction, and plan and execute a collision avoidance maneuver. For the ISS, the total time required to accomplish an of these steps is 56 hours. To protect human lives, NASA/JSC has requested that an US launches take additional steps to protect the ISS during this "COLA gap" period. The uncertainty in the state of a spent upper stage can be quite large after all bums are complete and all remaining propellants are expelled to safe the stage. Simply extending the launch COLA process an additional 56 hours is not a viable option as the 3-sigma position uncertainty will far exceed the 200 km miss-distance criterion. Additionally, performing a probability of collision (Pc) analysis over this

  17. High Precision Prediction of Functional Sites in Protein Structures

    PubMed Central

    Buturovic, Ljubomir; Wong, Mike; Tang, Grace W.; Altman, Russ B.; Petkovic, Dragutin

    2014-01-01

    We address the problem of assigning biological function to solved protein structures. Computational tools play a critical role in identifying potential active sites and informing screening decisions for further lab analysis. A critical parameter in the practical application of computational methods is the precision, or positive predictive value. Precision measures the level of confidence the user should have in a particular computed functional assignment. Low precision annotations lead to futile laboratory investigations and waste scarce research resources. In this paper we describe an advanced version of the protein function annotation system FEATURE, which achieved 99% precision and average recall of 95% across 20 representative functional sites. The system uses a Support Vector Machine classifier operating on the microenvironment of physicochemical features around an amino acid. We also compared performance of our method with state-of-the-art sequence-level annotator Pfam in terms of precision, recall and localization. To our knowledge, no other functional site annotator has been rigorously evaluated against these key criteria. The software and predictive models are incorporated into the WebFEATURE service at http://feature.stanford.edu/wf4.0-beta. PMID:24632601

  18. Enhanced Precision of the New Hologic Horizon Model Compared With the Old Discovery Model Is Less Evident When Fewer Vertebrae Are Included in the Analysis.

    PubMed

    McNamara, Elizabeth A; Kilim, Holly P; Malabanan, Alan O; Whittaker, LaTarsha G; Rosen, Harold N

    The International Society for Clinical Densitometry guidelines recommend using locally derived precision data for spine bone mineral densities (BMDs), but do not specify whether data derived from L1-L4 spines correctly reflect the precision for spines reporting fewer than 4 vertebrae. Our experience suggested that the decrease in precision with successively fewer vertebrae is progressive as more vertebrae are excluded and that the precision for the newer Horizon Hologic model might be better than that for the previous model, and we sought to quantify. Precision studies were performed on Hologic densitometers by acquiring spine BMD in fast array mode twice on 30 patients, according to International Society for Clinical Densitometry guidelines. This was done 10 different times on various Discovery densitometers, and once on a Horizon densitometer. When 1 vertebral body was excluded from analysis, there was no significant deterioration in precision. When 2 vertebrae were excluded, there was a nonsignificant trend to poorer precision, and when 3 vertebrae were excluded, there was significantly worse precision. When 3 or 4 vertebrae were reported, the precision of the spine BMD measurement was significantly better on the Hologic Horizon than on the Discovery, but the difference in precision between densitometers narrowed and was no longer significant when 1 or 2 vertebrae were reported. The results suggest that (1) the measurement of in vivo spine BMD on the new Hologic Horizon densitometer is significantly more precise than on the older Discovery model; (2) the difference in precision between the Horizon and Discovery models decreases as fewer vertebrae are included; (3) the measurement of spine BMD is less precise as more vertebrae are excluded, but still quite reasonable even when only 1 vertebral body is included; and (4) when 3 vertebrae are reported, L1-L4 precision data can reasonably be used to report significance of changes in BMD. When 1 or 2 vertebrae are

  19. Evaluation of High-Precision Sensors in Structural Monitoring

    PubMed Central

    Erol, Bihter

    2010-01-01

    One of the most intricate branches of metrology involves the monitoring of displacements and deformations of natural and anthropogenic structures under environmental forces, such as tidal or tectonic phenomena, or ground water level changes. Technological progress has changed the measurement process, and steadily increasing accuracy requirements have led to the continued development of new measuring instruments. The adoption of an appropriate measurement strategy, with proper instruments suited for the characteristics of the observed structure and its environmental conditions, is of high priority in the planning of deformation monitoring processes. This paper describes the use of precise digital inclination sensors in continuous monitoring of structural deformations. The topic is treated from two viewpoints: (i) evaluation of the performance of inclination sensors by comparing them to static and continuous GPS observations in deformation monitoring and (ii) providing a strategy for analyzing the structural deformations. The movements of two case study objects, a tall building and a geodetic monument in Istanbul, were separately monitored using dual-axes micro-radian precision inclination sensors (inclinometers) and GPS. The time series of continuous deformation observations were analyzed using the Least Squares Spectral Analysis Technique (LSSA). Overall, the inclinometers showed good performance for continuous monitoring of structural displacements, even at the sub-millimeter level. Static GPS observations remained insufficient for resolving the deformations to the sub-centimeter level due to the errors that affect GPS signals. With the accuracy advantage of inclination sensors, their use with GPS provides more detailed investigation of deformation phenomena. Using inclinometers and GPS is helpful to be able to identify the components of structural responses to the natural forces as static, quasi-static, or resonant. PMID:22163499

  20. Network-based machine learning and graph theory algorithms for precision oncology.

    PubMed

    Zhang, Wei; Chien, Jeremy; Yong, Jeongsik; Kuang, Rui

    2017-01-01

    Network-based analytics plays an increasingly important role in precision oncology. Growing evidence in recent studies suggests that cancer can be better understood through mutated or dysregulated pathways or networks rather than individual mutations and that the efficacy of repositioned drugs can be inferred from disease modules in molecular networks. This article reviews network-based machine learning and graph theory algorithms for integrative analysis of personal genomic data and biomedical knowledge bases to identify tumor-specific molecular mechanisms, candidate targets and repositioned drugs for personalized treatment. The review focuses on the algorithmic design and mathematical formulation of these methods to facilitate applications and implementations of network-based analysis in the practice of precision oncology. We review the methods applied in three scenarios to integrate genomic data and network models in different analysis pipelines, and we examine three categories of network-based approaches for repositioning drugs in drug-disease-gene networks. In addition, we perform a comprehensive subnetwork/pathway analysis of mutations in 31 cancer genome projects in the Cancer Genome Atlas and present a detailed case study on ovarian cancer. Finally, we discuss interesting observations, potential pitfalls and future directions in network-based precision oncology.

  1. A functional analysis of photo-object matching skills of severely retarded adolescents.

    PubMed

    Dixon, L S

    1981-01-01

    Matching-to-sample procedures were used to assess picture representation skills of severely retarded, nonverbal adolescents. Identity matching within the classes of objects and life-size, full-color photos of the objects was first used to assess visual discrimination, a necessary condition for picture representation. Picture representation was then assessed through photo-object matching tasks. Five students demonstrated visual discrimination (identity matching) within the two classes of photos and the objects. Only one student demonstrated photo-object matching. The results of the four students who failed to demonstrate photo-object matching suggested that physical properties of photos (flat, rectangular) and depth dimensions of objects may exert more control over matching than the similarities of the objects and images within the photos. An analysis of figure-ground variables was conducted to provide an empirical basis for program development in the use of pictures. In one series of tests, rectangular shape and background were removed by cutting out the figures in the photos. The edge shape of the photo and the edge shape of the image were then identical. The results suggest that photo-object matching may be facilitated by using cut-out figures rather than the complete rectangular photo.

  2. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    ERIC Educational Resources Information Center

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  3. CCD Photometry of bright stars using objective wire mesh

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamiński, Krzysztof; Zgórz, Marika; Schwarzenberg-Czerny, Aleksander, E-mail: chrisk@amu.edu.pl

    2014-06-01

    Obtaining accurate photometry of bright stars from the ground remains problematic due to the danger of overexposing the target and/or the lack of suitable nearby comparison stars. The century-old method of using objective wire mesh to produce multiple stellar images seems promising for the precise CCD photometry of such stars. Furthermore, our tests on β Cep and its comparison star, differing by 5 mag, are very encouraging. Using a CCD camera and a 20 cm telescope with the objective covered by a plastic wire mesh, in poor weather conditions, we obtained differential photometry with a precision of 4.5 mmag permore » two minute exposure. Our technique is flexible and may be tuned to cover a range as big as 6-8 mag. We discuss the possibility of installing a wire mesh directly in the filter wheel.« less

  4. [Implementation of precision control to achieve the goal of schistosomiasis elimination in China].

    PubMed

    Zhou, Xiao-nong

    2016-02-01

    The integrated strategy for schistosomiasis control with focus on infectious source control, which has been implemented since 2004, accelerated the progress towards schistosomiasis control in China, and achieved transmission control of the disease across the country by the end of 2015, which achieved the overall objective of the Mid- and Long-term National Plan for Prevention and Control of Schistosomiasis (2004-2015) on schedule. Then, the goal of schistosomiasis elimination by 2025 was proposed in China in 2014. To achieve this new goal on schedule, we have to address the key issues, and implement precision control measures with more precise identification of control targets, so that we are able to completely eradicate the potential factors leading to resurgence of schistosomiasis transmission and enable the achievement of schistosomiasis elimination on schedule. Precision schistosomiasis control, a theoretical innovation of precision medicine in schistosomiasis control, will provide new insights into schistosomiasis control based on the conception of precision medicine. This paper describes the definition, interventions and the role of precision schistosomiasis control in the elimination of schistosomiasis in China, and demonstrates that sustainable improvement of professionals and integrated control capability at grass-root level is a prerequisite to the implementation of schistosomiasis control, precision schistosomiasis control is a key to the further implementation of the integrated strategy for schistosomiasis control with focus on infectious source control, and precision schistosomiasis control is a guarantee of curing schistosomiasis patients and implementing schistosomiasis control program and interventions.

  5. The Density of Mid-sized Kuiper Belt Objects from ALMA Thermal Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Michael E.; Butler, Bryan J.

    The densities of mid-sized Kuiper Belt objects (KBOs) are a key constraint in understanding the assembly of objects in the outer solar system. These objects are critical for understanding the currently unexplained transition from the smallest KBOs with densities lower than that of water, to the largest objects with significant rock content. Mapping this transition is made difficult by the uncertainties in the diameters of these objects, which maps into an even larger uncertainty in volume and thus density. The substantial collecting area of the Atacama Large Millimeter Array allows significantly more precise measurements of thermal emission from outer solarmore » system objects and could potentially greatly improve the density measurements. Here we use new thermal observations of four objects with satellites to explore the improvements possible with millimeter data. We find that effects due to effective emissivity at millimeter wavelengths make it difficult to use the millimeter data directly to find diameters and thus volumes for these bodies. In addition, we find that when including the effects of model uncertainty, the true uncertainties on the sizes of outer solar system objects measured with radiometry are likely larger than those previously published. Substantial improvement in object sizes will likely require precise occultation measurements.« less

  6. Patient similarity for precision medicine: a systematic review.

    PubMed

    Parimbelli, E; Marini, S; Sacchi, L; Bellazzi, R

    2018-06-01

    Evidence-based medicine is the most prevalent paradigm adopted by physicians. Clinical practice guidelines typically define a set of recommendations together with eligibility criteria that restrict their applicability to a specific group of patients. The ever-growing size and availability of health-related data is currently challenging the broad definitions of guideline-defined patient groups. Precision medicine leverages on genetic, phenotypic, or psychosocial characteristics to provide precise identification of patient subsets for treatment targeting. Defining a patient similarity measure is thus an essential step to allow stratification of patients into clinically-meaningful subgroups. The present review investigates the use of patient similarity as a tool to enable precision medicine. 279 articles were analyzed along four dimensions: data types considered, clinical domains of application, data analysis methods, and translational stage of findings. Cancer-related research employing molecular profiling and standard data analysis techniques such as clustering constitute the majority of the retrieved studies. Chronic and psychiatric diseases follow as the second most represented clinical domains. Interestingly, almost one quarter of the studies analyzed presented a novel methodology, with the most advanced employing data integration strategies and being portable to different clinical domains. Integration of such techniques into decision support systems constitutes and interesting trend for future research. Copyright © 2018. Published by Elsevier Inc.

  7. Validity evidence for the Simulated Colonoscopy Objective Performance Evaluation scoring system.

    PubMed

    Trinca, Kristen D; Cox, Tiffany C; Pearl, Jonathan P; Ritter, E Matthew

    2014-02-01

    Low-cost, objective systems to assess and train endoscopy skills are needed. The aim of this study was to evaluate the ability of Simulated Colonoscopy Objective Performance Evaluation to assess the skills required to perform endoscopy. Thirty-eight subjects were included in this study, all of whom performed 4 tasks. The scoring system measured performance by calculating precision and efficiency. Data analysis assessed the relationship between colonoscopy experience and performance on each task and the overall score. Endoscopic trainees' Simulated Colonoscopy Objective Performance Evaluation scores correlated significantly with total colonoscopy experience (r = .61, P = .003) and experience in the past 12 months (r = .63, P = .002). Significant differences were seen among practicing endoscopists, nonendoscopic surgeons, and trainees (P < .0001). When the 4 tasks were analyzed, each showed significant correlation with colonoscopy experience (scope manipulation, r = .44, P = .044; tool targeting, r = .45, P = .04; loop management, r = .47, P = .032; mucosal inspection, r = .65, P = .001) and significant differences in performance between the endoscopist groups, except for mucosal inspection (scope manipulation, P < .0001; tool targeting, P = .002; loop management, P = .0008; mucosal inspection, P = .27). Simulated Colonoscopy Objective Performance Evaluation objectively assesses the technical skills required to perform endoscopy and shows promise as a platform for proficiency-based skills training. Published by Elsevier Inc.

  8. An objective isobaric/isentropic technique for upper air analysis

    NASA Technical Reports Server (NTRS)

    Mancuso, R. L.; Endlich, R. M.; Ehernberger, L. J.

    1981-01-01

    An objective meteorological analysis technique is presented whereby both horizontal and vertical upper air analyses are performed. The process used to interpolate grid-point values from the upper-air station data is the same as for grid points on both an isobaric surface and a vertical cross-sectional plane. The nearby data surrounding each grid point are used in the interpolation by means of an anisotropic weighting scheme, which is described. The interpolation for a grid-point potential temperature is performed isobarically; whereas wind, mixing-ratio, and pressure height values are interpolated from data that lie on the isentropic surface that passes through the grid point. Two versions (A and B) of the technique are evaluated by qualitatively comparing computer analyses with subjective handdrawn analyses. The objective products of version A generally have fair correspondence with the subjective analyses and with the station data, and depicted the structure of the upper fronts, tropopauses, and jet streams fairly well. The version B objective products correspond more closely to the subjective analyses, and show the same strong gradients across the upper front with only minor smoothing.

  9. [Precision and personalized medicine].

    PubMed

    Sipka, Sándor

    2016-10-01

    The author describes the concept of "personalized medicine" and the newly introduced "precision medicine". "Precision medicine" applies the terms of "phenotype", "endotype" and "biomarker" in order to characterize more precisely the various diseases. Using "biomarkers" the homogeneous type of a disease (a "phenotype") can be divided into subgroups called "endotypes" requiring different forms of treatment and financing. The good results of "precision medicine" have become especially apparent in relation with allergic and autoimmune diseases. The application of this new way of thinking is going to be necessary in Hungary, too, in the near future for participants, controllers and financing boards of healthcare. Orv. Hetil., 2016, 157(44), 1739-1741.

  10. Are Currently Available Wearable Devices for Activity Tracking and Heart Rate Monitoring Accurate, Precise, and Medically Beneficial?

    PubMed Central

    El-Amrawy, Fatema

    2015-01-01

    Objectives The new wave of wireless technologies, fitness trackers, and body sensor devices can have great impact on healthcare systems and the quality of life. However, there have not been enough studies to prove the accuracy and precision of these trackers. The objective of this study was to evaluate the accuracy, precision, and overall performance of seventeen wearable devices currently available compared with direct observation of step counts and heart rate monitoring. Methods Each participant in this study used three accelerometers at a time, running the three corresponding applications of each tracker on an Android or iOS device simultaneously. Each participant was instructed to walk 200, 500, and 1,000 steps. Each set was repeated 40 times. Data was recorded after each trial, and the mean step count, standard deviation, accuracy, and precision were estimated for each tracker. Heart rate was measured by all trackers (if applicable), which support heart rate monitoring, and compared to a positive control, the Onyx Vantage 9590 professional clinical pulse oximeter. Results The accuracy of the tested products ranged between 79.8% and 99.1%, while the coefficient of variation (precision) ranged between 4% and 17.5%. MisFit Shine showed the highest accuracy and precision (along with Qualcomm Toq), while Samsung Gear 2 showed the lowest accuracy, and Jawbone UP showed the lowest precision. However, Xiaomi Mi band showed the best package compared to its price. Conclusions The accuracy and precision of the selected fitness trackers are reasonable and can indicate the average level of activity and thus average energy expenditure. PMID:26618039

  11. Use of Terrestrial Laser Scanning Technology for Long Term High Precision Deformation Monitoring

    PubMed Central

    Vezočnik, Rok; Ambrožič, Tomaž; Sterle, Oskar; Bilban, Gregor; Pfeifer, Norbert; Stopar, Bojan

    2009-01-01

    The paper presents a new methodology for high precision monitoring of deformations with a long term perspective using terrestrial laser scanning technology. In order to solve the problem of a stable reference system and to assure the high quality of possible position changes of point clouds, scanning is integrated with two complementary surveying techniques, i.e., high quality static GNSS positioning and precise tacheometry. The case study object where the proposed methodology was tested is a high pressure underground pipeline situated in an area which is geologically unstable. PMID:22303152

  12. Reliability of Pressure Ulcer Rates: How Precisely Can We Differentiate Among Hospital Units, and Does the Standard Signal-Noise Reliability Measure Reflect This Precision?

    PubMed

    Staggs, Vincent S; Cramer, Emily

    2016-08-01

    Hospital performance reports often include rankings of unit pressure ulcer rates. Differentiating among units on the basis of quality requires reliable measurement. Our objectives were to describe and apply methods for assessing reliability of hospital-acquired pressure ulcer rates and evaluate a standard signal-noise reliability measure as an indicator of precision of differentiation among units. Quarterly pressure ulcer data from 8,199 critical care, step-down, medical, surgical, and medical-surgical nursing units from 1,299 US hospitals were analyzed. Using beta-binomial models, we estimated between-unit variability (signal) and within-unit variability (noise) in annual unit pressure ulcer rates. Signal-noise reliability was computed as the ratio of between-unit variability to the total of between- and within-unit variability. To assess precision of differentiation among units based on ranked pressure ulcer rates, we simulated data to estimate the probabilities of a unit's observed pressure ulcer rate rank in a given sample falling within five and ten percentiles of its true rank, and the probabilities of units with ulcer rates in the highest quartile and highest decile being identified as such. We assessed the signal-noise measure as an indicator of differentiation precision by computing its correlations with these probabilities. Pressure ulcer rates based on a single year of quarterly or weekly prevalence surveys were too susceptible to noise to allow for precise differentiation among units, and signal-noise reliability was a poor indicator of precision of differentiation. To ensure precise differentiation on the basis of true differences, alternative methods of assessing reliability should be applied to measures purported to differentiate among providers or units based on quality. © 2016 The Authors. Research in Nursing & Health published by Wiley Periodicals, Inc. © 2016 The Authors. Research in Nursing & Health published by Wiley Periodicals, Inc.

  13. Reliability of Pressure Ulcer Rates: How Precisely Can We Differentiate Among Hospital Units, and Does the Standard Signal‐Noise Reliability Measure Reflect This Precision?

    PubMed Central

    Cramer, Emily

    2016-01-01

    Abstract Hospital performance reports often include rankings of unit pressure ulcer rates. Differentiating among units on the basis of quality requires reliable measurement. Our objectives were to describe and apply methods for assessing reliability of hospital‐acquired pressure ulcer rates and evaluate a standard signal‐noise reliability measure as an indicator of precision of differentiation among units. Quarterly pressure ulcer data from 8,199 critical care, step‐down, medical, surgical, and medical‐surgical nursing units from 1,299 US hospitals were analyzed. Using beta‐binomial models, we estimated between‐unit variability (signal) and within‐unit variability (noise) in annual unit pressure ulcer rates. Signal‐noise reliability was computed as the ratio of between‐unit variability to the total of between‐ and within‐unit variability. To assess precision of differentiation among units based on ranked pressure ulcer rates, we simulated data to estimate the probabilities of a unit's observed pressure ulcer rate rank in a given sample falling within five and ten percentiles of its true rank, and the probabilities of units with ulcer rates in the highest quartile and highest decile being identified as such. We assessed the signal‐noise measure as an indicator of differentiation precision by computing its correlations with these probabilities. Pressure ulcer rates based on a single year of quarterly or weekly prevalence surveys were too susceptible to noise to allow for precise differentiation among units, and signal‐noise reliability was a poor indicator of precision of differentiation. To ensure precise differentiation on the basis of true differences, alternative methods of assessing reliability should be applied to measures purported to differentiate among providers or units based on quality. © 2016 The Authors. Research in Nursing & Health published by Wiley Periodicals, Inc. PMID:27223598

  14. The Too-Much-Precision Effect.

    PubMed

    Loschelder, David D; Friese, Malte; Schaerer, Michael; Galinsky, Adam D

    2016-12-01

    Past research has suggested a fundamental principle of price precision: The more precise an opening price, the more it anchors counteroffers. The present research challenges this principle by demonstrating a too-much-precision effect. Five experiments (involving 1,320 experts and amateurs in real-estate, jewelry, car, and human-resources negotiations) showed that increasing the precision of an opening offer had positive linear effects for amateurs but inverted-U-shaped effects for experts. Anchor precision backfired because experts saw too much precision as reflecting a lack of competence. This negative effect held unless first movers gave rationales that boosted experts' perception of their competence. Statistical mediation and experimental moderation established the critical role of competence attributions. This research disentangles competing theoretical accounts (attribution of competence vs. scale granularity) and qualifies two putative truisms: that anchors affect experts and amateurs equally, and that more precise prices are linearly more potent anchors. The results refine current theoretical understanding of anchoring and have significant implications for everyday life.

  15. Method of high precision interval measurement in pulse laser ranging system

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Lv, Xin-yuan; Mao, Jin-jin; Liu, Wei; Yang, Dong

    2013-09-01

    Laser ranging is suitable for laser system, for it has the advantage of high measuring precision, fast measuring speed,no cooperative targets and strong resistance to electromagnetic interference,the measuremen of laser ranging is the key paremeters affecting the performance of the whole system.The precision of the pulsed laser ranging system was decided by the precision of the time interval measurement, the principle structure of laser ranging system was introduced, and a method of high precision time interval measurement in pulse laser ranging system was established in this paper.Based on the analysis of the factors which affected the precision of range measure,the pulse rising edges discriminator was adopted to produce timing mark for the start-stop time discrimination,and the TDC-GP2 high precision interval measurement system based on TMS320F2812 DSP was designed to improve the measurement precision.Experimental results indicate that the time interval measurement method in this paper can obtain higher range accuracy. Compared with the traditional time interval measurement system,the method simplifies the system design and reduce the influence of bad weather conditions,furthermore,it satisfies the requirements of low costs and miniaturization.

  16. Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results

    NASA Technical Reports Server (NTRS)

    Jones, Scott

    2015-01-01

    Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines.OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial

  17. Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results

    NASA Technical Reports Server (NTRS)

    Jones, Scott M.

    2015-01-01

    Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines. OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial

  18. Fast and objective detection and analysis of structures in downhole images

    NASA Astrophysics Data System (ADS)

    Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick

    2017-09-01

    Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.

  19. Precision Optics Curriculum.

    ERIC Educational Resources Information Center

    Reid, Robert L.; And Others

    This guide outlines the competency-based, two-year precision optics curriculum that the American Precision Optics Manufacturers Association has proposed to fill the void that it suggests will soon exist as many of the master opticians currently employed retire. The model, which closely resembles the old European apprenticeship model, calls for 300…

  20. Development and validation of an automated and marker-free CT-based spatial analysis method (CTSA) for assessment of femoral hip implant migration: In vitro accuracy and precision comparable to that of radiostereometric analysis (RSA).

    PubMed

    Scheerlinck, Thierry; Polfliet, Mathias; Deklerck, Rudi; Van Gompel, Gert; Buls, Nico; Vandemeulebroucke, Jef

    2016-01-01

    We developed a marker-free automated CT-based spatial analysis (CTSA) method to detect stem-bone migration in consecutive CT datasets and assessed the accuracy and precision in vitro. Our aim was to demonstrate that in vitro accuracy and precision of CTSA is comparable to that of radiostereometric analysis (RSA). Stem and bone were segmented in 2 CT datasets and both were registered pairwise. The resulting rigid transformations were compared and transferred to an anatomically sound coordinate system, taking the stem as reference. This resulted in 3 translation parameters and 3 rotation parameters describing the relative amount of stem-bone displacement, and it allowed calculation of the point of maximal stem migration. Accuracy was evaluated in 39 comparisons by imposing known stem migration on a stem-bone model. Precision was estimated in 20 comparisons based on a zero-migration model, and in 5 patients without stem loosening. Limits of the 95% tolerance intervals (TIs) for accuracy did not exceed 0.28 mm for translations and 0.20° for rotations (largest standard deviation of the signed error (SD(SE)): 0.081 mm and 0.057°). In vitro, limits of the 95% TI for precision in a clinically relevant setting (8 comparisons) were below 0.09 mm and 0.14° (largest SD(SE): 0.012 mm and 0.020°). In patients, the precision was lower, but acceptable, and dependent on CT scan resolution. CTSA allows detection of stem-bone migration with an accuracy and precision comparable to that of RSA. It could be valuable for evaluation of subtle stem loosening in clinical practice.

  1. Experimental assessment of precision and accuracy of radiostereometric analysis for the determination of polyethylene wear in a total hip replacement model.

    PubMed

    Bragdon, Charles R; Malchau, Henrik; Yuan, Xunhua; Perinchief, Rebecca; Kärrholm, Johan; Börlin, Niclas; Estok, Daniel M; Harris, William H

    2002-07-01

    The purpose of this study was to develop and test a phantom model based on actual total hip replacement (THR) components to simulate the true penetration of the femoral head resulting from polyethylene wear. This model was used to study both the accuracy and the precision of radiostereometric analysis, RSA, in measuring wear. We also used this model to evaluate optimum tantalum bead configuration for this particular cup design when used in a clinical setting. A physical model of a total hip replacement (a phantom) was constructed which could simulate progressive, three-dimensional (3-D) penetration of the femoral head into the polyethylene component of a THR. Using a coordinate measuring machine (CMM) the positioning of the femoral head using the phantom was measured to be accurate to within 7 microm. The accuracy and precision of an RSA analysis system was determined from five repeat examinations of the phantom using various experimental set-ups of the phantom. The accuracy of the radiostereometric analysis, in this optimal experimental set-up studied was 33 microm for the medial direction, 22 microm for the superior direction, 86 microm for the posterior direction and 55 microm for the resultant 3-D vector length. The corresponding precision at the 95% confidence interval of the test results for repositioning the phantom five times, measured 8.4 microm for the medial direction, 5.5 microm for the superior direction, 16.0 microm for the posterior direction, and 13.5 microm for the resultant 3-D vector length. This in vitro model is proposed as a useful tool for developing a standard for the evaluation of radiostereometric and other radiographic methods used to measure in vivo wear.

  2. Examination about Influence for Precision of 3d Image Measurement from the Ground Control Point Measurement and Surface Matching

    NASA Astrophysics Data System (ADS)

    Anai, T.; Kochi, N.; Yamada, M.; Sasaki, T.; Otani, H.; Sasaki, D.; Nishimura, S.; Kimoto, K.; Yasui, N.

    2015-05-01

    As the 3D image measurement software is now widely used with the recent development of computer-vision technology, the 3D measurement from the image is now has acquired the application field from desktop objects as wide as the topography survey in large geographical areas. Especially, the orientation, which used to be a complicated process in the heretofore image measurement, can be now performed automatically by simply taking many pictures around the object. And in the case of fully textured object, the 3D measurement of surface features is now done all automatically from the orientated images, and greatly facilitated the acquisition of the dense 3D point cloud from images with high precision. With all this development in the background, in the case of small and the middle size objects, we are now furnishing the all-around 3D measurement by a single digital camera sold on the market. And we have also developed the technology of the topographical measurement with the air-borne images taken by a small UAV [1~5]. In this present study, in the case of the small size objects, we examine the accuracy of surface measurement (Matching) by the data of the experiments. And as to the topographic measurement, we examine the influence of GCP distribution on the accuracy by the data of the experiments. Besides, we examined the difference of the analytical results in each of the 3D image measurement software. This document reviews the processing flow of orientation and the 3D measurement of each software and explains the feature of the each software. And as to the verification of the precision of stereo-matching, we measured the test plane and the test sphere of the known form and assessed the result. As to the topography measurement, we used the air-borne image data photographed at the test field in Yadorigi of Matsuda City, Kanagawa Prefecture JAPAN. We have constructed Ground Control Point which measured by RTK-GPS and Total Station. And we show the results of analysis made

  3. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric S.

    2016-01-01

    The prediction of turbomachinery performance characteristics is an important part of the conceptual aircraft engine design process. During this phase, the designer must examine the effects of a large number of turbomachinery design parameters to determine their impact on overall engine performance and weight. The lack of detailed design information available in this phase necessitates the use of simpler meanline and streamline methods to determine the turbomachinery geometry characteristics and provide performance estimates prior to more detailed CFD (Computational Fluid Dynamics) analyses. While a number of analysis codes have been developed for this purpose, most are written in outdated software languages and may be difficult or impossible to apply to new, unconventional designs. The Object-Oriented Turbomachinery Analysis Code (OTAC) is currently being developed at NASA Glenn Research Center to provide a flexible meanline and streamline analysis capability in a modern object-oriented language. During the development and validation of OTAC, a limitation was identified in the code's ability to analyze and converge turbines as the flow approached choking. This paper describes a series of changes which can be made to typical OTAC turbine meanline models to enable the assessment of choked flow up to limit load conditions. Results produced with this revised model setup are provided in the form of turbine performance maps and are compared to published maps.

  4. Precision medicine in cardiology.

    PubMed

    Antman, Elliott M; Loscalzo, Joseph

    2016-10-01

    The cardiovascular research and clinical communities are ideally positioned to address the epidemic of noncommunicable causes of death, as well as advance our understanding of human health and disease, through the development and implementation of precision medicine. New tools will be needed for describing the cardiovascular health status of individuals and populations, including 'omic' data, exposome and social determinants of health, the microbiome, behaviours and motivations, patient-generated data, and the array of data in electronic medical records. Cardiovascular specialists can build on their experience and use precision medicine to facilitate discovery science and improve the efficiency of clinical research, with the goal of providing more precise information to improve the health of individuals and populations. Overcoming the barriers to implementing precision medicine will require addressing a range of technical and sociopolitical issues. Health care under precision medicine will become a more integrated, dynamic system, in which patients are no longer a passive entity on whom measurements are made, but instead are central stakeholders who contribute data and participate actively in shared decision-making. Many traditionally defined diseases have common mechanisms; therefore, elimination of a siloed approach to medicine will ultimately pave the path to the creation of a universal precision medicine environment.

  5. Accuracy and precision of protein-ligand interaction kinetics determined from chemical shift titrations.

    PubMed

    Markin, Craig J; Spyracopoulos, Leo

    2012-12-01

    NMR-monitored chemical shift titrations for the study of weak protein-ligand interactions represent a rich source of information regarding thermodynamic parameters such as dissociation constants (K ( D )) in the micro- to millimolar range, populations for the free and ligand-bound states, and the kinetics of interconversion between states, which are typically within the fast exchange regime on the NMR timescale. We recently developed two chemical shift titration methods wherein co-variation of the total protein and ligand concentrations gives increased precision for the K ( D ) value of a 1:1 protein-ligand interaction (Markin and Spyracopoulos in J Biomol NMR 53: 125-138, 2012). In this study, we demonstrate that classical line shape analysis applied to a single set of (1)H-(15)N 2D HSQC NMR spectra acquired using precise protein-ligand chemical shift titration methods we developed, produces accurate and precise kinetic parameters such as the off-rate (k ( off )). For experimentally determined kinetics in the fast exchange regime on the NMR timescale, k ( off ) ~ 3,000 s(-1) in this work, the accuracy of classical line shape analysis was determined to be better than 5 % by conducting quantum mechanical NMR simulations of the chemical shift titration methods with the magnetic resonance toolkit GAMMA. Using Monte Carlo simulations, the experimental precision for k ( off ) from line shape analysis of NMR spectra was determined to be 13 %, in agreement with the theoretical precision of 12 % from line shape analysis of the GAMMA simulations in the presence of noise and protein concentration errors. In addition, GAMMA simulations were employed to demonstrate that line shape analysis has the potential to provide reasonably accurate and precise k ( off ) values over a wide range, from 100 to 15,000 s(-1). The validity of line shape analysis for k ( off ) values approaching intermediate exchange (~100 s(-1)), may be facilitated by more accurate K ( D ) measurements

  6. Cognitive Task Analysis of Experts in Designing Multimedia Learning Object Guideline (M-LOG)

    ERIC Educational Resources Information Center

    Razak, Rafiza Abdul; Palanisamy, Punithavathy

    2013-01-01

    The purpose of this study was to design and develop a set of guidelines for multimedia learning objects to inform instructional designers (IDs) about the procedures involved in the process of content analysis. This study was motivated by the absence of standardized procedures in the beginning phase of the multimedia learning object design which is…

  7. A Concept for Airborne Precision Spacing for Dependent Parallel Approaches

    NASA Technical Reports Server (NTRS)

    Barmore, Bryan E.; Baxley, Brian T.; Abbott, Terence S.; Capron, William R.; Smith, Colin L.; Shay, Richard F.; Hubbs, Clay

    2012-01-01

    The Airborne Precision Spacing concept of operations has been previously developed to support the precise delivery of aircraft landing successively on the same runway. The high-precision and consistent delivery of inter-aircraft spacing allows for increased runway throughput and the use of energy-efficient arrivals routes such as Continuous Descent Arrivals and Optimized Profile Descents. This paper describes an extension to the Airborne Precision Spacing concept to enable dependent parallel approach operations where the spacing aircraft must manage their in-trail spacing from a leading aircraft on approach to the same runway and spacing from an aircraft on approach to a parallel runway. Functionality for supporting automation is discussed as well as procedures for pilots and controllers. An analysis is performed to identify the required information and a new ADS-B report is proposed to support these information needs. Finally, several scenarios are described in detail.

  8. Integrating end-to-end threads of control into object-oriented analysis and design

    NASA Technical Reports Server (NTRS)

    Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.

    1993-01-01

    Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.

  9. An Emerging Role for Polystores in Precision Medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Begoli, Edmon; Christian, J. Blair; Gadepally, Vijay

    Medical data is organically heterogeneous, and it usually varies significantly in both size and composition. Yet, this data is also a key for the recent and promising field of precision medicine, which focuses on identifying and tailoring appropriate medical treatments for the needs of the individual patients, based on their specific conditions, their medical history, lifestyle, genetic, and other individual factors. As we, and a database community at large, recognize that a “one size does not fit all” solution is required to work with such data, we present in this paper our observations based on our experiences, and the applicationsmore » in the field of precision medicine. Finally, we make the case for the use of polystore architecture; how it applies for precision medicine; we discuss the reference architecture; describe some of its critical components (array database); and discuss the specific types of analysis that directly benefit from this database architecture, and the ways it serves the data.« less

  10. Precision medicine in breast cancer: reality or utopia?

    PubMed

    Bettaieb, Ali; Paul, Catherine; Plenchette, Stéphanie; Shan, Jingxuan; Chouchane, Lotfi; Ghiringhelli, François

    2017-06-17

    Many cancers, including breast cancer, have demonstrated prognosis and support advantages thanks to the discovery of targeted therapies. The advent of these new approaches marked the rise of precision medicine, which leads to improve the diagnosis, prognosis and treatment of cancer. Precision medicine takes into account the molecular and biological specificities of the patient and their tumors that will influence the treatment determined by physicians. This new era of medicine is accessible through molecular genetics platforms, the development of high-speed sequencers and means of analysis of these data. Despite the spectacular results in the treatment of cancers including breast cancer, described in this review, not all patients however can benefit from this new strategy. This seems to be related to the many genetic mutations, which may be different from one patient to another or within the same patient. It comes to give new impetus to the research-both from a technological and biological point of view-to make the hope of precision medicine accessible to all.

  11. Modification of a successive corrections objective analysis for improved higher order calculations

    NASA Technical Reports Server (NTRS)

    Achtemeier, Gary L.

    1988-01-01

    The use of objectively analyzed fields of meteorological data for the initialization of numerical prediction models and for complex diagnostic studies places the requirements upon the objective method that derivatives of the gridded fields be accurate and free from interpolation error. A modification was proposed for an objective analysis developed by Barnes that provides improvements in analysis of both the field and its derivatives. Theoretical comparisons, comparisons between analyses of analytical monochromatic waves, and comparisons between analyses of actual weather data are used to show the potential of the new method. The new method restores more of the amplitudes of desired wavelengths while simultaneously filtering more of the amplitudes of undesired wavelengths. These results also hold for the first and second derivatives calculated from the gridded fields. Greatest improvements were for the Laplacian of the height field; the new method reduced the variance of undesirable very short wavelengths by 72 percent. Other improvements were found in the divergence of the gridded wind field and near the boundaries of the field of data.

  12. Moire measuring technology for three-dimensional profile of the object

    NASA Astrophysics Data System (ADS)

    Fu, Yanjun; Yang, Kuntao

    2006-02-01

    An optical system is designed to get projection of the transmission grating, the deformed grating is obtained on surface of the object. The image of the deformed grating is given by the lens, the reference grating is put on the place of the image, and then the moire fringe is obtained. The amplify principle of the moire fringe is used to measure the profile of the object. The optical principle of the projection is analyzed. And the relation between the phase and the height of object is deduced. From the different point of geometry optics and the physics opticsl, the optical system is analyzed, the factors that influence the image equality and the measuring result are obtained. So the betterment of improving the measuring precision is brought forward, and in the later information processing, because of the diffuse reflection, the image equality is not very well. In order to get a good image, the digital filter is used to filter the noise and smooth the image firstly. Then in order to improve the measure precision, the subdivision technology is applied. The Fourier transform profilometry and phase shifting technology is used in the calculation. A detail analyses is done both in time field and frequency field. And the method of improving the measuring precision is put forward. A good digital filter algorithm is brought forward in the Fourier transform profilometry. In the phase shifting technology, the detail formula of three-step and four-step is given. At last the phase that is relational with the high information of the object is get, but the phase is disconnected phase, after the unwrapping algorithm,the disconnected phase is changed to be the continuous phase. Taking use of the relation between the phase and height, the height is obtained. Then the three-dimensional profile of the measured object can be reconstructed. The system is very convenient for non-contact measure of profile of some objects.

  13. Analysis of precision in chemical oscillators: implications for circadian clocks

    NASA Astrophysics Data System (ADS)

    d'Eysmond, Thomas; De Simone, Alessandro; Naef, Felix

    2013-10-01

    Biochemical reaction networks often exhibit spontaneous self-sustained oscillations. An example is the circadian oscillator that lies at the heart of daily rhythms in behavior and physiology in most organisms including humans. While the period of these oscillators evolved so that it resonates with the 24 h daily environmental cycles, the precision of the oscillator (quantified via the Q factor) is another relevant property of these cell-autonomous oscillators. Since this quantity can be measured in individual cells, it is of interest to better understand how this property behaves across mathematical models of these oscillators. Current theoretical schemes for computing the Q factors show limitations for both high-dimensional models and in the vicinity of Hopf bifurcations. Here, we derive low-noise approximations that lead to numerically stable schemes also in high-dimensional models. In addition, we generalize normal form reductions that are appropriate near Hopf bifurcations. Applying our approximations to two models of circadian clocks, we show that while the low-noise regime is faithfully recapitulated, increasing the level of noise leads to species-dependent precision. We emphasize that subcomponents of the oscillator gradually decouple from the core oscillator as noise increases, which allows us to identify the subnetworks responsible for robust rhythms.

  14. Yale High Energy Physics Research: Precision Studies of Reactor Antineutrinos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heeger, Karsten M.

    2014-09-13

    This report presents experimental research at the intensity frontier of particle physics with particular focus on the study of reactor antineutrinos and the precision measurement of neutrino oscillations. The experimental neutrino physics group of Professor Heeger and Senior Scientist Band at Yale University has had leading responsibilities in the construction and operation of the Daya Bay Reactor Antineutrino Experiment and made critical contributions to the discovery of non-zeromore » $$\\theta_{13}$$. Heeger and Band led the Daya Bay detector management team and are now overseeing the operations of the antineutrino detectors. Postdoctoral researchers and students in this group have made leading contributions to the Daya Bay analysis including the prediction of the reactor antineutrino flux and spectrum, the analysis of the oscillation signal, and the precision determination of the target mass yielding unprecedented precision in the relative detector uncertainty. Heeger's group is now leading an R\\&D effort towards a short-baseline oscillation experiment, called PROSPECT, at a US research reactor and the development of antineutrino detectors with advanced background discrimination.« less

  15. Multi-objective experimental design for (13)C-based metabolic flux analysis.

    PubMed

    Bouvin, Jeroen; Cajot, Simon; D'Huys, Pieter-Jan; Ampofo-Asiama, Jerry; Anné, Jozef; Van Impe, Jan; Geeraerd, Annemie; Bernaerts, Kristel

    2015-10-01

    (13)C-based metabolic flux analysis is an excellent technique to resolve fluxes in the central carbon metabolism but costs can be significant when using specialized tracers. This work presents a framework for cost-effective design of (13)C-tracer experiments, illustrated on two different networks. Linear and non-linear optimal input mixtures are computed for networks for Streptomyces lividans and a carcinoma cell line. If only glucose tracers are considered as labeled substrate for a carcinoma cell line or S. lividans, the best parameter estimation accuracy is obtained by mixtures containing high amounts of 1,2-(13)C2 glucose combined with uniformly labeled glucose. Experimental designs are evaluated based on a linear (D-criterion) and non-linear approach (S-criterion). Both approaches generate almost the same input mixture, however, the linear approach is favored due to its low computational effort. The high amount of 1,2-(13)C2 glucose in the optimal designs coincides with a high experimental cost, which is further enhanced when labeling is introduced in glutamine and aspartate tracers. Multi-objective optimization gives the possibility to assess experimental quality and cost at the same time and can reveal excellent compromise experiments. For example, the combination of 100% 1,2-(13)C2 glucose with 100% position one labeled glutamine and the combination of 100% 1,2-(13)C2 glucose with 100% uniformly labeled glutamine perform equally well for the carcinoma cell line, but the first mixture offers a decrease in cost of $ 120 per ml-scale cell culture experiment. We demonstrated the validity of a multi-objective linear approach to perform optimal experimental designs for the non-linear problem of (13)C-metabolic flux analysis. Tools and a workflow are provided to perform multi-objective design. The effortless calculation of the D-criterion can be exploited to perform high-throughput screening of possible (13)C-tracers, while the illustrated benefit of multi-objective

  16. Fast Quantitative Analysis Of Museum Objects Using Laser-Induced Breakdown Spectroscopy And Multiple Regression Algorithms

    NASA Astrophysics Data System (ADS)

    Lorenzetti, G.; Foresta, A.; Palleschi, V.; Legnaioli, S.

    2009-09-01

    The recent development of mobile instrumentation, specifically devoted to in situ analysis and study of museum objects, allows the acquisition of many LIBS spectra in very short time. However, such large amount of data calls for new analytical approaches which would guarantee a prompt analysis of the results obtained. In this communication, we will present and discuss the advantages of statistical analytical methods, such as Partial Least Squares Multiple Regression algorithms vs. the classical calibration curve approach. PLS algorithms allows to obtain in real time the information on the composition of the objects under study; this feature of the method, compared to the traditional off-line analysis of the data, is extremely useful for the optimization of the measurement times and number of points associated with the analysis. In fact, the real time availability of the compositional information gives the possibility of concentrating the attention on the most `interesting' parts of the object, without over-sampling the zones which would not provide useful information for the scholars or the conservators. Some example on the applications of this method will be presented, including the studies recently performed by the researcher of the Applied Laser Spectroscopy Laboratory on museum bronze objects.

  17. Precision manipulation with a dextrous robot hand

    NASA Astrophysics Data System (ADS)

    Michelman, Paul

    1994-01-01

    In this thesis, we discuss a framework for describing and synthesizing precision manipulation tasks with a robot hand. Precision manipulations are those in which the motions of grasped objects are caused by finger motions alone (as distinct from arm or wrist motion). Experiments demonstrating the capabilities of the Utah-MIT hand are presented. This work begins by examining current research on biological motor control to raise a number of questions. For example, is the control centralized and organized by a central processor? Or is the control distributed throughout the nervous system? Motor control research on manipulation has focused on developing classifications of hand motions, concentrating solely on finger motions, while neglecting grasp stability and interaction forces that occur in manipulation. In addition, these taxonomies have not been explicitly functional. This thesis defines and analyzes a basic set of manipulation strategies that includes both position and force trajectories. The fundamental purposes of the manipulations are: (1) rectilinear and rotational motion of grasped objects of different geometries; and (2) the application of forces and moments against the environment by the grasped objects. First, task partitioning is described to allocate the fingers their roles in the task. Second, for each strategy, the mechanics and workspace of the tasks are analyzed geometrically to determine the gross finger trajectories required to achieve the tasks. Techniques illustrating the combination of simple manipulations into complex, multiple degree-of-freedom tasks are presented. There is a discussion of several tasks that use multiple elementary strategies. The tasks described are removing the top of a childproof medicine bottle, putting the top back on, rotating and regrasping a block and a cylinder within the grasp. Finally, experimental results are presented. The experimental setup at Columbia University's Center for Research in Intelligent Systems and

  18. Precision cosmological parameter estimation

    NASA Astrophysics Data System (ADS)

    Fendt, William Ashton, Jr.

    2009-09-01

    Experimental efforts of the last few decades have brought. a golden age to mankind's endeavor to understand tine physical properties of the Universe throughout its history. Recent measurements of the cosmic microwave background (CMB) provide strong confirmation of the standard big bang paradigm, as well as introducing new mysteries, to unexplained by current physical models. In the following decades. even more ambitious scientific endeavours will begin to shed light on the new physics by looking at the detailed structure of the Universe both at very early and recent times. Modern data has allowed us to begins to test inflationary models of the early Universe, and the near future will bring higher precision data and much stronger tests. Cracking the codes hidden in these cosmological observables is a difficult and computationally intensive problem. The challenges will continue to increase as future experiments bring larger and more precise data sets. Because of the complexity of the problem, we are forced to use approximate techniques and make simplifying assumptions to ease the computational workload. While this has been reasonably sufficient until now, hints of the limitations of our techniques have begun to come to light. For example, the likelihood approximation used for analysis of CMB data from the Wilkinson Microwave Anistropy Probe (WMAP) satellite was shown to have short falls, leading to pre-emptive conclusions drawn about current cosmological theories. Also it can he shown that an approximate method used by all current analysis codes to describe the recombination history of the Universe will not be sufficiently accurate for future experiments. With a new CMB satellite scheduled for launch in the coming months, it is vital that we develop techniques to improve the analysis of cosmological data. This work develops a novel technique of both avoiding the use of approximate computational codes as well as allowing the application of new, more precise analysis

  19. Combination of optically measured coordinates and displacements for quantitative investigation of complex objects

    NASA Astrophysics Data System (ADS)

    Andrae, Peter; Beeck, Manfred-Andreas; Jueptner, Werner P. O.; Nadeborn, Werner; Osten, Wolfgang

    1996-09-01

    Holographic interferometry makes it possible to measure high precision displacement data in the range of the wavelength of the used laser light. However, the determination of 3D- displacement vectors of objects with complex surfaces requires the measurement of 3D-object coordinates not only to consider local sensitivities but to distinguish between in-plane deformation, i.e. strains, and out-of-plane components, i.e. shears, too. To this purpose both the surface displacement and coordinates have to be combined and it is advantageous to make the data available for CAE- systems. The object surface has to be approximated analytically from the measured point cloud to generate a surface mesh. The displacement vectors can be assigned to the nodes of this surface mesh for visualization of the deformation of the object under test. They also can be compared to the results of FEM-calculations or can be used as boundary conditions for further numerical investigations. Here the 3D-object coordinates are measured in a separate topometric set-up using a modified fringe projection technique to acquire absolute phase values and a sophisticated geometrical model to map these phase data onto coordinates precisely. The determination of 3D-displacement vectors requires the measurement of several interference phase distributions for at least three independent sensitivity directions depending on the observation and illumination directions as well as the 3D-position of each measuring point. These geometric quantities have to be transformed into a reference coordinate system of the interferometric set-up in order to calculate the geometric matrix. The necessary transformation can be realized by means of a detection of object features in both data sets and a subsequent determination of the external camera orientation. This paper presents a consistent solution for the measurement and combination of shape and displacement data including their transformation into simulation systems. The

  20. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less

  1. The Use of Instructional Objectives: A Model for Second-Year Podiatric Surgical Residency.

    ERIC Educational Resources Information Center

    Lepow, Gary M.; Levy, Leonard A.

    1980-01-01

    The use of highly specific objectives can be the basis for a second-year podiatric surgical residency program. They show both residents and attending staff precisely the knowledge and skills to be achieved and aid evaluation of students. A series of objectives is provided. (MSE)

  2. A multiple ion counter total evaporation (MICTE) method for precise analysis of plutonium by thermal ionization mass spectrometry

    DOE PAGES

    Inglis, Jeremy D.; Maassen, Joel; Kara, Azim; ...

    2017-04-28

    This study presents a total evaporation method for the analysis of sub-picogram quantities of Pu, utilizing an array of multiple ion counters. Data from three standards are presented to assess the utility of the technique. An external precision of 1.5% RSD (2σ) was achieved on aliquots approaching 100 fg for the minor 240Pu isotope. Accurate analysis of <1 femtogram of 240Pu, is achievable, with an external reproducibility of better than 10% RSD (2σ). Finally, this new technique represents a significant advance in the total evaporation method and will allow routine measurement of femtogram sized Pu samples by thermal ionization massmore » spectrometry.« less

  3. A multiple ion counter total evaporation (MICTE) method for precise analysis of plutonium by thermal ionization mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inglis, Jeremy D.; Maassen, Joel; Kara, Azim

    This study presents a total evaporation method for the analysis of sub-picogram quantities of Pu, utilizing an array of multiple ion counters. Data from three standards are presented to assess the utility of the technique. An external precision of 1.5% RSD (2σ) was achieved on aliquots approaching 100 fg for the minor 240Pu isotope. Accurate analysis of <1 femtogram of 240Pu, is achievable, with an external reproducibility of better than 10% RSD (2σ). Finally, this new technique represents a significant advance in the total evaporation method and will allow routine measurement of femtogram sized Pu samples by thermal ionization massmore » spectrometry.« less

  4. Multi-Scale and Object-Oriented Analysis for Mountain Terrain Segmentation and Geomorphological Assessment

    NASA Astrophysics Data System (ADS)

    Marston, B. K.; Bishop, M. P.; Shroder, J. F.

    2009-12-01

    Digital terrain analysis of mountain topography is widely utilized for mapping landforms, assessing the role of surface processes in landscape evolution, and estimating the spatial variation of erosion. Numerous geomorphometry techniques exist to characterize terrain surface parameters, although their utility to characterize the spatial hierarchical structure of the topography and permit an assessment of the erosion/tectonic impact on the landscape is very limited due to scale and data integration issues. To address this problem, we apply scale-dependent geomorphometric and object-oriented analyses to characterize the hierarchical spatial structure of mountain topography. Specifically, we utilized a high resolution digital elevation model to characterize complex topography in the Shimshal Valley in the Western Himalaya of Pakistan. To accomplish this, we generate terrain objects (geomorphological features and landform) including valley floors and walls, drainage basins, drainage network, ridge network, slope facets, and elemental forms based upon curvature. Object-oriented analysis was used to characterize object properties accounting for object size, shape, and morphometry. The spatial overlay and integration of terrain objects at various scales defines the nature of the hierarchical organization. Our results indicate that variations in the spatial complexity of the terrain hierarchical organization is related to the spatio-temporal influence of surface processes and landscape evolution dynamics. Terrain segmentation and the integration of multi-scale terrain information permits further assessment of process domains and erosion, tectonic impact potential, and natural hazard potential. We demonstrate this with landform mapping and geomorphological assessment examples.

  5. Optimetrics for Precise Navigation

    NASA Technical Reports Server (NTRS)

    Yang, Guangning; Heckler, Gregory; Gramling, Cheryl

    2017-01-01

    Optimetrics for Precise Navigation will be implemented on existing optical communication links. The ranging and Doppler measurements are conducted over communication data frame and clock. The measurement accuracy is two orders of magnitude better than TDRSS. It also has other advantages of: The high optical carrier frequency enables: (1) Immunity from ionosphere and interplanetary Plasma noise floor, which is a performance limitation for RF tracking; and (2) High antenna gain reduces terminal size and volume, enables high precision tracking in Cubesat, and in deep space smallsat. High Optical Pointing Precision provides: (a) spacecraft orientation, (b) Minimal additional hardware to implement Precise Optimetrics over optical comm link; and (c) Continuous optical carrier phase measurement will enable the system presented here to accept future optical frequency standard with much higher clock accuracy.

  6. Comparative study of 2-DOF micromirrors for precision light manipulation

    NASA Astrophysics Data System (ADS)

    Young, Johanna I.; Shkel, Andrei M.

    2001-08-01

    Many industry experts predict that the future of fiber optic telecommunications depends on the development of all-optical components for switching of photonic signals from fiber to fiber throughout the networks. MEMS is a promising technology for providing all-optical switching at high speeds with significant cost reductions. This paper reports on the the analysis of two designs for 2-DOF electrostatically actuated MEMS micromirrors for precision controllable large optical switching arrays. The behavior of the micromirror designs is predicted by coupled-field electrostatic and modal analysis using a finite element analysis (FEA) multi-physics modeling software. The analysis indicates that the commonly used gimbal type mirror design experiences electrostatic interference and would therefore be difficult to precisely control for 2-DOF motion. We propose a new design approach which preserves 2-DOF actuation while minimizing electrostatic interference between the drive electrodes and the mirror. Instead of using two torsional axes, we use one actuator which combines torsional and flexural DOFs. A comparative analysis of the conventional gimbal design and the one proposed in this paper is performed.

  7. Detection and laser ranging of orbital objects using optical methods

    NASA Astrophysics Data System (ADS)

    Wagner, P.; Hampf, D.; Sproll, F.; Hasenohr, T.; Humbert, L.; Rodmann, J.; Riede, W.

    2016-09-01

    Laser ranging to satellites (SLR) in earth orbit is an established technology used for geodesy, fundamental science and precise orbit determination. A combined active and passive optical measurement system using a single telescope mount is presented which performs precise ranging measurements of retro reflector equipped objects in low earth orbit (LEO). The German Aerospace Center (DLR) runs an observatory in Stuttgart where a system has been assembled completely from commercial off-the-shelf (COTS) components. The visible light directed to the tracking camera is used to perform angular measurements of objects under investigation. This is done astrometrically by comparing the apparent target position with cataloged star positions. First successful satellite laser ranging was demonstrated recently using an optical fiber directing laser pulses onto the astronomical mount. The transmitter operates at a wavelength of 1064 nm with a repetition rate of 3 kHz and pulse energy of 25 μJ. A motorized tip/tilt mount allows beam steering of the collimated beam with μrad accuracy. The returning photons reflected from the object in space are captured with the tracking telescope. A special low aberration beam splitter unit was designed to separate the infrared from visible light. This allows passive optical closed loop tracking and operation of a single photon detector for time of flight measurements at a single telescope simultaneously. The presented innovative design yields to a compact and cost effective but very precise ranging system which allows orbit determination.

  8. Deficits in Coordinative Bimanual Timing Precision in Children with Specific Language Impairment

    ERIC Educational Resources Information Center

    Vuolo, Janet; Goffman, Lisa; Zelaznik, Howard N.

    2017-01-01

    Purpose: Our objective was to delineate components of motor performance in specific language impairment (SLI); specifically, whether deficits in timing precision in one effector (unimanual tapping) and in two effectors (bimanual clapping) are observed in young children with SLI. Method: Twenty-seven 4- to 5-year-old children with SLI and 21…

  9. Relative recency influences object-in-context memory

    PubMed Central

    Tam, Shu K.E.; Bonardi, Charlotte; Robinson, Jasper

    2015-01-01

    In two experiments rats received training on an object-in-context (OIC) task, in which they received preexposure to object A in context x, followed by exposure to object B in context y. In a subsequent test both A and B are presented in either context x or context y. Usually more exploration is seen of the object that has not previously been paired with the test context, an effect attributed to the ability to remember where an object was encountered. However, in the typical version of this task, object A has also been encountered less recently than object B at test. This is precisely the arrangement in tests of ‘relatively recency’ (RR), in which more remotely presented objects are explored more than objects experienced more recently. RR could contaminate performance on the OIC task, by enhancing the OIC effect when animals are tested in context y, and masking it when the test is in context x. This possibility was examined in two experiments, and evidence for superior performance in context y was obtained. The implications of this for theoretical interpretations of recognition memory and the procedures used to explore it are discussed. PMID:25546721

  10. An Efficient Objective Analysis System for Parallel Computers

    NASA Technical Reports Server (NTRS)

    Stobie, J.

    1999-01-01

    A new atmospheric objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 1 X 1 lat-lon grid with 18 levels of heights and winds and 10 levels of moisture) using 120,000 observations in 17 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system is totally portable and can run on several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from 1 to 32 CPUs is 18%. In addition, the analysis results are identical regardless of the number of processors used. This system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. Static tests with a 2 X 2.5 resolution version of this system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from several months of cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (O-F statistics) as the current operational system.

  11. An Efficient Objective Analysis System for Parallel Computers

    NASA Technical Reports Server (NTRS)

    Stobie, James G.

    1999-01-01

    A new objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 2 x 2.5 lat-lon grid with 20 levels of heights and winds and 10 levels of moisture) using 120,000 observations in less than 3 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system Ls totally portable and can run on -several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from I to 32 CPus is 18%. in addition, the analysis results are identical regardless of the number of processors used. T'his system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. It also includes a new quality control (buddy check) system. Static tests with the system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from a 2-month cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (0-F statistics) throughout the entire two months.

  12. Non-destructive analysis of museum objects by fibre-optic Raman spectroscopy

    PubMed Central

    Tate, Jim; Moens, Luc

    2006-01-01

    Raman spectroscopy is a versatile technique that has frequently been applied for the investigation of art objects. By using mobile Raman instrumentation it is possible to investigate the artworks without the need for sampling. This work evaluates the use of a dedicated mobile spectrometer for the investigation of a range of museum objects in museums in Scotland, including antique Egyptian sarcophagi, a panel painting, painted surfaces on paper and textile, and the painted lid and soundboard of an early keyboard instrument. The investigations of these artefacts illustrate some analytical challenges that arise when analysing museum objects, including fluorescing varnish layers, ambient sunlight, large dimensions of artefacts and the need to handle fragile objects with care. Analysis of the musical instrument (the Mar virginals) was undertaken in the exhibition gallery, while on display, which meant that interaction with the public and health and safety issues had to be taken into account. Experimental set-up for the non-destructive Raman spectroscopic investigation of a textile banner in the National Museums of Scotland PMID:16953310

  13. Non-destructive analysis of museum objects by fibre-optic Raman spectroscopy.

    PubMed

    Vandenabeele, Peter; Tate, Jim; Moens, Luc

    2007-02-01

    Raman spectroscopy is a versatile technique that has frequently been applied for the investigation of art objects. By using mobile Raman instrumentation it is possible to investigate the artworks without the need for sampling. This work evaluates the use of a dedicated mobile spectrometer for the investigation of a range of museum objects in museums in Scotland, including antique Egyptian sarcophagi, a panel painting, painted surfaces on paper and textile, and the painted lid and soundboard of an early keyboard instrument. The investigations of these artefacts illustrate some analytical challenges that arise when analysing museum objects, including fluorescing varnish layers, ambient sunlight, large dimensions of artefacts and the need to handle fragile objects with care. Analysis of the musical instrument (the Mar virginals) was undertaken in the exhibition gallery, while on display, which meant that interaction with the public and health and safety issues had to be taken into account. Experimental set-up for the non-destructive Raman spectroscopic investigation of a textile banner in the National Museums of Scotland.

  14. Using object-oriented analysis to design a multi-mission ground data system

    NASA Technical Reports Server (NTRS)

    Shames, Peter

    1995-01-01

    This paper describes an analytical approach and descriptive methodology that is adapted from Object-Oriented Analysis (OOA) techniques. The technique is described and then used to communicate key issues of system logical architecture. The essence of the approach is to limit the analysis to only service objects, with the idea of providing a direct mapping from the design to a client-server implementation. Key perspectives on the system, such as user interaction, data flow and management, service interfaces, hardware configuration, and system and data integrity are covered. A significant advantage of this service-oriented approach is that it permits mapping all of these different perspectives on the system onto a single common substrate. This services substrate is readily represented diagramatically, thus making details of the overall design much more accessible.

  15. Precision Airdrop (Largage de precision)

    DTIC Science & Technology

    2005-12-01

    NAVIGATION TO A PRECISION AIRDROP OVERVIEW RTO-AG-300-V24 2 - 9 the point from various compass headings. As the tests are conducted, the resultant...rate. This approach avoids including a magnetic compass for the heading reference, which has difficulties due to local changes in the magnetic field...Scientifica della Difesa ROYAUME-UNI Via XX Settembre 123 Dstl Knowledge Services ESPAGNE 00187 Roma Information Centre, Building 247 SDG TECEN / DGAM

  16. High resolution melting analysis: rapid and precise characterisation of recombinant influenza A genomes

    PubMed Central

    2013-01-01

    Background High resolution melting analysis (HRM) is a rapid and cost-effective technique for the characterisation of PCR amplicons. Because the reverse genetics of segmented influenza A viruses allows the generation of numerous influenza A virus reassortants within a short time, methods for the rapid selection of the correct recombinants are very useful. Methods PCR primer pairs covering the single nucleotide polymorphism (SNP) positions of two different influenza A H5N1 strains were designed. Reassortants of the two different H5N1 isolates were used as a model to prove the suitability of HRM for the selection of the correct recombinants. Furthermore, two different cycler instruments were compared. Results Both cycler instruments generated comparable average melting peaks, which allowed the easy identification and selection of the correct cloned segments or reassorted viruses. Conclusions HRM is a highly suitable method for the rapid and precise characterisation of cloned influenza A genomes. PMID:24028349

  17. The Rényi divergence enables accurate and precise cluster analysis for localisation microscopy.

    PubMed

    Staszowska, Adela D; Fox-Roberts, Patrick; Hirvonen, Liisa M; Peddie, Christopher J; Collinson, Lucy M; Jones, Gareth E; Cox, Susan

    2018-06-01

    Clustering analysis is a key technique for quantitatively characterising structures in localisation microscopy images. To build up accurate information about biological structures, it is critical that the quantification is both accurate (close to the ground truth) and precise (has small scatter and is reproducible). Here we describe how the Rényi divergence can be used for cluster radius measurements in localisation microscopy data. We demonstrate that the Rényi divergence can operate with high levels of background and provides results which are more accurate than Ripley's functions, Voronoi tesselation or DBSCAN. Data supporting this research will be made accessible via a web link. Software codes developed for this work can be accessed via http://coxphysics.com/Renyi_divergence_software.zip. Implemented in C ++. Correspondence and requests for materials can be also addressed to the corresponding author. adela.staszowska@gmail.com or susan.cox@kcl.ac.uk. Supplementary data are available at Bioinformatics online.

  18. Precision medicine for nurses: 101.

    PubMed

    Lemoine, Colleen

    2014-05-01

    To introduce the key concepts and terms associated with precision medicine and support understanding of future developments in the field by providing an overview and history of precision medicine, related ethical considerations, and nursing implications. Current nursing, medical and basic science literature. Rapid progress in understanding the oncogenic drivers associated with cancer is leading to a shift toward precision medicine, where treatment is based on targeting specific genetic and epigenetic alterations associated with a particular cancer. Nurses will need to embrace the paradigm shift to precision medicine, expend the effort necessary to learn the essential terminology, concepts and principles, and work collaboratively with physician colleagues to best position our patients to maximize the potential that precision medicine can offer. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Automation of Precise Time Reference Stations (PTRS)

    NASA Astrophysics Data System (ADS)

    Wheeler, P. J.

    1985-04-01

    The U.S. Naval Observatory is presently engaged in a program of automating precise time stations (PTS) and precise time reference stations (PTBS) by using a versatile mini-computer controlled data acquisition system (DAS). The data acquisition system is configured to monitor locally available PTTI signals such as LORAN-C, OMEGA, and/or the Global Positioning System. In addition, the DAS performs local standard intercomparison. Computer telephone communications provide automatic data transfer to the Naval Observatory. Subsequently, after analysis of the data, results and information can be sent back to the precise time reference station to provide automatic control of remote station timing. The DAS configuration is designed around state of the art standard industrial high reliability modules. The system integration and software are standardized but allow considerable flexibility to satisfy special local requirements such as stability measurements, performance evaluation and printing of messages and certificates. The DAS operates completely independently and may be queried or controlled at any time with a computer or terminal device (control is protected for use by authorized personnel only). Such DAS equipped PTS are operational in Hawaii, California, Texas and Florida.

  20. Autonomous Space Object Catalogue Construction and Upkeep Using Sensor Control Theory

    NASA Astrophysics Data System (ADS)

    Moretti, N.; Rutten, M.; Bessell, T.; Morreale, B.

    The capability to track objects in space is critical to safeguard domestic and international space assets. Infrequent measurement opportunities, complex dynamics and partial observability of orbital state makes the tracking of resident space objects nontrivial. It is not uncommon for human operators to intervene with space tracking systems, particularly in scheduling sensors. This paper details the development of a system that maintains a catalogue of geostationary objects through dynamically tasking sensors in real time by managing the uncertainty of object states. As the number of objects in space grows the potential for collision grows exponentially. Being able to provide accurate assessment to operators regarding costly collision avoidance manoeuvres is paramount; the accuracy of which is highly dependent on how object states are estimated. The system represents object state and uncertainty using particles and utilises a particle filter for state estimation. Particle filters capture the model and measurement uncertainty accurately, allowing for a more comprehensive representation of the state’s probability density function. Additionally, the number of objects in space is growing disproportionally to the number of sensors used to track them. Maintaining precise positions for all objects places large loads on sensors, limiting the time available to search for new objects or track high priority objects. Rather than precisely track all objects our system manages the uncertainty in orbital state for each object independently. The uncertainty is allowed to grow and sensor data is only requested when the uncertainty must be reduced. For example when object uncertainties overlap leading to data association issues or if the uncertainty grows to beyond a field of view. These control laws are formulated into a cost function, which is optimised in real time to task sensors. By controlling an optical telescope the system has been able to construct and maintain a catalogue

  1. Process for the reconstruction of three-dimensional images of an area of interest of an object comprising the combination of measurements over the entire object with measurements of an area of interest of said object, and appropriate installation

    DOEpatents

    Azevedo, Stephen; Grangeat, Pierre; Rizo, Philippe

    1995-01-01

    Process and installation making it possible to reconstitute precise images of an area of interest (2) of an object (1) by reducing the errors produced by the contribution of the compliment of the object. A first series of measurements is carried out, where a conical beam (10) only takes in the area of interest of the object (2) and this is followed by a second series of measurements in which the beam takes in the entire object. A combination of the measurements of the two series is carried out in order to make them compatible and obtain a more accurate image of the area of interest (2).

  2. Precision Cosmology

    NASA Astrophysics Data System (ADS)

    Jones, Bernard J. T.

    2017-04-01

    Preface; Notation and conventions; Part I. 100 Years of Cosmology: 1. Emerging cosmology; 2. The cosmic expansion; 3. The cosmic microwave background; 4. Recent cosmology; Part II. Newtonian Cosmology: 5. Newtonian cosmology; 6. Dark energy cosmological models; 7. The early universe; 8. The inhomogeneous universe; 9. The inflationary universe; Part III. Relativistic Cosmology: 10. Minkowski space; 11. The energy momentum tensor; 12. General relativity; 13. Space-time geometry and calculus; 14. The Einstein field equations; 15. Solutions of the Einstein equations; 16. The Robertson-Walker solution; 17. Congruences, curvature and Raychaudhuri; 18. Observing and measuring the universe; Part IV. The Physics of Matter and Radiation: 19. Physics of the CMB radiation; 20. Recombination of the primeval plasma; 21. CMB polarisation; 22. CMB anisotropy; Part V. Precision Tools for Precision Cosmology: 23. Likelihood; 24. Frequentist hypothesis testing; 25. Statistical inference: Bayesian; 26. CMB data processing; 27. Parametrising the universe; 28. Precision cosmology; 29. Epilogue; Appendix A. SI, CGS and Planck units; Appendix B. Magnitudes and distances; Appendix C. Representing vectors and tensors; Appendix D. The electromagnetic field; Appendix E. Statistical distributions; Appendix F. Functions on a sphere; Appendix G. Acknowledgements; References; Index.

  3. Hypothesis testing for band size detection of high-dimensional banded precision matrices.

    PubMed

    An, Baiguo; Guo, Jianhua; Liu, Yufeng

    2014-06-01

    Many statistical analysis procedures require a good estimator for a high-dimensional covariance matrix or its inverse, the precision matrix. When the precision matrix is banded, the Cholesky-based method often yields a good estimator of the precision matrix. One important aspect of this method is determination of the band size of the precision matrix. In practice, crossvalidation is commonly used; however, we show that crossvalidation not only is computationally intensive but can be very unstable. In this paper, we propose a new hypothesis testing procedure to determine the band size in high dimensions. Our proposed test statistic is shown to be asymptotically normal under the null hypothesis, and its theoretical power is studied. Numerical examples demonstrate the effectiveness of our testing procedure.

  4. High-numerical-aperture cryogenic light microscopy for increased precision of superresolution reconstructions

    PubMed Central

    Nahmani, Marc; Lanahan, Conor; DeRosier, David; Turrigiano, Gina G.

    2017-01-01

    Superresolution microscopy has fundamentally altered our ability to resolve subcellular proteins, but improving on these techniques to study dense structures composed of single-molecule-sized elements has been a challenge. One possible approach to enhance superresolution precision is to use cryogenic fluorescent imaging, reported to reduce fluorescent protein bleaching rates, thereby increasing the precision of superresolution imaging. Here, we describe an approach to cryogenic photoactivated localization microscopy (cPALM) that permits the use of a room-temperature high-numerical-aperture objective lens to image frozen samples in their native state. We find that cPALM increases photon yields and show that this approach can be used to enhance the effective resolution of two photoactivatable/switchable fluorophore-labeled structures in the same frozen sample. This higher resolution, two-color extension of the cPALM technique will expand the accessibility of this approach to a range of laboratories interested in more precise reconstructions of complex subcellular targets. PMID:28348224

  5. Isolation and genetic analysis of pure cells from forensic biological mixtures: The precision of a digital approach.

    PubMed

    Fontana, F; Rapone, C; Bregola, G; Aversa, R; de Meo, A; Signorini, G; Sergio, M; Ferrarini, A; Lanzellotto, R; Medoro, G; Giorgini, G; Manaresi, N; Berti, A

    2017-07-01

    Latest genotyping technologies allow to achieve a reliable genetic profile for the offender identification even from extremely minute biological evidence. The ultimate challenge occurs when genetic profiles need to be retrieved from a mixture, which is composed of biological material from two or more individuals. In this case, DNA profiling will often result in a complex genetic profile, which is then subject matter for statistical analysis. In principle, when more individuals contribute to a mixture with different biological fluids, their single genetic profiles can be obtained by separating the distinct cell types (e.g. epithelial cells, blood cells, sperm), prior to genotyping. Different approaches have been investigated for this purpose, such as fluorescent-activated cell sorting (FACS) or laser capture microdissection (LCM), but currently none of these methods can guarantee the complete separation of different type of cells present in a mixture. In other fields of application, such as oncology, DEPArray™ technology, an image-based, microfluidic digital sorter, has been widely proven to enable the separation of pure cells, with single-cell precision. This study investigates the applicability of DEPArray™ technology to forensic samples analysis, focusing on the resolution of the forensic mixture problem. For the first time, we report here the development of an application-specific DEPArray™ workflow enabling the detection and recovery of pure homogeneous cell pools from simulated blood/saliva and semen/saliva mixtures, providing full genetic match with genetic profiles of corresponding donors. In addition, we assess the performance of standard forensic methods for DNA quantitation and genotyping on low-count, DEPArray™-isolated cells, showing that pure, almost complete profiles can be obtained from as few as ten haploid cells. Finally, we explore the applicability in real casework samples, demonstrating that the described approach provides complete

  6. High resolution and high precision on line isotopic analysis of Holocene and glacial ice performed in the field

    NASA Astrophysics Data System (ADS)

    Gkinis, V.; Popp, T. J.; Johnsen, S. J.; Blunier, T.; Bigler, M.; Stowasser, C.; Schüpbach, S.; Leuenberger, D.

    2010-12-01

    Ice core records as obtained from polar ice caps provide a wealth of paleoclimatic information. One of the main features of ice cores is their potential for high temporal resolution. The isotopic signature of the ice, expressed through the relative abundances of the two heavy isotopologues H218O and HD16O, is a widely used proxy for the reconstruction of past temperature and accumulation. One step further the combined information obtained from these two isotopologues, commonly referred to as the deuterium excess, can be utilized to infer additional information about the source of the precipitated moisture. Until very recently isotopic analysis of polar ice was performed with isotope Ratio Mass Spectrometry (IRMS) in a discrete fashion resulting in a high workload related to the preparation of samples. Most important though the available temporal resolution of the ice core was in many cases not fully exploited. In order to overcome these limitations we have developed a system that interfaces a commercially available IR laser cavity ring-down spectrometer tailored for water isotope analysis to a stream of liquid water as extracted from a continuously melted ice rod. The system offers the possibility for simultaneous δ18O and δD analysis with a sample requirement of approximately 0.1 ml/min. The system has been deployed in the field during the NEEM ice core drilling project on 2009 and 2010. In this study we present actual on line measurements of Holocene and glacial ice. We also discuss how parameters as the melt rate, acquisition rate and integration time affect the obtained precision and resolution and we describe data analysis techniques that can improve these last two parameters. By applying spectral methods we are able to quantify the smoothing effects imposed by diffusion of the sample in the sample transfer lines and the optical cavity of the instrument. We demonstrate that with an acquisition rate of 0.2 Hz we are able to obtain a precision of 0.5‰ and 0

  7. Precision medicine and precision therapeutics: hedgehog signaling pathway, basal cell carcinoma and beyond.

    PubMed

    Mohan, Shalini V; Chang, Anne Lynn S

    2014-06-01

    Precision medicine and precision therapeutics is currently in its infancy with tremendous potential to improve patient care by better identifying individuals at risk for skin cancer and predict tumor responses to treatment. This review focuses on the Hedgehog signaling pathway, its critical role in the pathogenesis of basal cell carcinoma, and the emergence of targeted treatments for advanced basal cell carcinoma. Opportunities to utilize precision medicine are outlined, such as molecular profiling to predict basal cell carcinoma response to targeted therapy and to inform therapeutic decisions.

  8. Precision and accuracy of suggested maxillary and mandibular landmarks with cone-beam computed tomography for regional superimpositions: An in vitro study.

    PubMed

    Lemieux, Genevieve; Carey, Jason P; Flores-Mir, Carlos; Secanell, Marc; Hart, Adam; Lagravère, Manuel O

    2016-01-01

    Our objective was to identify and evaluate the accuracy and precision (intrarater and interrater reliabilities) of various anatomic landmarks for use in 3-dimensional maxillary and mandibular regional superimpositions. We used cone-beam computed tomography reconstructions of 10 human dried skulls to locate 10 landmarks in the maxilla and the mandible. Precision and accuracy were assessed with intrarater and interrater readings. Three examiners located these landmarks in the cone-beam computed tomography images 3 times with readings scheduled at 1-week intervals. Three-dimensional coordinates were determined (x, y, and z coordinates), and the intraclass correlation coefficient was computed to determine intrarater and interrater reliabilities, as well as the mean error difference and confidence intervals for each measurement. Bilateral mental foramina, bilateral infraorbital foramina, anterior nasal spine, incisive canal, and nasion showed the highest precision and accuracy in both intrarater and interrater reliabilities. Subspinale and bilateral lingulae had the lowest precision and accuracy in both intrarater and interrater reliabilities. When choosing the most accurate and precise landmarks for 3-dimensional cephalometric analysis or plane-derived maxillary and mandibular superimpositions, bilateral mental and infraorbital foramina, landmarks in the anterior region of the maxilla, and nasion appeared to be the best options of the analyzed landmarks. Caution is needed when using subspinale and bilateral lingulae because of their higher mean errors in location. Copyright © 2016 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  9. Patient-Centered Precision Health In A Learning Health Care System: Geisinger's Genomic Medicine Experience.

    PubMed

    Williams, Marc S; Buchanan, Adam H; Davis, F Daniel; Faucett, W Andrew; Hallquist, Miranda L G; Leader, Joseph B; Martin, Christa L; McCormick, Cara Z; Meyer, Michelle N; Murray, Michael F; Rahm, Alanna K; Schwartz, Marci L B; Sturm, Amy C; Wagner, Jennifer K; Williams, Janet L; Willard, Huntington F; Ledbetter, David H

    2018-05-01

    Health care delivery is increasingly influenced by the emerging concepts of precision health and the learning health care system. Although not synonymous with precision health, genomics is a key enabler of individualized care. Delivering patient-centered, genomics-informed care based on individual-level data in the current national landscape of health care delivery is a daunting challenge. Problems to overcome include data generation, analysis, storage, and transfer; knowledge management and representation for patients and providers at the point of care; process management; and outcomes definition, collection, and analysis. Development, testing, and implementation of a genomics-informed program requires multidisciplinary collaboration and building the concepts of precision health into a multilevel implementation framework. Using the principles of a learning health care system provides a promising solution. This article describes the implementation of population-based genomic medicine in an integrated learning health care system-a working example of a precision health program.

  10. Multi-objective Analysis for a Sequencing Planning of Mixed-model Assembly Line

    NASA Astrophysics Data System (ADS)

    Shimizu, Yoshiaki; Waki, Toshiya; Yoo, Jae Kyu

    Diversified customer demands are raising importance of just-in-time and agile manufacturing much more than before. Accordingly, introduction of mixed-model assembly lines becomes popular to realize the small-lot-multi-kinds production. Since it produces various kinds on the same assembly line, a rational management is of special importance. With this point of view, this study focuses on a sequencing problem of mixed-model assembly line including a paint line as its preceding process. By taking into account the paint line together, reducing work-in-process (WIP) inventory between these heterogeneous lines becomes a major concern of the sequencing problem besides improving production efficiency. Finally, we have formulated the sequencing problem as a bi-objective optimization problem to prevent various line stoppages, and to reduce the volume of WIP inventory simultaneously. Then we have proposed a practical method for the multi-objective analysis. For this purpose, we applied the weighting method to derive the Pareto front. Actually, the resulting problem is solved by a meta-heuristic method like SA (Simulated Annealing). Through numerical experiments, we verified the validity of the proposed approach, and discussed the significance of trade-off analysis between the conflicting objectives.

  11. Assessing accuracy and precision for field and laboratory data: a perspective in ecosystem restoration

    USGS Publications Warehouse

    Stapanian, Martin A.; Lewis, Timothy E; Palmer, Craig J.; Middlebrook Amos, Molly

    2016-01-01

    Unlike most laboratory studies, rigorous quality assurance/quality control (QA/QC) procedures may be lacking in ecosystem restoration (“ecorestoration”) projects, despite legislative mandates in the United States. This is due, in part, to ecorestoration specialists making the false assumption that some types of data (e.g. discrete variables such as species identification and abundance classes) are not subject to evaluations of data quality. Moreover, emergent behavior manifested by complex, adapting, and nonlinear organizations responsible for monitoring the success of ecorestoration projects tend to unconsciously minimize disorder, QA/QC being an activity perceived as creating disorder. We discuss similarities and differences in assessing precision and accuracy for field and laboratory data. Although the concepts for assessing precision and accuracy of ecorestoration field data are conceptually the same as laboratory data, the manner in which these data quality attributes are assessed is different. From a sample analysis perspective, a field crew is comparable to a laboratory instrument that requires regular “recalibration,” with results obtained by experts at the same plot treated as laboratory calibration standards. Unlike laboratory standards and reference materials, the “true” value for many field variables is commonly unknown. In the laboratory, specific QA/QC samples assess error for each aspect of the measurement process, whereas field revisits assess precision and accuracy of the entire data collection process following initial calibration. Rigorous QA/QC data in an ecorestoration project are essential for evaluating the success of a project, and they provide the only objective “legacy” of the dataset for potential legal challenges and future uses.

  12. Parallel Flux Tensor Analysis for Efficient Moving Object Detection

    DTIC Science & Technology

    2011-07-01

    computing as well as parallelization to enable real time performance in analyzing complex video [3, 4 ]. There are a number of challenging computer vision... 4 . TITLE AND SUBTITLE Parallel Flux Tensor Analysis for Efficient Moving Object Detection 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...We use the trace of the flux tensor matrix, referred to as Tr JF , that is defined below, Tr JF = ∫ Ω W (x− y)(I2xt(y) + I2yt(y) + I2tt(y))dy ( 4 ) as

  13. COGNITION, ACTION, AND OBJECT MANIPULATION

    PubMed Central

    Rosenbaum, David A.; Chapman, Kate M.; Weigelt, Matthias; Weiss, Daniel J.; van der Wel, Robrecht

    2012-01-01

    Although psychology is the science of mental life and behavior, it has paid little attention to the means by which mental life is translated into behavior. One domain where links between cognition and action have been explored is the manipulation of objects. This article reviews psychological research on this topic, with special emphasis on the tendency to grasp objects differently depending on what one plans to do with the objects. Such differential grasping has been demonstrated in a wide range of object manipulation tasks, including grasping an object in a way that reveals anticipation of the object's future orientation, height, and required placement precision. Differential grasping has also been demonstrated in a wide range of behaviors, including one-hand grasps, two-hand grasps, walking, and transferring objects from place to place as well as from person to person. The populations in whom the tendency has been shown are also diverse, including nonhuman primates as well as human adults, children, and babies. Meanwhile, the tendency is compromised in a variety of clinical populations and in children of a surprisingly advanced age. Verbal working memory is compromised as well if words are memorized while object manipulation tasks are performed; the recency portion of the serial position curve is reduced in this circumstance. In general, the research reviewed here points to rich connections between cognition and action as revealed through the study of object manipulation. Other implications concern affordances, Donders' Law, and naturalistic observation and the teaching of psychology. PMID:22448912

  14. Understanding and Optimizing Asynchronous Low-Precision Stochastic Gradient Descent

    PubMed Central

    De Sa, Christopher; Feldman, Matthew; Ré, Christopher; Olukotun, Kunle

    2018-01-01

    Stochastic gradient descent (SGD) is one of the most popular numerical algorithms used in machine learning and other domains. Since this is likely to continue for the foreseeable future, it is important to study techniques that can make it run fast on parallel hardware. In this paper, we provide the first analysis of a technique called Buckwild! that uses both asynchronous execution and low-precision computation. We introduce the DMGC model, the first conceptualization of the parameter space that exists when implementing low-precision SGD, and show that it provides a way to both classify these algorithms and model their performance. We leverage this insight to propose and analyze techniques to improve the speed of low-precision SGD. First, we propose software optimizations that can increase throughput on existing CPUs by up to 11×. Second, we propose architectural changes, including a new cache technique we call an obstinate cache, that increase throughput beyond the limits of current-generation hardware. We also implement and analyze low-precision SGD on the FPGA, which is a promising alternative to the CPU for future SGD systems. PMID:29391770

  15. Estimating effective data density in a satellite retrieval or an objective analysis

    NASA Technical Reports Server (NTRS)

    Purser, R. J.; Huang, H.-L.

    1993-01-01

    An attempt is made to formulate consistent objective definitions of the concept of 'effective data density' applicable both in the context of satellite soundings and more generally in objective data analysis. The definitions based upon various forms of Backus-Gilbert 'spread' functions are found to be seriously misleading in satellite soundings where the model resolution function (expressing the sensitivity of retrieval or analysis to changes in the background error) features sidelobes. Instead, estimates derived by smoothing the trace components of the model resolution function are proposed. The new estimates are found to be more reliable and informative in simulated satellite retrieval problems and, for the special case of uniformly spaced perfect observations, agree exactly with their actual density. The new estimates integrate to the 'degrees of freedom for signal', a diagnostic that is invariant to changes of units or coordinates used.

  16. Frontiers of QC Laser spectroscopy for high precision isotope ratio analysis of greenhouse gases

    NASA Astrophysics Data System (ADS)

    Emmenegger, Lukas; Mohn, Joachim; Harris, Eliza; Eyer, Simon; Ibraim, Erkan; Tuzson, Béla

    2016-04-01

    An important milestone for laser spectroscopy was achieved when isotope ratios of greenhouse gases were reported at precision levels that allow addressing research questions in environmental sciences. Real-time data with high temporal resolution at moderate cost and instrument size make the optical approach highly attractive, complementary to the well-established isotope-ratio mass-spectrometry (IRMS) method. Especially appealing, in comparison to IRMS, is the inherent specificity to structural isomers having the same molecular mass. Direct absorption in the MIR in single or dual QCL configuration has proven highly reliable for the sta-ble isotopes of CO2, N2O and CH4. The longest time series of real-time measurements is currently available for δ13C and δ18O in CO2 at the high-alpine station Jung-fraujoch. At this well-equipped site, QCL based direct absorption spectroscopy (QCLAS) measurements are ongoing since 2008 1,2. Applications of QCLAS for N2O and CH4 stable isotopes are considerably more challenging because of the lower atmospheric mixing ratios, especially for the less abundant species, such as N218O and CH3D. For high precision (< 0.1 ‰) measurements in ambient air, QCLAS may be combined with a fully automated preconcentration unit yielding an up to 500 times concentration increase and the capability to separate the target gas from spectral interferants by se-quential desorption 3. Here, we review our recent developments on high precision isotope ratio analysis of greenhouse gases, with special focus on the isotopic species of N2O and CH4. Furthermore, we show environ-mental applications illustrating the highly valuable information that isotope ratios of atmospheric trace gases can carry. For example, the intramolecular distribution of 15N in N2O gives important information on the geochemical cycle of N2O4-6, while the analysis of δ13C and δ D in CH4 may be applied to disentangle microbial, fossil and landfill sources 7. 1 Sturm, P., Tuzson, B

  17. Graph-Based Object Class Discovery

    NASA Astrophysics Data System (ADS)

    Xia, Shengping; Hancock, Edwin R.

    We are interested in the problem of discovering the set of object classes present in a database of images using a weakly supervised graph-based framework. Rather than making use of the ”Bag-of-Features (BoF)” approach widely used in current work on object recognition, we represent each image by a graph using a group of selected local invariant features. Using local feature matching and iterative Procrustes alignment, we perform graph matching and compute a similarity measure. Borrowing the idea of query expansion , we develop a similarity propagation based graph clustering (SPGC) method. Using this method class specific clusters of the graphs can be obtained. Such a cluster can be generally represented by using a higher level graph model whose vertices are the clustered graphs, and the edge weights are determined by the pairwise similarity measure. Experiments are performed on a dataset, in which the number of images increases from 1 to 50K and the number of objects increases from 1 to over 500. Some objects have been discovered with total recall and a precision 1 in a single cluster.

  18. Impact of PET/CT system, reconstruction protocol, data analysis method, and repositioning on PET/CT precision: An experimental evaluation using an oncology and brain phantom.

    PubMed

    Mansor, Syahir; Pfaehler, Elisabeth; Heijtel, Dennis; Lodge, Martin A; Boellaard, Ronald; Yaqub, Maqsood

    2017-12-01

    In longitudinal oncological and brain PET/CT studies, it is important to understand the repeatability of quantitative PET metrics in order to assess change in tracer uptake. The present studies were performed in order to assess precision as function of PET/CT system, reconstruction protocol, analysis method, scan duration (or image noise), and repositioning in the field of view. Multiple (repeated) scans have been performed using a NEMA image quality (IQ) phantom and a 3D Hoffman brain phantom filled with 18 F solutions on two systems. Studies were performed with and without randomly (< 2 cm) repositioning the phantom and all scans (12 replicates for IQ phantom and 10 replicates for Hoffman brain phantom) were performed at equal count statistics. For the NEMA IQ phantom, we studied the recovery coefficients (RC) of the maximum (SUV max ), peak (SUV peak ), and mean (SUV mean ) uptake in each sphere as a function of experimental conditions (noise level, reconstruction settings, and phantom repositioning). For the 3D Hoffman phantom, the mean activity concentration was determined within several volumes of interest and activity recovery and its precision was studied as function of experimental conditions. The impact of phantom repositioning on RC precision was mainly seen on the Philips Ingenuity PET/CT, especially in the case of smaller spheres (< 17 mm diameter, P < 0.05). This effect was much smaller for the Siemens Biograph system. When exploring SUV max , SUV peak , or SUV mean of the spheres in the NEMA IQ phantom, it was observed that precision depended on phantom repositioning, reconstruction algorithm, and scan duration, with SUV max being most and SUV peak least sensitive to phantom repositioning. For the brain phantom, regional averaged SUVs were only minimally affected by phantom repositioning (< 2 cm). The precision of quantitative PET metrics depends on the combination of reconstruction protocol, data analysis methods and scan duration (scan

  19. Extracting contours of oval-shaped objects by Hough transform and minimal path algorithms

    NASA Astrophysics Data System (ADS)

    Tleis, Mohamed; Verbeek, Fons J.

    2014-04-01

    Circular and oval-like objects are very common in cell and micro biology. These objects need to be analyzed, and to that end, digitized images from the microscope are used so as to come to an automated analysis pipeline. It is essential to detect all the objects in an image as well as to extract the exact contour of each individual object. In this manner it becomes possible to perform measurements on these objects, i.e. shape and texture features. Our measurement objective is achieved by probing contour detection through dynamic programming. In this paper we describe a method that uses Hough transform and two minimal path algorithms to detect contours of (ovoid-like) objects. These algorithms are based on an existing grey-weighted distance transform and a new algorithm to extract the circular shortest path in an image. The methods are tested on an artificial dataset of a 1000 images, with an F1-score of 0.972. In a case study with yeast cells, contours from our methods were compared with another solution using Pratt's figure of merit. Results indicate that our methods were more precise based on a comparison with a ground-truth dataset. As far as yeast cells are concerned, the segmentation and measurement results enable, in future work, to retrieve information from different developmental stages of the cell using complex features.

  20. GNSS global real-time augmentation positioning: Real-time precise satellite clock estimation, prototype system construction and performance analysis

    NASA Astrophysics Data System (ADS)

    Chen, Liang; Zhao, Qile; Hu, Zhigang; Jiang, Xinyuan; Geng, Changjiang; Ge, Maorong; Shi, Chuang

    2018-01-01

    Lots of ambiguities in un-differenced (UD) model lead to lower calculation efficiency, which isn't appropriate for the high-frequency real-time GNSS clock estimation, like 1 Hz. Mixed differenced model fusing UD pseudo-range and epoch-differenced (ED) phase observations has been introduced into real-time clock estimation. In this contribution, we extend the mixed differenced model for realizing multi-GNSS real-time clock high-frequency updating and a rigorous comparison and analysis on same conditions are performed to achieve the best real-time clock estimation performance taking the efficiency, accuracy, consistency and reliability into consideration. Based on the multi-GNSS real-time data streams provided by multi-GNSS Experiment (MGEX) and Wuhan University, GPS + BeiDou + Galileo global real-time augmentation positioning prototype system is designed and constructed, including real-time precise orbit determination, real-time precise clock estimation, real-time Precise Point Positioning (RT-PPP) and real-time Standard Point Positioning (RT-SPP). The statistical analysis of the 6 h-predicted real-time orbits shows that the root mean square (RMS) in radial direction is about 1-5 cm for GPS, Beidou MEO and Galileo satellites and about 10 cm for Beidou GEO and IGSO satellites. Using the mixed differenced estimation model, the prototype system can realize high-efficient real-time satellite absolute clock estimation with no constant clock-bias and can be used for high-frequency augmentation message updating (such as 1 Hz). The real-time augmentation message signal-in-space ranging error (SISRE), a comprehensive accuracy of orbit and clock and effecting the users' actual positioning performance, is introduced to evaluate and analyze the performance of GPS + BeiDou + Galileo global real-time augmentation positioning system. The statistical analysis of real-time augmentation message SISRE is about 4-7 cm for GPS, whlile 10 cm for Beidou IGSO/MEO, Galileo and about 30 cm

  1. Development and preliminary testing of an instrumented object for force analysis during grasping.

    PubMed

    Romeo, R A; Cordella, F; Zollo, L; Formica, D; Saccomandi, P; Schena, E; Carpino, G; Davalli, A; Sacchetti, R; Guglielmelli, E

    2015-01-01

    This paper presents the design and realization of an instrumented object for force analysis during grasping. The object, with spherical shape, has been constructed with three contact areas in order to allow performing a tripod grasp. Force Sensing Resistor (FSR) sensors have been employed for normal force measurements, while an accelerometer has been used for slip detection. An electronic board for data acquisition has been embedded into the object, so that only the cables for power supply exit from it. Validation tests have been carried out for: (i) comparing the force measurements with a ground truth; (ii) assessing the capability of the accelerometer to detect slippage for different roughness values; (iii) evaluating object performance in grasp trials performed by a human subject.

  2. Objective determination of image end-members in spectral mixture analysis of AVIRIS data

    NASA Technical Reports Server (NTRS)

    Tompkins, Stefanie; Mustard, John F.; Pieters, Carle M.; Forsyth, Donald W.

    1993-01-01

    Spectral mixture analysis has been shown to be a powerful, multifaceted tool for analysis of multi- and hyper-spectral data. Applications of AVIRIS data have ranged from mapping soils and bedrock to ecosystem studies. During the first phase of the approach, a set of end-members are selected from an image cube (image end-members) that best account for its spectral variance within a constrained, linear least squares mixing model. These image end-members are usually selected using a priori knowledge and successive trial and error solutions to refine the total number and physical location of the end-members. However, in many situations a more objective method of determining these essential components is desired. We approach the problem of image end-member determination objectively by using the inherent variance of the data. Unlike purely statistical methods such as factor analysis, this approach derives solutions that conform to a physically realistic model.

  3. THE PRISM MULTI-OBJECT SURVEY (PRIMUS). II. DATA REDUCTION AND REDSHIFT FITTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cool, Richard J.; Moustakas, John; Blanton, Michael R.

    2013-04-20

    The PRIsm MUlti-object Survey (PRIMUS) is a spectroscopic galaxy redshift survey to z {approx} 1 completed with a low-dispersion prism and slitmasks allowing for simultaneous observations of {approx}2500 objects over 0.18 deg{sup 2}. The final PRIMUS catalog includes {approx}130,000 robust redshifts over 9.1 deg{sup 2}. In this paper, we summarize the PRIMUS observational strategy and present the data reduction details used to measure redshifts, redshift precision, and survey completeness. The survey motivation, observational techniques, fields, target selection, slitmask design, and observations are presented in Coil et al. Comparisons to existing higher-resolution spectroscopic measurements show a typical precision of {sigma}{sub z}/(1more » + z) = 0.005. PRIMUS, both in area and number of redshifts, is the largest faint galaxy redshift survey completed to date and is allowing for precise measurements of the relationship between active galactic nuclei and their hosts, the effects of environment on galaxy evolution, and the build up of galactic systems over the latter half of cosmic history.« less

  4. Object-Based Dense Matching Method for Maintaining Structure Characteristics of Linear Buildings

    PubMed Central

    Yan, Yiming; Qiu, Mingjie; Zhao, Chunhui; Wang, Liguo

    2018-01-01

    In this paper, we proposed a novel object-based dense matching method specially for the high-precision disparity map of building objects in urban areas, which can maintain accurate object structure characteristics. The proposed framework mainly includes three stages. Firstly, an improved edge line extraction method is proposed for the edge segments to fit closely to building outlines. Secondly, a fusion method is proposed for the outlines under the constraint of straight lines, which can maintain the building structural attribute with parallel or vertical edges, which is very useful for the dense matching method. Finally, we proposed an edge constraint and outline compensation (ECAOC) dense matching method to maintain building object structural characteristics in the disparity map. In the proposed method, the improved edge lines are used to optimize matching search scope and matching template window, and the high-precision building outlines are used to compensate the shape feature of building objects. Our method can greatly increase the matching accuracy of building objects in urban areas, especially at building edges. For the outline extraction experiments, our fusion method verifies the superiority and robustness on panchromatic images of different satellites and different resolutions. For the dense matching experiments, our ECOAC method shows great advantages for matching accuracy of building objects in urban areas compared with three other methods. PMID:29596393

  5. PROSPECT - A precision oscillation and spectrum experiment

    NASA Astrophysics Data System (ADS)

    Langford, T. J.; PROSPECT Collaboration

    2015-08-01

    Segmented antineutrino detectors placed near a compact research reactor provide an excellent opportunity to probe short-baseline neutrino oscillations and precisely measure the reactor antineutrino spectrum. Close proximity to a reactor combined with minimal overburden yield a high background environment that must be managed through shielding and detector technology. PROSPECT is a new experimental effort to detect reactor antineutrinos from the High Flux Isotope Reactor (HFIR) at Oak Ridge National Laboratory, managed by UT Battelle for the U.S. Department of Energy. The detector will use novel lithium-loaded liquid scintillator capable of neutron/gamma pulse shape discrimination and neutron capture tagging. These enhancements improve the ability to identify neutrino inverse-beta decays (IBD) and reject background events in analysis. Results from these efforts will be covered along with their implications for an oscillation search and a precision spectrum measurement.

  6. Analysis of 14C and 13C in teeth provides precise birth dating and clues to geographical origin

    PubMed Central

    K, Alkass; BA, Buchholz; H, Druid; KL, Spalding

    2011-01-01

    The identification of human bodies in situations when there are no clues as to the person’s identity from circumstantial data, poses a difficult problem to investigators. The determination of age and sex of the body can be crucial in order to limit the search to individuals that are a possible match. We analyzed the proportion of bomb pulse derived carbon-14 (14C) incorporated in the enamel of teeth from individuals from different geographical locations. The ‘bomb pulse’ refers to a significant increase in 14C levels in the atmosphere caused by above ground test detonations of nuclear weapons during the cold war (1955-1963). By comparing 14C levels in enamel with 14C atmospheric levels systematically recorded over time, high precision birth dating of modern biological material is possible. Above ground nuclear bomb testing was largely restricted to a couple of locations in the northern hemisphere, producing differences in atmospheric 14C levels at various geographical regions, particularly in the early phase. Therefore, we examined the precision of 14C birth dating of enamel as a function of time of formation and geographical location. We also investigated the use of the stable isotope 13C as an indicator of geographical origin of an individual. Dental enamel was isolated from 95 teeth extracted from 84 individuals to study the precision of the 14C method along the bomb spike. For teeth formed before 1955 (N = 17), all but one tooth showed negative Δ14C values. Analysis of enamel from teeth formed during the rising part of the bomb-spike (1955-1963, N = 12) and after the peak (>1963, N = 66) resulted in an average absolute date of birth estimation error of 1.9 ±1.4 and 1.3 ± 1.0 years, respectively. Geographical location of an individual had no adverse effect on the precision of year of birth estimation using radiocarbon dating. In 46 teeth, measurement of 13C was also performed. Scandinavian teeth showed a substantially greater depression in average δ13C

  7. Analysis of 14C and 13C in teeth provides precise birth dating and clues to geographical origin.

    PubMed

    Alkass, K; Buchholz, B A; Druid, H; Spalding, K L

    2011-06-15

    The identification of human bodies in situations when there are no clues as to the person's identity from circumstantial data, poses a difficult problem to the investigators. The determination of age and sex of the body can be crucial in order to limit the search to individuals that are a possible match. We analyzed the proportion of bomb pulse derived carbon-14 ((14)C) incorporated in the enamel of teeth from individuals from different geographical locations. The 'bomb pulse' refers to a significant increase in (14)C levels in the atmosphere caused by above ground test detonations of nuclear weapons during the cold war (1955-1963). By comparing (14)C levels in enamel with (14)C atmospheric levels systematically recorded over time, high precision birth dating of modern biological material is possible. Above ground nuclear bomb testing was largely restricted to a couple of locations in the northern hemisphere, producing differences in atmospheric (14)C levels at various geographical regions, particularly in the early phase. Therefore, we examined the precision of (14)C birth dating of enamel as a function of time of formation and geographical location. We also investigated the use of the stable isotope (13)C as an indicator of geographical origin of an individual. Dental enamel was isolated from 95 teeth extracted from 84 individuals to study the precision of the (14)C method along the bomb spike. For teeth formed before 1955 (N=17), all but one tooth showed negative Δ(14)C values. Analysis of enamel from teeth formed during the rising part of the bomb-spike (1955-1963, N=12) and after the peak (>1963, N=66) resulted in an average absolute date of birth estimation error of 1.9±1.4 and 1.3±1.0 years, respectively. Geographical location of an individual had no adverse effect on the precision of year of birth estimation using radiocarbon dating. In 46 teeth, measurement of (13)C was also performed. Scandinavian teeth showed a substantially greater depression in

  8. Multi-object segmentation framework using deformable models for medical imaging analysis.

    PubMed

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  9. Wide-field Precision Kinematics of the M87 Globular Cluster System

    NASA Astrophysics Data System (ADS)

    Strader, Jay; Romanowsky, Aaron J.; Brodie, Jean P.; Spitler, Lee R.; Beasley, Michael A.; Arnold, Jacob A.; Tamura, Naoyuki; Sharples, Ray M.; Arimoto, Nobuo

    2011-12-01

    We present the most extensive combined photometric and spectroscopic study to date of the enormous globular cluster (GC) system around M87, the central giant elliptical galaxy in the nearby Virgo Cluster. Using observations from DEIMOS and the Low Resolution Imaging Spectrometer at Keck, and Hectospec on the Multiple Mirror Telescope, we derive new, precise radial velocities for 451 GCs around M87, with projected radii from ~5 to 185 kpc. We combine these measurements with literature data for a total sample of 737 objects, which we use for a re-examination of the kinematics of the GC system of M87. The velocities are analyzed in the context of archival wide-field photometry and a novel Hubble Space Telescope catalog of half-light radii, which includes sizes for 344 spectroscopically confirmed clusters. We use this unique catalog to identify 18 new candidate ultracompact dwarfs and to help clarify the relationship between these objects and true GCs. We find much lower values for the outer velocity dispersion and rotation of the GC system than in earlier papers and also differ from previous work in seeing no evidence for a transition in the inner halo to a potential dominated by the Virgo Cluster, nor for a truncation of the stellar halo. We find little kinematical evidence for an intergalactic GC population. Aided by the precision of the new velocity measurements, we see significant evidence for kinematical substructure over a wide range of radii, indicating that M87 is in active assembly. A simple, scale-free analysis finds less dark matter within ~85 kpc than in other recent work, reducing the tension between X-ray and optical results. In general, out to a projected radius of ~150 kpc, our data are consistent with the notion that M87 is not dynamically coupled to the Virgo Cluster; the core of Virgo may be in the earliest stages of assembly.

  10. In pursuit of precision: the calibration of minds and machines in late nineteenth-century psychology.

    PubMed

    Benschop, R; Draaisma, D

    2000-01-01

    A prominent feature of late nineteenth-century psychology was its intense preoccupation with precision. Precision was at once an ideal and an argument: the quest for precision helped psychology to establish its status as a mature science, sharing a characteristic concern with the natural sciences. We will analyse how psychologists set out to produce precision in 'mental chronometry', the measurement of the duration of psychological processes. In his Leipzig laboratory, Wundt inaugurated an elaborate research programme on mental chronometry. We will look at the problem of calibration of experimental apparatus and will describe the intricate material, literary, and social technologies involved in the manufacture of precision. First, we shall discuss some of the technical problems involved in the measurement of ever shorter time-spans. Next, the Cattell-Berger experiments will help us to argue against the received view that all the precision went into the hardware, and practically none into the social organization of experimentation. Experimenters made deliberate efforts to bring themselves and their subjects under a regime of control and calibration similar to that which reigned over the experimental machinery. In Leipzig psychology, the particular blend of material and social technology resulted in a specific object of study: the generalized mind. We will then show that the distribution of precision in experimental psychology outside Leipzig demanded a concerted effort of instruments, texts, and people. It will appear that the forceful attempts to produce precision and uniformity had some rather paradoxical consequences.

  11. Mechanism and experimental research on ultra-precision grinding of ferrite

    NASA Astrophysics Data System (ADS)

    Ban, Xinxing; Zhao, Huiying; Dong, Longchao; Zhu, Xueliang; Zhang, Chupeng; Gu, Yawen

    2017-02-01

    Ultra-precision grinding of ferrite is conducted to investigate the removal mechanism. Effect of the accuracy of machine tool key components on grinding surface quality is analyzed. The surface generation model of ferrite ultra-precision grinding machining is established. In order to reveal the surface formation mechanism of ferrite in the process of ultraprecision grinding, furthermore, the scientific and accurate of the calculation model are taken into account to verify the grinding surface roughness, which is proposed. Orthogonal experiment is designed using the high precision aerostatic turntable and aerostatic spindle for ferrite which is a typical hard brittle materials. Based on the experimental results, the influence factors and laws of ultra-precision grinding surface of ferrite are discussed through the analysis of the surface roughness. The results show that the quality of ferrite grinding surface is the optimal parameters, when the wheel speed of 20000r/mm, feed rate of 10mm/min, grinding depth of 0.005mm, and turntable rotary speed of 5r/min, the surface roughness Ra can up to 75nm.

  12. Study on manufacturing method of optical surface with high precision in angle and surface

    NASA Astrophysics Data System (ADS)

    Yu, Xin; Li, Xin; Yu, Ze; Zhao, Bin; Zhang, Xuebin; Sun, Lipeng; Tong, Yi

    2016-10-01

    This paper studied a manufacturing processing of optical surface with high precision in angel and surface. By theoretical analysis of the relationships between the angel precision and surface, the measurement conversion of the technical indicators, optical-cement method application, the optical-cement tooling design, the experiment has been finished successfully, the processing method has been verified, which can be also used in the manufacturing of the optical surface with similar high precision in angle and surface.

  13. Microfluidics cell sample preparation for analysis: Advances in efficient cell enrichment and precise single cell capture

    PubMed Central

    Bian, Shengtai; Cheng, Yinuo; Shi, Guanya; Liu, Peng; Ye, Xiongying

    2017-01-01

    Single cell analysis has received increasing attention recently in both academia and clinics, and there is an urgent need for effective upstream cell sample preparation. Two extremely challenging tasks in cell sample preparation—high-efficiency cell enrichment and precise single cell capture—have now entered into an era full of exciting technological advances, which are mostly enabled by microfluidics. In this review, we summarize the category of technologies that provide new solutions and creative insights into the two tasks of cell manipulation, with a focus on the latest development in the recent five years by highlighting the representative works. By doing so, we aim both to outline the framework and to showcase example applications of each task. In most cases for cell enrichment, we take circulating tumor cells (CTCs) as the target cells because of their research and clinical importance in cancer. For single cell capture, we review related technologies for many kinds of target cells because the technologies are supposed to be more universal to all cells rather than CTCs. Most of the mentioned technologies can be used for both cell enrichment and precise single cell capture. Each technology has its own advantages and specific challenges, which provide opportunities for researchers in their own area. Overall, these technologies have shown great promise and now evolve into real clinical applications. PMID:28217240

  14. In situ sulfur isotope analysis of sulfide minerals by SIMS: Precision and accuracy, with application to thermometry of ~3.5Ga Pilbara cherts

    USGS Publications Warehouse

    Kozdon, R.; Kita, N.T.; Huberty, J.M.; Fournelle, J.H.; Johnson, C.A.; Valley, J.W.

    2010-01-01

    Secondary ion mass spectrometry (SIMS) measurement of sulfur isotope ratios is a potentially powerful technique for in situ studies in many areas of Earth and planetary science. Tests were performed to evaluate the accuracy and precision of sulfur isotope analysis by SIMS in a set of seven well-characterized, isotopically homogeneous natural sulfide standards. The spot-to-spot and grain-to-grain precision for δ34S is ± 0.3‰ for chalcopyrite and pyrrhotite, and ± 0.2‰ for pyrite (2SD) using a 1.6 nA primary beam that was focused to 10 µm diameter with a Gaussian-beam density distribution. Likewise, multiple δ34S measurements within single grains of sphalerite are within ± 0.3‰. However, between individual sphalerite grains, δ34S varies by up to 3.4‰ and the grain-to-grain precision is poor (± 1.7‰, n = 20). Measured values of δ34S correspond with analysis pit microstructures, ranging from smooth surfaces for grains with high δ34S values, to pronounced ripples and terraces in analysis pits from grains featuring low δ34S values. Electron backscatter diffraction (EBSD) shows that individual sphalerite grains are single crystals, whereas crystal orientation varies from grain-to-grain. The 3.4‰ variation in measured δ34S between individual grains of sphalerite is attributed to changes in instrumental bias caused by different crystal orientations with respect to the incident primary Cs+ beam. High δ34S values in sphalerite correlate to when the Cs+ beam is parallel to the set of directions , from [111] to [110], which are preferred directions for channeling and focusing in diamond-centered cubic crystals. Crystal orientation effects on instrumental bias were further detected in galena. However, as a result of the perfect cleavage along {100} crushed chips of galena are typically cube-shaped and likely to be preferentially oriented, thus crystal orientation effects on instrumental bias may be obscured. Test were made to improve the analytical

  15. Image Tiling for Profiling Large Objects

    NASA Technical Reports Server (NTRS)

    Venkataraman, Ajit; Schock, Harold; Mercer, Carolyn R.

    1992-01-01

    Three dimensional surface measurements of large objects arc required in a variety of industrial processes. The nature of these measurements is changing as optical instruments arc beginning to replace conventional contact probes scanned over the objects. A common characteristic of the optical surface profilers is the trade off between measurement accuracy and field of view. In order to measure a large object with high accuracy, multiple views arc required. An accurate transformation between the different views is needed to bring about their registration. In this paper, we demonstrate how the transformation parameters can be obtained precisely by choosing control points which lie in the overlapping regions of the images. A good starting point for the transformation parameters is obtained by having a knowledge of the scanner position. The selection of the control points arc independent of the object geometry. By successively recording multiple views and obtaining transformation with respect to a single coordinate system, a complete physical model of an object can be obtained. Since all data arc in the same coordinate system, it can thus be used for building automatic models for free form surfaces.

  16. Analysis of HY2A precise orbit determination using DORIS

    NASA Astrophysics Data System (ADS)

    Gao, Fan; Peng, Bibo; Zhang, Yu; Evariste, Ngatchou Heutchi; Liu, Jihua; Wang, Xiaohui; Zhong, Min; Lin, Mingsen; Wang, Nazi; Chen, Runjing; Xu, Houze

    2015-03-01

    HY2A is the first Chinese marine dynamic environment satellite. The payloads include a radar altimeter to measure the sea surface height in combination with a high precision orbit to be determined from tracking data. Onboard satellite tracking includes GPS, SLR, and the DORIS DGXX receiver which delivers phase and pseudo-range measurements. CNES releases raw phase and pseudo-range measurements with RINEX DORIS 3.0 format and pre-processed Doppler range-rate with DORIS 2.2 data format. However, the VMSI software package developed by Van Martin Systems, Inc which is used to estimate HY2A DORIS orbits can only process Doppler range-rate but not the DORIS phase data which are available with much shorter latency. We have proposed a method of constructing the phase increment data, which are similar to range-rate data, from RINEX DORIS 3.0 phase data. We compute the HY2A orbits from June, 2013 to August, 2013 using the POD strategy described in this paper based on DORIS 2.2 range-rate data and our reconstructed phase increment data. The estimated orbits are evaluated by comparing with the CNES precise orbits and SLR residuals. Our DORIS-only orbits agree with the precise GPS + SLR + DORIS CNES orbits radially at 1-cm and about 3-cm in the other two directions. SLR test with the 50° cutoff elevation shows that the CNES orbit can achieve about 1.1-cm accuracy in radial direction and our DORIS-only POD solutions are slightly worse. In addition, other HY2A DORIS POD concerns are discussed in this paper. Firstly, we discuss the frequency offset values provided with the RINEX data and find that orbit accuracy for the case when the frequency offset is applied is worse than when it is not applied. Secondly, HY2A DORIS antenna z-offsets are estimated using two kinds of measurements from June, 2013 to August, 2013. The results show that the measurement errors contribute a total of about 2-cm difference of estimated z-offset. Finally, we estimate HY2A orbits selecting 3 days with

  17. Precise monitoring of global temperature trends from satellites

    NASA Technical Reports Server (NTRS)

    Spencer, Roy W.; Christy, John R.

    1990-01-01

    Passive microwave radiometry from satellites provides more precise atmospheric temperature information than that obtained from the relatively sparse distribution of thermometers over the earth's surface. Accurate global atmospheric temperature estimates are needed for detection of possible greenhouse warming, evaluation of computer models of climate change, and for understanding important factors in the climate system. Analysis of the first 10 years (1979 to 1988) of satellite measurements of lower atmospheric temperature changes reveals a monthly precision of 0.01 C, large temperature variability on time scales from weeks to several years, but no obvious trend for the 10-year period. The warmest years, in descending order, were 1987, 1988, 1983, and 1980. The years 1984, 1985, and 1986 were the coolest.

  18. Knock-Outs, Stick-Outs, Cut-Outs: Clipping Paths Separate Objects from Background.

    ERIC Educational Resources Information Center

    Wilson, Bradley

    1998-01-01

    Outlines a six-step process that allows computer operators, using Photoshop software, to create "knock-outs" to precisely define the path that will serve to separate the object from the background. (SR)

  19. Object-oriented productivity metrics

    NASA Technical Reports Server (NTRS)

    Connell, John L.; Eller, Nancy

    1992-01-01

    Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.

  20. Designing concept maps for a precise and objective description of pharmaceutical innovations

    PubMed Central

    2013-01-01

    Background When a new drug is launched onto the market, information about the new manufactured product is contained in its monograph and evaluation report published by national drug agencies. Health professionals need to be able to determine rapidly and easily whether the new manufactured product is potentially useful for their practice. There is therefore a need to identify the best way to group together and visualize the main items of information describing the nature and potential impact of the new drug. The objective of this study was to identify these items of information and to bring them together in a model that could serve as the standard for presenting the main features of new manufactured product. Methods We developed a preliminary conceptual model of pharmaceutical innovations, based on the knowledge of the authors. We then refined this model, using a random sample of 40 new manufactured drugs recently approved by the national drug regulatory authorities in France and covering a broad spectrum of innovations and therapeutic areas. Finally, we used another sample of 20 new manufactured drugs to determine whether the model was sufficiently comprehensive. Results The results of our modeling led to three sub models described as conceptual maps representingi) the medical context for use of the new drug (indications, type of effect, therapeutical arsenal for the same indications), ii) the nature of the novelty of the new drug (new molecule, new mechanism of action, new combination, new dosage, etc.), and iii) the impact of the drug in terms of efficacy, safety and ease of use, compared with other drugs with the same indications. Conclusions Our model can help to standardize information about new drugs released onto the market. It is potentially useful to the pharmaceutical industry, medical journals, editors of drug databases and medical software, and national or international drug regulation agencies, as a means of describing the main properties of new

  1. Genetic diversity of currently circulating rubella viruses: a need to define more precise viral groups.

    PubMed

    Rivailler, P; Abernathy, E; Icenogle, J

    2017-03-01

    Recent studies have shown that the currently circulating rubella viruses are mostly members of two genotypes, 1E and 2B. Also, genetically distinct viruses of genotype 1G have been found in East and West Africa. This study used a Mantel test to objectively include both genetic diversity and geographic location in the definition of lineages, and identified statistically justified lineages (n=13) and sub-lineages (n=9) of viruses within genotypes 1G, 1E and 2B. Genotype 2B viruses were widely distributed, while viruses of genotype 1E as well as 1G and 1J were much more geographically restricted. This analysis showed that more precise groupings for rubella viruses are possible, which should improve the ability to track rubella viruses worldwide. A year-by-year analysis revealed gaps in surveillance that need to be resolved in order to support the surveillance needed for enhanced control and elimination goals for rubella.

  2. Genetic diversity of currently circulating rubella viruses: a need to define more precise viral groups

    PubMed Central

    Rivailler, P

    2017-01-01

    Recent studies have shown that the currently circulating rubella viruses are mostly members of two genotypes, 1E and 2B. Also, genetically distinct viruses of genotype 1G have been found in East and West Africa. This study used a Mantel test to objectively include both genetic diversity and geographic location in the definition of lineages, and identified statistically justified lineages (n=13) and sub-lineages (n=9) of viruses within genotypes 1G, 1E and 2B. Genotype 2B viruses were widely distributed, while viruses of genotype 1E as well as 1G and 1J were much more geographically restricted. This analysis showed that more precise groupings for rubella viruses are possible, which should improve the ability to track rubella viruses worldwide. A year-by-year analysis revealed gaps in surveillance that need to be resolved in order to support the surveillance needed for enhanced control and elimination goals for rubella. PMID:27959771

  3. THE PRISM MULTI-OBJECT SURVEY (PRIMUS). I. SURVEY OVERVIEW AND CHARACTERISTICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coil, Alison L.; Moustakas, John; Aird, James

    2011-11-01

    We present the PRIsm MUlti-object Survey (PRIMUS), a spectroscopic faint galaxy redshift survey to z {approx} 1. PRIMUS uses a low-dispersion prism and slitmasks to observe {approx}2500 objects at once in a 0.18 deg{sup 2} field of view, using the Inamori Magellan Areal Camera and Spectrograph camera on the Magellan I Baade 6.5 m telescope at Las Campanas Observatory. PRIMUS covers a total of 9.1 deg{sup 2} of sky to a depth of i{sub AB} {approx} 23.5 in seven different deep, multi-wavelength fields that have coverage from the Galaxy Evolution Explorer, Spitzer, and either XMM or Chandra, as well asmore » multiple-band optical and near-IR coverage. PRIMUS includes {approx}130,000 robust redshifts of unique objects with a redshift precision of {sigma}{sub z}/(1 + z) {approx} 0.005. The redshift distribution peaks at z {approx} 0.6 and extends to z = 1.2 for galaxies and z = 5 for broad-line active galactic nuclei. The motivation, observational techniques, fields, target selection, slitmask design, and observations are presented here, with a brief summary of the redshift precision; a forthcoming paper presents the data reduction, redshift fitting, redshift confidence, and survey completeness. PRIMUS is the largest faint galaxy survey undertaken to date. The high targeting fraction ({approx}80%) and large survey size will allow for precise measures of galaxy properties and large-scale structure to z {approx} 1.« less

  4. Towards the GEOSAT Follow-On Precise Orbit Determination Goals of High Accuracy and Near-Real-Time Processing

    NASA Technical Reports Server (NTRS)

    Lemoine, Frank G.; Zelensky, Nikita P.; Chinn, Douglas S.; Beckley, Brian D.; Lillibridge, John L.

    2006-01-01

    The US Navy's GEOSAT Follow-On spacecraft (GFO) primary mission objective is to map the oceans using a radar altimeter. Satellite laser ranging data, especially in combination with altimeter crossover data, offer the only means of determining high-quality precise orbits. Two tuned gravity models, PGS7727 and PGS7777b, were created at NASA GSFC for GFO that reduce the predicted radial orbit through degree 70 to 13.7 and 10.0 mm. A macromodel was developed to model the nonconservative forces and the SLR spacecraft measurement offset was adjusted to remove a mean bias. Using these improved models, satellite-ranging data, altimeter crossover data, and Doppler data are used to compute both daily medium precision orbits with a latency of less than 24 hours. Final precise orbits are also computed using these tracking data and exported with a latency of three to four weeks to NOAA for use on the GFO Geophysical Data Records (GDR s). The estimated orbit precision of the daily orbits is between 10 and 20 cm, whereas the precise orbits have a precision of 5 cm.

  5. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    PubMed

    Green, Adam W; Bailey, Larissa L

    2015-01-01

    Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA) using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica) metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools) using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA) and available information to inform a formal decision process to determine optimal and timely management policies.

  6. Precision Medicine and Men's Health.

    PubMed

    Mata, Douglas A; Katchi, Farhan M; Ramasamy, Ranjith

    2017-07-01

    Precision medicine can greatly benefit men's health by helping to prevent, diagnose, and treat prostate cancer, benign prostatic hyperplasia, infertility, hypogonadism, and erectile dysfunction. For example, precision medicine can facilitate the selection of men at high risk for prostate cancer for targeted prostate-specific antigen screening and chemoprevention administration, as well as assist in identifying men who are resistant to medical therapy for prostatic hyperplasia, who may instead require surgery. Precision medicine-trained clinicians can also let couples know whether their specific cause of infertility should be bypassed by sperm extraction and in vitro fertilization to prevent abnormalities in their offspring. Though precision medicine's role in the management of hypogonadism has yet to be defined, it could be used to identify biomarkers associated with individual patients' responses to treatment so that appropriate therapy can be prescribed. Last, precision medicine can improve erectile dysfunction treatment by identifying genetic polymorphisms that regulate response to medical therapies and by aiding in the selection of patients for further cardiovascular disease screening.

  7. Influence of speckle image reconstruction on photometric precision for large solar telescopes

    NASA Astrophysics Data System (ADS)

    Peck, C. L.; Wöger, F.; Marino, J.

    2017-11-01

    Context. High-resolution observations from large solar telescopes require adaptive optics (AO) systems to overcome image degradation caused by Earth's turbulent atmosphere. AO corrections are, however, only partial. Achieving near-diffraction limited resolution over a large field of view typically requires post-facto image reconstruction techniques to reconstruct the source image. Aims: This study aims to examine the expected photometric precision of amplitude reconstructed solar images calibrated using models for the on-axis speckle transfer functions and input parameters derived from AO control data. We perform a sensitivity analysis of the photometric precision under variations in the model input parameters for high-resolution solar images consistent with four-meter class solar telescopes. Methods: Using simulations of both atmospheric turbulence and partial compensation by an AO system, we computed the speckle transfer function under variations in the input parameters. We then convolved high-resolution numerical simulations of the solar photosphere with the simulated atmospheric transfer function, and subsequently deconvolved them with the model speckle transfer function to obtain a reconstructed image. To compute the resulting photometric precision, we compared the intensity of the original image with the reconstructed image. Results: The analysis demonstrates that high photometric precision can be obtained for speckle amplitude reconstruction using speckle transfer function models combined with AO-derived input parameters. Additionally, it shows that the reconstruction is most sensitive to the input parameter that characterizes the atmospheric distortion, and sub-2% photometric precision is readily obtained when it is well estimated.

  8. Evaluating fuzzy operators of an object-based image analysis for detecting landslides and their changes

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Blaschke, Thomas; Tiede, Dirk; Moghaddam, Mohammad Hossein Rezaei

    2017-09-01

    This article presents a method of object-based image analysis (OBIA) for landslide delineation and landslide-related change detection from multi-temporal satellite images. It uses both spatial and spectral information on landslides, through spectral analysis, shape analysis, textural measurements using a gray-level co-occurrence matrix (GLCM), and fuzzy logic membership functionality. Following an initial segmentation step, particular combinations of various information layers were investigated to generate objects. This was achieved by applying multi-resolution segmentation to IRS-1D, SPOT-5, and ALOS satellite imagery in sequential steps of feature selection and object classification, and using slope and flow direction derivatives from a digital elevation model together with topographically-oriented gray level co-occurrence matrices. Fuzzy membership values were calculated for 11 different membership functions using 20 landslide objects from a landslide training data. Six fuzzy operators were used for the final classification and the accuracies of the resulting landslide maps were compared. A Fuzzy Synthetic Evaluation (FSE) approach was adapted for validation of the results and for an accuracy assessment using the landslide inventory database. The FSE approach revealed that the AND operator performed best with an accuracy of 93.87% for 2005 and 94.74% for 2011, closely followed by the MEAN Arithmetic operator, while the OR and AND (*) operators yielded relatively low accuracies. An object-based change detection was then applied to monitor landslide-related changes that occurred in northern Iran between 2005 and 2011. Knowledge rules to detect possible landslide-related changes were developed by evaluating all possible landslide-related objects for both time steps.

  9. Precision Muonium Spectroscopy

    NASA Astrophysics Data System (ADS)

    Jungmann, Klaus P.

    2016-09-01

    The muonium atom is the purely leptonic bound state of a positive muon and an electron. It has a lifetime of 2.2 µs. The absence of any known internal structure provides for precision experiments to test fundamental physics theories and to determine accurate values of fundamental constants. In particular ground state hyperfine structure transitions can be measured by microwave spectroscopy to deliver the muon magnetic moment. The frequency of the 1s-2s transition in the hydrogen-like atom can be determined with laser spectroscopy to obtain the muon mass. With such measurements fundamental physical interactions, in particular quantum electrodynamics, can also be tested at highest precision. The results are important input parameters for experiments on the muon magnetic anomaly. The simplicity of the atom enables further precise experiments, such as a search for muonium-antimuonium conversion for testing charged lepton number conservation and searches for possible antigravity of muons and dark matter.

  10. Very high precision and accuracy analysis of triple isotopic ratios of water. A critical instrumentation comparison study.

    NASA Astrophysics Data System (ADS)

    Gkinis, Vasileios; Holme, Christian; Morris, Valerie; Thayer, Abigail Grace; Vaughn, Bruce; Kjaer, Helle Astrid; Vallelonga, Paul; Simonsen, Marius; Jensen, Camilla Marie; Svensson, Anders; Maffrezzoli, Niccolo; Vinther, Bo; Dallmayr, Remi

    2017-04-01

    We present a performance comparison study between two state of the art Cavity Ring Down Spectrometers (Picarro L2310-i, L2140-i). The comparison took place during the Continuous Flow Analysis (CFA) campaign for the measurement of the Renland ice core, over a period of three months. Instant and complete vaporisation of the ice core melt stream, as well as of in-house water reference materials is achieved by accurate control of microflows of liquid into a homemade calibration system by following simple principles of the Hagen-Poiseuille law. Both instruments share the same vaporisation unit in a configuration that minimises sample preparation discrepancies between the two analyses. We describe our SMOW-SLAP calibration and measurement protocols for such a CFA application and present quality control metrics acquired during the full period of the campaign on a daily basis. The results indicate an unprecedented performance for all 3 isotopic ratios (δ2H, δ17O, δ18O ) in terms of precision, accuracy and resolution. We also comment on the precision and accuracy of the second order excess parameters of HD16O and H217O over H218O (Dxs, Δ17O ). To our knowledge these are the first reported CFA measurements at this level of precision and accuracy for all three isotopic ratios. Differences on the performance of the two instruments are carefully assessed during the measurement and reported here. Our quality control protocols extend to the area of low water mixing ratios, a regime in which often atmospheric vapour measurements take place and Cavity Ring Down Analysers show a poorer performance due to the lower signal to noise ratios. We address such issues and propose calibration protocols from which water vapour isotopic analyses can benefit from.

  11. Precision engineering: an evolutionary perspective.

    PubMed

    Evans, Chris J

    2012-08-28

    Precision engineering is a relatively new name for a technology with roots going back over a thousand years; those roots span astronomy, metrology, fundamental standards, manufacturing and money-making (literally). Throughout that history, precision engineers have created links across disparate disciplines to generate innovative responses to society's needs and wants. This review combines historical and technological perspectives to illuminate precision engineering's current character and directions. It first provides us a working definition of precision engineering and then reviews the subject's roots. Examples will be given showing the contributions of the technology to society, while simultaneously showing the creative tension between the technological convergence that spurs new directions and the vertical disintegration that optimizes manufacturing economics.

  12. Toward precision medicine in Alzheimer's disease.

    PubMed

    Reitz, Christiane

    2016-03-01

    In Western societies, Alzheimer's disease (AD) is the most common form of dementia and the sixth leading cause of death. In recent years, the concept of precision medicine, an approach for disease prevention and treatment that is personalized to an individual's specific pattern of genetic variability, environment and lifestyle factors, has emerged. While for some diseases, in particular select cancers and a few monogenetic disorders such as cystic fibrosis, significant advances in precision medicine have been made over the past years, for most other diseases precision medicine is only in its beginning. To advance the application of precision medicine to a wider spectrum of disorders, governments around the world are starting to launch Precision Medicine Initiatives, major efforts to generate the extensive scientific knowledge needed to integrate the model of precision medicine into every day clinical practice. In this article we summarize the state of precision medicine in AD, review major obstacles in its development, and discuss its benefits in this highly prevalent, clinically and pathologically complex disease.

  13. Precision manometer gauge

    DOEpatents

    McPherson, Malcolm J.; Bellman, Robert A.

    1984-01-01

    A precision manometer gauge which locates a zero height and a measured height of liquid using an open tube in communication with a reservoir adapted to receive the pressure to be measured. The open tube has a reference section carried on a positioning plate which is moved vertically with machine tool precision. Double scales are provided to read the height of the positioning plate accurately, the reference section being inclined for accurate meniscus adjustment, and means being provided to accurately locate a zero or reference position.

  14. Precision manometer gauge

    DOEpatents

    McPherson, M.J.; Bellman, R.A.

    1982-09-27

    A precision manometer gauge which locates a zero height and a measured height of liquid using an open tube in communication with a reservoir adapted to receive the pressure to be measured. The open tube has a reference section carried on a positioning plate which is moved vertically with machine tool precision. Double scales are provided to read the height of the positioning plate accurately, the reference section being inclined for accurate meniscus adjustment, and means being provided to accurately locate a zero or reference position.

  15. The precision problem in conservation and restoration

    USGS Publications Warehouse

    Hiers, J. Kevin; Jackson, Stephen T.; Hobbs, Richard J.; Bernhardt, Emily S.; Valentine, Leonie E.

    2016-01-01

    Within the varied contexts of environmental policy, conservation of imperilled species populations, and restoration of damaged habitats, an emphasis on idealized optimal conditions has led to increasingly specific targets for management. Overly-precise conservation targets can reduce habitat variability at multiple scales, with unintended consequences for future ecological resilience. We describe this dilemma in the context of endangered species management, stream restoration, and climate-change adaptation. Inappropriate application of conservation targets can be expensive, with marginal conservation benefit. Reduced habitat variability can limit options for managers trying to balance competing objectives with limited resources. Conservation policies should embrace habitat variability, expand decision-space appropriately, and support adaptation to local circumstances to increase ecological resilience in a rapidly changing world.

  16. Spatial analysis of NDVI readings with difference sampling density

    USDA-ARS?s Scientific Manuscript database

    Advanced remote sensing technologies provide research an innovative way of collecting spatial data for use in precision agriculture. Sensor information and spatial analysis together allow for a complete understanding of the spatial complexity of a field and its crop. The objective of the study was...

  17. HARNESSING BIG DATA FOR PRECISION MEDICINE: INFRASTRUCTURES AND APPLICATIONS.

    PubMed

    Yu, Kun-Hsing; Hart, Steven N; Goldfeder, Rachel; Zhang, Qiangfeng Cliff; Parker, Stephen C J; Snyder, Michael

    2017-01-01

    Precision medicine is a health management approach that accounts for individual differences in genetic backgrounds and environmental exposures. With the recent advancements in high-throughput omics profiling technologies, collections of large study cohorts, and the developments of data mining algorithms, big data in biomedicine is expected to provide novel insights into health and disease states, which can be translated into personalized disease prevention and treatment plans. However, petabytes of biomedical data generated by multiple measurement modalities poses a significant challenge for data analysis, integration, storage, and result interpretation. In addition, patient privacy preservation, coordination between participating medical centers and data analysis working groups, as well as discrepancies in data sharing policies remain important topics of discussion. In this workshop, we invite experts in omics integration, biobank research, and data management to share their perspectives on leveraging big data to enable precision medicine.Workshop website: http://tinyurl.com/PSB17BigData; HashTag: #PSB17BigData.

  18. A farm-level precision land management framework based on integer programming

    PubMed Central

    Li, Qi; Hu, Guiping; Jubery, Talukder Zaki; Ganapathysubramanian, Baskar

    2017-01-01

    Farmland management involves several planning and decision making tasks including seed selection and irrigation management. A farm-level precision farmland management model based on mixed integer linear programming is proposed in this study. Optimal decisions are designed for pre-season planning of crops and irrigation water allocation. The model captures the effect of size and shape of decision scale as well as special irrigation patterns. The authors illustrate the model with a case study on a farm in the state of California in the U.S. and show the model can capture the impact of precision farm management on profitability. The results show that threefold increase of annual net profit for farmers could be achieved by carefully choosing irrigation and seed selection. Although farmers could increase profits by applying precision management to seed or irrigation alone, profit increase is more significant if farmers apply precision management on seed and irrigation simultaneously. The proposed model can also serve as a risk analysis tool for farmers facing seasonal irrigation water limits as well as a quantitative tool to explore the impact of precision agriculture. PMID:28346499

  19. Variability Analysis: Detection and Classification

    NASA Astrophysics Data System (ADS)

    Eyer, L.

    2005-01-01

    The Gaia mission will offer an exceptional opportunity to perform variability studies. The data homogeneity, its optimised photometric systems, composed of 11 medium and 4-5 broad bands, the high photometric precision in G band of one milli-mag for V = 13-15, the radial velocity measurements and the exquisite astrometric precision for one billion stars will permit a detailed description of variable objects like stars, quasars and asteroids. However the time sampling and the total number of measurements change from one object to another because of the satellite scanning law. The data analysis is a challenge because of the huge amount of data, the complexity of the observed objects and the peculiarities of the satellite, and needs thorough preparation. Experience can be gained by the study of past and present survey analyses and results, and Gaia should be put in perspective with the future large scale surveys, like PanSTARRS or LSST. We present the activities of the Variable Star Working Group and a general plan to digest this unprecedented data set, focusing here on the photometry.

  20. Effects of experimental design on calibration curve precision in routine analysis

    PubMed Central

    Pimentel, Maria Fernanda; Neto, Benício de Barros; Saldanha, Teresa Cristina B.

    1998-01-01

    A computational program which compares the effciencies of different experimental designs with those of maximum precision (D-optimized designs) is described. The program produces confidence interval plots for a calibration curve and provides information about the number of standard solutions, concentration levels and suitable concentration ranges to achieve an optimum calibration. Some examples of the application of this novel computational program are given, using both simulated and real data. PMID:18924816

  1. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

    DOE PAGES

    Worcester, Elizabeth

    2015-08-06

    In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle θ 23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

  2. Objective Amplitude of Accommodation Computed from Optical Quality Metrics Applied to Wavefront Outcomes

    PubMed Central

    López-Gil, Norberto; Fernández-Sánchez, Vicente; Thibos, Larry N.; Montés-Micó, Robert

    2010-01-01

    Purpose We studied the accuracy and precision of 32 objective wavefront methods for finding the amplitude of accommodation obtained in 180 eyes. Methods Ocular accommodation was stimulated with 0.5 D steps in target vergence spanning the full range of accommodation for each subject. Subjective monocular amplitude of accommodation was measured using two clinical methods, using negative lenses and with a custom Badal optometer. Results Both subjective methods gave similar results. Results obtained from the Badal optometer where used to test the accuracy of the objective methods. All objective methods showed lower amplitude of accommodation that the subjective ones by an amount that varied from 0.2 to 1.1 D depending on the method. The precision in this prediction also varied between subjects, with an average standard error of the mean of 0.1 D that decreased with age. Conclusions Depth of field increases subjective of amplitude of accommodation overestimating the objective amplitude obtained with all the metrics used. The change in the negative direction of spherical aberration during accommodation increases the amplitude of accommodation by an amount that varies with age.

  3. Dynamical Coordination of Hand Intrinsic Muscles for Precision Grip in Diabetes Mellitus.

    PubMed

    Li, Ke; Wei, Na; Cheng, Mei; Hou, Xingguo; Song, Jun

    2018-03-12

    This study investigated the effects of diabetes mellitus (DM) on dynamical coordination of hand intrinsic muscles during precision grip. Precision grip was tested using a custom designed apparatus with stable and unstable loads, during which the surface electromyographic (sEMG) signals of the abductor pollicis brevis (APB) and first dorsal interosseous (FDI) were recorded simultaneously. Recurrence quantification analysis (RQA) was applied to quantify the dynamical structure of sEMG signals of the APB and FDI; and cross recurrence quantification analysis (CRQA) was used to assess the intermuscular coupling between the two intrinsic muscles. This study revealed that the DM altered the dynamical structure of muscle activation for the FDI and the dynamical intermuscular coordination between the APB and FDI during precision grip. A reinforced feedforward mechanism that compensates the loss of sensory feedbacks in DM may be responsible for the stronger intermuscular coupling between the APB and FDI muscles. Sensory deficits in DM remarkably decreased the capacity of online motor adjustment based on sensory feedback, rendering a lower adaptability to the uncertainty of environment. This study shed light on inherent dynamical properties underlying the intrinsic muscle activation and intermuscular coordination for precision grip and the effects of DM on hand sensorimotor function.

  4. Precise strong lensing mass profile of the CLASH galaxy cluster MACS 2129

    NASA Astrophysics Data System (ADS)

    Monna, A.; Seitz, S.; Balestra, I.; Rosati, P.; Grillo, C.; Halkola, A.; Suyu, S. H.; Coe, D.; Caminha, G. B.; Frye, B.; Koekemoer, A.; Mercurio, A.; Nonino, M.; Postman, M.; Zitrin, A.

    2017-04-01

    We present a detailed strong lensing (SL) mass reconstruction of the core of the galaxy cluster MACS J2129.4-0741 (zcl = 0.589) obtained by combining high-resolution Hubble Space Telescope photometry from the CLASH (Cluster Lensing And Supernovae survey with Hubble) survey with new spectroscopic observations from the CLASH-VLT (Very Large Telescope) survey. A background bright red passive galaxy at zsp = 1.36, sextuply lensed in the cluster core, has four radial lensed images located over the three central cluster members. Further 19 background lensed galaxies are spectroscopically confirmed by our VLT survey, including 3 additional multiple systems. A total of 31 multiple images are used in the lensing analysis. This allows us to trace with high precision the total mass profile of the cluster in its very inner region (R < 100 kpc). Our final lensing mass model reproduces the multiple images systems identified in the cluster core with high accuracy of 0.4 arcsec. This translates to a high-precision mass reconstruction of MACS 2129, which is constrained at a level of 2 per cent. The cluster has Einstein parameter ΘE = (29 ± 4) arcsec and a projected total mass of Mtot(<ΘE) = (1.35 ± 0.03) × 1014 M⊙ within such radius. Together with the cluster mass profile, we provide here also the complete spectroscopic data set for the cluster members and lensed images measured with VLT/Visible Multi-Object Spectrograph within the CLASH-VLT survey.

  5. Identification of Child Pedestrian Training Objectives: The Role of Task Analysis and Empirical Research.

    ERIC Educational Resources Information Center

    van der Molen, Hugo H.

    1984-01-01

    Describes a study designed to demonstrate that child pedestrian training objectives may be identified systematically through various task analysis methods, making use of different types of empirical information. Early approaches to analysis of pedestrian tasks are reviewed, and an outline of the Traffic Research Centre's pedestrian task analysis…

  6. Precision Heating Process

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A heat sealing process was developed by SEBRA based on technology that originated in work with NASA's Jet Propulsion Laboratory. The project involved connecting and transferring blood and fluids between sterile plastic containers while maintaining a closed system. SEBRA markets the PIRF Process to manufacturers of medical catheters. It is a precisely controlled method of heating thermoplastic materials in a mold to form or weld catheters and other products. The process offers advantages in fast, precise welding or shape forming of catheters as well as applications in a variety of other industries.

  7. Manufacturing Precise, Lightweight Paraboloidal Mirrors

    NASA Technical Reports Server (NTRS)

    Hermann, Frederick Thomas

    2006-01-01

    A process for fabricating a precise, diffraction- limited, ultra-lightweight, composite- material (matrix/fiber) paraboloidal telescope mirror has been devised. Unlike the traditional process of fabrication of heavier glass-based mirrors, this process involves a minimum of manual steps and subjective judgment. Instead, this process involves objectively controllable, repeatable steps; hence, this process is better suited for mass production. Other processes that have been investigated for fabrication of precise composite-material lightweight mirrors have resulted in print-through of fiber patterns onto reflecting surfaces, and have not provided adequate structural support for maintenance of stable, diffraction-limited surface figures. In contrast, this process does not result in print-through of the fiber pattern onto the reflecting surface and does provide a lightweight, rigid structure capable of maintaining a diffraction-limited surface figure in the face of changing temperature, humidity, and air pressure. The process consists mainly of the following steps: 1. A precise glass mandrel is fabricated by conventional optical grinding and polishing. 2. The mandrel is coated with a release agent and covered with layers of a carbon- fiber composite material. 3. The outer surface of the outer layer of the carbon-fiber composite material is coated with a surfactant chosen to provide for the proper flow of an epoxy resin to be applied subsequently. 4. The mandrel as thus covered is mounted on a temperature-controlled spin table. 5. The table is heated to a suitable temperature and spun at a suitable speed as the epoxy resin is poured onto the coated carbon-fiber composite material. 6. The surface figure of the optic is monitored and adjusted by use of traditional Ronchi, Focault, and interferometric optical measurement techniques while the speed of rotation and the temperature are adjusted to obtain the desired figure. The proper selection of surfactant, speed or rotation

  8. Precision of Multiple Reaction Monitoring Mass Spectrometry Analysis of Formalin-Fixed, Paraffin-Embedded Tissue

    PubMed Central

    2012-01-01

    We compared the reproducibility of multiple reaction monitoring (MRM) mass spectrometry-based peptide quantitation in tryptic digests from formalin-fixed, paraffin-embedded (FFPE) and frozen clear cell renal cell carcinoma tissues. The analyses targeted a candidate set of 114 peptides previously identified in shotgun proteomic analyses, of which 104 were detectable in FFPE and frozen tissue. Although signal intensities for MRM of peptides from FFPE tissue were on average 66% of those in frozen tissue, median coefficients of variation (CV) for measurements in FFPE and frozen tissues were nearly identical (18–20%). Measurements of lysine C-terminal peptides and arginine C-terminal peptides from FFPE tissue were similarly reproducible (19.5% and 18.3% median CV, respectively). We further evaluated the precision of MRM-based quantitation by analysis of peptides from the Her2 receptor in FFPE and frozen tissues from a Her2 overexpressing mouse xenograft model of breast cancer and in human FFPE breast cancer specimens. We obtained equivalent MRM measurements of HER2 receptor levels in FFPE and frozen mouse xenografts derived from HER2-overexpressing BT474 cells and HER2-negative Sum159 cells. MRM analyses of 5 HER2-positive and 5 HER-negative human FFPE breast tumors confirmed the results of immunohistochemical analyses, thus demonstrating the feasibility of HER2 protein quantification in FFPE tissue specimens. The data demonstrate that MRM analyses can be performed with equal precision on FFPE and frozen tissues and that lysine-containing peptides can be selected for quantitative comparisons, despite the greater impact of formalin fixation on lysine residues. The data further illustrate the feasibility of applying MRM to quantify clinically important tissue biomarkers in FFPE specimens. PMID:22530795

  9. Precision medicine in pediatric oncology: Lessons learned and next steps

    PubMed Central

    Mody, Rajen J.; Prensner, John R.; Everett, Jessica; Parsons, D. Williams; Chinnaiyan, Arul M.

    2017-01-01

    The maturation of genomic technologies has enabled new discoveries in disease pathogenesis as well as new approaches to patient care. In pediatric oncology, patients may now receive individualized genomic analysis to identify molecular aberrations of relevance for diagnosis and/or treatment. In this context, several recent clinical studies have begun to explore the feasibility and utility of genomics-driven precision medicine. Here, we review the major developments in this field, discuss current limitations, and explore aspects of the clinical implementation of precision medicine, which lack consensus. Lastly, we discuss ongoing scientific efforts in this arena, which may yield future clinical applications. PMID:27748023

  10. Development and Beam-Shape Analysis of an Integrated Fiber-Optic Confocal Probe for High-Precision Central Thickness Measurement of Small-Radius Lenses

    PubMed Central

    Sutapun, Boonsong; Somboonkaew, Armote; Amarit, Ratthasart; Chanhorm, Sataporn

    2015-01-01

    This work describes a new design of a fiber-optic confocal probe suitable for measuring the central thicknesses of small-radius optical lenses or similar objects. The proposed confocal probe utilizes an integrated camera that functions as a shape-encoded position-sensing device. The confocal signal for thickness measurement and beam-shape data for off-axis measurement can be simultaneously acquired using the proposed probe. Placing the probe’s focal point off-center relative to a sample’s vertex produces a non-circular image at the camera’s image plane that closely resembles an ellipse for small displacements. We were able to precisely position the confocal probe’s focal point relative to the vertex point of a ball lens with a radius of 2.5 mm, with a lateral resolution of 1.2 µm. The reflected beam shape based on partial blocking by an aperture was analyzed and verified experimentally. The proposed confocal probe offers a low-cost, high-precision technique, an alternative to a high-cost three-dimensional surface profiler, for tight quality control of small optical lenses during the manufacturing process. PMID:25871720

  11. High-precision shape representation using a neuromorphic vision sensor with synchronous address-event communication interface

    NASA Astrophysics Data System (ADS)

    Belbachir, A. N.; Hofstätter, M.; Litzenberger, M.; Schön, P.

    2009-10-01

    A synchronous communication interface for neuromorphic temporal contrast vision sensors is described and evaluated in this paper. This interface has been designed for ultra high-speed synchronous arbitration of a temporal contrast image sensors pixels' data. Enabling high-precision timestamping, this system demonstrates its uniqueness for handling peak data rates and preserving the main advantage of the neuromorphic electronic systems, that is high and accurate temporal resolution. Based on a synchronous arbitration concept, the timestamping has a resolution of 100 ns. Both synchronous and (state-of-the-art) asynchronous arbiters have been implemented in a neuromorphic dual-line vision sensor chip in a standard 0.35 µm CMOS process. The performance analysis of both arbiters and the advantages of the synchronous arbitration over asynchronous arbitration in capturing high-speed objects are discussed in detail.

  12. Precision of DVC approaches for strain analysis in bone imaged with μCT at different dimensional levels.

    NASA Astrophysics Data System (ADS)

    Dall'Ara, Enrico; Peña-Fernández, Marta; Palanca, Marco; Giorgi, Mario; Cristofolini, Luca; Tozzi, Gianluca

    2017-11-01

    Accurate measurement of local strain in heterogeneous and anisotropic bone tissue is fundamental to understand the pathophysiology of musculoskeletal diseases, to evaluate the effect of interventions from preclinical studies, and to optimize the design and delivery of biomaterials. Digital volume correlation (DVC) can be used to measure the three-dimensional displacement and strain fields from micro-Computed Tomography (µCT) images of loaded specimens. However, this approach is affected by the quality of the input images, by the morphology and density of the tissue under investigation, by the correlation scheme, and by the operational parameters used in the computation. Therefore, for each application the precision of the method should be evaluated. In this paper we present the results collected from datasets analyzed in previous studies as well as new data from a recent experimental campaign for characterizing the relationship between the precision of two different DVC approaches and the spatial resolution of the outputs. Different bone structures scanned with laboratory source µCT or Synchrotron light µCT (SRµCT) were processed in zero-strain tests to evaluate the precision of the DVC methods as a function of the subvolume size that ranged from 8 to 2500 micrometers. The results confirmed that for every microstructure the precision of DVC improves for larger subvolume size, following power laws. However, for the first time large differences in the precision of both local and global DVC approaches have been highlighted when SRµCT or in vivo µCT images were used instead of conventional ex vivo µCT. These findings suggest that in situ mechanical testing protocols applied in SRµCT facilities should be optimized in order to allow DVC analyses of localized strain measurements. Moreover, for in vivo µCT applications DVC analyses should be performed only with relatively course spatial resolution for achieving a reasonable precision of the method. In conclusion

  13. Multi-object investigation using two-wavelength phase-shift interferometry guided by an optical frequency comb

    NASA Astrophysics Data System (ADS)

    Ibrahim, Dahi Ghareab Abdelsalam; Yasui, Takeshi

    2018-04-01

    Two-wavelength phase-shift interferometry guided by optical frequency combs is presented. We demonstrate the operation of the setup with a large step sample simultaneously with a resolution test target with a negative pattern. The technique can investigate multi-objects simultaneously with high precision. Using this technique, several important applications in metrology that require high speed and precision are demonstrated.

  14. Analysis of the Image Quality of no Ground Controlled Positioning Precision about Surveying and Mapping Satellite

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Hu, X.; Yang, X.; Xie, G.

    2018-04-01

    The image quality of the surveying camera will affect the stereoscopic positioning accuracy of the remote sensing satellite. The key factors closely related to the image quality are Modulation Transfer Function(MTF),Signal to Noise Ratio(SNR) and Quantization Bits(QB). In "Mapping Satellite-1" image as the background, research the effect of positioning precision about the image quality in no ground controlled conditions, and evaluate the quantitative relationship with the positioning precision. At last verify the validity of the experimental results by simulating three factors of the degraded data on orbit, and counting the number of matching points, the mismatch rate, and the matching residuals of the degraded data. The reason for the variety of the positioning precision was analyzed.

  15. Object-oriented requirements analysis: A quick tour

    NASA Technical Reports Server (NTRS)

    Berard, Edward V.

    1990-01-01

    Of all the approaches to software development, an object-oriented approach appears to be both the most beneficial and the most popular. The description of the object-oriented approach is presented in the form of the view graphs.

  16. Conscientious objection to referrals for abortion: pragmatic solution or threat to women’s rights?

    PubMed Central

    2014-01-01

    Background Conscientious objection has spurred impassioned debate in many Western countries. Some Norwegian general practitioners (GPs) refuse to refer for abortion. Little is know about how the GPs carry out their refusals in practice, how they perceive their refusal to fit with their role as professionals, and how refusals impact patients. Empirical data can inform subsequent normative analysis. Methods Qualitative research interviews were conducted with seven GPs, all Christians. Transcripts were analysed using systematic text condensation. Results Informants displayed a marked ambivalence towards their own refusal practices. Five main topics emerged in the interviews: 1) carrying out conscientious objection in practice, 2) justification for conscientious objection, 3) challenges when relating to colleagues, 4) ambivalence and consistency, 5) effects on the doctor-patient relationship. Conclusions Norwegian GP conscientious objectors were given to consider both pros and cons when evaluating their refusal practices. They had settled on a practical compromise, the precise form of which would vary, and which was deemed an acceptable middle way between competing interests. PMID:24571955

  17. Genetic Particle Swarm Optimization–Based Feature Selection for Very-High-Resolution Remotely Sensed Imagery Object Change Detection

    PubMed Central

    Chen, Qiang; Chen, Yunhao; Jiang, Weiguo

    2016-01-01

    In the field of multiple features Object-Based Change Detection (OBCD) for very-high-resolution remotely sensed images, image objects have abundant features and feature selection affects the precision and efficiency of OBCD. Through object-based image analysis, this paper proposes a Genetic Particle Swarm Optimization (GPSO)-based feature selection algorithm to solve the optimization problem of feature selection in multiple features OBCD. We select the Ratio of Mean to Variance (RMV) as the fitness function of GPSO, and apply the proposed algorithm to the object-based hybrid multivariate alternative detection model. Two experiment cases on Worldview-2/3 images confirm that GPSO can significantly improve the speed of convergence, and effectively avoid the problem of premature convergence, relative to other feature selection algorithms. According to the accuracy evaluation of OBCD, GPSO is superior at overall accuracy (84.17% and 83.59%) and Kappa coefficient (0.6771 and 0.6314) than other algorithms. Moreover, the sensitivity analysis results show that the proposed algorithm is not easily influenced by the initial parameters, but the number of features to be selected and the size of the particle swarm would affect the algorithm. The comparison experiment results reveal that RMV is more suitable than other functions as the fitness function of GPSO-based feature selection algorithm. PMID:27483285

  18. Genetic Particle Swarm Optimization-Based Feature Selection for Very-High-Resolution Remotely Sensed Imagery Object Change Detection.

    PubMed

    Chen, Qiang; Chen, Yunhao; Jiang, Weiguo

    2016-07-30

    In the field of multiple features Object-Based Change Detection (OBCD) for very-high-resolution remotely sensed images, image objects have abundant features and feature selection affects the precision and efficiency of OBCD. Through object-based image analysis, this paper proposes a Genetic Particle Swarm Optimization (GPSO)-based feature selection algorithm to solve the optimization problem of feature selection in multiple features OBCD. We select the Ratio of Mean to Variance (RMV) as the fitness function of GPSO, and apply the proposed algorithm to the object-based hybrid multivariate alternative detection model. Two experiment cases on Worldview-2/3 images confirm that GPSO can significantly improve the speed of convergence, and effectively avoid the problem of premature convergence, relative to other feature selection algorithms. According to the accuracy evaluation of OBCD, GPSO is superior at overall accuracy (84.17% and 83.59%) and Kappa coefficient (0.6771 and 0.6314) than other algorithms. Moreover, the sensitivity analysis results show that the proposed algorithm is not easily influenced by the initial parameters, but the number of features to be selected and the size of the particle swarm would affect the algorithm. The comparison experiment results reveal that RMV is more suitable than other functions as the fitness function of GPSO-based feature selection algorithm.

  19. Inertial parameter identification using contact force information for an unknown object captured by a space manipulator

    NASA Astrophysics Data System (ADS)

    Chu, Zhongyi; Ma, Ye; Hou, Yueyang; Wang, Fengwen

    2017-02-01

    This paper presents a novel identification method for the intact inertial parameters of an unknown object in space captured by a manipulator in a space robotic system. With strong dynamic and kinematic coupling existing in the robotic system, the inertial parameter identification of the unknown object is essential for the ideal control strategy based on changes in the attitude and trajectory of the space robot via capturing operations. Conventional studies merely refer to the principle and theory of identification, and an error analysis process of identification is deficient for a practical scenario. To solve this issue, an analysis of the effect of errors on identification is illustrated first, and the accumulation of measurement or estimation errors causing poor identification precision is demonstrated. Meanwhile, a modified identification equation incorporating the contact force, as well as the force/torque of the end-effector, is proposed to weaken the accumulation of errors and improve the identification accuracy. Furthermore, considering a severe disturbance condition caused by various measured noises, the hybrid immune algorithm, Recursive Least Squares and Affine Projection Sign Algorithm (RLS-APSA), is employed to decode the modified identification equation to ensure a stable identification property. Finally, to verify the validity of the proposed identification method, the co-simulation of ADAMS-MATLAB is implemented by multi-degree of freedom models of a space robotic system, and the numerical results show a precise and stable identification performance, which is able to guarantee the execution of aerospace operations and prevent failed control strategies.

  20. The International GPS Service (IGS) as a Continuous Reference System for Precise GPS Positioning

    NASA Technical Reports Server (NTRS)

    Neilan, Ruth; Heflin, Michael; Watkins, Michael; Zumberge, James

    1996-01-01

    The International GPS Service for Geodynamics (IGS) is an organization which operates under the auspices of the International Association of Geodesy (IAG) and has been operational since January 1994. The primary objective of the IGS is to provide precise GPS data and data products to support geodetic and geophysical research activities.

  1. Precision gap particle separator

    DOEpatents

    Benett, William J.; Miles, Robin; Jones, II., Leslie M.; Stockton, Cheryl

    2004-06-08

    A system for separating particles entrained in a fluid includes a base with a first channel and a second channel. A precision gap connects the first channel and the second channel. The precision gap is of a size that allows small particles to pass from the first channel into the second channel and prevents large particles from the first channel into the second channel. A cover is positioned over the base unit, the first channel, the precision gap, and the second channel. An port directs the fluid containing the entrained particles into the first channel. An output port directs the large particles out of the first channel. A port connected to the second channel directs the small particles out of the second channel.

  2. McDonald Observatory Planetary Search - A high precision stellar radial velocity survey for other planetary systems

    NASA Technical Reports Server (NTRS)

    Cochran, William D.; Hatzes, Artie P.

    1993-01-01

    The McDonald Observatory Planetary Search program surveyed a sample of 33 nearby F, G, and K stars since September 1987 to search for substellar companion objects. Measurements of stellar radial velocity variations to a precision of better than 10 m/s were performed as routine observations to detect Jovian planets in orbit around solar type stars. Results confirm the detection of a companion object to HD114762.

  3. High precision radial velocities with GIANO spectra

    NASA Astrophysics Data System (ADS)

    Carleo, I.; Sanna, N.; Gratton, R.; Benatti, S.; Bonavita, M.; Oliva, E.; Origlia, L.; Desidera, S.; Claudi, R.; Sissa, E.

    2016-06-01

    Radial velocities (RV) measured from near-infrared (NIR) spectra are a potentially excellent tool to search for extrasolar planets around cool or active stars. High resolution infrared (IR) spectrographs now available are reaching the high precision of visible instruments, with a constant improvement over time. GIANO is an infrared echelle spectrograph at the Telescopio Nazionale Galileo (TNG) and it is a powerful tool to provide high resolution spectra for accurate RV measurements of exoplanets and for chemical and dynamical studies of stellar or extragalactic objects. No other high spectral resolution IR instrument has GIANO's capability to cover the entire NIR wavelength range (0.95-2.45 μm) in a single exposure. In this paper we describe the ensemble of procedures that we have developed to measure high precision RVs on GIANO spectra acquired during the Science Verification (SV) run, using the telluric lines as wavelength reference. We used the Cross Correlation Function (CCF) method to determine the velocity for both the star and the telluric lines. For this purpose, we constructed two suitable digital masks that include about 2000 stellar lines, and a similar number of telluric lines. The method is applied to various targets with different spectral type, from K2V to M8 stars. We reached different precisions mainly depending on the H-magnitudes: for H ˜ 5 we obtain an rms scatter of ˜ 10 m s-1, while for H ˜ 9 the standard deviation increases to ˜ 50 ÷ 80 m s-1. The corresponding theoretical error expectations are ˜ 4 m s-1 and 30 m s-1, respectively. Finally we provide the RVs measured with our procedure for the targets observed during GIANO Science Verification.

  4. On the Concept of Varying Influence Radii for a Successive Corrections Objective Analysis

    NASA Technical Reports Server (NTRS)

    Achtemeier, Gary L.

    1991-01-01

    There has been a long standing concept by those who use successive corrections objective analysis that the way to obtain the most accurate objective analysis is first, to analyze for the long wavelengths and then to build in the details of the shorter wavelengths by successively decreasing the influence of the more distant observations upon the interpolated values. Using the Barnes method, the filter characteristics were compared for families of response curves that pass through a common point at a reference wavelength. It was found that the filter cutoff is a maximum if the filter parameters that determine the influence of observations are unchanged for both the initial and corrections passes. This information was used to define and test the following hypothesis. If accuracy is defined by how well the method retains desired wavelengths and removes undesired wavelengths, then the Barnes method gives the most accurate analyses if the filter parameter on the initial and corrections passes are the same. This hypothesis does not follow the usual conceptual approach to successive corrections analysis.

  5. Study of nanometer-level precise phase-shift system used in electronic speckle shearography and phase-shift pattern interferometry

    NASA Astrophysics Data System (ADS)

    Jing, Chao; Liu, Zhongling; Zhou, Ge; Zhang, Yimo

    2011-11-01

    The nanometer-level precise phase-shift system is designed to realize the phase-shift interferometry in electronic speckle shearography pattern interferometry. The PZT is used as driving component of phase-shift system and translation component of flexure hinge is developed to realize micro displacement of non-friction and non-clearance. Closed-loop control system is designed for high-precision micro displacement, in which embedded digital control system is developed for completing control algorithm and capacitive sensor is used as feedback part for measuring micro displacement in real time. Dynamic model and control model of the nanometer-level precise phase-shift system is analyzed, and high-precision micro displacement is realized with digital PID control algorithm on this basis. It is proved with experiments that the location precision of the precise phase-shift system to step signal of displacement is less than 2nm and the location precision to continuous signal of displacement is less than 5nm, which is satisfied with the request of the electronic speckle shearography and phase-shift pattern interferometry. The stripe images of four-step phase-shift interferometry and the final phase distributed image correlated with distortion of objects are listed in this paper to prove the validity of nanometer-level precise phase-shift system.

  6. The Nature of the Nodes, Weights and Degree of Precision in Gaussian Quadrature Rules

    ERIC Educational Resources Information Center

    Prentice, J. S. C.

    2011-01-01

    We present a comprehensive proof of the theorem that relates the weights and nodes of a Gaussian quadrature rule to its degree of precision. This level of detail is often absent in modern texts on numerical analysis. We show that the degree of precision is maximal, and that the approximation error in Gaussian quadrature is minimal, in a…

  7. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    NASA Astrophysics Data System (ADS)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  8. A Study on Improvement of Machining Precision in a Medical Milling Robot

    NASA Astrophysics Data System (ADS)

    Sugita, Naohiko; Osa, Takayuki; Nakajima, Yoshikazu; Mori, Masahiko; Saraie, Hidenori; Mitsuishi, Mamoru

    Minimal invasiveness and increasing of precision have recently become important issues in orthopedic surgery. The femur and tibia must be cut precisely for successful knee arthroplasty. The recent trend towards Minimally Invasive Surgery (MIS) has increased surgical difficulty since the incision length and open access area are small. In this paper, the result of deformation analysis of the robot and an active compensation method of robot deformation, which is based on an error map, are proposed and evaluated.

  9. Precision medicine driven by cancer systems biology.

    PubMed

    Filipp, Fabian V

    2017-03-01

    Molecular insights from genome and systems biology are influencing how cancer is diagnosed and treated. We critically evaluate big data challenges in precision medicine. The melanoma research community has identified distinct subtypes involving chronic sun-induced damage and the mitogen-activated protein kinase driver pathway. In addition, despite low mutation burden, non-genomic mitogen-activated protein kinase melanoma drivers are found in membrane receptors, metabolism, or epigenetic signaling with the ability to bypass central mitogen-activated protein kinase molecules and activating a similar program of mitogenic effectors. Mutation hotspots, structural modeling, UV signature, and genomic as well as non-genomic mechanisms of disease initiation and progression are taken into consideration to identify resistance mutations and novel drug targets. A comprehensive precision medicine profile of a malignant melanoma patient illustrates future rational drug targeting strategies. Network analysis emphasizes an important role of epigenetic and metabolic master regulators in oncogenesis. Co-occurrence of driver mutations in signaling, metabolic, and epigenetic factors highlights how cumulative alterations of our genomes and epigenomes progressively lead to uncontrolled cell proliferation. Precision insights have the ability to identify independent molecular pathways suitable for drug targeting. Synergistic treatment combinations of orthogonal modalities including immunotherapy, mitogen-activated protein kinase inhibitors, epigenetic inhibitors, and metabolic inhibitors have the potential to overcome immune evasion, side effects, and drug resistance.

  10. Observing exoplanet populations with high-precision astrometry

    NASA Astrophysics Data System (ADS)

    Sahlmann, Johannes

    2012-06-01

    This thesis deals with the application of the astrometry technique, consisting in measuring the position of a star in the plane of the sky, for the discovery and characterisation of extra-solar planets. It is feasible only with a very high measurement precision, which motivates the use of space observatories, the development of new ground-based astronomical instrumentation and of innovative data analysis methods: The study of Sun-like stars with substellar companions using CORALIE radial velocities and HIPPARCOS astrometry leads to the determination of the frequency of close brown dwarf companions and to the discovery of a dividing line between massive planets and brown dwarf companions; An observation campaign employing optical imaging with a very large telescope demonstrates sufficient astrometric precision to detect planets around ultra-cool dwarf stars and the first results of the survey are presented; Finally, the design and initial astrometric performance of PRIMA, ! a new dual-feed near-infrared interferometric observing facility for relative astrometry is presented.

  11. Precision Medicine in Cancer Treatment

    Cancer.gov

    Precision medicine helps doctors select cancer treatments that are most likely to help patients based on a genetic understanding of their disease. Learn about the promise of precision medicine and the role it plays in cancer treatment.

  12. Precision Engineering - SRO 154.

    DTIC Science & Technology

    1986-01-01

    Operation The principle of interferometric displacement measurement devices is that if two identical, coherent, monochromatic light beams are directed...laser interferometric feedback to enhance the accuracy and precision of a lead screw stage. The precision translation stage was designed to produce...and the deepest was 22 micrometers (875 microinches). Figures 5, 6 and 7 are Nomarsky photomicrographs showing the begin- ning, middle and end of a

  13. Precise Estimation of Allele Frequencies of Single-Nucleotide Polymorphisms by a Quantitative SSCP Analysis of Pooled DNA

    PubMed Central

    Sasaki, Tomonari; Tahira, Tomoko; Suzuki, Akari; Higasa, Koichiro; Kukita, Yoji; Baba, Shingo; Hayashi, Kenshi

    2001-01-01

    We show that single-nucleotide polymorphisms (SNPs) of moderate to high heterozygosity (minor allele frequencies >10%) can be efficiently detected, and their allele frequencies accurately estimated, by pooling the DNA samples and applying a capillary-based SSCP analysis. In this method, alleles are separated into peaks, and their frequencies can be reliably and accurately quantified from their peak heights (SD <1.8%). We found that as many as 40% of publicly available SNPs that were analyzed by this method have widely differing allele frequency distributions among groups of different ethnicity (parents of Centre d'Etude Polymorphisme Humaine families vs. Japanese individuals). These results demonstrate the effectiveness of the present pooling method in the reevaluation of candidate SNPs that have been collected by examination of limited numbers of individuals. The method should also serve as a robust quantitative technique for studies in which a precise estimate of SNP allele frequencies is essential—for example, in linkage disequilibrium analysis. PMID:11083945

  14. Three-dimensional localization and optical imaging of objects in turbid media with independent component analysis.

    PubMed

    Xu, M; Alrubaiee, M; Gayen, S K; Alfano, R R

    2005-04-01

    A new approach for optical imaging and localization of objects in turbid media that makes use of the independent component analysis (ICA) from information theory is demonstrated. Experimental arrangement realizes a multisource illumination of a turbid medium with embedded objects and a multidetector acquisition of transmitted light on the medium boundary. The resulting spatial diversity and multiple angular observations provide robust data for three-dimensional localization and characterization of absorbing and scattering inhomogeneities embedded in a turbid medium. ICA of the perturbations in the spatial intensity distribution on the medium boundary sorts out the embedded objects, and their locations are obtained from Green's function analysis based on any appropriate light propagation model. Imaging experiments were carried out on two highly scattering samples of thickness approximately 50 times the transport mean-free path of the respective medium. One turbid medium had two embedded absorptive objects, and the other had four scattering objects. An independent component separation of the signal, in conjunction with diffusive photon migration theory, was used to locate the embedded inhomogeneities. In both cases, improved lateral and axial localizations of the objects over the result obtained by use of common photon migration reconstruction algorithms were achieved. The approach is applicable to different medium geometries, can be used with any suitable photon propagation model, and is amenable to near-real-time imaging applications.

  15. Precision Medicine in Gastrointestinal Pathology.

    PubMed

    Wang, David H; Park, Jason Y

    2016-05-01

    -Precision medicine is the promise of individualized therapy and management of patients based on their personal biology. There are now multiple global initiatives to perform whole-genome sequencing on millions of individuals. In the United States, an early program was the Million Veteran Program, and a more recent proposal in 2015 by the president of the United States is the Precision Medicine Initiative. To implement precision medicine in routine oncology care, genetic variants present in tumors need to be matched with effective clinical therapeutics. When we focus on the current state of precision medicine for gastrointestinal malignancies, it becomes apparent that there is a mixed history of success and failure. -To present the current state of precision medicine using gastrointestinal oncology as a model. We will present currently available targeted therapeutics, promising new findings in clinical genomic oncology, remaining quality issues in genomic testing, and emerging oncology clinical trial designs. -Review of the literature including clinical genomic studies on gastrointestinal malignancies, clinical oncology trials on therapeutics targeted to molecular alterations, and emerging clinical oncology study designs. -Translating our ability to sequence thousands of genes into meaningful improvements in patient survival will be the challenge for the next decade.

  16. Robust Optimization and Sensitivity Analysis with Multi-Objective Genetic Algorithms: Single- and Multi-Disciplinary Applications

    DTIC Science & Technology

    2007-01-01

    multi-disciplinary optimization with uncertainty. Robust optimization and sensitivity analysis is usually used when an optimization model has...formulation is introduced in Section 2.3. We briefly discuss several definitions used in the sensitivity analysis in Section 2.4. Following in...2.5. 2.4 SENSITIVITY ANALYSIS In this section, we discuss several definitions used in Chapter 5 for Multi-Objective Sensitivity Analysis . Inner

  17. The Precision Problem in Conservation and Restoration.

    PubMed

    Hiers, J Kevin; Jackson, Stephen T; Hobbs, Richard J; Bernhardt, Emily S; Valentine, Leonie E

    2016-11-01

    Within the varied contexts of environmental policy, conservation of imperilled species populations, and restoration of damaged habitats, an emphasis on idealized optimal conditions has led to increasingly specific targets for management. Overly-precise conservation targets can reduce habitat variability at multiple scales, with unintended consequences for future ecological resilience. We describe this dilemma in the context of endangered species management, stream restoration, and climate-change adaptation. Inappropriate application of conservation targets can be expensive, with marginal conservation benefit. Reduced habitat variability can limit options for managers trying to balance competing objectives with limited resources. Conservation policies should embrace habitat variability, expand decision-space appropriately, and support adaptation to local circumstances to increase ecological resilience in a rapidly changing world. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Precision Photometry and Astrometry from Pan-STARRS

    NASA Astrophysics Data System (ADS)

    Magnier, Eugene A.; Pan-STARRS Team

    2018-01-01

    The Pan-STARRS 3pi Survey has been calibrated with excellent precision for both astrometry and photometry. The Pan-STARRS Data Release 1, opened to the public on 2016 Dec 16, provides photometry in 5 well-calibrated, well-defined bandpasses (grizy) astrometrically registered to the Gaia frame. Comparisons with other surveys illustrate the high quality of the calibration and provide tests of remaining systematic errors in both Pan-STARRS and those external surveys. With photometry and astrometry of roughly 3 billion astronomical objects, the Pan-STARRS DR1 has substantial overlap with Gaia, SDSS, 2MASS and other surveys. I will discuss the astrometric tie between Pan-STARRS DR1 and Gaia and show comparisons between Pan-STARRS and other large-scale surveys.

  19. Real-time analysis of δ13C- and δD-CH4 by high precision laser spectroscopy

    NASA Astrophysics Data System (ADS)

    Eyer, Simon; Emmenegger, Lukas; Tuzson, Béla; Fischer, Hubertus; Mohn, Joachim

    2014-05-01

    Methane (CH4) is the most important non-CO2 greenhouse gas (GHG) contributing 18% to total radiative forcing. Anthropogenic sources (e.g. ruminants, landfills) contribute 60% to total emissions and led to an increase in its atmospheric mixing ratio from 700 ppb in pre-industrial times to 1819 ± 1 ppb in 2012 [1]. Analysis of the most abundant methane isotopologues 12CH4, 13CH4 and 12CH3D can be used to disentangle the various source/sink processes [2] and to develop target oriented reduction strategies. High precision isotopic analysis of CH4 can be accomplished by isotope-ratio mass-spectrometry (IRMS) [2] and more recently by mid-infrared laser-based spectroscopic techniques. For high precision measurements in ambient air, however, both techniques rely on preconcentration of the target gas [3]. In an on-going project, we developed a fully-automated, field-deployable CH4 preconcentration unit coupled to a dual quantum cascade laser absorption spectrometer (QCLAS) for real-time analysis of CH4 isotopologues. The core part of the rack-mounted (19 inch) device is a highly-efficient adsorbent trap attached to a motorized linear drive system and enclosed in a vacuum chamber. Thereby, the adsorbent trap can be decoupled from the Stirling cooler during desorption for fast desorption and optimal heat management. A wide variety of adsorbents, including: HayeSep D, molecular sieves as well as the novel metal-organic frameworks and carbon nanotubes were characterized regarding their surface area, isosteric enthalpy of adsorption and selectivity for methane over nitrogen. The most promising candidates were tested on the preconcentration device and a preconcentration by a factor > 500 was obtained. Furthermore analytical interferants (e.g. N2O, CO2) are separated by step-wise desorption of trace gases. A QCL absorption spectrometer previously described by Tuzson et al. (2010) for CH4 flux measurements was modified to obtain a platform for high precision and simultaneous

  20. [Progress in precision medicine: a scientific perspective].

    PubMed

    Wang, B; Li, L M

    2017-01-10

    Precision medicine is a new strategy for disease prevention and treatment by taking into account differences in genetics, environment and lifestyles among individuals and making precise diseases classification and diagnosis, which can provide patients with personalized, targeted prevention and treatment. Large-scale population cohort studies are fundamental for precision medicine research, and could produce best evidence for precision medicine practices. Current criticisms on precision medicine mainly focus on the very small proportion of benefited patients, the neglect of social determinants for health, and the possible waste of limited medical resources. In spite of this, precision medicine is still a most hopeful research area, and would become a health care practice model in the future.

  1. Developing Ubiquitous Sensor Network Platform Using Internet of Things: Application in Precision Agriculture.

    PubMed

    Ferrández-Pastor, Francisco Javier; García-Chamizo, Juan Manuel; Nieto-Hidalgo, Mario; Mora-Pascual, Jerónimo; Mora-Martínez, José

    2016-07-22

    The application of Information Technologies into Precision Agriculture methods has clear benefits. Precision Agriculture optimises production efficiency, increases quality, minimises environmental impact and reduces the use of resources (energy, water); however, there are different barriers that have delayed its wide development. Some of these main barriers are expensive equipment, the difficulty to operate and maintain and the standard for sensor networks are still under development. Nowadays, new technological development in embedded devices (hardware and communication protocols), the evolution of Internet technologies (Internet of Things) and ubiquitous computing (Ubiquitous Sensor Networks) allow developing less expensive systems, easier to control, install and maintain, using standard protocols with low-power consumption. This work develops and test a low-cost sensor/actuator network platform, based in Internet of Things, integrating machine-to-machine and human-machine-interface protocols. Edge computing uses this multi-protocol approach to develop control processes on Precision Agriculture scenarios. A greenhouse with hydroponic crop production was developed and tested using Ubiquitous Sensor Network monitoring and edge control on Internet of Things paradigm. The experimental results showed that the Internet technologies and Smart Object Communication Patterns can be combined to encourage development of Precision Agriculture. They demonstrated added benefits (cost, energy, smart developing, acceptance by agricultural specialists) when a project is launched.

  2. Developing Ubiquitous Sensor Network Platform Using Internet of Things: Application in Precision Agriculture

    PubMed Central

    Ferrández-Pastor, Francisco Javier; García-Chamizo, Juan Manuel; Nieto-Hidalgo, Mario; Mora-Pascual, Jerónimo; Mora-Martínez, José

    2016-01-01

    The application of Information Technologies into Precision Agriculture methods has clear benefits. Precision Agriculture optimises production efficiency, increases quality, minimises environmental impact and reduces the use of resources (energy, water); however, there are different barriers that have delayed its wide development. Some of these main barriers are expensive equipment, the difficulty to operate and maintain and the standard for sensor networks are still under development. Nowadays, new technological development in embedded devices (hardware and communication protocols), the evolution of Internet technologies (Internet of Things) and ubiquitous computing (Ubiquitous Sensor Networks) allow developing less expensive systems, easier to control, install and maintain, using standard protocols with low-power consumption. This work develops and test a low-cost sensor/actuator network platform, based in Internet of Things, integrating machine-to-machine and human-machine-interface protocols. Edge computing uses this multi-protocol approach to develop control processes on Precision Agriculture scenarios. A greenhouse with hydroponic crop production was developed and tested using Ubiquitous Sensor Network monitoring and edge control on Internet of Things paradigm. The experimental results showed that the Internet technologies and Smart Object Communication Patterns can be combined to encourage development of Precision Agriculture. They demonstrated added benefits (cost, energy, smart developing, acceptance by agricultural specialists) when a project is launched. PMID:27455265

  3. Beyond precision surgery: Molecularly motivated precision care for gastric cancer.

    PubMed

    Choi, Y Y; Cheong, J-H

    2017-05-01

    Gastric cancer is one of the leading causes of cancer-related deaths worldwide. Despite the high disease prevalence, gastric cancer research has not gained much attention. Recently, genome-scale technology has made it possible to explore the characteristics of gastric cancer at the molecular level. Accordingly, gastric cancer can be classified into molecular subtypes that convey more detailed information of tumor than histopathological characteristics, and these subtypes are associated with clinical outcomes. Furthermore, this molecular knowledge helps to identify new actionable targets and develop novel therapeutic strategies. To advance the concept of precision patient care in the clinic, patient-derived xenograft (PDX) models have recently been developed. PDX models not only represent histology and genomic features, but also predict responsiveness to investigational drugs in patient tumors. Molecularly curated PDX cohorts will be instrumental in hypothesis generation, biomarker discovery, and drug screening and testing in proof-of-concept preclinical trials for precision therapy. In the era of precision medicine, molecularly tailored therapeutic strategies should be individualized for cancer patients. To improve the overall clinical outcome, a multimodal approach is indispensable for advanced cancer patients. Careful, oncological principle-based surgery, combined with a molecularly guided multidisciplinary approach, will open new horizons in surgical oncology. Copyright © 2017. Published by Elsevier Ltd.

  4. Demonstrating Change with Astronaut Photography Using Object Based Image Analysis

    NASA Technical Reports Server (NTRS)

    Hollier, Andi; Jagge, Amy

    2017-01-01

    Every day, hundreds of images of Earth flood the Crew Earth Observations database as astronauts use hand held digital cameras to capture spectacular frames from the International Space Station. The variety of resolutions and perspectives provide a template for assessing land cover change over decades. We will focus on urban growth in the second fastest growing city in the nation, Houston, TX, using Object-Based Image Analysis. This research will contribute to the land change science community, integrated resource planning, and monitoring of the rapid rate of urban sprawl.

  5. Precision medicine in pediatric oncology: Lessons learned and next steps.

    PubMed

    Mody, Rajen J; Prensner, John R; Everett, Jessica; Parsons, D Williams; Chinnaiyan, Arul M

    2017-03-01

    The maturation of genomic technologies has enabled new discoveries in disease pathogenesis as well as new approaches to patient care. In pediatric oncology, patients may now receive individualized genomic analysis to identify molecular aberrations of relevance for diagnosis and/or treatment. In this context, several recent clinical studies have begun to explore the feasibility and utility of genomics-driven precision medicine. Here, we review the major developments in this field, discuss current limitations, and explore aspects of the clinical implementation of precision medicine, which lack consensus. Lastly, we discuss ongoing scientific efforts in this arena, which may yield future clinical applications. © 2016 Wiley Periodicals, Inc.

  6. Causal Video Object Segmentation From Persistence of Occlusions

    DTIC Science & Technology

    2015-05-01

    Precision, recall, and F-measure are reported on the ground truth anno - tations converted to binary masks. Note we cannot evaluate “number of...to lack of occlusions. References [1] P. Arbelaez, M. Maire, C. Fowlkes, and J . Malik. Con- tour detection and hierarchical image segmentation. TPAMI...X. Bai, J . Wang, D. Simons, and G. Sapiro. Video snapcut: robust video object cutout using localized classifiers. In ACM Transactions on Graphics

  7. Precision displacement reference system

    DOEpatents

    Bieg, Lothar F.; Dubois, Robert R.; Strother, Jerry D.

    2000-02-22

    A precision displacement reference system is described, which enables real time accountability over the applied displacement feedback system to precision machine tools, positioning mechanisms, motion devices, and related operations. As independent measurements of tool location is taken by a displacement feedback system, a rotating reference disk compares feedback counts with performed motion. These measurements are compared to characterize and analyze real time mechanical and control performance during operation.

  8. System for Dispensing a Precise Amount of Fluid

    DOEpatents

    Benett, William J.; Krulevitch, Peter A.; Visuri, Steven R.; Dzenitis, John M.; Ness, Kevin D.

    2008-08-12

    A dispensing system delivers a precise amount of fluid for biological or chemical processing and/or analysis. Dispensing means moves the fluid. The dispensing means is operated by a pneumatic force. Connection means delivers the fluid to the desired location. An actuator means provides the pneumatic force to the dispensing means. Valving means transmits the pneumatic force from the actuator means to the dispensing means.

  9. Precision injection molding of freeform optics

    NASA Astrophysics Data System (ADS)

    Fang, Fengzhou; Zhang, Nan; Zhang, Xiaodong

    2016-08-01

    Precision injection molding is the most efficient mass production technology for manufacturing plastic optics. Applications of plastic optics in field of imaging, illumination, and concentration demonstrate a variety of complex surface forms, developing from conventional plano and spherical surfaces to aspheric and freeform surfaces. It requires high optical quality with high form accuracy and lower residual stresses, which challenges both optical tool inserts machining and precision injection molding process. The present paper reviews recent progress in mold tool machining and precision injection molding, with more emphasis on precision injection molding. The challenges and future development trend are also discussed.

  10. Data Sharing For Precision Medicine: Policy Lessons And Future Directions.

    PubMed

    Blasimme, Alessandro; Fadda, Marta; Schneider, Manuel; Vayena, Effy

    2018-05-01

    Data sharing is a precondition of precision medicine. Numerous organizations have produced abundant guidance on data sharing. Despite such efforts, data are not being shared to a degree that can trigger the expected data-driven revolution in precision medicine. We set out to explore why. Here we report the results of a comprehensive analysis of data-sharing guidelines issued over the past two decades by multiple organizations. We found that the guidelines overlap on a restricted set of policy themes. However, we observed substantial fragmentation in the policy landscape across specific organizations and data types. This may have contributed to the current stalemate in data sharing. To move toward a more efficient data-sharing ecosystem for precision medicine, policy makers should explore innovative ways to cope with central policy themes such as privacy, consent, and data quality; focus guidance on interoperability, attribution, and public engagement; and promote data-sharing policies that can be adapted to multiple data types.

  11. Multi-spectral image analysis for improved space object characterization

    NASA Astrophysics Data System (ADS)

    Glass, William; Duggin, Michael J.; Motes, Raymond A.; Bush, Keith A.; Klein, Meiling

    2009-08-01

    The Air Force Research Laboratory (AFRL) is studying the application and utility of various ground-based and space-based optical sensors for improving surveillance of space objects in both Low Earth Orbit (LEO) and Geosynchronous Earth Orbit (GEO). This information can be used to improve our catalog of space objects and will be helpful in the resolution of satellite anomalies. At present, ground-based optical and radar sensors provide the bulk of remotely sensed information on satellites and space debris, and will continue to do so into the foreseeable future. However, in recent years, the Space-Based Visible (SBV) sensor was used to demonstrate that a synthesis of space-based visible data with ground-based sensor data could provide enhancements to information obtained from any one source in isolation. The incentives for space-based sensing include improved spatial resolution due to the absence of atmospheric effects and cloud cover and increased flexibility for observations. Though ground-based optical sensors can use adaptive optics to somewhat compensate for atmospheric turbulence, cloud cover and absorption are unavoidable. With recent advances in technology, we are in a far better position to consider what might constitute an ideal system to monitor our surroundings in space. This work has begun at the AFRL using detailed optical sensor simulations and analysis techniques to explore the trade space involved in acquiring and processing data from a variety of hypothetical space-based and ground-based sensor systems. In this paper, we briefly review the phenomenology and trade space aspects of what might be required in order to use multiple band-passes, sensor characteristics, and observation and illumination geometries to increase our awareness of objects in space.

  12. Mapping landslide source and transport areas in VHR images with Object-Based Analysis and Support Vector Machines

    NASA Astrophysics Data System (ADS)

    Heleno, Sandra; Matias, Magda; Pina, Pedro

    2015-04-01

    Visual interpretation of satellite imagery remains extremely demanding in terms of resources and time, especially when dealing with numerous multi-scale landslides affecting wide areas, such as is the case of rainfall-induced shallow landslides. Applying automated methods can contribute to more efficient landslide mapping and updating of existing inventories, and in recent years the number and variety of approaches is rapidly increasing. Very High Resolution (VHR) images, acquired by space-borne sensors with sub-metric precision, such as Ikonos, Quickbird, Geoeye and Worldview, are increasingly being considered as the best option for landslide mapping, but these new levels of spatial detail also present new challenges to state of the art image analysis tools, asking for automated methods specifically suited to map landslide events on VHR optical images. In this work we develop and test a methodology for semi-automatic landslide recognition and mapping of landslide source and transport areas. The method combines object-based image analysis and a Support Vector Machine supervised learning algorithm, and was tested using a GeoEye-1 multispectral image, sensed 3 days after a damaging landslide event in Madeira Island, together with a pre-event LiDAR DEM. Our approach has proved successful in the recognition of landslides on a 15 Km2-wide study area, with 81 out of 85 landslides detected in its validation regions. The classifier also showed reasonable performance (false positive rate 60% and false positive rate below 36% in both validation regions) in the internal mapping of landslide source and transport areas, in particular in the sunnier east-facing slopes. In the less illuminated areas the classifier is still able to accurately map the source areas, but performs poorly in the mapping of landslide transport areas.

  13. Clinical proteomics-driven precision medicine for targeted cancer therapy: current overview and future perspectives.

    PubMed

    Zhou, Li; Wang, Kui; Li, Qifu; Nice, Edouard C; Zhang, Haiyuan; Huang, Canhua

    2016-01-01

    Cancer is a common disease that is a leading cause of death worldwide. Currently, early detection and novel therapeutic strategies are urgently needed for more effective management of cancer. Importantly, protein profiling using clinical proteomic strategies, with spectacular sensitivity and precision, offer excellent promise for the identification of potential biomarkers that would direct the development of targeted therapeutic anticancer drugs for precision medicine. In particular, clinical sample sources, including tumor tissues and body fluids (blood, feces, urine and saliva), have been widely investigated using modern high-throughput mass spectrometry-based proteomic approaches combined with bioinformatic analysis, to pursue the possibilities of precision medicine for targeted cancer therapy. Discussed in this review are the current advantages and limitations of clinical proteomics, the available strategies of clinical proteomics for the management of precision medicine, as well as the challenges and future perspectives of clinical proteomics-driven precision medicine for targeted cancer therapy.

  14. Introduction to the GEOBIA 2010 special issue: From pixels to geographic objects in remote sensing image analysis

    NASA Astrophysics Data System (ADS)

    Addink, Elisabeth A.; Van Coillie, Frieke M. B.; De Jong, Steven M.

    2012-04-01

    Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received considerable attention over the past 15 years for analyzing and interpreting remote sensing imagery. In contrast to traditional image analysis, GEOBIA works more like the human eye-brain combination does. The latter uses the object's color (spectral information), size, texture, shape and occurrence to other image objects to interpret and analyze what we see. GEOBIA starts by segmenting the image grouping together pixels into objects and next uses a wide range of object properties to classify the objects or to extract object's properties from the image. Significant advances and improvements in image analysis and interpretation are made thanks to GEOBIA. In June 2010 the third conference on GEOBIA took place at the Ghent University after successful previous meetings in Calgary (2008) and Salzburg (2006). This special issue presents a selection of the 2010 conference papers that are worked out as full research papers for JAG. The papers cover GEOBIA applications as well as innovative methods and techniques. The topics range from vegetation mapping, forest parameter estimation, tree crown identification, urban mapping, land cover change, feature selection methods and the effects of image compression on segmentation. From the original 94 conference papers, 26 full research manuscripts were submitted; nine papers were selected and are presented in this special issue. Selection was done on the basis of quality and topic of the studies. The next GEOBIA conference will take place in Rio de Janeiro from 7 to 9 May 2012 where we hope to welcome even more scientists working in the field of GEOBIA.

  15. Commercial objectives, technology transfer, and systems analysis for fusion power development

    NASA Astrophysics Data System (ADS)

    Dean, Stephen O.

    1988-09-01

    Fusion is an inexhaustible source of energy that has the potential for economic commercial applications with excellent safety and environmental characteristics. The primary focus for the fusion energy development program is the generation of central station electricity. Fusion has the potential, however, for many other applications. The fact that a large fraction of the energy released in a DT fusion reaction is carried by high energy neutrons suggests potentially unique applications. In addition, fusion R and D will lead to new products and new markets. Each fusion application must meet certain standards of economic and safety and environmental attractiveness. For this reason, economics on the one hand, and safety and environment and licensing on the other, are the two primary criteria for setting long range commercial fusion objectives. A major function of systems analysis is to evaluate the potential of fusion against these objectives and to help guide the fusion R and D program toward practical applications. The transfer of fusion technology and skills from the national labs and universities to industry is the key to achieving the long range objective of commercial fusion applications.

  16. Commercial objectives, technology transfer, and systems analysis for fusion power development

    NASA Technical Reports Server (NTRS)

    Dean, Stephen O.

    1988-01-01

    Fusion is an inexhaustible source of energy that has the potential for economic commercial applications with excellent safety and environmental characteristics. The primary focus for the fusion energy development program is the generation of central station electricity. Fusion has the potential, however, for many other applications. The fact that a large fraction of the energy released in a DT fusion reaction is carried by high energy neutrons suggests potentially unique applications. In addition, fusion R and D will lead to new products and new markets. Each fusion application must meet certain standards of economic and safety and environmental attractiveness. For this reason, economics on the one hand, and safety and environment and licensing on the other, are the two primary criteria for setting long range commercial fusion objectives. A major function of systems analysis is to evaluate the potential of fusion against these objectives and to help guide the fusion R and D program toward practical applications. The transfer of fusion technology and skills from the national labs and universities to industry is the key to achieving the long range objective of commercial fusion applications.

  17. The landscape of precision cancer medicine clinical trials in the United States.

    PubMed

    Roper, Nitin; Stensland, Kristian D; Hendricks, Ryan; Galsky, Matthew D

    2015-05-01

    Advances in tumor biology and multiplex genomic analysis have ushered in the era of precision cancer medicine. Little is currently known, however, about the landscape of prospective "precision cancer medicine" clinical trials in the U.S. We identified all adult interventional cancer trials registered on ClinicalTrials.gov between September 2005 and May 2013. Trials were classified as "precision cancer medicine" if a genomic alteration in a predefined set of 88 genes was required for enrollment. Baseline characteristics were ascertained for each trial. Of the initial 18,797 trials identified, 9094 (48%) were eligible for inclusion: 684 (8%) were classified as precision cancer medicine trials and 8410 (92%) were non-precision cancer medicine trials. Compared with non-precision cancer medicine trials, precision cancer medicine trials were significantly more likely to be phase II [RR 1.19 (1.10-1.29), p<0.001], multi-center [RR 1.18 (1.11-1.26), p<0.001], open-label [RR 1.04 (1.02-1.07), p=0.005] and involve breast [RR 4.03 (3.49-4.52), p<0.001], colorectal [RR 1.62 (1.22-2.14), p=0.002] and skin [RR 1.98 (1.55-2.54), p<0.001] cancers. Precision medicine trials required 38 unique genomic alterations for enrollment. The proportion of precision cancer medicine trials compared to the total number of trials increased from 3% in 2006 to 16% in 2013. The proportion of adult cancer clinical trials in the U.S. requiring a genomic alteration for enrollment has increased substantially over the past several years. However, such trials still represent a small minority of studies performed within the cancer clinical trials enterprise and include a small subset of putatively "actionable" alterations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Quantifying precision and availability of location memory in everyday pictures and some implications for picture database design.

    PubMed

    Lansdale, Mark W; Oliff, Lynda; Baguley, Thom S

    2005-06-01

    The authors investigated whether memory for object locations in pictures could be exploited to address known difficulties of designing query languages for picture databases. M. W. Lansdale's (1998) model of location memory was adapted to 4 experiments observing memory for everyday pictures. These experiments showed that location memory is quantified by 2 parameters: a probability that memory is available and a measure of its precision. Availability is determined by controlled attentional processes, whereas precision is mostly governed by picture composition beyond the viewer's control. Additionally, participants' confidence judgments were good predictors of availability but were insensitive to precision. This research suggests that databases using location memory are feasible. The implications of these findings for database design and for further research and development are discussed. (c) 2005 APA

  19. NASA TSRV essential flight control system requirements via object oriented analysis

    NASA Technical Reports Server (NTRS)

    Duffy, Keith S.; Hoza, Bradley J.

    1992-01-01

    The objective was to analyze the baseline flight control system of the Transport Systems Research Vehicle (TSRV) and to develop a system specification that offers high visibility of the essential system requirements in order to facilitate the future development of alternate, more advanced software architectures. The flight control system is defined to be the baseline software for the TSRV research flight deck, including all navigation, guidance, and control functions, and primary pilot displays. The Object Oriented Analysis (OOA) methodology developed is used to develop a system requirement definition. The scope of the requirements definition contained herein is limited to a portion of the Flight Management/Flight Control computer functionality. The development of a partial system requirements definition is documented, and includes a discussion of the tasks required to increase the scope of the requirements definition and recommendations for follow-on research.

  20. Global gray-level thresholding based on object size.

    PubMed

    Ranefall, Petter; Wählby, Carolina

    2016-04-01

    In this article, we propose a fast and robust global gray-level thresholding method based on object size, where the selection of threshold level is based on recall and maximum precision with regard to objects within a given size interval. The method relies on the component tree representation, which can be computed in quasi-linear time. Feature-based segmentation is especially suitable for biomedical microscopy applications where objects often vary in number, but have limited variation in size. We show that for real images of cell nuclei and synthetic data sets mimicking fluorescent spots the proposed method is more robust than all standard global thresholding methods available for microscopy applications in ImageJ and CellProfiler. The proposed method, provided as ImageJ and CellProfiler plugins, is simple to use and the only required input is an interval of the expected object sizes. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.