Sample records for high precision analysis

  1. Numerical Simulation Analysis of High-precision Dispensing Needles for Solid-liquid Two-phase Grinding

    NASA Astrophysics Data System (ADS)

    Li, Junye; Hu, Jinglei; Wang, Binyu; Sheng, Liang; Zhang, Xinming

    2018-03-01

    In order to investigate the effect of abrasive flow polishing surface variable diameter pipe parts, with high precision dispensing needles as the research object, the numerical simulation of the process of polishing high precision dispensing needle was carried out. Analysis of different volume fraction conditions, the distribution of the dynamic pressure and the turbulence viscosity of the abrasive flow field in the high precision dispensing needle, through comparative analysis, the effectiveness of the abrasive grain polishing high precision dispensing needle was studied, controlling the volume fraction of silicon carbide can change the viscosity characteristics of the abrasive flow during the polishing process, so that the polishing quality of the abrasive grains can be controlled.

  2. Accuracy and Precision of Silicon Based Impression Media for Quantitative Areal Texture Analysis

    PubMed Central

    Goodall, Robert H.; Darras, Laurent P.; Purnell, Mark A.

    2015-01-01

    Areal surface texture analysis is becoming widespread across a diverse range of applications, from engineering to ecology. In many studies silicon based impression media are used to replicate surfaces, and the fidelity of replication defines the quality of data collected. However, while different investigators have used different impression media, the fidelity of surface replication has not been subjected to quantitative analysis based on areal texture data. Here we present the results of an analysis of the accuracy and precision with which different silicon based impression media of varying composition and viscosity replicate rough and smooth surfaces. Both accuracy and precision vary greatly between different media. High viscosity media tested show very low accuracy and precision, and most other compounds showed either the same pattern, or low accuracy and high precision, or low precision and high accuracy. Of the media tested, mid viscosity President Jet Regular Body and low viscosity President Jet Light Body (Coltène Whaledent) are the only compounds to show high levels of accuracy and precision on both surface types. Our results show that data acquired from different impression media are not comparable, supporting calls for greater standardisation of methods in areal texture analysis. PMID:25991505

  3. Error analysis of high-rate GNSS precise point positioning for seismic wave measurement

    NASA Astrophysics Data System (ADS)

    Shu, Yuanming; Shi, Yun; Xu, Peiliang; Niu, Xiaoji; Liu, Jingnan

    2017-06-01

    High-rate GNSS precise point positioning (PPP) has been playing a more and more important role in providing precise positioning information in fast time-varying environments. Although kinematic PPP is commonly known to have a precision of a few centimeters, the precision of high-rate PPP within a short period of time has been reported recently with experiments to reach a few millimeters in the horizontal components and sub-centimeters in the vertical component to measure seismic motion, which is several times better than the conventional kinematic PPP practice. To fully understand the mechanism of mystified excellent performance of high-rate PPP within a short period of time, we have carried out a theoretical error analysis of PPP and conducted the corresponding simulations within a short period of time. The theoretical analysis has clearly indicated that the high-rate PPP errors consist of two types: the residual systematic errors at the starting epoch, which affect high-rate PPP through the change of satellite geometry, and the time-varying systematic errors between the starting epoch and the current epoch. Both the theoretical error analysis and simulated results are fully consistent with and thus have unambiguously confirmed the reported high precision of high-rate PPP, which has been further affirmed here by the real data experiments, indicating that high-rate PPP can indeed achieve the millimeter level of precision in the horizontal components and the sub-centimeter level of precision in the vertical component to measure motion within a short period of time. The simulation results have clearly shown that the random noise of carrier phases and higher order ionospheric errors are two major factors to affect the precision of high-rate PPP within a short period of time. The experiments with real data have also indicated that the precision of PPP solutions can degrade to the cm level in both the horizontal and vertical components, if the geometry of satellites is rather poor with a large DOP value.

  4. System and method for high precision isotope ratio destructive analysis

    DOEpatents

    Bushaw, Bruce A; Anheier, Norman C; Phillips, Jon R

    2013-07-02

    A system and process are disclosed that provide high accuracy and high precision destructive analysis measurements for isotope ratio determination of relative isotope abundance distributions in liquids, solids, and particulate samples. The invention utilizes a collinear probe beam to interrogate a laser ablated plume. This invention provides enhanced single-shot detection sensitivity approaching the femtogram range, and isotope ratios that can be determined at approximately 1% or better precision and accuracy (relative standard deviation).

  5. Precision and Accuracy of Analysis for Boron in ITP Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tovo, L.L.

    'Inductively Coupled Plasma Emission Spectroscopy (ICPES) has been used by the Analytical Development Section (ADS) to measure boron in catalytic tetraphenylboron decomposition studies performed by the Waste Processing Technology (WPT) section. Analysis of these samples is complicated due to the presence of high concentrations of sodium and organic compounds. Previously, we found signal suppression in samples analyzed "as received". We suspected that the suppression was due to the high organic concentration (up to 0.01 molar organic decomposition products) in the samples. When the samples were acid digested prior to analysis, the suppression was eliminated. The precision of the reported boronmore » concentration was estimated as 10 percent based on the known precision of the inorganic boron standard used for calibration and quality control check of the ICPES analysis. However, a precision better than 10 percent was needed to evaluate ITP process operating parameters. Therefore, the purpose of this work was (1) to measure, instead of estimating, the precision of the boron measurement on ITP samples and (2) to determine the optimum precision attainable with current instrumentation.'« less

  6. Study on manufacturing method of optical surface with high precision in angle and surface

    NASA Astrophysics Data System (ADS)

    Yu, Xin; Li, Xin; Yu, Ze; Zhao, Bin; Zhang, Xuebin; Sun, Lipeng; Tong, Yi

    2016-10-01

    This paper studied a manufacturing processing of optical surface with high precision in angel and surface. By theoretical analysis of the relationships between the angel precision and surface, the measurement conversion of the technical indicators, optical-cement method application, the optical-cement tooling design, the experiment has been finished successfully, the processing method has been verified, which can be also used in the manufacturing of the optical surface with similar high precision in angle and surface.

  7. Measurement Model and Precision Analysis of Accelerometers for Maglev Vibration Isolation Platforms.

    PubMed

    Wu, Qianqian; Yue, Honghao; Liu, Rongqiang; Zhang, Xiaoyou; Ding, Liang; Liang, Tian; Deng, Zongquan

    2015-08-14

    High precision measurement of acceleration levels is required to allow active control for vibration isolation platforms. It is necessary to propose an accelerometer configuration measurement model that yields such a high measuring precision. In this paper, an accelerometer configuration to improve measurement accuracy is proposed. The corresponding calculation formulas of the angular acceleration were derived through theoretical analysis. A method is presented to minimize angular acceleration noise based on analysis of the root mean square noise of the angular acceleration. Moreover, the influence of installation position errors and accelerometer orientation errors on the calculation precision of the angular acceleration is studied. Comparisons of the output differences between the proposed configuration and the previous planar triangle configuration under the same installation errors are conducted by simulation. The simulation results show that installation errors have a relatively small impact on the calculation accuracy of the proposed configuration. To further verify the high calculation precision of the proposed configuration, experiments are carried out for both the proposed configuration and the planar triangle configuration. On the basis of the results of simulations and experiments, it can be concluded that the proposed configuration has higher angular acceleration calculation precision and can be applied to different platforms.

  8. Measurement Model and Precision Analysis of Accelerometers for Maglev Vibration Isolation Platforms

    PubMed Central

    Wu, Qianqian; Yue, Honghao; Liu, Rongqiang; Zhang, Xiaoyou; Ding, Liang; Liang, Tian; Deng, Zongquan

    2015-01-01

    High precision measurement of acceleration levels is required to allow active control for vibration isolation platforms. It is necessary to propose an accelerometer configuration measurement model that yields such a high measuring precision. In this paper, an accelerometer configuration to improve measurement accuracy is proposed. The corresponding calculation formulas of the angular acceleration were derived through theoretical analysis. A method is presented to minimize angular acceleration noise based on analysis of the root mean square noise of the angular acceleration. Moreover, the influence of installation position errors and accelerometer orientation errors on the calculation precision of the angular acceleration is studied. Comparisons of the output differences between the proposed configuration and the previous planar triangle configuration under the same installation errors are conducted by simulation. The simulation results show that installation errors have a relatively small impact on the calculation accuracy of the proposed configuration. To further verify the high calculation precision of the proposed configuration, experiments are carried out for both the proposed configuration and the planar triangle configuration. On the basis of the results of simulations and experiments, it can be concluded that the proposed configuration has higher angular acceleration calculation precision and can be applied to different platforms. PMID:26287203

  9. Ion microprobe measurement of strontium isotopes in calcium carbonate with application to salmon otoliths

    USGS Publications Warehouse

    Weber, P.K.; Bacon, C.R.; Hutcheon, I.D.; Ingram, B.L.; Wooden, J.L.

    2005-01-01

    The ion microprobe has the capability to generate high resolution, high precision isotopic measurements, but analysis of the isotopic composition of strontium, as measured by the 87Sr/ 86Sr ratio, has been hindered by isobaric interferences. Here we report the first high precision measurements of 87Sr/ 86Sr by ion microprobe in calcium carbonate samples with moderate Sr concentrations. We use the high mass resolving power (7000 to 9000 M.R.P.) of the SHRIMP-RG ion microprobe in combination with its high transmission to reduce the number of interfering species while maintaining sufficiently high count rates for precise isotopic measurements. The isobaric interferences are characterized by peak modeling and repeated analyses of standards. We demonstrate that by sample-standard bracketing, 87Sr/86Sr ratios can be measured in inorganic and biogenic carbonates with Sr concentrations between 400 and 1500 ppm with ???2??? external precision (2??) for a single analysis, and subpermil external precision with repeated analyses. Explicit correction for isobaric interferences (peak-stripping) is found to be less accurate and precise than sample-standard bracketing. Spatial resolution is ???25 ??m laterally and 2 ??m deep for a single analysis, consuming on the order of 2 ng of material. The method is tested on otoliths from salmon to demonstrate its accuracy and utility. In these growth-banded aragonitic structures, one-week temporal resolution can be achieved. The analytical method should be applicable to other calcium carbonate samples with similar Sr concentrations. Copyright ?? 2005 Elsevier Ltd.

  10. High precision spectroscopy and imaging in THz frequency range

    NASA Astrophysics Data System (ADS)

    Vaks, Vladimir L.

    2014-03-01

    Application of microwave methods for development of the THz frequency range has resulted in elaboration of high precision THz spectrometers based on nonstationary effects. The spectrometers characteristics (spectral resolution and sensitivity) meet the requirements for high precision analysis. The gas analyzers, based on the high precision spectrometers, have been successfully applied for analytical investigations of gas impurities in high pure substances. These investigations can be carried out both in absorption cell and in reactor. The devices can be used for ecological monitoring, detecting the components of chemical weapons and explosive in the atmosphere. The great field of THz investigations is the medicine application. Using the THz spectrometers developed one can detect markers for some diseases in exhaled air.

  11. High-Precision Image Aided Inertial Navigation with Known Features: Observability Analysis and Performance Evaluation

    PubMed Central

    Jiang, Weiping; Wang, Li; Niu, Xiaoji; Zhang, Quan; Zhang, Hui; Tang, Min; Hu, Xiangyun

    2014-01-01

    A high-precision image-aided inertial navigation system (INS) is proposed as an alternative to the carrier-phase-based differential Global Navigation Satellite Systems (CDGNSSs) when satellite-based navigation systems are unavailable. In this paper, the image/INS integrated algorithm is modeled by a tightly-coupled iterative extended Kalman filter (IEKF). Tightly-coupled integration ensures that the integrated system is reliable, even if few known feature points (i.e., less than three) are observed in the images. A new global observability analysis of this tightly-coupled integration is presented to guarantee that the system is observable under the necessary conditions. The analysis conclusions were verified by simulations and field tests. The field tests also indicate that high-precision position (centimeter-level) and attitude (half-degree-level)-integrated solutions can be achieved in a global reference. PMID:25330046

  12. Hypothesis testing for band size detection of high-dimensional banded precision matrices.

    PubMed

    An, Baiguo; Guo, Jianhua; Liu, Yufeng

    2014-06-01

    Many statistical analysis procedures require a good estimator for a high-dimensional covariance matrix or its inverse, the precision matrix. When the precision matrix is banded, the Cholesky-based method often yields a good estimator of the precision matrix. One important aspect of this method is determination of the band size of the precision matrix. In practice, crossvalidation is commonly used; however, we show that crossvalidation not only is computationally intensive but can be very unstable. In this paper, we propose a new hypothesis testing procedure to determine the band size in high dimensions. Our proposed test statistic is shown to be asymptotically normal under the null hypothesis, and its theoretical power is studied. Numerical examples demonstrate the effectiveness of our testing procedure.

  13. Towards Precision Spectroscopy of Baryonic Resonances

    NASA Astrophysics Data System (ADS)

    Döring, Michael; Mai, Maxim; Rönchen, Deborah

    2017-01-01

    Recent progress in baryon spectroscopy is reviewed. In a common effort, various groups have analyzed a set of new high-precision polarization observables from ELSA. The Jülich-Bonn group has finalized the analysis of pion-induced meson-baryon production, the potoproduction of pions and eta mesons, and (almost) the KΛ final state. As data become preciser, statistical aspects in the analysis of excited baryons become increasingly relevant and several advances in this direction are proposed.

  14. Towards precision spectroscopy of baryonic resonances

    DOE PAGES

    Doring, Michael; Mai, Maxim; Ronchen, Deborah

    2017-01-26

    Recent progress in baryon spectroscopy is reviewed. In a common effort, various groups have analyzed a set of new high-precision polarization observables from ELSA. The Julich-Bonn group has finalized the analysis of pion-induced meson-baryon production, the potoproduction of pions and eta mesons, and (almost) the KΛ final state. Lastly, as data become preciser, statistical aspects in the analysis of excited baryons become increasingly relevant and several advances in this direction are proposed.

  15. Optical System Error Analysis and Calibration Method of High-Accuracy Star Trackers

    PubMed Central

    Sun, Ting; Xing, Fei; You, Zheng

    2013-01-01

    The star tracker is a high-accuracy attitude measurement device widely used in spacecraft. Its performance depends largely on the precision of the optical system parameters. Therefore, the analysis of the optical system parameter errors and a precise calibration model are crucial to the accuracy of the star tracker. Research in this field is relatively lacking a systematic and universal analysis up to now. This paper proposes in detail an approach for the synthetic error analysis of the star tracker, without the complicated theoretical derivation. This approach can determine the error propagation relationship of the star tracker, and can build intuitively and systematically an error model. The analysis results can be used as a foundation and a guide for the optical design, calibration, and compensation of the star tracker. A calibration experiment is designed and conducted. Excellent calibration results are achieved based on the calibration model. To summarize, the error analysis approach and the calibration method are proved to be adequate and precise, and could provide an important guarantee for the design, manufacture, and measurement of high-accuracy star trackers. PMID:23567527

  16. Refining FIA plot locations using LiDAR point clouds

    Treesearch

    Charlie Schrader-Patton; Greg C. Liknes; Demetrios Gatziolis; Brian M. Wing; Mark D. Nelson; Patrick D. Miles; Josh Bixby; Daniel G. Wendt; Dennis Kepler; Abbey Schaaf

    2015-01-01

    Forest Inventory and Analysis (FIA) plot location coordinate precision is often insufficient for use with high resolution remotely sensed data, thereby limiting the use of these plots for geospatial applications and reducing the validity of models that assume the locations are precise. A practical and efficient method is needed to improve coordinate precision. To...

  17. Precision Medicine: Functional Advancements.

    PubMed

    Caskey, Thomas

    2018-01-29

    Precision medicine was conceptualized on the strength of genomic sequence analysis. High-throughput functional metrics have enhanced sequence interpretation and clinical precision. These technologies include metabolomics, magnetic resonance imaging, and I rhythm (cardiac monitoring), among others. These technologies are discussed and placed in clinical context for the medical specialties of internal medicine, pediatrics, obstetrics, and gynecology. Publications in these fields support the concept of a higher level of precision in identifying disease risk. Precise disease risk identification has the potential to enable intervention with greater specificity, resulting in disease prevention-an important goal of precision medicine.

  18. Method of high precision interval measurement in pulse laser ranging system

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Lv, Xin-yuan; Mao, Jin-jin; Liu, Wei; Yang, Dong

    2013-09-01

    Laser ranging is suitable for laser system, for it has the advantage of high measuring precision, fast measuring speed,no cooperative targets and strong resistance to electromagnetic interference,the measuremen of laser ranging is the key paremeters affecting the performance of the whole system.The precision of the pulsed laser ranging system was decided by the precision of the time interval measurement, the principle structure of laser ranging system was introduced, and a method of high precision time interval measurement in pulse laser ranging system was established in this paper.Based on the analysis of the factors which affected the precision of range measure,the pulse rising edges discriminator was adopted to produce timing mark for the start-stop time discrimination,and the TDC-GP2 high precision interval measurement system based on TMS320F2812 DSP was designed to improve the measurement precision.Experimental results indicate that the time interval measurement method in this paper can obtain higher range accuracy. Compared with the traditional time interval measurement system,the method simplifies the system design and reduce the influence of bad weather conditions,furthermore,it satisfies the requirements of low costs and miniaturization.

  19. A double sealing technique for increasing the precision of headspace-gas chromatographic analysis.

    PubMed

    Xie, Wei-Qi; Yu, Kong-Xian; Gong, Yi-Xian

    2018-01-19

    This paper investigates a new double sealing technique for increasing the precision of the headspace gas chromatographic method. The air leakage problem caused by the high pressure in the headspace vial during the headspace sampling process has a great impact to the measurement precision in the conventional headspace analysis (i.e., single sealing technique). The results (using ethanol solution as the model sample) show that the present technique is effective to minimize such a problem. The double sealing technique has an excellent measurement precision (RSD < 0.15%) and accuracy (recovery = 99.1%-100.6%) for the ethanol quantification. The detection precision of the present method was 10-20 times higher than that in earlier HS-GC work that use conventional single sealing technique. The present double sealing technique may open up a new avenue, and also serve as a general strategy for improving the performance (i.e., accuracy and precision) of headspace analysis of various volatile compounds. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Influence of speckle image reconstruction on photometric precision for large solar telescopes

    NASA Astrophysics Data System (ADS)

    Peck, C. L.; Wöger, F.; Marino, J.

    2017-11-01

    Context. High-resolution observations from large solar telescopes require adaptive optics (AO) systems to overcome image degradation caused by Earth's turbulent atmosphere. AO corrections are, however, only partial. Achieving near-diffraction limited resolution over a large field of view typically requires post-facto image reconstruction techniques to reconstruct the source image. Aims: This study aims to examine the expected photometric precision of amplitude reconstructed solar images calibrated using models for the on-axis speckle transfer functions and input parameters derived from AO control data. We perform a sensitivity analysis of the photometric precision under variations in the model input parameters for high-resolution solar images consistent with four-meter class solar telescopes. Methods: Using simulations of both atmospheric turbulence and partial compensation by an AO system, we computed the speckle transfer function under variations in the input parameters. We then convolved high-resolution numerical simulations of the solar photosphere with the simulated atmospheric transfer function, and subsequently deconvolved them with the model speckle transfer function to obtain a reconstructed image. To compute the resulting photometric precision, we compared the intensity of the original image with the reconstructed image. Results: The analysis demonstrates that high photometric precision can be obtained for speckle amplitude reconstruction using speckle transfer function models combined with AO-derived input parameters. Additionally, it shows that the reconstruction is most sensitive to the input parameter that characterizes the atmospheric distortion, and sub-2% photometric precision is readily obtained when it is well estimated.

  1. Concomitant ion effects on isotope ratio measurements with liquid sampling – atmospheric pressure glow discharge ion source Orbitrap mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoegg, Edward D.; Marcus, R. Kenneth; Hager, George J.

    RATIONALE: The field of highly accurate and precise isotope ratio (IR) analysis has been dominated by inductively coupled plasma and thermal ionization mass spectrometers. While these instruments are considered the gold standard for IR analysis, the International Atomic Energy Agency desires a field deployable instrument capable of accurately and precisely measuring U isotope ratios. METHODS: The proposed system interfaces the liquid sampling – atmospheric pressure glow discharge (LS-APGD) ion source with a high resolution Exactive Orbitrap mass spectrometer. With this experimental setup certified U isotope standards and unknown samples were analyzed. The accuracy and precision of the system were thenmore » determined. RESULTS: The LS-APGD /Exactive instrument measures a certified reference material of natural U (235U/238U = 0.007258) as 0.007041 with a relative standard deviation of 0.158% meeting the International Target Values for Uncertainty for the destructive analysis of U. Additionally, when three unknowns measured and compared to the results from an ICP multi collector instrument, there is no statistical difference between the two instruments.CONCLUSIONS: The LS-APGD / Orbitrap system, while still in the preliminary stages of development, offers highly accurate and precise IR analysis that suggest a paradigm shift in the world of IR analysis. Furthermore, the portability of the LS-APGD as an elemental ion source combined with the low overhead and small size of the Orbitrap suggest that the instrumentation is capable of being field deployable.With liquid sampling glow discharge-Orbitrap MS, isotope ratio and precision performance improves with rejection of concomitant ion species.« less

  2. Terrain matching image pre-process and its format transform in autonomous underwater navigation

    NASA Astrophysics Data System (ADS)

    Cao, Xuejun; Zhang, Feizhou; Yang, Dongkai; Yang, Bogang

    2007-06-01

    Underwater passive navigation technology is one of the important development orientations in the field of modern navigation. With the advantage of high self-determination, stealth at sea, anti-jamming and high precision, passive navigation is completely meet with actual navigation requirements. Therefore passive navigation has become a specific navigating method for underwater vehicles. The scientists and researchers in the navigating field paid more attention to it. The underwater passive navigation can provide accurate navigation information with main Inertial Navigation System (INS) for a long period, such as location and speed. Along with the development of micro-electronics technology, the navigation of AUV is given priority to INS assisted with other navigation methods, such as terrain matching navigation. It can provide navigation ability for a long period, correct the errors of INS and make AUV not emerge from the seabed termly. With terrain matching navigation technique, in the assistance of digital charts and ocean geographical characteristics sensors, we carry through underwater image matching assistant navigation to obtain the higher location precision, therefore it is content with the requirement of underwater, long-term, high precision and all-weather of the navigation system for Autonomous Underwater Vehicles. Tertian-assistant navigation (TAN) is directly dependent on the image information (map information) in the navigating field to assist the primary navigation system according to the path appointed in advance. In TAN, a factor coordinative important with the system operation is precision and practicability of the storable images and the database which produce the image data. If the data used for characteristics are not suitable, the system navigation precision will be low. Comparing with terrain matching assistant navigation system, image matching navigation system is a kind of high precision and low cost assistant navigation system, and its matching precision directly influences the final precision of integrated navigation system. Image matching assistant navigation is spatially matching and aiming at two underwater scenery images coming from two different sensors matriculating of the same scenery in order to confirm the relative displacement of the two images. In this way, we can obtain the vehicle's location in fiducial image known geographical relation, and the precise location information given from image matching location is transmitted to INS to eliminate its location error and greatly enhance the navigation precision of vehicle. Digital image data analysis and processing of image matching in underwater passive navigation is important. In regard to underwater geographic data analysis, we focus on the acquirement, disposal, analysis, expression and measurement of database information. These analysis items structure one of the important contents of underwater terrain matching and are propitious to know the seabed terrain configuration of navigation areas so that the best advantageous seabed terrain district and dependable navigation algorithm can be selected. In this way, we can improve the precision and reliability of terrain assistant navigation system. The pre-process and format transformation of digital image during underwater image matching are expatiated in this paper. The information of the terrain status in navigation areas need further study to provide the reliable data terrain characteristic and underwater overcast for navigation. Through realizing the choice of sea route, danger district prediction and navigating algorithm analysis, TAN can obtain more high location precision and probability, hence provide technological support for image matching of underwater passive navigation.

  3. Parametric geometric model and hydrodynamic shape optimization of a flying-wing structure underwater glider

    NASA Astrophysics Data System (ADS)

    Wang, Zhen-yu; Yu, Jian-cheng; Zhang, Ai-qun; Wang, Ya-xing; Zhao, Wen-tao

    2017-12-01

    Combining high precision numerical analysis methods with optimization algorithms to make a systematic exploration of a design space has become an important topic in the modern design methods. During the design process of an underwater glider's flying-wing structure, a surrogate model is introduced to decrease the computation time for a high precision analysis. By these means, the contradiction between precision and efficiency is solved effectively. Based on the parametric geometry modeling, mesh generation and computational fluid dynamics analysis, a surrogate model is constructed by adopting the design of experiment (DOE) theory to solve the multi-objects design optimization problem of the underwater glider. The procedure of a surrogate model construction is presented, and the Gaussian kernel function is specifically discussed. The Particle Swarm Optimization (PSO) algorithm is applied to hydrodynamic design optimization. The hydrodynamic performance of the optimized flying-wing structure underwater glider increases by 9.1%.

  4. Optimization design about gimbal structure of high-precision autonomous celestial navigation tracking mirror system

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Yang, Xiao-xu; Han, Jun-feng; Wei, Yu; Zhang, Jing; Xie, Mei-lin; Yue, Peng

    2016-01-01

    High precision tracking platform of celestial navigation with control mirror servo structure form, to solve the disadvantages of big volume and rotational inertia, slow response speed, and so on. It improved the stability and tracking accuracy of platform. Due to optical sensor and mirror are installed on the middle-gimbal, stiffness and resonant frequency requirement for high. Based on the application of finite element modality analysis theory, doing Research on dynamic characteristics of the middle-gimbal, and ANSYS was used for the finite element dynamic emulator analysis. According to the result of the computer to find out the weak links of the structure, and Put forward improvement suggestions and reanalysis. The lowest resonant frequency of optimization middle-gimbal avoid the bandwidth of the platform servo mechanism, and much higher than the disturbance frequency of carrier aircraft, and reduces mechanical resonance of the framework. Reaching provides a theoretical basis for the whole machine structure optimization design of high-precision of autonomous Celestial navigation tracking mirror system.

  5. Precision Health Economics and Outcomes Research to Support Precision Medicine: Big Data Meets Patient Heterogeneity on the Road to Value.

    PubMed

    Chen, Yixi; Guzauskas, Gregory F; Gu, Chengming; Wang, Bruce C M; Furnback, Wesley E; Xie, Guotong; Dong, Peng; Garrison, Louis P

    2016-11-02

    The "big data" era represents an exciting opportunity to utilize powerful new sources of information to reduce clinical and health economic uncertainty on an individual patient level. In turn, health economic outcomes research (HEOR) practices will need to evolve to accommodate individual patient-level HEOR analyses. We propose the concept of "precision HEOR", which utilizes a combination of costs and outcomes derived from big data to inform healthcare decision-making that is tailored to highly specific patient clusters or individuals. To explore this concept, we discuss the current and future roles of HEOR in health sector decision-making, big data and predictive analytics, and several key HEOR contexts in which big data and predictive analytics might transform traditional HEOR into precision HEOR. The guidance document addresses issues related to the transition from traditional to precision HEOR practices, the evaluation of patient similarity analysis and its appropriateness for precision HEOR analysis, and future challenges to precision HEOR adoption. Precision HEOR should make precision medicine more realizable by aiding and adapting healthcare resource allocation. The combined hopes for precision medicine and precision HEOR are that individual patients receive the best possible medical care while overall healthcare costs remain manageable or become more cost-efficient.

  6. Precision Health Economics and Outcomes Research to Support Precision Medicine: Big Data Meets Patient Heterogeneity on the Road to Value

    PubMed Central

    Chen, Yixi; Guzauskas, Gregory F.; Gu, Chengming; Wang, Bruce C. M.; Furnback, Wesley E.; Xie, Guotong; Dong, Peng; Garrison, Louis P.

    2016-01-01

    The “big data” era represents an exciting opportunity to utilize powerful new sources of information to reduce clinical and health economic uncertainty on an individual patient level. In turn, health economic outcomes research (HEOR) practices will need to evolve to accommodate individual patient–level HEOR analyses. We propose the concept of “precision HEOR”, which utilizes a combination of costs and outcomes derived from big data to inform healthcare decision-making that is tailored to highly specific patient clusters or individuals. To explore this concept, we discuss the current and future roles of HEOR in health sector decision-making, big data and predictive analytics, and several key HEOR contexts in which big data and predictive analytics might transform traditional HEOR into precision HEOR. The guidance document addresses issues related to the transition from traditional to precision HEOR practices, the evaluation of patient similarity analysis and its appropriateness for precision HEOR analysis, and future challenges to precision HEOR adoption. Precision HEOR should make precision medicine more realizable by aiding and adapting healthcare resource allocation. The combined hopes for precision medicine and precision HEOR are that individual patients receive the best possible medical care while overall healthcare costs remain manageable or become more cost-efficient. PMID:27827859

  7. Omics Profiling in Precision Oncology*

    PubMed Central

    Yu, Kun-Hsing; Snyder, Michael

    2016-01-01

    Cancer causes significant morbidity and mortality worldwide, and is the area most targeted in precision medicine. Recent development of high-throughput methods enables detailed omics analysis of the molecular mechanisms underpinning tumor biology. These studies have identified clinically actionable mutations, gene and protein expression patterns associated with prognosis, and provided further insights into the molecular mechanisms indicative of cancer biology and new therapeutics strategies such as immunotherapy. In this review, we summarize the techniques used for tumor omics analysis, recapitulate the key findings in cancer omics studies, and point to areas requiring further research on precision oncology. PMID:27099341

  8. Precision of coherence analysis to detect cerebral autoregulation by near-infrared spectroscopy in preterm infants

    NASA Astrophysics Data System (ADS)

    Hahn, Gitte Holst; Christensen, Karl Bang; Leung, Terence S.; Greisen, Gorm

    2010-05-01

    Coherence between spontaneous fluctuations in arterial blood pressure (ABP) and the cerebral near-infrared spectroscopy signal can detect cerebral autoregulation. Because reliable measurement depends on signals with high signal-to-noise ratio, we hypothesized that coherence is more precisely determined when fluctuations in ABP are large rather than small. Therefore, we investigated whether adjusting for variability in ABP (variabilityABP) improves precision. We examined the impact of variabilityABP within the power spectrum in each measurement and between repeated measurements in preterm infants. We also examined total monitoring time required to discriminate among infants with a simulation study. We studied 22 preterm infants (GA<30) yielding 215 10-min measurements. Surprisingly, adjusting for variabilityABP within the power spectrum did not improve the precision. However, adjusting for the variabilityABP among repeated measurements (i.e., weighting measurements with high variabilityABP in favor of those with low) improved the precision. The evidence of drift in individual infants was weak. Minimum monitoring time needed to discriminate among infants was 1.3-3.7 h. Coherence analysis in low frequencies (0.04-0.1 Hz) had higher precision and statistically more power than in very low frequencies (0.003-0.04 Hz). In conclusion, a reliable detection of cerebral autoregulation takes hours and the precision is improved by adjusting for variabilityABP between repeated measurements.

  9. Green and Fast Laser Fusion Technique for Bulk Silicate Rock Analysis by Laser Ablation-Inductively Coupled Plasma Mass Spectrometry.

    PubMed

    Zhang, Chenxi; Hu, Zhaochu; Zhang, Wen; Liu, Yongsheng; Zong, Keqing; Li, Ming; Chen, Haihong; Hu, Shenghong

    2016-10-18

    Sample preparation of whole-rock powders is the major limitation for their accurate and precise elemental analysis by laser ablation inductively-coupled plasma mass spectrometry (ICPMS). In this study, a green, efficient, and simplified fusion technique using a high energy infrared laser was developed for major and trace elemental analysis. Fusion takes only tens of milliseconds for each sample. Compared to the pressed pellet sample preparation, the analytical precision of the developed laser fusion technique is higher by an order of magnitude for most elements in granodiorite GSP-2. Analytical results obtained for five USGS reference materials (ranging from mafic to intermediate to felsic) using the laser fusion technique generally agree with recommended values with discrepancies of less than 10% for most elements. However, high losses (20-70%) of highly volatile elements (Zn and Pb) and the transition metal Cu are observed. The achieved precision is within 5% for major elements and within 15% for most trace elements. Direct laser fusion of rock powders is a green and notably simple method to obtain homogeneous samples, which will significantly accelerate the application of laser ablation ICPMS for whole-rock sample analysis.

  10. Analysis of polonium-210 in food products and bioassay samples by isotope-dilution alpha spectrometry.

    PubMed

    Lin, Zhichao; Wu, Zhongyu

    2009-05-01

    A rapid and reliable radiochemical method coupled with a simple and compact plating apparatus was developed, validated, and applied for the analysis of (210)Po in variety of food products and bioassay samples. The method performance characteristics, including accuracy, precision, robustness, and specificity, were evaluated along with a detailed measurement uncertainty analysis. With high Po recovery, improved energy resolution, and effective removal of interfering elements by chromatographic extraction, the overall method accuracy was determined to be better than 5% with measurement precision of 10%, at 95% confidence level.

  11. Direct analysis of δ13C and concentration of dissolved organic carbon (DOC) in environmental samples by TOC-IRMS

    NASA Astrophysics Data System (ADS)

    Kirkels, Frédérique; Cerli, Chiara; Federherr, Eugen; Kalbitz, Karsten

    2014-05-01

    Dissolved organic carbon (DOC) plays an important role in carbon cycling in terrestrial and aquatic systems. Stable isotope analysis (delta 13C) of DOC could provide valuable insights in its origin, fluxes and environmental fate. Precise and routine analysis of delta 13C and DOC concentration are therefore highly desirable. A promising, new system has been developed for this purpose, linking a high-temperature combustion TOC analyzer trough an interface with a continuous flow isotope ratio mass spectrometer (Elementar group, Hanau, Germany). This TOC-IRMS system enables simultaneous stable isotope (bulk delta 13C) and concentration analysis of DOC, with high oxidation efficiency by high-temperature combustion for complex mixtures as natural DOC. To give delta 13C analysis by TOC-IRMS the necessary impulse for broad-scale application, we present a detailed evaluation of its analytical performance for realistic and challenging conditions inclusive low DOC concentrations and environmental samples. High precision (standard deviation, SD predominantly < 0.15 permil) and accuracy (R2 = 0.9997, i.e. comparison TOC-IRMS and conventional EA-IRMS) were achieved by TOC-IRMS for a broad diversity of DOC solutions. This precision is comparable or even slightly better than that typically reported for EA-IRMS systems, and improves previous techniques for δ13C analysis of DOC. Simultaneously, very good precision was obtained for DOC concentration measurements. Assessment of natural abundance and slightly 13C enriched DOC, a wide range of concentrations (0.2-150 mgC/L) and injection volumes (0.05-3 ml), demonstrated good analytical performance with negligible memory effects, no concentration/volume effects and a wide linearity. Low DOC concentrations (< 2 mgC/L), were correctly analyzed without any pre-concentration. Moreover, TOC-IRMS was successfully applied to analyze DOC from diverse terrestrial, freshwater and marine environments (SD < 0.23 permil). In summary, the TOC-IRMS performs fast and reliable analysis of DOC concentration and δ13C in aqueous samples, without any pre-concentration/freeze-drying. Flexible usage is highlighted by automated, online analysis, a variable injection volume, high throughput and no extensive maintenance. Sample analysis is simple, using small aliquots and with minimal sample preparation. Further investigations should focus on complex, saline matrices and very low DOC concentrations, to achieve a potential lower limit of 0.2 mgC/L. High-resolution, routine delta 13C analysis of DOC by TOC-IRMS offers opportunities for wide-scale application in terrestrial, freshwater and marine research to elucidate the role of DOC in biogeochemical processes and ecosystem functioning.

  12. High-Precision Al-Mg Isotopic Systematics in USNM 3898 — The Benchmark "ALL" for Initial 87Sr/86Sr in the Earliest Solar System

    NASA Astrophysics Data System (ADS)

    MacPherson, G. J.; Defouilloy, C.; Kita, N. T.

    2017-07-01

    High-precision SIMS analysis of Al-Mg isotopes in USNM 3898, the CAI on which ALL is based, yields 26Al/27Al = (4.88 ± 0.14) × 10-5 in its interior vs. 26Al/27Al = (4.56 ± 0.11) × 10-5 in its outer mantle, suggesting later partial re-melting.

  13. Number-Density Measurements of CO2 in Real Time with an Optical Frequency Comb for High Accuracy and Precision

    NASA Astrophysics Data System (ADS)

    Scholten, Sarah K.; Perrella, Christopher; Anstie, James D.; White, Richard T.; Al-Ashwal, Waddah; Hébert, Nicolas Bourbeau; Genest, Jérôme; Luiten, Andre N.

    2018-05-01

    Real-time and accurate measurements of gas properties are highly desirable for numerous real-world applications. Here, we use an optical-frequency comb to demonstrate absolute number-density and temperature measurements of a sample gas with state-of-the-art precision and accuracy. The technique is demonstrated by measuring the number density of 12C16O2 with an accuracy of better than 1% and a precision of 0.04% in a measurement and analysis cycle of less than 1 s. This technique is transferable to numerous molecular species, thus offering an avenue for near-universal gas concentration measurements.

  14. Three-axis lever actuator with flexure hinges for an optical disk system

    NASA Astrophysics Data System (ADS)

    Han, Chang-Soo; Kim, Soo-Hyun

    2002-10-01

    A three-axis lever actuator with a flexure hinge has been designed and fabricated. This actuator is driven by electromagnetic force based on a coil-magnet system and can be used as a high precision actuator and, especially as a pickup head actuator in optical disks. High precision and low sensitivity to external vibration are the major advantages of this lever actuator. An analysis model was found and compared to the finite element method. Dynamic characteristics of the three-axis lever actuator were measured. The results are in very close agreement to those predicted by the model and finite element analysis.

  15. HiCTMap: Detection and analysis of chromosome territory structure and position by high-throughput imaging.

    PubMed

    Jowhar, Ziad; Gudla, Prabhakar R; Shachar, Sigal; Wangsa, Darawalee; Russ, Jill L; Pegoraro, Gianluca; Ried, Thomas; Raznahan, Armin; Misteli, Tom

    2018-06-01

    The spatial organization of chromosomes in the nuclear space is an extensively studied field that relies on measurements of structural features and 3D positions of chromosomes with high precision and robustness. However, no tools are currently available to image and analyze chromosome territories in a high-throughput format. Here, we have developed High-throughput Chromosome Territory Mapping (HiCTMap), a method for the robust and rapid analysis of 2D and 3D chromosome territory positioning in mammalian cells. HiCTMap is a high-throughput imaging-based chromosome detection method which enables routine analysis of chromosome structure and nuclear position. Using an optimized FISH staining protocol in a 384-well plate format in conjunction with a bespoke automated image analysis workflow, HiCTMap faithfully detects chromosome territories and their position in 2D and 3D in a large population of cells per experimental condition. We apply this novel technique to visualize chromosomes 18, X, and Y in male and female primary human skin fibroblasts, and show accurate detection of the correct number of chromosomes in the respective genotypes. Given the ability to visualize and quantitatively analyze large numbers of nuclei, we use HiCTMap to measure chromosome territory area and volume with high precision and determine the radial position of chromosome territories using either centroid or equidistant-shell analysis. The HiCTMap protocol is also compatible with RNA FISH as demonstrated by simultaneous labeling of X chromosomes and Xist RNA in female cells. We suggest HiCTMap will be a useful tool for routine precision mapping of chromosome territories in a wide range of cell types and tissues. Published by Elsevier Inc.

  16. The structure of the proton in the LHC precision era

    NASA Astrophysics Data System (ADS)

    Gao, Jun; Harland-Lang, Lucian; Rojo, Juan

    2018-05-01

    We review recent progress in the determination of the parton distribution functions (PDFs) of the proton, with emphasis on the applications for precision phenomenology at the Large Hadron Collider (LHC). First of all, we introduce the general theoretical framework underlying the global QCD analysis of the quark and gluon internal structure of protons. We then present a detailed overview of the hard-scattering measurements, and the corresponding theory predictions, that are used in state-of-the-art PDF fits. We emphasize here the role that higher-order QCD and electroweak corrections play in the description of recent high-precision collider data. We present the methodology used to extract PDFs in global analyses, including the PDF parametrization strategy and the definition and propagation of PDF uncertainties. Then we review and compare the most recent releases from the various PDF fitting collaborations, highlighting their differences and similarities. We discuss the role that QED corrections and photon-initiated contributions play in modern PDF analysis. We provide representative examples of the implications of PDF fits for high-precision LHC phenomenological applications, such as Higgs coupling measurements and searches for high-mass New Physics resonances. We conclude this report by discussing some selected topics relevant for the future of PDF determinations, including the treatment of theoretical uncertainties, the connection with lattice QCD calculations, and the role of PDFs at future high-energy colliders beyond the LHC.

  17. Critical Steps in Data Analysis for Precision Casimir Force Measurements with Semiconducting Films

    NASA Astrophysics Data System (ADS)

    Banishev, A. A.; Chang, Chia-Cheng; Mohideen, U.

    2011-06-01

    Some experimental procedures and corresponding results of the precision measurement of the Casimir force between low doped Indium Tin Oxide (ITO) film and gold sphere are described. Measurements were performed using an Atomic Force Microscope in high vacuum. It is shown that the magnitude of the Casimir force decreases after prolonged UV treatment of the ITO film. Some critical data analysis steps such as the correction for the mechanical drift of the sphere-plate system and photodiodes are discussed.

  18. Critical Steps in Data Analysis for Precision Casimir Force Measurements with Semiconducting Films

    NASA Astrophysics Data System (ADS)

    Banishev, A. A.; Chang, Chia-Cheng; Mohideen, U.

    Some experimental procedures and corresponding results of the precision measurement of the Casimir force between low doped Indium Tin Oxide (ITO) film and gold sphere are described. Measurements were performed using an Atomic Force Microscope in high vacuum. It is shown that the magnitude of the Casimir force decreases after prolonged UV treatment of the ITO film. Some critical data analysis steps such as the correction for the mechanical drift of the sphere-plate system and photodiodes are discussed.

  19. Methods for Multiplex Template Sampling in Digital PCR Assays

    PubMed Central

    Petriv, Oleh I.; Heyries, Kevin A.; VanInsberghe, Michael; Walker, David; Hansen, Carl L.

    2014-01-01

    The efficient use of digital PCR (dPCR) for precision copy number analysis requires high concentrations of target molecules that may be difficult or impossible to obtain from clinical samples. To solve this problem we present a strategy, called Multiplex Template Sampling (MTS), that effectively increases template concentrations by detecting multiple regions of fragmented target molecules. Three alternative assay approaches are presented for implementing MTS analysis of chromosome 21, providing a 10-fold concentration enhancement while preserving assay precision. PMID:24854517

  20. Methods for multiplex template sampling in digital PCR assays.

    PubMed

    Petriv, Oleh I; Heyries, Kevin A; VanInsberghe, Michael; Walker, David; Hansen, Carl L

    2014-01-01

    The efficient use of digital PCR (dPCR) for precision copy number analysis requires high concentrations of target molecules that may be difficult or impossible to obtain from clinical samples. To solve this problem we present a strategy, called Multiplex Template Sampling (MTS), that effectively increases template concentrations by detecting multiple regions of fragmented target molecules. Three alternative assay approaches are presented for implementing MTS analysis of chromosome 21, providing a 10-fold concentration enhancement while preserving assay precision.

  1. What Friends Are For: Collaborative Intelligence Analysis and Search

    DTIC Science & Technology

    2014-06-01

    14. SUBJECT TERMS Intelligence Community, information retrieval, recommender systems , search engines, social networks, user profiling, Lucene...improvements over existing search systems . The improvements are shown to be robust to high levels of human error and low similarity between users ...precision NOLH nearly orthogonal Latin hypercubes P@ precision at documents RS recommender systems TREC Text REtrieval Conference USM user

  2. Precise Truss Assembly Using Commodity Parts and Low Precision Welding

    NASA Technical Reports Server (NTRS)

    Komendera, Erik; Reishus, Dustin; Dorsey, John T.; Doggett, W. R.; Correll, Nikolaus

    2014-01-01

    Hardware and software design and system integration for an intelligent precision jigging robot (IPJR), which allows high precision assembly using commodity parts and low-precision bonding, is described. Preliminary 2D experiments that are motivated by the problem of assembling space telescope optical benches and very large manipulators on orbit using inexpensive, stock hardware and low-precision welding are also described. An IPJR is a robot that acts as the precise "jigging", holding parts of a local structure assembly site in place, while an external low precision assembly agent cuts and welds members. The prototype presented in this paper allows an assembly agent (for this prototype, a human using only low precision tools), to assemble a 2D truss made of wooden dowels to a precision on the order of millimeters over a span on the order of meters. The analysis of the assembly error and the results of building a square structure and a ring structure are discussed. Options for future work, to extend the IPJR paradigm to building in 3D structures at micron precision are also summarized.

  3. Fully Nonlinear Modeling and Analysis of Precision Membranes

    NASA Technical Reports Server (NTRS)

    Pai, P. Frank; Young, Leyland G.

    2003-01-01

    High precision membranes are used in many current space applications. This paper presents a fully nonlinear membrane theory with forward and inverse analyses of high precision membrane structures. The fully nonlinear membrane theory is derived from Jaumann strains and stresses, exact coordinate transformations, the concept of local relative displacements, and orthogonal virtual rotations. In this theory, energy and Newtonian formulations are fully correlated, and every structural term can be interpreted in terms of vectors. Fully nonlinear ordinary differential equations (ODES) governing the large static deformations of known axisymmetric membranes under known axisymmetric loading (i.e., forward problems) are presented as first-order ODES, and a method for obtaining numerically exact solutions using the multiple shooting procedure is shown. A method for obtaining the undeformed geometry of any axisymmetric membrane with a known inflated geometry and a known internal pressure (i.e., inverse problems) is also derived. Numerical results from forward analysis are verified using results in the literature, and results from inverse analysis are verified using known exact solutions and solutions from the forward analysis. Results show that the membrane theory and the proposed numerical methods for solving nonlinear forward and inverse membrane problems are accurate.

  4. Validating data analysis of broadband laser ranging

    NASA Astrophysics Data System (ADS)

    Rhodes, M.; Catenacci, J.; Howard, M.; La Lone, B.; Kostinski, N.; Perry, D.; Bennett, C.; Patterson, J.

    2018-03-01

    Broadband laser ranging combines spectral interferometry and a dispersive Fourier transform to achieve high-repetition-rate measurements of the position of a moving surface. Telecommunications fiber is a convenient tool for generating the large linear dispersions required for a dispersive Fourier transform, but standard fiber also has higher-order dispersion that distorts the Fourier transform. Imperfections in the dispersive Fourier transform significantly complicate the ranging signal and must be dealt with to make high-precision measurements. We describe in detail an analysis process for interpreting ranging data when standard telecommunications fiber is used to perform an imperfect dispersive Fourier transform. This analysis process is experimentally validated over a 27-cm scan of static positions, showing an accuracy of 50 μm and a root-mean-square precision of 4.7 μm.

  5. AMMI adjustment for statistical analysis of an international wheat yield trial.

    PubMed

    Crossa, J; Fox, P N; Pfeiffer, W H; Rajaram, S; Gauch, H G

    1991-01-01

    Multilocation trials are important for the CIMMYT Bread Wheat Program in producing high-yielding, adapted lines for a wide range of environments. This study investigated procedures for improving predictive success of a yield trial, grouping environments and genotypes into homogeneous subsets, and determining the yield stability of 18 CIMMYT bread wheats evaluated at 25 locations. Additive Main effects and Multiplicative Interaction (AMMI) analysis gave more precise estimates of genotypic yields within locations than means across replicates. This precision facilitated formation by cluster analysis of more cohesive groups of genotypes and locations for biological interpretation of interactions than occurred with unadjusted means. Locations were clustered into two subsets for which genotypes with positive interactions manifested in high, stable yields were identified. The analyses highlighted superior selections with both broad and specific adaptation.

  6. Tunable laser techniques for improving the precision of observational astronomy

    NASA Astrophysics Data System (ADS)

    Cramer, Claire E.; Brown, Steven W.; Lykke, Keith R.; Woodward, John T.; Bailey, Stephen; Schlegel, David J.; Bolton, Adam S.; Brownstein, Joel; Doherty, Peter E.; Stubbs, Christopher W.; Vaz, Amali; Szentgyorgyi, Andrew

    2012-09-01

    Improving the precision of observational astronomy requires not only new telescopes and instrumentation, but also advances in observing protocols, calibrations and data analysis. The Laser Applications Group at the National Institute of Standards and Technology in Gaithersburg, Maryland has been applying advances in detector metrology and tunable laser calibrations to problems in astronomy since 2007. Using similar measurement techniques, we have addressed a number of seemingly disparate issues: precision flux calibration for broad-band imaging, precision wavelength calibration for high-resolution spectroscopy, and precision PSF mapping for fiber spectrographs of any resolution. In each case, we rely on robust, commercially-available laboratory technology that is readily adapted to use at an observatory. In this paper, we give an overview of these techniques.

  7. Measurement of whole tire profile

    NASA Astrophysics Data System (ADS)

    Yang, Yongyue; Jiao, Wenguang

    2010-08-01

    In this paper, a precision measuring device is developed for obtaining characteristic curve of tire profile and its geometric parameters. It consists of a laser displacement measurement unit, a closed-loop precision two-dimensional coordinate table, a step motor control system and a fast data acquisition and analysis system. Based on the laser trigonometry, a data map of tire profile and coordinate values of all points can be obtained through corresponding data transformation. This device has a compact structure, a convenient control, a simple hardware circuit design and a high measurement precision. Experimental results indicate that measurement precision can meet the customer accuracy requirement of +/-0.02 mm.

  8. Fundamental differences between optimization code test problems in engineering applications

    NASA Technical Reports Server (NTRS)

    Eason, E. D.

    1984-01-01

    The purpose here is to suggest that there is at least one fundamental difference between the problems used for testing optimization codes and the problems that engineers often need to solve; in particular, the level of precision that can be practically achieved in the numerical evaluation of the objective function, derivatives, and constraints. This difference affects the performance of optimization codes, as illustrated by two examples. Two classes of optimization problem were defined. Class One functions and constraints can be evaluated to a high precision that depends primarily on the word length of the computer. Class Two functions and/or constraints can only be evaluated to a moderate or a low level of precision for economic or modeling reasons, regardless of the computer word length. Optimization codes have not been adequately tested on Class Two problems. There are very few Class Two test problems in the literature, while there are literally hundreds of Class One test problems. The relative performance of two codes may be markedly different for Class One and Class Two problems. Less sophisticated direct search type codes may be less likely to be confused or to waste many function evaluations on Class Two problems. The analysis accuracy and minimization performance are related in a complex way that probably varies from code to code. On a problem where the analysis precision was varied over a range, the simple Hooke and Jeeves code was more efficient at low precision while the Powell code was more efficient at high precision.

  9. Multinode acoustic focusing for parallel flow cytometry

    PubMed Central

    Piyasena, Menake E.; Suthanthiraraj, Pearlson P. Austin; Applegate, Robert W.; Goumas, Andrew M.; Woods, Travis A.; López, Gabriel P.; Graves, Steven W.

    2012-01-01

    Flow cytometry can simultaneously measure and analyze multiple properties of single cells or particles with high sensitivity and precision. Yet, conventional flow cytometers have fundamental limitations with regards to analyzing particles larger than about 70 microns, analyzing at flow rates greater than a few hundred microliters per minute, and providing analysis rates greater than 50,000 per second. To overcome these limits, we have developed multi-node acoustic focusing flow cells that can position particles (as small as a red blood cell and as large as 107 microns in diameter) into as many as 37 parallel flow streams. We demonstrate the potential of such flow cells for the development of high throughput, parallel flow cytometers by precision focusing of flow cytometry alignment microspheres, red blood cells, and the analysis of CD4+ cellular immunophenotyping assay. This approach will have significant impact towards the creation of high throughput flow cytometers for rare cell detection applications (e.g. circulating tumor cells), applications requiring large particle analysis, and high volume flow cytometry. PMID:22239072

  10. A Police and Insurance Joint Management System Based on High Precision BDS/GPS Positioning

    PubMed Central

    Zuo, Wenwei; Guo, Chi; Liu, Jingnan; Peng, Xuan; Yang, Min

    2018-01-01

    Car ownership in China reached 194 million vehicles at the end of 2016. The traffic congestion index (TCI) exceeds 2.0 during rush hour in some cities. Inefficient processing for minor traffic accidents is considered to be one of the leading causes for road traffic jams. Meanwhile, the process after an accident is quite troublesome. The main reason is that it is almost always impossible to get the complete chain of evidence when the accident happens. Accordingly, a police and insurance joint management system is developed which is based on high precision BeiDou Navigation Satellite System (BDS)/Global Positioning System (GPS) positioning to process traffic accidents. First of all, an intelligent vehicle rearview mirror terminal is developed. The terminal applies a commonly used consumer electronic device with single frequency navigation. Based on the high precision BDS/GPS positioning algorithm, its accuracy can reach sub-meter level in the urban areas. More specifically, a kernel driver is built to realize the high precision positioning algorithm in an Android HAL layer. Thus the third-party application developers can call the general location Application Programming Interface (API) of the original standard Global Navigation Satellite System (GNSS) to get high precision positioning results. Therefore, the terminal can provide lane level positioning service for car users. Next, a remote traffic accident processing platform is built to provide big data analysis and management. According to the big data analysis of information collected by BDS high precision intelligent sense service, vehicle behaviors can be obtained. The platform can also automatically match and screen the data that uploads after an accident to achieve accurate reproduction of the scene. Thus, it helps traffic police and insurance personnel to complete remote responsibility identification and survey for the accident. Thirdly, a rapid processing flow is established in this article to meet the requirements to quickly handle traffic accidents. The traffic police can remotely identify accident responsibility and the insurance personnel can remotely survey an accident. Moreover, the police and insurance joint management system has been carried out in Wuhan, Central China’s Hubei Province, and Wuxi, Eastern China’s Jiangsu Province. In a word, a system is developed to obtain and analyze multisource data including precise positioning and visual information, and a solution is proposed for efficient processing of traffic accidents. PMID:29320406

  11. A Police and Insurance Joint Management System Based on High Precision BDS/GPS Positioning.

    PubMed

    Zuo, Wenwei; Guo, Chi; Liu, Jingnan; Peng, Xuan; Yang, Min

    2018-01-10

    Car ownership in China reached 194 million vehicles at the end of 2016. The traffic congestion index (TCI) exceeds 2.0 during rush hour in some cities. Inefficient processing for minor traffic accidents is considered to be one of the leading causes for road traffic jams. Meanwhile, the process after an accident is quite troublesome. The main reason is that it is almost always impossible to get the complete chain of evidence when the accident happens. Accordingly, a police and insurance joint management system is developed which is based on high precision BeiDou Navigation Satellite System (BDS)/Global Positioning System (GPS) positioning to process traffic accidents. First of all, an intelligent vehicle rearview mirror terminal is developed. The terminal applies a commonly used consumer electronic device with single frequency navigation. Based on the high precision BDS/GPS positioning algorithm, its accuracy can reach sub-meter level in the urban areas. More specifically, a kernel driver is built to realize the high precision positioning algorithm in an Android HAL layer. Thus the third-party application developers can call the general location Application Programming Interface (API) of the original standard Global Navigation Satellite System (GNSS) to get high precision positioning results. Therefore, the terminal can provide lane level positioning service for car users. Next, a remote traffic accident processing platform is built to provide big data analysis and management. According to the big data analysis of information collected by BDS high precision intelligent sense service, vehicle behaviors can be obtained. The platform can also automatically match and screen the data that uploads after an accident to achieve accurate reproduction of the scene. Thus, it helps traffic police and insurance personnel to complete remote responsibility identification and survey for the accident. Thirdly, a rapid processing flow is established in this article to meet the requirements to quickly handle traffic accidents. The traffic police can remotely identify accident responsibility and the insurance personnel can remotely survey an accident. Moreover, the police and insurance joint management system has been carried out in Wuhan, Central China's Hubei Province, and Wuxi, Eastern China's Jiangsu Province. In a word, a system is developed to obtain and analyze multisource data including precise positioning and visual information, and a solution is proposed for efficient processing of traffic accidents.

  12. Composite adaptive control of belt polishing force for aero-engine blade

    NASA Astrophysics Data System (ADS)

    Zhsao, Pengbing; Shi, Yaoyao

    2013-09-01

    The existing methods for blade polishing mainly focus on robot polishing and manual grinding. Due to the difficulty in high-precision control of the polishing force, the blade surface precision is very low in robot polishing, in particular, quality of the inlet and exhaust edges can not satisfy the processing requirements. Manual grinding has low efficiency, high labor intensity and unstable processing quality, moreover, the polished surface is vulnerable to burn, and the surface precision and integrity are difficult to ensure. In order to further improve the profile accuracy and surface quality, a pneumatic flexible polishing force-exerting mechanism is designed and a dual-mode switching composite adaptive control(DSCAC) strategy is proposed, which combines Bang-Bang control and model reference adaptive control based on fuzzy neural network(MRACFNN) together. By the mode decision-making mechanism, Bang-Bang control is used to track the control command signal quickly when the actual polishing force is far away from the target value, and MRACFNN is utilized in smaller error ranges to improve the system robustness and control precision. Based on the mathematical model of the force-exerting mechanism, simulation analysis is implemented on DSCAC. Simulation results show that the output polishing force can better track the given signal. Finally, the blade polishing experiments are carried out on the designed polishing equipment. Experimental results show that DSCAC can effectively mitigate the influence of gas compressibility, valve dead-time effect, valve nonlinear flow, cylinder friction, measurement noise and other interference on the control precision of polishing force, which has high control precision, strong robustness, strong anti-interference ability and other advantages compared with MRACFNN. The proposed research achieves high-precision control of the polishing force, effectively improves the blade machining precision and surface consistency, and significantly reduces the surface roughness.

  13. High-precision measurement of chlorine stable isotope ratios

    USGS Publications Warehouse

    Long, A.; Eastoe, C.J.; Kaufmann, R.S.; Martin, J.G.; Wirt, L.; Finley, J.B.

    1993-01-01

    We present an analysis procedure that allows stable isotopes of chlorine to be analyzed with precision sufficient for geological and hydrological studies. The total analytical precision is ?????0.09%., and the present known range of chloride in the surface and near-surface environment is 3.5???. As Cl- is essentially nonreactive in natural aquatic environments, it is a conservative tracer and its ??37Cl is also conservative. Thus, the ??37Cl parameter is valuable for quantitative evaluation of mixing of different sources of chloride in brines and aquifers. ?? 1993.

  14. Laser technology for high precision satellite tracking

    NASA Technical Reports Server (NTRS)

    Plotkin, H. H.

    1974-01-01

    Fixed and mobile laser ranging stations have been developed to track satellites equipped with retro-reflector arrays. These have operated consistently at data rates of once per second with range precision better than 50 cm, using Q-switched ruby lasers with pulse durations of 20 to 40 nanoseconds. Improvements are being incorporated to improve the precision to 10 cm, and to permit ranging to more distant satellites. These include improved reflector array designs, processing and analysis of the received reflection pulses, and use of sub-nanosecond pulse duration lasers.

  15. Multiple-objective optimization in precision laser cutting of different thermoplastics

    NASA Astrophysics Data System (ADS)

    Tamrin, K. F.; Nukman, Y.; Choudhury, I. A.; Shirley, S.

    2015-04-01

    Thermoplastics are increasingly being used in biomedical, automotive and electronics industries due to their excellent physical and chemical properties. Due to the localized and non-contact process, use of lasers for cutting could result in precise cut with small heat-affected zone (HAZ). Precision laser cutting involving various materials is important in high-volume manufacturing processes to minimize operational cost, error reduction and improve product quality. This study uses grey relational analysis to determine a single optimized set of cutting parameters for three different thermoplastics. The set of the optimized processing parameters is determined based on the highest relational grade and was found at low laser power (200 W), high cutting speed (0.4 m/min) and low compressed air pressure (2.5 bar). The result matches with the objective set in the present study. Analysis of variance (ANOVA) is then carried out to ascertain the relative influence of process parameters on the cutting characteristics. It was found that the laser power has dominant effect on HAZ for all thermoplastics.

  16. Clinical proteomics-driven precision medicine for targeted cancer therapy: current overview and future perspectives.

    PubMed

    Zhou, Li; Wang, Kui; Li, Qifu; Nice, Edouard C; Zhang, Haiyuan; Huang, Canhua

    2016-01-01

    Cancer is a common disease that is a leading cause of death worldwide. Currently, early detection and novel therapeutic strategies are urgently needed for more effective management of cancer. Importantly, protein profiling using clinical proteomic strategies, with spectacular sensitivity and precision, offer excellent promise for the identification of potential biomarkers that would direct the development of targeted therapeutic anticancer drugs for precision medicine. In particular, clinical sample sources, including tumor tissues and body fluids (blood, feces, urine and saliva), have been widely investigated using modern high-throughput mass spectrometry-based proteomic approaches combined with bioinformatic analysis, to pursue the possibilities of precision medicine for targeted cancer therapy. Discussed in this review are the current advantages and limitations of clinical proteomics, the available strategies of clinical proteomics for the management of precision medicine, as well as the challenges and future perspectives of clinical proteomics-driven precision medicine for targeted cancer therapy.

  17. Closed tubes preparation of graphite for high-precision AMS radiocarbon analysis

    NASA Astrophysics Data System (ADS)

    Hajdas, I.; Michczynska, D.; Bonani, G.; Maurer, M.; Wacker, L.

    2009-04-01

    Radiocarbon dating is an established tool applied in Geochronology. Technical developments of Accelerator Mass Spectrometry AMS, which allow measurements of samples containing less than 1 mg of carbon, opened opportunities for new applications. Moreover, high resolution records of the past changes require high-resolution chronologies i.e. sampling for 14C dating. In result, the field of applications is rapidly expanding and number of radiocarbon analysis is growing rapidly. Nowadays dedicated 14C AMS machines have great capacity for analysis but in order to keep up with the demand for analysis and provide the results as fast as possible a very efficient way of sample preparation is required. Sample preparation for 14C AMS analysis consists of two steps: separation of relevant carbon from the sample material (removing contamination) and preparation of graphite for AMS analysis. The last step usually involves reaction of CO2 with H2, in the presence of metal catalyst (Fe or Co) of specific mesh size heated to 550-625°C, as originally suggested by Vogel et al. (1984). Various graphitization systems have been built in order to fulfil the requirement of sample quality needed for high-precision radiocarbon data. In the early 90ties another method has been proposed (Vogel 1992) and applied by few laboratories mainly for environmental or biomedical samples. This method uses TiH2 as a source of H2 and can be easily and flexibly applied to produce graphite. Sample of CO2 is frozen in to the tube containing pre-conditioned Zn/TiH2 and Fe catalyst. Torch sealed tubes are then placed in the stepwise heated oven at 500/550°C and left to react for several hours. The greatest problem is the lack of control of the reaction completeness and considerable fractionation. However, recently reported results (Xu et al. 2007) suggest that high precision dating using graphite produced in closed tubes might be possible. We will present results of radiocarbon dating of the set of standards and secondary IAEA standards to demonstrate to what level this method can be used for high precision radiocarbon dating. References Vogel JS. 1992. Rapid Production of Graphite without Contamination for Biomedical Ams. Radiocarbon 34: 344-350. Vogel JS, Southon JR, Nelson DE, and Brown TA. 1984. Performance of Catalytically Condensed Carbon for Use in Accelerator Mass-Spectrometry. Nuclear Instruments & Methods in Physics Research Section B-Beam Interactions with Materials and Atoms 233: 289-293. Xu X, Trumbore SE, Zheng S, Southon JR, McDuffee KE, Luttgen M, and Liu JC. 2007. Modifying a sealed tube zinc reduction method for preparation of AMS graphite targets: Reducing background and attaining high precision. Nuclear Instruments and Methods in Physics Research Section B: Beam Interactions with Materials and Atoms Accelerator Mass Spectrometry - Proceedings of the Tenth International Conference on Accelerator Mass Spectrometry 259: 320-329.

  18. Motion Simulation Analysis of Rail Weld CNC Fine Milling Machine

    NASA Astrophysics Data System (ADS)

    Mao, Huajie; Shu, Min; Li, Chao; Zhang, Baojun

    CNC fine milling machine is a new advanced equipment of rail weld precision machining with high precision, high efficiency, low environmental pollution and other technical advantages. The motion performance of this machine directly affects its machining accuracy and stability, which makes it an important consideration for its design. Based on the design drawings, this article completed 3D modeling of 60mm/kg rail weld CNC fine milling machine by using Solidworks. After that, the geometry was imported into Adams to finish the motion simulation analysis. The displacement, velocity, angular velocity and some other kinematical parameters curves of the main components were obtained in the post-processing and these are the scientific basis for the design and development for this machine.

  19. International Conference on the Mechanical Technology of Inertial Devices, University of Newcastle-upon-Tyne, England, Apr. 7-9, 1987, Proceedings

    NASA Astrophysics Data System (ADS)

    Various papers on the mechanical technology of inertial devices are presented. The topics addressed include: development of a directional gyroscope for remotely piloted vehicles and similar applications; a two-degree-of-freedom gyroscope with frictionless inner and outer gimbal pickoffs; oscillogyro design, manufacture, and performance; development of miniature two-axis rate gyroscope; mechanical design aspects of the electrostatically suspended gyroscope; role of gas-lubricated bearings in current and future sensors; development of a new microporous retainer material for precision ball bearings; design study for a high-stability, large-centrifuge test bed; evaluation of a two-axis rate gyro; operating principles of a two-axis angular rate transducer; and nutation frequency analysis. Also considered are: triaxial laser gyro; mechanical design considerations for a ring laser gyro dither mechanism; environmental considerations in the design of fiberoptic gyroscopes; manufacturing aspects of some critical high-precision mechanical components of inertial devices; dynamics and control of a gyroscopic force measurement system; high precision and high performance motion systems; use of multiple acceleration references to obtain high precision centrifuge data at low cost; gyro testing and evaluation at the Communications Research Centre; review of the mechanical design and development of a high-performance accelerometer; and silicon microengineering for accelerometers.

  20. Quantitative analysis of three chiral pesticide enantiomers by high-performance column liquid chromatography.

    PubMed

    Wang, Peng; Liu, Donghui; Gu, Xu; Jiang, Shuren; Zhou, Zhiqiang

    2008-01-01

    Methods for the enantiomeric quantitative determination of 3 chiral pesticides, paclobutrazol, myclobutanil, and uniconazole, and their residues in soil and water are reported. An effective chiral high-performance liquid chromatographic (HPLC)-UV method using an amylose-tris(3,5-dimethylphenylcarbamate; AD) column was developed for resolving the enantiomers and quantitative determination. The enantiomers were identified by a circular dichroism detector. Validation involved complete resolution of each of the 2 enantiomers, plus determination of linearity, precision, and limit of detection (LOD). The pesticide enantiomers were isolated by solvent extraction from soil and C18 solid-phase extraction from water. The 2 enantiomers of the 3 pesticides could be completely separated on the AD column using n-hexane isopropanol mobile phase. The linearity and precision results indicated that the method was reliable for the quantitative analysis of the enantiomers. LODs were 0.025, 0.05, and 0.05 mg/kg for each enantiomer of paclobutrazol, myclobutanil, and uniconazole, respectively. Recovery and precision data showed that the pretreatment procedures were satisfactory for enantiomer extraction and cleanup. This method can be used for optical purity determination of technical material and analysis of environmental residues.

  1. In Vivo Precision of Digital Topological Skeletonization Based Individual Trabecula Segmentation (ITS) Analysis of Trabecular Microstructure at the Distal Radius and Tibia by HR-pQCT.

    PubMed

    Zhou, Bin; Zhang, Zhendong; Wang, Ji; Yu, Y Eric; Liu, Xiaowei Sherry; Nishiyama, Kyle K; Rubin, Mishaela R; Shane, Elizabeth; Bilezikian, John P; Guo, X Edward

    2016-06-01

    Trabecular plate and rod microstructure plays a dominant role in the apparent mechanical properties of trabecular bone. With high-resolution computed tomography (CT) images, digital topological analysis (DTA) including skeletonization and topological classification was applied to transform the trabecular three-dimensional (3D) network into surface and curve skeletons. Using the DTA-based topological analysis and a new reconstruction/recovery scheme, individual trabecula segmentation (ITS) was developed to segment individual trabecular plates and rods and quantify the trabecular plate- and rod-related morphological parameters. High-resolution peripheral quantitative computed tomography (HR-pQCT) is an emerging in vivo imaging technique to visualize 3D bone microstructure. Based on HR-pQCT images, ITS was applied to various HR-pQCT datasets to examine trabecular plate- and rod-related microstructure and has demonstrated great potential in cross-sectional and longitudinal clinical applications. However, the reproducibility of ITS has not been fully determined. The aim of the current study is to quantify the precision errors of ITS plate-rod microstructural parameters. In addition, we utilized three different frequently used contour techniques to separate trabecular and cortical bone and to evaluate their effect on ITS measurements. Overall, good reproducibility was found for the standard HR-pQCT parameters with precision errors for volumetric BMD and bone size between 0.2%-2.0%, and trabecular bone microstructure between 4.9%-6.7% at the radius and tibia. High reproducibility was also achieved for ITS measurements using all three different contour techniques. For example, using automatic contour technology, low precision errors were found for plate and rod trabecular number (pTb.N, rTb.N, 0.9% and 3.6%), plate and rod trabecular thickness (pTb.Th, rTb.Th, 0.6% and 1.7%), plate trabecular surface (pTb.S, 3.4%), rod trabecular length (rTb.ℓ, 0.8%), and plate-plate junction density (P-P Junc.D, 2.3%) at the tibia. The precision errors at the radius were similar to those at the tibia. In addition, precision errors were affected by the contour technique. At the tibia, precision error by the manual contour method was significantly different from automatic and standard contour methods for pTb.N, rTb.N and rTb.Th. Precision error using the manual contour method was also significantly different from the standard contour method for rod trabecular number (rTb.N), rod trabecular thickness (rTb.Th), rod-rod and plate-rod junction densities (R-R Junc.D and P-R Junc.D) at the tibia. At the radius, the precision error was similar between the three different contour methods. Image quality was also found to significantly affect the ITS reproducibility. We concluded that ITS parameters are highly reproducible, giving assurance that future cross-sectional and longitudinal clinical HR-pQCT studies are feasible in the context of limited sample sizes.

  2. Quantitative Determination of Isotope Ratios from Experimental Isotopic Distributions

    PubMed Central

    Kaur, Parminder; O’Connor, Peter B.

    2008-01-01

    Isotope variability due to natural processes provides important information for studying a variety of complex natural phenomena from the origins of a particular sample to the traces of biochemical reaction mechanisms. These measurements require high-precision determination of isotope ratios of a particular element involved. Isotope Ratio Mass Spectrometers (IRMS) are widely employed tools for such a high-precision analysis, which have some limitations. This work aims at overcoming the limitations inherent to IRMS by estimating the elemental isotopic abundance from the experimental isotopic distribution. In particular, a computational method has been derived which allows the calculation of 13C/12C ratios from the whole isotopic distributions, given certain caveats, and these calculations are applied to several cases to demonstrate their utility. The limitations of the method in terms of the required number of ions and S/N ratio are discussed. For high-precision estimates of the isotope ratios, this method requires very precise measurement of the experimental isotopic distribution abundances, free from any artifacts introduced by noise, sample heterogeneity, or other experimental sources. PMID:17263354

  3. Comparative study of 2-DOF micromirrors for precision light manipulation

    NASA Astrophysics Data System (ADS)

    Young, Johanna I.; Shkel, Andrei M.

    2001-08-01

    Many industry experts predict that the future of fiber optic telecommunications depends on the development of all-optical components for switching of photonic signals from fiber to fiber throughout the networks. MEMS is a promising technology for providing all-optical switching at high speeds with significant cost reductions. This paper reports on the the analysis of two designs for 2-DOF electrostatically actuated MEMS micromirrors for precision controllable large optical switching arrays. The behavior of the micromirror designs is predicted by coupled-field electrostatic and modal analysis using a finite element analysis (FEA) multi-physics modeling software. The analysis indicates that the commonly used gimbal type mirror design experiences electrostatic interference and would therefore be difficult to precisely control for 2-DOF motion. We propose a new design approach which preserves 2-DOF actuation while minimizing electrostatic interference between the drive electrodes and the mirror. Instead of using two torsional axes, we use one actuator which combines torsional and flexural DOFs. A comparative analysis of the conventional gimbal design and the one proposed in this paper is performed.

  4. Note: Tandem Kirkpatrick-Baez microscope with sixteen channels for high-resolution laser-plasma diagnostics

    NASA Astrophysics Data System (ADS)

    Yi, Shengzhen; Zhang, Zhe; Huang, Qiushi; Zhang, Zhong; Wang, Zhanshan; Wei, Lai; Liu, Dongxiao; Cao, Leifeng; Gu, Yuqiu

    2018-03-01

    Multi-channel Kirkpatrick-Baez (KB) microscopes, which have better resolution and collection efficiency than pinhole cameras, have been widely used in laser inertial confinement fusion to diagnose time evolution of the target implosion. In this study, a tandem multi-channel KB microscope was developed to have sixteen imaging channels with the precise control of spatial resolution and image intervals. This precise control was created using a coarse assembly of mirror pairs with high-accuracy optical prisms, followed by precise adjustment in real-time x-ray imaging experiments. The multilayers coated on the KB mirrors were designed to have substantially the same reflectivity to obtain a uniform brightness of different images for laser-plasma temperature analysis. The study provides a practicable method to achieve the optimum performance of the microscope for future high-resolution applications in inertial confinement fusion experiments.

  5. In situ precision electrospinning as an effective delivery technique for cyanoacrylate medical glue with high efficiency and low toxicity.

    PubMed

    Dong, R H; Qin, C C; Qiu, X; Yan, X; Yu, M; Cui, L; Zhou, Y; Zhang, H D; Jiang, X Y; Long, Y Z

    2015-12-14

    The side effects or toxicity of cyanoacrylate used in vivo have been argued since its first application in wound closure. We propose an airflow-assisted in situ precision electrospinning apparatus as an applicator and make a detailed comparison with traditional spraying via in vitro and in vivo experiments. This novel method can not only improve operational performance and safety by precisely depositing cyanoacrylate fibers onto a wound, but significantly reduce the dosage of cyanoacrylate by almost 80%. A white blood cell count, liver function test and histological analysis prove that the in situ precision electrospinning applicator produces a better postoperative outcome, e.g., minor hepatocyte injury, moderate inflammation and the significant ability for liver regeneration. This in situ precision electrospinning method may thus dramatically broaden both civilian and military applications of cyanoacrylates.

  6. Study of the one-way speed of light anisotropy with particle beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wojtsekhowski, Bogdan B.

    Concepts of high precision studies of the one-way speed of light anisotropy are discussed. The high energy particle beam allows measurement of a one-way speed of light anisotropy (SOLA) via analysis of the beam momentum variation with sidereal phase without the use of synchronized clocks. High precision beam position monitors could provide accurate monitoring of the beam orbit and determination of the particle beam momentum with relative accuracy on the level of 10^-10, which corresponds to a limit on SOLA of 10^-18 with existing storage rings. A few additional versions of the experiment are also presented.

  7. Study of the one-way speed of light anisotropy with particle beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wojtsekhowski, Bogdan

    2017-04-01

    Concepts of high precision studies of the one-way speed of light anisotropy are discussed. The high energy particle beam allows measurement of a one-way speed of light anisotropy (SOLA) via analysis of the beam momentum variation with sidereal phase without the use of synchronized clocks. High precision beam position monitors could provide accurate monitoring of the beam orbit and determination of the particle beam momentum with relative accuracy on the level of 10^-10, which corresponds to a limit on SOLA of 10^-18 with existing storage rings. A few additional versions of the experiment are also presented.

  8. Phasemeter core for intersatellite laser heterodyne interferometry: modelling, simulations and experiments

    NASA Astrophysics Data System (ADS)

    Gerberding, Oliver; Sheard, Benjamin; Bykov, Iouri; Kullmann, Joachim; Esteban Delgado, Juan Jose; Danzmann, Karsten; Heinzel, Gerhard

    2013-12-01

    Intersatellite laser interferometry is a central component of future space-borne gravity instruments like Laser Interferometer Space Antenna (LISA), evolved LISA, NGO and future geodesy missions. The inherently small laser wavelength allows us to measure distance variations with extremely high precision by interfering a reference beam with a measurement beam. The readout of such interferometers is often based on tracking phasemeters, which are able to measure the phase of an incoming beatnote with high precision over a wide range of frequencies. The implementation of such phasemeters is based on all digital phase-locked loops (ADPLL), hosted in FPGAs. Here, we present a precise model of an ADPLL that allows us to design such a readout algorithm and we support our analysis by numerical performance measurements and experiments with analogue signals.

  9. High-speed Continuous-wave Stimulated Brillouin Scattering Spectrometer for Material Analysis.

    PubMed

    Remer, Itay; Cohen, Lear; Bilenca, Alberto

    2017-09-22

    Recent years have witnessed a significant increase in the use of spontaneous Brillouin spectrometers for non-contact analysis of soft matter, such as aqueous solutions and biomaterials, with fast acquisition times. Here, we discuss the assembly and operation of a Brillouin spectrometer that uses stimulated Brillouin scattering (SBS) to measure stimulated Brillouin gain (SBG) spectra of water and lipid emulsion-based tissue-like samples in transmission mode with <10 MHz spectral-resolution and <35 MHz Brillouin-shift measurement precision at <100 ms. The spectrometer consists of two nearly counter-propagating continuous-wave (CW) narrow-linewidth lasers at 780 nm whose frequency detuning is scanned through the material Brillouin shift. By using an ultra-narrowband hot rubidium-85 vapor notch filter and a phase-sensitive detector, the signal-to-noise-ratio of the SBG signal is significantly enhanced compared to that obtained with existing CW-SBS spectrometers. This improvement enables measurement of SBG spectra with up to 100-fold faster acquisition times, thereby facilitating high spectral-resolution and high-precision Brillouin analysis of soft materials at high speed.

  10. Note: Eddy current displacement sensors independent of target conductivity.

    PubMed

    Wang, Hongbo; Li, Wei; Feng, Zhihua

    2015-01-01

    Eddy current sensors (ECSs) are widely used for non-contact displacement measurement. In this note, the quantitative error of an ECS caused by target conductivity was analyzed using a complex image method. The response curves (L-x) of the ECS with different targets were similar and could be overlapped by shifting the curves on x direction with √2δ/2. Both finite element analysis and experiments match well with the theoretical analysis, which indicates that the measured error of high precision ECSs caused by target conductivity can be completely eliminated, and the ECSs can measure different materials precisely without calibration.

  11. High Precision Edge Detection Algorithm for Mechanical Parts

    NASA Astrophysics Data System (ADS)

    Duan, Zhenyun; Wang, Ning; Fu, Jingshun; Zhao, Wenhui; Duan, Boqiang; Zhao, Jungui

    2018-04-01

    High precision and high efficiency measurement is becoming an imperative requirement for a lot of mechanical parts. So in this study, a subpixel-level edge detection algorithm based on the Gaussian integral model is proposed. For this purpose, the step edge normal section line Gaussian integral model of the backlight image is constructed, combined with the point spread function and the single step model. Then gray value of discrete points on the normal section line of pixel edge is calculated by surface interpolation, and the coordinate as well as gray information affected by noise is fitted in accordance with the Gaussian integral model. Therefore, a precise location of a subpixel edge was determined by searching the mean point. Finally, a gear tooth was measured by M&M3525 gear measurement center to verify the proposed algorithm. The theoretical analysis and experimental results show that the local edge fluctuation is reduced effectively by the proposed method in comparison with the existing subpixel edge detection algorithms. The subpixel edge location accuracy and computation speed are improved. And the maximum error of gear tooth profile total deviation is 1.9 μm compared with measurement result with gear measurement center. It indicates that the method has high reliability to meet the requirement of high precision measurement.

  12. Precise, High-throughput Analysis of Bacterial Growth.

    PubMed

    Kurokawa, Masaomi; Ying, Bei-Wen

    2017-09-19

    Bacterial growth is a central concept in the development of modern microbial physiology, as well as in the investigation of cellular dynamics at the systems level. Recent studies have reported correlations between bacterial growth and genome-wide events, such as genome reduction and transcriptome reorganization. Correctly analyzing bacterial growth is crucial for understanding the growth-dependent coordination of gene functions and cellular components. Accordingly, the precise quantitative evaluation of bacterial growth in a high-throughput manner is required. Emerging technological developments offer new experimental tools that allow updates of the methods used for studying bacterial growth. The protocol introduced here employs a microplate reader with a highly optimized experimental procedure for the reproducible and precise evaluation of bacterial growth. This protocol was used to evaluate the growth of several previously described Escherichia coli strains. The main steps of the protocol are as follows: the preparation of a large number of cell stocks in small vials for repeated tests with reproducible results, the use of 96-well plates for high-throughput growth evaluation, and the manual calculation of two major parameters (i.e., maximal growth rate and population density) representing the growth dynamics. In comparison to the traditional colony-forming unit (CFU) assay, which counts the cells that are cultured in glass tubes over time on agar plates, the present method is more efficient and provides more detailed temporal records of growth changes, but has a stricter detection limit at low population densities. In summary, the described method is advantageous for the precise and reproducible high-throughput analysis of bacterial growth, which can be used to draw conceptual conclusions or to make theoretical observations.

  13. The development of a novel high-precision major depressive disorder screening system using transient autonomic responses induced by dual mental tasks.

    PubMed

    Matsui, Takemi; Shinba, Toshikazu; Sun, Guanghao

    2018-02-01

    12.6% of major depressive disorder (MDD) patients have suicide intent, while it has been reported that 43% of patients did not consult their doctors for MDD, automated MDD screening is eagerly anticipated. Recently, in order to achieve automated screening of MDD, biomarkers such as multiplex DNA methylation profiles or physiological method using near infra-red spectroscopy (NIRS) have been studied, however, they require inspection using 96-well DNA ELIZA kit after blood sampling or significant cost. Using a single-lead electrocardiography (ECG), we developed a high-precision MDD screening system using transient autonomic responses induced by dual mental tasks. We developed a novel high precision MDD screening system which is composed of a single-lead ECG monitor, analogue to digital (AD) converter and a personal computer with measurement and analysis program written by LabView programming language. The system discriminates MDD patients from normal subjects using heat rate variability (HRV)-derived transient autonomic responses induced by dual mental tasks, i.e. verbal fluency task and random number generation task, via linear discriminant analysis (LDA) adopting HRV-related predictor variables (hear rate (HR), high frequency (HF), low frequency (LF)/HF). The proposed system was tested for 12 MDD patients (32 ± 15 years) under antidepressant treatment from Shizuoka Saiseikai General Hospital outpatient unit and 30 normal volunteers (37 ± 17 years) from Tokyo Metropolitan University. The proposed system achieved 100% sensitivity and 100% specificity in classifying 42 examinees into 12 MDD patients and 30 normal subjects. The proposed system appears promising for future HRV-based high-precision and low-cost screening of MDDs using only single-lead ECG.

  14. System identification of the JPL micro-precision interferometer truss - Test-analysis reconciliation

    NASA Technical Reports Server (NTRS)

    Red-Horse, J. R.; Marek, E. L.; Levine-West, M.

    1993-01-01

    The JPL Micro-Precision Interferometer (MPI) is a testbed for studying the use of control-structure interaction technology in the design of space-based interferometers. A layered control architecture will be employed to regulate the interferometer optical system to tolerances in the nanometer range. An important aspect of designing and implementing the control schemes for such a system is the need for high fidelity, test-verified analytical structural models. This paper focuses on one aspect of the effort to produce such a model for the MPI structure, test-analysis model reconciliation. Pretest analysis, modal testing, and model refinement results are summarized for a series of tests at both the component and full system levels.

  15. Precision Timing of PSR J0437-4715: An Accurate Pulsar Distance, a High Pulsar Mass, and a Limit on the Variation of Newton's Gravitational Constant

    NASA Astrophysics Data System (ADS)

    Verbiest, J. P. W.; Bailes, M.; van Straten, W.; Hobbs, G. B.; Edwards, R. T.; Manchester, R. N.; Bhat, N. D. R.; Sarkissian, J. M.; Jacoby, B. A.; Kulkarni, S. R.

    2008-05-01

    Analysis of 10 years of high-precision timing data on the millisecond pulsar PSR J0437-4715 has resulted in a model-independent kinematic distance based on an apparent orbital period derivative, dot Pb , determined at the 1.5% level of precision (Dk = 157.0 +/- 2.4 pc), making it one of the most accurate stellar distance estimates published to date. The discrepancy between this measurement and a previously published parallax distance estimate is attributed to errors in the DE200 solar system ephemerides. The precise measurement of dot Pb allows a limit on the variation of Newton's gravitational constant, |Ġ/G| <= 23 × 10-12 yr-1. We also constrain any anomalous acceleration along the line of sight to the pulsar to |a⊙/c| <= 1.5 × 10-18 s-1 at 95% confidence, and derive a pulsar mass, mpsr = 1.76 +/- 0.20 M⊙, one of the highest estimates so far obtained.

  16. National and International Security Applications of Cryogenic Detectors—Mostly Nuclear Safeguards

    NASA Astrophysics Data System (ADS)

    Rabin, Michael W.

    2009-12-01

    As with science, so with security—in both arenas, the extraordinary sensitivity of cryogenic sensors enables high-confidence detection and high-precision measurement even of the faintest signals. Science applications are more mature, but several national and international security applications have been identified where cryogenic detectors have high potential payoff. International safeguards and nuclear forensics are areas needing new technology and methods to boost speed, sensitivity, precision and accuracy. Successfully applied, improved nuclear materials analysis will help constrain nuclear materials diversion pathways and contribute to treaty verification. Cryogenic microcalorimeter detectors for X-ray, gamma-ray, neutron, and alpha-particle spectrometry are under development with these aims in mind. In each case the unsurpassed energy resolution of microcalorimeters reveals previously invisible spectral features of nuclear materials. Preliminary results of quantitative analysis indicate substantial improvements are still possible, but significant work will be required to fully understand the ultimate performance limits.

  17. Determination of dasatinib in the tablet dosage form by ultra high performance liquid chromatography, capillary zone electrophoresis, and sequential injection analysis.

    PubMed

    Gonzalez, Aroa Garcia; Taraba, Lukáš; Hraníček, Jakub; Kozlík, Petr; Coufal, Pavel

    2017-01-01

    Dasatinib is a novel oral prescription drug proposed for treating adult patients with chronic myeloid leukemia. Three analytical methods, namely ultra high performance liquid chromatography, capillary zone electrophoresis, and sequential injection analysis, were developed, validated, and compared for determination of the drug in the tablet dosage form. The total analysis time of optimized ultra high performance liquid chromatography and capillary zone electrophoresis methods was 2.0 and 2.2 min, respectively. Direct ultraviolet detection with detection wavelength of 322 nm was employed in both cases. The optimized sequential injection analysis method was based on spectrophotometric detection of dasatinib after a simple colorimetric reaction with folin ciocalteau reagent forming a blue-colored complex with an absorbance maximum at 745 nm. The total analysis time was 2.5 min. The ultra high performance liquid chromatography method provided the lowest detection and quantitation limits and the most precise and accurate results. All three newly developed methods were demonstrated to be specific, linear, sensitive, precise, and accurate, providing results satisfactorily meeting the requirements of the pharmaceutical industry, and can be employed for the routine determination of the active pharmaceutical ingredient in the tablet dosage form. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Reconciling evidence-based medicine and precision medicine in the era of big data: challenges and opportunities.

    PubMed

    Beckmann, Jacques S; Lew, Daniel

    2016-12-19

    This era of groundbreaking scientific developments in high-resolution, high-throughput technologies is allowing the cost-effective collection and analysis of huge, disparate datasets on individual health. Proper data mining and translation of the vast datasets into clinically actionable knowledge will require the application of clinical bioinformatics. These developments have triggered multiple national initiatives in precision medicine-a data-driven approach centering on the individual. However, clinical implementation of precision medicine poses numerous challenges. Foremost, precision medicine needs to be contrasted with the powerful and widely used practice of evidence-based medicine, which is informed by meta-analyses or group-centered studies from which mean recommendations are derived. This "one size fits all" approach can provide inadequate solutions for outliers. Such outliers, which are far from an oddity as all of us fall into this category for some traits, can be better managed using precision medicine. Here, we argue that it is necessary and possible to bridge between precision medicine and evidence-based medicine. This will require worldwide and responsible data sharing, as well as regularly updated training programs. We also discuss the challenges and opportunities for achieving clinical utility in precision medicine. We project that, through collection, analyses and sharing of standardized medically relevant data globally, evidence-based precision medicine will shift progressively from therapy to prevention, thus leading eventually to improved, clinician-to-patient communication, citizen-centered healthcare and sustained well-being.

  19. New real-time algorithms for arbitrary, high precision function generation with applications to acoustic transducer excitation

    NASA Astrophysics Data System (ADS)

    Gaydecki, P.

    2009-07-01

    A system is described for the design, downloading and execution of arbitrary functions, intended for use with acoustic and low-frequency ultrasonic transducers in condition monitoring and materials testing applications. The instrumentation comprises a software design tool and a powerful real-time digital signal processor unit, operating at 580 million multiplication-accumulations per second (MMACs). The embedded firmware employs both an established look-up table approach and a new function interpolation technique to generate the real-time signals with very high precision and flexibility. Using total harmonic distortion (THD) analysis, the purity of the waveforms have been compared with those generated using traditional analogue function generators; this analysis has confirmed that the new instrument has a consistently superior signal-to-noise ratio.

  20. Sub-sampling genetic data to estimate black bear population size: A case study

    USGS Publications Warehouse

    Tredick, C.A.; Vaughan, M.R.; Stauffer, D.F.; Simek, S.L.; Eason, T.

    2007-01-01

    Costs for genetic analysis of hair samples collected for individual identification of bears average approximately US$50 [2004] per sample. This can easily exceed budgetary allowances for large-scale studies or studies of high-density bear populations. We used 2 genetic datasets from 2 areas in the southeastern United States to explore how reducing costs of analysis by sub-sampling affected precision and accuracy of resulting population estimates. We used several sub-sampling scenarios to create subsets of the full datasets and compared summary statistics, population estimates, and precision of estimates generated from these subsets to estimates generated from the complete datasets. Our results suggested that bias and precision of estimates improved as the proportion of total samples used increased, and heterogeneity models (e.g., Mh[CHAO]) were more robust to reduced sample sizes than other models (e.g., behavior models). We recommend that only high-quality samples (>5 hair follicles) be used when budgets are constrained, and efforts should be made to maximize capture and recapture rates in the field.

  1. Quantitative gel electrophoresis: new records in precision by elaborated staining and detection protocols.

    PubMed

    Deng, Xi; Schröder, Simone; Redweik, Sabine; Wätzig, Hermann

    2011-06-01

    Gel electrophoresis (GE) is a very common analytical technique for proteome research and protein analysis. Despite being developed decades ago, there is still a considerable need to improve its precision. Using the fluorescence of Colloidal Coomassie Blue -stained proteins in near-infrared (NIR), the major error source caused by the unpredictable background staining is strongly reduced. This result was generalized for various types of detectors. Since GE is a multi-step procedure, standardization of every single step is required. After detailed analysis of all steps, the staining and destaining were identified as the major source of the remaining variation. By employing standardized protocols, pooled percent relative standard deviations of 1.2-3.1% for band intensities were achieved for one-dimensional separations in repetitive experiments. The analysis of variance suggests that the same batch of staining solution should be used for gels of one experimental series to minimize day-to-day variation and to obtain high precision. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. 1998 Conference on Precision Electromagnetic Measurements Digest. Proceedings.

    NASA Astrophysics Data System (ADS)

    Nelson, T. L.

    The following topics were dealt with: fundamental constants; caesium standards; AC-DC transfer; impedance measurement; length measurement; units; statistics; cryogenic resonators; time transfer; QED; resistance scaling and bridges; mass measurement; atomic fountains and clocks; single electron transport; Newtonian constant of gravitation; stabilised lasers and frequency measurements; cryogenic current comparators; optical frequency standards; high voltage devices and systems; international compatibility; magnetic measurement; precision power measurement; high resolution spectroscopy; DC transport standards; waveform acquisition and analysis; ion trap standards; optical metrology; quantised Hall effect; Josephson array comparisons; signal generation and measurement; Avogadro constant; microwave networks; wideband power standards; antennas, fields and EMC; quantum-based standards.

  3. Rigorous high-precision enclosures of fixed points and their invariant manifolds

    NASA Astrophysics Data System (ADS)

    Wittig, Alexander N.

    The well established concept of Taylor Models is introduced, which offer highly accurate C0 enclosures of functional dependencies, combining high-order polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly non-linear dynamical systems. A method is proposed to extend the existing implementation of Taylor Models in COSY INFINITY from double precision coefficients to arbitrary precision coefficients. Great care is taken to maintain the highest efficiency possible by adaptively adjusting the precision of higher order coefficients in the polynomial expansion. High precision operations are based on clever combinations of elementary floating point operations yielding exact values for round-off errors. An experimental high precision interval data type is developed and implemented. Algorithms for the verified computation of intrinsic functions based on the High Precision Interval datatype are developed and described in detail. The application of these operations in the implementation of High Precision Taylor Models is discussed. An application of Taylor Model methods to the verification of fixed points is presented by verifying the existence of a period 15 fixed point in a near standard Henon map. Verification is performed using different verified methods such as double precision Taylor Models, High Precision intervals and High Precision Taylor Models. Results and performance of each method are compared. An automated rigorous fixed point finder is implemented, allowing the fully automated search for all fixed points of a function within a given domain. It returns a list of verified enclosures of each fixed point, optionally verifying uniqueness within these enclosures. An application of the fixed point finder to the rigorous analysis of beam transfer maps in accelerator physics is presented. Previous work done by Johannes Grote is extended to compute very accurate polynomial approximations to invariant manifolds of discrete maps of arbitrary dimension around hyperbolic fixed points. The algorithm presented allows for automatic removal of resonances occurring during construction. A method for the rigorous enclosure of invariant manifolds of continuous systems is introduced. Using methods developed for discrete maps, polynomial approximations of invariant manifolds of hyperbolic fixed points of ODEs are obtained. These approximations are outfit with a sharp error bound which is verified to rigorously contain the manifolds. While we focus on the three dimensional case, verification in higher dimensions is possible using similar techniques. Integrating the resulting enclosures using the verified COSY VI integrator, the initial manifold enclosures are expanded to yield sharp enclosures of large parts of the stable and unstable manifolds. To demonstrate the effectiveness of this method, we construct enclosures of the invariant manifolds of the Lorenz system and show pictures of the resulting manifold enclosures. To the best of our knowledge, these enclosures are the largest verified enclosures of manifolds in the Lorenz system in existence.

  4. First clinicial results on the feasibility, quality and reproducibility of aberrometry-based intraoperative refraction during cataract surgery.

    PubMed

    Huelle, Jan O; Katz, Toam; Druchkiv, Vasyl; Pahlitzsch, Milena; Steinberg, Johannes; Richard, Gisbert; Linke, Stephan J

    2014-11-01

    To provide the first clinical data in determining the feasibility, quality and precision of intraoperative wavefront aberrometry (IWA)-based refraction in patients with cataract. IWA refraction was recorded at 7 defined measurement points during standardised cataract surgery in 74 eyes of 74 consecutive patients (mean age 69±11.3 years). Precision and measurement quality was evaluated by the 'limits of agreement' approach, regression analysis, correlation analysis, Analysis of variance (ANOVA) and ORs for predicting measurement failure. Wavefront map (WFM) quality was objectivised and compared with the Pentacam Nuclear Staging analysis. Out of 814 IWA measurement attempts, 462 WFMs could be obtained. The most successful readings (n=63) were achieved in aphakia with viscoelastic. The highest (50.63%, SD 20.23) and lowest (29.19%, SD 13.94) quality of WFMs across all measurement points were found after clear corneal incision and in pseudophakia with viscoelastic, respectively. High consistency across repeated measures were found for mean spherical equivalent (SE) differences in aphakia with -0.01D and pseudophakia with -0.01D, but ranges were high (limits of agreement +0.69 D and -0.72 D; +1.53 D and -1.54 D, respectively). With increasing WFM quality, higher precision in measurements was observed. This is the first report addressing quality and reproducibility of WA in a large sample. IWA refraction in aphakia, for instance, appears to be reliable once stable and pressurised anterior chamber conditions are achieved. More efforts are required to improve the precision and quality of measurements before IWA can be used to guide the surgical refractive plan in cataract surgery. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Precision digital control systems

    NASA Astrophysics Data System (ADS)

    Vyskub, V. G.; Rozov, B. S.; Savelev, V. I.

    This book is concerned with the characteristics of digital control systems of great accuracy. A classification of such systems is considered along with aspects of stabilization, programmable control applications, digital tracking systems and servomechanisms, and precision systems for the control of a scanning laser beam. Other topics explored are related to systems of proportional control, linear devices and methods for increasing precision, approaches for further decreasing the response time in the case of high-speed operation, possibilities for the implementation of a logical control law, and methods for the study of precision digital control systems. A description is presented of precision automatic control systems which make use of electronic computers, taking into account the existing possibilities for an employment of computers in automatic control systems, approaches and studies required for including a computer in such control systems, and an analysis of the structure of automatic control systems with computers. Attention is also given to functional blocks in the considered systems.

  6. A Concept for Airborne Precision Spacing for Dependent Parallel Approaches

    NASA Technical Reports Server (NTRS)

    Barmore, Bryan E.; Baxley, Brian T.; Abbott, Terence S.; Capron, William R.; Smith, Colin L.; Shay, Richard F.; Hubbs, Clay

    2012-01-01

    The Airborne Precision Spacing concept of operations has been previously developed to support the precise delivery of aircraft landing successively on the same runway. The high-precision and consistent delivery of inter-aircraft spacing allows for increased runway throughput and the use of energy-efficient arrivals routes such as Continuous Descent Arrivals and Optimized Profile Descents. This paper describes an extension to the Airborne Precision Spacing concept to enable dependent parallel approach operations where the spacing aircraft must manage their in-trail spacing from a leading aircraft on approach to the same runway and spacing from an aircraft on approach to a parallel runway. Functionality for supporting automation is discussed as well as procedures for pilots and controllers. An analysis is performed to identify the required information and a new ADS-B report is proposed to support these information needs. Finally, several scenarios are described in detail.

  7. High Precision Isotope Analyses Using Multi-Collector SIMS: Applications to Earth and Planetary Science.

    NASA Astrophysics Data System (ADS)

    Kita, N. T.; Ushikubo, T.; Valley, J. W.

    2008-05-01

    The CAMECA IMS-1280 large radius, multicollector ion microprobe at the Wisc-SIMS National Facility is capable of high accuracy and precision for in situ analysis of isotope ratios. With improved hardware stability and software capability, high precision isotope analyses are routinely performed, typically 5 min per spot. We have developed analytical protocols for stable isotope analyses of oxygen, carbon, Mg, Si and Sulfur using multi-collector Faraday Cups (MCFC) and achieved precision of 0.1-0.2 ‰ (1SD) from a typically 10μm spot analyses. A number of isotopically homogeneous mineral standards have been prepared and calibrated in order to certify the accuracy of analyses in the same level. When spatial resolution is critical, spot size is reduced down to sub- μm for δ 18O to obtain better than 0.5‰ (1SD) precision by using electron multiplier (EM) on multi-collection system. Multi-collection EM analysis is also applied at 10 ppm level to Li isotope ratios in zircon with precision better than 2‰ (1SD). A few applications will be presented. (1) Oxygen three isotope analyses of chondrules in ordinary chondrites revealed both mass dependent and mass independent oxygen isotope fractionations among chondrules as well as within individual chondrules. The results give constraints on the process of chondrule formation and origin of isotope reservoirs in the early solar system. (2) High precision 26Al-26Mg (half life of 0.73 Ma) chronology is applied to zoned melilite and anorthite from Ca, Al-rich inclusions (CAI) in Leoville meteorite, and a well-defined internal isochron is obtained. The results indicate the Al- Mg system was remained closed within 40ky of the crystallization of melilite and anorthite in this CAI. (3) Sub- μm spot analyses of δ18O in isotopically zoned zircon from high-grade metamorphism reveals a diffusion profile of ~6‰ over 2μm, indicating slow diffusion of oxygen in zircon. This result also implies that old Archean detrital zircons (> 4Ga) might preserve their primary oxygen isotopic records, which allows us to trace the geological processes of the early earth [1]. Lithium isotope analyses of pre- 4Ga zircon from Jack Hills show high Li abundance and low δ 7Li, indicating existence of highly weathered crustal material as early as 4.3Ga. In conclusion, these new techniques allow us to study small natural variations of stable isotopes at μm-scale that permit exciting and fundamental research where samples are small, precious, or zoned. [1] Page FZ et al. (2007) Am Min 92, 1772-1775.

  8. PROSPECT - A precision oscillation and spectrum experiment

    NASA Astrophysics Data System (ADS)

    Langford, T. J.; PROSPECT Collaboration

    2015-08-01

    Segmented antineutrino detectors placed near a compact research reactor provide an excellent opportunity to probe short-baseline neutrino oscillations and precisely measure the reactor antineutrino spectrum. Close proximity to a reactor combined with minimal overburden yield a high background environment that must be managed through shielding and detector technology. PROSPECT is a new experimental effort to detect reactor antineutrinos from the High Flux Isotope Reactor (HFIR) at Oak Ridge National Laboratory, managed by UT Battelle for the U.S. Department of Energy. The detector will use novel lithium-loaded liquid scintillator capable of neutron/gamma pulse shape discrimination and neutron capture tagging. These enhancements improve the ability to identify neutrino inverse-beta decays (IBD) and reject background events in analysis. Results from these efforts will be covered along with their implications for an oscillation search and a precision spectrum measurement.

  9. [Possibile application of X-ray and high resolution CT in pneumoconiosis management].

    PubMed

    Vlasov, V G; Laptev, V Ia; Logvinenko, I I; Smirnova, E L; Brovchenko, E P; Mironova, M V

    2011-01-01

    The article covers results of clinical and roentgenologic data analysis. The data were obtained in the study that covered 447 pneumoconiosis patients, 75 of which were subjected to high resolution CT. If compared to chest X-ray, high resolution CT helps more precise forecast of further course in pneumoconiosis.

  10. [High Precision Identification of Igneous Rock Lithology by Laser Induced Breakdown Spectroscopy].

    PubMed

    Wang, Chao; Zhang, Wei-gang; Yan, Zhi-quan

    2015-09-01

    In the field of petroleum exploration, lithology identification of finely cuttings sample, especially high precision identification of igneous rock with similar property, has become one of the geological problems. In order to solve this problem, a new method is proposed based on element analysis of Laser-Induced Breakdown Spectroscopy (LIBS) and Total Alkali versus Silica (TAS) diagram. Using independent LIBS system, factors influencing spectral signal, such as pulse energy, acquisition time delay, spectrum acquisition method and pre-ablation are researched through contrast experiments systematically. The best analysis conditions of igneous rock are determined: pulse energy is 50 mJ, acquisition time delay is 2 μs, the analysis result is integral average of 20 different points of sample's surface, and pre-ablation has been proved not suitable for igneous rock sample by experiment. The repeatability of spectral data is improved effectively. Characteristic lines of 7 elements (Na, Mg, Al, Si, K, Ca, Fe) commonly used for lithology identification of igneous rock are determined, and igneous rock samples of different lithology are analyzed and compared. Calibration curves of Na, K, Si are generated by using national standard series of rock samples, and all the linearly dependent coefficients are greater than 0.9. The accuracy of quantitative analysis is investigated by national standard samples. Element content of igneous rock is analyzed quantitatively by calibration curve, and its lithology is identified accurately by the method of TAS diagram, whose accuracy rate is 90.7%. The study indicates that LIBS can effectively achieve the high precision identification of the lithology of igneous rock.

  11. Accuracy and precision of computer-assisted analysis of bone density via conventional and digital radiography in relation to dual-energy x-ray absorptiometry.

    PubMed

    Vaccaro, Calogero; Busetto, Roberto; Bernardini, Daniele; Anselmi, Carlo; Zotti, Alessandro

    2012-03-01

    To evaluate the precision and accuracy of assessing bone mineral density (BMD) by use of mean gray value (MGV) on digitalized and digital images of conventional and digital radiographs, respectively, of ex vivo bovine and equine bone specimens in relation to the gold-standard technique of dual-energy x-ray absorptiometry (DEXA). Left and right metatarsal bones from 11 beef cattle and right femurs from 2 horses. Bovine specimens were imaged by use of conventional radiography, whereas equine specimens were imaged by use of computed radiography (digital radiography). Each specimen was subsequently scanned by use of the same DEXA equipment. The BMD values resulting from each DEXA scan were paired with the MGVs obtained by use of software on the corresponding digitalized or digital radiographic image. The MGV analysis of digitalized and digital x-ray images was a precise (coefficient of variation, 0.1 and 0.09, respectively) and highly accurate method for assessing BMD, compared with DEXA (correlation coefficient, 0.910 and 0.937 for conventional and digital radiography, respectively). The high correlation between MGV and BMD indicated that MGV analysis may be a reliable alternative to DEXA in assessing radiographic bone density. This may provide a new, inexpensive, and readily available estimate of BMD.

  12. Thermo-mechanical performance of precision C/SiC mounts

    NASA Astrophysics Data System (ADS)

    Goodman, William A.; Mueller, Claus E.; Jacoby, Marc T.; Wells, Jim D.

    2001-12-01

    For complex shaped, lightweight, high precision opto- mechanical structures that must operate in adverse environments and over wide ranges of temperature, we consider IABG's optical grade silicon carbide composite ceramic (C/SiC) as the material of choice. C/SiC employs conventional NC machining/milling equipment to rapidly fabricate near-net shape parts, providing substantial schedule, cost, and risk savings for high precision components. Unlike powder based SiC ceramics, C/SiC does not experience significant shrinkage during processing, nor does it suffer from incomplete densification. If required, e.g. for large-size components, a fully-monolithic ceramic joining technique can be applied. Generally, the thermal and mechanical properties of C/SiC are tunable in certain ranges by modifying certain process steps. This paper focuses on the thermo-mechanical performance of new, high precision mounts designed by Schafer Corporation and manufactured by IABG. The mounts were manufactured using standard optical grade C/SiC (formulation internally called A-3). The A-3 formulation has a near-perfect CTE match with silicon, making it the ideal material to athermally support Schafer produced Silicon Lightweight Mirrors (SLMs) that will operate in a cryogenic environment. Corresponding thermo- mechanical testing and analysis is presented in this manuscript.

  13. Progress toward accurate high spatial resolution actinide analysis by EPMA

    NASA Astrophysics Data System (ADS)

    Jercinovic, M. J.; Allaz, J. M.; Williams, M. L.

    2010-12-01

    High precision, high spatial resolution EPMA of actinides is a significant issue for geochronology, resource geochemistry, and studies involving the nuclear fuel cycle. Particular interest focuses on understanding of the behavior of Th and U in the growth and breakdown reactions relevant to actinide-bearing phases (monazite, zircon, thorite, allanite, etc.), and geochemical fractionation processes involving Th and U in fluid interactions. Unfortunately, the measurement of minor and trace concentrations of U in the presence of major concentrations of Th and/or REEs is particularly problematic, especially in complexly zoned phases with large compositional variation on the micro or nanoscale - spatial resolutions now accessible with modern instruments. Sub-micron, high precision compositional analysis of minor components is feasible in very high Z phases where scattering is limited at lower kV (15kV or less) and where the beam diameter can be kept below 400nm at high current (e.g. 200-500nA). High collection efficiency spectrometers and high performance electron optics in EPMA now allow the use of lower overvoltage through an exceptional range in beam current, facilitating higher spatial resolution quantitative analysis. The U LIII edge at 17.2 kV precludes L-series analysis at low kV (high spatial resolution), requiring careful measurements of the actinide M series. Also, U-La detection (wavelength = 0.9A) requires the use of LiF (220) or (420), not generally available on most instruments. Strong peak overlaps of Th on U make highly accurate interference correction mandatory, with problems compounded by the ThMIV and ThMV absorption edges affecting peak, background, and interference calibration measurements (especially the interference of the Th M line family on UMb). Complex REE bearing phases such as monazite, zircon, and allanite have particularly complex interference issues due to multiple peak and background overlaps from elements present in the activation volume, as well as interferences from fluorescence at a distance from adjacent phases or distinct compositional domains in the same phase. Interference corrections for elements detected during boundary fluorescence are further complicated by X-ray focusing geometry considerations. Additional complications arise from the high current densities required for high spatial resolution and high count precision, such as fluctuations in internal charge distribution and peak shape changes as satellite production efficiency varies from calibration to analysis. No flawless method has yet emerged. Extreme care in interference corrections, especially where multiple and sometime mutual overlaps are present, and maximum care (and precision) in background characterization to account for interferences and curvature (e.g., WDS scan or multipoint regression), are crucial developments. Calibration curves from multiple peak and interference calibration measurements at different concentrations, and iterative software methodologies for incorporating absorption edge effects, and non-linearities in interference corrections due to peak shape changes and off-axis X-ray defocussing during boundary fluorescence at a distance, are directions with significant potential.

  14. Homogeneous Characterization of Transiting Exoplanet Systems

    NASA Astrophysics Data System (ADS)

    Gomez Maqueo Chew, Yilen; Faedi, Francesca; Hebb, Leslie; Pollacco, Don; Stassun, Keivan; Ghezzi, Luan; Cargile, Phillip; Barros, Susana; Smalley, Barry; Mack, Claude

    2012-02-01

    We aim to obtain a homogeneous set of high resolution, high signal- to-noise (S/N) spectra for a large and diverse sample of stars with transiting planets, using the Kitt Peak 4-m echelle spectrograph for bright Northern targets (7.7150) in combination with high precision light curves shows an improvement in the precision of the stellar parameters of 60% in Teff, 75% in FeH, 82% in mstar, and 73% in rstar, which translates into a 64% improvement in the precision of rpl, and more than 2% on mpl, relative to the discovery paper's values.

  15. High-throughput spectral and lifetime-based FRET screening in living cells to identify small-molecule effectors of SERCA

    PubMed Central

    Schaaf, Tory M.; Peterson, Kurt C.; Grant, Benjamin D.; Bawaskar, Prachi; Yuen, Samantha; Li, Ji; Muretta, Joseph M.; Gillispie, Gregory D.; Thomas, David D.

    2017-01-01

    A robust high-throughput screening (HTS) strategy has been developed to discover small-molecule effectors targeting the sarco/endoplasmic reticulum calcium ATPase (SERCA), based on a fluorescence microplate reader that records both the nanosecond decay waveform (lifetime mode) and the complete emission spectrum (spectral mode), with high precision and speed. This spectral unmixing plate reader (SUPR) was used to screen libraries of small molecules with a fluorescence resonance energy transfer (FRET) biosensor expressed in living cells. Ligand binding was detected by FRET associated with structural rearrangements of green (GFP, donor) and red (RFP, acceptor) fluorescent proteins fused to the cardiac-specific SERCA2a isoform. The results demonstrate accurate quantitation of FRET along with high precision of hit identification. Fluorescence lifetime analysis resolved SERCA’s distinct structural states, providing a method to classify small-molecule chemotypes on the basis of their structural effect on the target. The spectral analysis was also applied to flag interference by fluorescent compounds. FRET hits were further evaluated for functional effects on SERCA’s ATPase activity via both a coupled-enzyme assay and a FRET-based calcium sensor. Concentration-response curves indicated excellent correlation between FRET and function. These complementary spectral and lifetime FRET detection methods offer an attractive combination of precision, speed, and resolution for HTS. PMID:27899691

  16. Instantaneous Real-Time Kinematic Decimeter-Level Positioning with BeiDou Triple-Frequency Signals over Medium Baselines.

    PubMed

    He, Xiyang; Zhang, Xiaohong; Tang, Long; Liu, Wanke

    2015-12-22

    Many applications, such as marine navigation, land vehicles location, etc., require real time precise positioning under medium or long baseline conditions. In this contribution, we develop a model of real-time kinematic decimeter-level positioning with BeiDou Navigation Satellite System (BDS) triple-frequency signals over medium distances. The ambiguities of two extra-wide-lane (EWL) combinations are fixed first, and then a wide lane (WL) combination is reformed based on the two EWL combinations for positioning. Theoretical analysis and empirical analysis is given of the ambiguity fixing rate and the positioning accuracy of the presented method. The results indicate that the ambiguity fixing rate can be up to more than 98% when using BDS medium baseline observations, which is much higher than that of dual-frequency Hatch-Melbourne-Wübbena (HMW) method. As for positioning accuracy, decimeter level accuracy can be achieved with this method, which is comparable to that of carrier-smoothed code differential positioning method. Signal interruption simulation experiment indicates that the proposed method can realize fast high-precision positioning whereas the carrier-smoothed code differential positioning method needs several hundreds of seconds for obtaining high precision results. We can conclude that a relatively high accuracy and high fixing rate can be achieved for triple-frequency WL method with single-epoch observations, displaying significant advantage comparing to traditional carrier-smoothed code differential positioning method.

  17. [Precision Oncology and "Molecular Tumor Boards" - Concepts, Chances and Challenges].

    PubMed

    Holch, Julian Walter; Westphalen, Christoph Benedikt; Hiddemann, Wolfgang; Heinemann, Volker; Jung, Andreas; Metzeler, Klaus Hans

    2017-11-01

    Recent developments in genomics allow a more and more comprehensive genetic analysis of human malignancies, and have sparked hopes that this will contribute to the development of novel targeted, effective and well-tolerated therapies.While targeted therapies have improved the prognosis of many cancer patients with certain tumor types, "precision oncology" also brings along new challenges. Highly personalized treatment strategies require new strategies for clinical trials and translation into routine clinical practice. We review the current technical approaches for "universal genetic testing" in cancer, and potential pitfalls in the interpretation of such data. We then provide an overview of the available evidence supporting treatment strategies based on extended genetic analysis. Based on the available data, we conclude that "precision oncology" approaches that go beyond the current standard of care should be pursued within the framework of an interdisciplinary "molecular tumor board", and preferably within clinical trials. © Georg Thieme Verlag KG Stuttgart · New York.

  18. HARNESSING BIG DATA FOR PRECISION MEDICINE: INFRASTRUCTURES AND APPLICATIONS.

    PubMed

    Yu, Kun-Hsing; Hart, Steven N; Goldfeder, Rachel; Zhang, Qiangfeng Cliff; Parker, Stephen C J; Snyder, Michael

    2017-01-01

    Precision medicine is a health management approach that accounts for individual differences in genetic backgrounds and environmental exposures. With the recent advancements in high-throughput omics profiling technologies, collections of large study cohorts, and the developments of data mining algorithms, big data in biomedicine is expected to provide novel insights into health and disease states, which can be translated into personalized disease prevention and treatment plans. However, petabytes of biomedical data generated by multiple measurement modalities poses a significant challenge for data analysis, integration, storage, and result interpretation. In addition, patient privacy preservation, coordination between participating medical centers and data analysis working groups, as well as discrepancies in data sharing policies remain important topics of discussion. In this workshop, we invite experts in omics integration, biobank research, and data management to share their perspectives on leveraging big data to enable precision medicine.Workshop website: http://tinyurl.com/PSB17BigData; HashTag: #PSB17BigData.

  19. Yale High Energy Physics Research: Precision Studies of Reactor Antineutrinos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heeger, Karsten M.

    2014-09-13

    This report presents experimental research at the intensity frontier of particle physics with particular focus on the study of reactor antineutrinos and the precision measurement of neutrino oscillations. The experimental neutrino physics group of Professor Heeger and Senior Scientist Band at Yale University has had leading responsibilities in the construction and operation of the Daya Bay Reactor Antineutrino Experiment and made critical contributions to the discovery of non-zeromore » $$\\theta_{13}$$. Heeger and Band led the Daya Bay detector management team and are now overseeing the operations of the antineutrino detectors. Postdoctoral researchers and students in this group have made leading contributions to the Daya Bay analysis including the prediction of the reactor antineutrino flux and spectrum, the analysis of the oscillation signal, and the precision determination of the target mass yielding unprecedented precision in the relative detector uncertainty. Heeger's group is now leading an R\\&D effort towards a short-baseline oscillation experiment, called PROSPECT, at a US research reactor and the development of antineutrino detectors with advanced background discrimination.« less

  20. Estimation of L-dopa from Mucuna pruriens LINN and formulations containing M. pruriens by HPTLC method.

    PubMed

    Modi, Ketan Pravinbhai; Patel, Natvarlal Manilal; Goyal, Ramesh Kishorilal

    2008-03-01

    A selective, precise, and accurate high-performance thin-layer chromatographic (HPTLC) method has been developed for the analysis of L-dopa in Mucuna pruriens seed extract and its formulations. The method involves densitometric evaluation of L-dopa after resolving it by HPTLC on silica gel plates with n-butanol-acetic acid-water (4.0+1.0+1.0, v/v) as the mobile phase. Densitometric analysis of L-dopa was carried out in the absorbance mode at 280 nm. The relationship between the concentration of L-dopa and corresponding peak areas was found to be linear in the range of 100 to 1200 ng/spot. The method was validated for precision (inter and intraday), repeatability, and accuracy. Mean recovery was 100.30%. The relative standard deviation (RSD) values of the precision were found to be in the range 0.64-1.52%. In conclusion, the proposed TLC method was found to be precise, specific and accurate and can be used for identification and quantitative determination of L-dopa in herbal extract and its formulations.

  1. High precision gas hydrate imaging of small-scale and high-resolution marine sparker multichannel seismic data

    NASA Astrophysics Data System (ADS)

    Luo, D.; Cai, F.

    2017-12-01

    Small-scale and high-resolution marine sparker multi-channel seismic surveys using large energy sparkers are characterized by a high dominant frequency of the seismic source, wide bandwidth, and a high resolution. The technology with a high-resolution and high-detection precision was designed to improve the imaging quality of shallow sedimentary. In the study, a 20KJ sparker and 24-channel streamer cable with a 6.25m group interval were used as a seismic source and receiver system, respectively. Key factors for seismic imaging of gas hydrate are enhancement of S/N ratio, amplitude compensation and detailed velocity analysis. However, the data in this study has some characteristics below: 1. Small maximum offsets are adverse to velocity analysis and multiple attenuation. 2. Lack of low frequency information, that is, information less than 100Hz are invisible. 3. Low S/N ratio since less coverage times (only 12 times). These characteristics make it difficult to reach the targets of seismic imaging. In the study, the target processing methods are used to improve the seismic imaging quality of gas hydrate. First, some technologies of noise suppression are combined used in pre-stack seismic data to suppression of seismic noise and improve the S/N ratio. These technologies including a spectrum sharing noise elimination method, median filtering and exogenous interference suppression method. Second, the combined method of three technologies including SRME, τ-p deconvolution and high precision Radon transformation is used to remove multiples. Third, accurate velocity field are used in amplitude energy compensation to highlight the Bottom Simulating Reflector (short for BSR, the indicator of gas hydrates) and gas migration pathways (such as gas chimneys, hot spots et al.). Fourth, fine velocity analysis technology are used to improve accuracy of velocity analysis. Fifth, pre-stack deconvolution processing technology is used to compensate for low frequency energy and suppress of ghost, thus formation reflection characteristics are highlighted. The result shows that the small-scale and high resolution marine sparker multi-channel seismic surveys are very effective in improving the resolution and quality of gas hydrate imaging than the conventional seismic acquisition technology.

  2. Segmentation of the Knee for Analysis of Osteoarthritis

    NASA Astrophysics Data System (ADS)

    Zerfass, Peter; Museyko, Oleg; Bousson, Valérie; Laredo, Jean-Denis; Kalender, Willi A.; Engelke, Klaus

    Osteoarthritis changes the load distribution within joints and also changes bone density and structure. Within typical timelines of clinical studies these changes can be very small. Therefore precise definition of evaluation regions which are highly robust and show little to no interand intra-operator variance are essential for high quality quantitative analysis. To achieve this goal we have developed a system for the definition of such regions with minimal user input.

  3. Do We Know Who Will Drop out?: A Review of the Predictors of Dropping out of High School--Precision, Sensitivity, and Specificity

    ERIC Educational Resources Information Center

    Bowers, Alex J.; Sprott, Ryan; Taff, Sherry A.

    2013-01-01

    The purpose of this study is to review the literature on the most accurate indicators of students at risk of dropping out of high school. We used Relative Operating Characteristic (ROC) analysis to compare the sensitivity and specificity of 110 dropout flags across 36 studies. Our results indicate that 1) ROC analysis provides a means to compare…

  4. A Study of Particle Beam Spin Dynamics for High Precision Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiedler, Andrew J.

    In the search for physics beyond the Standard Model, high precision experiments to measure fundamental properties of particles are an important frontier. One group of such measurements involves magnetic dipole moment (MDM) values as well as searching for an electric dipole moment (EDM), both of which could provide insights about how particles interact with their environment at the quantum level and if there are undiscovered new particles. For these types of high precision experiments, minimizing statistical uncertainties in the measurements plays a critical role. \\\\ \\indent This work leverages computer simulations to quantify the effects of statistical uncertainty for experimentsmore » investigating spin dynamics. In it, analysis of beam properties and lattice design effects on the polarization of the beam is performed. As a case study, the beam lines that will provide polarized muon beams to the Fermilab Muon \\emph{g}-2 experiment are analyzed to determine the effects of correlations between the phase space variables and the overall polarization of the muon beam.« less

  5. Parallel algorithm for solving Kepler’s equation on Graphics Processing Units: Application to analysis of Doppler exoplanet searches

    NASA Astrophysics Data System (ADS)

    Ford, Eric B.

    2009-05-01

    We present the results of a highly parallel Kepler equation solver using the Graphics Processing Unit (GPU) on a commercial nVidia GeForce 280GTX and the "Compute Unified Device Architecture" (CUDA) programming environment. We apply this to evaluate a goodness-of-fit statistic (e.g., χ2) for Doppler observations of stars potentially harboring multiple planetary companions (assuming negligible planet-planet interactions). Given the high-dimensionality of the model parameter space (at least five dimensions per planet), a global search is extremely computationally demanding. We expect that the underlying Kepler solver and model evaluator will be combined with a wide variety of more sophisticated algorithms to provide efficient global search, parameter estimation, model comparison, and adaptive experimental design for radial velocity and/or astrometric planet searches. We tested multiple implementations using single precision, double precision, pairs of single precision, and mixed precision arithmetic. We find that the vast majority of computations can be performed using single precision arithmetic, with selective use of compensated summation for increased precision. However, standard single precision is not adequate for calculating the mean anomaly from the time of observation and orbital period when evaluating the goodness-of-fit for real planetary systems and observational data sets. Using all double precision, our GPU code outperforms a similar code using a modern CPU by a factor of over 60. Using mixed precision, our GPU code provides a speed-up factor of over 600, when evaluating nsys > 1024 models planetary systems each containing npl = 4 planets and assuming nobs = 256 observations of each system. We conclude that modern GPUs also offer a powerful tool for repeatedly evaluating Kepler's equation and a goodness-of-fit statistic for orbital models when presented with a large parameter space.

  6. High-pressure liquid chromatography analysis of antibiotic susceptibility disks.

    PubMed Central

    Hagel, R B; Waysek, E H; Cort, W M

    1979-01-01

    The analysis of antibiotic susceptibility disks by high-pressure liquid chromatography (HPLC) was investigated. Methods are presented for the potency determination of mecillinam, ampicillin, carbenicillin, and cephalothin alone and in various combinations. Good agreement between HPLC and microbiological data is observed for potency determinations with recoveries of greater than 95%. Relative standard deviations of lower than 2% are recorded for each HPLC method. HPLC methods offer improved accuracy and greater precision when compared to the standard microbiological methods of analysis for susceptibility disks. PMID:507793

  7. Processing of high-precision ceramic balls with a spiral V-groove plate

    NASA Astrophysics Data System (ADS)

    Feng, Ming; Wu, Yongbo; Yuan, Julong; Ping, Zhao

    2017-03-01

    As the demand for high-performance bearings gradually increases, ceramic balls with excellent properties, such as high accuracy, high reliability, and high chemical durability used, are extensively used for highperformance bearings. In this study, a spiral V-groove plate method is employed in processing high-precision ceramic balls. After the kinematic analysis of the ball-spin angle and enveloped lapping trajectories, an experimental rig is constructed and experiments are conducted to confirm the feasibility of this method. Kinematic analysis results indicate that the method not only allows for the control of the ball-spin angle but also uniformly distributes the enveloped lapping trajectories over the entire ball surface. Experimental results demonstrate that the novel spiral Vgroove plate method performs better than the conventional concentric V-groove plate method in terms of roundness, surface roughness, diameter difference, and diameter decrease rate. Ceramic balls with a G3-level accuracy are achieved, and their typical roundness, minimum surface roughness, and diameter difference are 0.05, 0.0045, and 0.105 μm, respectively. These findings confirm that the proposed method can be applied to high-accuracy and high-consistency ceramic ball processing.

  8. Simple, fast and reliable liquid chromatographic and spectrophotometric methods for the determination of theophylline in urine, saliva and plasma samples.

    PubMed

    Charehsaz, Mohammad; Gürbay, Aylin; Aydin, Ahmet; Sahin, Gönül

    2014-01-01

    In this study, a high-performance liquid chromatographic method (HPLC) and UV spectrophotometric method were developed, validated and applied for the determination of theophylline in biological fluids. Liquid- liquid extraction is performed for isolation of the drug and elimination of plasma and saliva interferences. Urine samples were applied without any extraction. The chromatographic separation was achieved on a C18 column by using 60:40 methanol:water as mobile phase under isocratic conditions at a flow rate of 0.75 mL/min with UV detection at 280 nm in HPLC method. UV spectrophotometric analysis was performed at 275 nm. the limit of quantification: 1.1 µg/mL for urine, 1.9 µg/mL for saliva, 3.1 µg/mL for plasma; recovery: 94.85% for plasma, 100.45% for saliva, 101.39% for urine; intra-day precision: 0.22-2.33%, inter-day precision: 3.17-13.12%. Spectrophotometric analysis results were as follows: the limit of quantitation: 5.23 µg/mL for plasma, 8.7 µg/mL for urine; recovery: 98.27% for plasma, 95.25% for urine; intra-day precision: 2.37 - 3.00%, inter-day precision: 5.43-7.91%. It can be concluded that this validated HPLC method is easy, precise, accurate, sensitive and selective for determination of theophylline in biological samples. Also spectrophotometric analysis can be used where it can be applicable.

  9. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    PubMed Central

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  10. Underresolved absorption spectroscopy of OH radicals in flames using broadband UV LEDs

    NASA Astrophysics Data System (ADS)

    White, Logan; Gamba, Mirko

    2018-04-01

    A broadband absorption spectroscopy diagnostic based on underresolution of the spectral absorption lines is evaluated for the inference of species mole fraction and temperature in combustion systems from spectral fitting. The approach uses spectrally broadband UV light emitting diodes and leverages low resolution, small form factor spectrometers. Through this combination, the method can be used to develop high precision measurement sensors. The challenges of underresolved spectroscopy are explored and addressed using spectral derivative fitting, which is found to generate measurements with high precision and accuracy. The diagnostic is demonstrated with experimental measurements of gas temperature and OH mole fraction in atmospheric air/methane premixed laminar flat flames. Measurements exhibit high precision, good agreement with 1-D flame simulations, and high repeatability. A newly developed model of uncertainty in underresolved spectroscopy is applied to estimate two-dimensional confidence regions for the measurements. The results of the uncertainty analysis indicate that the errors in the outputs of the spectral fitting procedure are correlated. The implications of the correlation between uncertainties for measurement interpretation are discussed.

  11. A new high-precision borehole-temperature logging system used at GISP2, Greenland, and Taylor Dome, Antarctica

    USGS Publications Warehouse

    Clow, G.D.; Saltus, R.W.; Waddington, E.D.

    1996-01-01

    We describe a high-precision (0.1-1.0 mK) borehole-temperature (BT) logging system developed at the United States Geological Survey (USGS) for use in remote polar regions. We discuss calibration, operational and data-processing procedures, and present an analysis of the measurement errors. The system is modular to facilitate calibration procedures and field repairs. By interchanging logging cables and temperature sensors, measurements can be made in either shallow air-filled boreholes or liquid-filled holes up to 7 km deep. Data can be acquired in either incremental or continuous-logging modes. The precision of data collected by the new logging system is high enough to detect and quantify various thermal effects at the milli-Kelvin level. To illustrate this capability, we present sample data from the 3 km deep borehole at GISP2, Greenland, and from a 130m deep air-filled hole at Taylor Dome, Antarctica. The precision of the processed GTSP2 continuous temperature logs is 0.25-0.34 mK, while the accuracy is estimated to be 4.5 mK. The effects of fluid convection and the dissipation of the thermal disturbance caused by drilling the borehole are clearly visible in the data. The precision of the incremental Taylor Dome measurements varies from 0.11 to 0.32mK, depending on the wind strength during the experiments. With this precision, we found that temperature fluctuations and multi-hour trends in the BT measurements correlate well with atmospheric-pressure changes.

  12. Ion microscopy with resonant ionization mass spectrometry : time-of-flight depth profiling with improved isotopic precision.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pellin, M. J.; Veryovkin, I. V.; Levine, J.

    2010-01-01

    There are four generally mutually exclusive requirements that plague many mass spectrometric measurements of trace constituents: (1) the small size (limited by the depth probed) of many interesting materials requires high useful yields to simply detect some trace elements, (2) the low concentrations of interesting elements require efficient discrimination from isobaric interferences, (3) it is often necessary to measure the depth distribution of elements with high surface and low bulk contributions, and (4) many applications require precise isotopic analysis. Resonant ionization mass spectrometry has made dramatic progress in addressing these difficulties over the past five years.

  13. Frequency scanning interferometry in ATLAS: remote, multiple, simultaneous and precise distance measurements in a hostile environment

    NASA Astrophysics Data System (ADS)

    Coe, P. A.; Howell, D. F.; Nickerson, R. B.

    2004-11-01

    ATLAS is the largest particle detector under construction at CERN Geneva. Frequency scanning interferometry (FSI), also known as absolute distance interferometry, will be used to monitor shape changes of the SCT (semiconductor tracker), a particle tracker in the inaccessible, high radiation environment at the centre of ATLAS. Geodetic grids with several hundred fibre-coupled interferometers (30 mm to 1.5 m long) will be measured simultaneously. These lengths will be measured by tuning two lasers and comparing the resulting phase shifts in grid line interferometers (GLIs) with phase shifts in a reference interferometer. The novel inexpensive GLI design uses diverging beams to reduce sensitivity to misalignment, albeit with weaker signals. One micrometre precision length measurements of grid lines will allow 10 µm precision tracker shape corrections to be fed into ATLAS particle tracking analysis. The technique was demonstrated by measuring a 400 mm interferometer to better than 400 nm and a 1195 mm interferometer to better than 250 nm. Precise measurements were possible, even with poor quality signals, using numerical analysis of thousands of intensity samples. Errors due to drifts in interferometer length were substantially reduced using two lasers tuned in opposite directions and the precision was further improved by linking measurements made at widely separated laser frequencies.

  14. Demonstration of a Fast, Precise Propane Measurement Using Infrared Spectroscopy

    NASA Astrophysics Data System (ADS)

    Zahniser, M. S.; Roscioli, J. R.; Nelson, D. D.; Herndon, S. C.

    2016-12-01

    Propane is one of the primary components of emissions from natural gas extraction and processing activities. In addition to being an air pollutant, its ratio to other hydrocarbons such as methane and ethane can serve as a "fingerprint" of a particular facility or process, aiding in identifying emission sources. Quantifying propane has typically required laboratory analysis of flask samples, resulting in low temporal resolution and making plume-based measurements infeasible. Here we demonstrate fast (1-second), high precision (<300 ppt) measurements of propane using high resolution mid-infrared spectroscopy at 2967 wavenumbers. In addition, we explore the impact of nearby water and ethane absorption lines on the accuracy and precision of the propane measurement. Finally, we discuss development of a dual-laser instrument capable of simultaneous measurements of methane, ethane, and propane (the C1-C3 compounds), all within a small spatial package that can be easily deployed aboard a mobile platform.

  15. The importance of precision radar tracking data for the determination of density and winds from the high-altitude inflatable sphere

    NASA Technical Reports Server (NTRS)

    Schmidlin, F. J.; Michel, W. R.

    1985-01-01

    Analysis of inflatable sphere measurements obtained during the Energy Budget and MAP/WINE campaigns led to questions concerning the precision of the MPS-36 radar used for tracking the spheres; the compatibility of the sphere program with the MPS-36 radar tracking data; and the oversmoothing of derived parameters at high altitudes. Simulations, with winds having sinusoidal vertical wavelengths, were done with the sphere program (HIROBIN) to determine the resolving capability of various filters. It is concluded that given a precision radar and a perfectly performing sphere, the HIROBIN filters can be adjusted to provide small-scale perturbation information to 70 km (i.e., sinusoidal wavelengths of 2 km). It is recommended that the HIROBIN program be modified to enable it to use a variable length filter, that adjusts to fall velocity and accelerations to provide wind data with small perturbations.

  16. Proposal for the determination of nuclear masses by high-precision spectroscopy of Rydberg states

    NASA Astrophysics Data System (ADS)

    Wundt, B. J.; Jentschura, U. D.

    2010-06-01

    The theoretical treatment of Rydberg states in one-electron ions is facilitated by the virtual absence of the nuclear-size correction, and fundamental constants like the Rydberg constant may be in the reach of planned high-precision spectroscopic experiments. The dominant nuclear effect that shifts transition energies among Rydberg states therefore is due to the nuclear mass. As a consequence, spectroscopic measurements of Rydberg transitions can be used in order to precisely deduce nuclear masses. A possible application of this approach to hydrogen and deuterium, and hydrogen-like lithium and carbon is explored in detail. In order to complete the analysis, numerical and analytic calculations of the quantum electrodynamic self-energy remainder function for states with principal quantum number n = 5, ..., 8 and with angular momentum ell = n - 1 and ell = n - 2 are described \\big(j = \\ell \\pm {\\textstyle {\\frac{1}{2}}}\\big).

  17. The Design & Development of the Ocean Color Instrument Precision Superduplex Hybrid Bearing Cartridge

    NASA Technical Reports Server (NTRS)

    Schepis, Joseph; Woodard, Timothy; Hakun, Claef; Bergandy, Konrad; Church, Joseph; Ward, Peter; Lee, Michael; Conti, Alfred; Guzek, Jeffrey

    2018-01-01

    A high precision, high-resolution Ocean Color Imaging (OCI) instrument is under development for the Plankton, Aerosol, Cloud, ocean Ecosystem (PACE) mission which requires a pair of medium speed mechanisms to scan the ocean surface continuously. The design of the rotating telescope (RT) mechanism operating at 360 RPM and the half-angle mirror (HAM) mechanism synchronized at 180 RPM was concern for maintaining pointing precision over the required life and continuous operations. An effort was undertaken with the manufacturer to design and analyze a special bearing configuration to minimize axial and radial runout, minimize torque, and maintain nominal contact stresses and stiffness over the operating temperature range and to maximize life. The bearing design, development effort, analysis and testing will be discussed as will the technical challenges that this specific design imposed upon the mechanism engineers. Bearing performance, runout as achieved and verified during encoder installation and operating torque will be described.

  18. A high-precision velocity measuring system design for projectiles based on S-shaped laser screen

    NASA Astrophysics Data System (ADS)

    Liu, Huayi; Qian, Zheng; Yu, Hao; Li, Yutao

    2018-03-01

    The high-precision measurement of the velocity of high-speed flying projectile is of great significance for the evaluation and development of modern weapons. The velocity of the high-speed flying projectile is usually measured by laser screen velocity measuring system. But this method cannot achieve the repeated measurements, so we cannot make an indepth evaluation of the uncertainty about the measuring system. This paper presents a design based on S-shaped laser screen velocity measuring system. This design can achieve repeated measurements. Therefore, it can effectively reduce the uncertainty of the velocity measuring system. In addition, we made a detailed analysis of the uncertainty of the measuring system. The measurement uncertainty is 0.2% when the velocity of the projectile is about 200m/s.

  19. High-precision shape representation using a neuromorphic vision sensor with synchronous address-event communication interface

    NASA Astrophysics Data System (ADS)

    Belbachir, A. N.; Hofstätter, M.; Litzenberger, M.; Schön, P.

    2009-10-01

    A synchronous communication interface for neuromorphic temporal contrast vision sensors is described and evaluated in this paper. This interface has been designed for ultra high-speed synchronous arbitration of a temporal contrast image sensors pixels' data. Enabling high-precision timestamping, this system demonstrates its uniqueness for handling peak data rates and preserving the main advantage of the neuromorphic electronic systems, that is high and accurate temporal resolution. Based on a synchronous arbitration concept, the timestamping has a resolution of 100 ns. Both synchronous and (state-of-the-art) asynchronous arbiters have been implemented in a neuromorphic dual-line vision sensor chip in a standard 0.35 µm CMOS process. The performance analysis of both arbiters and the advantages of the synchronous arbitration over asynchronous arbitration in capturing high-speed objects are discussed in detail.

  20. Design of c-band telecontrol transmitter local oscillator for UAV data link

    NASA Astrophysics Data System (ADS)

    Cao, Hui; Qu, Yu; Song, Zuxun

    2018-01-01

    A C-band local oscillator of an Unmanned Aerial Vehicle (UAV) data link radio frequency (RF) transmitter unit with high-stability, high-precision and lightweight was designed in this paper. Based on the highly integrated broadband phase-locked loop (PLL) chip HMC834LP6GE, the system performed fractional-N control by internal modules programming to achieve low phase noise and small frequency resolution. The simulation and testing methods were combined to optimize and select the loop filter parameters to ensure the high precision and stability of the frequency synthesis output. The theoretical analysis and engineering prototype measurement results showed that the local oscillator had stable output frequency, accurate frequency step, high spurious suppression and low phase noise, and met the design requirements. The proposed design idea and research method have theoretical guiding significance for engineering practice.

  1. Dimensional Precision Research of Wax Molding Rapid Prototyping based on Droplet Injection

    NASA Astrophysics Data System (ADS)

    Mingji, Huang; Geng, Wu; yan, Shan

    2017-11-01

    The traditional casting process is complex, the mold is essential products, mold quality directly affect the quality of the product. With the method of rapid prototyping 3D printing to produce mold prototype. The utility wax model has the advantages of high speed, low cost and complex structure. Using the orthogonal experiment as the main method, analysis each factors of size precision. The purpose is to obtain the optimal process parameters, to improve the dimensional accuracy of production based on droplet injection molding.

  2. Evaluation of the prediction precision capability of partial least squares regression approach for analysis of high alloy steel by laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Sarkar, Arnab; Karki, Vijay; Aggarwal, Suresh K.; Maurya, Gulab S.; Kumar, Rohit; Rai, Awadhesh K.; Mao, Xianglei; Russo, Richard E.

    2015-06-01

    Laser induced breakdown spectroscopy (LIBS) was applied for elemental characterization of high alloy steel using partial least squares regression (PLSR) with an objective to evaluate the analytical performance of this multivariate approach. The optimization of the number of principle components for minimizing error in PLSR algorithm was investigated. The effect of different pre-treatment procedures on the raw spectral data before PLSR analysis was evaluated based on several statistical (standard error of prediction, percentage relative error of prediction etc.) parameters. The pre-treatment with "NORM" parameter gave the optimum statistical results. The analytical performance of PLSR model improved by increasing the number of laser pulses accumulated per spectrum as well as by truncating the spectrum to appropriate wavelength region. It was found that the statistical benefit of truncating the spectrum can also be accomplished by increasing the number of laser pulses per accumulation without spectral truncation. The constituents (Co and Mo) present in hundreds of ppm were determined with relative precision of 4-9% (2σ), whereas the major constituents Cr and Ni (present at a few percent levels) were determined with a relative precision of ~ 2%(2σ).

  3. IoT for Real-Time Measurement of High-Throughput Liquid Dispensing in Laboratory Environments.

    PubMed

    Shumate, Justin; Baillargeon, Pierre; Spicer, Timothy P; Scampavia, Louis

    2018-04-01

    Critical to maintaining quality control in high-throughput screening is the need for constant monitoring of liquid-dispensing fidelity. Traditional methods involve operator intervention with gravimetric analysis to monitor the gross accuracy of full plate dispenses, visual verification of contents, or dedicated weigh stations on screening platforms that introduce potential bottlenecks and increase the plate-processing cycle time. We present a unique solution using open-source hardware, software, and 3D printing to automate dispenser accuracy determination by providing real-time dispense weight measurements via a network-connected precision balance. This system uses an Arduino microcontroller to connect a precision balance to a local network. By integrating the precision balance as an Internet of Things (IoT) device, it gains the ability to provide real-time gravimetric summaries of dispensing, generate timely alerts when problems are detected, and capture historical dispensing data for future analysis. All collected data can then be accessed via a web interface for reviewing alerts and dispensing information in real time or remotely for timely intervention of dispense errors. The development of this system also leveraged 3D printing to rapidly prototype sensor brackets, mounting solutions, and component enclosures.

  4. Design and analysis of a 3D Elliptical Micro-Displacement Motion Stage

    NASA Astrophysics Data System (ADS)

    Lin, Jieqiong; Zhao, Dongpo; Lu, Mingming; Zhou, Jiakang

    2017-12-01

    Micro-displacement motion stage driven by piezoelectric actuator has a significant demand in the field of ultra-precision machining in recent years, while the design of micro-displacement motion stage plays an important role to realize a large displacement output and high precision control. Thus, a 3D elliptical micro-displacement motion stage driven by three PZT actuators has been developed. Firstly, the 3D elliptical trajectory of this motion stage could be adjusted through the form of the PZT actuators input signal. Then, the desired trajectory was obtained by adjusting the micro displacement of the motion stage in 3D elliptical space. Finally, the trajectory simulation and the finite element simulation were applied in this motion stage. The experimental results shown that, the output displacement of the three directions under the input force of the 1600N were 14μm, 16μm and 74μm, respectively. And the first three modes were 1471.6Hz, 2698.4Hz and 2803.4Hz, respectively. Analysis and experiments were carried out to verify the performance, result proved that a large output displacement and high precision control could be obtained.

  5. Single-Cell Sequencing for Precise Cancer Research: Progress and Prospects.

    PubMed

    Zhang, Xiaoyan; Marjani, Sadie L; Hu, Zhaoyang; Weissman, Sherman M; Pan, Xinghua; Wu, Shixiu

    2016-03-15

    Advances in genomic technology have enabled the faithful detection and measurement of mutations and the gene expression profile of cancer cells at the single-cell level. Recently, several single-cell sequencing methods have been developed that permit the comprehensive and precise analysis of the cancer-cell genome, transcriptome, and epigenome. The use of these methods to analyze cancer cells has led to a series of unanticipated discoveries, such as the high heterogeneity and stochastic changes in cancer-cell populations, the new driver mutations and the complicated clonal evolution mechanisms, and the novel identification of biomarkers of variant tumors. These methods and the knowledge gained from their utilization could potentially improve the early detection and monitoring of rare cancer cells, such as circulating tumor cells and disseminated tumor cells, and promote the development of personalized and highly precise cancer therapy. Here, we discuss the current methods for single cancer-cell sequencing, with a strong focus on those practically used or potentially valuable in cancer research, including single-cell isolation, whole genome and transcriptome amplification, epigenome profiling, multi-dimensional sequencing, and next-generation sequencing and analysis. We also examine the current applications, challenges, and prospects of single cancer-cell sequencing. ©2016 American Association for Cancer Research.

  6. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  7. High resolution melting analysis: rapid and precise characterisation of recombinant influenza A genomes

    PubMed Central

    2013-01-01

    Background High resolution melting analysis (HRM) is a rapid and cost-effective technique for the characterisation of PCR amplicons. Because the reverse genetics of segmented influenza A viruses allows the generation of numerous influenza A virus reassortants within a short time, methods for the rapid selection of the correct recombinants are very useful. Methods PCR primer pairs covering the single nucleotide polymorphism (SNP) positions of two different influenza A H5N1 strains were designed. Reassortants of the two different H5N1 isolates were used as a model to prove the suitability of HRM for the selection of the correct recombinants. Furthermore, two different cycler instruments were compared. Results Both cycler instruments generated comparable average melting peaks, which allowed the easy identification and selection of the correct cloned segments or reassorted viruses. Conclusions HRM is a highly suitable method for the rapid and precise characterisation of cloned influenza A genomes. PMID:24028349

  8. High Precision Prediction of Functional Sites in Protein Structures

    PubMed Central

    Buturovic, Ljubomir; Wong, Mike; Tang, Grace W.; Altman, Russ B.; Petkovic, Dragutin

    2014-01-01

    We address the problem of assigning biological function to solved protein structures. Computational tools play a critical role in identifying potential active sites and informing screening decisions for further lab analysis. A critical parameter in the practical application of computational methods is the precision, or positive predictive value. Precision measures the level of confidence the user should have in a particular computed functional assignment. Low precision annotations lead to futile laboratory investigations and waste scarce research resources. In this paper we describe an advanced version of the protein function annotation system FEATURE, which achieved 99% precision and average recall of 95% across 20 representative functional sites. The system uses a Support Vector Machine classifier operating on the microenvironment of physicochemical features around an amino acid. We also compared performance of our method with state-of-the-art sequence-level annotator Pfam in terms of precision, recall and localization. To our knowledge, no other functional site annotator has been rigorously evaluated against these key criteria. The software and predictive models are incorporated into the WebFEATURE service at http://feature.stanford.edu/wf4.0-beta. PMID:24632601

  9. Progress Towards a High-Precision Infrared Spectroscopic Survey of the H_3^+ Ion

    NASA Astrophysics Data System (ADS)

    Perry, Adam J.; Hodges, James N.; Markus, Charles R.; Kocheril, G. Stephen; Jenkins, Paul A., II; McCall, Benjamin J.

    2015-06-01

    The trihydrogen cation, H_3^+, represents one of the most important and fundamental molecular systems. Having only two electrons and three nuclei, H_3^+ is the simplest polyatomic system and is a key testing ground for the development of new techniques for calculating potential energy surfaces and predicting molecular spectra. Corrections that go beyond the Born-Oppenheimer approximation, including adiabatic, non-adiabatic, relativistic, and quantum electrodynamic corrections are becoming more feasible to calculate. As a result, experimental measurements performed on the H_3^+ ion serve as important benchmarks which are used to test the predictive power of new computational methods. By measuring many infrared transitions with precision at the sub-MHz level it is possible to construct a list of the most highly precise experimental rovibrational energy levels for this molecule. Until recently, only a select handful of infrared transitions of this molecule have been measured with high precision (˜ 1 MHz). Using the technique of Noise Immune Cavity Enhanced Optical Heterodyne Velocity Modulation Spectroscopy, we are aiming to produce the largest high-precision spectroscopic dataset for this molecule to date. Presented here are the current results from our survey along with a discussion of the combination differences analysis used to extract the experimentally determined rovibrational energy levels. O. Polyansky, et al., Phil. Trans. R. Soc. A (2012), 370, 5014. M. Pavanello, et al., J. Chem. Phys. (2012), 136, 184303. L. Diniz, et al., Phys. Rev. A (2013), 88, 032506. L. Lodi, et al., Phys. Rev. A (2014), 89, 032505. J. Hodges, et al., J. Chem. Phys (2013), 139, 164201.

  10. BDS Precise Point Positioning for Seismic Displacements Monitoring: Benefit from the High-Rate Satellite Clock Corrections

    PubMed Central

    Geng, Tao; Su, Xing; Fang, Rongxin; Xie, Xin; Zhao, Qile; Liu, Jingnan

    2016-01-01

    In order to satisfy the requirement of high-rate high-precision applications, 1 Hz BeiDou Navigation Satellite System (BDS) satellite clock corrections are generated based on precise orbit products, and the quality of the generated clock products is assessed by comparing with those from the other analysis centers. The comparisons show that the root mean square (RMS) of clock errors of geostationary Earth orbits (GEO) is about 0.63 ns, whereas those of inclined geosynchronous orbits (IGSO) and medium Earth orbits (MEO) are about 0.2–0.3 ns and 0.1 ns, respectively. Then, the 1 Hz clock products are used for BDS precise point positioning (PPP) to retrieve seismic displacements of the 2015 Mw 7.8 Gorkha, Nepal, earthquake. The derived seismic displacements from BDS PPP are consistent with those from the Global Positioning System (GPS) PPP, with RMS of 0.29, 0.38, and 1.08 cm in east, north, and vertical components, respectively. In addition, the BDS PPP solutions with different clock intervals of 1 s, 5 s, 30 s, and 300 s are processed and compared with each other. The results demonstrate that PPP with 300 s clock intervals is the worst and that with 1 s clock interval is the best. For the scenario of 5 s clock intervals, the precision of PPP solutions is almost the same to 1 s results. Considering the time consumption of clock estimates, we suggest that 5 s clock interval is competent for high-rate BDS solutions. PMID:27999384

  11. BDS Precise Point Positioning for Seismic Displacements Monitoring: Benefit from the High-Rate Satellite Clock Corrections.

    PubMed

    Geng, Tao; Su, Xing; Fang, Rongxin; Xie, Xin; Zhao, Qile; Liu, Jingnan

    2016-12-20

    In order to satisfy the requirement of high-rate high-precision applications, 1 Hz BeiDou Navigation Satellite System (BDS) satellite clock corrections are generated based on precise orbit products, and the quality of the generated clock products is assessed by comparing with those from the other analysis centers. The comparisons show that the root mean square (RMS) of clock errors of geostationary Earth orbits (GEO) is about 0.63 ns, whereas those of inclined geosynchronous orbits (IGSO) and medium Earth orbits (MEO) are about 0.2-0.3 ns and 0.1 ns, respectively. Then, the 1 Hz clock products are used for BDS precise point positioning (PPP) to retrieve seismic displacements of the 2015 Mw 7.8 Gorkha, Nepal, earthquake. The derived seismic displacements from BDS PPP are consistent with those from the Global Positioning System (GPS) PPP, with RMS of 0.29, 0.38, and 1.08 cm in east, north, and vertical components, respectively. In addition, the BDS PPP solutions with different clock intervals of 1 s, 5 s, 30 s, and 300 s are processed and compared with each other. The results demonstrate that PPP with 300 s clock intervals is the worst and that with 1 s clock interval is the best. For the scenario of 5 s clock intervals, the precision of PPP solutions is almost the same to 1 s results. Considering the time consumption of clock estimates, we suggest that 5 s clock interval is competent for high-rate BDS solutions.

  12. Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques

    USGS Publications Warehouse

    Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.

    2013-01-01

    Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.

  13. High-Precision Measurement of 13C/12C Isotopic Ratio Using Gas Chromatography-Combustion-Cavity Ring-Down Spectroscopy

    NASA Astrophysics Data System (ADS)

    Saad, N.; Kuramoto, D. S.; Haase, C.; Crosson, E.; Tan, S.; Zare, R. N.

    2009-12-01

    Light stable isotope analysis, and in particular, compound specific isotopic analysis (CSIA), is a valuable tool to elucidate pathways and provide a better insight into biological, ecological, and geological systems. We present here the results of compound-specific isotopic carbon analysis of short chain hydrocarbons using the world’s first combination of gas chromatography, combustion interface, and cavity ring-down spectroscopy (GC-C-CRDS). Cavity ring-down spectroscopy (CRDS) is a highly sensitive optical spectroscopy, one application of which is to measure the stable isotopic ratios in small molecules. Because it uses a highly reflective optical cavity with many kilometers effective path length, CRDS provides some of the most sensitive and precise optical absorption measurements. Most optical spectroscopy isotopic analysis measures the quantities of each isotopologue independently using their distinct ro-vibrational spectra. The most common isotopes measured with optical spectroscopy are 13C and 12C in carbon dioxide. However, the isotopes of hydrogen, oxygen, and sulfur have also been measured. Unlike isotope ratio mass spectrometry (IRMS), optical spectroscopy can distinguish among isobars, which have essentially identical m/z ratios. The combination of chemical separation, chemical conversion, and CRDS makes a nearly universal tool for isotopic analysis of mixtures. In addition, CRDS can tolerate a variety of compounds mixed with the target. For example, CRDS can measure carbon dioxide and its isotopic 13C/12C ratio in the presence of oxygen. Using the novel GC-C-CRDS system, we injected a 75-microliter mixture of approximately equal quantities of methane, ethane, and propane into a gas chromatograph using helium as carrier gas. The methane, ethane, and propane were separated in time by 100 to 200 seconds after the chromatograph. Oxygen gas was added, and the hydrocarbons were combusted in a catalytic combustor with platinum and nickel, held at 1150oC. The combusted products were combined with dry nitrogen gas to provide sufficient gas flow for the CRDS analyzer, which measured the 13C/12C isotopic ratio of the separated methane, ethane, and propane, obtaining a precision of 0.95 permil or better. The calibration accuracy was within 3 permil of the values determined using IRMS. The current CRDS-based system is less expensive, does not require highly trained personnel to operate, and is portable, compared with IRMS. We anticipate that advances in spectroscopic analysis will improve the precision and accuracy of the CRDS isotopic measurement, making it comparable with IRMS.

  14. A quasi-spectral method for Cauchy problem of 2/D Laplace equation on an annulus

    NASA Astrophysics Data System (ADS)

    Saito, Katsuyoshi; Nakada, Manabu; Iijima, Kentaro; Onishi, Kazuei

    2005-01-01

    Real numbers are usually represented in the computer as a finite number of digits hexa-decimal floating point numbers. Accordingly the numerical analysis is often suffered from rounding errors. The rounding errors particularly deteriorate the precision of numerical solution in inverse and ill-posed problems. We attempt to use a multi-precision arithmetic for reducing the rounding error evil. The use of the multi-precision arithmetic system is by the courtesy of Dr Fujiwara of Kyoto University. In this paper we try to show effectiveness of the multi-precision arithmetic by taking two typical examples; the Cauchy problem of the Laplace equation in two dimensions and the shape identification problem by inverse scattering in three dimensions. It is concluded from a few numerical examples that the multi-precision arithmetic works well on the resolution of those numerical solutions, as it is combined with the high order finite difference method for the Cauchy problem and with the eigenfunction expansion method for the inverse scattering problem.

  15. Precision electron-beam polarimetry at 1 GeV using diamond microstrip detectors

    DOE PAGES

    Narayan, A.; Jones, D.; Cornejo, J. C.; ...

    2016-02-16

    We report on the highest precision yet achieved in the measurement of the polarization of a low-energy, O(1 GeV), continuous-wave (CW) electron beam, accomplished using a new polarimeter based on electron-photon scattering, in Hall C at Jefferson Lab. A number of technical innovations were necessary, including a novel method for precise control of the laser polarization in a cavity and a novel diamond microstrip detector that was able to capture most of the spectrum of scattered electrons. The data analysis technique exploited track finding, the high granularity of the detector, and its large acceptance. The polarization of the 180–μA, 1.16-GeVmore » electron beam was measured with a statistical precision of <1% per hour and a systematic uncertainty of 0.59%. This exceeds the level of precision required by the Q weak experiment, a measurement of the weak vector charge of the proton. Proposed future low-energy experiments require polarization uncertainty < 0.4%, and this result represents an important demonstration of that possibility. This measurement is the first use of diamond detectors for particle tracking in an experiment. As a result, it demonstrates the stable operation of a diamond-based tracking detector in a high radiation environment, for two years.« less

  16. Performance Analysis and Electronics Packaging of the Optical Communications Demonstrator

    NASA Technical Reports Server (NTRS)

    Jeganathan, M.; Monacos, S.

    1998-01-01

    The Optical Communications Demonstrator (OCD), under development at the Jet Propulsion Laboratory (JPL), is a laboratory-based lasercomm terminal designed to validate several key technologies, primarily precision beam pointing, high bandwidth tracking, and beacon acquisition.

  17. High-Precision Phenotyping of Grape Bunch Architecture Using Fast 3D Sensor and Automation.

    PubMed

    Rist, Florian; Herzog, Katja; Mack, Jenny; Richter, Robert; Steinhage, Volker; Töpfer, Reinhard

    2018-03-02

    Wine growers prefer cultivars with looser bunch architecture because of the decreased risk for bunch rot. As a consequence, grapevine breeders have to select seedlings and new cultivars with regard to appropriate bunch traits. Bunch architecture is a mosaic of different single traits which makes phenotyping labor-intensive and time-consuming. In the present study, a fast and high-precision phenotyping pipeline was developed. The optical sensor Artec Spider 3D scanner (Artec 3D, L-1466, Luxembourg) was used to generate dense 3D point clouds of grapevine bunches under lab conditions and an automated analysis software called 3D-Bunch-Tool was developed to extract different single 3D bunch traits, i.e., the number of berries, berry diameter, single berry volume, total volume of berries, convex hull volume of grapes, bunch width and bunch length. The method was validated on whole bunches of different grapevine cultivars and phenotypic variable breeding material. Reliable phenotypic data were obtained which show high significant correlations (up to r² = 0.95 for berry number) compared to ground truth data. Moreover, it was shown that the Artec Spider can be used directly in the field where achieved data show comparable precision with regard to the lab application. This non-invasive and non-contact field application facilitates the first high-precision phenotyping pipeline based on 3D bunch traits in large plant sets.

  18. Technical note: Coupling infrared gas analysis and cavity ring down spectroscopy for autonomous, high-temporal-resolution measurements of DIC and δ13C-DIC

    NASA Astrophysics Data System (ADS)

    Call, Mitchell; Schulz, Kai G.; Carvalho, Matheus C.; Santos, Isaac R.; Maher, Damien T.

    2017-03-01

    A new approach to autonomously determine concentrations of dissolved inorganic carbon (DIC) and its carbon stable isotope ratio (δ13C-DIC) at high temporal resolution is presented. The simple method requires no customised design. Instead it uses two commercially available instruments currently used in aquatic carbon research. An inorganic carbon analyser utilising non-dispersive infrared detection (NDIR) is coupled to a Cavity Ring-down Spectrometer (CRDS) to determine DIC and δ13C-DIC based on the liberated CO2 from acidified aliquots of water. Using a small sample volume of 2 mL, the precision and accuracy of the new method was comparable to standard isotope ratio mass spectrometry (IRMS) methods. The system achieved a sampling resolution of 16 min, with a DIC precision of ±1.5 to 2 µmol kg-1 and δ13C-DIC precision of ±0.14 ‰ for concentrations spanning 1000 to 3600 µmol kg-1. Accuracy of 0.1 ± 0.06 ‰ for δ13C-DIC based on DIC concentrations ranging from 2000 to 2230 µmol kg-1 was achieved during a laboratory-based algal bloom experiment. The high precision data that can be autonomously obtained by the system should enable complex carbonate system questions to be explored in aquatic sciences using high-temporal-resolution observations.

  19. Nonflammable, Nonaqueous, Low Atmospheric Impact, High Performance Cleaning Solvents

    NASA Technical Reports Server (NTRS)

    Dhooge, P. M.; Glass, S. M.; Nimitz, J. S.

    2001-01-01

    For many years, chlorofluorocarbon (CFC) and chlorocarbon solvents have played an important part in aerospace operations. These solvents found extensive use as cleaning and analysis (EPA) solvents in precision and critical cleaning. However, CFCs and chlorocarbon solvents have deleterious effects on the ozone layer, are relatively strong greenhouse gases, and some are suspect or known carcinogens. Because of their ozone-depletion potential (ODP), the Montreal Protocol and its amendments, as well as other environmental regulations, have resulted in the phaseout of CFC-113 and 1,1,1-trichloroethane (TCA). Although alternatives have been recommended, they do not perform as well as the original solvents. In addition, some analyses, such as the infrared analysis of extracted hydrocarbons, cannot be performed with the substitute solvents that contain C-H bonds. CFC-113 solvent has been used for many critical aerospace applications. CFC-113, also known as Freon (registered) TF, has been used extensively in NASA's cleaning facilities for precision and critical cleaning, in particular the final rinsing in Class 100 areas, with gas chromatography analysis of rinse residue. While some cleaning can be accomplished by other processes, there are certain critical applications where CFC-113 or a similar solvent is highly cost-effective and ensures safety. Oxygen system components are one example where a solvent compatible with oxygen and capable of removing fluorocarbon grease is needed. Electronic components and precision mechanical components can also be damaged by aggressive cleaning solvents.

  20. Hybrid Network Architectures for the Next Generation NAS

    NASA Technical Reports Server (NTRS)

    Madubata, Christian

    2003-01-01

    To meet the needs of the 21st Century NAS, an integrated, network-centric infrastructure is essential that is characterized by secure, high bandwidth, digital communication systems that support precision navigation capable of reducing position errors for all aircraft to within a few meters. This system will also require precision surveillance systems capable of accurately locating all aircraft, and automatically detecting any deviations from an approved path within seconds and be able to deliver high resolution weather forecasts - critical to create 4- dimensional (space and time) profiles for up to 6 hours for all atmospheric conditions affecting aviation, including wake vortices. The 21st Century NAS will be characterized by highly accurate digital data bases depicting terrain, obstacle, and airport information no matter what visibility conditions exist. This research task will be to perform a high-level requirements analysis of the applications, information and services required by the next generation National Airspace System. The investigation and analysis is expected to lead to the development and design of several national network-centric communications architectures that would be capable of supporting the Next Generation NAS.

  1. High-precision measurements of cementless acetabular components using model-based RSA: an experimental study.

    PubMed

    Baad-Hansen, Thomas; Kold, Søren; Kaptein, Bart L; Søballe, Kjeld

    2007-08-01

    In RSA, tantalum markers attached to metal-backed acetabular cups are often difficult to detect on stereo radiographs due to the high density of the metal shell. This results in occlusion of the prosthesis markers and may lead to inconclusive migration results. Within the last few years, new software systems have been developed to solve this problem. We compared the precision of 3 RSA systems in migration analysis of the acetabular component. A hemispherical and a non-hemispherical acetabular component were mounted in a phantom. Both acetabular components underwent migration analyses with 3 different RSA systems: conventional RSA using tantalum markers, an RSA system using a hemispherical cup algorithm, and a novel model-based RSA system. We found narrow confidence intervals, indicating high precision of the conventional marker system and model-based RSA with regard to migration and rotation. The confidence intervals of conventional RSA and model-based RSA were narrower than those of the hemispherical cup algorithm-based system regarding cup migration and rotation. The model-based RSA software combines the precision of the conventional RSA software with the convenience of the hemispherical cup algorithm-based system. Based on our findings, we believe that these new tools offer an improvement in the measurement of acetabular component migration.

  2. Technologies That Enable Accurate and Precise Nano- to Milliliter-Scale Liquid Dispensing of Aqueous Reagents Using Acoustic Droplet Ejection.

    PubMed

    Sackmann, Eric K; Majlof, Lars; Hahn-Windgassen, Annett; Eaton, Brent; Bandzava, Temo; Daulton, Jay; Vandenbroucke, Arne; Mock, Matthew; Stearns, Richard G; Hinkson, Stephen; Datwani, Sammy S

    2016-02-01

    Acoustic liquid handling uses high-frequency acoustic signals that are focused on the surface of a fluid to eject droplets with high accuracy and precision for various life science applications. Here we present a multiwell source plate, the Echo Qualified Reservoir (ER), which can acoustically transfer over 2.5 mL of fluid per well in 25-nL increments using an Echo 525 liquid handler. We demonstrate two Labcyte technologies-Dynamic Fluid Analysis (DFA) methods and a high-voltage (HV) grid-that are required to maintain accurate and precise fluid transfers from the ER at this volume scale. DFA methods were employed to dynamically assess the energy requirements of the fluid and adjust the acoustic ejection parameters to maintain a constant velocity droplet. Furthermore, we demonstrate that the HV grid enhances droplet velocity and coalescence at the destination plate. These technologies enabled 5-µL per destination well transfers to a 384-well plate, with accuracy and precision values better than 4%. Last, we used the ER and Echo 525 liquid handler to perform a quantitative polymerase chain reaction (qPCR) assay to demonstrate an application that benefits from the flexibility and larger volume capabilities of the ER. © 2015 Society for Laboratory Automation and Screening.

  3. Computational Calorimetry: High-Precision Calculation of Host–Guest Binding Thermodynamics

    PubMed Central

    2015-01-01

    We present a strategy for carrying out high-precision calculations of binding free energy and binding enthalpy values from molecular dynamics simulations with explicit solvent. The approach is used to calculate the thermodynamic profiles for binding of nine small molecule guests to either the cucurbit[7]uril (CB7) or β-cyclodextrin (βCD) host. For these systems, calculations using commodity hardware can yield binding free energy and binding enthalpy values with a precision of ∼0.5 kcal/mol (95% CI) in a matter of days. Crucially, the self-consistency of the approach is established by calculating the binding enthalpy directly, via end point potential energy calculations, and indirectly, via the temperature dependence of the binding free energy, i.e., by the van’t Hoff equation. Excellent agreement between the direct and van’t Hoff methods is demonstrated for both host–guest systems and an ion-pair model system for which particularly well-converged results are attainable. Additionally, we find that hydrogen mass repartitioning allows marked acceleration of the calculations with no discernible cost in precision or accuracy. Finally, we provide guidance for accurately assessing numerical uncertainty of the results in settings where complex correlations in the time series can pose challenges to statistical analysis. The routine nature and high precision of these binding calculations opens the possibility of including measured binding thermodynamics as target data in force field optimization so that simulations may be used to reliably interpret experimental data and guide molecular design. PMID:26523125

  4. Laser-induced breakdown spectroscopy (LIBS) analysis of calcium ions dissolved in water using filter paper substrates: an ideal internal standard for precision improvement.

    PubMed

    Choi, Daewoong; Gong, Yongdeuk; Nam, Sang-Ho; Han, Song-Hee; Yoo, Jonghyun; Lee, Yonghoon

    2014-01-01

    We report an approach for selecting an internal standard to improve the precision of laser-induced breakdown spectroscopy (LIBS) analysis for determining calcium (Ca) concentration in water. The dissolved Ca(2+) ions were pre-concentrated on filter paper by evaporating water. The filter paper was dried and analyzed using LIBS. By adding strontium chloride to sample solutions and using a Sr II line at 407.771 nm for the intensity normalization of Ca II lines at 393.366 or 396.847 nm, the analysis precision could be significantly improved. The Ca II and Sr II line intensities were mapped across the filter paper, and they showed a strong positive shot-to-shot correlation with the same spatial distribution on the filter paper surface. We applied this analysis approach for the measurement of Ca(2+) in tap, bottled, and ground water samples. The Ca(2+) concentrations determined using LIBS are in good agreement with those obtained from flame atomic absorption spectrometry. Finally, we suggest a homologous relation of the strongest emission lines of period 4 and 5 elements in groups IA and IIA based on their similar electronic structures. Our results indicate that the LIBS can be effectively applied for liquid analysis at the sub-parts per million level with high precision using a simple drying of liquid solutions on filter paper and the use of the correct internal standard elements with the similar valence electronic structure with respect to the analytes of interest.

  5. Spatial distribution, sampling precision and survey design optimisation with non-normal variables: The case of anchovy (Engraulis encrasicolus) recruitment in Spanish Mediterranean waters

    NASA Astrophysics Data System (ADS)

    Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan

    2016-02-01

    In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.

  6. Eco-Environment Status Evaluation and Change Analysis of Qinghai Based on National Geographic Conditions Census Data

    NASA Astrophysics Data System (ADS)

    Zheng, M.; Zhu, M.; Wang, Y.; Xu, C.; Yang, H.

    2018-04-01

    As the headstream of the Yellow River, the Yangtze River and the Lantsang River, located in the hinterland of Qinghai-Tibet Plateau, Qinghai province is hugely significant for ecosystem as well as for ecological security and sustainable development in China. With the accomplishment of the first national geographic condition census, the frequent monitoring has begun. The classification indicators of the census and monitoring data are highly correlated with Technical Criterion for Ecosystem Status Evaluation released by Ministry of Environmental Protection in 2015. Based on three years' geographic conditions data (2014-2016), Landsat-8 images and thematic data (water resource, pollution emissions, meteorological data, soil erosion, etc.), a multi-years and high-precision eco-environment status evaluation and spatiotemporal change analysis of Qinghai province has been researched on the basis of Technical Criterion for Ecosystem Status Evaluation in this paper. Unlike the evaluation implemented by environmental protection department, the evaluation unit in this paper is town rather than county. The evaluation result shows that the eco-environment status in Qinghai is generally in a fine condition, and has significant regional differences. The eco-environment status evaluation based on national geographic conditions census and monitoring data can improve both the time and space precision. The eco-environment status with high space precise and multi-indices is a key basis for environment protection decision-making.

  7. Quantitation of Phenol Levels in Oil of Wintergreen Using Gas Chromatography-Mass Spectrometry with Selected Ion Monitoring

    ERIC Educational Resources Information Center

    Sobel, Robert M.; Ballantine, David S.; Ryzhov, Victor

    2005-01-01

    Industrial application of gas chromatography-mass spectrometry (GC-MS) analysis is a powerful technique that could be used to elucidate components of a complex mixture while offering the benefits of high-precision quantitative analysis. The natural wintergreen oil is examined for its phenol concentration to determine the level of refining…

  8. Preliminary Figures of Merit for Isotope Ratio Measurements: The Liquid Sampling-Atmospheric Pressure Glow Discharge Microplasma Ionization Source Coupled to an Orbitrap Mass Analyzer

    NASA Astrophysics Data System (ADS)

    Hoegg, Edward D.; Barinaga, Charles J.; Hager, George J.; Hart, Garret L.; Koppenaal, David W.; Marcus, R. Kenneth

    2016-08-01

    In order to meet a growing need for fieldable mass spectrometer systems for precise elemental and isotopic analyses, the liquid sampling-atmospheric pressure glow discharge (LS-APGD) has a number of very promising characteristics. One key set of attributes that await validation deals with the performance characteristics relative to isotope ratio precision and accuracy. Owing to its availability and prior experience with this research team, the initial evaluation of isotope ratio (IR) performance was performed on a Thermo Scientific Exactive Orbitrap instrument. While the mass accuracy and resolution performance for Orbitrap analyzers are well-documented, no detailed evaluations of the IR performance have been published. Efforts described here involve two variables: the inherent IR precision and accuracy delivered by the LS-APGD microplasma and the inherent IR measurement qualities of Orbitrap analyzers. Important to the IR performance, the various operating parameters of the Orbitrap sampling interface, high-energy collisional dissociation (HCD) stage, and ion injection/data acquisition have been evaluated. The IR performance for a range of other elements, including natural, depleted, and enriched uranium isotopes was determined. In all cases, the precision and accuracy are degraded when measuring low abundance (<0.1% isotope fractions). In the best case, IR precision on the order of 0.1% RSD can be achieved, with values of 1%-3% RSD observed for low-abundance species. The results suggest that the LS-APGD is a promising candidate for field deployable MS analysis and that the high resolving powers of the Orbitrap may be complemented with a here-to-fore unknown capacity to deliver high-precision IRs.

  9. Thermal-mechanical behavior of high precision composite mirrors

    NASA Technical Reports Server (NTRS)

    Kuo, C. P.; Lou, M. C.; Rapp, D.

    1993-01-01

    Composite mirror panels were designed, constructed, analyzed, and tested in the framework of a NASA precision segmented reflector task. The deformations of the reflector surface during the exposure to space enviroments were predicted using a finite element model. The composite mirror panels have graphite-epoxy or graphite-cyanate facesheets, separated by an aluminum or a composite honeycomb core. It is pointed out that in order to carry out detailed modeling of composite mirrors with high accuracy, it is necessary to have temperature dependent properties of the materials involved and the type and magnitude of manufacturing errors and material nonuniformities. The structural modeling and analysis efforts addressed the impact of key design and materials parameters on the performance of mirrors.

  10. [Determination of doping in human urine by gas chromatography-high resolution mass spectrometry].

    PubMed

    Xing, Yan-Yi; Liu, Xin; Zhang, Yu-Mei; Wang, Xiao-Bing; Xu, You-Xuan

    2012-12-01

    A method was evaluated for determination of twenty-one doping (including nandrolone, boldenone and methandienone) in human urine by gas chromatography-high resolution mass spectrometry. Samples were prepared by liquid-liquid extraction, concentrated, TMS derivatization and limit of detection at ng x mL(-1) by MID/GC/HRMS. According to the code of the World Anti-Doping Agency (WADA), precision and recoveries of the procedure were evaluated by replicate analysis (n = 6), the recoveries in the range of 66%-103%, with the RSD below 10.0%. The precision within the day of the method with three different concentrations was also determined RSD were less than 9.5%, 10.0% and 9.7%.

  11. High-efficiency (6 + 1) × 1 pump-signal combiner based on low-deformation and high-precision alignment fabrication

    NASA Astrophysics Data System (ADS)

    Zou, Shuzhen; Chen, Han; Yu, Haijuan; Sun, Jing; Zhao, Pengfei; Lin, Xuechun

    2017-12-01

    We demonstrate a new method for fabricating a (6 + 1) × 1 pump-signal combiner based on the reduction of signal fiber diameter by corrosion. This method avoids the mismatch loss of the splice between the signal fiber and the output fiber caused by the signal fiber taper processing. The optimum radius of the corroded signal fiber was calculated according to the analysis of the influence of the cladding thickness on the laser propagating in the fiber core. Besides, we also developed a two-step splicing method to complete the high-precision alignment between the signal fiber core and the output fiber core. A high-efficiency (6 + 1) × 1 pump-signal combiner was produced with an average pump power transmission efficiency of 98.0% and a signal power transmission efficiency of 97.7%, which is well suitable for application to high-power fiber laser system.

  12. Quantitative high-performance liquid chromatography of nucleosides in biological materials.

    PubMed

    Gehrke, C W; Kuo, K C; Davis, G E; Suits, R D; Waalkes, T P; Borek, E

    1978-03-21

    A rigorous, comprehensive, and reliable reversed-phase high-performance liquid chromatographic (HPLC) method has been developed for the analysis of ribonucleosides in urine (psi, m1A, m1I, m2G, A, m2(2)G). An initial isolation of ribonucleosides with an affinity gel containing an immobilized phenylboronic acid was used to improve selectivity and sensitivity. Response for all nucleosides was linear from 0.1 to 50 nmoles injected and good quantitation was obtained for 25 microliter or less of sample placed on the HPLC column. Excellent precision of analysis for urinary nucleosides was achieved on matrix dependent and independent samples, and the high resolution of the reversed-phase column allowed the complete separation of 9 nucleosides from other unidentified UV absorbing components at the 1-ng level. Supporting experimental data are presented on precision, recovery, chromatographic methods, minimum detection limit, retention time, relative molar response, sample clean-up, stability of nucleosides, boronate gel capacity, and application to analysis of urine from patients with leukemia and breast cancer. This method is now being used routinely for the determination of the concentration and ratios of nucleosides in urine from patients with different types of cancer and in chemotherapy response studies.

  13. The ultrahigh precision form measurement of small, steep-sided aspheric moulds, incorporating novel hardware and software developments; Technical Digest

    NASA Astrophysics Data System (ADS)

    Mills, M. W.; Hutchinson, Matthew J.

    2005-05-01

    A variety of consumer applications, eg cellphone camera lenses, optical storage devices, digital cameras, etc, are driving the demand for small, high aspheric departure rotationally-symmetric moulded optics, manufactured both in polymer and glass materials. The mould tools for such components are manufactured by ultra-high precision techniques such as single point diamond turning and ultra-precision grinding, and must be accurate to <1/10μm levels for form, and exhibit nanometric surface finish quality. The aspheric forms of such components' optical surfaces exhibit high departure from best-fit sphere towards their outer edge, which renders this outer region especially critical for optical performance. The high slope of these components at the clear aperture has caused some restrictions on the use of profilometry in the measurement of form across their full diameter. Taylor Hobson designs and manufactures a range of ultra-precision profilometers for use in such industries as aspheric optics fabrication. In order to address the issues described, a new measurement system, Taylor Hobson Form Talysurf PGI 1250, has been developed, which contains new Aspheric Data Fusion Software, as well as Asphero-Diffractive Analysis Software, allowing the entire diametric profile to be analysed to the desired level of accuracy. This development removes the previous limitation of maximum slope for this type of measurement, thus enabling better quality control of high slope, high aspheric departure optics. Measurement data from the Form Talysurf PGI 1250 can be fed back directly to the machine tool, in order to optimize the form of the optical mould.

  14. The ultrahigh precision form measurement of small, steep-sided aspheric moulds, incorporating novel hardware and software developments; Technical Digest

    NASA Astrophysics Data System (ADS)

    Mills, M. W.; Hutchinson, Matthew J.

    2005-05-01

    A variety of consumer applications, eg cellphone camera lenses, optical storage devices, digital cameras, etc, are driving the demand for small, high aspheric departure rotationally-symmetric moulded optics, manufactured both in polymer and glass materials. The mould tools for such components are manufactured by ultra-high precision techniques such as single point diamond turning and ultra-precision grinding, and must be accurate to <1/10μm levels for form, and exhibit nanometric surface finish quality. The aspheric forms of such components" optical surfaces exhibit high departure from best-fit sphere towards their outer edge, which renders this outer region especially critical for optical performance. The high slope of these components at the clear aperture has caused some restrictions on the use of profilometry in the measurement of form across their full diameter. Taylor Hobson designs and manufactures a range of ultra-precision profilometers for use in such industries as aspheric optics fabrication. In order to address the issues described, a new measurement system, Taylor Hobson Form Talysurf PGI 1250, has been developed, which contains new Aspheric Data Fusion Software, as well as Asphero-Diffractive Analysis Software, allowing the entire diametric profile to be analysed to the desired level of accuracy. This development removes the previous limitation of maximum slope for this type of measurement, thus enabling better quality control of high slope, high aspheric departure optics. Measurement data from the Form Talysurf PGI 1250 can be fed back directly to the machine tool, in order to optimize the form of the optical mould.

  15. HIGH-PRECISION ASTROMETRIC MILLIMETER VERY LONG BASELINE INTERFEROMETRY USING A NEW METHOD FOR ATMOSPHERIC CALIBRATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rioja, M.; Dodson, R., E-mail: maria.rioja@icrar.org

    2011-04-15

    We describe a new method which achieves high-precision very long baseline interferometry (VLBI) astrometry in observations at millimeter (mm) wavelengths. It combines fast frequency-switching observations, to correct for the dominant non-dispersive tropospheric fluctuations, with slow source-switching observations, for the remaining ionospheric dispersive terms. We call this method source-frequency phase referencing. Provided that the switching cycles match the properties of the propagation media, one can recover the source astrometry. We present an analytic description of the two-step calibration strategy, along with an error analysis to characterize its performance. Also, we provide observational demonstrations of a successful application with observations using themore » Very Long Baseline Array at 86 GHz of the pairs of sources 3C274 and 3C273 and 1308+326 and 1308+328 under various conditions. We conclude that this method is widely applicable to mm-VLBI observations of many target sources, and unique in providing bona fide astrometrically registered images and high-precision relative astrometric measurements in mm-VLBI using existing and newly built instruments, including space VLBI.« less

  16. HPTLC Determination of Artemisinin and Its Derivatives in Bulk and Pharmaceutical Dosage

    NASA Astrophysics Data System (ADS)

    Agarwal, Suraj P.; Ahuja, Shipra

    A simple, selective, accurate, and precise high-performance thin-layer chromatographic (HPTLC) method has been established and validated for the analysis of artemisinin and its derivatives (artesunate, artemether, and arteether) in the bulk drugs and formulations. The artemisinin, artesunate, artemether, and arteether were separated on aluminum-backed silica gel 60 F254 plates with toluene:ethyl acetate (10:1), toluene: ethyl acetate: acetic acid (2:8:0.2), toluene:butanol (10:1), and toluene:dichloro methane (0.5:10) mobile phase, respectively. The linear detector response for concentrations between 100 and 600 ng/spot showed good linear relationship with r value 0.9967, 0.9989, 0.9981 and 0.9989 for artemisinin, artesunate, artemether, and arteether, respectively. Statistical analysis proves that the method is precise, accurate, and reproducible and hence can be employed for the routine analysis.

  17. Depth-resolved multilayer pigment identification in paintings: combined use of laser-induced breakdown spectroscopy (LIBS) and optical coherence tomography (OCT).

    PubMed

    Kaszewska, Ewa A; Sylwestrzak, Marcin; Marczak, Jan; Skrzeczanowski, Wojciech; Iwanicka, Magdalena; Szmit-Naud, Elżbieta; Anglos, Demetrios; Targowski, Piotr

    2013-08-01

    A detailed feasibility study on the combined use of laser-induced breakdown spectroscopy with optical coherence tomography (LIBS/OCT), aiming at a realistic depth-resolved elemental analysis of multilayer stratigraphies in paintings, is presented. Merging a high spectral resolution LIBS system with a high spatial resolution spectral OCT instrument significantly enhances the quality and accuracy of stratigraphic analysis. First, OCT mapping is employed prior to LIBS analysis in order to assist the selection of specific areas of interest on the painting surface to be examined in detail. Then, intertwined with LIBS, the OCT instrument is used as a precise profilometer for the online determination of the depth of the ablation crater formed by individual laser pulses during LIBS depth-profile analysis. This approach is novel and enables (i) the precise in-depth scaling of elemental concentration profiles, and (ii) the recognition of layer boundaries by estimating the corresponding differences in material ablation rate. Additionally, the latter is supported, within the transparency of the object, by analysis of the OCT cross-sectional views. The potential of this method is illustrated by presenting results on the detailed analysis of the structure of an historic painting on canvas performed to aid planned restoration of the artwork.

  18. Calibration of gyro G-sensitivity coefficients with FOG monitoring on precision centrifuge

    NASA Astrophysics Data System (ADS)

    Lu, Jiazhen; Yang, Yanqiang; Li, Baoguo; Liu, Ming

    2017-07-01

    The advantages of mechanical gyros, such as high precision, endurance and reliability, make them widely used as the core parts of inertial navigation systems (INS) utilized in the fields of aeronautics, astronautics and underground exploration. In a high-g environment, the accuracy of gyros is degraded. Therefore, the calibration and compensation of the gyro G-sensitivity coefficients is essential when the INS operates in a high-g environment. A precision centrifuge with a counter-rotating platform is the typical equipment for calibrating the gyro, as it can generate large centripetal acceleration and keep the angular rate close to zero; however, its performance is seriously restricted by the angular perturbation in the high-speed rotating process. To reduce the dependence on the precision of the centrifuge and counter-rotating platform, an effective calibration method for the gyro g-sensitivity coefficients under fiber-optic gyroscope (FOG) monitoring is proposed herein. The FOG can efficiently compensate spindle error and improve the anti-interference ability. Harmonic analysis is performed for data processing. Simulations show that the gyro G-sensitivity coefficients can be efficiently estimated to up to 99% of the true value and compensated using a lookup table or fitting method. Repeated tests indicate that the G-sensitivity coefficients can be correctly calibrated when the angular rate accuracy of the precision centrifuge is as low as 0.01%. Verification tests are performed to demonstrate that the attitude errors can be decreased from 0.36° to 0.08° in 200 s. The proposed measuring technology is generally applicable in engineering, as it can reduce the accuracy requirements for the centrifuge and the environment.

  19. Prospects of photonic nanojets for precise exposure on microobjects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geints, Yu. E., E-mail: ygeints@iao.ru; Zuev Institute of Atmospheric Optics, SB Russian Academy of Sciences, Acad. Zuev Square 1, Tomsk, 634021; Panina, E. K., E-mail: pek@iao.ru

    We report on the new optical tool for precise manipulation of various microobjects. This tool is referred to as a “photonic nanojet” (PJ) and corresponds to specific spatially localized and high-intensity area formed near micron-sized transparent spherical dielectric particles illuminated by a visible laser radiation The descriptive analysis of the morphological shapes of photonic nanojets is presented. The PJ shape characterization is based on the numerical calculations of the near-field distribution according to the Mie theory and accounts for jet dimensions and shape complexity.

  20. Applications of inertial-sensor high-inheritance instruments to DSN precision antenna pointing

    NASA Technical Reports Server (NTRS)

    Goddard, R. E.

    1992-01-01

    Laboratory test results of the initialization and tracking performance of an existing inertial-sensor-based instrument are given. The instrument, although not primarily designed for precision antenna pointing applications, demonstrated an on-average 10-hour tracking error of several millidegrees. The system-level instrument performance is shown by analysis to be sensor limited. Simulated instrument improvements show a tracking error of less than 1 mdeg, which would provide acceptable performance, i.e., low pointing loss, for the DSN 70-m antenna sub network, operating at Ka-band (1-cm wavelength).

  1. Applications of inertial-sensor high-inheritance instruments to DSN precision antenna pointing

    NASA Technical Reports Server (NTRS)

    Goddard, R. E.

    1992-01-01

    Laboratory test results of the initialization and tracking performance of an existing inertial-sensor-based instrument are given. The instrument, although not primarily designed for precision antenna pointing applications, demonstrated an on-average 10-hour tracking error of several millidegrees. The system-level instrument performance is shown by analysis to be sensor limited. Simulated instrument improvements show a tracking error of less than 1 mdeg, which would provide acceptable performance, i.e., low pointing loss, for the Deep Space Network 70-m antenna subnetwork, operating at Ka-band (1-cm wavelength).

  2. Absolute quantification by droplet digital PCR versus analog real-time PCR

    PubMed Central

    Hindson, Christopher M; Chevillet, John R; Briggs, Hilary A; Gallichotte, Emily N; Ruf, Ingrid K; Hindson, Benjamin J; Vessella, Robert L; Tewari, Muneesh

    2014-01-01

    Nanoliter-sized droplet technology paired with digital PCR (ddPCR) holds promise for highly precise, absolute nucleic acid quantification. Our comparison of microRNA quantification by ddPCR and real-time PCR revealed greater precision (coefficients of variation decreased by 37–86%) and improved day-to-day reproducibility (by a factor of seven) of ddPCR but with comparable sensitivity. When we applied ddPCR to serum microRNA biomarker analysis, this translated to superior diagnostic performance for identifying individuals with cancer. PMID:23995387

  3. Scaling up the precision in a ytterbium Bose-Einstein condensate interferometer

    NASA Astrophysics Data System (ADS)

    McAlpine, Katherine; Plotkin-Swing, Benjamin; Gochnauer, Daniel; Saxberg, Brendan; Gupta, Subhadeep

    2016-05-01

    We report on progress toward a high-precision ytterbium (Yb) Bose-Einstein condensate (BEC) interferometer, with the goal of measuring h/m and thus the fine structure constant α. Here h is Planck's constant and m is the mass of a Yb atom. The use of the non-magnetic Yb atom makes our experiment insensitive to magnetic field noise. Our chosen symmetric 3-path interferometer geometry suppresses errors from vibration, rotation, and acceleration. The precision scales with the phase accrued due to the kinetic energy difference between the interferometer arms, resulting in a quadratic sensitivity to the momentum difference. We are installing and testing the laser pulses for large momentum transfer via Bloch oscillations. We will report on Yb BEC production in a new apparatus and progress toward realizing the atom optical elements for high precision measurements. We will also discuss approaches to mitigate two important systematics: (i) atom interaction effects can be suppressed by creating the BEC in a dynamically shaped optical trap to reduce the density; (ii) diffraction phase effects from the various atom-optical elements can be accounted for through an analysis of the light-atom interaction for each pulse.

  4. Application of high precision two-way S-band ranging to the navigation of the Galileo Earth encounters

    NASA Technical Reports Server (NTRS)

    Pollmeier, Vincent M.; Kallemeyn, Pieter H.; Thurman, Sam W.

    1993-01-01

    The application of high-accuracy S/S-band (2.1 GHz uplink/2.3 GHz downlink) ranging to orbit determination with relatively short data arcs is investigated for the approach phase of each of the Galileo spacecraft's two Earth encounters (8 December 1990 and 8 December 1992). Analysis of S-band ranging data from Galileo indicated that under favorable signal levels, meter-level precision was attainable. It is shown that ranginging data of sufficient accuracy, when acquired from multiple stations, can sense the geocentric angular position of a distant spacecraft. Explicit modeling of ranging bias parameters for each station pass is used to largely remove systematic ground system calibration errors and transmission media effects from the Galileo range measurements, which would otherwise corrupt the angle finding capabilities of the data. The accuracy achieved using the precision range filtering strategy proved markedly better when compared to post-flyby reconstructions than did solutions utilizing a traditional Doppler/range filter strategy. In addition, the navigation accuracy achieved with precision ranging was comparable to that obtained using delta-Differenced One-Way Range, an interferometric measurement of spacecraft angular position relative to a natural radio source, which was also used operationally.

  5. PRECISE ANGLE MONITOR BASED ON THE CONCEPT OF PENCIL-BEAM INTERFEROMETRY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    QIAN,S.; TAKACS,P.

    2000-07-30

    The precise angle monitoring is a very important metrology task for research, development and industrial applications. Autocollimator is one of the most powerful and widely applied instruments for small angle monitoring, which is based on the principle of geometric optics. In this paper the authors introduce a new precise angle monitoring system, Pencil-beam Angle Monitor (PAM), base on pencil beam interferometry. Its principle of operation is a combination of physical and geometrical optics. The angle calculation method is similar to the autocollimator. However, the autocollimator creates a cross image but the precise pencil-beam angle monitoring system produces an interference fringemore » on the focal plane. The advantages of the PAM are: high angular sensitivity, long-term stability character making angle monitoring over long time periods possible, high measurement accuracy in the order of sub-microradian, simultaneous measurement ability in two perpendicular directions or on two different objects, dynamic measurement possibility, insensitive to the vibration and air turbulence, automatic display, storage and analysis by use of the computer, small beam diameter making the alignment extremely easy and longer test distance. Some test examples are presented.« less

  6. High-precision measurement of phenylalanine δ15N values for environmental samples: a new approach coupling high-pressure liquid chromatography purification and elemental analyzer isotope ratio mass spectrometry.

    PubMed

    Broek, Taylor A B; Walker, Brett D; Andreasen, Dyke H; McCarthy, Matthew D

    2013-11-15

    Compound-specific isotope analysis of individual amino acids (CSI-AA) is a powerful new tool for tracing nitrogen (N) source and transformation in biogeochemical cycles. Specifically, the δ(15)N value of phenylalanine (δ(15)N(Phe)) represents an increasingly used proxy for source δ(15)N signatures, with particular promise for paleoceanographic applications. However, current derivatization/gas chromatography methods require expensive and relatively uncommon instrumentation, and have relatively low precision, making many potential applications impractical. A new offline approach has been developed for high-precision δ(15)N measurements of amino acids (δ(15)N(AA)), optimized for δ(15)N(Phe) values. Amino acids (AAs) are first purified via high-pressure liquid chromatography (HPLC), using a mixed-phase column and automated fraction collection. The δ(15)N values are determined via offline elemental analyzer-isotope ratio mass spectrometry (EA-IRMS). The combined HPLC/EA-IRMS method separated most protein AAs with sufficient resolution to obtain accurate δ(15)N values, despite significant intra-peak isotopic fractionation. For δ(15)N(Phe) values, the precision was ±0.16‰ for standards, 4× better than gas chromatography/combustion/isotope ratio mass spectrometry (GC/C/IRMS; ±0.64‰). We also compared a δ(15)N(Phe) paleo-record from a deep-sea bamboo coral from Monterey Bay, CA, USA, using our method versus GC/C/IRMS. The two methods produced equivalent δ(15)N(Phe) values within error; however, the δ(15)N(Phe) values from HPLC/EA-IRMS had approximately twice the precision of GC/C/IRMS (average stdev of 0.27‰ ± 0.14‰ vs 0.60‰ ± 0.20‰, respectively). These results demonstrate that offline HPLC represents a viable alternative to traditional GC/C/IMRS for δ(15)N(AA) measurement. HPLC/EA-IRMS is more precise and widely available, and therefore useful in applications requiring increased precision for data interpretation (e.g. δ(15)N paleoproxies). Copyright © 2013 John Wiley & Sons, Ltd.

  7. CCD centroiding analysis for Nano-JASMINE observation data

    NASA Astrophysics Data System (ADS)

    Niwa, Yoshito; Yano, Taihei; Araki, Hiroshi; Gouda, Naoteru; Kobayashi, Yukiyasu; Yamada, Yoshiyuki; Tazawa, Seiichi; Hanada, Hideo

    2010-07-01

    Nano-JASMINE is a very small satellite mission for global space astrometry with milli-arcsecond accuracy, which will be launched in 2011. In this mission, centroids of stars in CCD image frames are estimated with sub-pixel accuracy. In order to realize such a high precision centroiding an algorithm utilizing a least square method is employed. One of the advantages is that centroids can be calculated without explicit assumption of the point spread functions of stars. CCD centroiding experiment has been performed to investigate whether this data analysis is available, and centroids of artificial star images on a CCD are determined with a precision of less than 0.001 pixel. This result indicates parallaxes of stars within 300 pc from Sun can be observed in Nano-JASMINE.

  8. An Image Analysis Method for the Precise Selection and Quantitation of Fluorescently Labeled Cellular Constituents

    PubMed Central

    Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.

    2012-01-01

    The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600

  9. Oxygen isotope analysis of shark teeth phosphates from Bartonian (Eocene) deposits in Mangyshlak peninsula, Kazakhstan

    NASA Astrophysics Data System (ADS)

    Pelc, Andrzej; Hałas, Stanisław; Niedźwiedzki, Robert

    2011-01-01

    We report the results of high-precision (±0.05‰) oxygen isotope analysis of phosphates in 6 teeth of fossil sharks from the Mangyshlak peninsula. This precision was achieved by the offline preparation of CO2 which was then analyzed on a dual-inlet and triple-collector IRMS. The teeth samples were separated from Middle- and Late Bartonian sediments cropping out in two locations, Usak and Kuilus. Seawater temperatures calculated from the δ18O data vary from 23-41°C. However, these temperatures are probably overestimated due to freshwater inflow. The data point at higher temperature in the Late Bartonian than in the Middle Bartonian and suggest differences in the depth habitats of the shark species studied.

  10. Assessment of Gamma-Ray Spectra Analysis Method Utilizing the Fireworks Algorithm for various Error Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alamaniotis, Miltiadis; Tsoukalas, Lefteri H.

    2018-01-01

    Significant role in enhancing nuclear nonproliferation plays the analysis of obtained data and the inference of the presence or not of special nuclear materials in them. Among various types of measurements, gamma-ray spectra is the widest used type of data utilized for analysis in nonproliferation. In this chapter, a method that employs the fireworks algorithm (FWA) for analyzing gamma-ray spectra aiming at detecting gamma signatures is presented. In particular FWA is utilized to fit a set of known signatures to a measured spectrum by optimizing an objective function, with non-zero coefficients expressing the detected signatures. FWA is tested on amore » set of experimentally obtained measurements and various objective functions -MSE, RMSE, Theil-2, MAE, MAPE, MAP- with results exhibiting its potential in providing high accuracy and high precision of detected signatures. Furthermore, FWA is benchmarked against genetic algorithms, and multiple linear regression with results exhibiting its superiority over the rest tested algorithms with respect to precision for MAE, MAPE and MAP measures.« less

  11. The use of secondary ion mass spectrometry in forensic analyses of ultra-small samples

    NASA Astrophysics Data System (ADS)

    Cliff, John

    2010-05-01

    It is becoming increasingly important in forensic science to perform chemical and isotopic analyses on very small sample sizes. Moreover, in some instances the signature of interest may be incorporated in a vast background making analyses impossible by bulk methods. Recent advances in instrumentation make secondary ion mass spectrometry (SIMS) a powerful tool to apply to these problems. As an introduction, we present three types of forensic analyses in which SIMS may be useful. The causal organism of anthrax (Bacillus anthracis) chelates Ca and other metals during spore formation. Thus, the spores contain a trace element signature related to the growth medium that produced the organisms. Although other techniques have been shown to be useful in analyzing these signatures, the sample size requirements are generally relatively large. We have shown that time of flight SIMS (TOF-SIMS) combined with multivariate analysis, can clearly separate Bacillus sp. cultures prepared in different growth media using analytical spot sizes containing approximately one nanogram of spores. An important emerging field in forensic analysis is that of provenance of fecal pollution. The strategy of choice for these analyses-developing host-specific nucleic acid probes-has met with considerable difficulty due to lack of specificity of the probes. One potentially fruitful strategy is to combine in situ nucleic acid probing with high precision isotopic analyses. Bulk analyses of human and bovine fecal bacteria, for example, indicate a relative difference in d13C content of about 4 per mil. We have shown that sample sizes of several nanograms can be analyzed with the IMS 1280 with precisions capable of separating two per mil differences in d13C. The NanoSIMS 50 is capable of much better spatial resolution than the IMS 1280, albeit at a cost of analytical precision. Nevertheless we have documented precision capable of separating five per mil differences in d13C using analytical spots containing less than 300 picograms of bacteria. Perhaps the most successful application of SIMS for forensic purposes to date is in the field of nuclear forensics. An example that has been used by laboratories associated with the International Atomic Energy Agency is the examination of environmental samples for enriched uranium particles indicative of clandestine weapons production activities.. The analytical challenge in these types of measurements is to search complex environmental matrices for U-bearing particles which must then be analyzed for 234U, 235U, and 236U content with high precision and accuracy. Older-generation SIMS instruments were hampered by small geometries that made resolution of significant interferences problematic. In addition, automated particle search software was proprietary and difficult to obtain. With the development of new search software, the IMS 1280 is capable of searching a sample in a matter of hours, flagging U-bearing particles for later analyses, and providing a rough 235U content. Particles of interest can be revisited for high precision analyses, and all U-isotopes can be measured simultaneously in multicollector mode, dramatically improving analysis time and internal precision. Further, the large geometry of the instrument allows complete resolution of isobaric interferences that have traditionally limited SIMS analyses of difficult samples. Examples of analyses of micron-sized standard particles indicate that estimates of 235U enrichment can be obtained with an external relative precision of 0.1% and 234U and 236U contents can be obtained with a relative precision of less than 1%. Analyses of 'real' samples show a dramatic improvement in the data quality obtained compared with small-geometry SIMS instruments making SIMS the method of choice for these high-profile samples when accurate, precise, and rapid results are required.

  12. Development and validity of an instrumented handbike: initial results of propulsion kinetics.

    PubMed

    van Drongelen, Stefan; van den Berg, Jos; Arnet, Ursina; Veeger, Dirkjan H E J; van der Woude, Lucas H V

    2011-11-01

    To develop an instrumented handbike system to measure the forces applied to the handgrip during handbiking. A 6 degrees of freedom force sensor was built into the handgrip of an attach-unit handbike, together with two optical encoders to measure the orientation of the handgrip and crank in space. Linearity, precision, and percent error were determined for static and dynamic tests. High linearity was demonstrated for both the static and the dynamic condition (r=1.01). Precision was high under the static condition (standard deviation of 0.2N), however the precision decreased with higher loads during the dynamic condition. Percent error values were between 0.3 and 5.1%. This is the first instrumented handbike system that can register 3-dimensional forces. It can be concluded that the instrumented handbike system allows for an accurate force analysis based on forces registered at the handle bars. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.

  13. Tracking Perfluorocarbon Nanoemulsion Delivery by 19F MRI for Precise High Intensity Focused Ultrasound Tumor Ablation

    PubMed Central

    Shin, Soo Hyun; Park, Eun-Joo; Min, Changki; Choi, Sun Il; Jeon, Soyeon; Kim, Yun-Hee; Kim, Daehong

    2017-01-01

    Perfluorocarbon nanoemulsions (PFCNEs) have recently been undergoing rigorous study to investigate their ability to improve the therapeutic efficacy of tumor ablation by high intensity focused ultrasound (HIFU). For precise control of PFCNE delivery and thermal ablation, their accumulation and distribution in a tumor should be quantitatively analyzed. Here, we used fluorine-19 (19F) magnetic resonance imaging (MRI) to quantitatively track PFCNE accumulation in a tumor, and analyzed how intra-tumoral PFCNE quantities affect the therapeutic efficacy of HIFU treatment. Ablation outcomes were assessed by intra-voxel incoherent motion analysis and bioluminescent imaging up to 14 days after the procedure. Assessment of PFCNE delivery and treatment outcomes showed that 2-3 mg/mL of PFCNE in a tumor produces the largest ablation volume under the same HIFU insonation conditions. Histology showed varying degrees of necrosis depending on the amount of PFCNE delivered. 19F MRI promises to be a valuable platform for precisely guiding PFCNE-enhanced HIFU ablation of tumors. PMID:28255351

  14. Observing exoplanet populations with high-precision astrometry

    NASA Astrophysics Data System (ADS)

    Sahlmann, Johannes

    2012-06-01

    This thesis deals with the application of the astrometry technique, consisting in measuring the position of a star in the plane of the sky, for the discovery and characterisation of extra-solar planets. It is feasible only with a very high measurement precision, which motivates the use of space observatories, the development of new ground-based astronomical instrumentation and of innovative data analysis methods: The study of Sun-like stars with substellar companions using CORALIE radial velocities and HIPPARCOS astrometry leads to the determination of the frequency of close brown dwarf companions and to the discovery of a dividing line between massive planets and brown dwarf companions; An observation campaign employing optical imaging with a very large telescope demonstrates sufficient astrometric precision to detect planets around ultra-cool dwarf stars and the first results of the survey are presented; Finally, the design and initial astrometric performance of PRIMA, ! a new dual-feed near-infrared interferometric observing facility for relative astrometry is presented.

  15. Workshop on Pion-Kaon Interactions (PKI2018) Mini-Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amaryan, M; Pal, Bilas

    This volume is a short summary of talks given at the PKI2018 Workshop organized to discuss current status and future prospects of pi -K interactions. The precise data on pi K interaction will have a strong impact on strange meson spectroscopy and form factors that are important ingredients in the Dalitz plot analysis of a decays of heavy mesons as well as precision measurement of Vus matrix element and therefore on a test of unitarity in the first raw of the CKM matrix. The workshop has combined the efforts of experimentalists, Lattice QCD, and phenomenology communities. Experimental data relevant tomore » the topic of the workshop were presented from the broad range of different collaborations like CLAS, GlueX, COMPASS, BaBar, BELLE, BESIII, VEPP-2000, and LHCb. One of the main goals of this workshop was to outline a need for a new high intensity and high precision secondary KL beam facility at JLab produced with the 12 GeV electron beam of CEBAF accelerator.« less

  16. Quantitative analysis of pork and chicken products by droplet digital PCR.

    PubMed

    Cai, Yicun; Li, Xiang; Lv, Rong; Yang, Jielin; Li, Jian; He, Yuping; Pan, Liangwen

    2014-01-01

    In this project, a highly precise quantitative method based on the digital polymerase chain reaction (dPCR) technique was developed to determine the weight of pork and chicken in meat products. Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of species-specific DNAs in meat products. However, it is limited in amplification efficiency and relies on standard curves based Ct values, detecting and quantifying low copy number target DNA, as in some complex mixture meat products. By using the dPCR method, we find the relationships between the raw meat weight and DNA weight and between the DNA weight and DNA copy number were both close to linear. This enabled us to establish formulae to calculate the raw meat weight based on the DNA copy number. The accuracy and applicability of this method were tested and verified using samples of pork and chicken powder mixed in known proportions. Quantitative analysis indicated that dPCR is highly precise in quantifying pork and chicken in meat products and therefore has the potential to be used in routine analysis by government regulators and quality control departments of commercial food and feed enterprises.

  17. Chromatographic speciation of Cr(III)-species, inter-species equilibrium isotope fractionation and improved chemical purification strategies for high-precision isotope analysis.

    PubMed

    Larsen, K K; Wielandt, D; Schiller, M; Bizzarro, M

    2016-04-22

    Chromatographic purification of chromium (Cr), which is required for high-precision isotope analysis, is complicated by the presence of multiple Cr-species with different effective charges in the acid digested sample aliquots. The differing ion exchange selectivity and sluggish reaction rates of these species can result in incomplete Cr recovery during chromatographic purification. Because of large mass-dependent inter-species isotope fractionation, incomplete recovery can affect the accuracy of high-precision Cr isotope analysis. Here, we demonstrate widely differing cation distribution coefficients of Cr(III)-species (Cr(3+), CrCl(2+) and CrCl2(+)) with equilibrium mass-dependent isotope fractionation spanning a range of ∼1‰/amu and consistent with theory. The heaviest isotopes partition into Cr(3+), intermediates in CrCl(2+) and the lightest in CrCl2(+)/CrCl3°. Thus, for a typical reported loss of ∼25% Cr (in the form of Cr(3+)) through chromatographic purification, this translates into 185 ppm/amu offset in the stable Cr isotope ratio of the residual sample. Depending on the validity of the mass-bias correction during isotope analysis, this further results in artificial mass-independent effects in the mass-bias corrected (53)Cr/(52)Cr (μ(53)Cr* of 5.2 ppm) and (54)Cr/(52)Cr (μ(54)Cr* of 13.5 ppm) components used to infer chronometric and nucleosynthetic information in meteorites. To mitigate these fractionation effects, we developed strategic chemical sample pre-treatment procedures that ensure high and reproducible Cr recovery. This is achieved either through 1) effective promotion of Cr(3+) by >5 days exposure to HNO3H2O2 solutions at room temperature, resulting in >∼98% Cr recovery for most types of sample matrices tested using a cationic chromatographic retention strategy, or 2) formation of Cr(III)-Cl complexes through exposure to concentrated HCl at high temperature (>120 °C) for several hours, resulting in >97.5% Cr recovery using a chromatographic elution strategy that takes advantage of the slow reaction kinetics of de-chlorination of Cr in dilute HCl at room temperature. These procedures significantly improve cation chromatographic purification of Cr over previous methods and allow for high-purity Cr isotope analysis with a total recovery of >95%. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Chromatographic speciation of Cr(III)-species, inter-species equilibrium isotope fractionation and improved chemical purification strategies for high-precision isotope analysis

    PubMed Central

    Larsen, K.K.; Wielandt, D.; Schiller, M.; Bizzarro, M.

    2016-01-01

    Chromatographic purification of chromium (Cr), which is required for high-precision isotope analysis, is complicated by the presence of multiple Cr-species with different effective charges in the acid digested sample aliquots. The differing ion exchange selectivity and sluggish reaction rates of these species can result in incomplete Cr recovery during chromatographic purification. Because of large mass-dependent inter-species isotope fractionation, incomplete recovery can affect the accuracy of high-precision Cr isotope analysis. Here, we demonstrate widely differing cation distribution coefficients of Cr(III)-species (Cr3+, CrCl2+ and CrCl2+) with equilibrium mass-dependent isotope fractionation spanning a range of ~1‰/amu and consistent with theory. The heaviest isotopes partition into Cr3+, intermediates in CrCl2+ and the lightest in CrCl2+/CrCl3°. Thus, for a typical reported loss of ~25% Cr (in the form of Cr3+) through chromatographic purification, this translates into 185 ppm/amu offset in the stable Cr isotope ratio of the residual sample. Depending on the validity of the mass-bias correction during isotope analysis, this further results in artificial mass-independent effects in the mass-bias corrected 53Cr/52Cr (μ53 Cr* of 5.2 ppm) and 54Cr/52Cr (μ54Cr* of 13.5 ppm) components used to infer chronometric and nucleosynthetic information in meteorites. To mitigate these fractionation effects, we developed strategic chemical sample pre-treatment procedures that ensure high and reproducible Cr recovery. This is achieved either through 1) effective promotion of Cr3+ by >5 days exposure to HNO3 —H2O2 solutions at room temperature, resulting in >~98% Cr recovery for most types of sample matrices tested using a cationic chromatographic retention strategy, or 2) formation of Cr(III)-Cl complexes through exposure to concentrated HCl at high temperature (>120 °C) for several hours, resulting in >97.5% Cr recovery using a chromatographic elution strategy that takes advantage of the slow reaction kinetics of de-chlorination of Cr in dilute HCl at room temperature. These procedures significantly improve cation chromatographic purification of Cr over previous methods and allow for high-purity Cr isotope analysis with a total recovery of >95%. PMID:27036208

  19. Validation of the sperm class analyser CASA system for sperm counting in a busy diagnostic semen analysis laboratory.

    PubMed

    Dearing, Chey G; Kilburn, Sally; Lindsay, Kevin S

    2014-03-01

    Sperm counts have been linked to several fertility outcomes making them an essential parameter of semen analysis. It has become increasingly recognised that Computer-Assisted Semen Analysis (CASA) provides improved precision over manual methods but that systems are seldom validated robustly for use. The objective of this study was to gather the evidence to validate or reject the Sperm Class Analyser (SCA) as a tool for routine sperm counting in a busy laboratory setting. The criteria examined were comparison with the Improved Neubauer and Leja 20-μm chambers, within and between field precision, sperm concentration linearity from a stock diluted in semen and media, accuracy against internal and external quality material, assessment of uneven flow effects and a receiver operating characteristic (ROC) analysis to predict fertility in comparison with the Neubauer method. This work demonstrates that SCA CASA technology is not a standalone 'black box', but rather a tool for well-trained staff that allows rapid, high-number sperm counting providing errors are identified and corrected. The system will produce accurate, linear, precise results, with less analytical variance than manual methods that correlate well against the Improved Neubauer chamber. The system provides superior predictive potential for diagnosing fertility problems.

  20. A systematic approach to determining the properties of an iodine absorption cell for high-precision radial velocity measurements

    NASA Astrophysics Data System (ADS)

    Perdelwitz, V.; Huke, P.

    2018-06-01

    Absorption cells filled with diatomic iodine are frequently employed as wavelength reference for high-precision stellar radial velocity determination due their long-term stability and low cost. Despite their wide-spread usage in the community, there is little documentation on how to determine the ideal operating temperature of an individual cell. We have developed a new approach to measuring the effective molecular temperature inside a gas absorption cell and searching for effects detrimental to a high precision wavelength reference, utilizing the Boltzmann distribution of relative line depths within absorption bands of single vibrational transitions. With a high resolution Fourier transform spectrometer, we took a series of 632 spectra at temperatures between 23 °C and 66 °C. These spectra provide a sufficient basis to test the algorithm and demonstrate the stability and repeatability of the temperature determination via molecular lines on a single iodine absorption cell. The achievable radial velocity precision σRV is found to be independent of the cell temperature and a detailed analysis shows a wavelength dependency, which originates in the resolving power of the spectrometer in use and the signal-to-noise ratio. Two effects were found to cause apparent absolute shifts in radial velocity, a temperature-induced shift of the order of ˜1 ms-1K-1 and a more significant effect resulting in abrupt jumps of ≥50 ms-1 is determined to be caused by the temperature crossing the dew point of the molecular iodine.

  1. Structural Dynamics Analysis and Research for FEA Modeling Method of a Light High Resolution CCD Camera

    NASA Astrophysics Data System (ADS)

    Sun, Jiwen; Wei, Ling; Fu, Danying

    2002-01-01

    resolution and wide swath. In order to assure its high optical precision smoothly passing the rigorous dynamic load of launch, it should be of high structural rigidity. Therefore, a careful study of the dynamic features of the camera structure should be performed. Pro/E. An interference examination is performed on the precise CAD model of the camera for mending the structural design. for the first time in China, and the analysis of structural dynamic of the camera is accomplished by applying the structural analysis code PATRAN and NASTRAN. The main research programs include: 1) the comparative calculation of modes analysis of the critical structure of the camera is achieved by using 4 nodes and 10 nodes tetrahedral elements respectively, so as to confirm the most reasonable general model; 2) through the modes analysis of the camera from several cases, the inherent frequencies and modes are obtained and further the rationality of the structural design of the camera is proved; 3) the static analysis of the camera under self gravity and overloads is completed and the relevant deformation and stress distributions are gained; 4) the response calculation of sine vibration of the camera is completed and the corresponding response curve and maximum acceleration response with corresponding frequencies are obtained. software technique is accurate and efficient. sensitivity, the dynamic design and engineering optimization of the critical structure of the camera are discussed. fundamental technology in design of forecoming space optical instruments.

  2. Reverse phase HPLC method for detection and quantification of lupin seed γ-conglutin.

    PubMed

    Mane, Sharmilee; Bringans, Scott; Johnson, Stuart; Pareek, Vishnu; Utikar, Ranjeet

    2017-09-15

    A simple, selective and accurate reverse phase HPLC method was developed for detection and quantitation of γ-conglutin from lupin seed extract. A linear gradient of water and acetonitrile containing trifluoroacetic acid (TFA) on a reverse phase column (Agilent Zorbax 300SB C-18), with a flow rate of 0.8ml/min was able to produce a sharp and symmetric peak of γ-conglutin with a retention time at 29.16min. The identity of γ-conglutin in the peak was confirmed by mass spectrometry (MS/MS identification) and sodium dodecyl sulphate polyacrylamide gel electrophoresis (SDS-PAGE) analysis. The data obtained from MS/MS analysis was matched against the specified database to obtain the exact match for the protein of interest. The proposed method was validated in terms of specificity, linearity, sensitivity, precision, recovery and accuracy. The analytical parameters revealed that the validated method was capable of selectively performing a good chromatographic separation of γ-conglutin from the lupin seed extract with no interference of the matrix. The detection and quantitation limit of γ-conglutin were found to be 2.68μg/ml and 8.12μg/ml respectively. The accuracy (precision and recovery) analysis of the method was conducted under repeatable conditions on different days. Intra-day and inter-day precision values less than 0.5% and recovery greater than 97% indicated high precision and accuracy of the method for analysis of γ-conglutin. The method validation findings were reproducible and can be successfully applied for routine analysis of γ-conglutin from lupin seed extract. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Ion chromatography for the precise analysis of chloride and sodium in sweat for the diagnosis of cystic fibrosis.

    PubMed

    Doorn, J; Storteboom, T T R; Mulder, A M; de Jong, W H A; Rottier, B L; Kema, I P

    2015-07-01

    Measurement of chloride in sweat is an essential part of the diagnostic algorithm for cystic fibrosis. The lack in sensitivity and reproducibility of current methods led us to develop an ion chromatography/high-performance liquid chromatography (IC/HPLC) method, suitable for the analysis of both chloride and sodium in small volumes of sweat. Precision, linearity and limit of detection of an in-house developed IC/HPLC method were established. Method comparison between the newly developed IC/HPLC method and the traditional Chlorocounter was performed, and trueness was determined using Passing Bablok method comparison with external quality assurance material (Royal College of Pathologists of Australasia). Precision and linearity fulfill criteria as established by UK guidelines are comparable with inductively coupled plasma-mass spectrometry methods. Passing Bablok analysis demonstrated excellent correlation between IC/HPLC measurements and external quality assessment target values, for both chloride and sodium. With a limit of quantitation of 0.95 mmol/L, our method is suitable for the analysis of small amounts of sweat and can thus be used in combination with the Macroduct collection system. Although a chromatographic application results in a somewhat more expensive test compared to a Chlorocounter test, more accurate measurements are achieved. In addition, simultaneous measurements of sodium concentrations will result in better detection of false positives, less test repeating and thus faster and more accurate and effective diagnosis. The described IC/HPLC method, therefore, provides a precise, relatively cheap and easy-to-handle application for the analysis of both chloride and sodium in sweat. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  4. The double-edged sword of high-precision U-Pb geochronology or be careful what you wish for. (Invited)

    NASA Astrophysics Data System (ADS)

    Bowring, S. A.

    2010-12-01

    Over the past two decades, U-Pb geochronology by ID-TIMS has been refined to achieve internal (analytical) uncertainties on a single grain analysis of ± ~ 0.1-0.2%, and 0.05% or better on weighted mean dates. This level of precision enables unprecedented evaluation of the rates and durations of geological processes, from magma chamber evolution to mass extinctions and recoveries. The increased precision, however, exposes complexity in magmatic/volcanic systems and highlights the importance of corrections related to disequilibrium partitioning of intermediate daughter products, and raises questions as to how best to interpret the complex spectrum of dates characteristic of many volcanic rocks. In addition, the increased precision requires renewed emphasis on the accuracy of U decay constants, the isotopic composition of U, the calibration of isotopic tracers, and the accurate propagation of uncertainties It is now commonplace in the high precision dating of volcanic ash-beds to analyze 5-20 single grains of zircon in an attempt to resolve the eruption/depositional age. Data sets with dispersion far in excess of analytical uncertainties are interpreted to reflect Pb-loss, inheritance, and protracted crystallization, often supported with zircon chemistry. In most cases, a weighted mean of the youngest reproducible dates is interpreted as the time of eruption/deposition. Crystallization histories of silicic magmatic systems recovered from plutonic rocks may also be protracted, though may not be directly applicable to silicic eruptions; each sample must be evaluated independently. A key to robust interpretations is the integration high-spatial resolution zircon trace element geochemistry with high-precision ID-TIMS analyses. The EARTHTIME initiative has focused on many of these issues, and the larger subject of constructing a timeline for earth history using both U-Pb and Ar-Ar chronometers. Despite continuing improvements in both, comparing dates for the same rock with both chronometers is not straightforward. Compelling issues range from pre-eruptive magma chamber residence, recognizing open system behavior, accurately correcting for disequilibrium amounts of 230Th and 231Pa, precise and accurate dates of fluence monitors for 40Ar/39Ar, and inter-laboratory biases. At present, despite the level of internal precision achievable by each technique, obstacles remain to combining both chronometers.

  5. Quantitative morphometrical characterization of human pronuclear zygotes.

    PubMed

    Beuchat, A; Thévenaz, P; Unser, M; Ebner, T; Senn, A; Urner, F; Germond, M; Sorzano, C O S

    2008-09-01

    Identification of embryos with high implantation potential remains a challenge in in vitro fertilization (IVF). Subjective pronuclear (PN) zygote scoring systems have been developed for that purpose. The aim of this work was to provide a software tool that enables objective measuring of morphological characteristics of the human PN zygote. A computer program was created to analyse zygote images semi-automatically, providing precise morphological measurements. The accuracy of this approach was first validated by comparing zygotes from two different IVF centres with computer-assisted measurements or subjective scoring. Computer-assisted measurement and subjective scoring were then compared for their ability to classify zygotes with high and low implantation probability by using a linear discriminant analysis. Zygote images coming from the two IVF centres were analysed with the software, resulting in a series of precise measurements of 24 variables. Using subjective scoring, the cytoplasmic halo was the only feature which was significantly different between the two IVF centres. Computer-assisted measurements revealed significant differences between centres in PN centring, PN proximity, cytoplasmic halo and features related to nucleolar precursor bodies distribution. The zygote classification error achieved with the computer-assisted measurements (0.363) was slightly inferior to that of the subjective ones (0.393). A precise and objective characterization of the morphology of human PN zygotes can be achieved by the use of an advanced image analysis tool. This computer-assisted analysis allows for a better morphological characterization of human zygotes and can be used for classification.

  6. Evaluation of laser diode thermal desorption-tandem mass spectrometry (LDTD-MS-MS) in forensic toxicology.

    PubMed

    Bynum, Nichole D; Moore, Katherine N; Grabenauer, Megan

    2014-10-01

    Many forensic laboratories experience backlogs due to increased drug-related cases. Laser diode thermal desorption (LDTD) has demonstrated its applicability in other scientific areas by providing data comparable with instrumentation, such as liquid chromatography-tandem mass spectrometry, in less time. LDTD-MS-MS was used to validate 48 compounds in drug-free human urine and blood for screening or quantitative analysis. Carryover, interference, limit of detection, limit of quantitation, matrix effect, linearity, precision and accuracy and stability were evaluated. Quantitative analysis indicated that LDTD-MS-MS produced precise and accurate results with the average overall within-run precision in urine and blood represented by a %CV <14.0 and <7.0, respectively. The accuracy for all drugs in urine ranged from 88.9 to 104.5% and 91.9 to 107.1% in blood. Overall, LDTD has the potential for use in forensic toxicology but before it can be successfully implemented that there are some challenges that must be addressed. Although the advantages of the LDTD system include minimal maintenance and rapid analysis (∼10 s per sample) which makes it ideal for high-throughput forensic laboratories, a major disadvantage is its inability or difficulty analyzing isomers and isobars due to the lack of chromatography without the use of high-resolution MS; therefore, it would be best implemented as a screening technique. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Discovery of 100K SNP array and its utilization in sugarcane

    USDA-ARS?s Scientific Manuscript database

    Next generation sequencing (NGS) enable us to identify thousands of single nucleotide polymorphisms (SNPs) marker for genotyping and fingerprinting. However, the process requires very precise bioinformatics analysis and filtering process. High throughput SNP array with predefined genomic location co...

  8. Pikalert(R) System Vehicle Data Translator (VDT) Utilizing Integrated Mobile Observations Pikalert VDT Enhancements, Operations, & Maintenance

    DOT National Transportation Integrated Search

    2017-03-24

    The Pikalert System provides high precision road weather guidance. It assesses current weather and road conditions based on observations from connected vehicles, road weather information stations, radar, and weather model analysis fields. It also for...

  9. Qualitative computer aided evaluation of dental impressions in vivo.

    PubMed

    Luthardt, Ralph G; Koch, Rainer; Rudolph, Heike; Walter, Michael H

    2006-01-01

    Clinical investigations dealing with the precision of different impression techniques are rare. Objective of the present study was to develop and evaluate a procedure for the qualitative analysis of the three-dimensional impression precision based on an established in-vitro procedure. The zero hypothesis to be tested was that the precision of impressions does not differ depending on the impression technique used (single-step, monophase and two-step-techniques) and on clinical variables. Digital surface data of patient's teeth prepared for crowns were gathered from standardized manufactured master casts after impressions with three different techniques were taken in a randomized order. Data-sets were analyzed for each patient in comparison with the one-step impression chosen as the reference. The qualitative analysis was limited to data-points within the 99.5%-range. Based on the color-coded representation areas with maximum deviations were determined (preparation margin and the mantle and occlusal surface). To qualitatively analyze the precision of the impression techniques, the hypothesis was tested in linear models for repeated measures factors (p < 0.05). For the positive 99.5% deviations no variables with significant influence were determined in the statistical analysis. In contrast, the impression technique and the position of the preparation margin significantly influenced the negative 99.5% deviations. The influence of clinical parameter on the deviations between impression techniques can be determined reliably using the 99.5 percentile of the deviations. An analysis regarding the areas with maximum deviations showed high clinical relevance. The preparation margin was pointed out as the weak spot of impression taking.

  10. Mechanism and experimental research on ultra-precision grinding of ferrite

    NASA Astrophysics Data System (ADS)

    Ban, Xinxing; Zhao, Huiying; Dong, Longchao; Zhu, Xueliang; Zhang, Chupeng; Gu, Yawen

    2017-02-01

    Ultra-precision grinding of ferrite is conducted to investigate the removal mechanism. Effect of the accuracy of machine tool key components on grinding surface quality is analyzed. The surface generation model of ferrite ultra-precision grinding machining is established. In order to reveal the surface formation mechanism of ferrite in the process of ultraprecision grinding, furthermore, the scientific and accurate of the calculation model are taken into account to verify the grinding surface roughness, which is proposed. Orthogonal experiment is designed using the high precision aerostatic turntable and aerostatic spindle for ferrite which is a typical hard brittle materials. Based on the experimental results, the influence factors and laws of ultra-precision grinding surface of ferrite are discussed through the analysis of the surface roughness. The results show that the quality of ferrite grinding surface is the optimal parameters, when the wheel speed of 20000r/mm, feed rate of 10mm/min, grinding depth of 0.005mm, and turntable rotary speed of 5r/min, the surface roughness Ra can up to 75nm.

  11. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  12. A high precision method for length-based separation of carbon nanotubes using bio-conjugation, SDS-PAGE and silver staining.

    PubMed

    Borzooeian, Zahra; Taslim, Mohammad E; Ghasemi, Omid; Rezvani, Saina; Borzooeian, Giti; Nourbakhsh, Amirhasan

    2018-01-01

    Parametric separation of carbon nanotubes, especially based on their length is a challenge for a number of nano-tech researchers. We demonstrate a method to combine bio-conjugation, SDS-PAGE, and silver staining in order to separate carbon nanotubes on the basis of length. Egg-white lysozyme, conjugated covalently onto the single-walled carbon nanotubes surfaces using carbodiimide method. The proposed conjugation of a biomolecule onto the carbon nanotubes surfaces is a novel idea and a significant step forward for creating an indicator for length-based carbon nanotubes separation. The conjugation step was followed by SDS-PAGE and the nanotube fragments were precisely visualized using silver staining. This high precision, inexpensive, rapid and simple separation method obviates the need for centrifugation, additional chemical analyses, and expensive spectroscopic techniques such as Raman spectroscopy to visualize carbon nanotube bands. In this method, we measured the length of nanotubes using different image analysis techniques which is based on a simplified hydrodynamic model. The method has high precision and resolution and is effective in separating the nanotubes by length which would be a valuable quality control tool for the manufacture of carbon nanotubes of specific lengths in bulk quantities. To this end, we were also able to measure the carbon nanotubes of different length, produced from different sonication time intervals.

  13. Rapid measurement of human milk macronutrients in the neonatal intensive care unit: accuracy and precision of fourier transform mid-infrared spectroscopy.

    PubMed

    Smilowitz, Jennifer T; Gho, Deborah S; Mirmiran, Majid; German, J Bruce; Underwood, Mark A

    2014-05-01

    Although it is well established that human milk varies widely in macronutrient content, it remains common for human milk fortification for premature infants to be based on historic mean values. As a result, those caring for premature infants often underestimate protein intake. Rapid precise measurement of human milk protein, fat, and lactose to allow individualized fortification has been proposed for decades but remains elusive due to technical challenges. This study aimed to evaluate the accuracy and precision of a Fourier transform (FT) mid-infrared (IR) spectroscope in the neonatal intensive care unit to measure human milk fat, total protein, lactose, and calculated energy compared with standard chemical analyses. One hundred sixteen breast milk samples across lactation stages from women who delivered at term (n = 69) and preterm (n = 5) were analyzed with the FT mid-IR spectroscope and with standard chemical methods. Ten of the samples were tested in replicate using the FT mid-IR spectroscope to determine repeatability. The agreement between the FT mid-IR spectroscope analysis and reference methods was high for protein and fat and moderate for lactose and energy. The intra-assay coefficients of variation for all outcomes were less than 3%. The FT mid-IR spectroscope demonstrated high accuracy in measurement of total protein and fat of preterm and term milk with high precision.

  14. Research on precise pneumatic-electric displacement sensor with large measurement range

    NASA Astrophysics Data System (ADS)

    Yin, Zhehao; Yuan, Yibao; Liu, Baoshuai

    2017-10-01

    This research mainly focuses on precise pneumatic-electric displacement sensor which has large measurement range. Under the high precision, measurement range can be expanded so that the need of high precision as well as large range can be satisfied in the field of machining inspection technology. This research was started by the analysis of pneumatic-measuring theory. Then, an gas circuit measuring system which is based on differential pressure was designed. This designed system can reach two aims: Firstly, to convert displacement signal into gas signal; Secondly, to reduce the measurement error which caused by pressure and environmental turbulence. Furthermore, in consideration of the high requirement for linearity, sensitivity and stability, the project studied the pneumatic-electric transducer which puts the SCX series pressure sensor as a key part. The main purpose of this pneumatic-electric transducer is to convert gas signal to suitable electrical signal. Lastly, a broken line subsection linearization circuit was designed, which can nonlinear correct the output characteristic curve so as to enlarge the linear measurement range. The final result could be briefly described like this: under the condition that measuring error is less than 1μm, measurement range could be extended to approximately 200μm which is much higher than the measurement range of traditional pneumatic measuring instrument. Meanwhile, it can reach higher exchangeability and stability in order to become more suitable to engineering application.

  15. Validation of high throughput screening of human sera for detection of anti-PA IgG by Enzyme-Linked Immunosorbent Assay (ELISA) as an emergency response to an anthrax incident

    PubMed Central

    Semenova, Vera A.; Steward-Clark, Evelene; Maniatis, Panagiotis; Epperson, Monica; Sabnis, Amit; Schiffer, Jarad

    2017-01-01

    To improve surge testing capability for a response to a release of Bacillus anthracis, the CDC anti-Protective Antigen (PA) IgG Enzyme-Linked Immunosorbent Assay (ELISA) was re-designed into a high throughput screening format. The following assay performance parameters were evaluated: goodness of fit (measured as the mean reference standard r2), accuracy (measured as percent error), precision (measured as coefficient of variance (CV)), lower limit of detection (LLOD), lower limit of quantification (LLOQ), dilutional linearity, diagnostic sensitivity (DSN) and diagnostic specificity (DSP). The paired sets of data for each sample were evaluated by Concordance Correlation Coefficient (CCC) analysis. The goodness of fit was 0.999; percent error between the expected and observed concentration for each sample ranged from −4.6% to 14.4%. The coefficient of variance ranged from 9.0% to 21.2%. The assay LLOQ was 2.6 μg/mL. The regression analysis results for dilutional linearity data were r2 = 0.952, slope = 1.02 and intercept = −0.03. CCC between assays was 0.974 for the median concentration of serum samples. The accuracy and precision components of CCC were 0.997 and 0.977, respectively. This high throughput screening assay is precise, accurate, sensitive and specific. Anti-PA IgG concentrations determined using two different assays proved high levels of agreement. The method will improve surge testing capability 18-fold from 4 to 72 sera per assay plate. PMID:27814939

  16. Impulsivity modulates performance under response uncertainty in a reaching task.

    PubMed

    Tzagarakis, C; Pellizzer, G; Rogers, R D

    2013-03-01

    We sought to explore the interaction of the impulsivity trait with response uncertainty. To this end, we used a reaching task (Pellizzer and Hedges in Exp Brain Res 150:276-289, 2003) where a motor response direction was cued at different levels of uncertainty (1 cue, i.e., no uncertainty, 2 cues or 3 cues). Data from 95 healthy adults (54 F, 41 M) were analysed. Impulsivity was measured using the Barratt Impulsiveness Scale version 11 (BIS-11). Behavioral variables recorded were reaction time (RT), errors of commission (referred to as 'early errors') and errors of precision. Data analysis employed generalised linear mixed models and generalised additive mixed models. For the early errors, there was an interaction of impulsivity with uncertainty and gender, with increased errors for high impulsivity in the one-cue condition for women and the three-cue condition for men. There was no effect of impulsivity on precision errors or RT. However, the analysis of the effect of RT and impulsivity on precision errors showed a different pattern for high versus low impulsives in the high uncertainty (3 cue) condition. In addition, there was a significant early error speed-accuracy trade-off for women, primarily in low uncertainty and a 'reverse' speed-accuracy trade-off for men in high uncertainty. These results extend those of past studies of impulsivity which help define it as a behavioural trait that modulates speed versus accuracy response styles depending on environmental constraints and highlight once more the importance of gender in the interplay of personality and behaviour.

  17. Validation of high throughput screening of human sera for detection of anti-PA IgG by Enzyme-Linked Immunosorbent Assay (ELISA) as an emergency response to an anthrax incident.

    PubMed

    Semenova, Vera A; Steward-Clark, Evelene; Maniatis, Panagiotis; Epperson, Monica; Sabnis, Amit; Schiffer, Jarad

    2017-01-01

    To improve surge testing capability for a response to a release of Bacillus anthracis, the CDC anti-Protective Antigen (PA) IgG Enzyme-Linked Immunosorbent Assay (ELISA) was re-designed into a high throughput screening format. The following assay performance parameters were evaluated: goodness of fit (measured as the mean reference standard r 2 ), accuracy (measured as percent error), precision (measured as coefficient of variance (CV)), lower limit of detection (LLOD), lower limit of quantification (LLOQ), dilutional linearity, diagnostic sensitivity (DSN) and diagnostic specificity (DSP). The paired sets of data for each sample were evaluated by Concordance Correlation Coefficient (CCC) analysis. The goodness of fit was 0.999; percent error between the expected and observed concentration for each sample ranged from -4.6% to 14.4%. The coefficient of variance ranged from 9.0% to 21.2%. The assay LLOQ was 2.6 μg/mL. The regression analysis results for dilutional linearity data were r 2  = 0.952, slope = 1.02 and intercept = -0.03. CCC between assays was 0.974 for the median concentration of serum samples. The accuracy and precision components of CCC were 0.997 and 0.977, respectively. This high throughput screening assay is precise, accurate, sensitive and specific. Anti-PA IgG concentrations determined using two different assays proved high levels of agreement. The method will improve surge testing capability 18-fold from 4 to 72 sera per assay plate. Published by Elsevier Ltd.

  18. Trace element analysis by EPMA in geosciences: detection limit, precision and accuracy

    NASA Astrophysics Data System (ADS)

    Batanova, V. G.; Sobolev, A. V.; Magnin, V.

    2018-01-01

    Use of the electron probe microanalyser (EPMA) for trace element analysis has increased over the last decade, mainly because of improved stability of spectrometers and the electron column when operated at high probe current; development of new large-area crystal monochromators and ultra-high count rate spectrometers; full integration of energy-dispersive / wavelength-dispersive X-ray spectrometry (EDS/WDS) signals; and the development of powerful software packages. For phases that are stable under a dense electron beam, the detection limit and precision can be decreased to the ppm level by using high acceleration voltage and beam current combined with long counting time. Data on 10 elements (Na, Al, P, Ca, Ti, Cr, Mn, Co, Ni, Zn) in olivine obtained on a JEOL JXA-8230 microprobe with tungsten filament show that the detection limit decreases proportionally to the square root of counting time and probe current. For all elements equal or heavier than phosphorus (Z = 15), the detection limit decreases with increasing accelerating voltage. The analytical precision for minor and trace elements analysed in olivine at 25 kV accelerating voltage and 900 nA beam current is 4 - 18 ppm (2 standard deviations of repeated measurements of the olivine reference sample) and is similar to the detection limit of corresponding elements. To analyse trace elements accurately requires careful estimation of background, and consideration of sample damage under the beam and secondary fluorescence from phase boundaries. The development and use of matrix reference samples with well-characterised trace elements of interest is important for monitoring and improving of the accuracy. An evaluation of the accuracy of trace element analyses in olivine has been made by comparing EPMA data for new reference samples with data obtained by different in-situ and bulk analytical methods in six different laboratories worldwide. For all elements, the measured concentrations in the olivine reference sample were found to be identical (within internal precision) to reference values, suggesting that achieved precision and accuracy are similar. The spatial resolution of EPMA in a silicate matrix, even at very extreme conditions (accelerating voltage 25 kV), does not exceed 7 - 8 μm and thus is still better than laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) or secondary ion mass spectrometry (SIMS) of similar precision. These make the electron microprobe an indispensable method with applications in experimental petrology, geochemistry and cosmochemistry.

  19. Advanced Photonic Sensors Enabled by Semiconductor Bonding

    DTIC Science & Technology

    2010-05-31

    a dry scroll backing pump to maintain the high differential pressure between the UV gun and the sample/analysis chamber. We also replaced the...semiconductor materials in an ultra-high vacuum (UHV) environment where the properties of the interface can be controlled with atomic-level precision. Such...year research program, we designed and constructed a unique system capable of fusion bonding two wafers in an ultra-high vacuum environment. This system

  20. Comparison of two different Radiostereometric analysis (RSA) systems with markerless elementary geometrical shape modeling for the measurement of stem migration.

    PubMed

    Li, Ye; Röhrl, Stephan M; Bøe, B; Nordsletten, Lars

    2014-09-01

    Radiostereometric analysis (RSA) is the gold standard of measurement for in vivo 3D implants migration. The aim of this study was to evaluate the in vivo precision of 2 RSA marker-based systems compared with that of marker-free, elementary geometrical shape modeling RSA. Stem migration was measured in 50 patients recruited from an on-going Randomized Controlled Trial. We performed marker-based analysis with the Um RSA and RSAcore systems and compared these results with those of the elementary geometrical shape RSA. The precision for subsidence was 0.118 mm for Um RSA, 0.141 mm for RSAcore, and 0.136 mm for elementary geometrical shape RSA. The precision for retroversion was 1.3° for elementary geometrical shape RSA, approximately 2-fold greater than that for the other methods. The intraclass correlation coefficient between the marker-based systems and elementary geometrical shape RSA was approximately 0.5 for retroversion. All 3 methods yielded ICCs for subsidence and varus-valgus rotation above 0.9. We found an excellent correlation between marker-based RSA and elementary geometrical shape RSA for subsidence and varus-valgus rotation, independent of the system used. The precisions for out-of-plane migration were inferior for elementary geometrical shape RSA. Therefore, as a mechanism of failure, retroversion may be more difficult to detect early. This is to our knowledge the first study to compare different RSA systems with or without markers on the implant. Marker-based RSA has high precision in all planes, independent of the system used. Elementary geometrical shape RSA is inferior in out-of-plane migration. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Methods for applying accurate digital PCR analysis on low copy DNA samples.

    PubMed

    Whale, Alexandra S; Cowen, Simon; Foy, Carole A; Huggett, Jim F

    2013-01-01

    Digital PCR (dPCR) is a highly accurate molecular approach, capable of precise measurements, offering a number of unique opportunities. However, in its current format dPCR can be limited by the amount of sample that can be analysed and consequently additional considerations such as performing multiplex reactions or pre-amplification can be considered. This study investigated the impact of duplexing and pre-amplification on dPCR analysis by using three different assays targeting a model template (a portion of the Arabidopsis thaliana alcohol dehydrogenase gene). We also investigated the impact of different template types (linearised plasmid clone and more complex genomic DNA) on measurement precision using dPCR. We were able to demonstrate that duplex dPCR can provide a more precise measurement than uniplex dPCR, while applying pre-amplification or varying template type can significantly decrease the precision of dPCR. Furthermore, we also demonstrate that the pre-amplification step can introduce measurement bias that is not consistent between experiments for a sample or assay and so could not be compensated for during the analysis of this data set. We also describe a model for estimating the prevalence of molecular dropout and identify this as a source of dPCR imprecision. Our data have demonstrated that the precision afforded by dPCR at low sample concentration can exceed that of the same template post pre-amplification thereby negating the need for this additional step. Our findings also highlight the technical differences between different templates types containing the same sequence that must be considered if plasmid DNA is to be used to assess or control for more complex templates like genomic DNA.

  2. Methods for Applying Accurate Digital PCR Analysis on Low Copy DNA Samples

    PubMed Central

    Whale, Alexandra S.; Cowen, Simon; Foy, Carole A.; Huggett, Jim F.

    2013-01-01

    Digital PCR (dPCR) is a highly accurate molecular approach, capable of precise measurements, offering a number of unique opportunities. However, in its current format dPCR can be limited by the amount of sample that can be analysed and consequently additional considerations such as performing multiplex reactions or pre-amplification can be considered. This study investigated the impact of duplexing and pre-amplification on dPCR analysis by using three different assays targeting a model template (a portion of the Arabidopsis thaliana alcohol dehydrogenase gene). We also investigated the impact of different template types (linearised plasmid clone and more complex genomic DNA) on measurement precision using dPCR. We were able to demonstrate that duplex dPCR can provide a more precise measurement than uniplex dPCR, while applying pre-amplification or varying template type can significantly decrease the precision of dPCR. Furthermore, we also demonstrate that the pre-amplification step can introduce measurement bias that is not consistent between experiments for a sample or assay and so could not be compensated for during the analysis of this data set. We also describe a model for estimating the prevalence of molecular dropout and identify this as a source of dPCR imprecision. Our data have demonstrated that the precision afforded by dPCR at low sample concentration can exceed that of the same template post pre-amplification thereby negating the need for this additional step. Our findings also highlight the technical differences between different templates types containing the same sequence that must be considered if plasmid DNA is to be used to assess or control for more complex templates like genomic DNA. PMID:23472156

  3. On Cross-talk Correction of Images from Multiple-port CCDs

    NASA Astrophysics Data System (ADS)

    Freyhammer, L. M.; Andersen, M. I.; Arentoft, T.; Sterken, C.; Nørregaard, P.

    Multi-channel CCD read-out, which is an option offered at most optical observatories, can significantly reduce the time spent on reading the detector. The penalty of using this option is the so-called amplifier cross-talk, which causes contamination across the output amplifiers, typically at the level of 1:10 000. This can be a serious problem for applications where high precision and/or high contrast is of importance. We represent an analysis of amplifier cross-talk for two instruments - FORS1 at the ESO VLT telescope Antu (Paranal) and DFOSC at the Danish 1.54 m telescope (La Silla) - and present a post-processing method for removing the imprint of cross-talk. It is found that cross-talk may significantly contaminate high-precision photometry in crowded fields, but it can be effectively eliminated during data reduction.

  4. A Lightweight, Precision-Deployable, Optical Bench for High Energy Astrophysics Missions

    NASA Astrophysics Data System (ADS)

    Danner, Rolf; Dailey, D.; Lillie, C.

    2011-09-01

    The small angle of total reflection for X-rays, forcing grazing incidence optics with large collecting areas to long focal lengths, has been a fundamental barrier to the advancement of high-energy astrophysics. Design teams around the world have long recognized that a significant increase in effective area beyond Chandra and XMM-Newton requires either a deployable optical bench or separate X-ray optics and instrument module on formation flying spacecraft. Here, we show that we have in hand the components for a lightweight, precision-deployable optical bench that, through its inherent design features, is the affordable path to the next generation of imaging high-energy astrophysics missions. We present our plans for a full-scale engineering model of a deployable optical bench for Explorer-class missions. We intend to use this test article to raise the technology readiness level (TRL) of the tensegrity truss for a lightweight, precision-deployable optical bench for high-energy astrophysics missions from TRL 3 to TRL 5 through a set of four well-defined technology milestones. The milestones cover the architecture's ability to deploy and control the focal point, characterize the deployed dynamics, determine long-term stability, and verify the stowed load capability. Our plan is based on detailed design and analysis work and the construction of a first prototype by our team. Building on our prior analysis and the high TRL of the architecture components we are ready to move on to the next step. The key elements to do this affordably are two existing, fully characterized, flight-quality, deployable booms. After integrating them into the test article, we will demonstrate that our architecture meets the deployment accuracy, adjustability, and stability requirements. The same test article can be used to further raise the TRL in the future.

  5. OPTIMA: sensitive and accurate whole-genome alignment of error-prone genomic maps by combinatorial indexing and technology-agnostic statistical analysis.

    PubMed

    Verzotto, Davide; M Teo, Audrey S; Hillmer, Axel M; Nagarajan, Niranjan

    2016-01-01

    Resolution of complex repeat structures and rearrangements in the assembly and analysis of large eukaryotic genomes is often aided by a combination of high-throughput sequencing and genome-mapping technologies (for example, optical restriction mapping). In particular, mapping technologies can generate sparse maps of large DNA fragments (150 kilo base pairs (kbp) to 2 Mbp) and thus provide a unique source of information for disambiguating complex rearrangements in cancer genomes. Despite their utility, combining high-throughput sequencing and mapping technologies has been challenging because of the lack of efficient and sensitive map-alignment algorithms for robustly aligning error-prone maps to sequences. We introduce a novel seed-and-extend glocal (short for global-local) alignment method, OPTIMA (and a sliding-window extension for overlap alignment, OPTIMA-Overlap), which is the first to create indexes for continuous-valued mapping data while accounting for mapping errors. We also present a novel statistical model, agnostic with respect to technology-dependent error rates, for conservatively evaluating the significance of alignments without relying on expensive permutation-based tests. We show that OPTIMA and OPTIMA-Overlap outperform other state-of-the-art approaches (1.6-2 times more sensitive) and are more efficient (170-200 %) and precise in their alignments (nearly 99 % precision). These advantages are independent of the quality of the data, suggesting that our indexing approach and statistical evaluation are robust, provide improved sensitivity and guarantee high precision.

  6. A New Time Measurement Method Using a High-End Global Navigation Satellite System to Analyze Alpine Skiing

    ERIC Educational Resources Information Center

    Supej, Matej; Holmberg, Hans-Christer

    2011-01-01

    Accurate time measurement is essential to temporal analysis in sport. This study aimed to (a) develop a new method for time computation from surveyed trajectories using a high-end global navigation satellite system (GNSS), (b) validate its precision by comparing GNSS with photocells, and (c) examine whether gate-to-gate times can provide more…

  7. Precision medicine in myasthenia graves: begin from the data precision

    PubMed Central

    Hong, Yu; Xie, Yanchen; Hao, Hong-Jun; Sun, Ren-Cheng

    2016-01-01

    Myasthenia gravis (MG) is a prototypic autoimmune disease with overt clinical and immunological heterogeneity. The data of MG is far from individually precise now, partially due to the rarity and heterogeneity of this disease. In this review, we provide the basic insights of MG data precision, including onset age, presenting symptoms, generalization, thymus status, pathogenic autoantibodies, muscle involvement, severity and response to treatment based on references and our previous studies. Subgroups and quantitative traits of MG are discussed in the sense of data precision. The role of disease registries and scientific bases of precise analysis are also discussed to ensure better collection and analysis of MG data. PMID:27127759

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoegg, Edward D.; Barinaga, Charles J.; Hager, George J.

    ABSTRACT In order to meet a growing need for fieldable mass spectrometer systems for precise elemental and isotopic analyses, the liquid sampling-atmospheric pressure glow discharge (LS-APGD) has a number of very promising characteristics. One key set of attributes that await validation deals with the performance characteristics relative to isotope ratio precision and accuracy. Due to its availability and prior experience with this research team, the initial evaluation of isotope ratio (IR) performance was performed on a Thermo Scientific Exactive Orbitrap instrument. While the mass accuracy and resolution performance for orbitrap analyzers are very well documented, no detailed evaluations of themore » IR performance have been published. Efforts described here involve two variables: the inherent IR precision and accuracy delivered by the LSAPGD microplasma and the inherent IR measurement qualities of orbitrap analyzers. Important to the IR performance, the various operating parameters of the orbitrap sampling interface, HCD dissociation stage, and ion injection/data acquisition have been evaluated. The IR performance for a range of other elements, including natural, depleted, and enriched uranium isotopes was determined. In all cases the precision and accuracy are degraded when measuring low abundance (<0.1% isotope fractions). In the best case, IR precision on the order of 0.1 %RSD can be achieved, with values of 1-3 %RSD observed for low-abundance species. The results suggest that the LSAPGD is a very good candidate for field deployable MS analysis and that the high resolving powers of the orbitrap may be complemented with a here-to-fore unknown capacity to deliver high-precision isotope ratios.« less

  9. Research on the tool holder mode in high speed machining

    NASA Astrophysics Data System (ADS)

    Zhenyu, Zhao; Yongquan, Zhou; Houming, Zhou; Xiaomei, Xu; Haibin, Xiao

    2018-03-01

    High speed machining technology can improve the processing efficiency and precision, but also reduce the processing cost. Therefore, the technology is widely regarded in the industry. With the extensive application of high-speed machining technology, high-speed tool system has higher and higher requirements on the tool chuck. At present, in high speed precision machining, several new kinds of clip heads are as long as there are heat shrinkage tool-holder, high-precision spring chuck, hydraulic tool-holder, and the three-rib deformation chuck. Among them, the heat shrinkage tool-holder has the advantages of high precision, high clamping force, high bending rigidity and dynamic balance, etc., which are widely used. Therefore, it is of great significance to research the new requirements of the machining tool system. In order to adapt to the requirement of high speed machining precision machining technology, this paper expounds the common tool holder technology of high precision machining, and proposes how to select correctly tool clamping system in practice. The characteristics and existing problems are analyzed in the tool clamping system.

  10. Targeted Quantitation of Proteins by Mass Spectrometry

    PubMed Central

    2013-01-01

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332

  11. Targeted quantitation of proteins by mass spectrometry.

    PubMed

    Liebler, Daniel C; Zimmerman, Lisa J

    2013-06-04

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.

  12. Comparison of four extraction/methylation analytical methods to measure fatty acid composition by gas chromatography in meat.

    PubMed

    Juárez, M; Polvillo, O; Contò, M; Ficco, A; Ballico, S; Failla, S

    2008-05-09

    Four different extraction-derivatization methods commonly used for fatty acid analysis in meat (in situ or one-step method, saponification method, classic method and a combination of classic extraction and saponification derivatization) were tested. The in situ method had low recovery and variation. The saponification method showed the best balance between recovery, precision, repeatability and reproducibility. The classic method had high recovery and acceptable variation values, except for the polyunsaturated fatty acids, showing higher variation than the former methods. The combination of extraction and methylation steps had great recovery values, but the precision, repeatability and reproducibility were not acceptable. Therefore the saponification method would be more convenient for polyunsaturated fatty acid analysis, whereas the in situ method would be an alternative for fast analysis. However the classic method would be the method of choice for the determination of the different lipid classes.

  13. Study on SOC wavelet analysis for LiFePO4 battery

    NASA Astrophysics Data System (ADS)

    Liu, Xuepeng; Zhao, Dongmei

    2017-08-01

    Improving the prediction accuracy of SOC can reduce the complexity of the conservative and control strategy of the strategy such as the scheduling, optimization and planning of LiFePO4 battery system. Based on the analysis of the relationship between the SOC historical data and the external stress factors, the SOC Estimation-Correction Prediction Model based on wavelet analysis is established. Using wavelet neural network prediction model is of high precision to achieve forecast link, external stress measured data is used to update parameters estimation in the model, implement correction link, makes the forecast model can adapt to the LiFePO4 battery under rated condition of charge and discharge the operating point of the variable operation area. The test results show that the method can obtain higher precision prediction model when the input and output of LiFePO4 battery are changed frequently.

  14. NiftyPET: a High-throughput Software Platform for High Quantitative Accuracy and Precision PET Imaging and Analysis.

    PubMed

    Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien

    2018-01-01

    We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.

  15. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Machine learning and data mining advance predictive big data analysis in precision animal agriculture.

    PubMed

    Morota, Gota; Ventura, Ricardo V; Silva, Fabyano F; Koyama, Masanori; Fernando, Samodha C

    2018-04-14

    Precision animal agriculture is poised to rise to prominence in the livestock enterprise in the domains of management, production, welfare, sustainability, health surveillance, and environmental footprint. Considerable progress has been made in the use of tools to routinely monitor and collect information from animals and farms in a less laborious manner than before. These efforts have enabled the animal sciences to embark on information technology-driven discoveries to improve animal agriculture. However, the growing amount and complexity of data generated by fully automated, high-throughput data recording or phenotyping platforms, including digital images, sensor and sound data, unmanned systems, and information obtained from real-time noninvasive computer vision, pose challenges to the successful implementation of precision animal agriculture. The emerging fields of machine learning and data mining are expected to be instrumental in helping meet the daunting challenges facing global agriculture. Yet, their impact and potential in "big data" analysis have not been adequately appreciated in the animal science community, where this recognition has remained only fragmentary. To address such knowledge gaps, this article outlines a framework for machine learning and data mining and offers a glimpse into how they can be applied to solve pressing problems in animal sciences.

  16. High-precision optical measurements of 13C/12C isotope ratios in organic compounds at natural abundance

    PubMed Central

    Zare, Richard N.; Kuramoto, Douglas S.; Haase, Christa; Tan, Sze M.; Crosson, Eric R.; Saad, Nabil M. R.

    2009-01-01

    A continuous-flow cavity ring-down spectroscopy (CRDS) system integrating a chromatographic separation technique, a catalytic combustor, and an isotopic 13C/12C optical analyzer is described for the isotopic analysis of a mixture of organic compounds. A demonstration of its potential is made for the geochemically important class of short-chain hydrocarbons. The system proved to be linear over a 3-fold injection volume dynamic range with an average precision of 0.95‰ and 0.67‰ for ethane and propane, respectively. The calibrated accuracy for methane, ethane, and propane is within 3‰ of the values determined using isotope ratio mass spectrometry (IRMS), which is the current method of choice for compound-specific isotope analysis. With anticipated improvements, the low-cost, portable, and easy-to-use CRDS-based instrumental setup is poised to evolve into a credible challenge to the high-cost and complex IRMS-based technique. PMID:19564619

  17. [The clinical economic analysis of the methods of ischemic heart disease diagnostics].

    PubMed

    Kalashnikov, V Iu; Mitriagina, S N; Syrkin, A L; Poltavskaia, M G; Sorokina, E G

    2007-01-01

    The clinical economical analysis was applied to assess the application of different techniques of ischemic heart disease diagnostics - the electro-cardiographic monitoring, the treadmill-testing, the stress-echo cardiographic with dobutamine, the single-photon computerized axial tomography with load, the multi-spiral computerized axial tomography with coronary arteries staining in patients with different initial probability of disease occurrence. In all groups, the best value of "cost-effectiveness" had the treadmill-test. The patients with low risk needed 17.4 rubles to precise the probability of ischemic heart disease occurrence at 1%. In the group with medium and high risk this indicator was 9.4 and 24.7 rubles correspondingly. It is concluded that to precise the probability of ischemic heart disease occurrence after tredmil-test in the patients with high probability it is appropriate to use the single-photon computerized axial tomography with load and in the case of patients with low probability the multi-spiral computerized axial tomography with coronary arteries staining.

  18. A Moire Fringing Spectrometer for Extra-Solar Planet Searches

    NASA Astrophysics Data System (ADS)

    van Eyken, J. C.; Ge, J.; Mahadevan, S.; De Witt, C.; Ramsey, L. W.; Berger, D.; Shaklan, S.; Pan, X.

    2001-12-01

    We have developed a prototype moire fringing spectrometer for high precision radial velocity measurements for the detection of extra-solar planets. This combination of Michelson interferometer and spectrograph overlays an interferometer comb on a medium resolution stellar spectrum, producing Moire patterns. Small changes in the doppler shift of the spectrum lead to corresponding large shifts in the Moire pattern (Moire magnification). The sinusoidal shape of the Moire fringes enables much simpler measurement of these shifts than in standard echelle spectrograph techniques, facilitating high precision measurements with a low cost instrument. Current data analysis software we have developed has produced short-term repeatability (over a few hours) to 5-10m/s, and future planned improvements based on previous experiments should reduce this significantly. We plan eventually to carry out large scale surveys for low mass companions around other stars. This poster will present new results obtained in the lab and at the HET and Palomar 5m telescopes, the theory of the instrument, and data analysis techniques.

  19. COSMOS: accurate detection of somatic structural variations through asymmetric comparison between tumor and normal samples

    PubMed Central

    Yamagata, Koichi; Yamanishi, Ayako; Kokubu, Chikara; Takeda, Junji; Sese, Jun

    2016-01-01

    An important challenge in cancer genomics is precise detection of structural variations (SVs) by high-throughput short-read sequencing, which is hampered by the high false discovery rates of existing analysis tools. Here, we propose an accurate SV detection method named COSMOS, which compares the statistics of the mapped read pairs in tumor samples with isogenic normal control samples in a distinct asymmetric manner. COSMOS also prioritizes the candidate SVs using strand-specific read-depth information. Performance tests on modeled tumor genomes revealed that COSMOS outperformed existing methods in terms of F-measure. We also applied COSMOS to an experimental mouse cell-based model, in which SVs were induced by genome engineering and gamma-ray irradiation, followed by polymerase chain reaction-based confirmation. The precision of COSMOS was 84.5%, while the next best existing method was 70.4%. Moreover, the sensitivity of COSMOS was the highest, indicating that COSMOS has great potential for cancer genome analysis. PMID:26833260

  20. Non-rigid Earth rotation series

    NASA Astrophysics Data System (ADS)

    Pashkevich, V. V.

    2008-04-01

    The last years a lot of attempts to derive a high-precision theory of the non-rigid Earth rotation was carried out. For these purposes the different transfer functions are used. Usually these transfer func- tions are applied to the series representing the nutation in longitude and in obliquity of the rigid Earth rotation with respect to the ecliptic of date. The aim of this investigation is a construction of the new high- precision non-rigid Earth rotation series (SN9000), dynamically adequate to the DE404/LE404 ephemeris over 2000 years, which are expressed as a function of Euler angles ψ, θ and φ with respect to the fixed ecliptic plane and equinox J2000.0. The early stages of the previous investigation: 1. The high-precision numerical solution of the rigid Earth rotation have been constructed (V.V.Pashkevich, G.I.Eroshkin and A.Brzezinski, 2004), (V.V.Pashkevich and G.I.Eroshkin, Proceedings of Journees 2004). The initial con- ditions have been calculated from SMART97 (P.Bretagnon, G.Francou, P.Rocher, J.L.Simon,1998). The discrepancies between the numerical solution and the semi-analytical solution SMART97 were obtained in Euler angles over 2000 years with one-day spacing. 2. Investigation of the discrepancies is carried out by the least squares and by the spectral analysis algorithms (V.V.Pashkevich and G.I.Eroshkin, Proceedings of Journees 2005). The high-precision rigid Earth rotation series S9000 are determined (V.V.Pashkevich and G.I.Eroshkin, 2005 ). The next stage of this investigation: 3. The new high-precision non-rigid Earth rotation series (SN9000), which are expressed as a function of Euler angles, are constructed by using the method (P.Bretagnon, P.M.Mathews, J.-L.Simon: 1999) and the transfer function MHB2002 (Mathews, P. M., Herring, T. A., and Buffett B. A., 2002).

  1. HIGH-PRECISION BIOLOGICAL EVENT EXTRACTION: EFFECTS OF SYSTEM AND OF DATA

    PubMed Central

    Cohen, K. Bretonnel; Verspoor, Karin; Johnson, Helen L.; Roeder, Chris; Ogren, Philip V.; Baumgartner, William A.; White, Elizabeth; Tipney, Hannah; Hunter, Lawrence

    2013-01-01

    We approached the problems of event detection, argument identification, and negation and speculation detection in the BioNLP’09 information extraction challenge through concept recognition and analysis. Our methodology involved using the OpenDMAP semantic parser with manually written rules. The original OpenDMAP system was updated for this challenge with a broad ontology defined for the events of interest, new linguistic patterns for those events, and specialized coordination handling. We achieved state-of-the-art precision for two of the three tasks, scoring the highest of 24 teams at precision of 71.81 on Task 1 and the highest of 6 teams at precision of 70.97 on Task 2. We provide a detailed analysis of the training data and show that a number of trigger words were ambiguous as to event type, even when their arguments are constrained by semantic class. The data is also shown to have a number of missing annotations. Analysis of a sampling of the comparatively small number of false positives returned by our system shows that major causes of this type of error were failing to recognize second themes in two-theme events, failing to recognize events when they were the arguments to other events, failure to recognize nontheme arguments, and sentence segmentation errors. We show that specifically handling coordination had a small but important impact on the overall performance of the system. The OpenDMAP system and the rule set are available at http://bionlp.sourceforge.net. PMID:25937701

  2. WFIRST: Microlensing Analysis Data Challenge

    NASA Astrophysics Data System (ADS)

    Street, Rachel; WFIRST Microlensing Science Investigation Team

    2018-01-01

    WFIRST will produce thousands of high cadence, high photometric precision lightcurves of microlensing events, from which a wealth of planetary and stellar systems will be discovered. However, the analysis of such lightcurves has historically been very time consuming and expensive in both labor and computing facilities. This poses a potential bottleneck to deriving the full science potential of the WFIRST mission. To address this problem, the WFIRST Microlensing Science Investigation Team designing a series of data challenges to stimulate research to address outstanding problems of microlensing analysis. These range from the classification and modeling of triple lens events to methods to efficiently yet thoroughly search a high-dimensional parameter space for the best fitting models.

  3. AN/TPN-14 Precision Approach Radar (PAR) Analysis

    DTIC Science & Technology

    1964-05-01

    to install high -pass filters on the transmitter power lines, or (2) change the PRF to, for example, 1400 cps to give maximum separation of the PRF...consist of a high voltage power supply, pulse forming and transmitting circuitry, and necessary control circuitry. It shall have a pulse type...1200 cycles per second and a pulse width of 0. 2 or 0. 8 microseconds selectable. 3. 5. 5. 1 High voltage power supply. - The high voltage power

  4. High Productivity Computing Systems Analysis and Performance

    DTIC Science & Technology

    2005-07-01

    cubic grid Discrete Math Global Updates per second (GUP/S) RandomAccess Paper & Pencil Contact Bob Lucas (ISI) Multiple Precision none...can be found at the web site. One of the HPCchallenge codes, RandomAccess, is derived from the HPCS discrete math benchmarks that we released, and...Kernels Discrete Math … Graph Analysis … Linear Solvers … Signal Processi ng Execution Bounds Execution Indicators 6 Scalable Compact

  5. Measuring Efficiency of Tunisian Schools in the Presence of Quasi-Fixed Inputs: A Bootstrap Data Envelopment Analysis Approach

    ERIC Educational Resources Information Center

    Essid, Hedi; Ouellette, Pierre; Vigeant, Stephane

    2010-01-01

    The objective of this paper is to measure the efficiency of high schools in Tunisia. We use a statistical data envelopment analysis (DEA)-bootstrap approach with quasi-fixed inputs to estimate the precision of our measure. To do so, we developed a statistical model serving as the foundation of the data generation process (DGP). The DGP is…

  6. [Application of target restoration space quantity and quantitative relation in precise esthetic prosthodontics].

    PubMed

    Haiyang, Yu; Tian, Luo

    2016-06-01

    Target restoration space (TRS) is the most precise space required for designing optimal prosthesis. TRS consists of an internal or external tooth space to confirm the esthetics and function of the final restoration. Therefore, assisted with quantitive analysis transfer, TRS quantitative analysis is a significant improvement for minimum tooth preparation. This article presents TRS quantity-related measurement, analysis, transfer, and internal relevance of three TR. classifications. Results reveal the close bond between precision and minimally invasive treatment. This study can be used to improve the comprehension and execution of precise esthetic prosthodontics.

  7. Precision medicine in breast cancer: reality or utopia?

    PubMed

    Bettaieb, Ali; Paul, Catherine; Plenchette, Stéphanie; Shan, Jingxuan; Chouchane, Lotfi; Ghiringhelli, François

    2017-06-17

    Many cancers, including breast cancer, have demonstrated prognosis and support advantages thanks to the discovery of targeted therapies. The advent of these new approaches marked the rise of precision medicine, which leads to improve the diagnosis, prognosis and treatment of cancer. Precision medicine takes into account the molecular and biological specificities of the patient and their tumors that will influence the treatment determined by physicians. This new era of medicine is accessible through molecular genetics platforms, the development of high-speed sequencers and means of analysis of these data. Despite the spectacular results in the treatment of cancers including breast cancer, described in this review, not all patients however can benefit from this new strategy. This seems to be related to the many genetic mutations, which may be different from one patient to another or within the same patient. It comes to give new impetus to the research-both from a technological and biological point of view-to make the hope of precision medicine accessible to all.

  8. Image analysis of speckle patterns as a probe of melting transitions in laser-heated diamond anvil cell experiments.

    PubMed

    Salem, Ran; Matityahu, Shlomi; Melchior, Aviva; Nikolaevsky, Mark; Noked, Ori; Sterer, Eran

    2015-09-01

    The precision of melting curve measurements using laser-heated diamond anvil cell (LHDAC) is largely limited by the correct and reliable determination of the onset of melting. We present a novel image analysis of speckle interference patterns in the LHDAC as a way to define quantitative measures which enable an objective determination of the melting transition. Combined with our low-temperature customized IR pyrometer, designed for measurements down to 500 K, our setup allows studying the melting curve of materials with low melting temperatures, with relatively high precision. As an application, the melting curve of Te was measured up to 35 GPa. The results are found to be in good agreement with previous data obtained at pressures up to 10 GPa.

  9. Analysis of photopole data reduction models

    NASA Technical Reports Server (NTRS)

    Cheek, James B.

    1987-01-01

    An analysis of the total impulse obtained from a buried explosive charge can be calculated from displacement versus time points taken from successive film frames of high speed motion pictures of the explosive event. The indicator of that motion is a pole and baseplate (photopole), which is placed on or within the soil overburden. Here, researchers are concerned with the precision of the impulse calculation and ways to improve that precision. Also examined here is the effect of each initial condition on the curve fitting process. It is shown that the zero initial velocity criteria should not be applied due to the linear acceleration versus time character of the cubic power series. The applicability of the new method to photopole data records whose early time motions are obscured is illustrated.

  10. 92 Years of the Ising Model: A High Resolution Monte Carlo Study

    NASA Astrophysics Data System (ADS)

    Xu, Jiahao; Ferrenberg, Alan M.; Landau, David P.

    2018-04-01

    Using extensive Monte Carlo simulations that employ the Wolff cluster flipping and data analysis with histogram reweighting and quadruple precision arithmetic, we have investigated the critical behavior of the simple cubic Ising model with lattice sizes ranging from 163 to 10243. By analyzing data with cross correlations between various thermodynamic quantities obtained from the same data pool, we obtained the critical inverse temperature K c = 0.221 654 626(5) and the critical exponent of the correlation length ν = 0.629 912(86) with precision that improves upon previous Monte Carlo estimates.

  11. Polyatomic interferences on high precision uranium isotope ratio measurements by MC-ICP-MS: Applications to environmental sampling for nuclear safeguards

    DOE PAGES

    Pollington, Anthony D.; Kinman, William S.; Hanson, Susan K.; ...

    2015-09-04

    Modern mass spectrometry and separation techniques have made measurement of major uranium isotope ratios a routine task; however accurate and precise measurement of the minor uranium isotopes remains a challenge as sample size decreases. One particular challenge is the presence of isobaric interferences and their impact on the accuracy of minor isotope 234U and 236U measurements. Furthermore, we present techniques used for routine U isotopic analysis of environmental nuclear safeguards samples and evaluate polyatomic interferences that negatively impact accuracy as well as methods to mitigate their impacts.

  12. A search for strongly Mg-enhanced stars from the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Zhao, Gang; Chen, Yu-Qin; Li, Hai-Ning

    2014-11-01

    Strongly Mg-enhanced stars with [Mg/Fe] > 1.0 show peculiar abundance patterns and hence are of great interest for our understanding of stellar formation and chemical evolution of the Galaxy. A systematic search for strongly Mg-enhanced stars based on low-resolution (R ≃ 2000) spectra from the Sloan Digital Sky Survey (SDSS) is carried out by finding the synthetic spectrum that best matches the observed one in the region of Mg I b lines around λ5170 Å via a profile matching method. The advantage of our method is that fitting parameters are refined by reproducing the [Mg/Fe] ratios of 47 stars from the very precise high-resolution spectroscopic (HRS) analysis by Nissen & Schuster; and these parameters are crucial to the precision and validity of the derived Mg abundances. As a further check of our method, Mg abundances are estimated with our method for member stars in four Galactic globular clusters (M92, M13, M3, M71) which coverthe same metallicity range as our sample, and the results are in good agreement with those of HRS analysis in the literature. The validation of our method is also demonstrated by the agreement of [Mg/Fe] between our values and those of HRS analysis by Aoki et al. Finally, 33 candidates of strongly Mg-enhanced stars with [Mg/Fe] > 1.0 are selected from 14 850 F and G stars. Follow-up observations will be carried out on these candidates with high-resolution spectroscopy by large telescopes in the near future, so as to check our selection procedure and to perform a precise and detailed abundance analysis and to explore the origins of these stars.

  13. Development and Beam-Shape Analysis of an Integrated Fiber-Optic Confocal Probe for High-Precision Central Thickness Measurement of Small-Radius Lenses

    PubMed Central

    Sutapun, Boonsong; Somboonkaew, Armote; Amarit, Ratthasart; Chanhorm, Sataporn

    2015-01-01

    This work describes a new design of a fiber-optic confocal probe suitable for measuring the central thicknesses of small-radius optical lenses or similar objects. The proposed confocal probe utilizes an integrated camera that functions as a shape-encoded position-sensing device. The confocal signal for thickness measurement and beam-shape data for off-axis measurement can be simultaneously acquired using the proposed probe. Placing the probe’s focal point off-center relative to a sample’s vertex produces a non-circular image at the camera’s image plane that closely resembles an ellipse for small displacements. We were able to precisely position the confocal probe’s focal point relative to the vertex point of a ball lens with a radius of 2.5 mm, with a lateral resolution of 1.2 µm. The reflected beam shape based on partial blocking by an aperture was analyzed and verified experimentally. The proposed confocal probe offers a low-cost, high-precision technique, an alternative to a high-cost three-dimensional surface profiler, for tight quality control of small optical lenses during the manufacturing process. PMID:25871720

  14. VLBI Phase-Referenced Observations on Southern Hemisphere HIPPARCOS Radio Start

    NASA Technical Reports Server (NTRS)

    Guirado, J. C.; Preston, R. A.; Jones, D. L.; Lestrade, J. F.; Reynolds, J. E.; Jauncey, D. L.; Tzioumis, A. K.; Ferris, R. H.; King, E. A.; Lovell, J. E. J.; hide

    1995-01-01

    Presented are multiepoch Very Long Baseline Interferometry (VLBI) observations on Southern Hemisphere radio stars phase-referenced to background radio sources. The differential astrometry analysis results in high-precision determinations of proper motions and parallaxes. The astrophysical implications and astrometric consequences of these results are discussed.

  15. GNSS global real-time augmentation positioning: Real-time precise satellite clock estimation, prototype system construction and performance analysis

    NASA Astrophysics Data System (ADS)

    Chen, Liang; Zhao, Qile; Hu, Zhigang; Jiang, Xinyuan; Geng, Changjiang; Ge, Maorong; Shi, Chuang

    2018-01-01

    Lots of ambiguities in un-differenced (UD) model lead to lower calculation efficiency, which isn't appropriate for the high-frequency real-time GNSS clock estimation, like 1 Hz. Mixed differenced model fusing UD pseudo-range and epoch-differenced (ED) phase observations has been introduced into real-time clock estimation. In this contribution, we extend the mixed differenced model for realizing multi-GNSS real-time clock high-frequency updating and a rigorous comparison and analysis on same conditions are performed to achieve the best real-time clock estimation performance taking the efficiency, accuracy, consistency and reliability into consideration. Based on the multi-GNSS real-time data streams provided by multi-GNSS Experiment (MGEX) and Wuhan University, GPS + BeiDou + Galileo global real-time augmentation positioning prototype system is designed and constructed, including real-time precise orbit determination, real-time precise clock estimation, real-time Precise Point Positioning (RT-PPP) and real-time Standard Point Positioning (RT-SPP). The statistical analysis of the 6 h-predicted real-time orbits shows that the root mean square (RMS) in radial direction is about 1-5 cm for GPS, Beidou MEO and Galileo satellites and about 10 cm for Beidou GEO and IGSO satellites. Using the mixed differenced estimation model, the prototype system can realize high-efficient real-time satellite absolute clock estimation with no constant clock-bias and can be used for high-frequency augmentation message updating (such as 1 Hz). The real-time augmentation message signal-in-space ranging error (SISRE), a comprehensive accuracy of orbit and clock and effecting the users' actual positioning performance, is introduced to evaluate and analyze the performance of GPS + BeiDou + Galileo global real-time augmentation positioning system. The statistical analysis of real-time augmentation message SISRE is about 4-7 cm for GPS, whlile 10 cm for Beidou IGSO/MEO, Galileo and about 30 cm for BeiDou GEO satellites. The real-time positioning results prove that the GPS + BeiDou + Galileo RT-PPP comparing to GPS-only can effectively accelerate convergence time by about 60%, improve the positioning accuracy by about 30% and obtain averaged RMS 4 cm in horizontal and 6 cm in vertical; additionally RT-SPP accuracy in the prototype system can realize positioning accuracy with about averaged RMS 1 m in horizontal and 1.5-2 m in vertical, which are improved by 60% and 70% to SPP based on broadcast ephemeris, respectively.

  16. Automated simultaneous measurement of the δ(13) C and δ(2) H values of methane and the δ(13) C and δ(18) O values of carbon dioxide in flask air samples using a new multi cryo-trap/gas chromatography/isotope ratio mass spectrometry system.

    PubMed

    Brand, Willi A; Rothe, Michael; Sperlich, Peter; Strube, Martin; Wendeberg, Magnus

    2016-07-15

    The isotopic composition of greenhouse gases helps to constrain global budgets and to study sink and source processes. We present a new system for high-precision stable isotope measurements of carbon, hydrogen and oxygen in atmospheric methane and carbon dioxide. The design is intended for analyzing flask air samples from existing sampling programs without the need for extra sample air for methane analysis. CO2 and CH4 isotopes are measured simultaneously using two isotope ratio mass spectrometers, one for the analysis of δ(13) C and δ(18) O values and the second one for δ(2) H values. The inlet carousel delivers air from 16 sample positions (glass flasks 1-5 L and high-pressure cylinders). Three 10-port valves take aliquots from the sample stream. CH4 from 100-mL air aliquots is preconcentrated in 0.8-mL sample loops using a new cryo-trap system. A precisely calibrated working reference air is used in parallel with the sample according to the Principle of Identical Treatment. It takes about 36 hours for a fully calibrated analysis of a complete carousel including extractions of four working reference and one quality control reference air. Long-term precision values, as obtained from the quality control reference gas since 2012, account for 0.04 ‰ (δ(13) C values of CO2 ), 0.07 ‰ (δ(18) O values of CO2 ), 0.11 ‰ (δ(13) C values of CH4 ) and 1.0 ‰ (δ(2) H values of CH4 ). Within a single day, the system exhibits a typical methane δ(13) C standard deviation (1σ) of 0.06 ‰ for 10 repeated measurements. The system has been in routine operation at the MPI-BGC since 2012. Consistency of the data and compatibility with results from other laboratories at a high precision level are of utmost importance. A high sample throughput and reliability of operation are important achievements of the presented system to cope with the large number of air samples to be analyzed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Conjugating precision and acquisition time in a Doppler broadening regime by interleaved frequency-agile rapid-scanning cavity ring-down spectroscopy.

    PubMed

    Gotti, Riccardo; Gatti, Davide; Masłowski, Piotr; Lamperti, Marco; Belmonte, Michele; Laporta, Paolo; Marangoni, Marco

    2017-10-07

    We propose a novel approach to cavity-ring-down-spectroscopy (CRDS) in which spectra acquired with a frequency-agile rapid-scanning (FARS) scheme, i.e., with a laser sideband stepped across the modes of a high-finesse cavity, are interleaved with one another by a sub-millisecond readjustment of the cavity length. This brings to time acquisitions below 20 s for few-GHz-wide spectra composed of a very high number of spectral points, typically 3200. Thanks to the signal-to-noise ratio easily in excess of 10 000, each FARS-CRDS spectrum is shown to be sufficient to determine the line-centre frequency of a Doppler broadened line with a precision of 2 parts over 10 11 , thus very close to that of sub-Doppler regimes and in a few-seconds time scale. The referencing of the probe laser to a frequency comb provides absolute accuracy and long-term reproducibility to the spectrometer and makes it a powerful tool for precision spectroscopy and line-shape analysis. The experimental approach is discussed in detail together with experimental precision and accuracy tests on the (30 012) ← (00 001) P12e line of CO 2 at ∼1.57 μm.

  18. Deep-Dive Targeted Quantification for Ultrasensitive Analysis of Proteins in Nondepleted Human Blood Plasma/Serum and Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, Song; Shi, Tujin; Fillmore, Thomas L.

    Mass spectrometry-based targeted proteomics (e.g., selected reaction monitoring, SRM) is emerging as an attractive alternative to immunoassays for protein quantification. Recently we have made significant progress in SRM sensitivity for enabling quantification of low ng/mL to sub-ng/mL level proteins in nondepleted human blood plasma/serum without affinity enrichment. However, precise quantification of extremely low abundant but biologically important proteins (e.g., ≤100 pg/mL in blood plasma/serum) using targeted proteomics approaches still remains challenging. To address this need, we have developed an antibody-independent Deep-Dive SRM (DD-SRM) approach that capitalizes on multidimensional high-resolution reversed-phase liquid chromatography (LC) separation for target peptide enrichment combined withmore » precise selection of target peptide fractions of interest, significantly improving SRM sensitivity by ~5 orders of magnitude when compared to conventional LC-SRM. Application of DD-SRM to human serum and tissue has been demonstrated to enable precise quantification of endogenous proteins at ~10 pg/mL level in nondepleted serum and at <10 copies per cell level in tissue. Thus, DD-SRM holds great promise for precisely measuring extremely low abundance proteins or protein modifications, especially when high-quality antibody is not available.« less

  19. Note: High precision measurements using high frequency gigahertz signals

    NASA Astrophysics Data System (ADS)

    Jin, Aohan; Fu, Siyuan; Sakurai, Atsunori; Liu, Liang; Edman, Fredrik; Pullerits, Tõnu; Öwall, Viktor; Karki, Khadga Jung

    2014-12-01

    Generalized lock-in amplifiers use digital cavities with Q-factors as high as 5 × 108 to measure signals with very high precision. In this Note, we show that generalized lock-in amplifiers can be used to analyze microwave (giga-hertz) signals with a precision of few tens of hertz. We propose that the physical changes in the medium of propagation can be measured precisely by the ultra-high precision measurement of the signal. We provide evidence to our proposition by verifying the Newton's law of cooling by measuring the effect of change in temperature on the phase and amplitude of the signals propagating through two calibrated cables. The technique could be used to precisely measure different physical properties of the propagation medium, for example, the change in length, resistance, etc. Real time implementation of the technique can open up new methodologies of in situ virtual metrology in material design.

  20. The Precision Efficacy Analysis for Regression Sample Size Method.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The general purpose of this study was to examine the efficiency of the Precision Efficacy Analysis for Regression (PEAR) method for choosing appropriate sample sizes in regression studies used for precision. The PEAR method, which is based on the algebraic manipulation of an accepted cross-validity formula, essentially uses an effect size to…

  1. Precise Quantitation of MicroRNA in a Single Cell with Droplet Digital PCR Based on Ligation Reaction.

    PubMed

    Tian, Hui; Sun, Yuanyuan; Liu, Chenghui; Duan, Xinrui; Tang, Wei; Li, Zhengping

    2016-12-06

    MicroRNA (miRNA) analysis in a single cell is extremely important because it allows deep understanding of the exact correlation between the miRNAs and cell functions. Herein, we wish to report a highly sensitive and precisely quantitative assay for miRNA detection based on ligation-based droplet digital polymerase chain reaction (ddPCR), which permits the quantitation of miRNA in a single cell. In this ligation-based ddPCR assay, two target-specific oligonucleotide probes can be simply designed to be complementary to the half-sequence of the target miRNA, respectively, which avoids the sophisticated design of reverse transcription and provides high specificity to discriminate a single-base difference among miRNAs with simple operations. After the miRNA-templated ligation, the ddPCR partitions individual ligated products into a water-in-oil droplet and digitally counts the fluorescence-positive and negative droplets after PCR amplification for quantification of the target molecules, which possesses the power of precise quantitation and robustness to variation in PCR efficiency. By integrating the advantages of the precise quantification of ddPCR and the simplicity of the ligation-based PCR, the proposed method can sensitively measure let-7a miRNA with a detection limit of 20 aM (12 copies per microliter), and even a single-base difference can be discriminated in let-7 family members. More importantly, due to its high selectivity and sensitivity, the proposed method can achieve precise quantitation of miRNAs in single-cell lysate. Therefore, the ligation-based ddPCR assay may serve as a useful tool to exactly reveal the miRNAs' actions in a single cell, which is of great importance for the study of miRNAs' biofunction as well as for the related biomedical studies.

  2. Object detection in cinematographic video sequences for automatic indexing

    NASA Astrophysics Data System (ADS)

    Stauder, Jurgen; Chupeau, Bertrand; Oisel, Lionel

    2003-06-01

    This paper presents an object detection framework applied to cinematographic post-processing of video sequences. Post-processing is done after production and before editing. At the beginning of each shot of a video, a slate (also called clapperboard) is shown. The slate contains notably an electronic audio timecode that is necessary for audio-visual synchronization. This paper presents an object detection framework to detect slates in video sequences for automatic indexing and post-processing. It is based on five steps. The first two steps aim to reduce drastically the video data to be analyzed. They ensure high recall rate but have low precision. The first step detects images at the beginning of a shot possibly showing up a slate while the second step searches in these images for candidates regions with color distribution similar to slates. The objective is to not miss any slate while eliminating long parts of video without slate appearance. The third and fourth steps are statistical classification and pattern matching to detected and precisely locate slates in candidate regions. These steps ensure high recall rate and high precision. The objective is to detect slates with very little false alarms to minimize interactive corrections. In a last step, electronic timecodes are read from slates to automize audio-visual synchronization. The presented slate detector has a recall rate of 89% and a precision of 97,5%. By temporal integration, much more than 89% of shots in dailies are detected. By timecode coherence analysis, the precision can be raised too. Issues for future work are to accelerate the system to be faster than real-time and to extend the framework for several slate types.

  3. A novel approach for high precision rapid potentiometric titrations: application to hydrazine assay.

    PubMed

    Sahoo, P; Malathi, N; Ananthanarayanan, R; Praveen, K; Murali, N

    2011-11-01

    We propose a high precision rapid personal computer (PC) based potentiometric titration technique using a specially designed mini-cell to carry out redox titrations for assay of chemicals in quality control laboratories attached to industrial, R&D, and nuclear establishments. Using this technique a few microlitre of sample (50-100 μl) in a total volume of ~2 ml solution can be titrated and the waste generated after titration is extremely low comparing to that obtained from the conventional titration technique. The entire titration including online data acquisition followed by immediate offline analysis of data to get information about concentration of unknown sample is completed within a couple of minutes (about 2 min). This facility has been created using a new class of sensors, viz., pulsating sensors developed in-house. The basic concept in designing such instrument and the salient features of the titration device are presented in this paper. The performance of the titration facility was examined by conducting some of the high resolution redox titrations using dilute solutions--hydrazine against KIO(3) in HCl medium, Fe(II) against Ce(IV) and uranium using Davies-Gray method. The precision of titrations using this innovative approach lies between 0.048% and 1.0% relative standard deviation in different redox titrations. With the evolution of this rapid PC based titrator it was possible to develop a simple but high precision potentiometric titration technique for quick determination of hydrazine in nuclear fuel dissolver solution in the context of reprocessing of spent nuclear fuel in fast breeder reactors. © 2011 American Institute of Physics

  4. A novel approach for high precision rapid potentiometric titrations: Application to hydrazine assay

    NASA Astrophysics Data System (ADS)

    Sahoo, P.; Malathi, N.; Ananthanarayanan, R.; Praveen, K.; Murali, N.

    2011-11-01

    We propose a high precision rapid personal computer (PC) based potentiometric titration technique using a specially designed mini-cell to carry out redox titrations for assay of chemicals in quality control laboratories attached to industrial, R&D, and nuclear establishments. Using this technique a few microlitre of sample (50-100 μl) in a total volume of ˜2 ml solution can be titrated and the waste generated after titration is extremely low comparing to that obtained from the conventional titration technique. The entire titration including online data acquisition followed by immediate offline analysis of data to get information about concentration of unknown sample is completed within a couple of minutes (about 2 min). This facility has been created using a new class of sensors, viz., pulsating sensors developed in-house. The basic concept in designing such instrument and the salient features of the titration device are presented in this paper. The performance of the titration facility was examined by conducting some of the high resolution redox titrations using dilute solutions--hydrazine against KIO3 in HCl medium, Fe(II) against Ce(IV) and uranium using Davies-Gray method. The precision of titrations using this innovative approach lies between 0.048% and 1.0% relative standard deviation in different redox titrations. With the evolution of this rapid PC based titrator it was possible to develop a simple but high precision potentiometric titration technique for quick determination of hydrazine in nuclear fuel dissolver solution in the context of reprocessing of spent nuclear fuel in fast breeder reactors.

  5. Study on the cutting mechanism and the brittle ductile transition model of isotropic pyrolyric graphite

    NASA Astrophysics Data System (ADS)

    Wang, Minghai; Wang, Hujun; Liu, Zhonghai

    2011-05-01

    Isotropic pyrolyric graphite (IPG) is a new kind of brittle material, it can be used for sealing the aero-engine turbine shaft and the ethylene high-temperature equipment. It not only has the general advantages of ordinal carbonaceous materials such as high temperature resistance, lubrication and abrasion resistance, but also has the advantages of impermeability and machinability that carbon/carbon composite doesn't have. Therefore, it has broad prospects for development. Mechanism of brittle-ductile transition of IPG is the foundation of precision cutting while the plastic deformation of IPG is the essential and the most important mechanical behavior of precision cutting. Using the theory of strain gradient, the mechanism of this material removal during the precision cutting is analyzed. The critical cutting thickness of IPG is calculated for the first time. Furthermore, the cutting process parameters such as cutting depth, feed rate which corresponding to the scale of brittle-ductile transition deformation of IPG are calculated. In the end, based on the theory of micromechanics, the deformation behaviors of IPG such as brittle fracture, plastic deformation and mutual transformation process are all simulated under the Sih.G.C fracture criterion. The condition of the simulation is that the material under the pressure-shear loading conditions .The result shows that the best angle during the IPG precision cutting is -30°. The theoretical analysis and the simulation result are validated by precision cutting experiments.

  6. High-precision robotic microcontact printing (R-μCP) utilizing a vision guided selectively compliant articulated robotic arm.

    PubMed

    McNulty, Jason D; Klann, Tyler; Sha, Jin; Salick, Max; Knight, Gavin T; Turng, Lih-Sheng; Ashton, Randolph S

    2014-06-07

    Increased realization of the spatial heterogeneity found within in vivo tissue microenvironments has prompted the desire to engineer similar complexities into in vitro culture substrates. Microcontact printing (μCP) is a versatile technique for engineering such complexities onto cell culture substrates because it permits microscale control of the relative positioning of molecules and cells over large surface areas. However, challenges associated with precisely aligning and superimposing multiple μCP steps severely limits the extent of substrate modification that can be achieved using this method. Thus, we investigated the feasibility of using a vision guided selectively compliant articulated robotic arm (SCARA) for μCP applications. SCARAs are routinely used to perform high precision, repetitive tasks in manufacturing, and even low-end models are capable of achieving microscale precision. Here, we present customization of a SCARA to execute robotic-μCP (R-μCP) onto gold-coated microscope coverslips. The system not only possesses the ability to align multiple polydimethylsiloxane (PDMS) stamps but also has the capability to do so even after the substrates have been removed, reacted to graft polymer brushes, and replaced back into the system. Plus, non-biased computerized analysis shows that the system performs such sequential patterning with <10 μm precision and accuracy, which is equivalent to the repeatability specifications of the employed SCARA model. R-μCP should facilitate the engineering of complex in vivo-like complexities onto culture substrates and their integration with microfluidic devices.

  7. Comparative analysis on reproducibility among 5 intraoral scanners: sectional analysis according to restoration type and preparation outline form

    PubMed Central

    2016-01-01

    PURPOSE The trueness and precision of acquired images of intraoral digital scanners could be influenced by restoration type, preparation outline form, scanning technology and the application of power. The aim of this study is to perform the comparative evaluation of the 3-dimensional reproducibility of intraoral scanners (IOSs). MATERIALS AND METHODS The phantom containing five prepared teeth was scanned by the reference scanner (Dental Wings) and 5 test IOSs (E4D dentist, Fastscan, iTero, Trios and Zfx Intrascan). The acquired images of the scanner groups were compared with the image from the reference scanner (trueness) and within each scanner groups (precision). Statistical analysis was performed using independent two-samples t-test and analysis of variance (α=.05). RESULTS The average deviations of trueness and precision of Fastscan, iTero and Trios were significantly lower than the other scanners. According to the restoration type, significantly higher trueness was observed in crown and inlay than in bridge. However, no significant difference was observed among four sites of preparation outline form. If compared by the characteristics of IOS, high trueness was observed in the group adopting the active triangulation and using powder. However, there was no significant difference between the still image acquisition and video acquisition groups. CONCLUSION Except for two intraoral scanners, Fastscan, iTero and Trios displayed comparable levels of trueness and precision values in tested phantom model. Difference in trueness was observed depending on the restoration type, the preparation outline form and characteristics of IOS, which should be taken into consideration when the intraoral scanning data are utilized. PMID:27826385

  8. Understanding deformation with high angular resolution electron backscatter diffraction (HR-EBSD)

    NASA Astrophysics Data System (ADS)

    Britton, T. B.; Hickey, J. L. R.

    2018-01-01

    High angular resolution electron backscatter diffraction (HR-EBSD) affords an increase in angular resolution, as compared to ‘conventional’ Hough transform based EBSD, of two orders of magnitude, enabling measurements of relative misorientations of 1 x 10-4 rads (~ 0.006°) and changes in (deviatoric) lattice strain with a precision of 1 x 10-4. This is achieved through direct comparison of two or more diffraction patterns using sophisticated cross-correlation based image analysis routines. Image shifts between zone axes in the two-correlated diffraction pattern are measured with sub-pixel precision and this realises the ability to measure changes in interplanar angles and lattice orientation with a high degree of sensitivity. These shifts are linked to strains and lattice rotations through simple geometry. In this manuscript, we outline the basis of the technique and two case studies that highlight its potential to tackle real materials science challenges, such as deformation patterning in polycrystalline alloys.

  9. [High-precision in situ analysis of the lead isotopic composition in copper using femtosecond laser ablation MC-ICP-MS and the application in ancient coins].

    PubMed

    Chen, Kai-Yun; Fan, Chao; Yuan, Hong-Lin; Bao, Zhi-An; Zong, Chun-Lei; Dai, Meng-Ning; Ling, Xue; Yang, Ying

    2013-05-01

    In the present study we set up a femtosecond laser ablation MC-ICP-MS method for lead isotopic analysis. Pb isotopic composition of fifteen copper (brass, bronze) standard samples from the National Institute of Standards Material were analyzed using the solution method (MC-ICP-MS) and laser method (fLA-MC-ICPMS) respectively, the results showed that the Pb isotopic composition in CuPb12 (GBW02137) is very homogeneous, and can be used as external reference material for Pb isotopic in situ analysis. On CuPb12 112 fLA-MC-ICPMS Pb isotope analysis, the weighted average values of the Pb isotopic ratio are in good agreement with the results analyzed by bulk solution method within 2sigma error, the internal precision RSEs of the 208 Pb/204 Pb ratio and 207 Pb/206 Pb ratio are less than 90 and 40 ppm respectively, and the external precision RSDs of them are less than 60 and 30 ppm respectively. Pb isotope of thirteen ancient bronze coins was analyzed via fLA-MC-ICPMS, the results showed that the Pb isotopic composition of ancient coins of different dynasties is significantly different, and not all the Pb isotopic compositions in the coins even from the same dynasty are in agreement with each other.

  10. Inadequacy, Impurity and Infidelity; Modifying the Modified Brendel Alpha-Cellulose Extraction Method for Resinous Woods in Stable Isotope Dendroclimatology

    NASA Astrophysics Data System (ADS)

    Brookman, T. H.; Whittaker, T. E.; King, P. L.; Horton, T. W.

    2011-12-01

    Stable isotope dendroclimatology is a burgeoning field in palaeoclimate science due to its unique potential to contribute (sub)annually resolved climate records, over millennial timescales, to the terrestrial palaeoclimate record. Until recently the time intensive methods precluded long-term climate reconstructions. Advances in continuous-flow mass spectrometry and isolation methods for α-cellulose (ideal for palaeoclimate studies as, unlike other wood components, it retains its initial isotopic composition) have made long-term, calendar dated palaeoclimate reconstructions a viable proposition. The Modified Brendel (mBrendel) α-cellulose extraction method is a fast, cost-effective way of preparing whole-wood samples for stable oxygen and carbon isotope analysis. However, resinous woods often yield incompletely processed α-cellulose using the standard mBrendel approach. As climate signals may be recorded by small (<1%) isotopic shifts it is important to investigate if incomplete processing affects the accuracy and precision of tree-ring isotopic records. In an effort to address this methodological issue, we investigated three highly resinous woods: kauri (Agathis australis), ponderosa pine (Pinus ponderosa) and huon pine (Lagarastrobus franklinii). Samples of each species were treated with 16 iterations of the mBrendel, varying reaction temperature, time and reagent volumes. Products were investigated using microscopic and bulk transmission Fourier Transform infrared spectroscopy (FITR) to reveal variations in the level of processing; poorly-digested fibres display a peak at 1520cm-1 suggesting residual lignin and a peak at ~1600cm-1 in some samples suggests retained resin. Despite the different levels of purity, replicate analyses of samples processed by high temperature digestion yielded consistent δ18O within and between experiments. All α-cellulose samples were 5-7% enriched compared to the whole-wood, suggesting that even incomplete processing at high temperature can provide acceptable δ18O analytical external precision. For kauri, short, lower temperature extractions produced α-cellulose with δ18O consistently ~1% lower than longer, higher temperature kauri experiments. These findings suggest that temperature and time are significant variables that influence the analytical precision of α-cellulose stable isotope analysis and that resinous hardwoods (e.g. kauri) may require longer and/or hotter digestions than softwoods. The effects of mBrendel variants on the carbon isotope ratio precision of α-cellulose extracts will also be presented. Our findings indicate that the standard mBrendel α-cellulose extraction method may not fully remove lignins and resins depending on the type of wood being analysed. Residual impurities can decrease analytical precision and accuracy. Fortunately, FTIR analysis prior to isotopic analysis is a relatively fast and cost effective way to determine α-cellulose extract purity, ultimately improving the data quality, accuracy and utility of tree-ring based stable isotopic climate records.

  11. Water vapor δ(2) H, δ(18) O and δ(17) O measurements using an off-axis integrated cavity output spectrometer - sensitivity to water vapor concentration, delta value and averaging-time.

    PubMed

    Tian, Chao; Wang, Lixin; Novick, Kimberly A

    2016-10-15

    High-precision analysis of atmospheric water vapor isotope compositions, especially δ(17) O values, can be used to improve our understanding of multiple hydrological and meteorological processes (e.g., differentiate equilibrium or kinetic fractionation). This study focused on assessing, for the first time, how the accuracy and precision of vapor δ(17) O laser spectroscopy measurements depend on vapor concentration, delta range, and averaging-time. A Triple Water Vapor Isotope Analyzer (T-WVIA) was used to evaluate the accuracy and precision of δ(2) H, δ(18) O and δ(17) O measurements. The sensitivity of accuracy and precision to water vapor concentration was evaluated using two international standards (GISP and SLAP2). The sensitivity of precision to delta value was evaluated using four working standards spanning a large delta range. The sensitivity of precision to averaging-time was assessed by measuring one standard continuously for 24 hours. Overall, the accuracy and precision of the δ(2) H, δ(18) O and δ(17) O measurements were high. Across all vapor concentrations, the accuracy of δ(2) H, δ(18) O and δ(17) O observations ranged from 0.10‰ to 1.84‰, 0.08‰ to 0.86‰ and 0.06‰ to 0.62‰, respectively, and the precision ranged from 0.099‰ to 0.430‰, 0.009‰ to 0.080‰ and 0.022‰ to 0.054‰, respectively. The accuracy and precision of all isotope measurements were sensitive to concentration, with the higher accuracy and precision generally observed under moderate vapor concentrations (i.e., 10000-15000 ppm) for all isotopes. The precision was also sensitive to the range of delta values, although the effect was not as large compared with the sensitivity to concentration. The precision was much less sensitive to averaging-time than the concentration and delta range effects. The accuracy and precision performance of the T-WVIA depend on concentration but depend less on the delta value and averaging-time. The instrument can simultaneously and continuously measure δ(2) H, δ(18) O and δ(17) O values in water vapor, opening a new window to better understand ecological, hydrological and meteorological processes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  12. A novel high sensitivity HPLC assay for topiramate, using 4-chloro-7-nitrobenzofurazan as pre-column fluorescence derivatizing agent.

    PubMed

    Bahrami, Gholamreza; Mohammadi, Bahareh

    2007-05-01

    A new, sensitive and simple high-performance liquid chromatographic method for analysis of topiramate, an antiepileptic agent, using 4-chloro-7-nitrobenzofurazan as pre-column derivatization agent is described. Following liquid-liquid extraction of topiramate and an internal standard (amlodipine) from human serum, derivatization of the drugs was performed by the labeling agent in the presence of dichloromethane, methanol, acetonitrile and borate buffer (0.05 M; pH 10.6). A mixture of sodium phosphate buffer (0.05 M; pH 2.4): methanol (35:65 v/v) was eluted as mobile phase and chromatographic separation was achieved using a Shimpack CLC-C18 (150 x 4.6 mm) column. In this method the limit of quantification of 0.01 microg/mL was obtained and the procedure was validated over the concentration range of 0.01 to 12.8 microg/mL. No interferences were found from commonly co-administrated antiepileptic drugs including phenytoin, phenobarbital carbamazepine, lamotrigine, zonisamide, primidone, gabapentin, vigabatrin, and ethosuximide. The analysis performance was carried-out in terms of specificity, sensitivity, linearity, precision, accuracy and stability and the method was shown to be accurate, with intra-day and inter-day accuracy from -3.4 to 10% and precise, with intra-day and inter-day precision from 1.1 to 18%.

  13. Validation of an analytical method for simultaneous high-precision measurements of greenhouse gas emissions from wastewater treatment plants using a gas chromatography-barrier discharge detector system.

    PubMed

    Pascale, Raffaella; Caivano, Marianna; Buchicchio, Alessandro; Mancini, Ignazio M; Bianco, Giuliana; Caniani, Donatella

    2017-01-13

    Wastewater treatment plants (WWTPs) emit CO 2 and N 2 O, which may lead to climate change and global warming. Over the last few years, awareness of greenhouse gas (GHG) emissions from WWTPs has increased. Moreover, the development of valid, reliable, and high-throughput analytical methods for simultaneous gas analysis is an essential requirement for environmental applications. In the present study, an analytical method based on a gas chromatograph (GC) equipped with a barrier ionization discharge (BID) detector was developed for the first time. This new method simultaneously analyses CO 2 and N 2 O and has a precision, measured in terms of relative standard of variation RSD%, equal to or less than 6.6% and 5.1%, respectively. The method's detection limits are 5.3ppm v for CO 2 and 62.0ppb v for N 2 O. The method's selectivity, linearity, accuracy, repeatability, intermediate precision, limit of detection and limit of quantification were good at trace concentration levels. After validation, the method was applied to a real case of N 2 O and CO 2 emissions from a WWTP, confirming its suitability as a standard procedure for simultaneous GHG analysis in environmental samples containing CO 2 levels less than 12,000mg/L. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. High resolution and high precision on line isotopic analysis of Holocene and glacial ice performed in the field

    NASA Astrophysics Data System (ADS)

    Gkinis, V.; Popp, T. J.; Johnsen, S. J.; Blunier, T.; Bigler, M.; Stowasser, C.; Schüpbach, S.; Leuenberger, D.

    2010-12-01

    Ice core records as obtained from polar ice caps provide a wealth of paleoclimatic information. One of the main features of ice cores is their potential for high temporal resolution. The isotopic signature of the ice, expressed through the relative abundances of the two heavy isotopologues H218O and HD16O, is a widely used proxy for the reconstruction of past temperature and accumulation. One step further the combined information obtained from these two isotopologues, commonly referred to as the deuterium excess, can be utilized to infer additional information about the source of the precipitated moisture. Until very recently isotopic analysis of polar ice was performed with isotope Ratio Mass Spectrometry (IRMS) in a discrete fashion resulting in a high workload related to the preparation of samples. Most important though the available temporal resolution of the ice core was in many cases not fully exploited. In order to overcome these limitations we have developed a system that interfaces a commercially available IR laser cavity ring-down spectrometer tailored for water isotope analysis to a stream of liquid water as extracted from a continuously melted ice rod. The system offers the possibility for simultaneous δ18O and δD analysis with a sample requirement of approximately 0.1 ml/min. The system has been deployed in the field during the NEEM ice core drilling project on 2009 and 2010. In this study we present actual on line measurements of Holocene and glacial ice. We also discuss how parameters as the melt rate, acquisition rate and integration time affect the obtained precision and resolution and we describe data analysis techniques that can improve these last two parameters. By applying spectral methods we are able to quantify the smoothing effects imposed by diffusion of the sample in the sample transfer lines and the optical cavity of the instrument. We demonstrate that with an acquisition rate of 0.2 Hz we are able to obtain a precision of 0.5‰ and 0.15‰ for δD and δ18O respectively. This is comparable to the performance of traditional IRMS systems for δD but slightly less precise for δ18O. With this acquisition rate the system’s 3db bandwidth is 0.006 Hz. With a melt rate equal to 3 cm/min, the latter translates to signals with wavelengths of 8.3 cm. We will comment on the quality of the acquired ice core data and their potential use for dating, paleotemperature reconstruction, isotopic firn diffusion and deuterium excess studies.

  15. Reference geometry-based detection of (4D-)CT motion artifacts: a feasibility study

    NASA Astrophysics Data System (ADS)

    Werner, René; Gauer, Tobias

    2015-03-01

    Respiration-correlated computed tomography (4D or 3D+t CT) can be considered as standard of care in radiation therapy treatment planning for lung and liver lesions. The decision about an application of motion management devices and the estimation of patient-specific motion effects on the dose distribution relies on precise motion assessment in the planning 4D CT data { which is impeded in case of CT motion artifacts. The development of image-based/post-processing approaches to reduce motion artifacts would benefit from precise detection and localization of the artifacts. Simple slice-by-slice comparison of intensity values and threshold-based analysis of related metrics suffer from- depending on the threshold- high false-positive or -negative rates. In this work, we propose exploiting prior knowledge about `ideal' (= artifact free) reference geometries to stabilize metric-based artifact detection by transferring (multi-)atlas-based concepts to this specific task. Two variants are introduced and evaluated: (S1) analysis and comparison of warped atlas data obtained by repeated non-linear atlas-to-patient registration with different levels of regularization; (S2) direct analysis of vector field properties (divergence, curl magnitude) of the atlas-to-patient transformation. Feasibility of approaches (S1) and (S2) is evaluated by motion-phantom data and intra-subject experiments (four patients) as well as - adopting a multi-atlas strategy- inter-subject investigations (twelve patients involved). It is demonstrated that especially sorting/double structure artifacts can be precisely detected and localized by (S1). In contrast, (S2) suffers from high false positive rates.

  16. Development and validation of high-performance liquid chromatography and high-performance thin-layer chromatography methods for the quantification of khellin in Ammi visnaga seed

    PubMed Central

    Kamal, Abid; Khan, Washim; Ahmad, Sayeed; Ahmad, F. J.; Saleem, Kishwar

    2015-01-01

    Objective: The present study was used to design simple, accurate and sensitive reversed phase-high-performance liquid chromatography RP-HPLC and high-performance thin-layer chromatography (HPTLC) methods for the development of quantification of khellin present in the seeds of Ammi visnaga. Materials and Methods: RP-HPLC analysis was performed on a C18 column with methanol: Water (75: 25, v/v) as a mobile phase. The HPTLC method involved densitometric evaluation of khellin after resolving it on silica gel plate using ethyl acetate: Toluene: Formic acid (5.5:4.0:0.5, v/v/v) as a mobile phase. Results: The developed HPLC and HPTLC methods were validated for precision (interday, intraday and intersystem), robustness and accuracy, limit of detection and limit of quantification. The relationship between the concentration of standard solutions and the peak response was linear in both HPLC and HPTLC methods with the concentration range of 10–80 μg/mL in HPLC and 25–1,000 ng/spot in HPTLC for khellin. The % relative standard deviation values for method precision was found to be 0.63–1.97%, 0.62–2.05% in HPLC and HPTLC for khellin respectively. Accuracy of the method was checked by recovery studies conducted at three different concentration levels and the average percentage recovery was found to be 100.53% in HPLC and 100.08% in HPTLC for khellin. Conclusions: The developed HPLC and HPTLC methods for the quantification of khellin were found simple, precise, specific, sensitive and accurate which can be used for routine analysis and quality control of A. visnaga and several formulations containing it as an ingredient. PMID:26681890

  17. High Precision Oxygen Three Isotope Analysis of Wild-2 Particles and Anhydrous Chondritic Interplanetary Dust Particles

    NASA Technical Reports Server (NTRS)

    Nakashima, D.; Ushikubo, T.; Zolensky, Michael E.; Weisberg, M. K.; Joswiak, D. J.; Brownlee, D. E.; Matrajt, G.; Kita, N. T.

    2011-01-01

    One of the most important discoveries from comet Wild-2 samples was observation of crystalline silicate particles that resemble chondrules and CAIs in carbonaceous chondrites. Previous oxygen isotope analyses of crystalline silicate terminal particles showed heterogeneous oxygen isotope ratios with delta(sup 18)O to approx. delta(sup 17)O down to -50% in the CAI-like particle Inti, a relict olivine grain in Gozen-sama, and an olivine particle. However, many Wild-2 particles as well as ferromagnesian silicates in anhydrous interplanetary dust particles (IDPs) showed Delta(sup 17)O values that cluster around -2%. In carbonaceous chondrites, chondrules seem to show two major isotope reservoirs with Delta(sup 17)O values at -5% and -2%. It was suggested that the Delta(sup 17)O = -2% is the common oxygen isotope reservoir for carbonaceous chondrite chondrules and cometary dust, from the outer asteroid belt to the Kuiper belt region. However, a larger dataset with high precision isotope analyses (+/-1-2%) is still needed to resolve the similarities or distinctions among Wild-2 particles, IDPs and chondrules in meteorites. We have made signifi-cant efforts to establish routine analyses of small particles (< or =10micronsm) at 1-2% precision using IMS-1280 at WiscSIMS laboratory. Here we report new results of high precision oxygen isotope analyses of Wild-2 particles and anhydrous chondritic IDPs, and discuss the relationship between the cometary dust and carbonaceous chondrite chondrules.

  18. Evaluation of the accuracy of 7 digital scanners: An in vitro analysis based on 3-dimensional comparisons.

    PubMed

    Renne, Walter; Ludlow, Mark; Fryml, John; Schurch, Zach; Mennito, Anthony; Kessler, Ray; Lauer, Abigail

    2017-07-01

    As digital impressions become more common and more digital impression systems are released onto the market, it is essential to systematically and objectively evaluate their accuracy. The purpose of this in vitro study was to evaluate and compare the trueness and precision of 6 intraoral scanners and 1 laboratory scanner in both sextant and complete-arch scenarios. Furthermore, time of scanning was evaluated and correlated with trueness and precision. A custom complete-arch model was fabricated with a refractive index similar to that of tooth structure. Seven digital impression systems were used to scan the custom model for both posterior sextant and complete arch scenarios. Analysis was performed using 3-dimensional metrology software to measure discrepancies between the master model and experimental casts. Of the intraoral scanners, the Planscan was found to have the best trueness and precision while the 3Shape Trios was found to have the poorest for sextant scanning (P<.001). The order of trueness for complete arch scanning was as follows: 3Shape D800 >iTero >3Shape TRIOS 3 >Carestream 3500 >Planscan >CEREC Omnicam >CEREC Bluecam. The order of precision for complete-arch scanning was as follows: CS3500 >iTero >3Shape D800 >3Shape TRIOS 3 >CEREC Omnicam >Planscan >CEREC Bluecam. For the secondary outcome evaluating the effect time has on trueness and precision, the complete- arch scan time was highly correlated with both trueness (r=0.771) and precision (r=0.771). For sextant scanning, the Planscan was found to be the most precise and true scanner. For complete-arch scanning, the 3Shape Trios was found to have the best balance of speed and accuracy. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  19. Use of an automated chromium reduction system for hydrogen isotope ratio analysis of physiological fluids applied to doubly labeled water analysis.

    PubMed

    Schoeller, D A; Colligan, A S; Shriver, T; Avak, H; Bartok-Olson, C

    2000-09-01

    The doubly labeled water method is commonly used to measure total energy expenditure in free-living subjects. The method, however, requires accurate and precise deuterium abundance determinations, which can be laborious. The aim of this study was to evaluate a fully automated, high-throughput, chromium reduction technique for the measurement of deuterium abundances in physiological fluids. The chromium technique was compared with an off-line zinc bomb reduction technique and also subjected to test-retest analysis. Analysis of international water standards demonstrated that the chromium technique was accurate and had a within-day precision of <1 per thousand. Addition of organic matter to water samples demonstrated that the technique was sensitive to interference at levels between 2 and 5 g l(-1). Physiological samples could be analyzed without this interference, plasma by 10000 Da exclusion filtration, saliva by sedimentation and urine by decolorizing with carbon black. Chromium reduction of urine specimens from doubly labeled water studies indicated no bias relative to zinc reduction with a mean difference in calculated energy expenditure of -0.2 +/- 3.9%. Blinded reanalysis of urine specimens from a second doubly labeled water study demonstrated a test-retest coefficient of variation of 4%. The chromium reduction method was found to be a rapid, accurate and precise method for the analysis of urine specimens from doubly labeled water. Copyright 2000 John Wiley & Sons, Ltd.

  20. An anatomically comprehensive atlas of the adult human brain transcriptome

    PubMed Central

    Guillozet-Bongaarts, Angela L.; Shen, Elaine H.; Ng, Lydia; Miller, Jeremy A.; van de Lagemaat, Louie N.; Smith, Kimberly A.; Ebbert, Amanda; Riley, Zackery L.; Abajian, Chris; Beckmann, Christian F.; Bernard, Amy; Bertagnolli, Darren; Boe, Andrew F.; Cartagena, Preston M.; Chakravarty, M. Mallar; Chapin, Mike; Chong, Jimmy; Dalley, Rachel A.; David Daly, Barry; Dang, Chinh; Datta, Suvro; Dee, Nick; Dolbeare, Tim A.; Faber, Vance; Feng, David; Fowler, David R.; Goldy, Jeff; Gregor, Benjamin W.; Haradon, Zeb; Haynor, David R.; Hohmann, John G.; Horvath, Steve; Howard, Robert E.; Jeromin, Andreas; Jochim, Jayson M.; Kinnunen, Marty; Lau, Christopher; Lazarz, Evan T.; Lee, Changkyu; Lemon, Tracy A.; Li, Ling; Li, Yang; Morris, John A.; Overly, Caroline C.; Parker, Patrick D.; Parry, Sheana E.; Reding, Melissa; Royall, Joshua J.; Schulkin, Jay; Sequeira, Pedro Adolfo; Slaughterbeck, Clifford R.; Smith, Simon C.; Sodt, Andy J.; Sunkin, Susan M.; Swanson, Beryl E.; Vawter, Marquis P.; Williams, Derric; Wohnoutka, Paul; Zielke, H. Ronald; Geschwind, Daniel H.; Hof, Patrick R.; Smith, Stephen M.; Koch, Christof; Grant, Seth G. N.; Jones, Allan R.

    2014-01-01

    Neuroanatomically precise, genome-wide maps of transcript distributions are critical resources to complement genomic sequence data and to correlate functional and genetic brain architecture. Here we describe the generation and analysis of a transcriptional atlas of the adult human brain, comprising extensive histological analysis and comprehensive microarray profiling of ~900 neuroanatomically precise subdivisions in two individuals. Transcriptional regulation varies enormously by anatomical location, with different regions and their constituent cell types displaying robust molecular signatures that are highly conserved between individuals. Analysis of differential gene expression and gene co-expression relationships demonstrates that brain-wide variation strongly reflects the distributions of major cell classes such as neurons, oligodendrocytes, astrocytes and microglia. Local neighbourhood relationships between fine anatomical subdivisions are associated with discrete neuronal subtypes and genes involved with synaptic transmission. The neocortex displays a relatively homogeneous transcriptional pattern, but with distinct features associated selectively with primary sensorimotor cortices and with enriched frontal lobe expression. Notably, the spatial topography of the neocortex is strongly reflected in its molecular topography— the closer two cortical regions, the more similar their transcriptomes. This freely accessible online data resource forms a high-resolution transcriptional baseline for neurogenetic studies of normal and abnormal human brain function. PMID:22996553

  1. High-Precision Registration of Point Clouds Based on Sphere Feature Constraints.

    PubMed

    Huang, Junhui; Wang, Zhao; Gao, Jianmin; Huang, Youping; Towers, David Peter

    2016-12-30

    Point cloud registration is a key process in multi-view 3D measurements. Its precision affects the measurement precision directly. However, in the case of the point clouds with non-overlapping areas or curvature invariant surface, it is difficult to achieve a high precision. A high precision registration method based on sphere feature constraint is presented to overcome the difficulty in the paper. Some known sphere features with constraints are used to construct virtual overlapping areas. The virtual overlapping areas provide more accurate corresponding point pairs and reduce the influence of noise. Then the transformation parameters between the registered point clouds are solved by an optimization method with weight function. In that case, the impact of large noise in point clouds can be reduced and a high precision registration is achieved. Simulation and experiments validate the proposed method.

  2. High-Precision Registration of Point Clouds Based on Sphere Feature Constraints

    PubMed Central

    Huang, Junhui; Wang, Zhao; Gao, Jianmin; Huang, Youping; Towers, David Peter

    2016-01-01

    Point cloud registration is a key process in multi-view 3D measurements. Its precision affects the measurement precision directly. However, in the case of the point clouds with non-overlapping areas or curvature invariant surface, it is difficult to achieve a high precision. A high precision registration method based on sphere feature constraint is presented to overcome the difficulty in the paper. Some known sphere features with constraints are used to construct virtual overlapping areas. The virtual overlapping areas provide more accurate corresponding point pairs and reduce the influence of noise. Then the transformation parameters between the registered point clouds are solved by an optimization method with weight function. In that case, the impact of large noise in point clouds can be reduced and a high precision registration is achieved. Simulation and experiments validate the proposed method. PMID:28042846

  3. HPMSS(High Precision Magnetic Survey System) and InterRidge

    NASA Astrophysics Data System (ADS)

    Isezaki, N.; Sayanagi, K.

    2012-12-01

    From the beginning of 1990s to the beginning of 2000s, the Japanese group of IntreRidge conducted many cruises for three component magnetic survey using Shipboard Three Component Magnetometer (STCM) and Deep Towed Three Component Magnetometer (DTCM) in the world wide oceans. We have been developing HPMSS during this time with support of Dr.Tamaki(the late representative of InterRidge Japan) who understood the advantages of three component geomagnetic anomalies (TCGA). TCGA measured by STCM determines the direction of geomagnetic anomaly lineations precisely at every point where TCGA were observed, which playes the important role in magnetic anomaly lineation analysis. Even in the beginning of 2000s, almost all marine magnetic scientists believed that the total intensity anomly (TIA) is the better data than TCGA for analysis because the scalar magnetometers (e.g. proton precession magnetometer) have the better accuracy than any other magnetometers (e.g.flux gate magnetometer (FGM)). We employed the high accrate gyroscope (e.g.ring lase gyroscope (RLG)/optical fiber gyroscope (OFG)) to improve the accuracy of STCM/DTCM equipped with FGM. Moreover we employed accurate and precise FGM which was selected among the market. Finally we developed the new magnetic survey system with high precision usable as airborn, shipboard and dee-ptowed magnetometers which we call HPMSS(High Precision Magnetic Survey System). As an optional equipment, we use LAN to communicate between a data aquisiitin part and a data logging part, and GPS for a position fix. For the deep-towed survey, we use the acoustic position fix (super short base line method) and the acoustic communication to monitor the DTCM status. First we used HPMSS to obtain the magnetization structure of the volcanic island, Aogashima located 300km south of Tokyo using a hellcopter in 2006 and 2009. Next we used HPMSS installed in DTCM in 2010,2011 and 2012 using R/V Bosei-maru belonging to Tokai University. Also we used HPMSS installed in AUV (automonous undersea vehicle), belonging to JAMSTEC in 2009,2010 and 2011. We have been emphasizing the importnace of TCGA compared to TIA because TIA does not obey the Laplace equation which means TIA is not harmonic, then we cannot apply the Fourier analysis on TIA. We will show the structure of three component magnetization of the mineral deposit in the volcanic thermal area in Izu-Ogasawara island Arc, called Hakurei Deposit. TCGA of DTCM and AUV survey data were used and the depth section and the vertical section of three components of magnetization of Hakurei Deposit area will be presented. We emphasize that reliable 3D structure of three component of magnetization was obtained from TCGA using HPMSS as the result of deep support of InterRidge Japan, especially deep support of Dr. Tamaki.

  4. A novel algorithm for a precise analysis of subchondral bone alterations

    PubMed Central

    Gao, Liang; Orth, Patrick; Goebel, Lars K. H.; Cucchiarini, Magali; Madry, Henning

    2016-01-01

    Subchondral bone alterations are emerging as considerable clinical problems associated with articular cartilage repair. Their analysis exposes a pattern of variable changes, including intra-lesional osteophytes, residual microfracture holes, peri-hole bone resorption, and subchondral bone cysts. A precise distinction between them is becoming increasingly important. Here, we present a tailored algorithm based on continuous data to analyse subchondral bone changes using micro-CT images, allowing for a clear definition of each entity. We evaluated this algorithm using data sets originating from two large animal models of osteochondral repair. Intra-lesional osteophytes were detected in 3 of 10 defects in the minipig and in 4 of 5 defects in the sheep model. Peri-hole bone resorption was found in 22 of 30 microfracture holes in the minipig and in 17 of 30 microfracture holes in the sheep model. Subchondral bone cysts appeared in 1 microfracture hole in the minipig and in 5 microfracture holes in the sheep model (n = 30 holes each). Calculation of inter-rater agreement (90% agreement) and Cohen’s kappa (kappa = 0.874) revealed that the novel algorithm is highly reliable, reproducible, and valid. Comparison analysis with the best existing semi-quantitative evaluation method was also performed, supporting the enhanced precision of this algorithm. PMID:27596562

  5. Significantly improved precision of cell migration analysis in time-lapse video microscopy through use of a fully automated tracking system

    PubMed Central

    2010-01-01

    Background Cell motility is a critical parameter in many physiological as well as pathophysiological processes. In time-lapse video microscopy, manual cell tracking remains the most common method of analyzing migratory behavior of cell populations. In addition to being labor-intensive, this method is susceptible to user-dependent errors regarding the selection of "representative" subsets of cells and manual determination of precise cell positions. Results We have quantitatively analyzed these error sources, demonstrating that manual cell tracking of pancreatic cancer cells lead to mis-calculation of migration rates of up to 410%. In order to provide for objective measurements of cell migration rates, we have employed multi-target tracking technologies commonly used in radar applications to develop fully automated cell identification and tracking system suitable for high throughput screening of video sequences of unstained living cells. Conclusion We demonstrate that our automatic multi target tracking system identifies cell objects, follows individual cells and computes migration rates with high precision, clearly outperforming manual procedures. PMID:20377897

  6. Precision production: enabling deterministic throughput for precision aspheres with MRF

    NASA Astrophysics Data System (ADS)

    Maloney, Chris; Entezarian, Navid; Dumas, Paul

    2017-10-01

    Aspherical lenses offer advantages over spherical optics by improving image quality or reducing the number of elements necessary in an optical system. Aspheres are no longer being used exclusively by high-end optical systems but are now replacing spherical optics in many applications. The need for a method of production-manufacturing of precision aspheres has emerged and is part of the reason that the optics industry is shifting away from artisan-based techniques towards more deterministic methods. Not only does Magnetorheological Finishing (MRF) empower deterministic figure correction for the most demanding aspheres but it also enables deterministic and efficient throughput for series production of aspheres. The Q-flex MRF platform is designed to support batch production in a simple and user friendly manner. Thorlabs routinely utilizes the advancements of this platform and has provided results from using MRF to finish a batch of aspheres as a case study. We have developed an analysis notebook to evaluate necessary specifications for implementing quality control metrics. MRF brings confidence to optical manufacturing by ensuring high throughput for batch processing of aspheres.

  7. On the use of particle filters for electromagnetic tracking in high dose rate brachytherapy.

    PubMed

    Götz, Th I; Lahmer, G; Brandt, T; Kallis, K; Strnad, V; Bert, Ch; Hensel, B; Tomé, A M; Lang, E W

    2017-09-12

    Modern radiotherapy of female breast cancers often employs high dose rate brachytherapy, where a radioactive source is moved inside catheters, implanted in the female breast, according to a prescribed treatment plan. Source localization relative to the patient's anatomy is determined with solenoid sensors whose spatial positions are measured with an electromagnetic tracking system. Precise sensor dwell position determination is of utmost importance to assure irradiation of the cancerous tissue according to the treatment plan. We present a hybrid data analysis system which combines multi-dimensional scaling with particle filters to precisely determine sensor dwell positions in the catheters during subsequent radiation treatment sessions. Both techniques are complemented with empirical mode decomposition for the removal of superimposed breathing artifacts. We show that the hybrid model robustly and reliably determines the spatial positions of all catheters used during the treatment and precisely determines any deviations of actual sensor dwell positions from the treatment plan. The hybrid system only relies on sensor positions measured with an EMT system and relates them to the spatial positions of the implanted catheters as initially determined with a computed x-ray tomography.

  8. Analysis of key technologies in geomagnetic navigation

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoming; Zhao, Yan

    2008-10-01

    Because of the costly price and the error accumulation of high precise Inertial Navigation Systems (INS) and the vulnerability of Global Navigation Satellite Systems (GNSS), the geomagnetic navigation technology, a passive autonomous navigation method, is paid attention again. Geomagnetic field is a natural spatial physical field, and is a function of position and time in near earth space. The navigation technology based on geomagnetic field is researched in a wide range of commercial and military applications. This paper presents the main features and the state-of-the-art of Geomagnetic Navigation System (GMNS). Geomagnetic field models and reference maps are described. Obtaining, modeling and updating accurate Anomaly Magnetic Field information is an important step for high precision geomagnetic navigation. In addition, the errors of geomagnetic measurement using strapdown magnetometers are analyzed. The precise geomagnetic data is obtained by means of magnetometer calibration and vehicle magnetic field compensation. According to the measurement data and reference map or model of geomagnetic field, the vehicle's position and attitude can be obtained using matching algorithm or state-estimating method. The tendency of geomagnetic navigation in near future is introduced at the end of this paper.

  9. Envirotyping for deciphering environmental impacts on crop plants.

    PubMed

    Xu, Yunbi

    2016-04-01

    Global climate change imposes increasing impacts on our environments and crop production. To decipher environmental impacts on crop plants, the concept "envirotyping" is proposed, as a third "typing" technology, complementing with genotyping and phenotyping. Environmental factors can be collected through multiple environmental trials, geographic and soil information systems, measurement of soil and canopy properties, and evaluation of companion organisms. Envirotyping contributes to crop modeling and phenotype prediction through its functional components, including genotype-by-environment interaction (GEI), genes responsive to environmental signals, biotic and abiotic stresses, and integrative phenotyping. Envirotyping, driven by information and support systems, has a wide range of applications, including environmental characterization, GEI analysis, phenotype prediction, near-iso-environment construction, agronomic genomics, precision agriculture and breeding, and development of a four-dimensional profile of crop science involving genotype (G), phenotype (P), envirotype (E) and time (T) (developmental stage). In the future, envirotyping needs to zoom into specific experimental plots and individual plants, along with the development of high-throughput and precision envirotyping platforms, to integrate genotypic, phenotypic and envirotypic information for establishing a high-efficient precision breeding and sustainable crop production system based on deciphered environmental impacts.

  10. Personalized In Vitro and In Vivo Cancer Models to Guide Precision Medicine

    PubMed Central

    Pauli, Chantal; Hopkins, Benjamin D.; Prandi, Davide; Shaw, Reid; Fedrizzi, Tarcisio; Sboner, Andrea; Sailer, Verena; Augello, Michael; Puca, Loredana; Rosati, Rachele; McNary, Terra J.; Churakova, Yelena; Cheung, Cynthia; Triscott, Joanna; Pisapia, David; Rao, Rema; Mosquera, Juan Miguel; Robinson, Brian; Faltas, Bishoy M.; Emerling, Brooke E.; Gadi, Vijayakrishna K.; Bernard, Brady; Elemento, Olivier; Beltran, Himisha; Dimichelis, Francesca; Kemp, Christopher J.; Grandori, Carla; Cantley, Lewis C.; Rubin, Mark A.

    2017-01-01

    Precision Medicine is an approach that takes into account the influence of individuals' genes, environment and lifestyle exposures to tailor interventions. Here, we describe the development of a robust precision cancer care platform, which integrates whole exome sequencing (WES) with a living biobank that enables high throughput drug screens on patient-derived tumor organoids. To date, 56 tumor-derived organoid cultures, and 19 patient-derived xenograft (PDX) models have been established from the 769 patients enrolled in an IRB approved clinical trial. Because genomics alone was insufficient to identify therapeutic options for the majority of patients with advanced disease, we used high throughput drug screening effective strategies. Analysis of tumor derived cells from four cases, two uterine malignancies and two colon cancers, identified effective drugs and drug combinations that were subsequently validated using 3D cultures and PDX models. This platform thereby promotes the discovery of novel therapeutic approaches that can be assessed in clinical trials and provides personalized therapeutic options for individual patients where standard clinical options have been exhausted. PMID:28331002

  11. Multidisciplinary Analysis and Optimal Design: As Easy as it Sounds?

    NASA Technical Reports Server (NTRS)

    Moore, Greg; Chainyk, Mike; Schiermeier, John

    2004-01-01

    The viewgraph presentation examines optimal design for precision, large aperture structures. Discussion focuses on aspects of design optimization, code architecture and current capabilities, and planned activities and collaborative area suggestions. The discussion of design optimization examines design sensitivity analysis; practical considerations; and new analytical environments including finite element-based capability for high-fidelity multidisciplinary analysis, design sensitivity, and optimization. The discussion of code architecture and current capabilities includes basic thermal and structural elements, nonlinear heat transfer solutions and process, and optical modes generation.

  12. Viscoelastic properties of chalcogenide glasses and the simulation of their molding processes

    NASA Astrophysics Data System (ADS)

    Liu, Weiguo; Shen, Ping; Jin, Na

    In order to simulate the precision molding process, the viscoelastic properties of chalcogenide glasses under high temperatures were investigated. Thermomechanical analysis were performed to measure and analysis the thermomechanical properties of chalcogenide glasses. The creep responses of the glasses at different temperatures were obtained. Finite element analysis was applied for the simulation of the molding processes. The simulation results were in consistence with previously reported experiment results. Stress concentration and evolution during the molding processes was also described with the simulation results.

  13. Mobile mapping of methane emissions and isoscapes

    NASA Astrophysics Data System (ADS)

    Takriti, Mounir; Ward, Sue; Wynn, Peter; Elias, Dafydd; McNamara, Niall

    2017-04-01

    Methane (CH4) is a potent greenhouse gas emitted from a variety of natural and anthropogenic sources. It is crucial to accurately and efficiently detect CH4 emissions and identify their sources to improve our understanding of changing emission patterns as well as to identify ways to curtail their release into the atmosphere. However, using established methods this can be challenging as well as time and resource intensive due to the temporal and spatial heterogeneity of many sources. To address this problem, we have developed a vehicle mounted mobile system that combines high precision CH4 measurements with isotopic mapping and dual isotope source characterisation. We here present details of the development and testing of a unique system for the detection and isotopic analysis of CH4 plumes built around a Picarro isotopic (13C/12C) gas analyser and a high precision Los Gatos greenhouse gas analyser. Combined with micrometeorological measurements and a mechanism for collecting discrete samples for high precision dual isotope (13C/12C, 2H/1H) analysis the system enables mapping of concentrations as well as directional and isotope based source verification. We then present findings from our mobile methane surveys around the North West of England. This area includes a variety of natural and anthropogenic methane sources within a relatively small geographical area, including livestock farming, urban and industrial gas infrastructure, landfills and waste water treatment facilities, and wetlands. We show that the system was successfully able to locate leaks from natural gas infrastructure and emissions from agricultural activities and to distinguish isotope signatures from these sources.

  14. An evaluation of the accuracy and precision of methane prediction equations for beef cattle fed high-forage and high-grain diets.

    PubMed

    Escobar-Bahamondes, P; Oba, M; Beauchemin, K A

    2017-01-01

    The study determined the performance of equations to predict enteric methane (CH4) from beef cattle fed forage- and grain-based diets. Many equations are available to predict CH4 from beef cattle and the predictions vary substantially among equations. The aims were to (1) construct a database of CH4 emissions for beef cattle from published literature, and (2) identify the most precise and accurate extant CH4 prediction models for beef cattle fed diets varying in forage content. The database was comprised of treatment means of CH4 production from in vivo beef studies published from 2000 to 2015. Criteria to include data in the database were as follows: animal description, intakes, diet composition and CH4 production. In all, 54 published equations that predict CH4 production from diet composition were evaluated. Precision and accuracy of the equations were evaluated using the concordance correlation coefficient (r c ), root mean square prediction error (RMSPE), model efficiency and analysis of errors. Equations were ranked using a combined index of the various statistical assessments based on principal component analysis. The final database contained 53 studies and 207 treatment means that were divided into two data sets: diets containing ⩾400 g/kg dry matter (DM) forage (n=116) and diets containing ⩽200 g/kg DM forage (n=42). Diets containing between ⩽400 and ⩾200 g/kg DM forage were not included in the analysis because of their limited numbers (n=6). Outliers, treatment means where feed was fed restrictively and diets with CH4 mitigation additives were omitted (n=43). Using the high-forage dataset the best-fit equations were the International Panel on Climate Change Tier 2 method, 3 equations for steers that considered gross energy intake (GEI) and body weight and an equation that considered dry matter intake and starch:neutral detergent fiber with r c ranging from 0.60 to 0.73 and RMSPE from 35.6 to 45.9 g/day. For the high-grain diets, the 5 best-fit equations considered intakes of metabolisable energy, cellulose, hemicellulose and fat, or for steers GEI and body weight, with r c ranging from 0.35 to 0.52 and RMSPE from 47.4 to 62.9 g/day. Ranking of extant CH4 prediction equations for their accuracy and precision differed with forage content of the diet. When used for cattle fed high-grain diets, extant CH4 prediction models were generally imprecise and lacked accuracy.

  15. Quality Analysis of Chlorogenic Acid and Hyperoside in Crataegi fructus

    PubMed Central

    Weon, Jin Bae; Jung, Youn Sik; Ma, Choong Je

    2016-01-01

    Background: Crataegi fructus is a herbal medicine for strong stomach, sterilization, and alcohol detoxification. Chlorogenic acid and hyperoside are the major compounds in Crataegi fructus. Objective: In this study, we established novel high-performance liquid chromatography (HPLC)-diode array detection analysis method of chlorogenic acid and hyperoside for quality control of Crataegi fructus. Materials and Methods: HPLC analysis was achieved on a reverse-phase C18 column (5 μm, 4.6 mm × 250 mm) using water and acetonitrile as mobile phase with gradient system. The method was validated for linearity, precision, and accuracy. About 31 batches of Crataegi fructus samples collected from Korea and China were analyzed by using HPLC fingerprint of developed HPLC method. Then, the contents of chlorogenic acid and hyperoside were compared for quality evaluation of Crataegi fructus. Results: The results have shown that the average contents (w/w %) of chlorogenic acid and hyperoside in Crataegi fructus collected from Korea were 0.0438% and 0.0416%, respectively, and the average contents (w/w %) of 0.0399% and 0.0325%, respectively. Conclusion: In conclusion, established HPLC analysis method was stable and could provide efficient quality evaluation for monitoring of commercial Crataegi fructus. SUMMARY Quantitative analysis method of chlorogenic acid and hyperoside in Crataegi fructus is developed by high.performance liquid chromatography.(HPLC).diode array detectionEstablished HPLC analysis method is validated with linearity, precision, and accuracyThe developed method was successfully applied for quantitative analysis of Crataegi fructus sample collected from Korea and China. Abbreviations used: HPLC: High-performance liquid chromatography, GC: Gas chromatography, MS: Mass spectrometer, LOD: Limits of detection, LOQ: Limits of quantification, RSD: Relative standard deviation, RRT: Relative retention time, RPA: Relation peak area. PMID:27076744

  16. Fugitive Methane Emission Identification and Source Attribution: Ethane-to-Methane Analysis Using a Portable Cavity Ring-Down Spectroscopy Analyzer

    NASA Astrophysics Data System (ADS)

    Kim-Hak, D.; Fleck, D.

    2017-12-01

    Natural gas analysis and methane specifically have become increasingly important by virtue of methane's 28-36x greenhouse warming potential compared to CO2 and accounting for 10% of total greenhouse gas emissions in the US alone. Additionally, large uncontrolled leaks, such as the recent one from Aliso Canyon in Southern California, originating from uncapped wells, storage facilities and coal mines have increased the total global contribution of methane missions even further. Determining the specific fingerprint of methane sources by quantifying the ethane to methane (C2:C1) ratios provides us with means to understand processes yielding methane and allows for sources of methane to be mapped and classified through these processes; i.e. biogenic or thermogenic, oil vs. gas vs. coal gas-related. Here we present data obtained using a portable cavity ring-down spectrometry analyzer weighing less than 25 lbs and consuming less than 35W that simultaneously measures methane and ethane in real-time with a raw 1-σ precision of <30 ppb and <10 ppb, respectively at <1 Hz. These precisions allow for a C2:C1 ratio 1-σ measurement of <0.1% above 10 ppm in a single measurement. Furthermore, a high precision methane only mode is available for surveying and locating leakage with a 1-σ precision of <3 ppb. Source discrimination data of local leaks and methane sources using this analysis method are presented. Additionally, two-dimensional plume snapshots are constructed using an integrated onboard GPS in order to visualize horizontal plane gas propagation.

  17. N-of-1-pathways MixEnrich: advancing precision medicine via single-subject analysis in discovering dynamic changes of transcriptomes.

    PubMed

    Li, Qike; Schissler, A Grant; Gardeux, Vincent; Achour, Ikbel; Kenost, Colleen; Berghout, Joanne; Li, Haiquan; Zhang, Hao Helen; Lussier, Yves A

    2017-05-24

    Transcriptome analytic tools are commonly used across patient cohorts to develop drugs and predict clinical outcomes. However, as precision medicine pursues more accurate and individualized treatment decisions, these methods are not designed to address single-patient transcriptome analyses. We previously developed and validated the N-of-1-pathways framework using two methods, Wilcoxon and Mahalanobis Distance (MD), for personal transcriptome analysis derived from a pair of samples of a single patient. Although, both methods uncover concordantly dysregulated pathways, they are not designed to detect dysregulated pathways with up- and down-regulated genes (bidirectional dysregulation) that are ubiquitous in biological systems. We developed N-of-1-pathways MixEnrich, a mixture model followed by a gene set enrichment test, to uncover bidirectional and concordantly dysregulated pathways one patient at a time. We assess its accuracy in a comprehensive simulation study and in a RNA-Seq data analysis of head and neck squamous cell carcinomas (HNSCCs). In presence of bidirectionally dysregulated genes in the pathway or in presence of high background noise, MixEnrich substantially outperforms previous single-subject transcriptome analysis methods, both in the simulation study and the HNSCCs data analysis (ROC Curves; higher true positive rates; lower false positive rates). Bidirectional and concordant dysregulated pathways uncovered by MixEnrich in each patient largely overlapped with the quasi-gold standard compared to other single-subject and cohort-based transcriptome analyses. The greater performance of MixEnrich presents an advantage over previous methods to meet the promise of providing accurate personal transcriptome analysis to support precision medicine at point of care.

  18. In situ sulfur isotope analysis of sulfide minerals by SIMS: Precision and accuracy, with application to thermometry of ~3.5Ga Pilbara cherts

    USGS Publications Warehouse

    Kozdon, R.; Kita, N.T.; Huberty, J.M.; Fournelle, J.H.; Johnson, C.A.; Valley, J.W.

    2010-01-01

    Secondary ion mass spectrometry (SIMS) measurement of sulfur isotope ratios is a potentially powerful technique for in situ studies in many areas of Earth and planetary science. Tests were performed to evaluate the accuracy and precision of sulfur isotope analysis by SIMS in a set of seven well-characterized, isotopically homogeneous natural sulfide standards. The spot-to-spot and grain-to-grain precision for δ34S is ± 0.3‰ for chalcopyrite and pyrrhotite, and ± 0.2‰ for pyrite (2SD) using a 1.6 nA primary beam that was focused to 10 µm diameter with a Gaussian-beam density distribution. Likewise, multiple δ34S measurements within single grains of sphalerite are within ± 0.3‰. However, between individual sphalerite grains, δ34S varies by up to 3.4‰ and the grain-to-grain precision is poor (± 1.7‰, n = 20). Measured values of δ34S correspond with analysis pit microstructures, ranging from smooth surfaces for grains with high δ34S values, to pronounced ripples and terraces in analysis pits from grains featuring low δ34S values. Electron backscatter diffraction (EBSD) shows that individual sphalerite grains are single crystals, whereas crystal orientation varies from grain-to-grain. The 3.4‰ variation in measured δ34S between individual grains of sphalerite is attributed to changes in instrumental bias caused by different crystal orientations with respect to the incident primary Cs+ beam. High δ34S values in sphalerite correlate to when the Cs+ beam is parallel to the set of directions , from [111] to [110], which are preferred directions for channeling and focusing in diamond-centered cubic crystals. Crystal orientation effects on instrumental bias were further detected in galena. However, as a result of the perfect cleavage along {100} crushed chips of galena are typically cube-shaped and likely to be preferentially oriented, thus crystal orientation effects on instrumental bias may be obscured. Test were made to improve the analytical precision of δ34S in sphalerite, and the best results were achieved by either reducing the depth of the analysis pits using a Köhler illuminated primary beam, or by lowering the total impact energy from 20 keV to 13 keV. The resulting grain-to-grain precision in δ34S improves from ± 1.7‰ to better than 0.6‰ (2SD) in both procedures. With careful use of appropriate analytical conditions, the accuracy of SIMS analysis for δ34S approaches ± 0.3‰ (2SD) for chalcopyrite, pyrite and pyrrhotite and ± 0.6‰ for sphalerite. Measurements of δ34S in sub-20 µm grains of pyrite and sphalerite in ∼ 3.5 Ga cherts from the Pilbara craton, Western Australia show that this analytical technique is suitable for in situ sulfur isotope thermometry with ± 50 °C accuracy in appropriate samples, however, sulfides are not isotopically equilibrated in analyzed samples.

  19. A novel imaging method for photonic crystal fiber fusion splicer

    NASA Astrophysics Data System (ADS)

    Bi, Weihong; Fu, Guangwei; Guo, Xuan

    2007-01-01

    Because the structure of Photonic Crystal Fiber (PCF) is very complex, and it is very difficult that traditional fiber fusion splice obtains optical axial information of PCF. Therefore, we must search for a bran-new optical imaging method to get section information of Photonic Crystal Fiber. Based on complex trait of PCF, a novel high-precision optics imaging system is presented in this article. The system uses a thinned electron-bombarded CCD (EBCCD) which is a kind of image sensor as imaging element, the thinned electron-bombarded CCD can offer low light level performance superior to conventional image intensifier coupled CCD approaches, this high-performance device can provide high contrast high resolution in low light level surveillance imaging; in order to realize precision focusing of image, we use a ultra-highprecision pace motor to adjust position of imaging lens. In this way, we can obtain legible section information of PCF. We may realize further concrete analysis for section information of PCF by digital image processing technology. Using this section information may distinguish different sorts of PCF, compute some parameters such as the size of PCF ventage, cladding structure of PCF and so on, and provide necessary analysis data for PCF fixation, adjustment, regulation, fusion and cutting system.

  20. Improved spectrophotometric analysis of fullerenes C60 and C70 in high-solubility organic solvents.

    PubMed

    Törpe, Alexander; Belton, Daniel J

    2015-01-01

    Fullerenes are among a number of recently discovered carbon allotropes that exhibit unique and versatile properties. The analysis of these materials is of great importance and interest. We present previously unreported spectroscopic data for C60 and C70 fullerenes in high-solubility solvents, including error bounds, so as to allow reliable colorimetric analysis of these materials. The Beer-Lambert-Bouguer law is found to be valid at all wavelengths. The measured data were highly reproducible, and yielded high-precision molar absorbance coefficients for C60 and C70 in o-xylene and o-dichlorobenzene, which both exhibit a high solubility for these fullerenes, and offer the prospect of improved extraction efficiency. A photometric method for a C60/C70 mixture analysis was validated with standard mixtures, and subsequently improved for real samples by correcting for light scattering, using a power-law fit. The method was successfully applied to the analysis of C60/C70 mixtures extracted from fullerene soot.

  1. High precision locating control system based on VCM for Talbot lithography

    NASA Astrophysics Data System (ADS)

    Yao, Jingwei; Zhao, Lixin; Deng, Qian; Hu, Song

    2016-10-01

    Aiming at the high precision and efficiency requirements of Z-direction locating in Talbot lithography, a control system based on Voice Coil Motor (VCM) was designed. In this paper, we built a math model of VCM and its moving characteristic was analyzed. A double-closed loop control strategy including position loop and current loop were accomplished. The current loop was implemented by driver, in order to achieve the rapid follow of the system current. The position loop was completed by the digital signal processor (DSP) and the position feedback was achieved by high precision linear scales. Feed forward control and position feedback Proportion Integration Differentiation (PID) control were applied in order to compensate for dynamic lag and improve the response speed of the system. And the high precision and efficiency of the system were verified by simulation and experiments. The results demonstrated that the performance of Z-direction gantry was obviously improved, having high precision, quick responses, strong real-time and easily to expend for higher precision.

  2. Study on high-precision measurement of long radius of curvature

    NASA Astrophysics Data System (ADS)

    Wu, Dongcheng; Peng, Shijun; Gao, Songtao

    2016-09-01

    It is hard to get high-precision measurement of the radius of curvature (ROC), because of many factors that affect the measurement accuracy. For the measurement of long radius of curvature, some factors take more important position than others'. So, at first this paper makes some research about which factor is related to the long measurement distance, and also analyse the uncertain of the measurement accuracy. At second this article also study the influence about the support status and the adjust error about the cat's eye and confocal position. At last, a 1055micrometer radius of curvature convex is measured in high-precision laboratory. Experimental results show that the proper steady support (three-point support) can guarantee the high-precision measurement of radius of curvature. Through calibrating the gain of cat's eye and confocal position, is useful to ensure the precise position in order to increase the measurement accuracy. After finish all the above process, the high-precision long ROC measurement is realized.

  3. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study.

    PubMed

    Egbewale, Bolaji E; Lewis, Martyn; Sim, Julius

    2014-04-09

    Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. 126 hypothetical trial scenarios were evaluated (126,000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power.

  4. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study

    PubMed Central

    2014-01-01

    Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304

  5. Water vapor δ17O measurements using an off-axis integrated cavity output spectrometer and seasonal variation in 17O-excess of precipitation in the east-central United States

    NASA Astrophysics Data System (ADS)

    Tian, C.; Wang, L.; Novick, K. A.

    2016-12-01

    High-precision triple oxygen isotope analysis can be used to improve our understanding of multiple hydrological and meteorological processes. Recent studies focus on understanding 17O-excess variation of tropical storms, high-latitude snow and ice-core as well as spatial distribution of meteoric water (tap water). The temporal scale of 17O-excess variation in middle-latitude precipitation is needed to better understand which processes control on the 17O-excess variations. This study focused on assessing how the accuracy and precision of vapor δ17O laser spectroscopy measurements depend on vapor concentration, delta range, and averaging-time. Meanwhile, we presented 17O-excess data from two-year, event based precipitation sampling in the east-central United States. A Triple Water Vapor Isotope Analyzer (T-WVIA) was used to evaluate the accuracy and precision of δ2H, δ18O and δ17O measurements. GISP and SLAP2 from IAEA and four working standards were used to evaluate the sensitivity in the three factors. Overall, the accuracy and precision of all isotope measurements were sensitive to concentration, with higher accuracy and precision generally observed under moderate vapor concentrations (i.e., 10000-15000 ppm) for all isotopes. Precision was also sensitive to the range of delta values, though the effect was not as large when compared to the sensitivity to concentration. The precision was much less sensitive to averaging time when compared with concentration and delta range effects. The preliminary results showed that 17O-excess variation was lower in summer (23±17 per meg) than in winter (34±16 per meg), whereas spring values (30±21 per meg) was similar to fall (29±13 per meg). That means kinetic fractionation influences the isotopic composition and 17O-excess in different seasons.

  6. Optimetrics for Precise Navigation

    NASA Technical Reports Server (NTRS)

    Yang, Guangning; Heckler, Gregory; Gramling, Cheryl

    2017-01-01

    Optimetrics for Precise Navigation will be implemented on existing optical communication links. The ranging and Doppler measurements are conducted over communication data frame and clock. The measurement accuracy is two orders of magnitude better than TDRSS. It also has other advantages of: The high optical carrier frequency enables: (1) Immunity from ionosphere and interplanetary Plasma noise floor, which is a performance limitation for RF tracking; and (2) High antenna gain reduces terminal size and volume, enables high precision tracking in Cubesat, and in deep space smallsat. High Optical Pointing Precision provides: (a) spacecraft orientation, (b) Minimal additional hardware to implement Precise Optimetrics over optical comm link; and (c) Continuous optical carrier phase measurement will enable the system presented here to accept future optical frequency standard with much higher clock accuracy.

  7. Facile fabrication of a poly(ethylene terephthalate) membrane filter with precise arrangement of through-holes

    NASA Astrophysics Data System (ADS)

    Kihara, Naoto; Odaka, Hidefumi; Kuboyama, Daiki; Onoshima, Daisuke; Ishikawa, Kenji; Baba, Yoshinobu; Hori, Masaru

    2018-03-01

    Although membrane filters are indispensable in biochemical analysis fields, most methods for through-hole fabrication are complex and inefficient. We developed a simple method of fabricating poly(ethylene terephthalate) (PET) membrane filters with a precise arrangement of through-holes for the isolation of circulating tumor cells (CTCs) based on their size. By photolithography and dry etching, highly packed 380,000 through-holes with a diameter of 7 µm were able to cover a whole area with a diameter of 13 mm. Device fabrication for the size-based capture of rare cells in blood such as CTCs is realized in this study.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fernandez T, Arturo

    The use of the sophisticated and large underground detectors at CERN for cosmic ray studies has been considered by several groups, e.g. UA1, LEP and LHC detectors. They offer the opportunity to provide large sensitivity area with magnetic analysis which allow a precise determination of the direction of cosmic ray muons as well as their momentum up to the order of some TeV. The aim of this article is to review the observation of high energy cosmic ray muons using precise spectrometers at CERN, mainly LEP detectors as well as the possibility of improve those measurements with LHC apparatus, givingmore » special emphasis to the ACORDE-ALICE cosmic ray physics program.« less

  9. Chiral dynamics with (non)strange quarks

    NASA Astrophysics Data System (ADS)

    Kubis, Bastian; Meißner, Ulf-G.

    2017-01-01

    We review the results and achievements of the project B.3. Topics addressed include pion photoproduction off the proton and off deuterium, three-flavor chiral perturbation theory studies, chiral symmetry tests in Goldstone boson decays, the development of unitarized chiral perturbation theory to next-to-leading order, the two-pole structure of the Λ(1405), the dynamical generation of the lowest S11 resonances, the theory of hadronic atoms and its application to various systems, precision studies in light-meson decays based on dispersion theory, the Roy-Steiner analysis of pion-nucleon scattering, a high-precision extraction of the elusive pion-nucleon σ-term, and aspects of chiral dynamics in few-nucleon systems.

  10. Constraining cosmic scatter in the Galactic halo through a differential analysis of metal-poor stars

    NASA Astrophysics Data System (ADS)

    Reggiani, Henrique; Meléndez, Jorge; Kobayashi, Chiaki; Karakas, Amanda; Placco, Vinicius

    2017-12-01

    Context. The chemical abundances of metal-poor halo stars are important to understanding key aspects of Galactic formation and evolution. Aims: We aim to constrain Galactic chemical evolution with precise chemical abundances of metal-poor stars (-2.8 ≤ [Fe/H] ≤ -1.5). Methods: Using high resolution and high S/N UVES spectra of 23 stars and employing the differential analysis technique we estimated stellar parameters and obtained precise LTE chemical abundances. Results: We present the abundances of Li, Na, Mg, Al, Si, Ca, Sc, Ti, V, Cr, Mn, Co, Ni, Zn, Sr, Y, Zr, and Ba. The differential technique allowed us to obtain an unprecedented low level of scatter in our analysis, with standard deviations as low as 0.05 dex, and mean errors as low as 0.05 dex for [X/Fe]. Conclusions: By expanding our metallicity range with precise abundances from other works, we were able to precisely constrain Galactic chemical evolution models in a wide metallicity range (-3.6 ≤ [Fe/H] ≤ -0.4). The agreements and discrepancies found are key for further improvement of both models and observations. We also show that the LTE analysis of Cr II is a much more reliable source of abundance for chromium, as Cr I has important NLTE effects. These effects can be clearly seen when we compare the observed abundances of Cr I and Cr II with GCE models. While Cr I has a clear disagreement between model and observations, Cr II is very well modeled. We confirm tight increasing trends of Co and Zn toward lower metallicities, and a tight flat evolution of Ni relative to Fe. Our results strongly suggest inhomogeneous enrichment from hypernovae. Our precise stellar parameters results in a low star-to-star scatter (0.04 dex) in the Li abundances of our sample, with a mean value about 0.4 dex lower than the prediction from standard Big Bang nucleosynthesis; we also study the relation between lithium depletion and stellar mass, but it is difficult to assess a correlation due to the limited mass range. We find two blue straggler stars, based on their very depleted Li abundances. One of them shows intriguing abundance anomalies, including a possible zinc enhancement, suggesting that zinc may have been also produced by a former AGB companion. Tables A.1-A.6 are also available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/608/A46

  11. Prospects for dating monazite via single-collector HR-ICP-MS

    NASA Astrophysics Data System (ADS)

    Kohn, M. J.; Vervoort, J. D.

    2006-12-01

    ICP-MS analysis permits rapid and precise dating of minerals with high U and Th contents. Here we describe a new method for in situ determination of 206Pb/238U, 207Pb/^{235}U, ^{208}Pb/232Th, and 207Pb/206Pb ages in monazite via laser ablation (New Wave Research UP-213 laser system), single-collector, magnetic sector ICP-MS (ThermoFinnigan Element2), using spot sizes of 8-30 μm, a repetition rate of 5 Hz, and a fluence of 10 J/cm2. Based on analysis of 9 monazite samples of known ages ranging from 280 to 1800 Ma, analytical precision (single sample) is ±2-3% (2σ), and reproducibility (single sample) is ±2-4% (2σ), yielding age precisions of ±3- 5% (2σ) for single points, or ±1-2% (2 s.e.) for pooled multiple analyses (n > 4). Issues of accuracy are paramount. 207Pb/206Pb ages are consistently the most accurate and agree to ±2% with accepted TIMS ages. In contrast, 206Pb/238U, 207Pb/^{235}U, and ^{208}Pb/232Th ages can differ by as much as ±5% (2σ), a problem that has also been observed for SIMS Th-Pb dating. The sources of the interelement standardization disparities among monazites remain enigmatic, but do not result from molecular interferences on Pb, U, or Th peaks. Unresolvable mass interference between 204Pb and trace contaminant 204Hg in commercial Ar gas precludes precise common Pb corrections. Instead common Pb corrections are made assuming concordancy between 207Pb/^{235}U and either 206Pb/238U or ^{208}Pb/232Th ages. The new method offers rapid analysis (~1 minute), minimal sample preparation (polished thin section), and high sensitivity. Comparatively large errors on the 206Pb/238U, 207Pb/^{235}U, and ^{208}Pb/232Th ages will likely restrict analysis of younger monazite grains (<250 Ma) to applications where 5% accuracy is sufficient. Older grains (c. 500 Ma and older) can be dated more precisely and accurately using 207Pb/206Pb. One application to young materials involves dating a large vein monazite from the Llallagua tin district of Bolivia, which resolves a ~2 Myr history of mineralization at 20-22 Ma. These data support mineralization age estimates of 21 Ma (K-Ar on wallrock minerals) rather than 44 Ma (Sm-Nd on apatite).

  12. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251

  13. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2012-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory.

  14. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2011-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory

  15. In vivo short-term precision of hip structure analysis variables in comparison with bone mineral density using paired dual-energy X-ray absorptiometry scans from multi-center clinical trials.

    PubMed

    Khoo, Benjamin C C; Beck, Thomas J; Qiao, Qi-Hong; Parakh, Pallav; Semanick, Lisa; Prince, Richard L; Singer, Kevin P; Price, Roger I

    2005-07-01

    Hip structural analysis (HSA) is a technique for extracting strength-related structural dimensions of bone cross-sections from two-dimensional hip scan images acquired by dual energy X-ray absorptiometry (DXA) scanners. Heretofore the precision of the method has not been thoroughly tested in the clinical setting. Using paired scans from two large clinical trials involving a range of different DXA machines, this study reports the first precision analysis of HSA variables, in comparison with that of conventional bone mineral density (BMD) on the same scans. A key HSA variable, section modulus (Z), biomechanically indicative of bone strength during bending, had a short-term precision percentage coefficient of variation (CV%) in the femoral neck of 3.4-10.1%, depending on the manufacturer or model of the DXA equipment. Cross-sectional area (CSA), a determinant of bone strength during axial loading and closely aligned with conventional DXA bone mineral content, had a range of CV% from 2.8% to 7.9%. Poorer precision was associated with inadequate inclusion of the femoral shaft or femoral head in the DXA-scanned hip region. Precision of HSA-derived BMD varied between 2.4% and 6.4%. Precision of DXA manufacturer-derived BMD varied between 1.9% and 3.4%, arising from the larger analysis region of interest (ROI). The precision of HSA variables was not generally dependent on magnitude, subject height, weight, or conventional femoral neck densitometric variables. The generally poorer precision of key HSA variables in comparison with conventional DXA-derived BMD highlights the critical roles played by correct limb repositioning and choice of an adequate and appropriately positioned ROI.

  16. Precision mechatronics based on high-precision measuring and positioning systems and machines

    NASA Astrophysics Data System (ADS)

    Jäger, Gerd; Manske, Eberhard; Hausotte, Tino; Mastylo, Rostyslav; Dorozhovets, Natalja; Hofmann, Norbert

    2007-06-01

    Precision mechatronics is defined in the paper as the science and engineering of a new generation of high precision systems and machines. Nanomeasuring and nanopositioning engineering represents important fields of precision mechatronics. The nanometrology is described as the today's limit of the precision engineering. The problem, how to design nanopositioning machines with uncertainties as small as possible will be discussed. The integration of several optical and tactile nanoprobes makes the 3D-nanopositioning machine suitable for various tasks, such as long range scanning probe microscopy, mask and wafer inspection, nanotribology, nanoindentation, free form surface measurement as well as measurement of microoptics, precision molds, microgears, ring gauges and small holes.

  17. Classification of LIDAR Data for Generating a High-Precision Roadway Map

    NASA Astrophysics Data System (ADS)

    Jeong, J.; Lee, I.

    2016-06-01

    Generating of a highly precise map grows up with development of autonomous driving vehicles. The highly precise map includes a precision of centimetres level unlike an existing commercial map with the precision of meters level. It is important to understand road environments and make a decision for autonomous driving since a robust localization is one of the critical challenges for the autonomous driving car. The one of source data is from a Lidar because it provides highly dense point cloud data with three dimensional position, intensities and ranges from the sensor to target. In this paper, we focus on how to segment point cloud data from a Lidar on a vehicle and classify objects on the road for the highly precise map. In particular, we propose the combination with a feature descriptor and a classification algorithm in machine learning. Objects can be distinguish by geometrical features based on a surface normal of each point. To achieve correct classification using limited point cloud data sets, a Support Vector Machine algorithm in machine learning are used. Final step is to evaluate accuracies of obtained results by comparing them to reference data The results show sufficient accuracy and it will be utilized to generate a highly precise road map.

  18. Using experimental design and spatial analyses to improve the precision of NDVI estimates in upland cotton field trials

    USDA-ARS?s Scientific Manuscript database

    Controlling for spatial variability is important in high-throughput phenotyping studies that enable large numbers of genotypes to be evaluated across time and space. In the current study, we compared the efficacy of different experimental designs and spatial models in the analysis of canopy spectral...

  19. Understanding the Effect Size of Lisdexamfetamine Dimesylate for Treating ADHD in Children and Adults

    ERIC Educational Resources Information Center

    Faraone, Stephen V.

    2012-01-01

    Objective: An earlier meta-analysis of pediatric clinical trials indicated that lisdexamfetamine dimesylate (LDX) had a greater effect size than other stimulant medications. This work tested the hypothesis that the apparent increased efficacy was artifactual. Method: The authors assessed two potential artifacts: an unusually high precision of…

  20. XPS Study of Oxide/GaAs and SiO2/Si Interfaces

    NASA Technical Reports Server (NTRS)

    Grunthaner, F. J.; Grunthaner, P. J.; Vasquez, R. P.; Lewis, B. F.; Maserjian, J.; Madhukar, A.

    1982-01-01

    Concepts developed in study of SiO2/Si interface applied to analysis of native oxide/GaAs interface. High-resolution X-ray photoelectron spectroscopy (XPS) has been combined with precise chemical-profiling technique and resolution-enhancement methods to study stoichiometry of transitional layer. Results are presented in report now available.

  1. Precise Estimation of Allele Frequencies of Single-Nucleotide Polymorphisms by a Quantitative SSCP Analysis of Pooled DNA

    PubMed Central

    Sasaki, Tomonari; Tahira, Tomoko; Suzuki, Akari; Higasa, Koichiro; Kukita, Yoji; Baba, Shingo; Hayashi, Kenshi

    2001-01-01

    We show that single-nucleotide polymorphisms (SNPs) of moderate to high heterozygosity (minor allele frequencies >10%) can be efficiently detected, and their allele frequencies accurately estimated, by pooling the DNA samples and applying a capillary-based SSCP analysis. In this method, alleles are separated into peaks, and their frequencies can be reliably and accurately quantified from their peak heights (SD <1.8%). We found that as many as 40% of publicly available SNPs that were analyzed by this method have widely differing allele frequency distributions among groups of different ethnicity (parents of Centre d'Etude Polymorphisme Humaine families vs. Japanese individuals). These results demonstrate the effectiveness of the present pooling method in the reevaluation of candidate SNPs that have been collected by examination of limited numbers of individuals. The method should also serve as a robust quantitative technique for studies in which a precise estimate of SNP allele frequencies is essential—for example, in linkage disequilibrium analysis. PMID:11083945

  2. Automated magnification calibration in transmission electron microscopy using Fourier analysis of replica images.

    PubMed

    van der Laak, Jeroen A W M; Dijkman, Henry B P M; Pahlplatz, Martin M M

    2006-03-01

    The magnification factor in transmission electron microscopy is not very precise, hampering for instance quantitative analysis of specimens. Calibration of the magnification is usually performed interactively using replica specimens, containing line or grating patterns with known spacing. In the present study, a procedure is described for automated magnification calibration using digital images of a line replica. This procedure is based on analysis of the power spectrum of Fourier transformed replica images, and is compared to interactive measurement in the same images. Images were used with magnification ranging from 1,000 x to 200,000 x. The automated procedure deviated on average 0.10% from interactive measurements. Especially for catalase replicas, the coefficient of variation of automated measurement was considerably smaller (average 0.28%) compared to that of interactive measurement (average 3.5%). In conclusion, calibration of the magnification in digital images from transmission electron microscopy may be performed automatically, using the procedure presented here, with high precision and accuracy.

  3. Manufacturing error sensitivity analysis and optimal design method of cable-network antenna structures

    NASA Astrophysics Data System (ADS)

    Zong, Yali; Hu, Naigang; Duan, Baoyan; Yang, Guigeng; Cao, Hongjun; Xu, Wanye

    2016-03-01

    Inevitable manufacturing errors and inconsistency between assumed and actual boundary conditions can affect the shape precision and cable tensions of a cable-network antenna, and even result in failure of the structure in service. In this paper, an analytical sensitivity analysis method of the shape precision and cable tensions with respect to the parameters carrying uncertainty was studied. Based on the sensitivity analysis, an optimal design procedure was proposed to alleviate the effects of the parameters that carry uncertainty. The validity of the calculated sensitivities is examined by those computed by a finite difference method. Comparison with a traditional design method shows that the presented design procedure can remarkably reduce the influence of the uncertainties on the antenna performance. Moreover, the results suggest that especially slender front net cables, thick tension ties, relatively slender boundary cables and high tension level can improve the ability of cable-network antenna structures to resist the effects of the uncertainties on the antenna performance.

  4. High-throughput investigation of single and binary protein adsorption isotherms in anion exchange chromatography employing multivariate analysis.

    PubMed

    Field, Nicholas; Konstantinidis, Spyridon; Velayudhan, Ajoy

    2017-08-11

    The combination of multi-well plates and automated liquid handling is well suited to the rapid measurement of the adsorption isotherms of proteins. Here, single and binary adsorption isotherms are reported for BSA, ovalbumin and conalbumin on a strong anion exchanger over a range of pH and salt levels. The impact of the main experimental factors at play on the accuracy and precision of the adsorbed protein concentrations is quantified theoretically and experimentally. In addition to the standard measurement of liquid concentrations before and after adsorption, the amounts eluted from the wells are measured directly. This additional measurement corroborates the calculation based on liquid concentration data, and improves precision especially under conditions of weak or moderate interaction strength. The traditional measurement of multicomponent isotherms is limited by the speed of HPLC analysis; this analytical bottleneck is alleviated by careful multivariate analysis of UV spectra. Copyright © 2017. Published by Elsevier B.V.

  5. High Precision Thermal, Structural and Optical Analysis of an External Occulter Using a Common Model and the General Purpose Multi-Physics Analysis Tool Cielo

    NASA Technical Reports Server (NTRS)

    Hoff, Claus; Cady, Eric; Chainyk, Mike; Kissil, Andrew; Levine, Marie; Moore, Greg

    2011-01-01

    The efficient simulation of multidisciplinary thermo-opto-mechanical effects in precision deployable systems has for years been limited by numerical toolsets that do not necessarily share the same finite element basis, level of mesh discretization, data formats, or compute platforms. Cielo, a general purpose integrated modeling tool funded by the Jet Propulsion Laboratory and the Exoplanet Exploration Program, addresses shortcomings in the current state of the art via features that enable the use of a single, common model for thermal, structural and optical aberration analysis, producing results of greater accuracy, without the need for results interpolation or mapping. This paper will highlight some of these advances, and will demonstrate them within the context of detailed external occulter analyses, focusing on in-plane deformations of the petal edges for both steady-state and transient conditions, with subsequent optical performance metrics including intensity distributions at the pupil and image plane.

  6. Prenatal diagnosis in a cystic fibrosis family: a combined molecular strategy for a precise diagnosis.

    PubMed

    Chávez-Saldaña, Margarita; García-Cavazos, Ricardo; Vigueras, Rosa María; Orozco, Lorena

    2011-01-01

    The high genetic heterogeneity in populations with a wide spectrum of mutations in the CF transmembrane conductance regulator gene (CFTR), makes the detection of mutations a very hard and difficult task, thereby limiting the accurate diagnosis of the disease, mainly in patients with uncharacterized mutations. Molecular strategies, like targeted identification of the most frequent CFTR mutations in Mexican population combined with linkage analysis using markers, is very useful for carrier detection and for prenatal diagnosis in affected families with CF. In this paper we show that the combination of methodologies was a crucial alternative to reach a precise prenatal CF diagnosis. We documented CF diagnosis in a 14th-week fetus combining the screening of the most common mutations in Mexican population with linkage analysis of two extragenic polymorphisms (XV2C/TaqI and KM19/PstI). We determined that the fetus inherited the PG542X mutation from its mother and an unknown mutation from its father through the chromosomal phases analysis.

  7. Precise determination of N-acetylcysteine in pharmaceuticals by microchip electrophoresis.

    PubMed

    Rudašová, Marína; Masár, Marián

    2016-01-01

    A novel microchip electrophoresis method for the rapid and high-precision determination of N-acetylcysteine, a pharmaceutically active ingredient, in mucolytics has been developed. Isotachophoresis separations were carried out at pH 6.0 on a microchip with conductivity detection. The methods of external calibration and internal standard were used to evaluate the results. The internal standard method effectively eliminated variations in various working parameters, mainly run-to-run fluctuations of an injected volume. The repeatability and accuracy of N-acetylcysteine determination in all mucolytic preparations tested (Solmucol 90 and 200, and ACC Long 600) were more than satisfactory with the relative standard deviation and relative error values <0.7 and <1.9%, respectively. A recovery range of 99-101% of N-acetylcysteine in the analyzed pharmaceuticals predetermines the proposed method for accurate analysis as well. This work, in general, indicates analytical possibilities of microchip isotachophoresis for the quantitative analysis of simplified samples such as pharmaceuticals that contain the analyte(s) at relatively high concentrations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Individual Biomarkers Using Molecular Personalized Medicine Approaches.

    PubMed

    Zenner, Hans P

    2017-01-01

    Molecular personalized medicine tries to generate individual predictive biomarkers to assist doctors in their decision making. These are thought to improve the efficacy and lower the toxicity of a treatment. The molecular basis of the desired high-precision prediction is modern "omex" technologies providing high-throughput bioanalytical methods. These include genomics and epigenomics, transcriptomics, proteomics, metabolomics, microbiomics, imaging, and functional analyses. In most cases, producing big data also requires a complex biomathematical analysis. Using molecular personalized medicine, the conventional physician's check of biomarker results may no longer be sufficient. By contrast, the physician may need to cooperate with the biomathematician to achieve the desired prediction on the basis of the analysis of individual big data typically produced by omex technologies. Identification of individual biomarkers using molecular personalized medicine approaches is thought to allow a decision-making for the precise use of a targeted therapy, selecting the successful therapeutic tool from a panel of preexisting drugs or medical products. This should avoid the treatment of nonresponders and responders that produces intolerable unwanted effects. © 2017 S. Karger AG, Basel.

  9. A portable x-ray fluorescence instrument for analyzing dust wipe samples for lead: evaluation with field samples.

    PubMed

    Sterling, D A; Lewis, R D; Luke, D A; Shadel, B N

    2000-06-01

    Dust wipe samples collected in the field were tested by nondestructive X-ray fluorescence (XRF) followed by laboratory analysis with flame atomic absorption spectrophotometry (FAAS). Data were analyzed for precision and accuracy of measurement. Replicate samples with the XRF show high precision with an intraclass correlation coefficient (ICC) of 0.97 (P<0.0001) and an overall coefficient of variation of 11.6%. Paired comparison indicates no statistical difference (P=0.272) between XRF and FAAS analysis. Paired samples are highly correlated with an R(2) ranging between 0.89 for samples that contain paint chips and 0.93 for samples that do not contain paint chips. The ICC for absolute agreement between XRF and laboratory results was 0.95 (P<0.0001). The relative error over the concentration range of 25 to 14,200 microgram Pb is -12% (95% CI, -18 to -5). The XRF appears to be an excellent method for rapid on-site evaluation of dust wipes for clearance and risk assessment purposes, although there are indications of some confounding when paint chips are present. Copyright 2000 Academic Press.

  10. COSMOS: accurate detection of somatic structural variations through asymmetric comparison between tumor and normal samples.

    PubMed

    Yamagata, Koichi; Yamanishi, Ayako; Kokubu, Chikara; Takeda, Junji; Sese, Jun

    2016-05-05

    An important challenge in cancer genomics is precise detection of structural variations (SVs) by high-throughput short-read sequencing, which is hampered by the high false discovery rates of existing analysis tools. Here, we propose an accurate SV detection method named COSMOS, which compares the statistics of the mapped read pairs in tumor samples with isogenic normal control samples in a distinct asymmetric manner. COSMOS also prioritizes the candidate SVs using strand-specific read-depth information. Performance tests on modeled tumor genomes revealed that COSMOS outperformed existing methods in terms of F-measure. We also applied COSMOS to an experimental mouse cell-based model, in which SVs were induced by genome engineering and gamma-ray irradiation, followed by polymerase chain reaction-based confirmation. The precision of COSMOS was 84.5%, while the next best existing method was 70.4%. Moreover, the sensitivity of COSMOS was the highest, indicating that COSMOS has great potential for cancer genome analysis. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Technology Insertion (TI)/Industrial Process Improvement (IPI) Task Order Number 1. Quick Fix Plan for WR-ALC, 7 RCC’s

    DTIC Science & Technology

    1989-09-25

    Orders and test specifications. Some mandatory replacement of high failure items are directed by Technical Orders to extend MTBF. Precision bearing and...Experience is very high but natural attrition is reducing the numbers faster than training is furnishing younger mechanics. Surge conditions would be...model validation run output revealed that utilization of equipment is very low and manpower is high . Based on this analysis and the brainstorming

  12. Flash melting of tantalum in a diamond cell to 85 GPa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karandikar, Amol; Boehler, Reinhard

    2016-02-09

    Here, we demonstrate a new level of precision in measuring melting temperatures at high pressure using laser flash-heating followed by Scanning Electron Microscopy and Focused Ion Beam Milling. Furthermore, the new measurements on tantalum put unprecedented constraints on its highly debated melting slope, calling for a reevaluation of theoretical, shock compression and diamond cell approaches to determine melting at high pressure. X-ray analysis of the recovered samples confirmed the absence of chemical reactions, which likely played a significant role in previous experiments.

  13. Design of measurement system of 3D surface profile based on chromatic confocal technology

    NASA Astrophysics Data System (ADS)

    Wang, An-su; Xie, Bin; Liu, Zi-wei

    2018-01-01

    Chromatic confocal 3D profilometer has widely used in science investigation and industry fields recently for its high precision, great measuring range and numerical surface characteristic. It can provide exact and omnidirectional solution for manufacture and research by 3D non-contact surface analysis technique. The article analyzes the principle of surface measurement with chromatic confocal technology, and provides the designing indicators and requirements of the confocal system. As the key component, the dispersive objective used to achieve longitudinal focus vibration with wavelength was designed. The objective disperses the focus of wavelength between 400 700 nm to 15 mm longitudinal range. With selected spectrometer, the resolution of chromatic confocal 3D profilometer is no more than 5 μm, which can meet needs for the high precision non-contact surface profile measurement.

  14. Molecular Classification and Pharmacogenetics of Primary Plasma Cell Leukemia: An Initial Approach toward Precision Medicine

    PubMed Central

    Simeon, Vittorio; Todoerti, Katia; La Rocca, Francesco; Caivano, Antonella; Trino, Stefania; Lionetti, Marta; Agnelli, Luca; De Luca, Luciana; Laurenzana, Ilaria; Neri, Antonino; Musto, Pellegrino

    2015-01-01

    Primary plasma cell leukemia (pPCL) is a rare and aggressive variant of multiple myeloma (MM) which may represent a valid model for high-risk MM. This disease is associated with a very poor prognosis, and unfortunately, it has not significantly improved during the last three decades. New high-throughput technologies have allowed a better understanding of the molecular basis of this disease and moved toward risk stratification, providing insights for targeted therapy studies. This knowledge, added to the pharmacogenetic profile of new and old agents in the analysis of efficacy and safety, could contribute to help clinical decisions move toward a precision medicine and a better clinical outcome for these patients. In this review, we describe the available literature concerning the genomic characterization and pharmacogenetics of plasma cell leukemia (PCL). PMID:26263974

  15. Performance characteristics of two bioassays and high-performance liquid chromatography for determination of flucytosine in serum.

    PubMed Central

    St-Germain, G; Lapierre, S; Tessier, D

    1989-01-01

    We compared the accuracy and precision of two microbiological methods and one high-pressure liquid chromatography (HPLC) procedure used to measure the concentrations of flucytosine in serum. On the basis of an analysis of six standards, all methods were judged reliable within acceptable limits for clinical use. With the biological methods, a slight loss of linearity was observed in the 75- to 100-micrograms/ml range. Compared with the bioassays, the HPLC method did not present linearity problems and was more precise and accurate in the critical zone of 100 micrograms/ml. On average, results obtained with patient sera containing 50 to 100 micrograms of flucytosine per ml were 10.6% higher with the HPLC method than with the bioassays. Standards for the biological assays may be prepared in serum or water. PMID:2802566

  16. Molecular Classification and Pharmacogenetics of Primary Plasma Cell Leukemia: An Initial Approach toward Precision Medicine.

    PubMed

    Simeon, Vittorio; Todoerti, Katia; La Rocca, Francesco; Caivano, Antonella; Trino, Stefania; Lionetti, Marta; Agnelli, Luca; De Luca, Luciana; Laurenzana, Ilaria; Neri, Antonino; Musto, Pellegrino

    2015-07-30

    Primary plasma cell leukemia (pPCL) is a rare and aggressive variant of multiple myeloma (MM) which may represent a valid model for high-risk MM. This disease is associated with a very poor prognosis, and unfortunately, it has not significantly improved during the last three decades. New high-throughput technologies have allowed a better understanding of the molecular basis of this disease and moved toward risk stratification, providing insights for targeted therapy studies. This knowledge, added to the pharmacogenetic profile of new and old agents in the analysis of efficacy and safety, could contribute to help clinical decisions move toward a precision medicine and a better clinical outcome for these patients. In this review, we describe the available literature concerning the genomic characterization and pharmacogenetics of plasma cell leukemia (PCL).

  17. Towards accurate radial velocities from early type spectra in the framework of an ESO key programme

    NASA Astrophysics Data System (ADS)

    Verschueren, Werner; David, M.; Hensberge, Herman

    In order to elucidate the internal kinematics in very young stellar groups, a dedicated machinery was set up, which made it possible to proceed from actual observations to reductions and correlation analysis to the ultimate derivation of early-type stellar radial velocities (RVs) with the requisite precision. The following ingredients are found to be essential to obtain RVs of early-type stars at the 1-km/s level of precision: high-resolution, high-S/N spectra covering a large wavelength range; maximal reduction of observational errors and the use of optimal reduction procedures; the intelligent use of a versatile cross-correlation package; and comparison of velocities derived from different regions of the spectrum in order to detect systematic mismatches between object and template spectrum in some of the lines.

  18. Automation of Precise Time Reference Stations (PTRS)

    NASA Astrophysics Data System (ADS)

    Wheeler, P. J.

    1985-04-01

    The U.S. Naval Observatory is presently engaged in a program of automating precise time stations (PTS) and precise time reference stations (PTBS) by using a versatile mini-computer controlled data acquisition system (DAS). The data acquisition system is configured to monitor locally available PTTI signals such as LORAN-C, OMEGA, and/or the Global Positioning System. In addition, the DAS performs local standard intercomparison. Computer telephone communications provide automatic data transfer to the Naval Observatory. Subsequently, after analysis of the data, results and information can be sent back to the precise time reference station to provide automatic control of remote station timing. The DAS configuration is designed around state of the art standard industrial high reliability modules. The system integration and software are standardized but allow considerable flexibility to satisfy special local requirements such as stability measurements, performance evaluation and printing of messages and certificates. The DAS operates completely independently and may be queried or controlled at any time with a computer or terminal device (control is protected for use by authorized personnel only). Such DAS equipped PTS are operational in Hawaii, California, Texas and Florida.

  19. A validated fast difference spectrophotometric method for 5-hydroxymethyl-2-furfural (HMF) determination in corn syrups.

    PubMed

    de Andrade, Jucimara Kulek; de Andrade, Camila Kulek; Komatsu, Emy; Perreault, Hélène; Torres, Yohandra Reyes; da Rosa, Marcos Roberto; Felsner, Maria Lurdes

    2017-08-01

    Corn syrups, important ingredients used in food and beverage industries, often contain high levels of 5-hydroxymethyl-2-furfural (HMF), a toxic contaminant. In this work, an in house validation of a difference spectrophotometric method for HMF analysis in corn syrups was developed using sophisticated statistical tools by the first time. The methodology showed excellent analytical performance with good selectivity, linearity (R 2 =99.9%, r>0.99), accuracy and low limits (LOD=0.10mgL -1 and LOQ=0.34mgL -1 ). An excellent precision was confirmed by repeatability (RSD (%)=0.30) and intermediate precision (RSD (%)=0.36) estimates and by Horrat value (0.07). A detailed study of method precision using a nested design demonstrated that variation sources such as instruments, operators and time did not interfere in the variability of results within laboratory and consequently in its intermediate precision. The developed method is environmentally friendly, fast, cheap and easy to implement resulting in an attractive alternative for corn syrups quality control in industries and official laboratories. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Single photon ranging system using two wavelengths laser and analysis of precision

    NASA Astrophysics Data System (ADS)

    Chen, Yunfei; He, Weiji; Miao, Zhuang; Gu, Guohua; Chen, Qian

    2013-09-01

    The laser ranging system based on time correlation single photon counting technology and single photon detector has the feature of high precision and low emergent energy etc. In this paper, we established a single photon laser ranging system that use the supercontinuum laser as light source, and two wavelengths (532nm and 830nm) of echo signal as the stop signal. We propose a new method that is capable to improve the single photon ranging system performance. The method is implemented by using two single-photon detectors to receive respectively the two different wavelength signals at the same time. We extracted the firings of the two detectors triggered by the same laser pulse at the same time and then took mean time of the two firings as the combined detection time-of-flight. The detection by two channels using two wavelengths will effectively improve the detection precision and decrease the false alarm probability. Finally, an experimental single photon ranging system was established. Through a lot of experiments, we got the system precision using both single and two wavelengths and verified the effectiveness of the method.

  1. Nanoscale tailor-made membranes for precise and rapid molecular sieve separation.

    PubMed

    Wang, Jing; Zhu, Junyong; Zhang, Yatao; Liu, Jindun; Van der Bruggen, Bart

    2017-03-02

    The precise and rapid separation of different molecules from aqueous, organic solutions and gas mixtures is critical to many technologies in the context of resource-saving and sustainable development. The strength of membrane-based technologies is well recognized and they are extensively applied as cost-effective, highly efficient separation techniques. Currently, empirical-based approaches, lacking an accurate nanoscale control, are used to prepare the most advanced membranes. In contrast, nanoscale control renders the membrane molecular specificity (sub-2 nm) necessary for efficient and rapid molecular separation. Therefore, as a growing trend in membrane technology, the field of nanoscale tailor-made membranes is highlighted in this review. An in-depth analysis of the latest advances in tailor-made membranes for precise and rapid molecule sieving is given, along with an outlook to future perspectives of such membranes. Special attention is paid to the established processing strategies, as well as the application of molecular dynamics (MD) simulation in nanoporous membrane design. This review will provide useful guidelines for future research in the development of nanoscale tailor-made membranes with a precise and rapid molecular sieve separation property.

  2. Distortion control in 20MnCr5 bevel gears after liquid nitriding process to maintain precision dimensions

    NASA Astrophysics Data System (ADS)

    Mahendiran, M.; Kavitha, M.

    2018-02-01

    Robotic and automotive gears are generally very high precision components with limitations in tolerances. Bevel gears are more widely used and dimensionally very close tolerance components that need stability without any backlash or distortion for smooth and trouble free functions. Nitriding is carried out to enhance wear resistance of the surface. The aim of this paper is to reduce the distortion in liquid nitriding process, though plasma nitriding is preferred for high precision components. Various trials were conducted to optimize the process parameters, considering pre dimensional setting for nominal nitriding layer growth. Surface cleaning, suitable fixtures and stress relieving operations were also done to optimize the process. Micro structural analysis and Vickers hardness testing were carried out for analyzing the phase changes, variation in surface hardness and case depth. CNC gear testing machine was used for determining the distortion level. The presence of white layer was found for about 10-15μm in the case depth of 250± 3.5μm showing an average surface hardness of 670 HV. Hence the economical liquid nitriding process was successfully used for producing high hardness and wear resistant coating over 20MnCr5 material with less distortion and reduced secondary grinding process for dimensional control.

  3. Automation of ⁹⁹Tc extraction by LOV prior ICP-MS detection: application to environmental samples.

    PubMed

    Rodríguez, Rogelio; Leal, Luz; Miranda, Silvia; Ferrer, Laura; Avivar, Jessica; García, Ariel; Cerdà, Víctor

    2015-02-01

    A new, fast, automated and inexpensive sample pre-treatment method for (99)Tc determination by inductively coupled plasma-mass spectrometry (ICP-MS) detection is presented. The miniaturized approach is based on a lab-on-valve (LOV) system, allowing automatic separation and preconcentration of (99)Tc. Selectivity is provided by the solid phase extraction system used (TEVA resin) which retains selectively pertechnetate ion in diluted nitric acid solution. The proposed system has some advantages such as minimization of sample handling, reduction of reagents volume, improvement of intermediate precision and sample throughput, offering a significant decrease of both time and cost per analysis in comparison to other flow techniques and batch methods. The proposed LOV system has been successfully applied to different samples of environmental interest (water and soil) with satisfactory recoveries, between 94% and 98%. The detection limit (LOD) of the developed method is 0.005 ng. The high durability of the resin and its low amount (32 mg), its good intermediate precision (RSD 3.8%) and repeatability (RSD 2%) and its high extraction frequency (up to 5 h(-1)) makes this method an inexpensive, high precision and fast tool for monitoring (99)Tc in environmental samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Omics-Based Strategies in Precision Medicine: Toward a Paradigm Shift in Inborn Errors of Metabolism Investigations

    PubMed Central

    Tebani, Abdellah; Afonso, Carlos; Marret, Stéphane; Bekri, Soumeya

    2016-01-01

    The rise of technologies that simultaneously measure thousands of data points represents the heart of systems biology. These technologies have had a huge impact on the discovery of next-generation diagnostics, biomarkers, and drugs in the precision medicine era. Systems biology aims to achieve systemic exploration of complex interactions in biological systems. Driven by high-throughput omics technologies and the computational surge, it enables multi-scale and insightful overviews of cells, organisms, and populations. Precision medicine capitalizes on these conceptual and technological advancements and stands on two main pillars: data generation and data modeling. High-throughput omics technologies allow the retrieval of comprehensive and holistic biological information, whereas computational capabilities enable high-dimensional data modeling and, therefore, accessible and user-friendly visualization. Furthermore, bioinformatics has enabled comprehensive multi-omics and clinical data integration for insightful interpretation. Despite their promise, the translation of these technologies into clinically actionable tools has been slow. In this review, we present state-of-the-art multi-omics data analysis strategies in a clinical context. The challenges of omics-based biomarker translation are discussed. Perspectives regarding the use of multi-omics approaches for inborn errors of metabolism (IEM) are presented by introducing a new paradigm shift in addressing IEM investigations in the post-genomic era. PMID:27649151

  5. Applied 3D printing for microscopy in health science research

    NASA Astrophysics Data System (ADS)

    Brideau, Craig; Zareinia, Kourosh; Stys, Peter

    2015-03-01

    The rapid prototyping capability offered by 3D printing is considered advantageous for commercial applications. However, the ability to quickly produce precision custom devices is highly beneficial in the research laboratory setting as well. Biological laboratories require the manipulation and analysis of delicate living samples, thus the ability to create custom holders, support equipment, and adapters allow the extension of existing laboratory machines. Applications include camera adapters and stage sample holders for microscopes, surgical guides for tissue preparation, and small precision tools customized to unique specifications. Where high precision is needed, especially the reproduction of fine features, a printer with a high resolution is needed. However, the introduction of cheaper, lower resolution commercial printers have been shown to be more than adequate for less demanding projects. For direct manipulation of delicate samples, biocompatible raw materials are often required, complicating the printing process. This paper will examine some examples of 3D-printed objects for laboratory use, and provide an overview of the requirements for 3D printing for this application. Materials, printing resolution, production, and ease of use will all be reviewed with an eye to producing better printers and techniques for laboratory applications. Specific case studies will highlight applications for 3D-printed devices in live animal imaging for both microscopy and Magnetic Resonance Imaging.

  6. Omics-Based Strategies in Precision Medicine: Toward a Paradigm Shift in Inborn Errors of Metabolism Investigations.

    PubMed

    Tebani, Abdellah; Afonso, Carlos; Marret, Stéphane; Bekri, Soumeya

    2016-09-14

    The rise of technologies that simultaneously measure thousands of data points represents the heart of systems biology. These technologies have had a huge impact on the discovery of next-generation diagnostics, biomarkers, and drugs in the precision medicine era. Systems biology aims to achieve systemic exploration of complex interactions in biological systems. Driven by high-throughput omics technologies and the computational surge, it enables multi-scale and insightful overviews of cells, organisms, and populations. Precision medicine capitalizes on these conceptual and technological advancements and stands on two main pillars: data generation and data modeling. High-throughput omics technologies allow the retrieval of comprehensive and holistic biological information, whereas computational capabilities enable high-dimensional data modeling and, therefore, accessible and user-friendly visualization. Furthermore, bioinformatics has enabled comprehensive multi-omics and clinical data integration for insightful interpretation. Despite their promise, the translation of these technologies into clinically actionable tools has been slow. In this review, we present state-of-the-art multi-omics data analysis strategies in a clinical context. The challenges of omics-based biomarker translation are discussed. Perspectives regarding the use of multi-omics approaches for inborn errors of metabolism (IEM) are presented by introducing a new paradigm shift in addressing IEM investigations in the post-genomic era.

  7. RDF SKETCH MAPS - KNOWLEDGE COMPLEXITY REDUCTION FOR PRECISION MEDICINE ANALYTICS.

    PubMed

    Thanintorn, Nattapon; Wang, Juexin; Ersoy, Ilker; Al-Taie, Zainab; Jiang, Yuexu; Wang, Duolin; Verma, Megha; Joshi, Trupti; Hammer, Richard; Xu, Dong; Shin, Dmitriy

    2016-01-01

    Realization of precision medicine ideas requires significant research effort to be able to spot subtle differences in complex diseases at the molecular level to develop personalized therapies. It is especially important in many cases of highly heterogeneous cancers. Precision diagnostics and therapeutics of such diseases demands interrogation of vast amounts of biological knowledge coupled with novel analytic methodologies. For instance, pathway-based approaches can shed light on the way tumorigenesis takes place in individual patient cases and pinpoint to novel drug targets. However, comprehensive analysis of hundreds of pathways and thousands of genes creates a combinatorial explosion, that is challenging for medical practitioners to handle at the point of care. Here we extend our previous work on mapping clinical omics data to curated Resource Description Framework (RDF) knowledge bases to derive influence diagrams of interrelationships of biomarker proteins, diseases and signal transduction pathways for personalized theranostics. We present RDF Sketch Maps - a computational method to reduce knowledge complexity for precision medicine analytics. The method of RDF Sketch Maps is inspired by the way a sketch artist conveys only important visual information and discards other unnecessary details. In our case, we compute and retain only so-called RDF Edges - places with highly important diagnostic and therapeutic information. To do this we utilize 35 maps of human signal transduction pathways by transforming 300 KEGG maps into highly processable RDF knowledge base. We have demonstrated potential clinical utility of RDF Sketch Maps in hematopoietic cancers, including analysis of pathways associated with Hairy Cell Leukemia (HCL) and Chronic Myeloid Leukemia (CML) where we achieved up to 20-fold reduction in the number of biological entities to be analyzed, while retaining most likely important entities. In experiments with pathways associated with HCL a generated RDF Sketch Map of the top 30% paths retained important information about signaling cascades leading to activation of proto-oncogene BRAF, which is usually associated with a different cancer, melanoma. Recent reports of successful treatments of HCL patients by the BRAF-targeted drug vemurafenib support the validity of the RDF Sketch Maps findings. We therefore believe that RDF Sketch Maps will be invaluable for hypothesis generation for precision diagnostics and therapeutics as well as drug repurposing studies.

  8. The accuracy and precision of radiostereometric analysis in upper limb arthroplasty.

    PubMed

    Ten Brinke, Bart; Beumer, Annechien; Koenraadt, Koen L M; Eygendaal, Denise; Kraan, Gerald A; Mathijssen, Nina M C

    2017-06-01

    Background and purpose - Radiostereometric analysis (RSA) is an accurate method for measurement of early migration of implants. Since a relation has been shown between early migration and future loosening of total knee and hip prostheses, RSA plays an important role in the development and evaluation of prostheses. However, there have been few RSA studies of the upper limb, and the value of RSA of the upper limb is not yet clear. We therefore performed a systematic review to investigate the accuracy and precision of RSA of the upper limb. Patients and methods - PRISMA guidelines were followed and the protocol for this review was published online at PROSPERO under registration number CRD42016042014. A systematic search of the literature was performed in the databases Embase, Medline, Cochrane, Web of Science, Scopus, Cinahl, and Google Scholar on April 25, 2015 based on the keywords radiostereometric analysis, shoulder prosthesis, elbow prosthesis, wrist prosthesis, trapeziometacarpal joint prosthesis, humerus, ulna, radius, carpus. Articles concerning RSA for the analysis of early migration of prostheses of the upper limb were included. Quality assessment was performed using the MINORS score, Downs and Black checklist, and the ISO RSA Results - 23 studies were included. Precision values were in the 0.06-0.88 mm and 0.05-10.7° range for the shoulder, the 0.05-0.34 mm and 0.16-0.76° range for the elbow, and the 0.16-1.83 mm and 11-124° range for the TMC joint. Accuracy data from marker- and model-based RSA were not reported in the studies included. Interpretation - RSA is a highly precise method for measurement of early migration of orthopedic implants in the upper limb. However, the precision of rotation measurement is poor in some components. Challenges with RSA in the upper limb include the symmetrical shape of prostheses and the limited size of surrounding bone, leading to over-projection of the markers by the prosthesis. We recommend higher adherence to RSA guidelines and encourage investigators to publish long-term follow-up RSA studies.

  9. An automated field phenotyping pipeline for application in grapevine research.

    PubMed

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-02-26

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale.

  10. Structural health monitoring ultrasonic thickness measurement accuracy and reliability of various time-of-flight calculation methods

    NASA Astrophysics Data System (ADS)

    Eason, Thomas J.; Bond, Leonard J.; Lozev, Mark G.

    2016-02-01

    The accuracy, precision, and reliability of ultrasonic thickness structural health monitoring systems are discussed in-cluding the influence of systematic and environmental factors. To quantify some of these factors, a compression wave ultrasonic thickness structural health monitoring experiment is conducted on a flat calibration block at ambient temperature with forty four thin-film sol-gel transducers and various time-of-flight thickness calculation methods. As an initial calibration, the voltage response signals from each sensor are used to determine the common material velocity as well as the signal offset unique to each calculation method. Next, the measurement precision of the thickness error of each method is determined with a proposed weighted censored relative maximum likelihood analysis technique incorporating the propagation of asymmetric measurement uncertainty. The results are presented as upper and lower confidence limits analogous to the a90/95 terminology used in industry recognized Probability-of-Detection assessments. Future work is proposed to apply the statistical analysis technique to quantify measurement precision of various thickness calculation methods under different environmental conditions such as high temperature, rough back-wall surface, and system degradation with an intended application to monitor naphthenic acid corrosion in oil refineries.

  11. Development of Optical System for ARGO-M

    NASA Astrophysics Data System (ADS)

    Nah, Jakyoung; Jang, Jung-Guen; Jang, Bi-Ho; Han, In-Woo; Han, Jeong-Yeol; Park, Kwijong; Lim, Hyung-Chul; Yu, Sung-Yeol; Park, Eunseo; Seo, Yoon-Kyung; Moon, Il-Kwon; Choi, Byung-Kyu; Na, Eunjoo; Nam, Uk-Won

    2013-03-01

    ARGO-M is a satellite laser ranging (SLR) system developed by the Korea Astronomy and Space Science Institute with the consideration of mobility and daytime and nighttime satellite observation. The ARGO-M optical system consists of 40 cm receiving telescope, 10 cm transmitting telescope, and detecting optics. For the development of ARGO-M optical system, the structural analysis was performed with regard to the optics and optomechanics design and the optical components. To ensure the optical performance, the quality was tested at the level of parts using the laser interferometer and ultra-high-precision measuring instruments. The assembly and alignment of ARGO-M optical system were conducted at an auto-collimation facility. As the transmission and reception are separated in the ARGO-M optical system, the pointing alignment between the transmitting telescope and receiving telescope is critical for precise target pointing. Thus, the alignment using the ground target and the radiant point observation of transmitting laser beam was carried out, and the lines of sight for the two telescopes were aligned within the required pointing precision. This paper describes the design, structural analysis, manufacture and assembly of parts, and entire process related with the alignment for the ARGO-M optical system.

  12. Precise orbit computation and sea surface modeling

    NASA Technical Reports Server (NTRS)

    Wakker, Karel F.; Ambrosius, B. A. C.; Rummel, R.; Vermaat, E.; Deruijter, W. P. M.; Vandermade, J. W.; Zimmerman, J. T. F.

    1991-01-01

    The research project described below is part of a long-term program at Delft University of Technology aiming at the application of European Remote Sensing satellite (ERS-1) and TOPEX/POSEIDON altimeter measurements for geophysical purposes. This program started in 1980 with the processing of Seasat laser range and altimeter height measurements and concentrates today on the analysis of Geosat altimeter data. The objectives of the TOPEX/POSEIDON research project are the tracking of the satellite by the Dutch mobile laser tracking system MTLRS-2, the computation of precise TOPEX/POSEIDON orbits, the analysis of the spatial and temporal distribution of the orbit errors, the improvement of ERS-1 orbits through the information obtained from the altimeter crossover difference residuals for crossing ERS-1 and TOPEX/POSEIDON tracks, the combination of ERS-1 and TOPEX/POSEIDON altimeter data into a single high-precision data set, and the application of this data set to model the sea surface. The latter application will focus on the determination of detailed regional mean sea surfaces, sea surface variability, ocean topography, and ocean currents in the North Atlantic, the North Sea, the seas around Indonesia, the West Pacific, and the oceans around South Africa.

  13. Solving complex photocycle kinetics. Theory and direct method.

    PubMed Central

    Nagle, J F

    1991-01-01

    A direct nonlinear least squares method is described that obtains the true kinetic rate constants and the temperature-independent spectra of n intermediates from spectroscopic data taken in the visible at three or more temperatures. A theoretical analysis, which is independent of implementation of the direct method, proves that well determined local solutions are not possible for fewer than three temperatures. This analysis also proves that measurements at more than n wavelengths are redundant, although the direct method indicates that convergence is faster if n + m wavelengths are measured, where m is of order one. This suggests that measurements should concentrate on high precision for a few measuring wavelengths, rather than lower precision for many wavelengths. Globally, false solutions occur, and the ability to reject these depends upon the precision of the data, as shown by explicit example. An optimized way to analyze vibrational spectroscopic data is also presented. Such data yield unique results, which are comparably accurate to those obtained from data taken in the visible with comparable noise. It is discussed how use of both kinds of data is advantageous if the data taken in the visible are significantly less noisy. PMID:2009362

  14. An Automated Field Phenotyping Pipeline for Application in Grapevine Research

    PubMed Central

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-01-01

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale. PMID:25730485

  15. The Influence of Sampling Density on Bayesian Age-Depth Models and Paleoclimatic Reconstructions - Lessons Learned from Lake Titicaca - Bolivia/Peru

    NASA Astrophysics Data System (ADS)

    Salenbien, W.; Baker, P. A.; Fritz, S. C.; Guedron, S.

    2014-12-01

    Lake Titicaca is one of the most important archives of paleoclimate in tropical South America, and prior studies have elucidated patterns of climate variation at varied temporal scales over the past 0.5 Ma. Yet, slow sediment accumulation rates in the main deeper basin of the lake have precluded analysis of the lake's most recent history at high resolution. To obtain a paleoclimate record of the last few millennia at multi-decadal resolution, we obtained five short cores, ranging from 139 to 181 cm in length, from the shallower Wiñaymarka sub-basin of of Lake Titicaca, where sedimentation rates are higher than in the lake's main basin. Selected cores have been analyzed for their geochemical signature by scanning XRF, diatom stratigraphy, sedimentology, and for 14C age dating. A total of 72 samples were 14C-dated using a Gas Ion Source automated high-throughput method for carbonate samples (mainly Littoridina sp. and Taphius montanus gastropod shells) at NOSAMS (Woods Hole Oceanographic Institute) with an analytical precision higher than 2%. The method has lower analytical precision compared with traditional AMS radiocarbon dating, but the lower cost enables analysis of a larger number of samples, and the error associated with the lower precision is relatively small for younger samples (< ~8,000 years). A 172-cm-long core was divided into centimeter long sections, and 47 14C dates were obtained from 1-cm intervals, averaging one date every 3-4 cm. The other cores were radiocarbon dated with a sparser sampling density that focused on visual unconformities and shell beds. The high-resolution radiocarbon analysis reveals complex sedimentation patterns in visually continuous sections, with abundant indicators of bioturbated or reworked sediments and periods of very rapid sediment accumulation. These features are not evident in the sparser sampling strategy but have significant implications for reconstructing past lake level and paleoclimatic history.

  16. Automatic stent strut detection in intravascular OCT images using image processing and classification technique

    NASA Astrophysics Data System (ADS)

    Lu, Hong; Gargesha, Madhusudhana; Wang, Zhao; Chamie, Daniel; Attizani, Guilherme F.; Kanaya, Tomoaki; Ray, Soumya; Costa, Marco A.; Rollins, Andrew M.; Bezerra, Hiram G.; Wilson, David L.

    2013-02-01

    Intravascular OCT (iOCT) is an imaging modality with ideal resolution and contrast to provide accurate in vivo assessments of tissue healing following stent implantation. Our Cardiovascular Imaging Core Laboratory has served >20 international stent clinical trials with >2000 stents analyzed. Each stent requires 6-16hrs of manual analysis time and we are developing highly automated software to reduce this extreme effort. Using classification technique, physically meaningful image features, forward feature selection to limit overtraining, and leave-one-stent-out cross validation, we detected stent struts. To determine tissue coverage areas, we estimated stent "contours" by fitting detected struts and interpolation points from linearly interpolated tissue depths to a periodic cubic spline. Tissue coverage area was obtained by subtracting lumen area from the stent area. Detection was compared against manual analysis of 40 pullbacks. We obtained recall = 90+/-3% and precision = 89+/-6%. When taking struts deemed not bright enough for manual analysis into consideration, precision improved to 94+/-6%. This approached inter-observer variability (recall = 93%, precision = 96%). Differences in stent and tissue coverage areas are 0.12 +/- 0.41 mm2 and 0.09 +/- 0.42 mm2, respectively. We are developing software which will enable visualization, review, and editing of automated results, so as to provide a comprehensive stent analysis package. This should enable better and cheaper stent clinical trials, so that manufacturers can optimize the myriad of parameters (drug, coverage, bioresorbable versus metal, etc.) for stent design.

  17. ANALYSIS OF KEPLER'S SHORT-CADENCE PHOTOMETRY FOR TrES-2b

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kipping, David; Bakos, Gaspar, E-mail: dkipping@cfa.harvard.edu

    2011-05-20

    We present an analysis of 18 short-cadence (SC) transit light curves of TrES-2b using quarter 0 (Q0) and quarter 1 (Q1) from the Kepler Mission. The photometry is of unprecedented precision, 237 ppm minute{sup -1}, allowing for the most accurate determination of the transit parameters yet obtained for this system. Global fits of the transit photometry, radial velocities, and known transit times are used to obtain a self-consistent set of refined parameters for this system, including updated stellar and planetary parameters. Special attention is paid to fitting for limb darkening and eccentricity. We place an upper limit on the occultationmore » depth to be <72.9 ppm to 3{sigma} confidence, indicating TrES-2b has the lowest determined geometric albedo for an exoplanet, of A{sub g} < 0.146. We also produce a transit timing analysis using Kepler's SC data and demonstrate exceptional timing precision at the level of a few seconds for each transit event. With 18 fully sampled transits at such high precision, we are able to produce stringent constraints on the presence of perturbing planets, Trojans, and extrasolar moons. We introduce the novel use of control data to identify phasing effects. We also exclude the previously proposed hypotheses of short-period transit time variation and additional transits but find that the hypothesis of long-term inclination change is neither supported nor refuted by our analysis.« less

  18. In-vivo measurement of dynamic joint motion using high speed biplane radiography and CT: application to canine ACL deficiency.

    PubMed

    Tashman, Scott; Anderst, William

    2003-04-01

    Dynamic assessment of three-dimensional (3D) skeletal kinematics is essential for understanding normal joint function as well as the effects of injury or disease. This paper presents a novel technique for measuring in-vivo skeletal kinematics that combines data collected from high-speed biplane radiography and static computed tomography (CT). The goals of the present study were to demonstrate that highly precise measurements can be obtained during dynamic movement studies employing high frame-rate biplane video-radiography, to develop a method for expressing joint kinematics in an anatomically relevant coordinate system and to demonstrate the application of this technique by calculating canine tibio-femoral kinematics during dynamic motion. The method consists of four components: the generation and acquisition of high frame rate biplane radiographs, identification and 3D tracking of implanted bone markers, CT-based coordinate system determination, and kinematic analysis routines for determining joint motion in anatomically based coordinates. Results from dynamic tracking of markers inserted in a phantom object showed the system bias was insignificant (-0.02 mm). The average precision in tracking implanted markers in-vivo was 0.064 mm for the distance between markers and 0.31 degree for the angles between markers. Across-trial standard deviations for tibio-femoral translations were similar for all three motion directions, averaging 0.14 mm (range 0.08 to 0.20 mm). Variability in tibio-femoral rotations was more dependent on rotation axis, with across-trial standard deviations averaging 1.71 degrees for flexion/extension, 0.90 degree for internal/external rotation, and 0.40 degree for varus/valgus rotation. Advantages of this technique over traditional motion analysis methods include the elimination of skin motion artifacts, improved tracking precision and the ability to present results in a consistent anatomical reference frame.

  19. Quantitative analysis of drugs in hair by UHPLC high resolution mass spectrometry.

    PubMed

    Kronstrand, Robert; Forsman, Malin; Roman, Markus

    2018-02-01

    Liquid chromatographic methods coupled to high resolution mass spectrometry are increasingly used to identify compounds in various matrices including hair but there are few recommendations regarding the parameters and their criteria to identify a compound. In this study we present a method for the identification and quantification of a range of drugs and discuss the parameters used to identify a compound with high resolution mass spectrometry. Drugs were extracted from hair by incubation in a buffer:solvent mixture at 37°C during 18h. Analysis was performed on a chromatographic system comprised of an Agilent 6550 QTOF coupled to a 1290 Infinity UHPLC system. High resolution accurate mass data were acquired in the All Ions mode and exported into Mass Hunter Quantitative software for quantitation and identification using qualifier fragment ions. Validation included selectivity, matrix effects, calibration range, within day and between day precision and accuracy. The analytes were 7-amino-flunitrazepam, 7-amino-clonazepam, 7-amino-nitrazepam, acetylmorphine, alimemazine, alprazolam, amphetamine, benzoylecgonine, buprenorphine, diazepam, ethylmorphine, fentanyl, hydroxyzine, ketobemidone, codeine, cocaine, MDMA, methadone, methamphetamine, morphine, oxycodone, promethazine, propiomazine, propoxyphene, tramadol, zaleplone, zolpidem, and zopiclone. As proof of concept, hair from 29 authentic post mortem cases were analysed. The calibration range was established between 0.05ng/mg to 5.0ng/mg for all analytes except fentanyl (0.02-2.0), buprenorphine (0.04-2.0), and ketobemidone (0.05-4.0) as well as for alimemazine, amphetamine, cocaine, methadone, and promethazine (0.10-5.0). For all analytes, the accuracy of the fortified pooled hair matrix was 84-108% at the low level and 89-106% at the high level. The within series precisions were between 1.4 and 6.7% and the between series precisions were between 1.4 and 10.1%. From the 29 autopsy cases, 121 positive findings were encountered from 23 of the analytes in concentrations similar to those previously published. We conclude that the developed method proved precise and accurate and that it had sufficient performance for the purpose of detecting regular use of drugs or treatment with prescription drugs. To identify a compound we recommend the use of ion ratios as a complement to instrument software "matching scores". Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Fast and precise dense grid size measurement method based on coaxial dual optical imaging system

    NASA Astrophysics Data System (ADS)

    Guo, Jiping; Peng, Xiang; Yu, Jiping; Hao, Jian; Diao, Yan; Song, Tao; Li, Ameng; Lu, Xiaowei

    2015-10-01

    Test sieves with dense grid structure are widely used in many fields, accurate gird size calibration is rather critical for success of grading analysis and test sieving. But traditional calibration methods suffer from the disadvantages of low measurement efficiency and shortage of sampling number of grids which could lead to quality judgment risk. Here, a fast and precise test sieve inspection method is presented. Firstly, a coaxial imaging system with low and high optical magnification probe is designed to capture the grid images of the test sieve. Then, a scaling ratio between low and high magnification probes can be obtained by the corresponding grids in captured images. With this, all grid dimensions in low magnification image can be obtained by measuring few corresponding grids in high magnification image with high accuracy. Finally, by scanning the stage of the tri-axis platform of the measuring apparatus, whole surface of the test sieve can be quickly inspected. Experiment results show that the proposed method can measure the test sieves with higher efficiency compare to traditional methods, which can measure 0.15 million grids (gird size 0.1mm) within only 60 seconds, and it can measure grid size range from 20μm to 5mm precisely. In a word, the presented method can calibrate the grid size of test sieve automatically with high efficiency and accuracy. By which, surface evaluation based on statistical method can be effectively implemented, and the quality judgment will be more reasonable.

  1. A numerical similarity approach for using retired Current Procedural Terminology (CPT) codes for electronic phenotyping in the Scalable Collaborative Infrastructure for a Learning Health System (SCILHS).

    PubMed

    Klann, Jeffrey G; Phillips, Lori C; Turchin, Alexander; Weiler, Sarah; Mandl, Kenneth D; Murphy, Shawn N

    2015-12-11

    Interoperable phenotyping algorithms, needed to identify patient cohorts meeting eligibility criteria for observational studies or clinical trials, require medical data in a consistent structured, coded format. Data heterogeneity limits such algorithms' applicability. Existing approaches are often: not widely interoperable; or, have low sensitivity due to reliance on the lowest common denominator (ICD-9 diagnoses). In the Scalable Collaborative Infrastructure for a Learning Healthcare System (SCILHS) we endeavor to use the widely-available Current Procedural Terminology (CPT) procedure codes with ICD-9. Unfortunately, CPT changes drastically year-to-year - codes are retired/replaced. Longitudinal analysis requires grouping retired and current codes. BioPortal provides a navigable CPT hierarchy, which we imported into the Informatics for Integrating Biology and the Bedside (i2b2) data warehouse and analytics platform. However, this hierarchy does not include retired codes. We compared BioPortal's 2014AA CPT hierarchy with Partners Healthcare's SCILHS datamart, comprising three-million patients' data over 15 years. 573 CPT codes were not present in 2014AA (6.5 million occurrences). No existing terminology provided hierarchical linkages for these missing codes, so we developed a method that automatically places missing codes in the most specific "grouper" category, using the numerical similarity of CPT codes. Two informaticians reviewed the results. We incorporated the final table into our i2b2 SCILHS/PCORnet ontology, deployed it at seven sites, and performed a gap analysis and an evaluation against several phenotyping algorithms. The reviewers found the method placed the code correctly with 97 % precision when considering only miscategorizations ("correctness precision") and 52 % precision using a gold-standard of optimal placement ("optimality precision"). High correctness precision meant that codes were placed in a reasonable hierarchal position that a reviewer can quickly validate. Lower optimality precision meant that codes were not often placed in the optimal hierarchical subfolder. The seven sites encountered few occurrences of codes outside our ontology, 93 % of which comprised just four codes. Our hierarchical approach correctly grouped retired and non-retired codes in most cases and extended the temporal reach of several important phenotyping algorithms. We developed a simple, easily-validated, automated method to place retired CPT codes into the BioPortal CPT hierarchy. This complements existing hierarchical terminologies, which do not include retired codes. The approach's utility is confirmed by the high correctness precision and successful grouping of retired with non-retired codes.

  2. A rapid method for the simultaneous determination of 25 anti-hypertensive compounds in dietary supplements using ultra-high-pressure liquid chromatography.

    PubMed

    Heo, Seok; Yoo, Geum Joo; Choi, Ji Yeon; Park, Hyoung Joon; Park, Sung-Kwan; Baek, Sun Young

    2016-11-01

    A novel, stable, simple and specific ultra-performance liquid chromatography method with ultraviolet detection (205 nm) for the simultaneous analysis of 25 anti-hypertensive substances was developed. The method was validated according to the International Conference of Harmonisation guidelines with respect to linearity, accuracy, precision, limit of detection (LOD), limit of quantitation (LOQ) and stability. From the ultra-performance liquid chromatography results, we identified the LOD and LOQ of solid samples to be 0.20-1.00 and 0.60-3.00 μg ml -1 , respectively, while those of liquid samples were 0.30-1.20 and 0.90-3.60 μg ml -1 , respectively. The linearity exceeded 0.9999, and the intra- and inter-day precisions were 0.15-6.48% and 0.28-8.67%, respectively. The intra- and inter-day accuracies were 82.25-111.42% and 80.70-115.64%, respectively, and the stability was lower than 12.9% (relative standard deviation). This method was applied to the monitoring of 97 commercially available dietary supplements obtained in Korea, such as pills, soft capsules, hard capsules, liquids, powders and tablets. The proposed method is accurate, precise and of high quality, and can be used for the routine, reproducible analysis and control of 25 anti-hypertensive substances in various dietary supplements. The work presented herein may help to prevent incidents related to food adulteration and restrict the illegal food market.

  3. Quantitative high-resolution genomic analysis of single cancer cells.

    PubMed

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  4. Precision Attitude Determination System (PADS) design and analysis. Two-axis gimbal star tracker

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Development of the Precision Attitude Determination System (PADS) focused chiefly on the two-axis gimballed star tracker and electronics design improved from that of Precision Pointing Control System (PPCS), and application of the improved tracker for PADS at geosynchronous altitude. System design, system analysis, software design, and hardware design activities are reported. The system design encompasses the PADS configuration, system performance characteristics, component design summaries, and interface considerations. The PADS design and performance analysis includes error analysis, performance analysis via attitude determination simulation, and star tracker servo design analysis. The design of the star tracker and electronics are discussed. Sensor electronics schematics are included. A detailed characterization of the application software algorithms and computer requirements is provided.

  5. High-speed peak matching algorithm for retention time alignment of gas chromatographic data for chemometric analysis.

    PubMed

    Johnson, Kevin J; Wright, Bob W; Jarman, Kristin H; Synovec, Robert E

    2003-05-09

    A rapid retention time alignment algorithm was developed as a preprocessing utility to be used prior to chemometric analysis of large datasets of diesel fuel profiles obtained using gas chromatography (GC). Retention time variation from chromatogram-to-chromatogram has been a significant impediment against the use of chemometric techniques in the analysis of chromatographic data due to the inability of current chemometric techniques to correctly model information that shifts from variable to variable within a dataset. The alignment algorithm developed is shown to increase the efficacy of pattern recognition methods applied to diesel fuel chromatograms by retaining chemical selectivity while reducing chromatogram-to-chromatogram retention time variations and to do so on a time scale that makes analysis of large sets of chromatographic data practical. Two sets of diesel fuel gas chromatograms were studied using the novel alignment algorithm followed by principal component analysis (PCA). In the first study, retention times for corresponding chromatographic peaks in 60 chromatograms varied by as much as 300 ms between chromatograms before alignment. In the second study of 42 chromatograms, the retention time shifting exhibited was on the order of 10 s between corresponding chromatographic peaks, and required a coarse retention time correction prior to alignment with the algorithm. In both cases, an increase in retention time precision afforded by the algorithm was clearly visible in plots of overlaid chromatograms before and then after applying the retention time alignment algorithm. Using the alignment algorithm, the standard deviation for corresponding peak retention times following alignment was 17 ms throughout a given chromatogram, corresponding to a relative standard deviation of 0.003% at an average retention time of 8 min. This level of retention time precision is a 5-fold improvement over the retention time precision initially provided by a state-of-the-art GC instrument equipped with electronic pressure control and was critical to the performance of the chemometric analysis. This increase in retention time precision does not come at the expense of chemical selectivity, since the PCA results suggest that essentially all of the chemical selectivity is preserved. Cluster resolution between dissimilar groups of diesel fuel chromatograms in a two-dimensional scores space generated with PCA is shown to substantially increase after alignment. The alignment method is robust against missing or extra peaks relative to a target chromatogram used in the alignment, and operates at high speed, requiring roughly 1 s of computation time per GC chromatogram.

  6. Hologic QDR 2000 whole-body scans: a comparison of three combinations of scan modes and analysis software

    NASA Technical Reports Server (NTRS)

    Spector, E.; LeBlanc, A.; Shackelford, L.

    1995-01-01

    This study reports on the short-term in vivo precision and absolute measurements of three combinations of whole-body scan modes and analysis software using a Hologic QDR 2000 dual-energy X-ray densitometer. A group of 21 normal, healthy volunteers (11 male and 10 female) were scanned six times, receiving one pencil-beam and one array whole-body scan on three occasions approximately 1 week apart. The following combinations of scan modes and analysis software were used: pencil-beam scans analyzed with Hologic's standard whole-body software (PB scans); the same pencil-beam analyzed with Hologic's newer "enhanced" software (EPB scans); and array scans analyzed with the enhanced software (EA scans). Precision values (% coefficient of variation, %CV) were calculated for whole-body and regional bone mineral content (BMC), bone mineral density (BMD), fat mass, lean mass, %fat and total mass. In general, there was no significant difference among the three scan types with respect to short-term precision of BMD and only slight differences in the precision of BMC. Precision of BMC and BMD for all three scan types was excellent: < 1% CV for whole-body values, with most regional values in the 1%-2% range. Pencil-beam scans demonstrated significantly better soft tissue precision than did array scans. Precision errors for whole-body lean mass were: 0.9% (PB), 1.1% (EPB) and 1.9% (EA). Precision errors for whole-body fat mass were: 1.7% (PB), 2.4% (EPB) and 5.6% (EA). EPB precision errors were slightly higher than PB precision errors for lean, fat and %fat measurements of all regions except the head, although these differences were significant only for the fat and % fat of the arms and legs. In addition EPB precision values exhibited greater individual variability than PB precision values. Finally, absolute values of bone and soft tissue were compared among the three combinations of scan and analysis modes. BMC, BMD, fat mass, %fat and lean mass were significantly different between PB scans and either of the EPB or EA scans. Differences were as large as 20%-25% for certain regional fat and BMD measurements. Additional work may be needed to examine the relative accuracy of the scan mode/software combinations and to identify reasons for the differences in soft tissue precision with the array whole-body scan mode.

  7. Advances in laser ablation MC-ICPMS isotopic analysis of rock materials

    NASA Astrophysics Data System (ADS)

    Young, E. D.

    2007-12-01

    Laser ablation multiple-collector inductively coupled plasma-source mass spectrometry (LA-MC-ICPMS) is a rapid method for obtaining high-precision isotope ratio measurements in geological samples. The method has been used with success for measuring isotope ratios of numerous elements, including Pb, Hf, Mg, Si, and Fe in terrestrial and extraterrestrial samples. It fills the gap between the highest precision obtainable with acid digestion together with MC-ICPMS and thermal ionization mass spectrometry (TIMS) and the maximum spatial resolution afforded by secondary ion mass spectrometry (SIMS). Matrix effects have been shown to be negligible for Pb isotopic analysis by LA-MC-ICPMS (Simon et al., 2007). Glass standards NBS 610, 612, and 614 have Pb/matrix ratios spanning two orders of magnitude. Our sample-standard bracketing laser ablation technique gives accurate and precise 208Pb/206Pb and 207Pb/206Pb for these glasses. The accuracy is superior to that obtained when using Tl to correct for mass fractionation. Accuracy and precision (± 0.2 ‰) for Pb in feldspars is comparable to that for double-spike TIMS. Data like these have been used to distinguish distinct sources of magmas in the Long Valley silicic magma system. LA-MC-ICPMS analyses of Mg isotope ratios in calcium-aluminum-rich inclusions (CAIs) from carbonaceous chondrite meteorites have revealed a wealth of new information about the history of these objects. A byproduct of this work has been recognition of the importance of different mass fractionation laws among three isotopes of a given element. Kinetic and equilibrium processes define distinct fractionation laws. Reservoir effects can further modify these laws. The result is that the linear coefficient β that relates the logarithms of the ratios n2/n1 and n3/n1 (ni refers to the number of atoms of isotope i) of isotopes with masses m3 > m2 > m1 is not unique. Rather, it is process dependent. In the case of Mg, this coefficient ranges from 0.521 for single-step equilibrium processes to 0.510 or even lower for kinetic processes. Rayleigh fractionation involving a kinetic process with a single-step β of 0.510 produces an effective β of 0.512. Such differences in fractionation laws can be crucial for determining excesses or deficits in isotopes relative to mass fractionation. Contrary to some assertions, Si isotope ratios can be measured with high accuracy and precision using 193 nm excimer lasers with nanosecond pulse widths (Shahar and Young, 2007). Silicon isotope ratios in CAIs measured by 193 nm LA-MC-ICPMS have been combined with Mg isotope ratios to constrain the astrophysical environments in which these oldest solar system materials formed. Accuracy of the measurements was determined using gravimetric standards of various matrix compositions. The results establish that matrix effects for Si are below detection at the ± 0.2 ‰ precision of the laser ablation technique. High mass resolving power (m/Δ m ~ 9000) is necessary to obtain accurate Si isotope ratios by laser ablation. High-precision LA-MC-ICPMS measurements of 176Hf/177Hf in zircons can be obtained by normalizing to 179Hf/177Hf assuming an exponential fractionation law and no mass-dependent Hf, Lu, or Yb stable isotope fractionation. With corrections for interfering 176Lu and 176Yb precision for this method can be on the order of 0.3 epsilon (0.03 ‰). The approach has been used to infer the existence of continental crust on Earth 4.4 billion years before present (Harrison et al., 2005).

  8. The Rényi divergence enables accurate and precise cluster analysis for localisation microscopy.

    PubMed

    Staszowska, Adela D; Fox-Roberts, Patrick; Hirvonen, Liisa M; Peddie, Christopher J; Collinson, Lucy M; Jones, Gareth E; Cox, Susan

    2018-06-01

    Clustering analysis is a key technique for quantitatively characterising structures in localisation microscopy images. To build up accurate information about biological structures, it is critical that the quantification is both accurate (close to the ground truth) and precise (has small scatter and is reproducible). Here we describe how the Rényi divergence can be used for cluster radius measurements in localisation microscopy data. We demonstrate that the Rényi divergence can operate with high levels of background and provides results which are more accurate than Ripley's functions, Voronoi tesselation or DBSCAN. Data supporting this research will be made accessible via a web link. Software codes developed for this work can be accessed via http://coxphysics.com/Renyi_divergence_software.zip. Implemented in C ++. Correspondence and requests for materials can be also addressed to the corresponding author. adela.staszowska@gmail.com or susan.cox@kcl.ac.uk. Supplementary data are available at Bioinformatics online.

  9. A precise goniometer/tensiometer using a low cost single-board computer

    NASA Astrophysics Data System (ADS)

    Favier, Benoit; Chamakos, Nikolaos T.; Papathanasiou, Athanasios G.

    2017-12-01

    Measuring the surface tension and the Young contact angle of a droplet is extremely important for many industrial applications. Here, considering the booming interest for small and cheap but precise experimental instruments, we have constructed a low-cost contact angle goniometer/tensiometer, based on a single-board computer (Raspberry Pi). The device runs an axisymmetric drop shape analysis (ADSA) algorithm written in Python. The code, here named DropToolKit, was developed in-house. We initially present the mathematical framework of our algorithm and then we validate our software tool against other well-established ADSA packages, including the commercial ramé-hart DROPimage Advanced as well as the DropAnalysis plugin in ImageJ. After successfully testing for various combinations of liquids and solid surfaces, we concluded that our prototype device would be highly beneficial for industrial applications as well as for scientific research in wetting phenomena compared to the commercial solutions.

  10. WASP-47 and the Origin of Hot Jupiters

    NASA Astrophysics Data System (ADS)

    Vanderburg, Andrew; Becker, Juliette; Latham, David W.; Adams, Fred; Bryan, Marta; Buchhave, Lars; Haywood, Raphaelle; Khain, Tali; Lopez, Eric; Malavolta, Luca; Mortier, Annelies; HARPS-N Consortium

    2018-01-01

    WASP-47 b is a transiting hot Jupiter in a system with two additional short-period transiting planets and a long-period outer Jovian companion. WASP-47 b is the only known hot Jupiter with such close-in companions and therefore may hold clues to the origins of hot Jupiter systems. We report on precise radial velocity observations of WASP-47 to measure planet masses and determine their orbits to high precision. Using these improved masses and orbital elements, we perform a dynamical analysis to constrain the inclination of the outer planet, which we find likely orbits near the same plane as the inner transiting system. A similar dynamical analysis for five other hot Jupiter systems with long-period companions around cool host stars (Teff < 6200 K) shows that these outer companions likely also orbit close to the plane of the hot Jupiters. These constraints disfavor hot Jupiter models involving strong dynamical interactions like Kozai-Lidov migration.

  11. Attaining the Photometric Precision Required by Future Dark Energy Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stubbs, Christopher

    2013-01-21

    This report outlines our progress towards achieving the high-precision astronomical measurements needed to derive improved constraints on the nature of the Dark Energy. Our approach to obtaining higher precision flux measurements has two basic components: 1) determination of the optical transmission of the atmosphere, and 2) mapping out the instrumental photon sensitivity function vs. wavelength, calibrated by referencing the measurements to the known sensitivity curve of a high precision silicon photodiode, and 3) using the self-consistency of the spectrum of stars to achieve precise color calibrations.

  12. Sensitivity, accuracy, and precision issues in opto-electronic holography based on fiber optics and high-spatial- and high-digitial-resolution cameras

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Yokum, Jeffrey S.; Pryputniewicz, Ryszard J.

    2002-06-01

    Sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography based on fiber optics and high-spatial and high-digital resolution cameras, are discussed in this paper. It is shown that sensitivity, accuracy, and precision dependent on both, the effective determination of optical phase and the effective characterization of the illumination-observation conditions. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gages, demonstrating the applicability of quantitative optical metrology techniques to satisfy constantly increasing needs for the study and development of emerging technologies.

  13. Microfluidic chip-based technologies: emerging platforms for cancer diagnosis

    PubMed Central

    2013-01-01

    The development of early and personalized diagnostic protocols is considered the most promising avenue to decrease mortality from cancer and improve outcome. The emerging microfluidic-based analyzing platforms hold high promises to fulfill high-throughput and high-precision screening with reduced equipment cost and low analysis time, as compared to traditional bulky counterparts in bench-top laboratories. This article overviewed the potential applications of microfluidic technologies for detection and monitoring of cancer through nucleic acid and protein biomarker analysis. The implications of the technologies in cancer cytology that can provide functional personalized diagnosis were highlighted. Finally, the future niches for using microfluidic-based systems in tumor screening were briefly discussed. PMID:24070124

  14. High-precision processing and detection of the high-caliber off-axis aspheric mirror

    NASA Astrophysics Data System (ADS)

    Dai, Chen; Li, Ang; Xu, Lingdi; Zhang, Yingjie

    2017-10-01

    To achieve the efficient, controllable, digital processing and high-precision detection of the high-caliber off-axis aspheric mirror, meeting the high-level development needs of the modern high-resolution, large field of space optical remote sensing camera, we carried out the research on high precision machining and testing technology of off-axis aspheric mirror. First, we forming the off-axis aspheric sample with diameter of 574mm × 302mm by milling it with milling machine, and then the intelligent robot equipment was used for off-axis aspheric high precision polishing. Surface detection of the sample will be proceed with the off-axis aspheric contact contour detection technology and offaxis non-spherical surface interference detection technology after its fine polishing using ion beam equipment. The final surface accuracy RMS is 12nm.

  15. Application of high-precision two-way ranging to Galileo Earth-1 encounter navigation

    NASA Technical Reports Server (NTRS)

    Pollmeier, V. M.; Thurman, S. W.

    1992-01-01

    The application of precision two-way ranging to orbit determination with relatively short data arcs is investigated for the Galileo spacecraft's approach to its first Earth encounter (December 8, 1990). Analysis of previous S-band (2.3-GHz) ranging data acquired from Galileo indicated that under good signal conditions submeter precision and 10-m ranging accuracy were achieved. It is shown that ranging data of sufficient accuracy, when acquired from multiple stations, can sense the geocentric angular position of a distant spacecraft. A range data filtering technique, in which explicit modeling of range measurement bias parameters for each station pass is utilized, is shown to largely remove the systematic ground system calibration errors and transmission media effects from the Galileo range measurements, which would otherwise corrupt the angle-finding capabilities of the data. The accuracy of the Galileo orbit solutions obtained with S-band Doppler and precision ranging were found to be consistent with simple theoretical calculations, which predicted that angular accuracies of 0.26-0.34 microrad were achievable. In addition, the navigation accuracy achieved with precision ranging was marginally better than that obtained using delta-differenced one-way range (delta DOR), the principal data type that was previously used to obtain spacecraft angular position measurements operationally.

  16. Analysis of High Precision GPS Time Series and Strain Rates for the Geothermal Play Fairway Analysis of Washington State Prospects Project

    DOE Data Explorer

    Michael Swyer

    2015-02-22

    Global Positioning System (GPS) time series from the National Science Foundation (NSF) Earthscope’s Plate Boundary Observatory (PBO) and Central Washington University’s Pacific Northwest Geodetic Array (PANGA). GPS station velocities were used to infer strain rates using the ‘splines in tension’ method. Strain rates were derived separately for subduction zone locking at depth and block rotation near the surface within crustal block boundaries.

  17. On the role of differenced phase-delays in high-precision wide-field multi-source astrometry

    NASA Astrophysics Data System (ADS)

    Martí-Vidal, I.; Marcaide, J. M.; Guirado, J. C.

    2007-07-01

    Phase-delay is, by far, the most precise observable used in interferometry. In typical very-long-baseline-interferometry (VLBI) observations, the uncertainties of the phase-delays can be about 100 times smaller than those of the group delays. However, the phase-delays have an important handicap: they are ambiguous, since they are computed from the relative phases of the signals of the different antennas, and an indeterminate number of complete 2¶- cycles can be added to those phases leaving them unchanged. There are different approaches to solve the ambiguity problem of the phase delays (Shapiro et al., 1979; Beasley & Conway, 1995), but none of them has been ever used in observations involving more than 2.3 sources. In this contribution, we will report for the first-time wide-field multi-source astrometric analysis that has been performed on a complete set of radio sources using the phase-delay observable. The target of our analysis is the S5 polar cap sample, consisting on 13 bright ICRF sources near the North Celestial Pole. We have developed new algorithms and updated existing software to correct, in an automatic way, the ambiguities of the phase-delay and, therefore, perform a phasedelay astrometric analysis of all the sources in the sample. We will also discuss on the impact of the use of phase-delays in the astrometric precision.

  18. The Laser Ranging Experiment of the Lunar Reconnaissance Orbiter: Five Years of Operations and Data Analysis

    NASA Technical Reports Server (NTRS)

    Mao, Dandan; McGarry, Jan F.; Mazarico, Erwan; Neumann, Gregory A.; Sun, Xiaoli; Torrence, Mark H.; Zagwodzki, Thomas W.; Rowlands, David D.; Hoffman, Evan D.; Horvath, Julie E.; hide

    2016-01-01

    We describe the results of the Laser Ranging (LR) experiment carried out from June 2009 to September 2014 in order to make one-way time-of-flight measurements of laser pulses between Earth-based laser ranging stations and the Lunar Reconnaissance Orbiter (LRO) orbiting the Moon. Over 4,000 hours of successful LR data are obtained from 10 international ground stations. The 20-30 centimeter precision of the full-rate LR data is further improved to 5-10 centimeter after conversion into normal points. The main purpose of LR is to utilize the high accuracy normal point data to improve the quality of the LRO orbits, which are nomi- nally determined by the radiometric S-band tracking data. When independently used in the LRO precision orbit determination process with the high-resolution GRAIL (Gravity Recovery and Interior Laboratory) gravity model, LR data provide good orbit solutions, with an average difference of approximately 50 meters in total position, and approximately 20 centimeters in radial direction, compared to the definitive LRO trajectory. When used in combination with the S-band tracking data, LR data help to improve the orbit accuracy in the radial direction to approximately 15 centimeters. In order to obtain highly accurate LR range measurements for precise orbit determination results, it is critical to closely model the behavior of the clocks both at the ground stations and on the spacecraft. LR provides a unique data set to calibrate the spacecraft clock. The LRO spacecraft clock is characterized by the LR data to a timing knowledge of 0.015 milliseconds over the entire 5 years of LR operation. We here present both the engineering setup of the LR experiments and the detailed analysis results of the LR data.

  19. Use of single-representative reverse-engineered surface-models for RSA does not affect measurement accuracy and precision.

    PubMed

    Seehaus, Frank; Schwarze, Michael; Flörkemeier, Thilo; von Lewinski, Gabriela; Kaptein, Bart L; Jakubowitz, Eike; Hurschler, Christof

    2016-05-01

    Implant migration can be accurately quantified by model-based Roentgen stereophotogrammetric analysis (RSA), using an implant surface model to locate the implant relative to the bone. In a clinical situation, a single reverse engineering (RE) model for each implant type and size is used. It is unclear to what extent the accuracy and precision of migration measurement is affected by implant manufacturing variability unaccounted for by a single representative model. Individual RE models were generated for five short-stem hip implants of the same type and size. Two phantom analyses and one clinical analysis were performed: "Accuracy-matched models": one stem was assessed, and the results from the original RE model were compared with randomly selected models. "Accuracy-random model": each of the five stems was assessed and analyzed using one randomly selected RE model. "Precision-clinical setting": implant migration was calculated for eight patients, and all five available RE models were applied to each case. For the two phantom experiments, the 95%CI of the bias ranged from -0.28 mm to 0.30 mm for translation and -2.3° to 2.5° for rotation. In the clinical setting, precision is less than 0.5 mm and 1.2° for translation and rotation, respectively, except for rotations about the proximodistal axis (<4.1°). High accuracy and precision of model-based RSA can be achieved and are not biased by using a single representative RE model. At least for implants similar in shape to the investigated short-stem, individual models are not necessary. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 34:903-910, 2016. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  20. Determination of the Kinematics of the Qweak Experiment and Investigation of an Atomic Hydrogen Moller Polarimeter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, Valerie M.

    The Q weak experiment has tested the Standard Model through making a precise measurement of the weak charge of the proton (more » $$Q^p_W$$). This was done through measuring the parity-violating asymmetry for polarized electrons scattering off of unpolarized protons. The parity-violating asymmetry measured is directly proportional to the four-momentum transfer ($Q^2$) from the electron to the proton. The extraction of $$Q^p_W$$ from the measured asymmetry requires a precise $Q^2$ determination. The Q weak experiment had a $Q^2$ = 24.8 ± 0.1 m(GeV 2) which achieved the goal of an uncertainty of <= 0.5%. From the measured asymmetry and $Q^2$, $$Q^p_W$$ was determined to be 0.0719 ± 0.0045, which is in good agreement with the Standard Model prediction. This puts a 7.5 TeV lower limit on possible "new physics". This dissertation describes the analysis of Q^2 for the Q weak experiment. Future parity-violating electron scattering experiments similar to the Q weak experiment will measure asymmetries to high precision in order to test the Standard Model. These measurements will require the beam polarization to be measured to sub-0.5% precision. Presently the electron beam polarization is measured through Moller scattering off of a ferromagnetic foil or through using Compton scattering, both of which can have issues reaching this precision. A novel Atomic Hydrogen Moller Polarimeter has been proposed as a non-invasive way to measure the polarization of an electron beam via Moller scattering off of polarized monatomic hydrogen gas. This dissertation describes the development and initial analysis of a Monte Carlo simulation of an Atomic Hydrogen Moller Polarimeter.« less

  1. High-throughput analysis of amphetamines in blood and urine with online solid-phase extraction-liquid chromatography-tandem mass spectrometry.

    PubMed

    Fernández, María del Mar Ramírez; Wille, Sarah M R; Samyn, Nele; Wood, Michelle; López-Rivadulla, Manuel; De Boeck, Gert

    2009-01-01

    An automated online solid-phase extraction-liquid chromatography-tandem mass spectrometry (SPE-LC-MS-MS) method for the analysis of amphetamines in blood and urine was developed and validated. Chromatographic separation was achieved on a Nucleodur Sphinx RP column with an LC gradient (a mixture of 10 mM ammonium formate buffer and acetonitrile), ensuring the elution of amphetamine, methamphetamine, MDMA, MDA, MDEA, PMA, and ephedrine within 11 min. The method was fully validated, according to international guidelines, using only 100 and 50 microL of blood and urine, respectively. The method showed an excellent intra- and interassay precision (relative standard deviation < 11.2% and bias < 13%) for two external quality control samples (QC) for both matrices and three and two 'in house' QCs for blood and urine, respectively. Responses were linear over the investigated range (r(2) > 0.99, 2.5-400 microg/L for blood and 25-1000 microg/L for urine). Limits of quantification were determined to be 2.5 and 25 microg/L for blood and urine, respectively. Limits of detection ranged from 0.05 to 0.5 microg/L for blood and 0.25 to 2.5 microg/L for urine, depending on the compound. Furthermore, the analytes and the processed samples were demonstrated to be stable (in the autosampler for at least 72 h and after three freeze/thaw cycles), and no disturbing matrix effects were observed for all compounds. Moreover, no carryover was observed after the analysis of high concentration samples (15,000 microg/L). The method was subsequently applied to authentic blood and urine samples obtained from forensic cases, which covered a broad range of concentrations. The validation results and actual sample analyses demonstrated that this method is rugged, precise, accurate, and well-suited for routine analysis as more than 72 samples are analyzed non-stop in 24 h with minimum sample handling. The combination of the high-throughput online SPE and the well-known sensitivity and selectivity assured by MS-MS resulted in the elimination of the bottleneck associated with the sample preparation requirements and provided increased sensitivity, accuracy, and precision.

  2. Long-term impact of precision agriculture on a farmer’s field

    USDA-ARS?s Scientific Manuscript database

    Targeting management practices and inputs with precision agriculture has high potential to meet some of the grand challenges of sustainability in the coming century. Although potential is high, few studies have documented long-term effects of precision agriculture on crop production and environmenta...

  3. Real-Time PPP Based on the Coupling Estimation of Clock Bias and Orbit Error with Broadcast Ephemeris.

    PubMed

    Pan, Shuguo; Chen, Weirong; Jin, Xiaodong; Shi, Xiaofei; He, Fan

    2015-07-22

    Satellite orbit error and clock bias are the keys to precise point positioning (PPP). The traditional PPP algorithm requires precise satellite products based on worldwide permanent reference stations. Such an algorithm requires considerable work and hardly achieves real-time performance. However, real-time positioning service will be the dominant mode in the future. IGS is providing such an operational service (RTS) and there are also commercial systems like Trimble RTX in operation. On the basis of the regional Continuous Operational Reference System (CORS), a real-time PPP algorithm is proposed to apply the coupling estimation of clock bias and orbit error. The projection of orbit error onto the satellite-receiver range has the same effects on positioning accuracy with clock bias. Therefore, in satellite clock estimation, part of the orbit error can be absorbed by the clock bias and the effects of residual orbit error on positioning accuracy can be weakened by the evenly distributed satellite geometry. In consideration of the simple structure of pseudorange equations and the high precision of carrier-phase equations, the clock bias estimation method coupled with orbit error is also improved. Rovers obtain PPP results by receiving broadcast ephemeris and real-time satellite clock bias coupled with orbit error. By applying the proposed algorithm, the precise orbit products provided by GNSS analysis centers are rendered no longer necessary. On the basis of previous theoretical analysis, a real-time PPP system was developed. Some experiments were then designed to verify this algorithm. Experimental results show that the newly proposed approach performs better than the traditional PPP based on International GNSS Service (IGS) real-time products. The positioning accuracies of the rovers inside and outside the network are improved by 38.8% and 36.1%, respectively. The PPP convergence speeds are improved by up to 61.4% and 65.9%. The new approach can change the traditional PPP mode because of its advantages of independence, high positioning precision, and real-time performance. It could be an alternative solution for regional positioning service before global PPP service comes into operation.

  4. Real-Time PPP Based on the Coupling Estimation of Clock Bias and Orbit Error with Broadcast Ephemeris

    PubMed Central

    Pan, Shuguo; Chen, Weirong; Jin, Xiaodong; Shi, Xiaofei; He, Fan

    2015-01-01

    Satellite orbit error and clock bias are the keys to precise point positioning (PPP). The traditional PPP algorithm requires precise satellite products based on worldwide permanent reference stations. Such an algorithm requires considerable work and hardly achieves real-time performance. However, real-time positioning service will be the dominant mode in the future. IGS is providing such an operational service (RTS) and there are also commercial systems like Trimble RTX in operation. On the basis of the regional Continuous Operational Reference System (CORS), a real-time PPP algorithm is proposed to apply the coupling estimation of clock bias and orbit error. The projection of orbit error onto the satellite-receiver range has the same effects on positioning accuracy with clock bias. Therefore, in satellite clock estimation, part of the orbit error can be absorbed by the clock bias and the effects of residual orbit error on positioning accuracy can be weakened by the evenly distributed satellite geometry. In consideration of the simple structure of pseudorange equations and the high precision of carrier-phase equations, the clock bias estimation method coupled with orbit error is also improved. Rovers obtain PPP results by receiving broadcast ephemeris and real-time satellite clock bias coupled with orbit error. By applying the proposed algorithm, the precise orbit products provided by GNSS analysis centers are rendered no longer necessary. On the basis of previous theoretical analysis, a real-time PPP system was developed. Some experiments were then designed to verify this algorithm. Experimental results show that the newly proposed approach performs better than the traditional PPP based on International GNSS Service (IGS) real-time products. The positioning accuracies of the rovers inside and outside the network are improved by 38.8% and 36.1%, respectively. The PPP convergence speeds are improved by up to 61.4% and 65.9%. The new approach can change the traditional PPP mode because of its advantages of independence, high positioning precision, and real-time performance. It could be an alternative solution for regional positioning service before global PPP service comes into operation. PMID:26205276

  5. Use of handheld X-ray fluorescence as a non-invasive method to distinguish between Asian and African elephant tusks

    PubMed Central

    Buddhachat, Kittisak; Thitaram, Chatchote; Brown, Janine L.; Klinhom, Sarisa; Bansiddhi, Pakkanut; Penchart, Kitichaya; Ouitavon, Kanita; Sriaksorn, Khanittha; Pa-in, Chalermpol; Kanchanasaka, Budsabong; Somgird, Chaleamchat; Nganvongpanit, Korakot

    2016-01-01

    We describe the use of handheld X-ray fluorescence, for elephant tusk species identification. Asian (n = 72) and African (n = 85) elephant tusks were scanned and we utilized the species differences in elemental composition to develop a functional model differentiating between species with high precision. Spatially, the majority of measured elements (n = 26) exhibited a homogeneous distribution in cross-section, but a more heterologous pattern in the longitudinal direction. Twenty-one of twenty four elements differed between Asian and African samples. Data were subjected to hierarchical cluster analysis followed by a stepwise discriminant analysis, which identified elements for the functional equation. The best equation consisted of ratios of Si, S, Cl, Ti, Mn, Ag, Sb and W, with Zr as the denominator. Next, Bayesian binary regression model analysis was conducted to predict the probability that a tusk would be of African origin. A cut-off value was established to improve discrimination. This Bayesian hybrid classification model was then validated by scanning an additional 30 Asian and 41 African tusks, which showed high accuracy (94%) and precision (95%) rates. We conclude that handheld XRF is an accurate, non-invasive method to discriminate origin of elephant tusks provides rapid results applicable to use in the field. PMID:27097717

  6. Use of handheld X-ray fluorescence as a non-invasive method to distinguish between Asian and African elephant tusks

    NASA Astrophysics Data System (ADS)

    Buddhachat, Kittisak; Thitaram, Chatchote; Brown, Janine L.; Klinhom, Sarisa; Bansiddhi, Pakkanut; Penchart, Kitichaya; Ouitavon, Kanita; Sriaksorn, Khanittha; Pa-in, Chalermpol; Kanchanasaka, Budsabong; Somgird, Chaleamchat; Nganvongpanit, Korakot

    2016-04-01

    We describe the use of handheld X-ray fluorescence, for elephant tusk species identification. Asian (n = 72) and African (n = 85) elephant tusks were scanned and we utilized the species differences in elemental composition to develop a functional model differentiating between species with high precision. Spatially, the majority of measured elements (n = 26) exhibited a homogeneous distribution in cross-section, but a more heterologous pattern in the longitudinal direction. Twenty-one of twenty four elements differed between Asian and African samples. Data were subjected to hierarchical cluster analysis followed by a stepwise discriminant analysis, which identified elements for the functional equation. The best equation consisted of ratios of Si, S, Cl, Ti, Mn, Ag, Sb and W, with Zr as the denominator. Next, Bayesian binary regression model analysis was conducted to predict the probability that a tusk would be of African origin. A cut-off value was established to improve discrimination. This Bayesian hybrid classification model was then validated by scanning an additional 30 Asian and 41 African tusks, which showed high accuracy (94%) and precision (95%) rates. We conclude that handheld XRF is an accurate, non-invasive method to discriminate origin of elephant tusks provides rapid results applicable to use in the field.

  7. Accuracy and precision of 3 intraoral scanners and accuracy of conventional impressions: A novel in vivo analysis method.

    PubMed

    Nedelcu, R; Olsson, P; Nyström, I; Rydén, J; Thor, A

    2018-02-01

    To evaluate a novel methodology using industrial scanners as a reference, and assess in vivo accuracy of 3 intraoral scanners (IOS) and conventional impressions. Further, to evaluate IOS precision in vivo. Four reference-bodies were bonded to the buccal surfaces of upper premolars and incisors in five subjects. After three reference-scans, ATOS Core 80 (ATOS), subjects were scanned three times with three IOS systems: 3M True Definition (3M), CEREC Omnicam (OMNI) and Trios 3 (TRIOS). One conventional impression (IMPR) was taken, 3M Impregum Penta Soft, and poured models were digitized with laboratory scanner 3shape D1000 (D1000). Best-fit alignment of reference-bodies and 3D Compare Analysis was performed. Precision of ATOS and D1000 was assessed for quantitative evaluation and comparison. Accuracy of IOS and IMPR were analyzed using ATOS as reference. Precision of IOS was evaluated through intra-system comparison. Precision of ATOS reference scanner (mean 0.6 μm) and D1000 (mean 0.5 μm) was high. Pairwise multiple comparisons of reference-bodies located in different tooth positions displayed a statistically significant difference of accuracy between two scanner-groups: 3M and TRIOS, over OMNI (p value range 0.0001 to 0.0006). IMPR did not show any statistically significant difference to IOS. However, deviations of IOS and IMPR were within a similar magnitude. No statistical difference was found for IOS precision. The methodology can be used for assessing accuracy of IOS and IMPR in vivo in up to five units bilaterally from midline. 3M and TRIOS had a higher accuracy than OMNI. IMPR overlapped both groups. Intraoral scanners can be used as a replacement for conventional impressions when restoring up to ten units without extended edentulous spans. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Quantitative analysis of cardiovascular MR images.

    PubMed

    van der Geest, R J; de Roos, A; van der Wall, E E; Reiber, J H

    1997-06-01

    The diagnosis of cardiovascular disease requires the precise assessment of both morphology and function. Nearly all aspects of cardiovascular function and flow can be quantified nowadays with fast magnetic resonance (MR) imaging techniques. Conventional and breath-hold cine MR imaging allow the precise and highly reproducible assessment of global and regional left ventricular function. During the same examination, velocity encoded cine (VEC) MR imaging provides measurements of blood flow in the heart and great vessels. Quantitative image analysis often still relies on manual tracing of contours in the images. Reliable automated or semi-automated image analysis software would be very helpful to overcome the limitations associated with the manual and tedious processing of the images. Recent progress in MR imaging of the coronary arteries and myocardial perfusion imaging with contrast media, along with the further development of faster imaging sequences, suggest that MR imaging could evolve into a single technique ('one stop shop') for the evaluation of many aspects of heart disease. As a result, it is very likely that the need for automated image segmentation and analysis software algorithms will further increase. In this paper the developments directed towards the automated image analysis and semi-automated contour detection for cardiovascular MR imaging are presented.

  9. Effect of Correlated Precision Errors on Uncertainty of a Subsonic Venturi Calibration

    NASA Technical Reports Server (NTRS)

    Hudson, S. T.; Bordelon, W. J., Jr.; Coleman, H. W.

    1996-01-01

    An uncertainty analysis performed in conjunction with the calibration of a subsonic venturi for use in a turbine test facility produced some unanticipated results that may have a significant impact in a variety of test situations. Precision uncertainty estimates using the preferred propagation techniques in the applicable American National Standards Institute/American Society of Mechanical Engineers standards were an order of magnitude larger than precision uncertainty estimates calculated directly from a sample of results (discharge coefficient) obtained at the same experimental set point. The differences were attributable to the effect of correlated precision errors, which previously have been considered negligible. An analysis explaining this phenomenon is presented. The article is not meant to document the venturi calibration, but rather to give a real example of results where correlated precision terms are important. The significance of the correlated precision terms could apply to many test situations.

  10. Laser confocal measurement system for curvature radius of lenses based on grating ruler

    NASA Astrophysics Data System (ADS)

    Tian, Jiwei; Wang, Yun; Zhou, Nan; Zhao, Weirui; Zhao, Weiqian

    2015-02-01

    In the modern optical measurement field, the radius of curvature (ROC) is one of the fundamental parameters of optical lens. Its measurement accuracy directly affects the other optical parameters, such as focal length, aberration and so on, which significantly affect the overall performance of the optical system. To meet the demand of measurement instruments for radius of curvature (ROC) with high accuracy in the market, we develop a laser confocal radius measurement system with grating ruler. The system uses the peak point of the confocal intensity curve to precisely identify the cat-eye and confocal positions and then measure the distance between these two positions by using the grating ruler, thereby achieving the high-precision measurement for the ROC. The system has advantages of high focusing sensitivity and anti-environment disturbance ability. And the preliminary theoretical analysis and experiments show that the measuring repeatability can be up to 0.8 um, which can provide an effective way for the accurate measurement of ROC.

  11. Development of the automated circulating tumor cell recovery system with microcavity array.

    PubMed

    Negishi, Ryo; Hosokawa, Masahito; Nakamura, Seita; Kanbara, Hisashige; Kanetomo, Masafumi; Kikuhara, Yoshihito; Tanaka, Tsuyoshi; Matsunaga, Tadashi; Yoshino, Tomoko

    2015-05-15

    Circulating tumor cells (CTCs) are well recognized as useful biomarker for cancer diagnosis and potential target of drug discovery for metastatic cancer. Efficient and precise recovery of extremely low concentrations of CTCs from blood has been required to increase the detection sensitivity. Here, an automated system equipped with a microcavity array (MCA) was demonstrated for highly efficient and reproducible CTC recovery. The use of MCA allows selective recovery of cancer cells from whole blood on the basis of differences in size between tumor and blood cells. Intra- and inter-assays revealed that the automated system achieved high efficiency and reproducibility equal to the assay manually performed by well-trained operator. Under optimized assay workflow, the automated system allows efficient and precise cell recovery for non-small cell lung cancer cells spiked in whole blood. The automated CTC recovery system will contribute to high-throughput analysis in the further clinical studies on large cohort of cancer patients. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Development of High Precision Metal Micro-Electro-Mechanical-Systems Column for Portable Surface Acoustic Wave Gas Chromatograph

    NASA Astrophysics Data System (ADS)

    Iwaya, Takamitsu; Akao, Shingo; Sakamoto, Toshihiro; Tsuji, Toshihiro; Nakaso, Noritaka; Yamanaka, Kazushi

    2012-07-01

    In the field of environmental measurement and security, a portable gas chromatograph (GC) is required for the on-site analysis of multiple hazardous gases. Although the gas separation column has been downsized using micro-electro-mechanical-systems (MEMS) technology, an MEMS column made of silicon and glass still does not have sufficient robustness and a sufficiently low fabrication cost for a portable GC. In this study, we fabricated a robust and inexpensive high-precision metal MEMS column by combining diffusion-bonded etched stainless-steel plates with alignment evaluation using acoustic microscopy. The separation performance was evaluated using a desktop GC with a flame ionization detector and we achieved the high separation performance comparable to the best silicon MEMS column fabricated using a dynamic coating method. As an application, we fabricated a palm-size surface acoustic wave (SAW) GC combining this column with a ball SAW sensor and succeeded in separating and detecting a mixture of volatile organic compounds.

  13. Extraction of impacted mandibular third molars - the effect of osteotomy at two speeds on peripheral bone: a histopathological analysis.

    PubMed

    Siroraj, A Pearlcid; Giri G V V; Ramkumar, Subramaniam; Narasimhan, Malathi

    2016-05-01

    The aim of this study was to find out the ideal speed for making a precise osteotomy with minimal damage to the surrounding bone. Thirty-six patients were divided into two groups (n=18 in each) depending on the speed of the handpiece used for osteotomy (slow=20000rpm and fast=40000rpm). Samples were taken from the peripheral bone and examined histologically to measure the margins of the osteotomy, the amount of debris produced, and the degree of thermal osteonecrosis. The osteotomy made with the high speed handpiece was better than that made with the low speed one on all counts. The margins in the high speed group were more or less precisely as required, with less debris and no thermal necrosis, which illustrated the efficacy of a high speed osteotomy. These findings can apply to other procedures that involve osteotomies in maxillofacial surgery. Copyright © 2016 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  14. High-throughput quantitative analysis by desorption electrospray ionization mass spectrometry.

    PubMed

    Manicke, Nicholas E; Kistler, Thomas; Ifa, Demian R; Cooks, R Graham; Ouyang, Zheng

    2009-02-01

    A newly developed high-throughput desorption electrospray ionization (DESI) source was characterized in terms of its performance in quantitative analysis. A 96-sample array, containing pharmaceuticals in various matrices, was analyzed in a single run with a total analysis time of 3 min. These solution-phase samples were examined from a hydrophobic PTFE ink printed on glass. The quantitative accuracy, precision, and limit of detection (LOD) were characterized. Chemical background-free samples of propranolol (PRN) with PRN-d(7) as internal standard (IS) and carbamazepine (CBZ) with CBZ-d(10) as IS were examined. So were two other sample sets consisting of PRN/PRN-d(7) at varying concentration in a biological milieu of 10% urine or porcine brain total lipid extract, total lipid concentration 250 ng/microL. The background-free samples, examined in a total analysis time of 1.5 s/sample, showed good quantitative accuracy and precision, with a relative error (RE) and relative standard deviation (RSD) generally less than 3% and 5%, respectively. The samples in urine and the lipid extract required a longer analysis time (2.5 s/sample) and showed RSD values of around 10% for the samples in urine and 4% for the lipid extract samples and RE values of less than 3% for both sets. The LOD for PRN and CBZ when analyzed without chemical background was 10 and 30 fmol, respectively. The LOD of PRN increased to 400 fmol analyzed in 10% urine, and 200 fmol when analyzed in the brain lipid extract.

  15. Analysis of precision in chemical oscillators: implications for circadian clocks

    NASA Astrophysics Data System (ADS)

    d'Eysmond, Thomas; De Simone, Alessandro; Naef, Felix

    2013-10-01

    Biochemical reaction networks often exhibit spontaneous self-sustained oscillations. An example is the circadian oscillator that lies at the heart of daily rhythms in behavior and physiology in most organisms including humans. While the period of these oscillators evolved so that it resonates with the 24 h daily environmental cycles, the precision of the oscillator (quantified via the Q factor) is another relevant property of these cell-autonomous oscillators. Since this quantity can be measured in individual cells, it is of interest to better understand how this property behaves across mathematical models of these oscillators. Current theoretical schemes for computing the Q factors show limitations for both high-dimensional models and in the vicinity of Hopf bifurcations. Here, we derive low-noise approximations that lead to numerically stable schemes also in high-dimensional models. In addition, we generalize normal form reductions that are appropriate near Hopf bifurcations. Applying our approximations to two models of circadian clocks, we show that while the low-noise regime is faithfully recapitulated, increasing the level of noise leads to species-dependent precision. We emphasize that subcomponents of the oscillator gradually decouple from the core oscillator as noise increases, which allows us to identify the subnetworks responsible for robust rhythms.

  16. On the use of particle filters for electromagnetic tracking in high dose rate brachytherapy

    NASA Astrophysics Data System (ADS)

    Götz, Th I.; Lahmer, G.; Brandt, T.; Kallis, K.; Strnad, V.; Bert, Ch; Hensel, B.; Tomé, A. M.; Lang, E. W.

    2017-10-01

    Modern radiotherapy of female breast cancers often employs high dose rate brachytherapy, where a radioactive source is moved inside catheters, implanted in the female breast, according to a prescribed treatment plan. Source localization relative to the patient’s anatomy is determined with solenoid sensors whose spatial positions are measured with an electromagnetic tracking system. Precise sensor dwell position determination is of utmost importance to assure irradiation of the cancerous tissue according to the treatment plan. We present a hybrid data analysis system which combines multi-dimensional scaling with particle filters to precisely determine sensor dwell positions in the catheters during subsequent radiation treatment sessions. Both techniques are complemented with empirical mode decomposition for the removal of superimposed breathing artifacts. We show that the hybrid model robustly and reliably determines the spatial positions of all catheters used during the treatment and precisely determines any deviations of actual sensor dwell positions from the treatment plan. The hybrid system only relies on sensor positions measured with an EMT system and relates them to the spatial positions of the implanted catheters as initially determined with a computed x-ray tomography.

  17. Optogenetic Functional MRI

    PubMed Central

    Lin, Peter; Fang, Zhongnan; Liu, Jia; Lee, Jin Hyung

    2016-01-01

    The investigation of the functional connectivity of precise neural circuits across the entire intact brain can be achieved through optogenetic functional magnetic resonance imaging (ofMRI), which is a novel technique that combines the relatively high spatial resolution of high-field fMRI with the precision of optogenetic stimulation. Fiber optics that enable delivery of specific wavelengths of light deep into the brain in vivo are implanted into regions of interest in order to specifically stimulate targeted cell types that have been genetically induced to express light-sensitive trans-membrane conductance channels, called opsins. fMRI is used to provide a non-invasive method of determining the brain's global dynamic response to optogenetic stimulation of specific neural circuits through measurement of the blood-oxygen-level-dependent (BOLD) signal, which provides an indirect measurement of neuronal activity. This protocol describes the construction of fiber optic implants, the implantation surgeries, the imaging with photostimulation and the data analysis required to successfully perform ofMRI. In summary, the precise stimulation and whole-brain monitoring ability of ofMRI are crucial factors in making ofMRI a powerful tool for the study of the connectomics of the brain in both healthy and diseased states. PMID:27167840

  18. Climate change as an ecosystem architect: implications to rare plant ecology, conservation, and restoration

    Treesearch

    Constance I. Millar

    2003-01-01

    Recent advances in earth system sciences have revealed significant new information relevant to rare plant ecology and conservation. Analysis of climate change at high resolution with new and precise proxies of paleotemperatures reveals a picture over the past two million years of oscillatory climate change operating simultaneously at multiple timescales. Low-frequency...

  19. A High Precision Prediction Model Using Hybrid Grey Dynamic Model

    ERIC Educational Resources Information Center

    Li, Guo-Dong; Yamaguchi, Daisuke; Nagai, Masatake; Masuda, Shiro

    2008-01-01

    In this paper, we propose a new prediction analysis model which combines the first order one variable Grey differential equation Model (abbreviated as GM(1,1) model) from grey system theory and time series Autoregressive Integrated Moving Average (ARIMA) model from statistics theory. We abbreviate the combined GM(1,1) ARIMA model as ARGM(1,1)…

  20. Wave processes in the human cardiovascular system: The measuring complex, computing models, and diagnostic analysis

    NASA Astrophysics Data System (ADS)

    Ganiev, R. F.; Reviznikov, D. L.; Rogoza, A. N.; Slastushenskiy, Yu. V.; Ukrainskiy, L. E.

    2017-03-01

    A description of a complex approach to investigation of nonlinear wave processes in the human cardiovascular system based on a combination of high-precision methods of measuring a pulse wave, mathematical methods of processing the empirical data, and methods of direct numerical modeling of hemodynamic processes in an arterial tree is given.

  1. Finland and Singapore in PISA 2009: Similarities and Differences in Achievements and School Management

    ERIC Educational Resources Information Center

    Soh, Kaycheng

    2014-01-01

    In PISA 2009, Finland and Singapore were both ranked high among the participating nations and have caught much attention internationally. However, a secondary analysis of the means for Reading achievement show that the differences are rather small and are attributable to spurious precision. Hence, the two nations should be considered as being on…

  2. A Fixed-Precision Sequential Sampling Plan for the Potato Tuberworm Moth, Phthorimaea operculella Zeller (Lepidoptera: Gelechidae), on Potato Cultivars.

    PubMed

    Shahbi, M; Rajabpour, A

    2017-08-01

    Phthorimaea operculella Zeller is an important pest of potato in Iran. Spatial distribution and fixed-precision sequential sampling for population estimation of the pest on two potato cultivars, Arinda ® and Sante ® , were studied in two separate potato fields during two growing seasons (2013-2014 and 2014-2015). Spatial distribution was investigated by Taylor's power law and Iwao's patchiness. Results showed that the spatial distribution of eggs and larvae was random. In contrast to Iwao's patchiness, Taylor's power law provided a highly significant relationship between variance and mean density. Therefore, fixed-precision sequential sampling plan was developed by Green's model at two precision levels of 0.25 and 0.1. The optimum sample size on Arinda ® and Sante ® cultivars at precision level of 0.25 ranged from 151 to 813 and 149 to 802 leaves, respectively. At 0.1 precision level, the sample sizes varied from 5083 to 1054 and 5100 to 1050 leaves for Arinda ® and Sante ® cultivars, respectively. Therefore, the optimum sample sizes for the cultivars, with different resistance levels, were not significantly different. According to the calculated stop lines, the sampling must be continued until cumulative number of eggs + larvae reached to 15-16 or 96-101 individuals at precision levels of 0.25 or 0.1, respectively. The performance of the sampling plan was validated by resampling analysis using resampling for validation of sampling plans software. The sampling plant provided in this study can be used to obtain a rapid estimate of the pest density with minimal effort.

  3. Pharmacogenomics and Global Precision Medicine in the Context of Adverse Drug Reactions: Top 10 Opportunities and Challenges for the Next Decade.

    PubMed

    Alessandrini, Marco; Chaudhry, Mamoonah; Dodgen, Tyren M; Pepper, Michael S

    2016-10-01

    In a move indicative of the enthusiastic support of precision medicine, the U.S. President Barack Obama announced the Precision Medicine Initiative in January 2015. The global precision medicine ecosystem is, thus, receiving generous support from the United States ($215 million), and numerous other governments have followed suit. In the context of precision medicine, drug treatment and prediction of its outcomes have been important for nearly six decades in the field of pharmacogenomics. The field offers an elegant solution for minimizing the effects and occurrence of adverse drug reactions (ADRs). The Clinical Pharmacogenetics Implementation Consortium (CPIC) plays an important role in this context, and it aims at specifically guiding the translation of clinically relevant and evidence-based pharmacogenomics research. In this forward-looking analysis, we make particular reference to several of the CPIC guidelines and their role in guiding the treatment of highly relevant diseases, namely cardiovascular disease, major depressive disorder, cancer, and human immunodeficiency virus, with a view to predicting and managing ADRs. In addition, we provide a list of the top 10 crosscutting opportunities and challenges facing the fields of precision medicine and pharmacogenomics, which have broad applicability independent of the drug class involved. Many of these opportunities and challenges pertain to infrastructure, study design, policy, and science culture in the early 21st century. Ultimately, rational pharmacogenomics study design and the acquisition of comprehensive phenotypic data that proportionately match the genomics data should be an imperative as we move forward toward global precision medicine.

  4. Pharmacogenomics and Global Precision Medicine in the Context of Adverse Drug Reactions: Top 10 Opportunities and Challenges for the Next Decade

    PubMed Central

    Alessandrini, Marco; Chaudhry, Mamoonah; Dodgen, Tyren M.

    2016-01-01

    Abstract In a move indicative of the enthusiastic support of precision medicine, the U.S. President Barack Obama announced the Precision Medicine Initiative in January 2015. The global precision medicine ecosystem is, thus, receiving generous support from the United States ($215 million), and numerous other governments have followed suit. In the context of precision medicine, drug treatment and prediction of its outcomes have been important for nearly six decades in the field of pharmacogenomics. The field offers an elegant solution for minimizing the effects and occurrence of adverse drug reactions (ADRs). The Clinical Pharmacogenetics Implementation Consortium (CPIC) plays an important role in this context, and it aims at specifically guiding the translation of clinically relevant and evidence-based pharmacogenomics research. In this forward-looking analysis, we make particular reference to several of the CPIC guidelines and their role in guiding the treatment of highly relevant diseases, namely cardiovascular disease, major depressive disorder, cancer, and human immunodeficiency virus, with a view to predicting and managing ADRs. In addition, we provide a list of the top 10 crosscutting opportunities and challenges facing the fields of precision medicine and pharmacogenomics, which have broad applicability independent of the drug class involved. Many of these opportunities and challenges pertain to infrastructure, study design, policy, and science culture in the early 21st century. Ultimately, rational pharmacogenomics study design and the acquisition of comprehensive phenotypic data that proportionately match the genomics data should be an imperative as we move forward toward global precision medicine. PMID:27643672

  5. Design and algorithm research of high precision airborne infrared touch screen

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-Bing; Wang, Shuang-Jie; Fu, Yan; Chen, Zhao-Quan

    2016-10-01

    There are shortcomings of low precision, touch shaking, and sharp decrease of touch precision when emitting and receiving tubes are failure in the infrared touch screen. A high precision positioning algorithm based on extended axis is proposed to solve these problems. First, the unimpeded state of the beam between emitting and receiving tubes is recorded as 0, while the impeded state is recorded as 1. Then, the method of oblique scan is used, in which the light of one emitting tube is used for five receiving tubes. The impeded information of all emitting and receiving tubes is collected as matrix. Finally, according to the method of arithmetic average, the position of the touch object is calculated. The extended axis positioning algorithm is characteristic of high precision in case of failure of individual infrared tube and affects slightly the precision. The experimental result shows that the 90% display area of the touch error is less than 0.25D, where D is the distance between adjacent emitting tubes. The conclusion is gained that the algorithm based on extended axis has advantages of high precision, little impact when individual infrared tube is failure, and using easily.

  6. A Lane-Level LBS System for Vehicle Network with High-Precision BDS/GPS Positioning

    PubMed Central

    Guo, Chi; Guo, Wenfei; Cao, Guangyi; Dong, Hongbo

    2015-01-01

    In recent years, research on vehicle network location service has begun to focus on its intelligence and precision. The accuracy of space-time information has become a core factor for vehicle network systems in a mobile environment. However, difficulties persist in vehicle satellite positioning since deficiencies in the provision of high-quality space-time references greatly limit the development and application of vehicle networks. In this paper, we propose a high-precision-based vehicle network location service to solve this problem. The major components of this study include the following: (1) application of wide-area precise positioning technology to the vehicle network system. An adaptive correction message broadcast protocol is designed to satisfy the requirements for large-scale target precise positioning in the mobile Internet environment; (2) development of a concurrence service system with a flexible virtual expansion architecture to guarantee reliable data interaction between vehicles and the background; (3) verification of the positioning precision and service quality in the urban environment. Based on this high-precision positioning service platform, a lane-level location service is designed to solve a typical traffic safety problem. PMID:25755665

  7. Precision Crystal Calorimeters in High Energy Physics

    ScienceCinema

    Ren-Yuan Zhu

    2017-12-09

    Precision crystal calorimeters traditionally play an important role in high energy physics experiments. In the last two decades, it faces a challenge to maintain its precision in a hostile radiation environment. This paper reviews the performance of crystal calorimeters constructed for high energy physics experiments and the progress achieved in understanding crystal’s radiation damage as well as in developing high quality scintillating crystals for particle physics. Potential applications of new generation scintillating crystals of high density and high light yield, such as LSO and LYSO, in particle physics experiments is also discussed.

  8. Value of Sample Return and High Precision Analyses: Need for A Resource of Compelling Stories, Metaphors and Examples for Public Speakers

    NASA Technical Reports Server (NTRS)

    Allton, J. H.

    2017-01-01

    There is widespread agreement among planetary scientists that much of what we know about the workings of the solar system comes from accurate, high precision measurements on returned samples. Precision is a function of the number of atoms the instrumentation is able to count. Accuracy depends on the calibration or standardization technique. For Genesis, the solar wind sample return mission, acquiring enough atoms to ensure precise SW measurements and then accurately quantifying those measurements were steps known to be non-trivial pre-flight. The difficulty of precise and accurate measurements on returned samples, and why they cannot be made remotely, is not communicated well to the public. In part, this is be-cause "high precision" is abstract and error bars are not very exciting topics. This paper explores ideas for collecting and compiling compelling metaphors and colorful examples as a resource for planetary science public speakers.

  9. Quantitative measurement for the microstructural parameters of nano-precipitates in Al-Mg-Si-Cu alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Kai

    Size, number density and volume fraction of nano-precipitates are important microstructural parameters controlling the strengthening of materials. In this work a widely accessible, convenient, moderately time efficient method with acceptable accuracy and precision has been provided for measurement of volume fraction of nano-precipitates in crystalline materials. The method is based on the traditional but highly accurate technique of measuring foil thickness via convergent beam electron diffraction. A new equation is proposed and verified with the aid of 3-dimensional atom probe (3DAP) analysis, to compensate for the additional error resulted from the hardly distinguishable contrast of too short incomplete precipitates cutmore » by the foil surface. The method can be performed on a regular foil specimen with a modern LaB{sub 6} or field-emission-gun transmission electron microscope. Precisions around ± 16% have been obtained for precipitate volume fractions of needle-like β″/C and Q precipitates in an aged Al-Mg-Si-Cu alloy. The measured number density is close to that directly obtained using 3DAP analysis by a misfit of 4.5%, and the estimated precision for number density measurement is about ± 11%. The limitations of the method are also discussed. - Highlights: •A facile method for measuring volume fraction of nano-precipitates based on CBED •An equation to compensate for small invisible precipitates, with 3DAP verification •Precisions around ± 16% for volume fraction and ± 11% for number density.« less

  10. A rapid and reliable method for Pb isotopic analysis of peat and lichens by laser ablation-quadrupole-inductively coupled plasma-mass spectrometry for biomonitoring and sample screening.

    PubMed

    Kylander, M E; Weiss, D J; Jeffries, T E; Kober, B; Dolgopolova, A; Garcia-Sanchez, R; Coles, B J

    2007-01-16

    An analytical protocol for rapid and reliable laser ablation-quadrupole (LA-Q)- and multi-collector (MC-) inductively coupled plasma-mass spectrometry (ICP-MS) analysis of Pb isotope ratios ((207)Pb/(206)Pb and (208)Pb/(206)Pb) in peats and lichens is developed. This technique is applicable to source tracing atmospheric Pb deposition in biomonitoring studies and sample screening. Reference materials and environmental samples were dry ashed and pressed into pellets for introduction by laser ablation. No binder was used to reduce contamination. LA-MC-ICP-MS internal and external precisions were <1.1% and <0.3%, respectively, on both (207)Pb/(206)Pb and (208)Pb/(206)Pb ratios. LA-Q-ICP-MS internal precisions on (207)Pb/(206)Pb and (208)Pb/(206)Pb ratios were lower with values for the different sample sets <14.3% while external precisions were <2.9%. The level of external precision acquired in this study is high enough to distinguish between most modern Pb sources. LA-MC-ICP-MS measurements differed from thermal ionisation mass spectrometry (TIMS) values by 1% or less while the accuracy obtained using LA-Q-ICP-MS compared to solution MC-ICP-MS was 3.1% or better using a run bracketing (RB) mass bias correction method. Sample heterogeneity and detector switching when measuring (208)Pb by Q-ICP-MS are identified as sources of reduced analytical performance.

  11. Frontiers of QC Laser spectroscopy for high precision isotope ratio analysis of greenhouse gases

    NASA Astrophysics Data System (ADS)

    Emmenegger, Lukas; Mohn, Joachim; Harris, Eliza; Eyer, Simon; Ibraim, Erkan; Tuzson, Béla

    2016-04-01

    An important milestone for laser spectroscopy was achieved when isotope ratios of greenhouse gases were reported at precision levels that allow addressing research questions in environmental sciences. Real-time data with high temporal resolution at moderate cost and instrument size make the optical approach highly attractive, complementary to the well-established isotope-ratio mass-spectrometry (IRMS) method. Especially appealing, in comparison to IRMS, is the inherent specificity to structural isomers having the same molecular mass. Direct absorption in the MIR in single or dual QCL configuration has proven highly reliable for the sta-ble isotopes of CO2, N2O and CH4. The longest time series of real-time measurements is currently available for δ13C and δ18O in CO2 at the high-alpine station Jung-fraujoch. At this well-equipped site, QCL based direct absorption spectroscopy (QCLAS) measurements are ongoing since 2008 1,2. Applications of QCLAS for N2O and CH4 stable isotopes are considerably more challenging because of the lower atmospheric mixing ratios, especially for the less abundant species, such as N218O and CH3D. For high precision (< 0.1 ‰) measurements in ambient air, QCLAS may be combined with a fully automated preconcentration unit yielding an up to 500 times concentration increase and the capability to separate the target gas from spectral interferants by se-quential desorption 3. Here, we review our recent developments on high precision isotope ratio analysis of greenhouse gases, with special focus on the isotopic species of N2O and CH4. Furthermore, we show environ-mental applications illustrating the highly valuable information that isotope ratios of atmospheric trace gases can carry. For example, the intramolecular distribution of 15N in N2O gives important information on the geochemical cycle of N2O4-6, while the analysis of δ13C and δ D in CH4 may be applied to disentangle microbial, fossil and landfill sources 7. 1 Sturm, P., Tuzson, B., Henne, S. & Emmenegger, L. Tracking isotopic signatures of CO2 at the high altitude site Jungfraujoch with laser spectroscopy: Analytical improvements and representative re-sults. Atmospheric Measurement Techniques 6, 1659-1671 (2013). 2 Tuzson, B. et al. Continuous isotopic composition measurements of tropospheric CO2 at Jungfraujoch (3580 m a.s.l.), Switzerland: real-time observation of regional pollution events. Atmospheric Chemistry and Physics 11, 1685-1696 (2011). 3 Mohn, J. et al. A liquid nitrogen-free preconcentration unit for measurements of ambient N2O isotopomers by QCLAS. Atmospheric Measurement Techniques 3, 609-618 (2010). 4 Wolf, B. et al. First on-line isotopic characterization of N2O above intensively managed grassland. Biogeosciences 12, 2517-1960 (2015). 5 Harris, E. et al. Nitrous oxide and methane emissions and nitrous oxide isotopic composition from waste incineration in Switzerland. Waste Management 35, 135-140 (2015). 6 Harris, E. et al. Isotopic evidence for nitrous oxide production pathways in a partial nitritation-anammox reactor. Water Research 83, 258-270 (2015). 7 Eyer, S. et al. Real-time analysis of δ13C- and δ D-CH4 in ambient air with laser spectroscopy: method development and first intercomparison results. Atmos. Meas. Tech. Discuss. 8, 8925-8970 (2015).

  12. Real-time analysis of δ13C- and δD-CH4 by high precision laser spectroscopy

    NASA Astrophysics Data System (ADS)

    Eyer, Simon; Emmenegger, Lukas; Tuzson, Béla; Fischer, Hubertus; Mohn, Joachim

    2014-05-01

    Methane (CH4) is the most important non-CO2 greenhouse gas (GHG) contributing 18% to total radiative forcing. Anthropogenic sources (e.g. ruminants, landfills) contribute 60% to total emissions and led to an increase in its atmospheric mixing ratio from 700 ppb in pre-industrial times to 1819 ± 1 ppb in 2012 [1]. Analysis of the most abundant methane isotopologues 12CH4, 13CH4 and 12CH3D can be used to disentangle the various source/sink processes [2] and to develop target oriented reduction strategies. High precision isotopic analysis of CH4 can be accomplished by isotope-ratio mass-spectrometry (IRMS) [2] and more recently by mid-infrared laser-based spectroscopic techniques. For high precision measurements in ambient air, however, both techniques rely on preconcentration of the target gas [3]. In an on-going project, we developed a fully-automated, field-deployable CH4 preconcentration unit coupled to a dual quantum cascade laser absorption spectrometer (QCLAS) for real-time analysis of CH4 isotopologues. The core part of the rack-mounted (19 inch) device is a highly-efficient adsorbent trap attached to a motorized linear drive system and enclosed in a vacuum chamber. Thereby, the adsorbent trap can be decoupled from the Stirling cooler during desorption for fast desorption and optimal heat management. A wide variety of adsorbents, including: HayeSep D, molecular sieves as well as the novel metal-organic frameworks and carbon nanotubes were characterized regarding their surface area, isosteric enthalpy of adsorption and selectivity for methane over nitrogen. The most promising candidates were tested on the preconcentration device and a preconcentration by a factor > 500 was obtained. Furthermore analytical interferants (e.g. N2O, CO2) are separated by step-wise desorption of trace gases. A QCL absorption spectrometer previously described by Tuzson et al. (2010) for CH4 flux measurements was modified to obtain a platform for high precision and simultaneous analysis of CH4 isotopologues. The infrared radiation emitted by the two cw-QC laser sources are combined and coupled into a 0.5 L astigmatic multipass absorption cell with an optical path length of 76 m. An Allan variance minimum of the isotope ratio time-series of 0.1 o for δ13C-CH4 and 0.3 o for δD-CH4 has been achieved using 300 s integration time. First experiments of the developed analytical technique demonstrate its potential with respect to field-applicability and temporal resolving power. References: [1] WMO, Greenhouse Gas Bulletin No. 9, 2013, WMO GAW, pp. 4. [2] H. Fischer, M. Behrens, M. Bock, U. Richter, J. Schmitt, L. Loulergue, J. Chappellaz, R. Spahni, T. Blunier, M. Leuenberger and T. F. Stocker, Nature, 2008, 452, 864-867. [3] J. Mohn, B. Tuzson, A. Manninen, N. Yoshida, S. Toyoda, W. A. Brand, and L. Emmenegger, Atmos. Meas. Tech., 2012, 5, 1601-1609. [4] Tuzson, B., Hiller, R. V., Zeyer, K., Eugster, W., Neftel, A., Ammann, C., and L. Emmenegger, Atmos. Meas. Tech., 2010, 3,1519-1531.

  13. An object-based image analysis approach for aquaculture ponds precise mapping and monitoring: a case study of Tam Giang-Cau Hai Lagoon, Vietnam.

    PubMed

    Virdis, Salvatore Gonario Pasquale

    2014-01-01

    Monitoring and mapping shrimp farms, including their impact on land cover and land use, is critical to the sustainable management and planning of coastal zones. In this work, a methodology was proposed to set up a cost-effective and reproducible procedure that made use of satellite remote sensing, object-based classification approach, and open-source software for mapping aquaculture areas with high planimetric and thematic accuracy between 2005 and 2008. The analysis focused on two characteristic areas of interest of the Tam Giang-Cau Hai Lagoon (in central Vietnam), which have similar farming systems to other coastal aquaculture worldwide: the first was primarily characterised by locally referred "low tide" shrimp ponds, which are partially submerged areas; the second by earthed shrimp ponds, locally referred to as "high tide" ponds, which are non-submerged areas on the lagoon coast. The approach was based on the region-growing segmentation of high- and very high-resolution panchromatic images, SPOT5 and Worldview-1, and the unsupervised clustering classifier ISOSEG embedded on SPRING non-commercial software. The results, the accuracy of which was tested with a field-based aquaculture inventory, showed that in favourable situations (high tide shrimp ponds), the classification results provided high rates of accuracy (>95 %) through a fully automatic object-based classification. In unfavourable situations (low tide shrimp ponds), the performance degraded due to the low contrast between the water and the pond embankments. In these situations, the automatic results were improved by manual delineation of the embankments. Worldview-1 necessarily showed better thematic accuracy, and precise maps have been realised at a scale of up to 1:2,000. However, SPOT5 provided comparable results in terms of number of correctly classified ponds, but less accurate results in terms of the precision of mapped features. The procedure also demonstrated high degrees of reproducibility because it was applied to images with different spatial resolutions in an area that, during the investigated period, did not experience significant land cover changes.

  14. Validation of high-throughput single cell analysis methodology.

    PubMed

    Devonshire, Alison S; Baradez, Marc-Olivier; Morley, Gary; Marshall, Damian; Foy, Carole A

    2014-05-01

    High-throughput quantitative polymerase chain reaction (qPCR) approaches enable profiling of multiple genes in single cells, bringing new insights to complex biological processes and offering opportunities for single cell-based monitoring of cancer cells and stem cell-based therapies. However, workflows with well-defined sources of variation are required for clinical diagnostics and testing of tissue-engineered products. In a study of neural stem cell lines, we investigated the performance of lysis, reverse transcription (RT), preamplification (PA), and nanofluidic qPCR steps at the single cell level in terms of efficiency, precision, and limit of detection. We compared protocols using a separate lysis buffer with cell capture directly in RT-PA reagent. The two methods were found to have similar lysis efficiencies, whereas the direct RT-PA approach showed improved precision. Digital PCR was used to relate preamplified template copy numbers to Cq values and reveal where low-quality signals may affect the analysis. We investigated the impact of calibration and data normalization strategies as a means of minimizing the impact of inter-experimental variation on gene expression values and found that both approaches can improve data comparability. This study provides validation and guidance for the application of high-throughput qPCR workflows for gene expression profiling of single cells. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Design and development aspects of flexure mechanism for high precision application

    NASA Astrophysics Data System (ADS)

    Sollapur, Shrishail B.; Patil, M. S.; Deshmukh, S. P.

    2018-04-01

    Planer XY Flexurel Mechanisms has various applications in precision motion mechanisms. A flexural mechanism generates relative motion between fixed support and motion stage using flexibility of material. This mechanism offers zero backlash, frictionless motion and high order repeatability. It is relatively compact in design as compared to rigid link mechanism. The merits of using flexure is complete mechanism can be from single monolith. Modelling of flexural mechanism to provide accurate scanning of comparatively larger range at a higher speed. Static Analysis of mechanism is carried out on FEA tool to determine static deflection of motion stage. Further Mechanism is actuated with the help of weight pan and weights. The resultant displacement is measured on Dial Gauge Indicator. Experimental set-up consists of Flexural mechanism, Dial Gauge, Weight Pan and Weights, Pulley, String, Small metal strip, Optical Bread Board etc. Further experimental Results and Analytical Results are compared and minimum deviation is found.

  16. New Insights into Shape Memory Alloy Bimorph Actuators Formed by Electron Beam Evaporation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Hao; Nykypanchuk, Dmytro

    In order to create shape memory alloy (SMA) bimorph microactuators with high-precision features, a novel fabrication process combined with electron beam (E-beam) evaporation, lift-off resist and isotropic XeF2 dry etching method was developed. To examine the effect of E-beam deposition and annealing process on nitinol (NiTi) characteristics, the NiTi thin film samples with different deposition rate and overflow conditions during annealing process were investigated. With the characterizations using scanning electron microscope and x-ray diffraction, the results indicated that low E-beam deposition rate and argon employed annealing process could benefit the formation of NiTi crystalline structure. In addition, SMA bimorph microactuatorsmore » with high-precision features as small as 5 microns were successfully fabricated. Furthermore, the thermomechanical performance was experimentally verified and compared with finite element analysis simulation results.« less

  17. Coupling solid-phase microextraction and high-performance liquid chromatography for direct and sensitive determination of halogenated fungicides in wine.

    PubMed

    Millán, S; Sampedro, M C; Unceta, N; Goicolea, M A; Rodríguez, E; Barrio, R J

    2003-05-02

    A solid-phase microextraction (SPME) method coupled to high-performance liquid chromatography with diode array detection (HPLC-DAD) for the analysis of six organochlorine fungicides (nuarimol, triadimenol, triadimefon, folpet, vinclozolin and penconazole) in wine was developed. For this purpose, polydimethylsiloxane-divinylbenzene-coated fibers were utilized and all factors affecting throughput, precision, and accuracy of the SPME method were investigated and optimized. These factors include: matrix influence, extraction and desorption time, percentage of ethanol, pH, salt effect and desorption mode. The performed analytical procedure showed detectability ranging from 4 to 27 microg l(-1) and precision from 2.4 to 14.2% (as intra-day relative standard deviation, RSD) and 4.7-25.7% (as inter-day RSD) depending on the fungicide. The results demonstrate the suitability of the SPME-HPLC-DAD method to analyze these organochlorine fungicides in red wine.

  18. STEREO TRansiting Exoplanet and Stellar Survey (STRESS) - I. Introduction and data pipeline

    NASA Astrophysics Data System (ADS)

    Sangaralingam, Vinothini; Stevens, Ian R.

    2011-12-01

    The Solar TErrestrial RElations Observatory (STEREO) is a system of two identical spacecraft in heliocentric Earth orbit. We use the two heliospheric imagers (HI), which are wide-angle imagers with multibaffle systems, to perform high-precision stellar photometry in order to search for exoplanetary transits and understand stellar variables. The large cadence (40 min for HI-1 and 2 h for HI-2), high precision, wide magnitude range (R mag: 4-12) and broad sky coverage (nearly 20 per cent for HI-1A alone and 60 per cent of the sky in the zodiacal region for all instruments combined) of this instrument place it in a region left largely devoid by other current projects. In this paper, we describe the semi-automated pipeline devised for reduction of the data, some of the interesting characteristics of the data obtained and data-analysis methods used, along with some early results.

  19. Mars Atmospheric Escape Recorded by H, C and O Isotope Ratios in Carbon Dioxide and Water Measured by the Sam Tunable Laser Spectrometer on the Curiosity Rover

    NASA Technical Reports Server (NTRS)

    Webster, C. R.; Mahaffy, P. R.; Leshin, L. A.; Atreya, S. K.; Flesch, G. J.; Stern, J.; Christensen, L. E.; Vasavada, A. R.; Owen, T.; Niles, P. B.; hide

    2013-01-01

    Stable isotope ratios in C, H, N, O and S are powerful indicators of a wide variety of planetary geophysical processes that can identify origin, transport, temperature history, radiation exposure, atmospheric escape, environmental habitability and biological activity [2]. For Mars, measurements to date have indicated enrichment in all the heavier isotopes consistent with atmospheric escape processes, but with uncertainty too high to tie the results with the more precise isotopic ratios achieved from SNC meteoritic analyses. We will present results to date of H, C and O isotope ratios in CO2 and H2O made to high precision (few per mil) using the Tunable Laser Spectrometer (TLS) that is part of the Sample Analysis at Mars (SAM) instrument suite on MSL s Curiosity Rover.

  20. Precision Viticulture from Multitemporal, Multispectral Very High Resolution Satellite Data

    NASA Astrophysics Data System (ADS)

    Kandylakis, Z.; Karantzalos, K.

    2016-06-01

    In order to exploit efficiently very high resolution satellite multispectral data for precision agriculture applications, validated methodologies should be established which link the observed reflectance spectra with certain crop/plant/fruit biophysical and biochemical quality parameters. To this end, based on concurrent satellite and field campaigns during the veraison period, satellite and in-situ data were collected, along with several grape samples, at specific locations during the harvesting period. These data were collected for a period of three years in two viticultural areas in Northern Greece. After the required data pre-processing, canopy reflectance observations, through the combination of several vegetation indices were correlated with the quantitative results from the grape/must analysis of grape sampling. Results appear quite promising, indicating that certain key quality parameters (like brix levels, total phenolic content, brix to total acidity, anthocyanin levels) which describe the oenological potential, phenolic composition and chromatic characteristics can be efficiently estimated from the satellite data.

  1. Biological Matrix Effects in Quantitative Tandem Mass Spectrometry-Based Analytical Methods: Advancing Biomonitoring

    PubMed Central

    Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd

    2015-01-01

    The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585

  2. Oufti: An integrated software package for high-accuracy, high-throughput quantitative microscopy analysis

    PubMed Central

    Paintdakhi, Ahmad; Parry, Bradley; Campos, Manuel; Irnov, Irnov; Elf, Johan; Surovtsev, Ivan; Jacobs-Wagner, Christine

    2016-01-01

    Summary With the realization that bacteria display phenotypic variability among cells and exhibit complex subcellular organization critical for cellular function and behavior, microscopy has re-emerged as a primary tool in bacterial research during the last decade. However, the bottleneck in today’s single-cell studies is quantitative image analysis of cells and fluorescent signals. Here, we address current limitations through the development of Oufti, a stand-alone, open-source software package for automated measurements of microbial cells and fluorescence signals from microscopy images. Oufti provides computational solutions for tracking touching cells in confluent samples, handles various cell morphologies, offers algorithms for quantitative analysis of both diffraction and non-diffraction-limited fluorescence signals, and is scalable for high-throughput analysis of massive datasets, all with subpixel precision. All functionalities are integrated in a single package. The graphical user interface, which includes interactive modules for segmentation, image analysis, and post-processing analysis, makes the software broadly accessible to users irrespective of their computational skills. PMID:26538279

  3. High-Precision Half-Life Measurements for the Superallowed Fermi β+ Emitters 14O and 18Ne

    NASA Astrophysics Data System (ADS)

    Laffoley, A. T.; Andreoiu, C.; Austin, R. A. E.; Ball, G. C.; Bender, P. C.; Bidaman, H.; Bildstein, V.; Blank, B.; Bouzomita, H.; Cross, D. S.; Deng, G.; Diaz Varela, A.; Dunlop, M. R.; Dunlop, R.; Finlay, P.; Garnsworthy, A. B.; Garrett, P.; Giovinazzo, J.; Grinyer, G. F.; Grinyer, J.; Hadinia, B.; Jamieson, D. S.; Jigmeddorj, B.; Ketelhut, S.; Kisliuk, D.; Leach, K. G.; Leslie, J. R.; MacLean, A.; Miller, D.; Mills, B.; Moukaddam, M.; Radich, A. J.; Rajabali, M. M.; Rand, E. T.; Svensson, C. E.; Tardiff, E.; Thomas, J. C.; Turko, J.; Voss, P.; Unsworth, C.

    High-precision half-life measurements, at the level of ±0.04%, for the superallowed Fermi emitters 14O and 18Ne have been performed at TRIUMF's Isotope Separator and Accelerator facility. Using 3 independent detector systems, a gas-proportional counter, a fast plastic scintillator, and a high-purity germanium array, a series of direct β and γ counting measurements were performed for each of the isotopes. In the case of 14O, these measurements were made to help resolve an existing discrepancy between detection methods, whereas for 18Ne the half-life precision has been improved in anticipation of forthcoming high-precision branching ratio measurements.

  4. The Joint Physics Analysis Center: Recent results

    NASA Astrophysics Data System (ADS)

    Fernández-Ramírez, César

    2016-10-01

    We review some of the recent achievements of the Joint Physics Analysis Center, a theoretical collaboration with ties to experimental collaborations, that aims to provide amplitudes suitable for the analysis of the current and forthcoming experimental data on hadron physics. Since its foundation in 2013, the group is focused on hadron spectroscopy in preparation for the forthcoming high statistics and high precision experimental data from BELLEII, BESIII, CLAS12, COMPASS, GlueX, LHCb and (hopefully) PANDA collaborations. So far, we have developed amplitudes for πN scattering, KN scattering, pion and J/ψ photoproduction, two kaon photoproduction and three-body decays of light mesons (η, ω, ϕ). The codes for the amplitudes are available to download from the group web page and can be straightforwardly incorporated to the analysis of the experimental data.

  5. Precision injection molding of freeform optics

    NASA Astrophysics Data System (ADS)

    Fang, Fengzhou; Zhang, Nan; Zhang, Xiaodong

    2016-08-01

    Precision injection molding is the most efficient mass production technology for manufacturing plastic optics. Applications of plastic optics in field of imaging, illumination, and concentration demonstrate a variety of complex surface forms, developing from conventional plano and spherical surfaces to aspheric and freeform surfaces. It requires high optical quality with high form accuracy and lower residual stresses, which challenges both optical tool inserts machining and precision injection molding process. The present paper reviews recent progress in mold tool machining and precision injection molding, with more emphasis on precision injection molding. The challenges and future development trend are also discussed.

  6. An in-depth evaluation of accuracy and precision in Hg isotopic analysis via pneumatic nebulization and cold vapor generation multi-collector ICP-mass spectrometry.

    PubMed

    Rua-Ibarz, Ana; Bolea-Fernandez, Eduardo; Vanhaecke, Frank

    2016-01-01

    Mercury (Hg) isotopic analysis via multi-collector inductively coupled plasma (ICP)-mass spectrometry (MC-ICP-MS) can provide relevant biogeochemical information by revealing sources, pathways, and sinks of this highly toxic metal. In this work, the capabilities and limitations of two different sample introduction systems, based on pneumatic nebulization (PN) and cold vapor generation (CVG), respectively, were evaluated in the context of Hg isotopic analysis via MC-ICP-MS. The effect of (i) instrument settings and acquisition parameters, (ii) concentration of analyte element (Hg), and internal standard (Tl)-used for mass discrimination correction purposes-and (iii) different mass bias correction approaches on the accuracy and precision of Hg isotope ratio results was evaluated. The extent and stability of mass bias were assessed in a long-term study (18 months, n = 250), demonstrating a precision ≤0.006% relative standard deviation (RSD). CVG-MC-ICP-MS showed an approximately 20-fold enhancement in Hg signal intensity compared with PN-MC-ICP-MS. For CVG-MC-ICP-MS, the mass bias induced by instrumental mass discrimination was accurately corrected for by using either external correction in a sample-standard bracketing approach (SSB) or double correction, consisting of the use of Tl as internal standard in a revised version of the Russell law (Baxter approach), followed by SSB. Concomitant matrix elements did not affect CVG-ICP-MS results. Neither with PN, nor with CVG, any evidence for mass-independent discrimination effects in the instrument was observed within the experimental precision obtained. CVG-MC-ICP-MS was finally used for Hg isotopic analysis of reference materials (RMs) of relevant environmental origin. The isotopic composition of Hg in RMs of marine biological origin testified of mass-independent fractionation that affected the odd-numbered Hg isotopes. While older RMs were used for validation purposes, novel Hg isotopic data are provided for the latest generations of some biological RMs.

  7. An Improved Method of AGM for High Precision Geolocation of SAR Images

    NASA Astrophysics Data System (ADS)

    Zhou, G.; He, C.; Yue, T.; Huang, W.; Huang, Y.; Li, X.; Chen, Y.

    2018-05-01

    In order to take full advantage of SAR images, it is necessary to obtain the high precision location of the image. During the geometric correction process of images, to ensure the accuracy of image geometric correction and extract the effective mapping information from the images, precise image geolocation is important. This paper presents an improved analytical geolocation method (IAGM) that determine the high precision geolocation of each pixel in a digital SAR image. This method is based on analytical geolocation method (AGM) proposed by X. K. Yuan aiming at realizing the solution of RD model. Tests will be conducted using RADARSAT-2 SAR image. Comparing the predicted feature geolocation with the position as determined by high precision orthophoto, results indicate an accuracy of 50m is attainable with this method. Error sources will be analyzed and some recommendations about improving image location accuracy in future spaceborne SAR's will be given.

  8. High-precision R-branch transition frequencies in the ν2 fundamental band of H 3+ %A Perry, Adam J.; Hodges, James N.; Markus, Charles R.; Kocheril, G. Stephen; McCall, Benjamin J.

    NASA Astrophysics Data System (ADS)

    Perry, Adam J.; Hodges, James N.; Markus, Charles R.; Kocheril, G. Stephen; McCall, Benjamin J.

    2015-11-01

    The H3+ molecular ion has served as a long-standing benchmark for state-of-the-art ab initio calculations of molecular potentials and variational calculations of rovibrational energy levels. However, the accuracy of such calculations would not have been confirmed if not for the wealth of spectroscopic data that has been made available for this molecule. Recently, a new high-precision ion spectroscopy technique was demonstrated by Hodges et al., which led to the first highly accurate and precise (∼MHz) H3+ transition frequencies. As an extension of this work, we present ten additional R-branch transitions measured to similar precision as a next step toward the ultimate goal of producing a comprehensive high-precision survey of this molecule, from which rovibrational energy levels can be calculated.

  9. New tool for getting data on the field for paleoclimate and paleoceanography data based on isotope 13C and 18O measurements

    NASA Astrophysics Data System (ADS)

    Mandic, M.; Stöbener, N.; Smajgl, D.

    2017-12-01

    For many decades different instrumental methods involving generations of the isotope ratio mass spectrometers with different periphery units for sample preparation, have provided scientifically required high precision, and high throughput of samples for varies application - from geological and hydrological to food and forensic. With this work we introduce automated measurement of δ13C and δ18O from solid carbonate samples, DIC and δ18O of water. We have demonstrated usage of a Thermo Scientific™ Delta Ray™ IRIS with URI Connect on certified reference materials and confirmed the high achievable accuracy and a precision better then <0.1‰ for both δ13C and δ18O, in the laboratory or the field with same precision and throughput of samples. With equilibration method for determination of δ18O in water samples, which we present in this work, achieved repeatability and accuracy are 0.12‰ and 0.68‰ respectively, which fulfill requirements of regulatory methods. The preparation of the samples for carbonate and DIC analysis on the Delta Ray IRIS with URI Connect is similar to the previously mentioned Gas Bench II methods. Samples are put into vials and phosphoric acid is added. The resulting sample-acid chemical reaction releases CO2 gas, which is then introduced into the Delta Ray IRIS via the Variable Volume. Three international standards of carbonate materials (NBS-18, NBS-19 and IAEA-CO-1) were analyzed. NBS-18 and NBS-19 were used as standards for calibration, and IAEA-CO-1 was treated as unknown. For water sample analysis equilibration method with 1% of CO2 in dry air was used. Test measurements and conformation of precision and accuracy of method determination δ18O in water samples were done with three lab standards, namely ANST, OCEAN 2 and HBW. All laboratory standards were previously calibrated with international reference material VSMOW2 and SLAP2 to assure accuracy of the isotopic values. The Principle of Identical Treatment was applied in sample and standard preparation, in measurement procedure, as well as in the evaluation of the results.

  10. Analytical Method Development and Validation for the Simultaneous Estimation of Abacavir and Lamivudine by Reversed-phase High-performance Liquid Chromatography in Bulk and Tablet Dosage Forms.

    PubMed

    Raees Ahmad, Sufiyan Ahmad; Patil, Lalit; Mohammed Usman, Mohammed Rageeb; Imran, Mohammad; Akhtar, Rashid

    2018-01-01

    A simple rapid, accurate, precise, and reproducible validated reverse phase high performance liquid chromatography (HPLC) method was developed for the determination of Abacavir (ABAC) and Lamivudine (LAMI) in bulk and tablet dosage forms. The quantification was carried out using Symmetry Premsil C18 (250 mm × 4.6 mm, 5 μm) column run in isocratic way using mobile phase comprising methanol: water (0.05% orthophosphoric acid with pH 3) 83:17 v/v and a detection wavelength of 245 nm and injection volume of 20 μl, with a flow rate of 1 ml/min. In the developed method, the retention times of ABAC and LAMI were found to be 3.5 min and 7.4 min, respectively. The method was validated in terms of linearity, precision, accuracy, limits of detection, limits of quantitation, and robustness in accordance with the International Conference on Harmonization guidelines. The assay of the proposed method was found to be 99% - 101%. The recovery studies were also carried out and mean % recovery was found to be 99% - 101%. The % relative standard deviation from reproducibility was found to be <2%. The proposed method was statistically evaluated and can be applied for routine quality control analysis of ABAC and LAMI in bulk and in tablet dosage form. Attempts were made to develop RP-HPLC method for simultaneous estimation of Abacavir and Lamivudine for the RP-HPLC method. The developed method was validated according to the ICH guidelines. The linearity, precision, range, robustness were within the limits as specified by the ICH guidelines. Hence the method was found to be simple, accurate, precise, economic and reproducible. So the proposed methods can be used for the routine quality control analysis of Abacavir and Lamivudine in bulk drug as well as in formulations. Abbreviations Used: HPLC: High-performance liquid chromatography, UV: Ultraviolet, ICH: International Conference on Harmonization, ABAC: Abacavir, LAMI: Lamivudine, HIV: Human immunodeficiency virus, AIDS: Acquired immunodeficiency syndrome, NRTI: Nucleoside reverse transcriptase inhibitors, ARV: Antiretroviral, RSD: Relative standard deviation, RT: Retention time, SD: Standard deviation.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreras, Ignacio; Trujillo, Ignacio, E-mail: i.ferreras@ucl.ac.uk

    At the core of the standard cosmological model lies the assumption that the redshift of distant galaxies is independent of photon wavelength. This invariance of cosmological redshift with wavelength is routinely found in all galaxy spectra with a precision of Δ z ∼ 10{sup −4}. The combined use of approximately half a million high-quality galaxy spectra from the Sloan Digital Sky Survey (SDSS) allows us to explore this invariance down to a nominal precision in redshift of 10{sup −6} (statistical). Our analysis is performed over the redshift interval 0.02 < z < 0.25. We use the centroids of spectral linesmore » over the 3700–6800 Å rest-frame optical window. We do not find any difference in redshift between the blue and red sides down to a precision of 10{sup −6} at z ≲ 0.1 and 10{sup −5} at 0.1 ≲ z ≲ 0.25 (i.e., at least an order of magnitude better than with single galaxy spectra). This is the first time the wavelength-independence of the (1 + z ) redshift law is confirmed over a wide spectral window at this precision level. This result holds independently of the stellar population of the galaxies and their kinematical properties. This result is also robust against wavelength calibration issues. The limited spectral resolution ( R ∼ 2000) of the SDSS data, combined with the asymmetric wavelength sampling of the spectral features in the observed restframe due to the (1 + z ) stretching of the lines, prevent our methodology from achieving a precision higher than 10{sup −5}, at z > 0.1. Future attempts to constrain this law will require high quality galaxy spectra at higher resolution ( R ≳ 10,000).« less

  12. Double the dates and go for Bayes - Impacts of model choice, dating density and quality on chronologies

    NASA Astrophysics Data System (ADS)

    Blaauw, Maarten; Christen, J. Andrés; Bennett, K. D.; Reimer, Paula J.

    2018-05-01

    Reliable chronologies are essential for most Quaternary studies, but little is known about how age-depth model choice, as well as dating density and quality, affect the precision and accuracy of chronologies. A meta-analysis suggests that most existing late-Quaternary studies contain fewer than one date per millennium, and provide millennial-scale precision at best. We use existing and simulated sediment cores to estimate what dating density and quality are required to obtain accurate chronologies at a desired precision. For many sites, a doubling in dating density would significantly improve chronologies and thus their value for reconstructing and interpreting past environmental changes. Commonly used classical age-depth models stop becoming more precise after a minimum dating density is reached, but the precision of Bayesian age-depth models which take advantage of chronological ordering continues to improve with more dates. Our simulations show that classical age-depth models severely underestimate uncertainty and are inaccurate at low dating densities, and also perform poorly at high dating densities. On the other hand, Bayesian age-depth models provide more realistic precision estimates, including at low to average dating densities, and are much more robust against dating scatter and outliers. Indeed, Bayesian age-depth models outperform classical ones at all tested dating densities, qualities and time-scales. We recommend that chronologies should be produced using Bayesian age-depth models taking into account chronological ordering and based on a minimum of 2 dates per millennium.

  13. Semi-empirical studies of atomic structure. Progress report, 1 July 1982-1 February 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, L.J.

    1983-01-01

    A program of studies of the properties of the heavy and highly ionized atomic systems which often occur as contaminants in controlled fusion devices is continuing. The project combines experimental measurements by fast-ion-beam excitation with semi-empirical data parametrizations to identify and exploit regularities in the properties of these very heavy and very highly ionized systems. The increasing use of spectroscopic line intensities as diagnostics for determining thermonuclear plasma temperatures and densities requires laboratory observation and analysis of such spectra, often to accuracies that exceed the capabilities of ab initio theoretical methods for these highly relativistic many electron systems. Through themore » acquisition and systematization of empirical data, remarkably precise methods for predicting excitation energies, transition wavelengths, transition probabilities, level lifetimes, ionization potentials, core polarizabilities, and core penetrabilities are being developed and applied. Although the data base for heavy, highly ionized atoms is still sparse, parametrized extrapolations and interpolations along isoelectronic, homologous, and Rydberg sequences are providing predictions for large classes of quantities, with a precision that is sharpened by subsequent measurements.« less

  14. Semiempirical studies of atomic structure. Progress report, 1 July 1983-1 June 1984

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, L.J.

    1984-01-01

    A program of studies of the properties of the heavy and highly ionized atomic systems which often occur as contaminants in controlled fusion devices is continuing. The project combines experimental measurements by fast ion beam excitation with semiempirical data parametrizations to identify and exploit regularities in the properties of these very heavy and very highly ionized systems. The increasing use of spectroscopic line intensities as diagnostics for determining thermonuclear plasma temperatures and densities requires laboratory observation and analysis of such spectra, often to accuracies that exceed the capabilities of ab initio theoretical methods for these highly relativistic many electron systems.more » Through the acquisition and systematization of empirical data, remarkably precise methods for predicting excitation energies, transition wavelengths, transition probabilities, level lifetimes, ionization potentials, core polarizabilities, and core penetrabilities are being developed and applied. Although the data base for heavy, highly ionized atoms is still sparse, parametrized extrapolations and interpolations along isoelectronic, homologous, and Rydberg sequences are providing predictions for large classes of quantities, with a precision that is sharpened by subsequent measurements.« less

  15. THE APPLICATION OF MULTIVIEW METHODS FOR HIGH-PRECISION ASTROMETRIC SPACE VLBI AT LOW FREQUENCIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dodson, R.; Rioja, M.; Imai, H.

    2013-06-15

    High-precision astrometric space very long baseline interferometry (S-VLBI) at the low end of the conventional frequency range, i.e., 20 cm, is a requirement for a number of high-priority science goals. These are headlined by obtaining trigonometric parallax distances to pulsars in pulsar-black hole pairs and OH masers anywhere in the Milky Way and the Magellanic Clouds. We propose a solution for the most difficult technical problems in S-VLBI by the MultiView approach where multiple sources, separated by several degrees on the sky, are observed simultaneously. We simulated a number of challenging S-VLBI configurations, with orbit errors up to 8 mmore » in size and with ionospheric atmospheres consistent with poor conditions. In these simulations we performed MultiView analysis to achieve the required science goals. This approach removes the need for beam switching requiring a Control Moment Gyro, and the space and ground infrastructure required for high-quality orbit reconstruction of a space-based radio telescope. This will dramatically reduce the complexity of S-VLBI missions which implement the phase-referencing technique.« less

  16. Comparison of low cost measurement techniques for long-term monitoring of atmospheric ammonia.

    PubMed

    Sutton, M A; Miners, B; Tang, Y S; Milford, C; Wyers, G P; Duyzer, J H; Fowler, D

    2001-10-01

    An inter-comparison of techniques for long-term sampling of atmospheric ammonia (NH3) was conducted with a view to establishing a national network with > 50 sites. Key requirements were for: a low cost system, simplicity and durability to enable a postal exchange with local site operators, a precision of < +/- 20% for monthly sampling at expected NH3 concentrations of 1-2 micrograms m-3, a detection limit sufficient to resolve the small NH3 concentrations (< 0.2 microgram m-3) expected in remote parts of the UK, and a quantitative means to establish quality control. Five sampling methods were compared: A, a commercially available membrane diffusion tube (exposed in triplicate), with membranes removed immediately after sampling; B, the above method, with the membranes left in place until analysis; C, open-ended diffusion tubes (exposed with 4 replicates); D, a new active sampling diffusion denuder system; and E, an active sampling bubbler system. Method D consisted of two 0.1 m acid coated glass denuders in series with sampling at approximately 0.3 l min-1. These methods were deployed at 6 locations in the UK and the Netherlands and compared against reference estimates. Method D was the most precise and sensitive of the techniques compared, with a detection limit of < 0.1 microgram m-3. The bubbler provided a less precise estimate of NH3 concentration, and also suffered several practical drawbacks. The diffusion tubes were found to correlate with the reference at high concentrations (> 3 micrograms m-3), but were less precise and overestimated NH3 at smaller concentrations. Of the passive methods, A was the most precise and C the least precise. On the basis of the results, method D has been implemented in the national network, together with application of method A to explore spatial variability in regions with expected high NH3 concentrations.

  17. Supervised Learning in Spiking Neural Networks for Precise Temporal Encoding

    PubMed Central

    Gardner, Brian; Grüning, André

    2016-01-01

    Precise spike timing as a means to encode information in neural networks is biologically supported, and is advantageous over frequency-based codes by processing input features on a much shorter time-scale. For these reasons, much recent attention has been focused on the development of supervised learning rules for spiking neural networks that utilise a temporal coding scheme. However, despite significant progress in this area, there still lack rules that have a theoretical basis, and yet can be considered biologically relevant. Here we examine the general conditions under which synaptic plasticity most effectively takes place to support the supervised learning of a precise temporal code. As part of our analysis we examine two spike-based learning methods: one of which relies on an instantaneous error signal to modify synaptic weights in a network (INST rule), and the other one relying on a filtered error signal for smoother synaptic weight modifications (FILT rule). We test the accuracy of the solutions provided by each rule with respect to their temporal encoding precision, and then measure the maximum number of input patterns they can learn to memorise using the precise timings of individual spikes as an indication of their storage capacity. Our results demonstrate the high performance of the FILT rule in most cases, underpinned by the rule’s error-filtering mechanism, which is predicted to provide smooth convergence towards a desired solution during learning. We also find the FILT rule to be most efficient at performing input pattern memorisations, and most noticeably when patterns are identified using spikes with sub-millisecond temporal precision. In comparison with existing work, we determine the performance of the FILT rule to be consistent with that of the highly efficient E-learning Chronotron rule, but with the distinct advantage that our FILT rule is also implementable as an online method for increased biological realism. PMID:27532262

  18. Supervised Learning in Spiking Neural Networks for Precise Temporal Encoding.

    PubMed

    Gardner, Brian; Grüning, André

    2016-01-01

    Precise spike timing as a means to encode information in neural networks is biologically supported, and is advantageous over frequency-based codes by processing input features on a much shorter time-scale. For these reasons, much recent attention has been focused on the development of supervised learning rules for spiking neural networks that utilise a temporal coding scheme. However, despite significant progress in this area, there still lack rules that have a theoretical basis, and yet can be considered biologically relevant. Here we examine the general conditions under which synaptic plasticity most effectively takes place to support the supervised learning of a precise temporal code. As part of our analysis we examine two spike-based learning methods: one of which relies on an instantaneous error signal to modify synaptic weights in a network (INST rule), and the other one relying on a filtered error signal for smoother synaptic weight modifications (FILT rule). We test the accuracy of the solutions provided by each rule with respect to their temporal encoding precision, and then measure the maximum number of input patterns they can learn to memorise using the precise timings of individual spikes as an indication of their storage capacity. Our results demonstrate the high performance of the FILT rule in most cases, underpinned by the rule's error-filtering mechanism, which is predicted to provide smooth convergence towards a desired solution during learning. We also find the FILT rule to be most efficient at performing input pattern memorisations, and most noticeably when patterns are identified using spikes with sub-millisecond temporal precision. In comparison with existing work, we determine the performance of the FILT rule to be consistent with that of the highly efficient E-learning Chronotron rule, but with the distinct advantage that our FILT rule is also implementable as an online method for increased biological realism.

  19. Stochastic precision analysis of 2D cardiac strain estimation in vivo

    NASA Astrophysics Data System (ADS)

    Bunting, E. A.; Provost, J.; Konofagou, E. E.

    2014-11-01

    Ultrasonic strain imaging has been applied to echocardiography and carries great potential to be used as a tool in the clinical setting. Two-dimensional (2D) strain estimation may be useful when studying the heart due to the complex, 3D deformation of the cardiac tissue. Increasing the framerate used for motion estimation, i.e. motion estimation rate (MER), has been shown to improve the precision of the strain estimation, although maintaining the spatial resolution necessary to view the entire heart structure in a single heartbeat remains challenging at high MERs. Two previously developed methods, the temporally unequispaced acquisition sequence (TUAS) and the diverging beam sequence (DBS), have been used in the past to successfully estimate in vivo axial strain at high MERs without compromising spatial resolution. In this study, a stochastic assessment of 2D strain estimation precision is performed in vivo for both sequences at varying MERs (65, 272, 544, 815 Hz for TUAS; 250, 500, 1000, 2000 Hz for DBS). 2D incremental strains were estimated during left ventricular contraction in five healthy volunteers using a normalized cross-correlation function and a least-squares strain estimator. Both sequences were shown capable of estimating 2D incremental strains in vivo. The conditional expected value of the elastographic signal-to-noise ratio (E(SNRe|ɛ)) was used to compare strain estimation precision of both sequences at multiple MERs over a wide range of clinical strain values. The results here indicate that axial strain estimation precision is much more dependent on MER than lateral strain estimation, while lateral estimation is more affected by strain magnitude. MER should be increased at least above 544 Hz to avoid suboptimal axial strain estimation. Radial and circumferential strain estimations were influenced by the axial and lateral strain in different ways. Furthermore, the TUAS and DBS were found to be of comparable precision at similar MERs.

  20. MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.

    2016-01-01

    MADNESS (multiresolution adaptive numerical environment for scientific simulation) is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.

  1. In vivo glenohumeral analysis using 3D MRI models and a flexible software tool: feasibility and precision.

    PubMed

    Busse, Harald; Thomas, Michael; Seiwerts, Matthias; Moche, Michael; Busse, Martin W; von Salis-Soglio, Georg; Kahn, Thomas

    2008-01-01

    To implement a PC-based morphometric analysis platform and to evaluate the feasibility and precision of MRI measurements of glenohumeral translation. Using a vertically open 0.5T MRI scanner, the shoulders of 10 healthy subjects were scanned in apprehension (AP) and in neutral position (NP), respectively. Surface models of the humeral head (HH) and the glenoid cavity (GC) were created from segmented MR images by three readers. Glenohumeral translation was determined by the projection point of the manually fitted HH center on the GC plane defined by the two main principal axes of the GC model. Positional precision, given as mean (extreme value at 95% confidence level), was 0.9 (1.8) mm for the HH center and 0.7 (1.6) mm for the GC centroid; angular GC precision was 1.3 degrees (2.3 degrees ) for the normal and about 4 degrees (7 degrees ) for the anterior and superior coordinate axes. The two-dimensional (2D) precision of the HH projection point was 1.1 (2.2) mm. A significant HH translation between AP and NP was found. Despite a limited quality of the underlying model data, our PC-based analysis platform allows a precise morphometric analysis of the glenohumeral joint. The software is easily extendable and may potentially be used for an objective evaluation of therapeutical measures.

  2. Breast density quantification with cone-beam CT: A post-mortem study

    PubMed Central

    Johnson, Travis; Ding, Huanjun; Le, Huy Q.; Ducote, Justin L.; Molloi, Sabee

    2014-01-01

    Forty post-mortem breasts were imaged with a flat-panel based cone-beam x-ray CT system at 50 kVp. The feasibility of breast density quantification has been investigated using standard histogram thresholding and an automatic segmentation method based on the fuzzy c-means algorithm (FCM). The breasts were chemically decomposed into water, lipid, and protein immediately after image acquisition was completed. The percent fibroglandular volume (%FGV) from chemical analysis was used as the gold standard for breast density comparison. Both image-based segmentation techniques showed good precision in breast density quantification with high linear coefficients between the right and left breast of each pair. When comparing with the gold standard using %FGV from chemical analysis, Pearson’s r-values were estimated to be 0.983 and 0.968 for the FCM clustering and the histogram thresholding techniques, respectively. The standard error of the estimate (SEE) was also reduced from 3.92% to 2.45% by applying the automatic clustering technique. The results of the postmortem study suggested that breast tissue can be characterized in terms of water, lipid and protein contents with high accuracy by using chemical analysis, which offers a gold standard for breast density studies comparing different techniques. In the investigated image segmentation techniques, the FCM algorithm had high precision and accuracy in breast density quantification. In comparison to conventional histogram thresholding, it was more efficient and reduced inter-observer variation. PMID:24254317

  3. High-Precision Pinpointing of Luminescent Targets in Encoder-Assisted Scanning Microscopy Allowing High-Speed Quantitative Analysis.

    PubMed

    Zheng, Xianlin; Lu, Yiqing; Zhao, Jiangbo; Zhang, Yuhai; Ren, Wei; Liu, Deming; Lu, Jie; Piper, James A; Leif, Robert C; Liu, Xiaogang; Jin, Dayong

    2016-01-19

    Compared with routine microscopy imaging of a few analytes at a time, rapid scanning through the whole sample area of a microscope slide to locate every single target object offers many advantages in terms of simplicity, speed, throughput, and potential for robust quantitative analysis. Existing techniques that accommodate solid-phase samples incorporating individual micrometer-sized targets generally rely on digital microscopy and image analysis, with intrinsically low throughput and reliability. Here, we report an advanced on-the-fly stage scanning method to achieve high-precision target location across the whole slide. By integrating X- and Y-axis linear encoders to a motorized stage as the virtual "grids" that provide real-time positional references, we demonstrate an orthogonal scanning automated microscopy (OSAM) technique which can search a coverslip area of 50 × 24 mm(2) in just 5.3 min and locate individual 15 μm lanthanide luminescent microspheres with standard deviations of 1.38 and 1.75 μm in X and Y directions. Alongside implementation of an autofocus unit that compensates the tilt of a slide in the Z-axis in real time, we increase the luminescence detection efficiency by 35% with an improved coefficient of variation. We demonstrate the capability of advanced OSAM for robust quantification of luminescence intensities and lifetimes for a variety of micrometer-scale luminescent targets, specifically single down-shifting and upconversion microspheres, crystalline microplates, and color-barcoded microrods, as well as quantitative suspension array assays of biotinylated-DNA functionalized upconversion nanoparticles.

  4. Epigenetic regulation of gene expression in cancer: techniques, resources and analysis

    PubMed Central

    Kagohara, Luciane T; Stein-O’Brien, Genevieve L; Kelley, Dylan; Flam, Emily; Wick, Heather C; Danilova, Ludmila V; Easwaran, Hariharan; Favorov, Alexander V; Qian, Jiang; Gaykalova, Daria A; Fertig, Elana J

    2018-01-01

    Abstract Cancer is a complex disease, driven by aberrant activity in numerous signaling pathways in even individual malignant cells. Epigenetic changes are critical mediators of these functional changes that drive and maintain the malignant phenotype. Changes in DNA methylation, histone acetylation and methylation, noncoding RNAs, posttranslational modifications are all epigenetic drivers in cancer, independent of changes in the DNA sequence. These epigenetic alterations were once thought to be crucial only for the malignant phenotype maintenance. Now, epigenetic alterations are also recognized as critical for disrupting essential pathways that protect the cells from uncontrolled growth, longer survival and establishment in distant sites from the original tissue. In this review, we focus on DNA methylation and chromatin structure in cancer. The precise functional role of these alterations is an area of active research using emerging high-throughput approaches and bioinformatics analysis tools. Therefore, this review also describes these high-throughput measurement technologies, public domain databases for high-throughput epigenetic data in tumors and model systems and bioinformatics algorithms for their analysis. Advances in bioinformatics data that combine these epigenetic data with genomics data are essential to infer the function of specific epigenetic alterations in cancer. These integrative algorithms are also a focus of this review. Future studies using these emerging technologies will elucidate how alterations in the cancer epigenome cooperate with genetic aberrations during tumor initiation and progression. This deeper understanding is essential to future studies with epigenetics biomarkers and precision medicine using emerging epigenetic therapies. PMID:28968850

  5. Use of Terrestrial Laser Scanning Technology for Long Term High Precision Deformation Monitoring

    PubMed Central

    Vezočnik, Rok; Ambrožič, Tomaž; Sterle, Oskar; Bilban, Gregor; Pfeifer, Norbert; Stopar, Bojan

    2009-01-01

    The paper presents a new methodology for high precision monitoring of deformations with a long term perspective using terrestrial laser scanning technology. In order to solve the problem of a stable reference system and to assure the high quality of possible position changes of point clouds, scanning is integrated with two complementary surveying techniques, i.e., high quality static GNSS positioning and precise tacheometry. The case study object where the proposed methodology was tested is a high pressure underground pipeline situated in an area which is geologically unstable. PMID:22303152

  6. The high-speed after-pulse measurement system for PMT

    NASA Astrophysics Data System (ADS)

    Cheng, Y.; Qian, S.; Ning, Z.; Xia, J.; Wang, Z.

    2018-05-01

    A system employing a desktop FADC has been developed to investigate the features of 8-inch Hamamatsu PMT R5912. The system stands out for its high-speed and informative results as a consequence of adopting fast waveform sampling technology. Recording the full waveforms allows us to perform pulse shape analysis. High-precision after-pulse time and charge distribution results are presented in this manuscript. Other characteristics of the photomultiplier tube, such as the gain of charge, dark rate and transit time spread, can be also obtained by this system.

  7. Development of High Precision Tsunami Runup Calculation Method Coupled with Structure Analysis

    NASA Astrophysics Data System (ADS)

    Arikawa, Taro; Seki, Katsumi; Chida, Yu; Takagawa, Tomohiro; Shimosako, Kenichiro

    2017-04-01

    The 2011 Great East Japan Earthquake (GEJE) has shown that tsunami disasters are not limited to inundation damage in a specified region, but may destroy a wide area, causing a major disaster. Evaluating standing land structures and damage to them requires highly precise evaluation of three-dimensional fluid motion - an expensive process. Our research goals were thus to develop a coupling STOC-CADMAS (Arikawa and Tomita, 2016) coupling with the structure analysis (Arikawa et. al., 2009) to efficiently calculate all stages from tsunami source to runup including the deformation of structures and to verify their applicability. We also investigated the stability of breakwaters at Kamaishi Bay. Fig. 1 shows the whole of this calculation system. The STOC-ML simulator approximates pressure by hydrostatic pressure and calculates the wave profiles based on an equation of continuity, thereby lowering calculation cost, primarily calculating from a e epi center to the shallow region. As a simulator, STOC-IC solves pressure based on a Poisson equation to account for a shallower, more complex topography, but reduces computation cost slightly to calculate the area near a port by setting the water surface based on an equation of continuity. CS3D also solves a Navier-Stokes equation and sets the water surface by VOF to deal with the runup area, with its complex surfaces of overflows and bores. STR solves the structure analysis including the geo analysis based on the Biot's formula. By coupling these, it efficiently calculates the tsunami profile from the propagation to the inundation. The numerical results compared with the physical experiments done by Arikawa et. al.,2012. It was good agreement with the experimental ones. Finally, the system applied to the local situation at Kamaishi bay. The almost breakwaters were washed away, whose situation was similar to the damage at Kamaishi bay. REFERENCES T. Arikawa and T. Tomita (2016): "Development of High Precision Tsunami Runup Calculation Method Based on a Hierarchical Simulation", Journal of Disaster ResearchVol.11 No.4 T. Arikawa, K. Hamaguchi, K. Kitagawa, T. Suzuki (2009): "Development of Numerical Wave Tank Coupled with Structure Analysis Based on FEM", Journal of J.S.C.E., Ser. B2 (Coastal Engineering) Vol. 65, No. 1 T. Arikawa et. al.(2012) "Failure Mechanism of Kamaishi Breakwaters due to the Great East Japan Earthquake Tsunami", 33rd International Conference on Coastal Engineering, No.1191

  8. High-Precision Half-Life Measurement for the Superallowed β+ Emitter Alm26

    NASA Astrophysics Data System (ADS)

    Finlay, P.; Ettenauer, S.; Ball, G. C.; Leslie, J. R.; Svensson, C. E.; Andreoiu, C.; Austin, R. A. E.; Bandyopadhyay, D.; Cross, D. S.; Demand, G.; Djongolov, M.; Garrett, P. E.; Green, K. L.; Grinyer, G. F.; Hackman, G.; Leach, K. G.; Pearson, C. J.; Phillips, A. A.; Sumithrarachchi, C. S.; Triambak, S.; Williams, S. J.

    2011-01-01

    A high-precision half-life measurement for the superallowed β+ emitter Alm26 was performed at the TRIUMF-ISAC radioactive ion beam facility yielding T1/2=6346.54±0.46stat±0.60systms, consistent with, but 2.5 times more precise than, the previous world average. The Alm26 half-life and ft value, 3037.53(61) s, are now the most precisely determined for any superallowed β decay. Combined with recent theoretical corrections for isospin-symmetry-breaking and radiative effects, the corrected Ft value for Alm26, 3073.0(12) s, sets a new benchmark for the high-precision superallowed Fermi β-decay studies used to test the conserved vector current hypothesis and determine the Vud element of the Cabibbo-Kobayashi-Maskawa quark mixing matrix.

  9. Precise strong lensing mass profile of the CLASH galaxy cluster MACS 2129

    NASA Astrophysics Data System (ADS)

    Monna, A.; Seitz, S.; Balestra, I.; Rosati, P.; Grillo, C.; Halkola, A.; Suyu, S. H.; Coe, D.; Caminha, G. B.; Frye, B.; Koekemoer, A.; Mercurio, A.; Nonino, M.; Postman, M.; Zitrin, A.

    2017-04-01

    We present a detailed strong lensing (SL) mass reconstruction of the core of the galaxy cluster MACS J2129.4-0741 (zcl = 0.589) obtained by combining high-resolution Hubble Space Telescope photometry from the CLASH (Cluster Lensing And Supernovae survey with Hubble) survey with new spectroscopic observations from the CLASH-VLT (Very Large Telescope) survey. A background bright red passive galaxy at zsp = 1.36, sextuply lensed in the cluster core, has four radial lensed images located over the three central cluster members. Further 19 background lensed galaxies are spectroscopically confirmed by our VLT survey, including 3 additional multiple systems. A total of 31 multiple images are used in the lensing analysis. This allows us to trace with high precision the total mass profile of the cluster in its very inner region (R < 100 kpc). Our final lensing mass model reproduces the multiple images systems identified in the cluster core with high accuracy of 0.4 arcsec. This translates to a high-precision mass reconstruction of MACS 2129, which is constrained at a level of 2 per cent. The cluster has Einstein parameter ΘE = (29 ± 4) arcsec and a projected total mass of Mtot(<ΘE) = (1.35 ± 0.03) × 1014 M⊙ within such radius. Together with the cluster mass profile, we provide here also the complete spectroscopic data set for the cluster members and lensed images measured with VLT/Visible Multi-Object Spectrograph within the CLASH-VLT survey.

  10. Quantitative High-Resolution Genomic Analysis of Single Cancer Cells

    PubMed Central

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A.; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics. PMID:22140428

  11. Simultaneous Determination of Soyasaponins and Isoflavones in Soy (Glycine max L.) Products by HPTLC-densitometry-Multiple Detection.

    PubMed

    Shawky, Eman; Sallam, Shaimaa M

    2017-11-01

    A new high-throughput method was developed for the simultaneous analysis of isoflavones and soyasaponnins in Soy (Glycine max L.) products by high-performance thin-layer chromatography with densitometry and multiple detection. Silica gel was used as the stationary phase and ethyl acetate:methanol:water:acetic acid (100:20:16:1, v/v/v/v) as the mobile phase. After chromatographic development, multi-wavelength scanning was carried out by: (i) UV-absorbance measurement at 265 nm for genistin, daidzin and glycitin, (ii) Vis-absorbance measurement at 650 nm for Soyasaponins I and III, after post-chromatographic derivatization with anisaldehyde/sulfuric acid reagent. Validation of the developed method was found to meet the acceptance criteria delineated by ICH guidelines with respect to linearity, accuracy, precision, specificity and robustness. Calibrations were linear with correlation coefficients of >0.994. Intra-day precisions relative standard deviation (RSD)% of all substances in matrix were determined to be between 0.7 and 0.9%, while inter-day precisions (RSD%) ranged between 1.2 and 1.8%. The validated method was successfully applied for determination of the studied analytes in soy-based infant formula and soybean products. The new method compares favorably to other reported methods in being as accurate and precise and in the same time more feasible and cost-effective. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Neutron activation analyses and half-life measurements at the usgs triga reactor

    NASA Astrophysics Data System (ADS)

    Larson, Robert E.

    Neutron activation of materials followed by gamma spectroscopy using high-purity germanium detectors is an effective method for making measurements of nuclear beta decay half-lives and for detecting trace amounts of elements present in materials. This research explores applications of neutron activation analysis (NAA) in two parts. Part 1. High Precision Methods for Measuring Decay Half-Lives, Chapters 1 through 8 Part one develops research methods and data analysis techniques for making high precision measurements of nuclear beta decay half-lives. The change in the electron capture half-life of 51Cr in pure chromium versus chromium mixed in a gold lattice structure is explored, and the 97Ru electron capture decay half-life are compared for ruthenium in a pure crystal versus ruthenium in a rutile oxide state, RuO2. In addition, the beta-minus decay half-life of 71mZn is measured and compared with new high precision findings. Density Functional Theory is used to explain the measured magnitude of changes in electron capture half-life from changes in the surrounding lattice electron configuration. Part 2. Debris Collection Nuclear Diagnostic at the National Ignition Facility, Chapters 9 through 11 Part two explores the design and development of a solid debris collector for use as a diagnostic tool at the National Ignition Facility (NIF). NAA measurements are performed on NIF post-shot debris collected on witness plates in the NIF chamber. In this application NAA is used to detect and quantify the amount of trace amounts of gold from the hohlraum and germanium from the pellet present in the debris collected after a NIF shot. The design of a solid debris collector based on material x-ray ablation properties is given, and calculations are done to predict performance and results for the collection and measurements of trace amounts of gold and germanium from dissociated hohlraum debris.

  13. Otolith oxygen isotopes measured by high-precision secondary ion mass spectrometry reflect life history of a yellowfin sole (Limanda aspera).

    PubMed

    Matta, Mary Elizabeth; Orland, Ian J; Ushikubo, Takayuki; Helser, Thomas E; Black, Bryan A; Valley, John W

    2013-03-30

    The oxygen isotope ratio (δ(18)O value) of aragonite fish otoliths is dependent on the temperature and the δ(18)O value of the ambient water and can thus reflect the environmental history of a fish. Secondary ion mass spectrometry (SIMS) offers a spatial-resolution advantage over conventional acid-digestion techniques for stable isotope analysis of otoliths, especially given their compact nature. High-precision otolith δ(18)O analysis was conducted with an IMS-1280 ion microprobe to investigate the life history of a yellowfin sole (Limanda aspera), a Bering Sea species known to migrate ontogenetically. The otolith was cut transversely through its core and one half was roasted to eliminate organic contaminants. Values of δ(18)O were measured in 10-µm spots along three transects (two in the roasted half, one in the unroasted half) from the core toward the edge. Otolith annual growth zones were dated using the dendrochronology technique of crossdating. Measured values of δ(18)O ranged from 29.0 to 34.1‰ (relative to Vienna Standard Mean Ocean Water). Ontogenetic migration from shallow to deeper waters was reflected in generally increasing δ(18)O values from age-0 to approximately age-7 and subsequent stabilization after the expected onset of maturity at age-7. Cyclical variations of δ(18)O values within juvenile otolith growth zones, up to 3.9‰ in magnitude, were caused by a combination of seasonal changes in the temperature and the δ(18)O value of the ambient water. The ion microprobe produced a high-precision and high-resolution record of the relative environmental conditions experienced by a yellowfin sole that was consistent with population-level studies of ontogeny. Furthermore, this study represents the first time that crossdating has been used to ensure the dating accuracy of δ(18)O measurements in otoliths. Copyright © 2013 John Wiley & Sons, Ltd.

  14. Precision cleaning verification of fluid components by air/water impingement and total carbon analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1994-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 sq m. Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging/diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg/sq ft of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVRs impinged from witness plates of 0.05 to 0.75 sq m.

  15. Precision Cleaning Verification of Fluid Components by Air/Water Impingement and Total Carbon Analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1995-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).

  16. Molecularly imprinted polymeric stir bar: Preparation and application for the determination of naftopidil in plasma and urine samples.

    PubMed

    Peng, Jun; Xiao, Deli; He, Hua; Zhao, Hongyan; Wang, Cuixia; Shi, Tian; Shi, Kexin

    2016-01-01

    In this study, molecularly imprinting technology and stir bar absorption technology were combined to develop a microextraction approach based on a molecularly imprinted polymeric stir bar. The molecularly imprinted polymer stir bar has a high performance, is specific, economical, and simple to prepare. The obtained naftopidil-imprinted polymer-coated bars could simultaneously agitate and adsorb naftopidil in the sample solution. The ratio of template/monomer/cross-linker and conditions of template removal were optimized to prepare a stir bar with highly efficient adsorption. Fourier transform infrared spectroscopy, scanning electron microscopy, selectivity, and extraction capacity experiments showed that the molecularly imprinted polymer stir bar was prepared successfully. To utilize the molecularly imprinted polymer stir bar for the determination of naftopidil in complex body fluid matrices, the extraction time, stirring speed, eluent, and elution time were optimized. The limits of detection of naftopidil in plasma and urine sample were 7.5 and 4.0 ng/mL, respectively, and the recoveries were in the range of 90-112%. The within-run precision and between-run precision were acceptable (relative standard deviation <7%). These data demonstrated that the molecularly imprinted polymeric stir bar based microextraction with high-performance liquid chromatography was a convenient, rapid, efficient, and specific method for the precise determination of trace naftopidil in clinical analysis. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. A high-precision sampling scheme to assess persistence and transport characteristics of micropollutants in rivers.

    PubMed

    Schwientek, Marc; Guillet, Gaëlle; Rügner, Hermann; Kuch, Bertram; Grathwohl, Peter

    2016-01-01

    Increasing numbers of organic micropollutants are emitted into rivers via municipal wastewaters. Due to their persistence many pollutants pass wastewater treatment plants without substantial removal. Transport and fate of pollutants in receiving waters and export to downstream ecosystems is not well understood. In particular, a better knowledge of processes governing their environmental behavior is needed. Although a lot of data are available concerning the ubiquitous presence of micropollutants in rivers, accurate data on transport and removal rates are lacking. In this paper, a mass balance approach is presented, which is based on the Lagrangian sampling scheme, but extended to account for precise transport velocities and mixing along river stretches. The calculated mass balances allow accurate quantification of pollutants' reactivity along river segments. This is demonstrated for representative members of important groups of micropollutants, e.g. pharmaceuticals, musk fragrances, flame retardants, and pesticides. A model-aided analysis of the measured data series gives insight into the temporal dynamics of removal processes. The occurrence of different removal mechanisms such as photooxidation, microbial degradation, and volatilization is discussed. The results demonstrate, that removal processes are highly variable in time and space and this has to be considered for future studies. The high precision sampling scheme presented could be a powerful tool for quantifying removal processes under different boundary conditions and in river segments with contrasting properties. Copyright © 2015. Published by Elsevier B.V.

  18. High-precision measurement of (186)Os/(188)Os and (187)Os/(188)Os: isobaric oxide corrections with in-run measured oxygen isotope ratios.

    PubMed

    Chu, Zhu-Yin; Li, Chao-Feng; Chen, Zhi; Xu, Jun-Jie; Di, Yan-Kun; Guo, Jing-Hui

    2015-09-01

    We present a novel method for high precision measurement of (186)Os/(188)Os and (187)Os/(188)Os ratios, applying isobaric oxide interference correction based on in-run measurements of oxygen isotopic ratios. For this purpose, we set up a static data collection routine to measure the main Os(16)O3(-) ion beams with Faraday cups connected to conventional 10(11) amplifiers, and (192)Os(16)O2(17)O(-) and (192)Os(16)O2(18)O(-) ion beams with Faraday cups connected to 10(12) amplifiers. Because of the limited number of Faraday cups, we did not measure (184)Os(16)O3(-) and (189)Os(16)O3(-) simultaneously in-run, but the analytical setup had no significant influence on final (186)Os/(188)Os and (187)Os/(188)Os data. By analyzing UMd, DROsS, an in-house Os solution standard, and several rock reference materials, including WPR-1, WMS-1a, and Gpt-5, the in-run measured oxygen isotopic ratios were proven to present accurate Os isotopic data. However, (186)Os/(188)Os and (187)Os/(188)Os data obtained with in-run O isotopic compositions for the solution standards and rock reference materials show minimal improvement in internal and external precision, compared to the conventional oxygen correction method. We concluded that, the small variations of oxygen isotopes during OsO3(-) analytical sessions are probably not the main source of error for high precision Os isotopic analysis. Nevertheless, use of run-specific O isotopic compositions is still a better choice for Os isotopic data reduction and eliminates the requirement of extra measurements of the oxygen isotopic ratios.

  19. Optimization of deformation monitoring networks using finite element strain analysis

    NASA Astrophysics Data System (ADS)

    Alizadeh-Khameneh, M. Amin; Eshagh, Mehdi; Jensen, Anna B. O.

    2018-04-01

    An optimal design of a geodetic network can fulfill the requested precision and reliability of the network, and decrease the expenses of its execution by removing unnecessary observations. The role of an optimal design is highlighted in deformation monitoring network due to the repeatability of these networks. The core design problem is how to define precision and reliability criteria. This paper proposes a solution, where the precision criterion is defined based on the precision of deformation parameters, i. e. precision of strain and differential rotations. A strain analysis can be performed to obtain some information about the possible deformation of a deformable object. In this study, we split an area into a number of three-dimensional finite elements with the help of the Delaunay triangulation and performed the strain analysis on each element. According to the obtained precision of deformation parameters in each element, the precision criterion of displacement detection at each network point is then determined. The developed criterion is implemented to optimize the observations from the Global Positioning System (GPS) in Skåne monitoring network in Sweden. The network was established in 1989 and straddled the Tornquist zone, which is one of the most active faults in southern Sweden. The numerical results show that 17 out of all 21 possible GPS baseline observations are sufficient to detect minimum 3 mm displacement at each network point.

  20. Parameter-tolerant design of high contrast gratings

    NASA Astrophysics Data System (ADS)

    Chevallier, Christyves; Fressengeas, Nicolas; Jacquet, Joel; Almuneau, Guilhem; Laaroussi, Youness; Gauthier-Lafaye, Olivier; Cerutti, Laurent; Genty, Frédéric

    2015-02-01

    This work is devoted to the design of high contrast grating mirrors taking into account the technological constraints and tolerance of fabrication. First, a global optimization algorithm has been combined to a numerical analysis of grating structures (RCWA) to automatically design HCG mirrors. Then, the tolerances of the grating dimensions have been precisely studied to develop a robust optimization algorithm with which high contrast gratings, exhibiting not only a high efficiency but also large tolerance values, could be designed. Finally, several structures integrating previously designed HCGs has been simulated to validate and illustrate the interest of such gratings.

  1. Precise orbit determination of the Lunar Reconnaissance Orbiter and first gravity field results

    NASA Astrophysics Data System (ADS)

    Maier, Andrea; Baur, Oliver

    2014-05-01

    The Lunar Reconnaissance Orbiter (LRO) was launched in 2009 and is expected to orbit the Moon until the end of 2014. Among other instruments, LRO has a highly precise altimeter on board demanding an orbit accuracy of one meter in the radial component. Precise orbit determination (POD) is achieved with radiometric observations (Doppler range rates, ranges) on the one hand, and optical laser ranges on the other hand. LRO is the first satellite at a distance of approximately 360 000 to 400 000 km from the Earth that is routinely tracked with optical laser ranges. This measurement type was introduced to achieve orbits of higher precision than it would be possible with radiometric observations only. In this contribution we investigate the strength of each measurement type (radiometric range rates, radiometric ranges, optical laser ranges) based on single-technique orbit estimation. In a next step all measurement types are combined in a joined analysis. In addition to POD results, preliminary gravity field coefficients are presented being a subsequent product of the orbit determination process. POD and gravity field estimation was accomplished with the NASA/GSFC software packages GEODYN and SOLVE.

  2. SU (2) lattice gauge theory simulations on Fermi GPUs

    NASA Astrophysics Data System (ADS)

    Cardoso, Nuno; Bicudo, Pedro

    2011-05-01

    In this work we explore the performance of CUDA in quenched lattice SU (2) simulations. CUDA, NVIDIA Compute Unified Device Architecture, is a hardware and software architecture developed by NVIDIA for computing on the GPU. We present an analysis and performance comparison between the GPU and CPU in single and double precision. Analyses with multiple GPUs and two different architectures (G200 and Fermi architectures) are also presented. In order to obtain a high performance, the code must be optimized for the GPU architecture, i.e., an implementation that exploits the memory hierarchy of the CUDA programming model. We produce codes for the Monte Carlo generation of SU (2) lattice gauge configurations, for the mean plaquette, for the Polyakov Loop at finite T and for the Wilson loop. We also present results for the potential using many configurations (50,000) without smearing and almost 2000 configurations with APE smearing. With two Fermi GPUs we have achieved an excellent performance of 200× the speed over one CPU, in single precision, around 110 Gflops/s. We also find that, using the Fermi architecture, double precision computations for the static quark-antiquark potential are not much slower (less than 2× slower) than single precision computations.

  3. Quantity of dates trumps quality of dates for dense Bayesian radiocarbon sediment chronologies - Gas ion source 14C dating instructed by simultaneous Bayesian accumulation rate modeling

    NASA Astrophysics Data System (ADS)

    Rosenheim, B. E.; Firesinger, D.; Roberts, M. L.; Burton, J. R.; Khan, N.; Moyer, R. P.

    2016-12-01

    Radiocarbon (14C) sediment core chronologies benefit from a high density of dates, even when precision of individual dates is sacrificed. This is demonstrated by a combined approach of rapid 14C analysis of CO2 gas generated from carbonates and organic material coupled with Bayesian statistical modeling. Analysis of 14C is facilitated by the gas ion source on the Continuous Flow Accelerator Mass Spectrometry (CFAMS) system at the Woods Hole Oceanographic Institution's National Ocean Sciences Accelerator Mass Spectrometry facility. This instrument is capable of producing a 14C determination of +/- 100 14C y precision every 4-5 minutes, with limited sample handling (dissolution of carbonates and/or combustion of organic carbon in evacuated containers). Rapid analysis allows over-preparation of samples to include replicates at each depth and/or comparison of different sample types at particular depths in a sediment or peat core. Analysis priority is given to depths that have the least chronologic precision as determined by Bayesian modeling of the chronology of calibrated ages. Use of such a statistical approach to determine the order in which samples are run ensures that the chronology constantly improves so long as material is available for the analysis of chronologic weak points. Ultimately, accuracy of the chronology is determined by the material that is actually being dated, and our combined approach allows testing of different constituents of the organic carbon pool and the carbonate minerals within a core. We will present preliminary results from a deep-sea sediment core abundant in deep-sea foraminifera as well as coastal wetland peat cores to demonstrate statistical improvements in sediment- and peat-core chronologies obtained by increasing the quantity and decreasing the quality of individual dates.

  4. Global GNSS processing based on the raw observation approach

    NASA Astrophysics Data System (ADS)

    Strasser, Sebastian; Zehentner, Norbert; Mayer-Gürr, Torsten

    2017-04-01

    Many global navigation satellite system (GNSS) applications, e.g. Precise Point Positioning (PPP), require high-quality GNSS products, such as precise GNSS satellite orbits and clocks. These products are routinely determined by analysis centers of the International GNSS Service (IGS). The current processing methods of the analysis centers make use of the ionosphere-free linear combination to reduce the ionospheric influence. Some of the analysis centers also form observation differences, in general double-differences, to eliminate several additional error sources. The raw observation approach is a new GNSS processing approach that was developed at Graz University of Technology for kinematic orbit determination of low Earth orbit (LEO) satellites and subsequently adapted to global GNSS processing in general. This new approach offers some benefits compared to well-established approaches, such as a straightforward incorporation of new observables due to the avoidance of observation differences and linear combinations. This becomes especially important in view of the changing GNSS landscape with two new systems, the European system Galileo and the Chinese system BeiDou, currently in deployment. GNSS products generated at Graz University of Technology using the raw observation approach currently comprise precise GNSS satellite orbits and clocks, station positions and clocks, code and phase biases, and Earth rotation parameters. To evaluate the new approach, products generated using the Global Positioning System (GPS) constellation and observations from the global IGS station network are compared to those of the IGS analysis centers. The comparisons show that the products generated at Graz University of Technology are on a similar level of quality to the products determined by the IGS analysis centers. This confirms that the raw observation approach is applicable to global GNSS processing. Some areas requiring further work have been identified, enabling future improvements of the method.

  5. Analysis of administrative barriers in the industry of the high-rise construction in Russian Federation

    NASA Astrophysics Data System (ADS)

    Zaychenko, Irina; Borremans, Alexandra; Gutman, Svetlana

    2018-03-01

    The article describes the concept and types of administrative barriers encountered in various areas of the enterprise. The particularities of the Russian high-rise construction industry are described and a comparative analysis of administrative barriers in this sector is performed. The main stages and administrative procedures when the developers implement investment and construction projects in the field of high-rise construction are determined. The regulatory and legal framework for the implementation of investment and project activities in the high-rise construction industry has been studied and conclusions have been drawn on its low level of precision in the issue of the formation of competitive and efficient high-rise construction markets. The average number of administrative procedures for the implementation of the investment and construction project in the field of high-rise construction is determined. The factors preventing the reduction of administrative barriers in the high-rise construction industry are revealed.

  6. Application of Raytracing Through the High Resolution Numerical Weather Model HIRLAM for the Analysis of European VLBI

    NASA Technical Reports Server (NTRS)

    Garcia-Espada, Susana; Haas, Rudiger; Colomer, Francisco

    2010-01-01

    An important limitation for the precision in the results obtained by space geodetic techniques like VLBI and GPS are tropospheric delays caused by the neutral atmosphere, see e.g. [1]. In recent years numerical weather models (NWM) have been applied to improve mapping functions which are used for tropospheric delay modeling in VLBI and GPS data analyses. In this manuscript we use raytracing to calculate slant delays and apply these to the analysis of Europe VLBI data. The raytracing is performed through the limited area numerical weather prediction (NWP) model HIRLAM. The advantages of this model are high spatial (0.2 deg. x 0.2 deg.) and high temporal resolution (in prediction mode three hours).

  7. Quantification of urinary uric acid in the presence of thymol and thimerosal by high-performance liquid chromatography

    NASA Technical Reports Server (NTRS)

    Chen, Y.; Pietrzyk, R. A.; Whitson, P. A.

    1997-01-01

    A high-performance liquid chromatographic method was developed as an alternative to automated enzymatic analysis of uric acid in human urine preserved with thymol and/or thimerosal. Uric acid (tR = 10 min) and creatinine (tR = 5 min) were separated and quantified during isocratic elution (0.025 M acetate buffer, pH 4.5) from a mu Bondapak C18 column. The uric-acid peak was identified chemically by incubating urine samples with uricase. The thymol/thimerosal peak appeared at 31 min during the washing step and did not interfere with the analysis. We validated the high-performance liquid chromatographic method for linearity, precision and accuracy, and the results were found to be excellent.

  8. Laser-Induced Focused Ultrasound for Cavitation Treatment: Toward High-Precision Invisible Sonic Scalpel.

    PubMed

    Lee, Taehwa; Luo, Wei; Li, Qiaochu; Demirci, Hakan; Guo, L Jay

    2017-10-01

    Beyond the implementation of the photoacoustic effect to photoacoustic imaging and laser ultrasonics, this study demonstrates a novel application of the photoacoustic effect for high-precision cavitation treatment of tissue using laser-induced focused ultrasound. The focused ultrasound is generated by pulsed optical excitation of an efficient photoacoustic film coated on a concave surface, and its amplitude is high enough to produce controllable microcavitation within the focal region (lateral focus <100 µm). Such microcavitation is used to cut or ablate soft tissue in a highly precise manner. This work demonstrates precise cutting of tissue-mimicking gels as well as accurate ablation of gels and animal eye tissues. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Basic Expeditionary Airfield Resource (BEAR) Requirements Analysis Tool (BRAT)

    DTIC Science & Technology

    2008-01-01

    Washington Headquarters Services , Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302...high-demand/low-density precision-guided munitions.8 LCDR Scott McCain’s paper, The Afloat Prepositioning Program: Do Service Mission Differences...Preclude Total fointness?, examines the feasibility of joint management of all Service afloat prepositioning programs, ultimately concluding that the

  10. High-precision relocation of long-period events beneath the summit region of Kı̄lauea Volcano, Hawai‘i, from 1986 to 2009

    USGS Publications Warehouse

    Matoza, Robin S.; Shearer, Peter M.; Okubo, Paul G.

    2016-01-01

    Long-period (0.5–5 Hz, LP) seismicity has been recorded for decades in the summit region of Kı̄lauea Volcano, Hawai‘i, and is postulated as linked with the magma transport and shallow hydrothermal systems. To better characterize its spatiotemporal occurrence, we perform a systematic analysis of 49,030 seismic events occurring in the Kı̄lauea summit region from January 1986 to March 2009 recorded by the ∼50-station Hawaiian Volcano Observatory permanent network. We estimate 215,437 P wave spectra, considering all events on all stations, and use a station-averaged spectral metric to consistently classify LP and non-LP seismicity. We compute high-precision relative relocations for 5327 LP events (43% of all classified LP events) using waveform cross correlation and cluster analysis with 6.4 million event pairs, combined with the source-specific station term method. The majority of intermediate-depth (5–15 km) LPs collapse to a compact volume, with remarkable source location stability over 23 years indicating a source process controlled by geological or conduit structure.

  11. A flight test method for pilot/aircraft analysis

    NASA Technical Reports Server (NTRS)

    Koehler, R.; Buchacker, E.

    1986-01-01

    In high precision flight maneuvres a pilot is a part of a closed loop pilot/aircraft system. The assessment of the flying qualities is highly dependent on the closed loop characteristics related to precision maneuvres like approach, landing, air-to-air tracking, air-to-ground tracking, close formation flying and air-to air refueling of the receiver. The object of a research program at DFVLR is the final flight phase of an air to ground mission. In this flight phase the pilot has to align the aircraft with the target, correct small deviations from the target direction and keep the target in his sights for a specific time period. To investigate the dynamic behavior of the pilot-aircraft system a special ground attack flight test technique with a prolonged tracking maneuvres was developed. By changing the targets during the attack the pilot is forced to react continously on aiming errors in his sights. Thus the closed loop pilot/aircraft system is excited over a wide frequency range of interest, the pilot gets more information about mission oriented aircraft dynamics and suitable flight test data for a pilot/aircraft analysis can be generated.

  12. Analysis on the multi-dimensional spectrum of the thrust force for the linear motor feed drive system in machine tools

    NASA Astrophysics Data System (ADS)

    Yang, Xiaojun; Lu, Dun; Ma, Chengfang; Zhang, Jun; Zhao, Wanhua

    2017-01-01

    The motor thrust force has lots of harmonic components due to the nonlinearity of drive circuit and motor itself in the linear motor feed drive system. What is more, in the motion process, these thrust force harmonics may vary with the position, velocity, acceleration and load, which affects the displacement fluctuation of the feed drive system. Therefore, in this paper, on the basis of the thrust force spectrum obtained by the Maxwell equation and the electromagnetic energy method, the multi-dimensional variation of each thrust harmonic is analyzed under different motion parameters. Then the model of the servo system is established oriented to the dynamic precision. The influence of the variation of the thrust force spectrum on the displacement fluctuation is discussed. At last the experiments are carried out to verify the theoretical analysis above. It can be found that the thrust harmonics show multi-dimensional spectrum characteristics under different motion parameters and loads, which should be considered to choose the motion parameters and optimize the servo control parameters in the high-speed and high-precision machine tools equipped with the linear motor feed drive system.

  13. [Determination of LF-VD refining furnace slag by X ray fluorescence spectrometry].

    PubMed

    Kan, Bin; Cheng, Jian-ping; Song, Zu-feng

    2004-10-01

    Eight components, i.e. TFe, CaO, MgO, Al2O3, SiO2, TiO2, MnO and P2O5 in refining furnace slag were determined by X ray fluorescence spectrometer. Because the content of CaO was high, the authors selected 12 national and departmental grade slag standard samples and prepared a series of synthetic standard samples by adding spectrally pure reagents to them. The calibration curve is suitable to the sample analysis of CaO, MgO and SiO2 with widely varying range. Meanwhile, the points on the curve are even. The samples were prepared at high temperature by adding Li2B4O7 as flux. The experiments for the selection of the sample preparation conditions about strip reagents, melting temperature and dulition ratio were carried out. The matrix effects on absorption and enhancement were corrected by means of PH model and theoretical alpha coefficient. Moreover, the precision and accuracy experiments were performed. In comparison with chemical analysis method, the quantitative analytical results for each component are satisfactory. The method has proven rapid, precise and simple.

  14. High precision applications of the global positioning system

    NASA Technical Reports Server (NTRS)

    Lichten, Stephen M.

    1991-01-01

    The Global Positioning System (GPS) is a constellation of U.S. defense navigation satellites which can be used for military and civilian positioning applications. A wide variety of GPS scientific applications were identified and precise positioning capabilities with GPS were already demonstrated with data available from the present partial satellite constellation. Expected applications include: measurements of Earth crustal motion, particularly in seismically active regions; measurements of the Earth's rotation rate and pole orientation; high-precision Earth orbiter tracking; surveying; measurements of media propagation delays for calibration of deep space radiometric data in support of NASA planetary missions; determination of precise ground station coordinates; and precise time transfer worldwide.

  15. Droplet-counting Microtitration System for Precise On-site Analysis.

    PubMed

    Kawakubo, Susumu; Omori, Taichi; Suzuki, Yasutada; Ueta, Ikuo

    2018-01-01

    A new microtitration system based on the counting of titrant droplets has been developed for precise on-site analysis. The dropping rate was controlled by inserting a capillary tube as a flow resistance in a laboratory-made micropipette. The error of titration was 3% in a simulated titration with 20 droplets. The pre-addition of a titrant was proposed for precise titration within an error of 0.5%. The analytical performances were evaluated for chelate titration, redox titration and acid-base titration.

  16. Benford's law gives better scaling exponents in phase transitions of quantum XY models.

    PubMed

    Rane, Ameya Deepak; Mishra, Utkarsh; Biswas, Anindya; Sen De, Aditi; Sen, Ujjwal

    2014-08-01

    Benford's law is an empirical law predicting the distribution of the first significant digits of numbers obtained from natural phenomena and mathematical tables. It has been found to be applicable for numbers coming from a plethora of sources, varying from seismographic, biological, financial, to astronomical. We apply this law to analyze the data obtained from physical many-body systems described by the one-dimensional anisotropic quantum XY models in a transverse magnetic field. We detect the zero-temperature quantum phase transition and find that our method gives better finite-size scaling exponents for the critical point than many other known scaling exponents using measurable quantities like magnetization, entanglement, and quantum discord. We extend our analysis to the same system but at finite temperature and find that it also detects the finite-temperature phase transition in the model. Moreover, we compare the Benford distribution analysis with the same obtained from the uniform and Poisson distributions. The analysis is furthermore important in that the high-precision detection of the cooperative physical phenomena is possible even from low-precision experimental data.

  17. Spark ablation-inductively coupled plasma spectrometry for analysis of geologic materials

    USGS Publications Warehouse

    Golightly, D.W.; Montaser, A.; Smith, B.L.; Dorrzapf, A.F.

    1989-01-01

    Spark ablation-inductively coupled plasma (SA-ICP) spectrometry is applied to the measurement of hafnium-zirconium ratios in zircons and to the determination of cerium, cobalt, iron, lead, nickel and phosphorus in ferromanganese nodules. Six operating parameters used for the high-voltage spark and argon-ICP combination are established by sequential simplex optimization of both signal-to-background ratio and signal-to-noise ratio. The time-dependences of the atomic emission signals of analytes and matrix elements ablated from a finely pulverized sample embedded in a pressed disk of copper demonstrate selective sampling by the spark. Concentration ratios of hafnium to zirconium in zircons are measured with a precision of 4% (relative standard deviation, RSD). For ferromanganese nodules, spectral measurements based on intensity ratios of analyte line to the Mn(II) 257.610 nm line provide precisions of analysis in the range from 7 to 14% RSD. The accuracy of analysis depends on use of standard additions of the reference material USGS Nod P-1, and an independent measurement of the Mn concentration. ?? 1989.

  18. Gravitational Lensing 2.0

    NASA Astrophysics Data System (ADS)

    Wittman, David M.; Benson, Bryant

    2018-06-01

    Weak lensing analyses use the image---the intensity field---of a distant galaxy to infer gravitational effects on that line of sight. What if we analyze the velocity field instead? We show that lensing imprints much more information onto a highly ordered velocity field, such as that of a rotating disk galaxy, than onto an intensity field. This is because shuffling intensity pixels yields a post-lensed image quite similar to an unlensed galaxy with a different orientation, a problem known as "shape noise." We show that velocity field analysis can eliminate shape noise and yield much more precise lensing constraints. Furthermore, convergence as well as shear can be constrained using the same target, and there is no need to assume the weak lensing limit of small convergence. We present Fisher matrix forecasts of the precision achievable with this method. Velocity field observations are expensive, so we derive guidelines for choosing suitable targets by exploring how precision varies with source parameters such as inclination angle and redshift. Finally, we present simulations that support our Fisher matrix forecasts.

  19. A global view on the Higgs self-coupling at lepton colliders

    DOE PAGES

    Di Vita, Stefano; Durieux, Gauthier; Grojean, Christophe; ...

    2018-02-28

    We perform a global effective-field-theory analysis to assess the precision on the determination of the Higgs trilinear self-coupling at future lepton colliders. Two main scenarios are considered, depending on whether the center-of-mass energy of the colliders is sufficient or not to access Higgs pair production processes. Low-energy machines allow for ~40% precision on the extraction of the Higgs trilinear coupling through the exploitation of next-to-leading-order effects in single Higgs measurements, provided that runs at both 240/250 GeV and 350 GeV are available with luminosities in the few attobarns range. A global fit, including possible deviations in other SM couplings, ismore » essential in this case to obtain a robust determination of the Higgs self-coupling. High-energy machines can easily achieve a ~20% precision through Higgs pair production processes. In this case, the impact of additional coupling modifications is milder, although not completely negligible.« less

  20. A global view on the Higgs self-coupling at lepton colliders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Vita, Stefano; Durieux, Gauthier; Grojean, Christophe

    We perform a global effective-field-theory analysis to assess the precision on the determination of the Higgs trilinear self-coupling at future lepton colliders. Two main scenarios are considered, depending on whether the center-of-mass energy of the colliders is sufficient or not to access Higgs pair production processes. Low-energy machines allow for ~40% precision on the extraction of the Higgs trilinear coupling through the exploitation of next-to-leading-order effects in single Higgs measurements, provided that runs at both 240/250 GeV and 350 GeV are available with luminosities in the few attobarns range. A global fit, including possible deviations in other SM couplings, ismore » essential in this case to obtain a robust determination of the Higgs self-coupling. High-energy machines can easily achieve a ~20% precision through Higgs pair production processes. In this case, the impact of additional coupling modifications is milder, although not completely negligible.« less

  1. Search for CP violation effects in the h→ τ τ decay with e^+e^- colliders

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Wu, Yongcheng

    2017-10-01

    A new method is proposed to reconstruct the neutrinos in the e^+e^-→ Zh process followed by the h→ τ τ decay. With the help of a refined Higgs momentum reconstruction from the recoiling system and the impact parameters, high precision in the determination of the momentum of neutrinos can be achieved. The prospect of measuring the Higgs CP mixing angle with the h→ τ τ decay at e^+e^- colliders is studied with the new method. The analysis is based on a detailed detector simulation of the signal and backgrounds. The fully reconstructed neutrinos and also other visible products from the tau decay are used to build matrix element (ME)-based CP observables. With 5 ab^{-1} of data at E_{ {CM}}=250 GeV, a precision of 2.9° can be achieved for the CP mixing angle with three main one-prong decay modes of the taus. The precision is found to be about 35% better than the other methods.

  2. Analysis of RDSS positioning accuracy based on RNSS wide area differential technique

    NASA Astrophysics Data System (ADS)

    Xing, Nan; Su, RanRan; Zhou, JianHua; Hu, XiaoGong; Gong, XiuQiang; Liu, Li; He, Feng; Guo, Rui; Ren, Hui; Hu, GuangMing; Zhang, Lei

    2013-10-01

    The BeiDou Navigation Satellite System (BDS) provides Radio Navigation Service System (RNSS) as well as Radio Determination Service System (RDSS). RDSS users can obtain positioning by responding the Master Control Center (MCC) inquiries to signal transmitted via GEO satellite transponder. The positioning result can be calculated with elevation constraint by MCC. The primary error sources affecting the RDSS positioning accuracy are the RDSS signal transceiver delay, atmospheric trans-mission delay and GEO satellite position error. During GEO orbit maneuver, poor orbit forecast accuracy significantly impacts RDSS services. A real-time 3-D orbital correction method based on wide-area differential technique is raised to correct the orbital error. Results from the observation shows that the method can successfully improve positioning precision during orbital maneuver, independent from the RDSS reference station. This improvement can reach 50% in maximum. Accurate calibration of the RDSS signal transceiver delay precision and digital elevation map may have a critical role in high precise RDSS positioning services.

  3. Multiple Frequency Audio Signal Communication as a Mechanism for Neurophysiology and Video Data Synchronization

    PubMed Central

    Topper, Nicholas C.; Burke, S.N.; Maurer, A.P.

    2014-01-01

    BACKGROUND Current methods for aligning neurophysiology and video data are either prepackaged, requiring the additional purchase of a software suite, or use a blinking LED with a stationary pulse-width and frequency. These methods lack significant user interface for adaptation, are expensive, or risk a misalignment of the two data streams. NEW METHOD A cost-effective means to obtain high-precision alignment of behavioral and neurophysiological data is obtained by generating an audio-pulse embedded with two domains of information, a low-frequency binary-counting signal and a high, randomly changing frequency. This enabled the derivation of temporal information while maintaining enough entropy in the system for algorithmic alignment. RESULTS The sample to frame index constructed using the audio input correlation method described in this paper enables video and data acquisition to be aligned at a sub-frame level of precision. COMPARISONS WITH EXISTING METHOD Traditionally, a synchrony pulse is recorded on-screen via a flashing diode. The higher sampling rate of the audio input of the camcorder enables the timing of an event to be detected with greater precision. CONCLUSIONS While On-line analysis and synchronization using specialized equipment may be the ideal situation in some cases, the method presented in the current paper presents a viable, low cost alternative, and gives the flexibility to interface with custom off-line analysis tools. Moreover, the ease of constructing and implements this set-up presented in the current paper makes it applicable to a wide variety of applications that require video recording. PMID:25256648

  4. Multiple frequency audio signal communication as a mechanism for neurophysiology and video data synchronization.

    PubMed

    Topper, Nicholas C; Burke, Sara N; Maurer, Andrew Porter

    2014-12-30

    Current methods for aligning neurophysiology and video data are either prepackaged, requiring the additional purchase of a software suite, or use a blinking LED with a stationary pulse-width and frequency. These methods lack significant user interface for adaptation, are expensive, or risk a misalignment of the two data streams. A cost-effective means to obtain high-precision alignment of behavioral and neurophysiological data is obtained by generating an audio-pulse embedded with two domains of information, a low-frequency binary-counting signal and a high, randomly changing frequency. This enabled the derivation of temporal information while maintaining enough entropy in the system for algorithmic alignment. The sample to frame index constructed using the audio input correlation method described in this paper enables video and data acquisition to be aligned at a sub-frame level of precision. Traditionally, a synchrony pulse is recorded on-screen via a flashing diode. The higher sampling rate of the audio input of the camcorder enables the timing of an event to be detected with greater precision. While on-line analysis and synchronization using specialized equipment may be the ideal situation in some cases, the method presented in the current paper presents a viable, low cost alternative, and gives the flexibility to interface with custom off-line analysis tools. Moreover, the ease of constructing and implements this set-up presented in the current paper makes it applicable to a wide variety of applications that require video recording. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Spectrophotometric method development and validation for determination of chlorpheniramine maleate in bulk and controlled release tablets.

    PubMed

    Ashfaq, Maria; Sial, Ali Akber; Bushra, Rabia; Rehman, Atta-Ur; Baig, Mirza Tasawur; Huma, Ambreen; Ahmed, Maryam

    2018-01-01

    Spectrophotometric technique is considered to be the simplest and operator friendly among other available analytical methods for pharmaceutical analysis. The objective of the study was to develop a precise, accurate and rapid UV-spectrophotometric method for the estimation of chlorpheniramine maleate (CPM) in pure and solid pharmaceutical formulation. Drug absorption was measured in various solvent systems including 0.1N HCl (pH 1.2), acetate buffer (pH 4.5), phosphate buffer (pH 6.8) and distil water (pH 7.0). Method validation was performed as per official guidelines of ICH, 2005. High drug absorption was observed in 0.1N HCl medium with λ max of 261nm. The drug showed the good linearity from 20 to 60μg/mL solution concentration with the correlation coefficient linear regression equation Y= 0.1853 X + 0.1098 presenting R 2 value of 0.9998. The method accuracy was evaluated by the percent drug recovery, presents more than 99% drug recovery at three different levels assessed. The % RSD value <1 was computed for inter and intraday analysis indicating the high accuracy and precision of the developed technique. The developed method is robust because it shows no any significant variation in with minute changes. The LOD and LOQ values were assessed to be 2.2μg/mL and 6.6μg/mL respectively. The investigated method proved its sensitivity, precision and accuracy hence could be successfully used to estimate the CPM content in bulk and pharmaceutical matrix tablets.

  6. High resolution extremity CT for biomechanics modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashby, A.E.; Brand, H.; Hollerbach, K.

    1995-09-23

    With the advent of ever more powerful computing and finite element analysis (FEA) capabilities, the bone and joint geometry detail available from either commercial surface definitions or from medical CT scans is inadequate. For dynamic FEA modeling of joints, precise articular contours are necessary to get appropriate contact definition. In this project, a fresh cadaver extremity was suspended in parafin in a lucite cylinder and then scanned with an industrial CT system to generate a high resolution data set for use in biomechanics modeling.

  7. Development and Validation of RP-HPLC Method for the Estimation of Ivabradine Hydrochloride in Tablets

    PubMed Central

    Seerapu, Sunitha; Srinivasan, B. P.

    2010-01-01

    A simple, sensitive, precise and robust reverse–phase high-performance liquid chromatographic method for analysis of ivabradine hydrochloride in pharmaceutical formulations was developed and validated as per ICH guidelines. The separation was performed on SS Wakosil C18AR, 250×4.6 mm, 5 μm column with methanol:25 mM phosphate buffer (60:40 v/v), adjusted to pH 6.5 with orthophosphoric acid, added drop wise, as mobile phase. A well defined chromatographic peak of Ivabradine hydrochloride was exhibited with a retention time of 6.55±0.05 min and tailing factor of 1.14 at the flow rate of 0.8 ml/min and at ambient temperature, when monitored at 285 nm. The linear regression analysis data for calibration plots showed good linear relationship with R=0.9998 in the concentration range of 30-210 μg/ml. The method was validated for precision, recovery and robustness. Intra and Inter-day precision (% relative standard deviation) were always less than 2%. The method showed the mean % recovery of 99.00 and 98.55 % for Ivabrad and Inapure tablets, respectively. The proposed method has been successfully applied to the commercial tablets without any interference of excipients. PMID:21695008

  8. Probing the BSM physics with CMB precision cosmology: an application to supersymmetry

    NASA Astrophysics Data System (ADS)

    Dalianis, Ioannis; Watanabe, Yuki

    2018-02-01

    The cosmic history before the BBN is highly determined by the physics that operates beyond the Standard Model (BSM) of particle physics and it is poorly constrained observationally. Ongoing and future precision measurements of the CMB observables can provide us with significant information about the pre-BBN era and hence possibly test the cosmological predictions of different BSM scenarios. Supersymmetry is a particularly motivated BSM theory and it is often the case that different superymmetry breaking schemes require different cosmic histories with specific reheating temperatures or low entropy production in order to be cosmologically viable. In this paper we quantify the effects of the possible alternative cosmic histories on the n s and r CMB observables assuming a generic non-thermal stage after cosmic inflation. We analyze TeV and especially multi-TeV super-symmetry breaking schemes assuming the neutralino and gravitino dark matter scenarios. We complement our analysis considering the Starobinsky R 2 inflation model to exemplify the improved CMB predictions that a unified description of the early universe cosmic evolution yields. Our analysis underlines the importance of the CMB precision measurements that can be viewed, to some extend, as complementary to the laboratory experimental searches for supersymmetry or other BSM theories.

  9. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    PubMed

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  10. Sensor-based precision fertilization for field crops

    USDA-ARS?s Scientific Manuscript database

    From the development of the first viable variable-rate fertilizer systems in the upper Midwest USA, precision agriculture is now approaching three decades old. Early precision fertilization practice relied on laboratory analysis of soil samples collected on a spatial pattern to define the nutrient-s...

  11. Precision optical slit for high heat load or ultra high vacuum

    DOEpatents

    Andresen, N.C.; DiGennaro, R.S.; Swain, T.L.

    1995-01-24

    This invention relates generally to slits used in optics that must be precisely aligned and adjusted. The optical slits of the present invention are useful in x-ray optics, x-ray beam lines, optical systems in which the entrance slit is critical for high wavelength resolution. The invention is particularly useful in ultra high vacuum systems where lubricants are difficult to use and designs which avoid the movement of metal parts against one another are important, such as monochromators for high wavelength resolution with ultra high vacuum systems. The invention further relates to optical systems in which temperature characteristics of the slit materials is important. The present invention yet additionally relates to precision slits wherein the opposing edges of the slit must be precisely moved relative to a center line between the edges with each edge retaining its parallel orientation with respect to the other edge and/or the center line. 21 figures.

  12. Precision optical slit for high heat load or ultra high vacuum

    DOEpatents

    Andresen, Nord C.; DiGennaro, Richard S.; Swain, Thomas L.

    1995-01-01

    This invention relates generally to slits used in optics that must be precisely aligned and adjusted. The optical slits of the present invention are useful in x-ray optics, x-ray beam lines, optical systems in which the entrance slit is critical for high wavelength resolution. The invention is particularly useful in ultra high vacuum systems where lubricants are difficult to use and designs which avoid the movement of metal parts against one another are important, such as monochrometers for high wavelength resolution with ultra high vacuum systems. The invention further relates to optical systems in which temperature characteristics of the slit materials is important. The present invention yet additionally relates to precision slits wherein the opposing edges of the slit must be precisely moved relative to a center line between the edges with each edge retaining its parallel orientation with respect to the other edge and/or the center line.

  13. Rhizoslides: paper-based growth system for non-destructive, high throughput phenotyping of root development by means of image analysis.

    PubMed

    Le Marié, Chantal; Kirchgessner, Norbert; Marschall, Daniela; Walter, Achim; Hund, Andreas

    2014-01-01

    A quantitative characterization of root system architecture is currently being attempted for various reasons. Non-destructive, rapid analyses of root system architecture are difficult to perform due to the hidden nature of the root. Hence, improved methods to measure root architecture are necessary to support knowledge-based plant breeding and to analyse root growth responses to environmental changes. Here, we report on the development of a novel method to reveal growth and architecture of maize root systems. The method is based on the cultivation of different root types within several layers of two-dimensional, large (50 × 60 cm) plates (rhizoslides). A central plexiglass screen stabilizes the system and is covered on both sides with germination paper providing water and nutrients for the developing root, followed by a transparent cover foil to prevent the roots from falling dry and to stabilize the system. The embryonic roots grow hidden between a Plexiglas surface and paper, whereas crown roots grow visible between paper and the transparent cover. Long cultivation with good image quality up to 20 days (four fully developed leaves) was enhanced by suppressing fungi with a fungicide. Based on hyperspectral microscopy imaging, the quality of different germination papers was tested and three provided sufficient contrast to distinguish between roots and background (segmentation). Illumination, image acquisition and segmentation were optimised to facilitate efficient root image analysis. Several software packages were evaluated with regard to their precision and the time investment needed to measure root system architecture. The software 'Smart Root' allowed precise evaluation of root development but needed substantial user interference. 'GiaRoots' provided the best segmentation method for batch processing in combination with a good analysis of global root characteristics but overestimated root length due to thinning artefacts. 'WhinRhizo' offered the most rapid and precise evaluation of root lengths in diameter classes, but had weaknesses with respect to image segmentation and analysis of root system architecture. A new technique has been established for non-destructive root growth studies and quantification of architectural traits beyond seedlings stages. However, automation of the scanning process and appropriate software remains the bottleneck for high throughput analysis.

  14. Patient safety and systematic reviews: finding papers indexed in MEDLINE, EMBASE and CINAHL.

    PubMed

    Tanon, A A; Champagne, F; Contandriopoulos, A-P; Pomey, M-P; Vadeboncoeur, A; Nguyen, H

    2010-10-01

    To develop search strategies for identifying papers on patient safety in MEDLINE, EMBASE and CINAHL. Six journals were electronically searched for papers on patient safety published between 2000 and 2006. Identified papers were divided into two gold standards: one to build and the other to validate the search strategies. Candidate terms for strategy construction were identified using a word frequency analysis of titles, abstracts and keywords used to index the papers in the databases. Searches were run for each one of the selected terms independently in every database. Sensitivity, precision and specificity were calculated for each candidate term. Terms with sensitivity greater than 10% were combined to form the final strategies. The search strategies developed were run against the validation gold standard to assess their performance. A final step in the validation process was to compare the performance of each strategy to those of other strategies found in the literature. We developed strategies for all three databases that were highly sensitive (range 95%-100%), precise (range 40%-60%) and balanced (the product of sensitivity and precision being in the range of 30%-40%). The strategies were very specific and outperformed those found in the literature. The strategies we developed can meet the needs of users aiming to maximise either sensitivity or precision, or seeking a reasonable compromise between sensitivity and precision, when searching for papers on patient safety in MEDLINE, EMBASE or CINAHL.

  15. Function and disability in late life: comparison of the Late-Life Function and Disability Instrument to the Short-Form-36 and the London Handicap Scale.

    PubMed

    Dubuc, Nicole; Haley, Stephen; Ni, Pengsheng; Kooyoomjian, Jill; Jette, Alan

    2004-03-18

    We evaluated the Late-Life Function and Disability Instrument's (LLFDI) concurrent validity, comprehensiveness and precision by comparing it with the Short-Form-36 physical functioning (PF-10) and the London Handicap Scale (LHS). We administered the LLFDI, PF-10 and LHS to 75 community-dwelling adults (> 60 years of age). We used Pearson correlation coefficients to examine concurrent validity and Rasch analysis to compare the item hierarchies, content ranges and precision of the PF-10 and LLFDI function domains, and the LHS and the LLFDI disability domains. LLFDI Function (lower extremity scales) and PF-10 scores were highly correlated (r = 0.74 - 0.86, p > 0.001); moderate correlations were found between the LHS and the LLFDI Disability limitation (r = 0.66, p < 0.0001) and Disability frequency (r = 0.47, p < 0.001) scores. The LLFDI had a wider range of content coverage, less ceiling effects and better relative precision across the spectrum of function and disability than the PF-10 and the LHS. The LHS had slightly more content range and precision in the lower end of the disability scale than the LLFDI. The LLFDI is a more comprehensive and precise instrument compared to the PF-10 and LHS for assessing function and disability in community-dwelling older adults.

  16. [Medical imaging in tumor precision medicine: opportunities and challenges].

    PubMed

    Xu, Jingjing; Tan, Yanbin; Zhang, Minming

    2017-05-25

    Tumor precision medicine is an emerging approach for tumor diagnosis, treatment and prevention, which takes account of individual variability of environment, lifestyle and genetic information. Tumor precision medicine is built up on the medical imaging innovations developed during the past decades, including the new hardware, new imaging agents, standardized protocols, image analysis and multimodal imaging fusion technology. Also the development of automated and reproducible analysis algorithm has extracted large amount of information from image-based features. With the continuous development and mining of tumor clinical and imaging databases, the radiogenomics, radiomics and artificial intelligence have been flourishing. Therefore, these new technological advances bring new opportunities and challenges to the application of imaging in tumor precision medicine.

  17. Automatic Topography Using High Precision Digital Moire Methods

    NASA Astrophysics Data System (ADS)

    Yatagai, T.; Idesawa, M.; Saito, S.

    1983-07-01

    Three types of moire topographic methods using digital techniques are proposed. Deformed gratings obtained by projecting a reference grating onto an object under test are subjected to digital analysis. The electronic analysis procedures of deformed gratings described here enable us to distinguish between depression and elevation of the object, so that automatic measurement of 3-D shapes and automatic moire fringe interpolation are performed. Based on the digital moire methods, we have developed a practical measurement system, with a linear photodiode array on a micro-stage as a scanning image sensor. Examples of fringe analysis in medical applications are presented.

  18. Development of novel hybrid flexure-based microgrippers for precision micro-object manipulation.

    PubMed

    Mohd Zubir, Mohd Nashrul; Shirinzadeh, Bijan; Tian, Yanling

    2009-06-01

    This paper describes the process of developing a microgripper that is capable of high precision and fidelity manipulation of micro-objects. The design adopts the concept of flexure-based hinges on its joints to provide the rotational motion, thus eliminating the inherent nonlinearities associated with the application of conventional rigid hinges. A combination of two modeling techniques, namely, pseudorigid body model and finite element analysis was utilized to expedite the prototyping procedure, which leads to the establishment of a high performance mechanism. A new hybrid compliant structure integrating cantilever beam and flexural hinge configurations within microgripper mechanism mainframe has been developed. This concept provides a novel approach to harness the advantages within each individual configuration while mutually compensating the limitations inherent between them. A wire electrodischarge machining technique was utilized to fabricate the gripper out of high grade aluminum alloy (Al 7075T6). Experimental studies were conducted on the model to obtain various correlations governing the gripper performance as well as for model verification. The experimental results demonstrate high level of compliance in comparison to the computational results. A high amplification characteristic and maximum achievable stroke of 100 microm can be achieved.

  19. Development of novel hybrid flexure-based microgrippers for precision micro-object manipulation

    NASA Astrophysics Data System (ADS)

    Mohd Zubir, Mohd Nashrul; Shirinzadeh, Bijan; Tian, Yanling

    2009-06-01

    This paper describes the process of developing a microgripper that is capable of high precision and fidelity manipulation of micro-objects. The design adopts the concept of flexure-based hinges on its joints to provide the rotational motion, thus eliminating the inherent nonlinearities associated with the application of conventional rigid hinges. A combination of two modeling techniques, namely, pseudorigid body model and finite element analysis was utilized to expedite the prototyping procedure, which leads to the establishment of a high performance mechanism. A new hybrid compliant structure integrating cantilever beam and flexural hinge configurations within microgripper mechanism mainframe has been developed. This concept provides a novel approach to harness the advantages within each individual configuration while mutually compensating the limitations inherent between them. A wire electrodischarge machining technique was utilized to fabricate the gripper out of high grade aluminum alloy (Al 7075T6). Experimental studies were conducted on the model to obtain various correlations governing the gripper performance as well as for model verification. The experimental results demonstrate high level of compliance in comparison to the computational results. A high amplification characteristic and maximum achievable stroke of 100 μm can be achieved.

  20. The emerging potential for network analysis to inform precision cancer medicine.

    PubMed

    Ozturk, Kivilcim; Dow, Michelle; Carlin, Daniel E; Bejar, Rafael; Carter, Hannah

    2018-06-14

    Precision cancer medicine promises to tailor clinical decisions to patients using genomic information. Indeed, successes of drugs targeting genetic alterations in tumors, such as imatinib that targets BCR-ABL in chronic myelogenous leukemia, have demonstrated the power of this approach. However biological systems are complex, and patients may differ not only by the specific genetic alterations in their tumor, but by more subtle interactions among such alterations. Systems biology and more specifically, network analysis, provides a framework for advancing precision medicine beyond clinical actionability of individual mutations. Here we discuss applications of network analysis to study tumor biology, early methods for N-of-1 tumor genome analysis and the path for such tools to the clinic. Copyright © 2018. Published by Elsevier Ltd.

  1. Design and control of the precise tracking bed based on complex electromechanical design theory

    NASA Astrophysics Data System (ADS)

    Ren, Changzhi; Liu, Zhao; Wu, Liao; Chen, Ken

    2010-05-01

    The precise tracking technology is wide used in astronomical instruments, satellite tracking and aeronautic test bed. However, the precise ultra low speed tracking drive system is one high integrated electromechanical system, which one complexly electromechanical design method is adopted to improve the efficiency, reliability and quality of the system during the design and manufacture circle. The precise Tracking Bed is one ultra-exact, ultra-low speed, high precision and huge inertial instrument, which some kind of mechanism and environment of the ultra low speed is different from general technology. This paper explores the design process based on complex electromechanical optimizing design theory, one non-PID with a CMAC forward feedback control method is used in the servo system of the precise tracking bed and some simulation results are discussed.

  2. Precision Machining Technologies. Occupational Competency Analysis Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    This Occupational Competency Analysis Profile (OCAP), which is one of a series of OCAPs developed to identify the skills that Ohio employers deem necessary to entering a given occupation/occupational area, lists the occupational, academic, and employability skills required of individuals entering the occupation of precision machinist. The…

  3. Video-rate or high-precision: a flexible range imaging camera

    NASA Astrophysics Data System (ADS)

    Dorrington, Adrian A.; Cree, Michael J.; Carnegie, Dale A.; Payne, Andrew D.; Conroy, Richard M.; Godbaz, John P.; Jongenelen, Adrian P. P.

    2008-02-01

    A range imaging camera produces an output similar to a digital photograph, but every pixel in the image contains distance information as well as intensity. This is useful for measuring the shape, size and location of objects in a scene, hence is well suited to certain machine vision applications. Previously we demonstrated a heterodyne range imaging system operating in a relatively high resolution (512-by-512) pixels and high precision (0.4 mm best case) configuration, but with a slow measurement rate (one every 10 s). Although this high precision range imaging is useful for some applications, the low acquisition speed is limiting in many situations. The system's frame rate and length of acquisition is fully configurable in software, which means the measurement rate can be increased by compromising precision and image resolution. In this paper we demonstrate the flexibility of our range imaging system by showing examples of high precision ranging at slow acquisition speeds and video-rate ranging with reduced ranging precision and image resolution. We also show that the heterodyne approach and the use of more than four samples per beat cycle provides better linearity than the traditional homodyne quadrature detection approach. Finally, we comment on practical issues of frame rate and beat signal frequency selection.

  4. Stability indicating high performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in combined dosage form

    PubMed Central

    Bageshwar, Deepak; Khanvilkar, Vineeta; Kadam, Vilasrao

    2011-01-01

    A specific, precise and stability indicating high-performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in pharmaceutical formulations was developed and validated. The method employed TLC aluminium plates precoated with silica gel 60F254 as the stationary phase. The solvent system consisted of methanol:water:ammonium acetate; 4.0:1.0:0.5 (v/v/v). This system was found to give compact and dense spots for both itopride hydrochloride (Rf value of 0.55±0.02) and pantoprazole sodium (Rf value of 0.85±0.04). Densitometric analysis of both drugs was carried out in the reflectance–absorbance mode at 289 nm. The linear regression analysis data for the calibration plots showed a good linear relationship with R2=0.9988±0.0012 in the concentration range of 100–400 ng for pantoprazole sodium. Also, the linear regression analysis data for the calibration plots showed a good linear relationship with R2=0.9990±0.0008 in the concentration range of 200–1200 ng for itopride hydrochloride. The method was validated for specificity, precision, robustness and recovery. Statistical analysis proves that the method is repeatable and selective for the estimation of both the said drugs. As the method could effectively separate the drug from its degradation products, it can be employed as a stability indicating method. PMID:29403710

  5. Stability indicating high performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in combined dosage form.

    PubMed

    Bageshwar, Deepak; Khanvilkar, Vineeta; Kadam, Vilasrao

    2011-11-01

    A specific, precise and stability indicating high-performance thin-layer chromatographic method for simultaneous estimation of pantoprazole sodium and itopride hydrochloride in pharmaceutical formulations was developed and validated. The method employed TLC aluminium plates precoated with silica gel 60F 254 as the stationary phase. The solvent system consisted of methanol:water:ammonium acetate; 4.0:1.0:0.5 (v/v/v). This system was found to give compact and dense spots for both itopride hydrochloride ( R f value of 0.55±0.02) and pantoprazole sodium ( R f value of 0.85±0.04). Densitometric analysis of both drugs was carried out in the reflectance-absorbance mode at 289 nm. The linear regression analysis data for the calibration plots showed a good linear relationship with R 2 =0.9988±0.0012 in the concentration range of 100-400 ng for pantoprazole sodium. Also, the linear regression analysis data for the calibration plots showed a good linear relationship with R 2 =0.9990±0.0008 in the concentration range of 200-1200 ng for itopride hydrochloride. The method was validated for specificity, precision, robustness and recovery. Statistical analysis proves that the method is repeatable and selective for the estimation of both the said drugs. As the method could effectively separate the drug from its degradation products, it can be employed as a stability indicating method.

  6. Characterization and Uncertainty Analysis of a Reference Pressure Measurement System for Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Amer, Tahani; Tripp, John; Tcheng, Ping; Burkett, Cecil; Sealey, Bradley

    2004-01-01

    This paper presents the calibration results and uncertainty analysis of a high-precision reference pressure measurement system currently used in wind tunnels at the NASA Langley Research Center (LaRC). Sensors, calibration standards, and measurement instruments are subject to errors due to aging, drift with time, environment effects, transportation, the mathematical model, the calibration experimental design, and other factors. Errors occur at every link in the chain of measurements and data reduction from the sensor to the final computed results. At each link of the chain, bias and precision uncertainties must be separately estimated for facility use, and are combined to produce overall calibration and prediction confidence intervals for the instrument, typically at a 95% confidence level. The uncertainty analysis and calibration experimental designs used herein, based on techniques developed at LaRC, employ replicated experimental designs for efficiency, separate estimation of bias and precision uncertainties, and detection of significant parameter drift with time. Final results, including calibration confidence intervals and prediction intervals given as functions of the applied inputs, not as a fixed percentage of the full-scale value are presented. System uncertainties are propagated beginning with the initial reference pressure standard, to the calibrated instrument as a working standard in the facility. Among the several parameters that can affect the overall results are operating temperature, atmospheric pressure, humidity, and facility vibration. Effects of factors such as initial zeroing and temperature are investigated. The effects of the identified parameters on system performance and accuracy are discussed.

  7. Identification of ground motion features for high-tech facility under far field seismic waves using wavelet packet transform

    NASA Astrophysics Data System (ADS)

    Huang, Shieh-Kung; Loh, Chin-Hsiung; Chen, Chin-Tsun

    2016-04-01

    Seismic records collected from earthquake with large magnitude and far distance may contain long period seismic waves which have small amplitude but with dominant period up to 10 sec. For a general situation, the long period seismic waves will not endanger the safety of the structural system or cause any uncomfortable for human activity. On the contrary, for those far distant earthquakes, this type of seismic waves may cause a glitch or, furthermore, breakdown to some important equipments/facilities (such as the high-precision facilities in high-tech Fab) and eventually damage the interests of company if the amplitude becomes significant. The previous study showed that the ground motion features such as time-variant dominant frequencies extracted using moving window singular spectrum analysis (MWSSA) and amplitude characteristics of long-period waves identified from slope change of ground motion Arias Intensity can efficiently indicate the damage severity to the high-precision facilities. However, embedding a large hankel matrix to extract long period seismic waves make the MWSSA become a time-consumed process. In this study, the seismic ground motion data collected from broadband seismometer network located in Taiwan were used (with epicenter distance over 1000 km). To monitor the significant long-period waves, the low frequency components of these seismic ground motion data are extracted using wavelet packet transform (WPT) to obtain wavelet coefficients and the wavelet entropy of coefficients are used to identify the amplitude characteristics of long-period waves. The proposed method is a timesaving process compared to MWSSA and can be easily implemented for real-time detection. Comparison and discussion on this method among these different seismic events and the damage severity to the high-precision facilities in high-tech Fab is made.

  8. Multi-fractal detrended texture feature for brain tumor classification

    NASA Astrophysics Data System (ADS)

    Reza, Syed M. S.; Mays, Randall; Iftekharuddin, Khan M.

    2015-03-01

    We propose a novel non-invasive brain tumor type classification using Multi-fractal Detrended Fluctuation Analysis (MFDFA) [1] in structural magnetic resonance (MR) images. This preliminary work investigates the efficacy of the MFDFA features along with our novel texture feature known as multifractional Brownian motion (mBm) [2] in classifying (grading) brain tumors as High Grade (HG) and Low Grade (LG). Based on prior performance, Random Forest (RF) [3] is employed for tumor grading using two different datasets such as BRATS-2013 [4] and BRATS-2014 [5]. Quantitative scores such as precision, recall, accuracy are obtained using the confusion matrix. On an average 90% precision and 85% recall from the inter-dataset cross-validation confirm the efficacy of the proposed method.

  9. A low-frequency vibration insensitive pendulum bench based on translation-tilt compensation in measuring the performances of inertial sensors

    NASA Astrophysics Data System (ADS)

    Liu, L.; Ye, X.; Wu, S. C.; Bai, Y. Z.; Zhou, Z. B.

    2015-10-01

    The performance test of precision space inertial sensors on the ground is inevitably affected by seismic noise. A traditional vibration isolation platform, generally with a resonance frequency of several Hz, cannot satisfy the requirements for testing an inertial sensor at low frequencies. In this paper, we present a pendulum bench for inertial sensor testing based on translation-tilt compensation. A theoretical analysis indicates that the seismic noise effect on inertial sensors located on this bench can be attenuated by more than 40 dB below 0.1 Hz, which is very significant for investigating the performance of high-precision inertial sensors. We demonstrate this attenuation with a dedicated experiment.

  10. Robust one-Tube Ω-PCR Strategy Accelerates Precise Sequence Modification of Plasmids for Functional Genomics

    PubMed Central

    Chen, Letian; Wang, Fengpin; Wang, Xiaoyu; Liu, Yao-Guang

    2013-01-01

    Functional genomics requires vector construction for protein expression and functional characterization of target genes; therefore, a simple, flexible and low-cost molecular manipulation strategy will be highly advantageous for genomics approaches. Here, we describe a Ω-PCR strategy that enables multiple types of sequence modification, including precise insertion, deletion and substitution, in any position of a circular plasmid. Ω-PCR is based on an overlap extension site-directed mutagenesis technique, and is named for its characteristic Ω-shaped secondary structure during PCR. Ω-PCR can be performed either in two steps, or in one tube in combination with exonuclease I treatment. These strategies have wide applications for protein engineering, gene function analysis and in vitro gene splicing. PMID:23335613

  11. MOLA: The Future of Mars Global Cartography

    NASA Technical Reports Server (NTRS)

    Duxbury, T. C.; Smith, D. E.; Zuber, M. T.; Frey, H. V.; Garvin, J. B.; Head, J. W.; Muhleman, D. O.; Pettengill, G. H.; Phillips, R. J.; Solomon, S. C.

    1999-01-01

    The MGS Orbiter is carrying the high-precision Mars Orbiter Laser Altimeter (MOLA) which, when combined with precision reconstructed orbital data and telemetered attitude data, provides a tie between inertial space and Mars-fixed coordinates to an accuracy of 100 m in latitude / longitude and 10 m in radius (1 sigma), orders of magnitude more accurate than previous global geodetic/ cartographic control data. Over the 2 year MGS mission lifetime, it is expected that over 30,000 MOLA Global Cartographic Control Points will be produced to form the basis for new and re-derived map and geodetic products, key to the analysis of existing and evolving MGS data as well as future Mars exploration. Additional information is contained in the original extended abstract.

  12. Improvement of the polarized neutron interferometer setup demonstrating violation of a Bell-like inequality.

    PubMed

    Geppert, H; Denkmayr, T; Sponar, S; Lemmel, H; Hasegawa, Y

    2014-11-01

    For precise measurements with polarised neutrons high efficient spin-manipulation is required. We developed several neutron optical elements suitable for a new sophisticated setup, i.e., DC spin-turners and Larmor-accelerators which diminish thermal disturbances and depolarisation considerably. The gain in performance is exploited demonstrating violation of a Bell-like inequality for a spin-path entangled single-neutron state. The obtained value of [Formula: see text], which is much higher than previous measurements by neutron interferometry, is [Formula: see text] above the limit of S =2 predicted by contextual hidden variable theories. The new setup is more flexible referring to state preparation and analysis, therefore new, more precise measurements can be carried out.

  13. Reliable positioning in a sparse GPS network, eastern Ontario

    NASA Astrophysics Data System (ADS)

    Samadi Alinia, H.; Tiampo, K.; Atkinson, G. M.

    2013-12-01

    Canada hosts two regions that are prone to large earthquakes: western British Columbia, and the St. Lawrence River region in eastern Canada. Although eastern Ontario is not as seismically active as other areas of eastern Canada, such as the Charlevoix/Ottawa Valley seismic zone, it experiences ongoing moderate seismicity. In historic times, potentially damaging events have occurred in New York State (Attica, 1929, M=5.7; Plattsburg, 2002, M=5.0), north-central Ontario (Temiskaming, 1935, M=6.2; North Bay, 2000, M=5.0), eastern Ontario (Cornwall, 1944, M=5.8), Georgian Bay (2005, MN=4.3), and western Quebec (Val-Des-Bois,2010, M=5.0, MN=5.8). In eastern Canada, the analysis of detailed, high-precision measurements of surface deformation is a key component in our efforts to better characterize the associated seismic hazard. The data from precise, continuous GPS stations is necessary to adequately characterize surface velocities from which patterns and rates of stress accumulation on faults can be estimated (Mazzotti and Adams, 2005; Mazzotti et al., 2005). Monitoring of these displacements requires employing high accuracy GPS positioning techniques. Detailed strain measurements can determine whether the regional strain everywhere is commensurate with a large event occurring every few hundred years anywhere within this general area or whether large earthquakes are limited to specific areas (Adams and Halchuck, 2003; Mazzotti and Adams, 2005). In many parts of southeastern Ontario and western Québec, GPS stations are distributed quite sparsely, with spacings of approximately 100 km or more. The challenge is to provide accurate solutions for these sparse networks with an approach that is capable of achieving high-accuracy positioning. Here, various reduction techniques are applied to a sparse network installed with the Southern Ontario Seismic Network in eastern Ontario. Recent developments include the implementation of precise point positioning processing on acquired GPS raw data. These are based on precise GPS orbit and clock data products with centimeter accuracy computed beforehand. Here, the analysis of 1Hz GPS data is conducted in order to find the most reliable regional network from eight stations (STCO, TYNO, ACTO, INUQ, IVKQ, KLBO, MATQ and ALGO) that cover the study area in eastern Ontario. In this way, the estimated parameters are the total number of ambiguities and resolved ambiguities, posteriori rms of each baseline and the coordinates for each station and their differences with the known coordinates. The positioning accuracy, the corrections and the accuracy of interpolated corrections, and the initialization time required for precise positioning are presented for the various applications.

  14. High-precision relative position and attitude measurement for on-orbit maintenance of spacecraft

    NASA Astrophysics Data System (ADS)

    Zhu, Bing; Chen, Feng; Li, Dongdong; Wang, Ying

    2018-02-01

    In order to realize long-term on-orbit running of satellites, space stations, etc spacecrafts, in addition to the long life design of devices, The life of the spacecraft can also be extended by the on-orbit servicing and maintenance. Therefore, it is necessary to keep precise and detailed maintenance of key components. In this paper, a high-precision relative position and attitude measurement method used in the maintenance of key components is given. This method mainly considers the design of the passive cooperative marker, light-emitting device and high resolution camera in the presence of spatial stray light and noise. By using a series of algorithms, such as background elimination, feature extraction, position and attitude calculation, and so on, the high precision relative pose parameters as the input to the control system between key operation parts and maintenance equipment are obtained. The simulation results show that the algorithm is accurate and effective, satisfying the requirements of the precision operation technique.

  15. Quantifying time in sedimentary successions by radio-isotopic dating of ash beds

    NASA Astrophysics Data System (ADS)

    Schaltegger, Urs

    2014-05-01

    Sedimentary rock sequences are an accurate record of geological, chemical and biological processes throughout the history of our planet. If we want to know more about the duration or the rates of some of these processes, we can apply methods of absolute age determination, i.e. of radio-isotopic dating. Data of highest precision and accuracy, and therefore of highest degree of confidence, are obtained by chemical abrasion, isotope-dilution, thermal ionization mass spectrometry (CA-ID-TIMS) 238U-206Pb dating techniques, applied to magmatic zircon from ash beds that are interbedded with the sediments. This techniques allows high-precision estimates of age at the 0.1% uncertainty for single analyses, and down to 0.03% uncertainty for groups of statistically equivalent 206Pb/238U dates. Such high precision is needed, since we would like the precision to be approximately equivalent or better than the (interpolated) duration of ammonoid zones in the Mesozoic (e.g., Ovtcharova et al. 2006), or to match short feedback rates of biological, climatic, or geochemical cycles after giant volcanic eruptions in large igneous provinces (LIP's), e.g., at the Permian/Triassic or the Triassic/Jurassic boundaries. We also wish to establish as precisely as possible temporal coincidence between the sedimentary record and short-lived volcanic events within the LIP's. Precision and accuracy of the U-Pb data has to be traceable and quantifiable in absolute terms, achieved by direct reference to the international kilogram, via an absolute calibration of the standard and isotopic tracer solutions. Only with a perfect control on precision and accuracy of radio-isotopic data, we can confidently determine whether two ages of geological events are really different, and avoid mistaking interlaboratory or interchronometer biases for age difference. The development of unprecedented precision of CA-ID-TIMS 238U-206Pb dates led to the recognition of protracted growth of zircon in a magmatic liquid (see, e.g., Schoene et al. 2012), which then becomes transferred into volcanic ashes as excess dispersion of 238U-206Pb dates (see, e.g., Guex et al. 2012). Zircon is crystallizing in the magmatic liquid shortly before the volcanic eruption; we therefore aim at finding the youngest zircon date or youngest statistically equivalent cluster of 238U-206Pb dates as an approximation of ash deposition (Wotzlaw et al. 2013). Time gaps between last zircon crystallization and eruption ("Δt") may be as large as 100-200 ka, at the limits of analytical precision. Understanding the magmatic crystallization history of zircon is the fundamental background for interpreting ash bed dates in a sedimentary succession. Ash beds of different stratigraphic position and age my be generated within different magmatic systems, showing different crystallization histories. A sufficient number of samples (N) is therefore of paramount importance, not to lose the stratigraphic age control in a given section, and to be able to discard samples with large Δt - but, how large has to be "N"? In order to use the youngest zircon or zircons as an approximation of the age of eruption and ash deposition, we need to be sure that we have quantitatively solved the problem of post-crystallization lead loss - but, how can we be sure?! Ash bed zircons are prone to partial loss of radiogenic lead, because the ashes have been flushed by volcanic gases, as well as brines during sediment compaction. We therefore need to analyze a sufficient number of zircons (n) to be sure not to miss the youngest - but, how large has to be "n"? Analysis of trace elements or oxygen, hafnium isotopic compositions in dated zircon may sometimes help to distinguish zircon that is in equilibrium with the last magmatic liquid, from those that are recycled from earlier crystallization episodes, or to recognize zircon with partial lead loss (Schoene et al. 2010). Respecting these constraints, we may arrive at accurate correlation of periods of global environmental and biotic disturbance (from ash bed analysis in biostratigraphically or cyclostratigraphically well constrained marine sections) with volcanic activity; examples are the Triassic-Jurassic boundary and the Central Atlantic Magmatic Province (Schoene et al. 2010), or the lower Toarcian oceanic anoxic event and the Karoo Province volcanism (Sell et al. in prep.). High-precision temporal correlations may also be obtained by combining high-precision U-Pb dating with biochronology in the Middle Triassic (Ovtcharova et al., in prep.), or by comparing U-Pb dates with astronomical timescales in the Upper Miocene (Wotzlaw et al., in prep.). References Guex, J., Schoene, B., Bartolini, A., Spangenberg, J., Schaltegger, U., O'Dogherty, L., et al. (2012). Geochronological constraints on post-extinction recovery of the ammonoids and carbon cycle perturbations during the Early Jurassic. Palaeogeography, Palaeoclimatology, Palaeoecology, 346-347(C), 1-11. Ovtcharova, M., Bucher, H., Schaltegger, U., Galfetti, T., Brayard, A., & Guex, J. (2006). New Early to Middle Triassic U-Pb ages from South China: Calibration with ammonoid biochronozones and implications for the timing of the Triassic biotic recovery. Earth and Planetary Science Letters, 243(3-4), 463-475. Ovtcharova, M., Goudemand, N., Galfetti, Th., Guodun, K., Hammer, O., Schaltegger, U., Bucher, H. Improving accuracy and precision of radio-isotopic and biochronological approaches in dating geological boundaries: The Early-Middle Triassic boundary case. In preparation. Schoene, B., Schaltegger, U., Brack, P., Latkoczy, C., Stracke, A., & Günther, D. (2012). Rates of magma differentiation and emplacement in a ballooning pluton recorded by U-Pb TIMS-TEA, Adamello batholith, Italy. Earth and Planetary Science Letters, 355-356, 162-173. Schoene, B., Latkoczy, C., Schaltegger, U., & Günther, D. (2010). A new method integrating high-precision U-Pb geochronology with zircon trace element analysis (U-Pb TIMS-TEA). Geochimica Et Cosmochimica Acta, 74(24), 7144-7159. Schoene, B., Guex, J., Bartolini, A., Schaltegger, U., & Blackburn, T. J. (2010). Correlating the end-Triassic mass extinction and flood basalt volcanism at the 100 ka level. Geology, 38(5), 387-390. Sell, B., Ovtcharova, M., Guex, J., Jourdan, F., Schaltegger, U. Evaluating the link between the Karoo LIP and climatic-biologic events of the Toarcian Stage with high-precision U-Pb geochronology. In preparation. Wotzlaw, J. F., Schaltegger, U., Frick, D. A., Dungan, M. A., Gerdes, A., & Günther, D. (2013). Tracking the evolution of large-volume silicic magma reservoirs from assembly to supereruption. Geology, 41(8), 867-870. Wotzlaw, J.F., Hüsing, S.K., Hilgen, F.J.., Schaltegger, U. Testing the gold standard of geochronology against astronomical time: High-precision U-Pb geochronology of orbitally tuned ash beds from the Mediterranean Miocene. In preparation.

  16. Development and validation of an automated and marker-free CT-based spatial analysis method (CTSA) for assessment of femoral hip implant migration: In vitro accuracy and precision comparable to that of radiostereometric analysis (RSA).

    PubMed

    Scheerlinck, Thierry; Polfliet, Mathias; Deklerck, Rudi; Van Gompel, Gert; Buls, Nico; Vandemeulebroucke, Jef

    2016-01-01

    We developed a marker-free automated CT-based spatial analysis (CTSA) method to detect stem-bone migration in consecutive CT datasets and assessed the accuracy and precision in vitro. Our aim was to demonstrate that in vitro accuracy and precision of CTSA is comparable to that of radiostereometric analysis (RSA). Stem and bone were segmented in 2 CT datasets and both were registered pairwise. The resulting rigid transformations were compared and transferred to an anatomically sound coordinate system, taking the stem as reference. This resulted in 3 translation parameters and 3 rotation parameters describing the relative amount of stem-bone displacement, and it allowed calculation of the point of maximal stem migration. Accuracy was evaluated in 39 comparisons by imposing known stem migration on a stem-bone model. Precision was estimated in 20 comparisons based on a zero-migration model, and in 5 patients without stem loosening. Limits of the 95% tolerance intervals (TIs) for accuracy did not exceed 0.28 mm for translations and 0.20° for rotations (largest standard deviation of the signed error (SD(SE)): 0.081 mm and 0.057°). In vitro, limits of the 95% TI for precision in a clinically relevant setting (8 comparisons) were below 0.09 mm and 0.14° (largest SD(SE): 0.012 mm and 0.020°). In patients, the precision was lower, but acceptable, and dependent on CT scan resolution. CTSA allows detection of stem-bone migration with an accuracy and precision comparable to that of RSA. It could be valuable for evaluation of subtle stem loosening in clinical practice.

  17. Precision blackbody sources for radiometric standards.

    PubMed

    Sapritsky, V I; Khlevnoy, B B; Khromchenko, V B; Lisiansky, B E; Mekhontsev, S N; Melenevsky, U A; Morozova, S P; Prokhorov, A V; Samoilov, L N; Shapoval, V I; Sudarev, K A; Zelener, M F

    1997-08-01

    The precision blackbody sources developed at the All-Russian Institute for Optical and Physical Measurements (Moscow, Russia) and their characteristics are analyzed. The precision high-temperature graphite blackbody BB22p, large-area high-temperature pyrolytic graphite blackbody BB3200pg, middle-temperature graphite blackbody BB2000, low-temperature blackbody BB300, and gallium fixed-point blackbody BB29gl and their characteristics are described.

  18. A comparison of Boolean-based retrieval to the WAIS system for retrieval of aeronautical information

    NASA Technical Reports Server (NTRS)

    Marchionini, Gary; Barlow, Diane

    1994-01-01

    An evaluation of an information retrieval system using a Boolean-based retrieval engine and inverted file architecture and WAIS, which uses a vector-based engine, was conducted. Four research questions in aeronautical engineering were used to retrieve sets of citations from the NASA Aerospace Database which was mounted on a WAIS server and available through Dialog File 108 which served as the Boolean-based system (BBS). High recall and high precision searches were done in the BBS and terse and verbose queries were used in the WAIS condition. Precision values for the WAIS searches were consistently above the precision values for high recall BBS searches and consistently below the precision values for high precision BBS searches. Terse WAIS queries gave somewhat better precision performance than verbose WAIS queries. In every case, a small number of relevant documents retrieved by one system were not retrieved by the other, indicating the incomplete nature of the results from either retrieval system. Relevant documents in the WAIS searches were found to be randomly distributed in the retrieved sets rather than distributed by ranks. Advantages and limitations of both types of systems are discussed.

  19. High precision measurement of the proton charge radius: The PRad experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meziane, Mehdi

    2013-11-01

    The recent high precision measurements of the proton charge radius performed at PSI from muonic hydrogen Lamb shift puzzled the hadronic physics community. A value of 0.8418 {+-} 0.0007 fm was extracted which is 7{sigma} smaller than the previous determinations obtained from electron-proton scattering experiments and based on precision spectroscopy of electronic hydrogen. An additional extraction of the proton charge radius from electron scattering at Mainz is also in good agreement with these "electronic" determinations. An independent measurement of the proton charge radius from unpolarized elastic ep scattering using a magnetic spectrometer free method was proposed and fully approved atmore » Jefferson Laboratory in June 2012. This novel technique uses the high precision calorimeter HyCal and a windowless hydrogen gas target which makes possible the extraction of the charge radius at very forward angles and thus very low momentum transfer Q{sup 2} up to 10{sup -4} (GeV/c){sup 2} with an unprecedented sub-percent precision for this type of experiment. In this paper, after a review of the recent progress on the proton charge radius extraction and the new high precision experiment PRad will be presented.« less

  20. Analytical precision of the Urolizer for the determination of the BONN-Risk-Index (BRI) for calcium oxalate urolithiasis and evaluation of the influence of 24-h urine storage at moderate temperatures on BRI.

    PubMed

    Berg, Wolfgang; Bechler, Robin; Laube, Norbert

    2009-01-01

    Since its first publication in 2000, the BONN-Risk-Index (BRI) has been successfully used to determine the calcium oxalate (CaOx) crystallization risk from urine samples. To date, a BRI-measuring device, the "Urolizer", has been developed, operating automatically and requiring only a minimum of preparation. Two major objectives were pursued: determination of Urolizer precision, and determination of the influence of 24-h urine storage at moderate temperatures on BRI. 24-h urine samples from 52 CaOx stone-formers were collected. A total of 37 urine samples were used for the investigation of Urolizer precision by performing six independent BRI determinations in series. In total, 30 samples were taken for additional investigation of urine storability. Each sample was measured thrice: directly after collection, after 24-h storage at T=21 degrees C, and after 24-h cooling at T=4 degrees C. Outcomes were statistically tested for identity with regard to the immediately obtained results. Repeat measurements for evaluation of Urolizer precision revealed statistical identity of data (p-0.05). 24-h storage of urine at both tested temperatures did not significantly affect BRI (p-0.05). The pilot-run Urolizer shows high analytical reliability. The innovative analysis device may be especially suited for urologists specializing in urolithiasis treatment. The possibility for urine storage at moderate temperatures without loss of analysis quality further demonstrates the applicability of the BRI method.

Top