Sample records for offers improved accuracy

  1. Effects of the Presence of Audio and Type of Game Controller on Learning of Rhythmic Accuracy

    ERIC Educational Resources Information Center

    Thomas, James William

    2017-01-01

    "Guitar Hero III" and similar games potentially offer a vehicle for improvement of musical rhythmic accuracy with training delivered in both visual and auditory formats and by use of its novel guitar-shaped interface; however, some theories regarding multimedia learning suggest sound is a possible source of extraneous cognitive load…

  2. The use of computerized image guidance in lumbar disk arthroplasty.

    PubMed

    Smith, Harvey E; Vaccaro, Alexander R; Yuan, Philip S; Papadopoulos, Stephen; Sasso, Rick

    2006-02-01

    Surgical navigation systems have been increasingly studied and applied in the application of spinal instrumentation. Successful disk arthroplasty requires accurate midline and rotational positioning for optimal function and longevity. A surgical simulation study in human cadaver specimens was done to evaluate and compare the accuracy of standard fluoroscopy, computer-assisted fluoroscopic image guidance, and Iso-C3D image guidance in the placement of lumbar intervertebral disk replacements. Lumbar intervertebral disk prostheses were placed using three different image guidance techniques in three human cadaver spine specimens at multiple levels. Postinstrumentation accuracy was assessed with thin-cut computed tomography scans. Intervertebral disk replacements placed using the StealthStation with Iso-C3D were more accurately centered than those placed using the StealthStation with FluoroNav and standard fluoroscopy. Intervertebral disk replacements placed with Iso-C3D and FluoroNav had improved rotational divergence compared with standard fluoroscopy. Iso-C3D and FluoroNav had a smaller interprocedure variance than standard fluoroscopy. These results did not approach statistical significance. Relative to both virtual and standard fluoroscopy, use of the StealthStation with Iso-C3D resulted in improved accuracy in centering the lumbar disk prosthesis in the coronal midline. The StealthStation with FluoroNav appears to be at least equivalent to standard fluoroscopy and may offer improved accuracy with rotational alignment while minimizing radiation exposure to the surgeon. Surgical guidance systems may offer improved accuracy and less interprocedure variation in the placement of intervertebral disk replacements than standard fluoroscopy. Further study regarding surgical navigation systems for intervertebral disk replacement is warranted.

  3. Algorithms For Integrating Nonlinear Differential Equations

    NASA Technical Reports Server (NTRS)

    Freed, A. D.; Walker, K. P.

    1994-01-01

    Improved algorithms developed for use in numerical integration of systems of nonhomogenous, nonlinear, first-order, ordinary differential equations. In comparison with integration algorithms, these algorithms offer greater stability and accuracy. Several asymptotically correct, thereby enabling retention of stability and accuracy when large increments of independent variable used. Accuracies attainable demonstrated by applying them to systems of nonlinear, first-order, differential equations that arise in study of viscoplastic behavior, spread of acquired immune-deficiency syndrome (AIDS) virus and predator/prey populations.

  4. Effects of accuracy motivation and anchoring on metacomprehension judgment and accuracy.

    PubMed

    Zhao, Qin

    2012-01-01

    The current research investigates how accuracy motivation impacts anchoring and adjustment in metacomprehension judgment and how accuracy motivation and anchoring affect metacomprehension accuracy. Participants were randomly assigned to one of six conditions produced by the between-subjects factorial design involving accuracy motivation (incentive or no) and peer performance anchor (95%, 55%, or no). Two studies showed that accuracy motivation did not impact anchoring bias, but the adjustment-from-anchor process occurred. Accuracy incentive increased anchor-judgment gap for the 95% anchor but not for the 55% anchor, which induced less certainty about the direction of adjustment. The findings offer support to the integrative theory of anchoring. Additionally, the two studies revealed a "power struggle" between accuracy motivation and anchoring in influencing metacomprehension accuracy. Accuracy motivation could improve metacomprehension accuracy in spite of anchoring effect, but if anchoring effect is too strong, it could overpower the motivation effect. The implications of the findings were discussed.

  5. MUSIC APPRECIATION AND TRAINING FOR COCHLEAR IMPLANT RECIPIENTS: A REVIEW

    PubMed Central

    Looi, Valerie; Gfeller, Kate; Driscoll, Virginia

    2012-01-01

    In recent years, there has been increasing interest in music perception of cochlear implant (CI) recipients, and a growing body of research conducted in this area. The majority of these studies have examined perceptual accuracy for pitch, rhythm, and timbre. Another important, but less commonly studied aspect of music listening is appreciation, or appraisal. Despite the ongoing research into potential technological improvements that may improve music perception for recipients, both perceptual accuracy and appreciation generally remain poor for most recipients. Whilst perceptual accuracy for music is important, appreciation and enjoyment also warrants research as it also contributes to clinical outcomes and perceived benefits. Music training is being shown to offer excellent potential for improving music perception and appreciation for recipients. Therefore, the primary topics of this review are music appreciation and training. However, a brief overview of the psychoacoustic, technical, and physiological factors associated with a recipient’s perception of music is provided, as these are important factors in understanding the listening experience for CI recipients. The purpose of this review is to summarize key papers that have investigated these issues, in order to demonstrate that i) music enjoyment and appraisal is an important and valid consideration in evaluating music outcomes for recipients, and ii) that music training can improve music listening for many recipients, and is something that can be offered to persons using current technology. PMID:23459244

  6. Quality improvement of International Classification of Diseases, 9th revision, diagnosis coding in radiation oncology: single-institution prospective study at University of California, San Francisco.

    PubMed

    Chen, Chien P; Braunstein, Steve; Mourad, Michelle; Hsu, I-Chow J; Haas-Kogan, Daphne; Roach, Mack; Fogh, Shannon E

    2015-01-01

    Accurate International Classification of Diseases (ICD) diagnosis coding is critical for patient care, billing purposes, and research endeavors. In this single-institution study, we evaluated our baseline ICD-9 (9th revision) diagnosis coding accuracy, identified the most common errors contributing to inaccurate coding, and implemented a multimodality strategy to improve radiation oncology coding. We prospectively studied ICD-9 coding accuracy in our radiation therapy--specific electronic medical record system. Baseline ICD-9 coding accuracy was obtained from chart review targeting ICD-9 coding accuracy of all patients treated at our institution between March and June of 2010. To improve performance an educational session highlighted common coding errors, and a user-friendly software tool, RadOnc ICD Search, version 1.0, for coding radiation oncology specific diagnoses was implemented. We then prospectively analyzed ICD-9 coding accuracy for all patients treated from July 2010 to June 2011, with the goal of maintaining 80% or higher coding accuracy. Data on coding accuracy were analyzed and fed back monthly to individual providers. Baseline coding accuracy for physicians was 463 of 661 (70%) cases. Only 46% of physicians had coding accuracy above 80%. The most common errors involved metastatic cases, whereby primary or secondary site ICD-9 codes were either incorrect or missing, and special procedures such as stereotactic radiosurgery cases. After implementing our project, overall coding accuracy rose to 92% (range, 86%-96%). The median accuracy for all physicians was 93% (range, 77%-100%) with only 1 attending having accuracy below 80%. Incorrect primary and secondary ICD-9 codes in metastatic cases showed the most significant improvement (10% vs 2% after intervention). Identifying common coding errors and implementing both education and systems changes led to significantly improved coding accuracy. This quality assurance project highlights the potential problem of ICD-9 coding accuracy by physicians and offers an approach to effectively address this shortcoming. Copyright © 2015. Published by Elsevier Inc.

  7. A new Euler scheme based on harmonic-polygon approach for solving first order ordinary differential equation

    NASA Astrophysics Data System (ADS)

    Yusop, Nurhafizah Moziyana Mohd; Hasan, Mohammad Khatim; Wook, Muslihah; Amran, Mohd Fahmi Mohamad; Ahmad, Siti Rohaidah

    2017-10-01

    There are many benefits to improve Euler scheme for solving the Ordinary Differential Equation Problems. Among the benefits are simple implementation and low-cost computational. However, the problem of accuracy in Euler scheme persuade scholar to use complex method. Therefore, the main purpose of this research are show the construction a new modified Euler scheme that improve accuracy of Polygon scheme in various step size. The implementing of new scheme are used Polygon scheme and Harmonic mean concept that called as Harmonic-Polygon scheme. This Harmonic-Polygon can provide new advantages that Euler scheme could offer by solving Ordinary Differential Equation problem. Four set of problems are solved via Harmonic-Polygon. Findings show that new scheme or Harmonic-Polygon scheme can produce much better accuracy result.

  8. Voxel-Wise Time-Series Analysis of Quantitative MRI in Relapsing-Remitting MS: Dynamic Imaging Metrics of Disease Activity Including Pre-Lesional Changes

    DTIC Science & Technology

    2015-12-01

    other parameters match the previous simulation. A third simulation was performed to evaluate the effect of gradient and RF spoiling on the accuracy of...this increase also offers an opportunity to increase the length of the spoiler gradient and improve the accuracy of FA quanti - fication (27). To...Relaxation Pouria Mossahebi,1 Vasily L. Yarnykh,2 and Alexey Samsonov3* Purpose: Cross-relaxation imaging (CRI) is a family of quanti - tative

  9. Accuracy Improvement in Magnetic Field Modeling for an Axisymmetric Electromagnet

    NASA Technical Reports Server (NTRS)

    Ilin, Andrew V.; Chang-Diaz, Franklin R.; Gurieva, Yana L.; Il,in, Valery P.

    2000-01-01

    This paper examines the accuracy and calculation speed for the magnetic field computation in an axisymmetric electromagnet. Different numerical techniques, based on an adaptive nonuniform grid, high order finite difference approximations, and semi-analitical calculation of boundary conditions are considered. These techniques are being applied to the modeling of the Variable Specific Impulse Magnetoplasma Rocket. For high-accuracy calculations, a fourth-order scheme offers dramatic advantages over a second order scheme. For complex physical configurations of interest in plasma propulsion, a second-order scheme with nonuniform mesh gives the best results. Also, the relative advantages of various methods are described when the speed of computation is an important consideration.

  10. Accounting for speed-accuracy tradeoff in perceptual learning

    PubMed Central

    Liu, Charles C.; Watanabe, Takeo

    2011-01-01

    In the perceptual learning (PL) literature, researchers typically focus on improvements in accuracy, such as d’. In contrast, researchers who investigate the practice of cognitive skills focus on improvements in response times (RT). Here, we argue for the importance of accounting for both accuracy and RT in PL experiments, due to the phenomenon of speed-accuracy tradeoff (SAT): at a given level of discriminability, faster responses tend to produce more errors. A formal model of the decision process, such as the diffusion model, can explain the SAT. In this model, a parameter known as the drift rate represents the perceptual strength of the stimulus, where higher drift rates lead to more accurate and faster responses. We applied the diffusion model to analyze responses from a yes-no coherent motion detection task. The results indicate that observers do not use a fixed threshold for evidence accumulation, so changes in the observed accuracy may not provide the most appropriate estimate of learning. Instead, our results suggest that SAT can be accounted for by a modeling approach, and that drift rates offer a promising index of PL. PMID:21958757

  11. On the Potential of a New Generation of Magnetometers for MEG: A Beamformer Simulation Study

    PubMed Central

    Boto, Elena; Bowtell, Richard; Krüger, Peter; Fromhold, T. Mark; Morris, Peter G.; Meyer, Sofie S.; Barnes, Gareth R.; Brookes, Matthew J.

    2016-01-01

    Magnetoencephalography (MEG) is a sophisticated tool which yields rich information on the spatial, spectral and temporal signatures of human brain function. Despite unique potential, MEG is limited by a low signal-to-noise ratio (SNR) which is caused by both the inherently small magnetic fields generated by the brain, and the scalp-to-sensor distance. The latter is limited in current systems due to a requirement for pickup coils to be cryogenically cooled. Recent work suggests that optically-pumped magnetometers (OPMs) might be a viable alternative to superconducting detectors for MEG measurement. They have the advantage that sensors can be brought to within ~4 mm of the scalp, thus offering increased sensitivity. Here, using simulations, we quantify the advantages of hypothetical OPM systems in terms of sensitivity, reconstruction accuracy and spatial resolution. Our results show that a multi-channel whole-head OPM system offers (on average) a fivefold improvement in sensitivity for an adult brain, as well as clear improvements in reconstruction accuracy and spatial resolution. However, we also show that such improvements depend critically on accurate forward models; indeed, the reconstruction accuracy of our simulated OPM system only outperformed that of a simulated superconducting system in cases where forward field error was less than 5%. Overall, our results imply that the realisation of a viable whole-head multi-channel OPM system could generate a step change in the utility of MEG as a means to assess brain electrophysiological activity in health and disease. However in practice, this will require both improved hardware and modelling algorithms. PMID:27564416

  12. Improved Forecasting Methods for Naval Manpower Studies

    DTIC Science & Technology

    2015-03-25

    Using monthly data is likely to improve the overall fit of the models and the accuracy of the BP test . A measure of unemployment to control for...measure of the relative goodness of fit of a statistical model. It is grounded in the concept of information entropy, in effect, offering a relative...the Kullback – Leibler divergence, DKL(f,g1); similarly, the information lost from using g2 to

  13. A Global Optimization Methodology for Rocket Propulsion Applications

    NASA Technical Reports Server (NTRS)

    2001-01-01

    While the response surface method is an effective method in engineering optimization, its accuracy is often affected by the use of limited amount of data points for model construction. In this chapter, the issues related to the accuracy of the RS approximations and possible ways of improving the RS model using appropriate treatments, including the iteratively re-weighted least square (IRLS) technique and the radial-basis neural networks, are investigated. A main interest is to identify ways to offer added capabilities for the RS method to be able to at least selectively improve the accuracy in regions of importance. An example is to target the high efficiency region of a fluid machinery design space so that the predictive power of the RS can be maximized when it matters most. Analytical models based on polynomials, with controlled level of noise, are used to assess the performance of these techniques.

  14. Optical Coherence Tomography in Glaucoma

    NASA Astrophysics Data System (ADS)

    Berisha, Fatmire; Hoffmann, Esther M.; Pfeiffer, Norbert

    Retinal nerve fiber layer (RNFL) thinning and optic nerve head cupping are key diagnostic features of glaucomatous optic neuropathy. The higher resolution of the recently introduced SD-OCT offers enhanced visualization and improved segmentation of the retinal layers, providing a higher accuracy in identification of subtle changes of the optic disc and RNFL thinning associated with glaucoma.

  15. Using Keystroke Analytics to Improve Pass-Fail Classifiers

    ERIC Educational Resources Information Center

    Casey, Kevin

    2017-01-01

    Learning analytics offers insights into student behaviour and the potential to detect poor performers before they fail exams. If the activity is primarily online (for example computer programming), a wealth of low-level data can be made available that allows unprecedented accuracy in predicting which students will pass or fail. In this paper, we…

  16. Integrative Approaches for Predicting in vivo Effects of Chemicals from their Structural Descriptors and the Results of Short-term Biological Assays

    PubMed Central

    Low, Yen S.; Sedykh, Alexander; Rusyn, Ivan; Tropsha, Alexander

    2017-01-01

    Cheminformatics approaches such as Quantitative Structure Activity Relationship (QSAR) modeling have been used traditionally for predicting chemical toxicity. In recent years, high throughput biological assays have been increasingly employed to elucidate mechanisms of chemical toxicity and predict toxic effects of chemicals in vivo. The data generated in such assays can be considered as biological descriptors of chemicals that can be combined with molecular descriptors and employed in QSAR modeling to improve the accuracy of toxicity prediction. In this review, we discuss several approaches for integrating chemical and biological data for predicting biological effects of chemicals in vivo and compare their performance across several data sets. We conclude that while no method consistently shows superior performance, the integrative approaches rank consistently among the best yet offer enriched interpretation of models over those built with either chemical or biological data alone. We discuss the outlook for such interdisciplinary methods and offer recommendations to further improve the accuracy and interpretability of computational models that predict chemical toxicity. PMID:24805064

  17. Improved fibrosis staging by elastometry and blood test in chronic hepatitis C.

    PubMed

    Calès, Paul; Boursier, Jérôme; Ducancelle, Alexandra; Oberti, Frédéric; Hubert, Isabelle; Hunault, Gilles; de Lédinghen, Victor; Zarski, Jean-Pierre; Salmon, Dominique; Lunel, Françoise

    2014-07-01

    Our main objective was to improve non-invasive fibrosis staging accuracy by resolving the limits of previous methods via new test combinations. Our secondary objectives were to improve staging precision, by developing a detailed fibrosis classification, and reliability (personalized accuracy) determination. All patients (729) included in the derivation population had chronic hepatitis C, liver biopsy, 6 blood tests and Fibroscan. Validation populations included 1584 patients. The most accurate combination was provided by using most markers of FibroMeter and Fibroscan results targeted for significant fibrosis, i.e. 'E-FibroMeter'. Its classification accuracy (91.7%) and precision (assessed by F difference with Metavir: 0.62 ± 0.57) were better than those of FibroMeter (84.1%, P < 0.001; 0.72 ± 0.57, P < 0.001), Fibroscan (88.2%, P = 0.011; 0.68 ± 0.57, P = 0.020), and a previous CSF-SF classification of FibroMeter + Fibroscan (86.7%, P < 0.001; 0.65 ± 0.57, P = 0.044). The accuracy for fibrosis absence (F0) was increased, e.g. from 16.0% with Fibroscan to 75.0% with E-FibroMeter (P < 0.001). Cirrhosis sensitivity was improved, e.g. E-FibroMeter: 92.7% vs. Fibroscan: 83.3%, P = 0.004. The combination improved reliability by deleting unreliable results (accuracy <50%) observed with a single test (1.2% of patients) and increasing optimal reliability (accuracy ≥85%) from 80.4% of patients with Fibroscan (accuracy: 90.9%) to 94.2% of patients with E-FibroMeter (accuracy: 92.9%), P < 0.001. The patient rate with 100% predictive values for cirrhosis by the best combination was twice (36.2%) that of the best single test (FibroMeter: 16.2%, P < 0.001). The new test combination increased: accuracy, globally and especially in patients without fibrosis, staging precision, cirrhosis prediction, and even reliability, thus offering improved fibrosis staging. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Accounting for speed-accuracy tradeoff in perceptual learning.

    PubMed

    Liu, Charles C; Watanabe, Takeo

    2012-05-15

    In the perceptual learning (PL) literature, researchers typically focus on improvements in accuracy, such as d'. In contrast, researchers who investigate the practice of cognitive skills focus on improvements in response times (RT). Here, we argue for the importance of accounting for both accuracy and RT in PL experiments, due to the phenomenon of speed-accuracy tradeoff (SAT): at a given level of discriminability, faster responses tend to produce more errors. A formal model of the decision process, such as the diffusion model, can explain the SAT. In this model, a parameter known as the drift rate represents the perceptual strength of the stimulus, where higher drift rates lead to more accurate and faster responses. We applied the diffusion model to analyze responses from a yes-no coherent motion detection task. The results indicate that observers do not use a fixed threshold for evidence accumulation, so changes in the observed accuracy may not provide the most appropriate estimate of learning. Instead, our results suggest that SAT can be accounted for by a modeling approach, and that drift rates offer a promising index of PL. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Real-time, resource-constrained object classification on a micro-air vehicle

    NASA Astrophysics Data System (ADS)

    Buck, Louis; Ray, Laura

    2013-12-01

    A real-time embedded object classification algorithm is developed through the novel combination of binary feature descriptors, a bag-of-visual-words object model and the cortico-striatal loop (CSL) learning algorithm. The BRIEF, ORB and FREAK binary descriptors are tested and compared to SIFT descriptors with regard to their respective classification accuracies, execution times, and memory requirements when used with CSL on a 12.6 g ARM Cortex embedded processor running at 800 MHz. Additionally, the effect of x2 feature mapping and opponent-color representations used with these descriptors is examined. These tests are performed on four data sets of varying sizes and difficulty, and the BRIEF descriptor is found to yield the best combination of speed and classification accuracy. Its use with CSL achieves accuracies between 67% and 95% of those achieved with SIFT descriptors and allows for the embedded classification of a 128x192 pixel image in 0.15 seconds, 60 times faster than classification with SIFT. X2 mapping is found to provide substantial improvements in classification accuracy for all of the descriptors at little cost, while opponent-color descriptors are offer accuracy improvements only on colorful datasets.

  20. Extending the Utility of the Parabolic Approximation in Medical Ultrasound Using Wide-Angle Diffraction Modeling.

    PubMed

    Soneson, Joshua E

    2017-04-01

    Wide-angle parabolic models are commonly used in geophysics and underwater acoustics but have seen little application in medical ultrasound. Here, a wide-angle model for continuous-wave high-intensity ultrasound beams is derived, which approximates the diffraction process more accurately than the commonly used Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation without increasing implementation complexity or computing time. A method for preventing the high spatial frequencies often present in source boundary conditions from corrupting the solution is presented. Simulations of shallowly focused axisymmetric beams using both the wide-angle and standard parabolic models are compared to assess the accuracy with which they model diffraction effects. The wide-angle model proposed here offers improved focusing accuracy and less error throughout the computational domain than the standard parabolic model, offering a facile method for extending the utility of existing KZK codes.

  1. Magnetic resonance imaging of the preterm infant brain.

    PubMed

    Doria, Valentina; Arichi, Tomoki; Edwards, David A

    2014-01-01

    Despite improvements in neonatal care, survivors of preterm birth are still at a significantly increased risk of developing life-long neurological difficulties including cerebral palsy and cognitive difficulties. Cranial ultrasound is routinely used in neonatal practice, but has a low sensitivity for identifying later neurodevelopmental difficulties. Magnetic Resonance Imaging (MRI) can be used to identify intracranial abnormalities with greater diagnostic accuracy in preterm infants, and theoretically might improve the planning and targeting of long-term neurodevelopmental care; reducing parental stress and unplanned healthcare utilisation; and ultimately may improve healthcare cost effectiveness. Furthermore, MR imaging offers the advantage of allowing the quantitative assessment of the integrity, growth and function of intracranial structures, thereby providing the means to develop sensitive biomarkers which may be predictive of later neurological impairment. However further work is needed to define the accuracy and value of diagnosis by MR and the techniques's precise role in care pathways for preterm infants.

  2. Direct Position Determination of Multiple Non-Circular Sources with a Moving Coprime Array.

    PubMed

    Zhang, Yankui; Ba, Bin; Wang, Daming; Geng, Wei; Xu, Haiyun

    2018-05-08

    Direct position determination (DPD) is currently a hot topic in wireless localization research as it is more accurate than traditional two-step positioning. However, current DPD algorithms are all based on uniform arrays, which have an insufficient degree of freedom and limited estimation accuracy. To improve the DPD accuracy, this paper introduces a coprime array to the position model of multiple non-circular sources with a moving array. To maximize the advantages of this coprime array, we reconstruct the covariance matrix by vectorization, apply a spatial smoothing technique, and converge the subspace data from each measuring position to establish the cost function. Finally, we obtain the position coordinates of the multiple non-circular sources. The complexity of the proposed method is computed and compared with that of other methods, and the Cramer⁻Rao lower bound of DPD for multiple sources with a moving coprime array, is derived. Theoretical analysis and simulation results show that the proposed algorithm is not only applicable to circular sources, but can also improve the positioning accuracy of non-circular sources. Compared with existing two-step positioning algorithms and DPD algorithms based on uniform linear arrays, the proposed technique offers a significant improvement in positioning accuracy with a slight increase in complexity.

  3. An Improved WiFi Indoor Positioning Algorithm by Weighted Fusion

    PubMed Central

    Ma, Rui; Guo, Qiang; Hu, Changzhen; Xue, Jingfeng

    2015-01-01

    The rapid development of mobile Internet has offered the opportunity for WiFi indoor positioning to come under the spotlight due to its low cost. However, nowadays the accuracy of WiFi indoor positioning cannot meet the demands of practical applications. To solve this problem, this paper proposes an improved WiFi indoor positioning algorithm by weighted fusion. The proposed algorithm is based on traditional location fingerprinting algorithms and consists of two stages: the offline acquisition and the online positioning. The offline acquisition process selects optimal parameters to complete the signal acquisition, and it forms a database of fingerprints by error classification and handling. To further improve the accuracy of positioning, the online positioning process first uses a pre-match method to select the candidate fingerprints to shorten the positioning time. After that, it uses the improved Euclidean distance and the improved joint probability to calculate two intermediate results, and further calculates the final result from these two intermediate results by weighted fusion. The improved Euclidean distance introduces the standard deviation of WiFi signal strength to smooth the WiFi signal fluctuation and the improved joint probability introduces the logarithmic calculation to reduce the difference between probability values. Comparing the proposed algorithm, the Euclidean distance based WKNN algorithm and the joint probability algorithm, the experimental results indicate that the proposed algorithm has higher positioning accuracy. PMID:26334278

  4. An Improved WiFi Indoor Positioning Algorithm by Weighted Fusion.

    PubMed

    Ma, Rui; Guo, Qiang; Hu, Changzhen; Xue, Jingfeng

    2015-08-31

    The rapid development of mobile Internet has offered the opportunity for WiFi indoor positioning to come under the spotlight due to its low cost. However, nowadays the accuracy of WiFi indoor positioning cannot meet the demands of practical applications. To solve this problem, this paper proposes an improved WiFi indoor positioning algorithm by weighted fusion. The proposed algorithm is based on traditional location fingerprinting algorithms and consists of two stages: the offline acquisition and the online positioning. The offline acquisition process selects optimal parameters to complete the signal acquisition, and it forms a database of fingerprints by error classification and handling. To further improve the accuracy of positioning, the online positioning process first uses a pre-match method to select the candidate fingerprints to shorten the positioning time. After that, it uses the improved Euclidean distance and the improved joint probability to calculate two intermediate results, and further calculates the final result from these two intermediate results by weighted fusion. The improved Euclidean distance introduces the standard deviation of WiFi signal strength to smooth the WiFi signal fluctuation and the improved joint probability introduces the logarithmic calculation to reduce the difference between probability values. Comparing the proposed algorithm, the Euclidean distance based WKNN algorithm and the joint probability algorithm, the experimental results indicate that the proposed algorithm has higher positioning accuracy.

  5. Investigating Atmospheric Rivers using GPS TPW during CalWater 2015

    NASA Astrophysics Data System (ADS)

    Almanza, V.; Foster, J. H.; Businger, S.

    2015-12-01

    Ship-based Global Positioning System (GPS) receivers have been successful in obtaining millimeter accuracy total precipitable water (TPW). We apply this technique with a field experiment using a GPS meteorology system installed on board the R/V Ronald Brown during the CalWater 2015 project. The goal of CalWater is to monitor atmospheric river (AR) events over the Eastern Pacific Ocean and improve forecasting of the extreme precipitation events they can produce. During the 30-day cruise, TPW derived from radiosonde balloons released from the Ron Brown are used to verify the accuracy of shipboard GPS TPW. The results suggest that ship-based GPS TPW offers a cost-effective approach for acquiring accurate real-time meteorological observations of TPW in AR's over remote oceans, as well as near the coastlines where satellites algorithms have limited accuracy. The results have implications for augmenting operational observing networks to improve weather prediction and nowcasting of ARs, thereby supporting hazard response and mitigation efforts associated with coastal flooding events.

  6. Seismic wavefield modeling based on time-domain symplectic and Fourier finite-difference method

    NASA Astrophysics Data System (ADS)

    Fang, Gang; Ba, Jing; Liu, Xin-xin; Zhu, Kun; Liu, Guo-Chang

    2017-06-01

    Seismic wavefield modeling is important for improving seismic data processing and interpretation. Calculations of wavefield propagation are sometimes not stable when forward modeling of seismic wave uses large time steps for long times. Based on the Hamiltonian expression of the acoustic wave equation, we propose a structure-preserving method for seismic wavefield modeling by applying the symplectic finite-difference method on time grids and the Fourier finite-difference method on space grids to solve the acoustic wave equation. The proposed method is called the symplectic Fourier finite-difference (symplectic FFD) method, and offers high computational accuracy and improves the computational stability. Using acoustic approximation, we extend the method to anisotropic media. We discuss the calculations in the symplectic FFD method for seismic wavefield modeling of isotropic and anisotropic media, and use the BP salt model and BP TTI model to test the proposed method. The numerical examples suggest that the proposed method can be used in seismic modeling of strongly variable velocities, offering high computational accuracy and low numerical dispersion. The symplectic FFD method overcomes the residual qSV wave of seismic modeling in anisotropic media and maintains the stability of the wavefield propagation for large time steps.

  7. Squash preparation: A reliable diagnostic tool in the intraoperative diagnosis of central nervous system tumors

    PubMed Central

    Mitra, Sumit; Kumar, Mohan; Sharma, Vivek; Mukhopadhyay, Debasis

    2010-01-01

    Background: Intraoperative cytology is an important diagnostic modality improving on the accuracy of the frozen sections. It has shown to play an important role especially in the intraoperative diagnosis of central nervous system tumors. Aim: To study the diagnostic accuracy of squash preparation and frozen section (FS) in the intraoperative diagnosis of central nervous system (CNS) tumors. Materials and Methods: This prospective study of 114 patients with CNS tumors was conducted over a period of 18 months (September 2004 to February 2006). The cytological preparations were stained by the quick Papanicolaou method. The squash interpretation and FS diagnosis were later compared with the paraffin section diagnosis. Results: Of the 114 patients, cytological diagnosis was offered in 96 cases. Eighteen nonneoplastic or noncontributory cases were excluded. Using hematoxylin and eosin-stained histopathology sections as the gold standard, the diagnostic accuracy of cytology was 88.5% (85/96) and the accuracy on FS diagnosis was 90.6% (87/96). Among these cases, gliomas formed the largest category of tumors (55.2%). The cytological accuracy in this group was 84.9% (45/53) and the comparative FS figure was 86.8% (46/53). In cases where the smear and the FS diagnosis did not match, the latter opinion was offered. Conclusions: Squash preparation is a reliable, rapid and easy method and can be used as a complement to FS in the intraoperative diagnosis of CNS tumors. PMID:21187881

  8. The use of video in standardized patient training to improve portrayal accuracy: A randomized post-test control group study.

    PubMed

    Schlegel, Claudia; Bonvin, Raphael; Rethans, Jan Joost; van der Vleuten, Cees

    2014-10-14

    Abstract Introduction: High-stake objective structured clinical examinations (OSCEs) with standardized patients (SPs) should offer the same conditions to all candidates throughout the exam. SP performance should therefore be as close to the original role script as possible during all encounters. In this study, we examined the impact of video in SP training on SPs' role accuracy, investigating how the use of different types of video during SP training improves the accuracy of SP portrayal. Methods: In a randomized post-test, control group design three groups of 12 SPs each with different types of video training and one control group of 12 SPs without video use in SP training were compared. The three intervention groups used role-modeling video, performance-feedback video, or a combination of both. Each SP from each group had four students encounter. Two blinded faculty members rated the 192 video-recorded encounters, using a case-specific rating instrument to assess SPs' role accuracy. Results: SPs trained by video showed significantly (p < 0.001) better role accuracy than SPs trained without video over the four sequential portrayals. There was no difference between the three types of video training. Discussion: Use of video during SP training enhances the accuracy of SP portrayal compared with no video, regardless of the type of video intervention used.

  9. Improved Fuzzy K-Nearest Neighbor Using Modified Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Jamaluddin; Siringoringo, Rimbun

    2017-12-01

    Fuzzy k-Nearest Neighbor (FkNN) is one of the most powerful classification methods. The presence of fuzzy concepts in this method successfully improves its performance on almost all classification issues. The main drawbackof FKNN is that it is difficult to determine the parameters. These parameters are the number of neighbors (k) and fuzzy strength (m). Both parameters are very sensitive. This makes it difficult to determine the values of ‘m’ and ‘k’, thus making FKNN difficult to control because no theories or guides can deduce how proper ‘m’ and ‘k’ should be. This study uses Modified Particle Swarm Optimization (MPSO) to determine the best value of ‘k’ and ‘m’. MPSO is focused on the Constriction Factor Method. Constriction Factor Method is an improvement of PSO in order to avoid local circumstances optima. The model proposed in this study was tested on the German Credit Dataset. The test of the data/The data test has been standardized by UCI Machine Learning Repository which is widely applied to classification problems. The application of MPSO to the determination of FKNN parameters is expected to increase the value of classification performance. Based on the experiments that have been done indicating that the model offered in this research results in a better classification performance compared to the Fk-NN model only. The model offered in this study has an accuracy rate of 81%, while. With using Fk-NN model, it has the accuracy of 70%. At the end is done comparison of research model superiority with 2 other classification models;such as Naive Bayes and Decision Tree. This research model has a better performance level, where Naive Bayes has accuracy 75%, and the decision tree model has 70%

  10. Deep Learning for Image-Based Cassava Disease Detection.

    PubMed

    Ramcharan, Amanda; Baranowski, Kelsee; McCloskey, Peter; Ahmed, Babuali; Legg, James; Hughes, David P

    2017-01-01

    Cassava is the third largest source of carbohydrates for human food in the world but is vulnerable to virus diseases, which threaten to destabilize food security in sub-Saharan Africa. Novel methods of cassava disease detection are needed to support improved control which will prevent this crisis. Image recognition offers both a cost effective and scalable technology for disease detection. New deep learning models offer an avenue for this technology to be easily deployed on mobile devices. Using a dataset of cassava disease images taken in the field in Tanzania, we applied transfer learning to train a deep convolutional neural network to identify three diseases and two types of pest damage (or lack thereof). The best trained model accuracies were 98% for brown leaf spot (BLS), 96% for red mite damage (RMD), 95% for green mite damage (GMD), 98% for cassava brown streak disease (CBSD), and 96% for cassava mosaic disease (CMD). The best model achieved an overall accuracy of 93% for data not used in the training process. Our results show that the transfer learning approach for image recognition of field images offers a fast, affordable, and easily deployable strategy for digital plant disease detection.

  11. Accurate Energies and Orbital Description in Semi-Local Kohn-Sham DFT

    NASA Astrophysics Data System (ADS)

    Lindmaa, Alexander; Kuemmel, Stephan; Armiento, Rickard

    2015-03-01

    We present our progress on a scheme in semi-local Kohn-Sham density-functional theory (KS-DFT) for improving the orbital description while still retaining the level of accuracy of the usual semi-local exchange-correlation (xc) functionals. DFT is a widely used tool for first-principles calculations of properties of materials. A given task normally requires a balance of accuracy and computational cost, which is well achieved with semi-local DFT. However, commonly used semi-local xc functionals have important shortcomings which often can be attributed to features of the corresponding xc potential. One shortcoming is an overly delocalized representation of localized orbitals. Recently a semi-local GGA-type xc functional was constructed to address these issues, however, it has the trade-off of lower accuracy of the total energy. We discuss the source of this error in terms of a surplus energy contribution in the functional that needs to be accounted for, and offer a remedy for this issue which formally stays within KS-DFT, and, which does not harshly increase the computational effort. The end result is a scheme that combines accurate total energies (e.g., relaxed geometries) with an improved orbital description (e.g., improved band structure).

  12. Utilisation of three-dimensional printed heart models for operative planning of complex congenital heart defects.

    PubMed

    Olejník, Peter; Nosal, Matej; Havran, Tomas; Furdova, Adriana; Cizmar, Maros; Slabej, Michal; Thurzo, Andrej; Vitovic, Pavol; Klvac, Martin; Acel, Tibor; Masura, Jozef

    2017-01-01

    To evaluate the accuracy of the three-dimensional (3D) printing of cardiovascular structures. To explore whether utilisation of 3D printed heart replicas can improve surgical and catheter interventional planning in patients with complex congenital heart defects. Between December 2014 and November 2015 we fabricated eight cardiovascular models based on computed tomography data in patients with complex spatial anatomical relationships of cardiovascular structures. A Bland-Altman analysis was used to assess the accuracy of 3D printing by comparing dimension measurements at analogous anatomical locations between the printed models and digital imagery data, as well as between printed models and in vivo surgical findings. The contribution of 3D printed heart models for perioperative planning improvement was evaluated in the four most representative patients. Bland-Altman analysis confirmed the high accuracy of 3D cardiovascular printing. Each printed model offered an improved spatial anatomical orientation of cardiovascular structures. Current 3D printers can produce authentic copies of patients` cardiovascular systems from computed tomography data. The use of 3D printed models can facilitate surgical or catheter interventional procedures in patients with complex congenital heart defects due to better preoperative planning and intraoperative orientation.

  13. Application of snowcovered area to runoff forecasting in selected basins of the Sierra Nevada, California. [Kings, Kern and Kaweah River Basins

    NASA Technical Reports Server (NTRS)

    Brown, A. J.; Hannaford, J. F. (Principal Investigator)

    1980-01-01

    The author has identified the following significant results. Direct overlay onto 1:1,000,000 prints takes about one third the time of 1:500,000 zone transfer scope analysis using transparencies, but the consistency of the transparencies reduce the time for data analysis. LANDSAT data received on transparencies is better and more easily interpreted than the near real-time data from Quick Look, or imagery from other sources such as NOAA. The greatest potential for water supply forecasting is probably in improving forecast accuracy and in expanding forecast services during the period of snowmelt. Problems of transient snow line and uncertainties in future weather are the main reasons that snow cover area appears to offer little in water supply forecast accuracy improvement during the peroid snowpack accumulation.

  14. Mean composite fire severity metrics computed with Google Earth engine offer improved accuracy and expanded mapping potential

    Treesearch

    Sean A. Parks; Lisa M. Holsinger; Morgan A. Voss; Rachel A. Loehman; Nathaniel P. Robinson

    2018-01-01

    Landsat-based fire severity datasets are an invaluable resource for monitoring and research purposes. These gridded fire severity datasets are generally produced with pre- and post-fire imagery to estimate the degree of fire-induced ecological change. Here, we introduce methods to produce three Landsat-based fire severity metrics using the Google Earth Engine (GEE)...

  15. Transforming the HIM department into a strategic resource.

    PubMed

    Odorisio, L F; Piescik, J B

    1998-05-01

    The transformation of a traditional (HIM) health information management department into a "virtual" HIM department can offer an IDS substantial economic advantages, improve patient satisfaction levels, enhance the quality of care, and provide greater accuracy in measuring and demonstrating healthcare outcomes. The HIM department's mission should be aligned with enterprise strategies, and the IDS should invest in technology that enables HIM to respond to enterprisewide information requirements.

  16. The role of computerized diagnostic proposals in the interpretation of the 12-lead electrocardiogram by cardiology and non-cardiology fellows.

    PubMed

    Novotny, Tomas; Bond, Raymond; Andrsova, Irena; Koc, Lumir; Sisakova, Martina; Finlay, Dewar; Guldenring, Daniel; Spinar, Jindrich; Malik, Marek

    2017-05-01

    Most contemporary 12-lead electrocardiogram (ECG) devices offer computerized diagnostic proposals. The reliability of these automated diagnoses is limited. It has been suggested that incorrect computer advice can influence physician decision-making. This study analyzed the role of diagnostic proposals in the decision process by a group of fellows of cardiology and other internal medicine subspecialties. A set of 100 clinical 12-lead ECG tracings was selected covering both normal cases and common abnormalities. A team of 15 junior Cardiology Fellows and 15 Non-Cardiology Fellows interpreted the ECGs in 3 phases: without any diagnostic proposal, with a single diagnostic proposal (half of them intentionally incorrect), and with four diagnostic proposals (only one of them being correct) for each ECG. Self-rated confidence of each interpretation was collected. Availability of diagnostic proposals significantly increased the diagnostic accuracy (p<0.001). Nevertheless, in case of a single proposal (either correct or incorrect) the increase of accuracy was present in interpretations with correct diagnostic proposals, while the accuracy was substantially reduced with incorrect proposals. Confidence levels poorly correlated with interpretation scores (rho≈2, p<0.001). Logistic regression showed that an interpreter is most likely to be correct when the ECG offers a correct diagnostic proposal (OR=10.87) or multiple proposals (OR=4.43). Diagnostic proposals affect the diagnostic accuracy of ECG interpretations. The accuracy is significantly influenced especially when a single diagnostic proposal (either correct or incorrect) is provided. The study suggests that the presentation of multiple computerized diagnoses is likely to improve the diagnostic accuracy of interpreters. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. STTR Phase I: Low-Cost, High-Accuracy, Whole-Building Carbon Dioxide Monitoring for Demand Control Ventilation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hallstrom, Jason O.; Ni, Zheng Richard

    This STTR Phase I project assessed the feasibility of a new CO 2 sensing system optimized for low-cost, high-accuracy, whole-building monitoring for use in demand control ventilation. The focus was on the development of a wireless networking platform and associated firmware to provide signal conditioning and conversion, fault- and disruptiontolerant networking, and multi-hop routing at building scales to avoid wiring costs. Early exploration of a bridge (or “gateway”) to direct digital control services was also explored. Results of the project contributed to an improved understanding of a new electrochemical sensor for monitoring indoor CO 2 concentrations, as well as themore » electronics and networking infrastructure required to deploy those sensors at building scales. New knowledge was acquired concerning the sensor’s accuracy, environmental response, and failure modes, and the acquisition electronics required to achieve accuracy over a wide range of CO 2 concentrations. The project demonstrated that the new sensor offers repeatable correspondence with commercial optical sensors, with supporting electronics that offer gain accuracy within 0.5%, and acquisition accuracy within 1.5% across three orders of magnitude variation in generated current. Considering production, installation, and maintenance costs, the technology presents a foundation for achieving whole-building CO 2 sensing at a price point below $0.066 / sq-ft – meeting economic feasibility criteria established by the Department of Energy. The technology developed under this award addresses obstacles on the critical path to enabling whole-building CO 2 sensing and demand control ventilation in commercial retrofits, small commercial buildings, residential complexes, and other highpotential structures that have been slow to adopt these technologies. It presents an opportunity to significantly reduce energy use throughout the United States.« less

  18. Discrimination of natural and cultivated vegetation using Thematic Mapper spectral data

    NASA Technical Reports Server (NTRS)

    Degloria, Stephen D.; Bernstein, Ralph; Dizenzo, Silvano

    1986-01-01

    The availability of high quality spectral data from the current suite of earth observation satellite systems offers significant improvements in the ability to survey and monitor food and fiber production on both a local and global basis. Current research results indicate that Landsat TM data when used in either digital or analog formats achieve higher land-cover classification accuracies than MSS data using either comparable or improved spectral bands and spatial resolution. A review of these quantitative results is presented for both natural and cultivated vegetation.

  19. A high accuracy sequential solver for simulation and active control of a longitudinal combustion instability

    NASA Technical Reports Server (NTRS)

    Shyy, W.; Thakur, S.; Udaykumar, H. S.

    1993-01-01

    A high accuracy convection scheme using a sequential solution technique has been developed and applied to simulate the longitudinal combustion instability and its active control. The scheme has been devised in the spirit of the Total Variation Diminishing (TVD) concept with special source term treatment. Due to the substantial heat release effect, a clear delineation of the key elements employed by the scheme, i.e., the adjustable damping factor and the source term treatment has been made. By comparing with the first-order upwind scheme previously utilized, the present results exhibit less damping and are free from spurious oscillations, offering improved quantitative accuracy while confirming the spectral analysis reported earlier. A simple feedback type of active control has been found to be capable of enhancing or attenuating the magnitude of the combustion instability.

  20. Performance Evaluation of Multimodal Multifeature Authentication System Using KNN Classification.

    PubMed

    Rajagopal, Gayathri; Palaniswamy, Ramamoorthy

    2015-01-01

    This research proposes a multimodal multifeature biometric system for human recognition using two traits, that is, palmprint and iris. The purpose of this research is to analyse integration of multimodal and multifeature biometric system using feature level fusion to achieve better performance. The main aim of the proposed system is to increase the recognition accuracy using feature level fusion. The features at the feature level fusion are raw biometric data which contains rich information when compared to decision and matching score level fusion. Hence information fused at the feature level is expected to obtain improved recognition accuracy. However, information fused at feature level has the problem of curse in dimensionality; here PCA (principal component analysis) is used to diminish the dimensionality of the feature sets as they are high dimensional. The proposed multimodal results were compared with other multimodal and monomodal approaches. Out of these comparisons, the multimodal multifeature palmprint iris fusion offers significant improvements in the accuracy of the suggested multimodal biometric system. The proposed algorithm is tested using created virtual multimodal database using UPOL iris database and PolyU palmprint database.

  1. Performance Evaluation of Multimodal Multifeature Authentication System Using KNN Classification

    PubMed Central

    Rajagopal, Gayathri; Palaniswamy, Ramamoorthy

    2015-01-01

    This research proposes a multimodal multifeature biometric system for human recognition using two traits, that is, palmprint and iris. The purpose of this research is to analyse integration of multimodal and multifeature biometric system using feature level fusion to achieve better performance. The main aim of the proposed system is to increase the recognition accuracy using feature level fusion. The features at the feature level fusion are raw biometric data which contains rich information when compared to decision and matching score level fusion. Hence information fused at the feature level is expected to obtain improved recognition accuracy. However, information fused at feature level has the problem of curse in dimensionality; here PCA (principal component analysis) is used to diminish the dimensionality of the feature sets as they are high dimensional. The proposed multimodal results were compared with other multimodal and monomodal approaches. Out of these comparisons, the multimodal multifeature palmprint iris fusion offers significant improvements in the accuracy of the suggested multimodal biometric system. The proposed algorithm is tested using created virtual multimodal database using UPOL iris database and PolyU palmprint database. PMID:26640813

  2. Present status of metrology of electro-optical surveillance systems

    NASA Astrophysics Data System (ADS)

    Chrzanowski, K.

    2017-10-01

    There has been a significant progress in equipment for testing electro-optical surveillance systems over the last decade. Modern test systems are increasingly computerized, employ advanced image processing and offer software support in measurement process. However, one great challenge, in form of relative low accuracy, still remains not solved. It is quite common that different test stations, when testing the same device, produce different results. It can even happen that two testing teams, while working on the same test station, with the same tested device, produce different results. Rapid growth of electro-optical technology, poor standardization, limited metrology infrastructure, subjective nature of some measurements, fundamental limitations from laws of physics, tendering rules and advances in artificial intelligence are major factors responsible for such situation. Regardless, next decade should bring significant improvements, since improvement in measurement accuracy is needed to sustain fast growth of electro-optical surveillance technology.

  3. Improving GOCE cross-track gravity gradients

    NASA Astrophysics Data System (ADS)

    Siemes, Christian

    2018-01-01

    The GOCE gravity gradiometer measured highly accurate gravity gradients along the orbit during GOCE's mission lifetime from March 17, 2009, to November 11, 2013. These measurements contain unique information on the gravity field at a spatial resolution of 80 km half wavelength, which is not provided to the same accuracy level by any other satellite mission now and in the foreseeable future. Unfortunately, the gravity gradient in cross-track direction is heavily perturbed in the regions around the geomagnetic poles. We show in this paper that the perturbing effect can be modeled accurately as a quadratic function of the non-gravitational acceleration of the satellite in cross-track direction. Most importantly, we can remove the perturbation from the cross-track gravity gradient to a great extent, which significantly improves the accuracy of the latter and offers opportunities for better scientific exploitation of the GOCE gravity gradient data set.

  4. Traceable Radiometry Underpinning Terrestrial- and Helio- Studies (TRUTHS)

    USGS Publications Warehouse

    Fox, N.; Aiken, J.; Barnett, J.J.; Briottet, X.; Carvell, R.; Frohlich, C.; Groom, S.B.; Hagolle, O.; Haigh, J.D.; Kieffer, H.H.; Lean, J.; Pollock, D.B.; Quinn, T.; Sandford, M.C.W.; Schaepman, M.; Shine, K.P.; Schmutz, W.K.; Teillet, P.M.; Thome, K.J.; Verstraete, M.M.; Zalewski, E.; ,

    2002-01-01

    The Traceable Radiometry Underpinning Terrestrial- and Helio- Studies (TRUTHS) mission offers a novel approach to the provision of key scientific data with unprecedented radiometric accuracy for Earth Observation (EO) and solar studies, which will also establish well-calibrated reference targets/standards to support other EO missions. This paper will present the TRUTHS mission and its objectives. TRUTHS will be the first satellite mission to calibrate its instrumentation directly to SI in orbit, overcoming the usual uncertainties associated with drifts of sensor gain and spectral shape by using an electrical rather than an optical standard as the basis of its calibration. The range of instruments flown as part of the payload will also provide accurate input data to improve atmospheric radiative transfer codes by anchoring boundary conditions, through simultaneous measurements of aerosols, particulates and radiances at various heights. Therefore, TRUTHS will significantly improve the performance and accuracy of Earth observation missions with broad global or operational aims, as well as more dedicated missions. The provision of reference standards will also improve synergy between missions by reducing errors due to different calibration biases and offer cost reductions for future missions by reducing the demands for on-board calibration systems. Such improvements are important for the future success of strategies such as Global Monitoring for Environment and Security (GMES) and the implementation and monitoring of international treaties such as the Kyoto Protocol. TRUTHS will achieve these aims by measuring the geophysical variables of solar and lunar irradiance, together with both polarised and un-polarised spectral radiance of the Moon, and the Earth and its atmosphere.

  5. Traceable Radiometry Underpinning Terrestrial - and Helio- Studies (TRUTHS)

    USGS Publications Warehouse

    Fox, N.; Aiken, J.; Barnett, J.J.; Briottet, X.; Carvell, R.; Frohlich, C.; Groom, S.B.; Hagolle, O.; Haigh, J.D.; Kieffer, H.H.; Lean, J.; Pollock, D.B.; Quinn, T.; Sandford, M.C.W.; Schaepman, M.; Shine, K.P.; Schmutz, W.K.; Teillet, P.M.; Thome, K.J.; Verstraete, M.M.; Zalewski, E.

    2003-01-01

    The Traceable Radiometry Underpinning Terrestrial- and Helio- Studies (TRUTHS) mission offers a novel approach to the provision of key scientific data with unprecedented radiometric accuracy for Earth Observation (EO) and solar studies, which will also establish well-calibrated reference targets/standards to support other EO missions. This paper presents the TRUTHS mission and its objectives. TRUTHS will be the first satellite mission to calibrate its EO instrumentation directly to SI in orbit, overcoming the usual uncertainties associated with drifts of sensor gain and spectral shape by using an electrical rather than an optical standard as the basis of its calibration. The range of instruments flown as part of the payload will also provide accurate input data to improve atmospheric radiative transfer codes by anchoring boundary conditions, through simultaneous measurements of aerosols, particulates and radiances at various heights. Therefore, TRUTHS will significantly improve the performance and accuracy of EO missions with broad global or operational aims, as well as more dedicated missions. The provision of reference standards will also improve synergy between missions by reducing errors due to different calibration biases and offer cost reductions for future missions by reducing the demands for on-board calibration systems. Such improvements are important for the future success of strategies such as Global Monitoring for Environment and Security (GMES) and the implementation and monitoring of international treaties such as the Kyoto Protocol. TRUTHS will achieve these aims by measuring the geophysical variables of solar and lunar irradiance, together with both polarised and unpolarised spectral radiance of the Moon, Earth and its atmosphere. Published by Elsevier Ltd of behalf of COSPAR.

  6. Combining functional and structural tests improves the diagnostic accuracy of relevance vector machine classifiers

    PubMed Central

    Racette, Lyne; Chiou, Christine Y.; Hao, Jiucang; Bowd, Christopher; Goldbaum, Michael H.; Zangwill, Linda M.; Lee, Te-Won; Weinreb, Robert N.; Sample, Pamela A.

    2009-01-01

    Purpose To investigate whether combining optic disc topography and short-wavelength automated perimetry (SWAP) data improves the diagnostic accuracy of relevance vector machine (RVM) classifiers for detecting glaucomatous eyes compared to using each test alone. Methods One eye of 144 glaucoma patients and 68 healthy controls from the Diagnostic Innovations in Glaucoma Study were included. RVM were trained and tested with cross-validation on optimized (backward elimination) SWAP features (thresholds plus age; pattern deviation (PD); total deviation (TD)) and on Heidelberg Retina Tomograph II (HRT) optic disc topography features, independently and in combination. RVM performance was also compared to two HRT linear discriminant functions (LDF) and to SWAP mean deviation (MD) and pattern standard deviation (PSD). Classifier performance was measured by the area under the receiver operating characteristic curves (AUROCs) generated for each feature set and by the sensitivities at set specificities of 75%, 90% and 96%. Results RVM trained on combined HRT and SWAP thresholds plus age had significantly higher AUROC (0.93) than RVM trained on HRT (0.88) and SWAP (0.76) alone. AUROCs for the SWAP global indices (MD: 0.68; PSD: 0.72) offered no advantage over SWAP thresholds plus age, while the LDF AUROCs were significantly lower than RVM trained on the combined SWAP and HRT feature set and on HRT alone feature set. Conclusions Training RVM on combined optimized HRT and SWAP data improved diagnostic accuracy compared to training on SWAP and HRT parameters alone. Future research may identify other combinations of tests and classifiers that can also improve diagnostic accuracy. PMID:19528827

  7. Staggered Mesh Ewald: An extension of the Smooth Particle-Mesh Ewald method adding great versatility

    PubMed Central

    Cerutti, David S.; Duke, Robert E.; Darden, Thomas A.; Lybrand, Terry P.

    2009-01-01

    We draw on an old technique for improving the accuracy of mesh-based field calculations to extend the popular Smooth Particle Mesh Ewald (SPME) algorithm as the Staggered Mesh Ewald (StME) algorithm. StME improves the accuracy of computed forces by up to 1.2 orders of magnitude and also reduces the drift in system momentum inherent in the SPME method by averaging the results of two separate reciprocal space calculations. StME can use charge mesh spacings roughly 1.5× larger than SPME to obtain comparable levels of accuracy; the one mesh in an SPME calculation can therefore be replaced with two separate meshes, each less than one third of the original size. Coarsening the charge mesh can be balanced with reductions in the direct space cutoff to optimize performance: the efficiency of StME rivals or exceeds that of SPME calculations with similarly optimized parameters. StME may also offer advantages for parallel molecular dynamics simulations because it permits the use of coarser meshes without requiring higher orders of charge interpolation and also because the two reciprocal space calculations can be run independently if that is most suitable for the machine architecture. We are planning other improvements to the standard SPME algorithm, and anticipate that StME will work synergistically will all of them to dramatically improve the efficiency and parallel scaling of molecular simulations. PMID:20174456

  8. The Fukushima-137Cs deposition case study: properties of the multi-model ensemble.

    PubMed

    Solazzo, E; Galmarini, S

    2015-01-01

    In this paper we analyse the properties of an eighteen-member ensemble generated by the combination of five atmospheric dispersion modelling systems and six meteorological data sets. The models have been applied to the total deposition of (137)Cs, following the nuclear accident of the Fukushima power plant in March 2011. Analysis is carried out with the scope of determining whether the ensemble is reliable, sufficiently diverse and if its accuracy and precision can be improved. Although ensemble practice is becoming more and more popular in many geophysical applications, good practice guidelines are missing as to how models should be combined for the ensembles to offer an improvement over single model realisations. We show that the ensemble of models share large portions of bias and variance and make use of several techniques to further show that subsets of models can explain the same amount of variance as the full ensemble mean with the advantage of being poorly correlated, allowing to save computational resources and reduce noise (and thus improving accuracy). We further propose and discuss two methods for selecting subsets of skilful and diverse members, and prove that, in the contingency of the present analysis, their mean outscores the full ensemble mean in terms of both accuracy (error) and precision (variance). Copyright © 2014. Published by Elsevier Ltd.

  9. Mapping water table depth using geophysical and environmental variables.

    PubMed

    Buchanan, S; Triantafilis, J

    2009-01-01

    Despite its importance, accurate representation of the spatial distribution of water table depth remains one of the greatest deficiencies in many hydrological investigations. Historically, both inverse distance weighting (IDW) and ordinary kriging (OK) have been used to interpolate depths. These methods, however, have major limitations: namely they require large numbers of measurements to represent the spatial variability of water table depth and they do not represent the variation between measurement points. We address this issue by assessing the benefits of using stepwise multiple linear regression (MLR) with three different ancillary data sets to predict the water table depth at 100-m intervals. The ancillary data sets used are Electromagnetic (EM34 and EM38), gamma radiometric: potassium (K), uranium (eU), thorium (eTh), total count (TC), and morphometric data. Results show that MLR offers significant precision and accuracy benefits over OK and IDW. Inclusion of the morphometric data set yielded the greatest (16%) improvement in prediction accuracy compared with IDW, followed by the electromagnetic data set (5%). Use of the gamma radiometric data set showed no improvement. The greatest improvement, however, resulted when all data sets were combined (37% increase in prediction accuracy over IDW). Significantly, however, the use of MLR also allows for prediction in variations in water table depth between measurement points, which is crucial for land management.

  10. Accuracy concerns in digital speckle photography combined with Fresnel digital holographic interferometry

    NASA Astrophysics Data System (ADS)

    Zhao, Yuchen; Zemmamouche, Redouane; Vandenrijt, Jean-François; Georges, Marc P.

    2018-05-01

    A combination of digital holographic interferometry (DHI) and digital speckle photography (DSP) allows in-plane and out-of-plane displacement measurement between two states of an object. The former can be determined by correlating the two speckle patterns whereas the latter is given by the phase difference obtained from DHI. We show that the amplitude of numerically reconstructed object wavefront obtained from Fresnel in-line digital holography (DH), in combination with phase shifting techniques, can be used as speckle patterns in DSP. The accuracy of in-plane measurement is improved after correcting the phase errors induced by reference wave during reconstruction process. Furthermore, unlike conventional imaging system, Fresnel DH offers the possibility to resize the pixel size of speckle patterns situated on the reconstruction plane under the same optical configuration simply by zero-padding the hologram. The flexibility of speckle size adjustment in Fresnel DH ensures the accuracy of estimation result using DSP.

  11. Understanding error generation in fused deposition modeling

    NASA Astrophysics Data System (ADS)

    Bochmann, Lennart; Bayley, Cindy; Helu, Moneer; Transchel, Robert; Wegener, Konrad; Dornfeld, David

    2015-03-01

    Additive manufacturing offers completely new possibilities for the manufacturing of parts. The advantages of flexibility and convenience of additive manufacturing have had a significant impact on many industries, and optimizing part quality is crucial for expanding its utilization. This research aims to determine the sources of imprecision in fused deposition modeling (FDM). Process errors in terms of surface quality, accuracy and precision are identified and quantified, and an error-budget approach is used to characterize errors of the machine tool. It was determined that accuracy and precision in the y direction (0.08-0.30 mm) are generally greater than in the x direction (0.12-0.62 mm) and the z direction (0.21-0.57 mm). Furthermore, accuracy and precision tend to decrease at increasing axis positions. The results of this work can be used to identify possible process improvements in the design and control of FDM technology.

  12. Improved automation of dissolved organic carbon sampling for organic-rich surface waters.

    PubMed

    Grayson, Richard P; Holden, Joseph

    2016-02-01

    In-situ UV-Vis spectrophotometers offer the potential for improved estimates of dissolved organic carbon (DOC) fluxes for organic-rich systems such as peatlands because they are able to sample and log DOC proxies automatically through time at low cost. In turn, this could enable improved total carbon budget estimates for peatlands. The ability of such instruments to accurately measure DOC depends on a number of factors, not least of which is how absorbance measurements relate to DOC and the environmental conditions. Here we test the ability of a S::can Spectro::lyser™ for measuring DOC in peatland streams with routinely high DOC concentrations. Through analysis of the spectral response data collected by the instrument we have been able to accurately measure DOC up to 66 mg L(-1), which is more than double the original upper calibration limit for this particular instrument. A linear regression modelling approach resulted in an accuracy >95%. The greatest accuracy was achieved when absorbance values for several different wavelengths were used at the same time in the model. However, an accuracy >90% was achieved using absorbance values for a single wavelength to predict DOC concentration. Our calculations indicated that, for organic-rich systems, in-situ measurement with a scanning spectrophotometer can improve fluvial DOC flux estimates by 6 to 8% compared with traditional sampling methods. Thus, our techniques pave the way for improved long-term carbon budget calculations from organic-rich systems such as peatlands. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Detectors

    DOEpatents

    Orr, Christopher Henry; Luff, Craig Janson; Dockray, Thomas; Macarthur, Duncan Whittemore; Bounds, John Alan; Allander, Krag

    2002-01-01

    The apparatus and method provide techniques through which both alpha and beta emission determinations can be made simultaneously using a simple detector structure. The technique uses a beta detector covered in an electrically conducting material, the electrically conducting material discharging ions generated by alpha emissions, and as a consequence providing a measure of those alpha emissions. The technique also offers improved mountings for alpha detectors and other forms of detectors against vibration and the consequential effects vibration has on measurement accuracy.

  14. High accuracy switched-current circuits using an improved dynamic mirror

    NASA Technical Reports Server (NTRS)

    Zweigle, G.; Fiez, T.

    1991-01-01

    The switched-current technique, a recently developed circuit approach to analog signal processing, has emerged as an alternative/compliment to the well established switched-capacitor circuit technique. High speed switched-current circuits offer potential cost and power savings over slower switched-capacitor circuits. Accuracy improvements are a primary concern at this stage in the development of the switched-current technique. Use of the dynamic current mirror has produced circuits that are insensitive to transistor matching errors. The dynamic current mirror has been limited by other sources of error including clock-feedthrough and voltage transient errors. In this paper we present an improved switched-current building block using the dynamic current mirror. Utilizing current feedback the errors due to current imbalance in the dynamic current mirror are reduced. Simulations indicate that this feedback can reduce total harmonic distortion by as much as 9 dB. Additionally, we have developed a clock-feedthrough reduction scheme for which simulations reveal a potential 10 dB total harmonic distortion improvement. The clock-feedthrough reduction scheme also significantly reduces offset errors and allows for cancellation with a constant current source. Experimental results confirm the simulated improvements.

  15. Actuator-Assisted Calibration of Freehand 3D Ultrasound System.

    PubMed

    Koo, Terry K; Silvia, Nathaniel

    2018-01-01

    Freehand three-dimensional (3D) ultrasound has been used independently of other technologies to analyze complex geometries or registered with other imaging modalities to aid surgical and radiotherapy planning. A fundamental requirement for all freehand 3D ultrasound systems is probe calibration. The purpose of this study was to develop an actuator-assisted approach to facilitate freehand 3D ultrasound calibration using point-based phantoms. We modified the mathematical formulation of the calibration problem to eliminate the need of imaging the point targets at different viewing angles and developed an actuator-assisted approach/setup to facilitate quick and consistent collection of point targets spanning the entire image field of view. The actuator-assisted approach was applied to a commonly used cross wire phantom as well as two custom-made point-based phantoms (original and modified), each containing 7 collinear point targets, and compared the results with the traditional freehand cross wire phantom calibration in terms of calibration reproducibility, point reconstruction precision, point reconstruction accuracy, distance reconstruction accuracy, and data acquisition time. Results demonstrated that the actuator-assisted single cross wire phantom calibration significantly improved the calibration reproducibility and offered similar point reconstruction precision, point reconstruction accuracy, distance reconstruction accuracy, and data acquisition time with respect to the freehand cross wire phantom calibration. On the other hand, the actuator-assisted modified "collinear point target" phantom calibration offered similar precision and accuracy when compared to the freehand cross wire phantom calibration, but it reduced the data acquisition time by 57%. It appears that both actuator-assisted cross wire phantom and modified collinear point target phantom calibration approaches are viable options for freehand 3D ultrasound calibration.

  16. Actuator-Assisted Calibration of Freehand 3D Ultrasound System

    PubMed Central

    2018-01-01

    Freehand three-dimensional (3D) ultrasound has been used independently of other technologies to analyze complex geometries or registered with other imaging modalities to aid surgical and radiotherapy planning. A fundamental requirement for all freehand 3D ultrasound systems is probe calibration. The purpose of this study was to develop an actuator-assisted approach to facilitate freehand 3D ultrasound calibration using point-based phantoms. We modified the mathematical formulation of the calibration problem to eliminate the need of imaging the point targets at different viewing angles and developed an actuator-assisted approach/setup to facilitate quick and consistent collection of point targets spanning the entire image field of view. The actuator-assisted approach was applied to a commonly used cross wire phantom as well as two custom-made point-based phantoms (original and modified), each containing 7 collinear point targets, and compared the results with the traditional freehand cross wire phantom calibration in terms of calibration reproducibility, point reconstruction precision, point reconstruction accuracy, distance reconstruction accuracy, and data acquisition time. Results demonstrated that the actuator-assisted single cross wire phantom calibration significantly improved the calibration reproducibility and offered similar point reconstruction precision, point reconstruction accuracy, distance reconstruction accuracy, and data acquisition time with respect to the freehand cross wire phantom calibration. On the other hand, the actuator-assisted modified “collinear point target” phantom calibration offered similar precision and accuracy when compared to the freehand cross wire phantom calibration, but it reduced the data acquisition time by 57%. It appears that both actuator-assisted cross wire phantom and modified collinear point target phantom calibration approaches are viable options for freehand 3D ultrasound calibration. PMID:29854371

  17. Developing an in silico minimum inhibitory concentration panel test for Klebsiella pneumoniae

    DOE PAGES

    Nguyen, Marcus; Brettin, Thomas; Long, S. Wesley; ...

    2018-01-11

    Here, antimicrobial resistant infections are a serious public health threat worldwide. Whole genome sequencing approaches to rapidly identify pathogens and predict antibiotic resistance phenotypes are becoming more feasible and may offer a way to reduce clinical test turnaround times compared to conventional culture-based methods, and in turn, improve patient outcomes. In this study, we use whole genome sequence data from 1668 clinical isolates of Klebsiella pneumoniae to develop a XGBoost-based machine learning model that accurately predicts minimum inhibitory concentrations (MICs) for 20 antibiotics. The overall accuracy of the model, within ± 1 two-fold dilution factor, is 92%. Individual accuracies aremore » >= 90% for 15/20 antibiotics. We show that the MICs predicted by the model correlate with known antimicrobial resistance genes. Importantly, the genome-wide approach described in this study offers a way to predict MICs for isolates without knowledge of the underlying gene content. This study shows that machine learning can be used to build a complete in silico MIC prediction panel for K. pneumoniae and provides a framework for building MIC prediction models for other pathogenic bacteria.« less

  18. Developing an in silico minimum inhibitory concentration panel test for Klebsiella pneumoniae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Marcus; Brettin, Thomas; Long, S. Wesley

    Here, antimicrobial resistant infections are a serious public health threat worldwide. Whole genome sequencing approaches to rapidly identify pathogens and predict antibiotic resistance phenotypes are becoming more feasible and may offer a way to reduce clinical test turnaround times compared to conventional culture-based methods, and in turn, improve patient outcomes. In this study, we use whole genome sequence data from 1668 clinical isolates of Klebsiella pneumoniae to develop a XGBoost-based machine learning model that accurately predicts minimum inhibitory concentrations (MICs) for 20 antibiotics. The overall accuracy of the model, within ± 1 two-fold dilution factor, is 92%. Individual accuracies aremore » >= 90% for 15/20 antibiotics. We show that the MICs predicted by the model correlate with known antimicrobial resistance genes. Importantly, the genome-wide approach described in this study offers a way to predict MICs for isolates without knowledge of the underlying gene content. This study shows that machine learning can be used to build a complete in silico MIC prediction panel for K. pneumoniae and provides a framework for building MIC prediction models for other pathogenic bacteria.« less

  19. Predicting outcome on admission and post-admission for acetaminophen-induced acute liver failure using classification and regression tree models.

    PubMed

    Speiser, Jaime Lynn; Lee, William M; Karvellas, Constantine J

    2015-01-01

    Assessing prognosis for acetaminophen-induced acute liver failure (APAP-ALF) patients often presents significant challenges. King's College (KCC) has been validated on hospital admission, but little has been published on later phases of illness. We aimed to improve determinations of prognosis both at the time of and following admission for APAP-ALF using Classification and Regression Tree (CART) models. CART models were applied to US ALFSG registry data to predict 21-day death or liver transplant early (on admission) and post-admission (days 3-7) for 803 APAP-ALF patients enrolled 01/1998-09/2013. Accuracy in prediction of outcome (AC), sensitivity (SN), specificity (SP), and area under receiver-operating curve (AUROC) were compared between 3 models: KCC (INR, creatinine, coma grade, pH), CART analysis using only KCC variables (KCC-CART) and a CART model using new variables (NEW-CART). Traditional KCC yielded 69% AC, 90% SP, 27% SN, and 0.58 AUROC on admission, with similar performance post-admission. KCC-CART at admission offered predictive 66% AC, 65% SP, 67% SN, and 0.74 AUROC. Post-admission, KCC-CART had predictive 82% AC, 86% SP, 46% SN and 0.81 AUROC. NEW-CART models using MELD (Model for end stage liver disease), lactate and mechanical ventilation on admission yielded predictive 72% AC, 71% SP, 77% SN and AUROC 0.79. For later stages, NEW-CART (MELD, lactate, coma grade) offered predictive AC 86%, SP 91%, SN 46%, AUROC 0.73. CARTs offer simple prognostic models for APAP-ALF patients, which have higher AUROC and SN than KCC, with similar AC and negligibly worse SP. Admission and post-admission predictions were developed. • Prognostication in acetaminophen-induced acute liver failure (APAP-ALF) is challenging beyond admission • Little has been published regarding the use of King's College Criteria (KCC) beyond admission and KCC has shown limited sensitivity in subsequent studies • Classification and Regression Tree (CART) methodology allows the development of predictive models using binary splits and offers an intuitive method for predicting outcome, using processes familiar to clinicians • Data from the ALFSG registry suggested that CART prognosis models for the APAP population offer improved sensitivity and model performance over traditional regression-based KCC, while maintaining similar accuracy and negligibly worse specificity • KCC-CART models offered modest improvement over traditional KCC, with NEW-CART models performing better than KCC-CART particularly at late time points.

  20. Infrared calibration for climate: a perspective on present and future high-spectral resolution instruments

    NASA Astrophysics Data System (ADS)

    Revercomb, Henry E.; Anderson, James G.; Best, Fred A.; Tobin, David C.; Knuteson, Robert O.; LaPorte, Daniel D.; Taylor, Joe K.

    2006-12-01

    The new era of high spectral resolution infrared instruments for atmospheric sounding offers great opportunities for climate change applications. A major issue with most of our existing IR observations from space is spectral sampling uncertainty and the lack of standardization in spectral sampling. The new ultra resolution observing capabilities from the AIRS grating spectrometer on the NASA Aqua platform and from new operational FTS instruments (IASI on Metop, CrIS for NPP/NPOESS, and the GIFTS for a GOES demonstration) will go a long way toward improving this situation. These new observations offer the following improvements: 1. Absolute accuracy, moving from issues of order 1 K to <0.2-0.4 K brightness temperature, 2. More complete spectral coverage, with Nyquist sampling for scale standardization, and 3. Capabilities for unifying IR calibration among different instruments and platforms. However, more needs to be done to meet the immediate needs for climate and to effectively leverage these new operational weather systems, including 1. Place special emphasis on making new instruments as accurate as they can be to realize the potential of technological investments already made, 2. Maintain a careful validation program for establishing the best possible direct radiance check of long-term accuracy--specifically, continuing to use aircraft-or balloon-borne instruments that are periodically checked directly with NIST, and 3. Commit to a simple, new IR mission that will provide an ongoing backbone for the climate observing system. The new mission would make use of Fourier Transform Spectrometer measurements to fill in spectral and diurnal sampling gaps of the operational systems and provide a benchmark with better than 0.1K 3-sigma accuracy based on standards that are verifiable in-flight.

  1. Toward accurate prediction of pKa values for internal protein residues: the importance of conformational relaxation and desolvation energy.

    PubMed

    Wallace, Jason A; Wang, Yuhang; Shi, Chuanyin; Pastoor, Kevin J; Nguyen, Bao-Linh; Xia, Kai; Shen, Jana K

    2011-12-01

    Proton uptake or release controls many important biological processes, such as energy transduction, virus replication, and catalysis. Accurate pK(a) prediction informs about proton pathways, thereby revealing detailed acid-base mechanisms. Physics-based methods in the framework of molecular dynamics simulations not only offer pK(a) predictions but also inform about the physical origins of pK(a) shifts and provide details of ionization-induced conformational relaxation and large-scale transitions. One such method is the recently developed continuous constant pH molecular dynamics (CPHMD) method, which has been shown to be an accurate and robust pK(a) prediction tool for naturally occurring titratable residues. To further examine the accuracy and limitations of CPHMD, we blindly predicted the pK(a) values for 87 titratable residues introduced in various hydrophobic regions of staphylococcal nuclease and variants. The predictions gave a root-mean-square deviation of 1.69 pK units from experiment, and there were only two pK(a)'s with errors greater than 3.5 pK units. Analysis of the conformational fluctuation of titrating side-chains in the context of the errors of calculated pK(a) values indicate that explicit treatment of conformational flexibility and the associated dielectric relaxation gives CPHMD a distinct advantage. Analysis of the sources of errors suggests that more accurate pK(a) predictions can be obtained for the most deeply buried residues by improving the accuracy in calculating desolvation energies. Furthermore, it is found that the generalized Born implicit-solvent model underlying the current CPHMD implementation slightly distorts the local conformational environment such that the inclusion of an explicit-solvent representation may offer improvement of accuracy. Copyright © 2011 Wiley-Liss, Inc.

  2. Comparison of traditional nondestructive analysis of RERTR fuel plates with digital radiographic techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidsmeier, T.; Koehl, R.; Lanham, R.

    2008-07-15

    The current design and fabrication process for RERTR fuel plates utilizes film radiography during the nondestructive testing and characterization. Digital radiographic methods offer a potential increases in efficiency and accuracy. The traditional and digital radiographic methods are described and demonstrated on a fuel plate constructed with and average of 51% by volume fuel using the dispersion method. Fuel loading data from each method is analyzed and compared to a third baseline method to assess accuracy. The new digital method is shown to be more accurate, save hours of work, and provide additional information not easily available in the traditional method.more » Additional possible improvements suggested by the new digital method are also raised. (author)« less

  3. Photometry with FORS

    NASA Astrophysics Data System (ADS)

    Freudling, W.; Møller, P.; Patat, F.; Moehler, S.; Romaniello, M.; Jehin, E.; O'Brien, K.; Izzo, C.; Pompei, E.

    Photometric calibration observations are routinely carried out with all ESO imaging cameras in every clear night. The nightly zeropoints derived from these observations are accurate to about 10%. Recently, we have started the FORS Absolute Photometry Project (FAP) to investigate, if and how percent-level absolute photometric accuracy can be achieved with FORS1, and how such photometric calibration can be offered to observers. We found that there are significant differences between the sky-flats and the true photometric response of the instrument which partially depend on the rotator angle. A second order correction to the sky-flat significantly improves the relative photometry within the field. We demonstrate the feasibility of percent level photometry and describe the calibrations necessary to achieve that level of accuracy.

  4. Calibration of transonic and supersonic wind tunnels

    NASA Technical Reports Server (NTRS)

    Reed, T. D.; Pope, T. C.; Cooksey, J. M.

    1977-01-01

    State-of-the art instrumentation and procedures for calibrating transonic (0.6 less than M less than 1.4) and supersonic (M less than or equal to 3.5) wind tunnels were reviewed and evaluated. Major emphasis was given to transonic tunnels. Continuous, blowdown and intermittent tunnels were considered. The required measurements of pressure, temperature, flow angularity, noise and humidity were discussed, and the effects of measurement uncertainties were summarized. A comprehensive review of instrumentation currently used to calibrate empty tunnel flow conditions was included. The recent results of relevant research are noted and recommendations for achieving improved data accuracy are made where appropriate. It is concluded, for general testing purposes, that satisfactory calibration measurements can be achieved in both transonic and supersonic tunnels. The goal of calibrating transonic tunnels to within 0.001 in centerline Mach number appears to be feasible with existing instrumentation, provided correct calibration procedures are carefully followed. A comparable accuracy can be achieved off-centerline with carefully designed, conventional probes, except near Mach 1. In the range 0.95 less than M less than 1.05, the laser Doppler velocimeter appears to offer the most promise for improved calibration accuracy off-centerline.

  5. Using Multiple Adaptively-Weighted Strategies for the Resolution of Demonstratives

    DTIC Science & Technology

    1993-05-10

    better coverage and accuracy, and reducing the reliance on user intervention. In addition to incresed coverage, the multi-strategy approach offers easy...wrote that: 26.10. "The entrepreneur takes resources from an area of lower productivity and moves them to an area of higher productivity." 26.11...That’s what defines an entrepreneur . 26.12. Innovation is a specific tool, Drucker says, of the entrepreneur , in which we create new resources or improve

  6. LLNL/Lion Precision LVDT amplifier

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopkins, D.J.

    1994-04-01

    A high-precision, low-noise, LVDT amplifier has been developed which is a significant advancement on the current state of the art in contact displacement measurement. This amplifier offers the dynamic range of a typical LVDT probe but with a resolution that rivals that of non contact displacement measuring systems such as capacitance gauges and laser interferometers. Resolution of 0.1 {mu} in with 100 Hz bandwidth is possible. This level of resolution is over an order of magnitude greater than what is now commercially available. A front panel switch can reduce the bandwidth to 2.5 Hz and attain a resolution of 0.025more » {mu} in. This level of resolution meets or exceeds that of displacement measuring laser interferometry or capacitance gauge systems. Contact displacement measurement offers high part spatial resolution and therefore can measure not only part contour but surface finish. Capacitance gauges and displacement laser interferometry offer poor part spatial resolution and can not provide good surface finish measurements. Machine tool builders, meteorologists and quality inspection departments can immediately utilize the higher accuracy and capabilities that this amplifier offers. The precision manufacturing industry can improve as a result of improved capability to measure parts that help reduce costs and minimize material waste.« less

  7. A new generation of effective core potentials for correlated calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Michael Chandler; Melton, Cody A.; Annaberdiyev, Abdulgani

    Here, we outline ideas on desired properties for a new generation of effective core potentials (ECPs) that will allow valence-only calculations to reach the full potential offered by recent advances in many-body wave function methods. The key improvements include consistent use of correlated methods throughout ECP constructions and improved transferability as required for an accurate description of molecular systems over a range of geometries. The guiding principle is the isospectrality of all-electron and ECP Hamiltonians for a subset of valence states. We illustrate these concepts on a few first- and second-row atoms (B, C, N, O, S), and we obtainmore » higher accuracy in transferability than previous constructions while using semi-local ECPs with a small number of parameters. In addition, the constructed ECPs enable many-body calculations of valence properties with higher (or same) accuracy than their all-electron counterparts with uncorrelated cores. This implies that the ECPs include also some of the impacts of core-core and core-valence correlations on valence properties. The results open further prospects for ECP improvements and refinements.« less

  8. A new generation of effective core potentials for correlated calculations

    DOE PAGES

    Bennett, Michael Chandler; Melton, Cody A.; Annaberdiyev, Abdulgani; ...

    2017-12-12

    Here, we outline ideas on desired properties for a new generation of effective core potentials (ECPs) that will allow valence-only calculations to reach the full potential offered by recent advances in many-body wave function methods. The key improvements include consistent use of correlated methods throughout ECP constructions and improved transferability as required for an accurate description of molecular systems over a range of geometries. The guiding principle is the isospectrality of all-electron and ECP Hamiltonians for a subset of valence states. We illustrate these concepts on a few first- and second-row atoms (B, C, N, O, S), and we obtainmore » higher accuracy in transferability than previous constructions while using semi-local ECPs with a small number of parameters. In addition, the constructed ECPs enable many-body calculations of valence properties with higher (or same) accuracy than their all-electron counterparts with uncorrelated cores. This implies that the ECPs include also some of the impacts of core-core and core-valence correlations on valence properties. The results open further prospects for ECP improvements and refinements.« less

  9. Machine Learning Algorithms Outperform Conventional Regression Models in Predicting Development of Hepatocellular Carcinoma

    PubMed Central

    Singal, Amit G.; Mukherjee, Ashin; Elmunzer, B. Joseph; Higgins, Peter DR; Lok, Anna S.; Zhu, Ji; Marrero, Jorge A; Waljee, Akbar K

    2015-01-01

    Background Predictive models for hepatocellular carcinoma (HCC) have been limited by modest accuracy and lack of validation. Machine learning algorithms offer a novel methodology, which may improve HCC risk prognostication among patients with cirrhosis. Our study's aim was to develop and compare predictive models for HCC development among cirrhotic patients, using conventional regression analysis and machine learning algorithms. Methods We enrolled 442 patients with Child A or B cirrhosis at the University of Michigan between January 2004 and September 2006 (UM cohort) and prospectively followed them until HCC development, liver transplantation, death, or study termination. Regression analysis and machine learning algorithms were used to construct predictive models for HCC development, which were tested on an independent validation cohort from the Hepatitis C Antiviral Long-term Treatment against Cirrhosis (HALT-C) Trial. Both models were also compared to the previously published HALT-C model. Discrimination was assessed using receiver operating characteristic curve analysis and diagnostic accuracy was assessed with net reclassification improvement and integrated discrimination improvement statistics. Results After a median follow-up of 3.5 years, 41 patients developed HCC. The UM regression model had a c-statistic of 0.61 (95%CI 0.56-0.67), whereas the machine learning algorithm had a c-statistic of 0.64 (95%CI 0.60–0.69) in the validation cohort. The machine learning algorithm had significantly better diagnostic accuracy as assessed by net reclassification improvement (p<0.001) and integrated discrimination improvement (p=0.04). The HALT-C model had a c-statistic of 0.60 (95%CI 0.50-0.70) in the validation cohort and was outperformed by the machine learning algorithm (p=0.047). Conclusion Machine learning algorithms improve the accuracy of risk stratifying patients with cirrhosis and can be used to accurately identify patients at high-risk for developing HCC. PMID:24169273

  10. Restoration Of MEX SRC Images For Improved Topography: A New Image Product

    NASA Astrophysics Data System (ADS)

    Duxbury, T. C.

    2012-12-01

    Surface topography is an important constraint when investigating the evolution of solar system bodies. Topography is typically obtained from stereo photogrammetric or photometric (shape from shading) analyses of overlapping / stereo images and from laser / radar altimetry data. The ESA Mars Express Mission [1] carries a Super Resolution Channel (SRC) as part of the High Resolution Stereo Camera (HRSC) [2]. The SRC can build up overlapping / stereo coverage of Mars, Phobos and Deimos by viewing the surfaces from different orbits. The derivation of high precision topography data from the SRC raw images is degraded because the camera is out of focus. The point spread function (PSF) is multi-peaked, covering tens of pixels. After registering and co-adding hundreds of star images, an accurate SRC PSF was reconstructed and is being used to restore the SRC images to near blur free quality. The restored images offer a factor of about 3 in improved geometric accuracy as well as identifying the smallest of features to significantly improve the stereo photogrammetric accuracy in producing digital elevation models. The difference between blurred and restored images provides a new derived image product that can provide improved feature recognition to increase spatial resolution and topographic accuracy of derived elevation models. Acknowledgements: This research was funded by the NASA Mars Express Participating Scientist Program. [1] Chicarro, et al., ESA SP 1291(2009) [2] Neukum, et al., ESA SP 1291 (2009). A raw SRC image (h4235.003) of a Martian crater within Gale crater (the MSL landing site) is shown in the upper left and the restored image is shown in the lower left. A raw image (h0715.004) of Phobos is shown in the upper right and the difference between the raw and restored images, a new derived image data product, is shown in the lower right. The lower images, resulting from an image restoration process, significantly improve feature recognition for improved derived topographic accuracy.

  11. Ultrasonic flowmeters offer oil line leak-detection potential

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hettrich, U.

    1995-04-01

    Ultrasonic flowmeters (USFM) installed on Transalpine Pipeline Co.`s (TAL) crude-oil system have proven to be a cost-effective flow measurement technique and beneficial in batch identification and leak detection. Through close examination, TAL has determined that clamp-on USFMs offer cost-saving advantages in installation, maintenance and operation. USFMs do not disturb pig passage. The technique also provides sound velocity capabilities, which can be used for liquid identification and batch tracking. The instruments have a repeatability of better than 0.25% and achieve an accuracy of better than 1%, depending on the flow profiles predictability. Using USFMs with multiple beams probably will improve accuracymore » further and it should be possible to find leaks even smaller than 1% of flow.« less

  12. Improving Density Functional Tight Binding Predictions of Free Energy Surfaces for Slow Chemical Reactions in Solution

    NASA Astrophysics Data System (ADS)

    Kroonblawd, Matthew; Goldman, Nir

    2017-06-01

    First principles molecular dynamics using highly accurate density functional theory (DFT) is a common tool for predicting chemistry, but the accessible time and space scales are often orders of magnitude beyond the resolution of experiments. Semi-empirical methods such as density functional tight binding (DFTB) offer up to a thousand-fold reduction in required CPU hours and can approach experimental scales. However, standard DFTB parameter sets lack good transferability and calibration for a particular system is usually necessary. Force matching the pairwise repulsive energy term in DFTB to short DFT trajectories can improve the former's accuracy for reactions that are fast relative to DFT simulation times (<10 ps), but the effects on slow reactions and the free energy surface are not well-known. We present a force matching approach to improve the chemical accuracy of DFTB. Accelerated sampling techniques are combined with path collective variables to generate the reference DFT data set and validate fitted DFTB potentials. Accuracy of force-matched DFTB free energy surfaces is assessed for slow peptide-forming reactions by direct comparison to DFT for particular paths. Extensions to model prebiotic chemistry under shock conditions are discussed. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  13. AGU Climate Scientists Offer Question-and-Answer Service for Media

    NASA Astrophysics Data System (ADS)

    Jackson, Stacy

    2010-03-01

    In fall 2009, AGU launched a member-driven pilot project to improve the accuracy of climate science coverage in the media and to improve public understanding of climate science. The project's goal was to increase the accessibility of climate science experts to journalists across the full spectrum of media outlets. As a supplement to the traditional one-to-one journalist-expert relationship model, the project tested the novel approach of providing a question-and-answer (Q&A) service with a pool of expert scientists and a Web-based interface with journalists. Questions were explicitly limited to climate science to maintain a nonadvocacy, nonpartisan perspective.

  14. SAR antenna calibration techniques

    NASA Technical Reports Server (NTRS)

    Carver, K. R.; Newell, A. C.

    1978-01-01

    Calibration of SAR antennas requires a measurement of gain, elevation and azimuth pattern shape, boresight error, cross-polarization levels, and phase vs. angle and frequency. For spaceborne SAR antennas of SEASAT size operating at C-band or higher, some of these measurements can become extremely difficult using conventional far-field antenna test ranges. Near-field scanning techniques offer an alternative approach and for C-band or X-band SARs, give much improved accuracy and precision as compared to that obtainable with a far-field approach.

  15. A pilot feasibility study of virtual patient simulation to enhance social work students' brief mental health assessment skills.

    PubMed

    Washburn, Micki; Bordnick, Patrick; Rizzo, Albert Skip

    2016-10-01

    This study presents preliminary feasibility and acceptability data on the use of virtual patient (VP) simulations to develop brief assessment skills within an interdisciplinary care setting. Results support the acceptability of technology-enhanced simulations and offer preliminary evidence for an association between engagement in VP practice simulations and improvements in diagnostic accuracy and clinical interviewing skills. Recommendations and next steps for research on technology-enhanced simulations within social work are discussed.

  16. Magnetic resonance imaging in prostate brachytherapy: Evidence, clinical end points to data, and direction forward.

    PubMed

    Pugh, Thomas J; Pokharel, Sajal S

    The integration of multiparametric MRI into prostate brachytherapy has become a subject of interest over the past 2 decades. MRI directed high-dose-rate and low-dose-rate prostate brachytherapy offers the potential to improve treatment accuracy and standardize postprocedure quality. This article reviews the evidence to date on MRI utilization in prostate brachytherapy and postulates future pathways for MRI integration. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  17. Flight test and evaluation of Omega navigation for general aviation

    NASA Technical Reports Server (NTRS)

    Hwoschinsky, P. V.

    1975-01-01

    A seventy hour flight test program was performed to determine the suitability and accuracy of a low cost Omega navigation receiver in a general aviation aircraft. An analysis was made of signal availability in two widely separated geographic areas. Comparison is made of the results of these flights with other navigation systems. Conclusions drawn from the test experience indicate that developmental system improvement is necessary before a competent fail safe or fail soft area navigation system is offered to general aviation.

  18. High-pressure liquid chromatography analysis of antibiotic susceptibility disks.

    PubMed Central

    Hagel, R B; Waysek, E H; Cort, W M

    1979-01-01

    The analysis of antibiotic susceptibility disks by high-pressure liquid chromatography (HPLC) was investigated. Methods are presented for the potency determination of mecillinam, ampicillin, carbenicillin, and cephalothin alone and in various combinations. Good agreement between HPLC and microbiological data is observed for potency determinations with recoveries of greater than 95%. Relative standard deviations of lower than 2% are recorded for each HPLC method. HPLC methods offer improved accuracy and greater precision when compared to the standard microbiological methods of analysis for susceptibility disks. PMID:507793

  19. Combining accuracy assessment of land-cover maps with environmental monitoring programs

    Treesearch

    Stephen V. Stehman; Raymond L. Czaplewski; Sarah M. Nusser; Limin Yang; Zhiliang Zhu

    2000-01-01

    A scientifically valid accuracy assessment of a large-area, land-cover map is expensive. Environmental monitoring programs offer a potential source of data to partially defray the cost of accuracy assessment while still maintaining the statistical validity. In this article, three general strategies for combining accuracy assessment and environmental monitoring...

  20. Treatment of word-finding deficits in fluent aphasia through the manipulation of spatial attention: Preliminary findings

    PubMed Central

    Dotson, Vonetta M.; Singletary, Floris; Fuller, Renee; Koehler, Shirley; Moore, Anna Bacon; Gonzalez Rothi, Leslie J.; Crosson, Bruce

    2010-01-01

    Background Attention, the processing of one source of information to the exclusion of others, is important for most cognitive processes, including language. Evidence suggests not only that dysfunctional attention mechanisms contribute to language deficits after stroke, but also that orienting attention to a patient's ipsilesional hemispace recruits attention mechanisms in the intact hemisphere and improves language functions in some persons with aphasia. Aims The aim of the current research was to offer proof of concept for the strategy of improving picture-naming performance in fluent aphasia by moving stimuli into the left hemispace. It was hypothesised that repeated orientation of attention to the ipsilesional hemispace during picture naming would lead to improved naming accuracy for participants with fluent aphasia. Methods & Procedures Three participants with stable fluent aphasia received daily treatment sessions that consisted of naming simple line drawings presented 45 degrees to the left of body midline on a computer monitor. Naming probes were administered before initiation of the treatment protocol to establish a baseline, and before each treatment session to measure change during treatment. The C statistic was used to establish the stability of baseline performance and to determine whether the slope of the treatment phases differed significantly from the slope of the baseline. Outcomes & Results Two of the three participants showed significant improvement over baseline performance in the percent correct of naming probes. One participant showed no improvement over baseline accuracy. Conclusions Results suggest that engaging right-hemisphere attention mechanisms may improve naming accuracy in some people with fluent aphasia. Findings justify further investigation of this treatment in a larger controlled study. PMID:22131638

  1. Improvements in low-cost label-free QPI microscope for live cell imaging

    NASA Astrophysics Data System (ADS)

    Seniya, C.; Towers, C. E.; Towers, D. P.

    2017-07-01

    This paper reports an improvement in the development of a low-cost QPI microscope offering new capabilities in term of phase measurement accuracy for label-free live samples in the longer term (i.e., hours to days). The spatially separated scattered and non-scattered image light fields are reshaped in the Fourier plane and modulated to form an interference image at a CCD camera. The apertures that enable these two beams to be generated have been optimised by means of laser-cut apertures placed on the mirrors of a Michelson interferometer and has improved the phase measuring and reconstruction capability of the QPI microscope. The microscope was tested with transparent onion cells as an object of interest.

  2. Improving the recommender algorithms with the detected communities in bipartite networks

    NASA Astrophysics Data System (ADS)

    Zhang, Peng; Wang, Duo; Xiao, Jinghua

    2017-04-01

    Recommender system offers a powerful tool to make information overload problem well solved and thus gains wide concerns of scholars and engineers. A key challenge is how to make recommendations more accurate and personalized. We notice that community structures widely exist in many real networks, which could significantly affect the recommendation results. By incorporating the information of detected communities in the recommendation algorithms, an improved recommendation approach for the networks with communities is proposed. The approach is examined in both artificial and real networks, the results show that the improvement on accuracy and diversity can be 20% and 7%, respectively. This reveals that it is beneficial to classify the nodes based on the inherent properties in recommender systems.

  3. Bayesian Network Structure Learning for Urban Land Use Classification from Landsat ETM+ and Ancillary Data

    NASA Astrophysics Data System (ADS)

    Park, M.; Stenstrom, M. K.

    2004-12-01

    Recognizing urban information from the satellite imagery is problematic due to the diverse features and dynamic changes of urban landuse. The use of Landsat imagery for urban land use classification involves inherent uncertainty due to its spatial resolution and the low separability among land uses. To resolve the uncertainty problem, we investigated the performance of Bayesian networks to classify urban land use since Bayesian networks provide a quantitative way of handling uncertainty and have been successfully used in many areas. In this study, we developed the optimized networks for urban land use classification from Landsat ETM+ images of Marina del Rey area based on USGS land cover/use classification level III. The networks started from a tree structure based on mutual information between variables and added the links to improve accuracy. This methodology offers several advantages: (1) The network structure shows the dependency relationships between variables. The class node value can be predicted even with particular band information missing due to sensor system error. The missing information can be inferred from other dependent bands. (2) The network structure provides information of variables that are important for the classification, which is not available from conventional classification methods such as neural networks and maximum likelihood classification. In our case, for example, bands 1, 5 and 6 are the most important inputs in determining the land use of each pixel. (3) The networks can be reduced with those input variables important for classification. This minimizes the problem without considering all possible variables. We also examined the effect of incorporating ancillary data: geospatial information such as X and Y coordinate values of each pixel and DEM data, and vegetation indices such as NDVI and Tasseled Cap transformation. The results showed that the locational information improved overall accuracy (81%) and kappa coefficient (76%), and lowered the omission and commission errors compared with using only spectral data (accuracy 71%, kappa coefficient 62%). Incorporating DEM data did not significantly improve overall accuracy (74%) and kappa coefficient (66%) but lowered the omission and commission errors. Incorporating NDVI did not much improve the overall accuracy (72%) and k coefficient (65%). Including Tasseled Cap transformation reduced the accuracy (accuracy 70%, kappa 61%). Therefore, additional information from the DEM and vegetation indices was not useful as locational ancillary data.

  4. PRGdb 3.0: a comprehensive platform for prediction and analysis of plant disease resistance genes.

    PubMed

    Osuna-Cruz, Cristina M; Paytuvi-Gallart, Andreu; Di Donato, Antimo; Sundesha, Vicky; Andolfo, Giuseppe; Aiese Cigliano, Riccardo; Sanseverino, Walter; Ercolano, Maria R

    2018-01-04

    The Plant Resistance Genes database (PRGdb; http://prgdb.org) has been redesigned with a new user interface, new sections, new tools and new data for genetic improvement, allowing easy access not only to the plant science research community but also to breeders who want to improve plant disease resistance. The home page offers an overview of easy-to-read search boxes that streamline data queries and directly show plant species for which data from candidate or cloned genes have been collected. Bulk data files and curated resistance gene annotations are made available for each plant species hosted. The new Gene Model view offers detailed information on each cloned resistance gene structure to highlight shared attributes with other genes. PRGdb 3.0 offers 153 reference resistance genes and 177 072 annotated candidate Pathogen Receptor Genes (PRGs). Compared to the previous release, the number of putative genes has been increased from 106 to 177 K from 76 sequenced Viridiplantae and algae genomes. The DRAGO 2 tool, which automatically annotates and predicts (PRGs) from DNA and amino acid with high accuracy and sensitivity, has been added. BLAST search has been implemented to offer users the opportunity to annotate and compare their own sequences. The improved section on plant diseases displays useful information linked to genes and genomes to connect complementary data and better address specific needs. Through, a revised and enlarged collection of data, the development of new tools and a renewed portal, PRGdb 3.0 engages the plant science community in developing a consensus plan to improve knowledge and strategies to fight diseases that afflict main crops and other plants. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Accuracy improvement techniques in Precise Point Positioning method using multiple GNSS constellations

    NASA Astrophysics Data System (ADS)

    Vasileios Psychas, Dimitrios; Delikaraoglou, Demitris

    2016-04-01

    The future Global Navigation Satellite Systems (GNSS), including modernized GPS, GLONASS, Galileo and BeiDou, offer three or more signal carriers for civilian use and much more redundant observables. The additional frequencies can significantly improve the capabilities of the traditional geodetic techniques based on GPS signals at two frequencies, especially with regard to the availability, accuracy, interoperability and integrity of high-precision GNSS applications. Furthermore, highly redundant measurements can allow for robust simultaneous estimation of static or mobile user states including more parameters such as real-time tropospheric biases and more reliable ambiguity resolution estimates. This paper presents an investigation and analysis of accuracy improvement techniques in the Precise Point Positioning (PPP) method using signals from the fully operational (GPS and GLONASS), as well as the emerging (Galileo and BeiDou) GNSS systems. The main aim was to determine the improvement in both the positioning accuracy achieved and the time convergence it takes to achieve geodetic-level (10 cm or less) accuracy. To this end, freely available observation data from the recent Multi-GNSS Experiment (MGEX) of the International GNSS Service, as well as the open source program RTKLIB were used. Following a brief background of the PPP technique and the scope of MGEX, the paper outlines the various observational scenarios that were used in order to test various data processing aspects of PPP solutions with multi-frequency, multi-constellation GNSS systems. Results from the processing of multi-GNSS observation data from selected permanent MGEX stations are presented and useful conclusions and recommendations for further research are drawn. As shown, data fusion from GPS, GLONASS, Galileo and BeiDou systems is becoming increasingly significant nowadays resulting in a position accuracy increase (mostly in the less favorable East direction) and a large reduction of convergence time in PPP static and kinematic solutions compared to GPS-only PPP solutions for various observational session durations. However, this is mostly observed when the visibility of Galileo and BeiDou satellites is substantially long within an observational session. In GPS-only cases dealing with data from high elevation cut-off angles, the number of GPS satellites decreases dramatically, leading to a position accuracy and convergence time deviating from satisfactory geodetic thresholds. By contrast, respective multi-GNSS PPP solutions not only show improvement, but also lead to geodetic level accuracies even in 30° elevation cut-off. Finally, the GPS ambiguity resolution in PPP processing is investigated using the GPS satellite wide-lane fractional cycle biases, which are included in the clock products by CNES. It is shown that their addition shortens the convergence time and increases the position accuracy of PPP solutions, especially in kinematic mode. Analogous improvement is obtained in respective multi-GNSS solutions, even though the GLONASS, Galileo and BeiDou ambiguities remain float, since information about them is not provided in the clock products available to date.

  6. Smartphone Mobile Applications to Enhance Diagnosis of Skin Cancer: A Guide for the Rural Practitioner.

    PubMed

    Cook, Shane E; Palmer, Louis C; Shuler, Franklin D

    2015-01-01

    Primary care physicians occupy a vital position to impact many devastating conditions, especially those dependent upon early diagnosis, such as skin cancer. Skin cancer is the most common cancer in the United States and despite improvements in skin cancer therapy, patients with a delay in diagnosis and advanced disease continue to have a grave prognosis. Due to a variety of barriers, advanced stages of skin cancer are more prominent in rural populations. In order to improve early diagnosis four things are paramount: increased patient participation in prevention methods, establishment of screening guidelines, increased diagnostic accuracy of malignant lesions, and easier access to dermatologists. Recent expansion in smartphone mobile application technology offers simple ways for rural practitioners to address these problems. More than 100,000 health related applications are currently available, with over 200 covering dermatology. This review will evaluate the newest and most useful of those applications offered to enhance the prevention and early diagnosis of skin cancer, particularly in the rural population.

  7. Mapping fractional woody cover in semi-arid savannahs using multi-seasonal composites from Landsat data

    NASA Astrophysics Data System (ADS)

    Higginbottom, Thomas P.; Symeonakis, Elias; Meyer, Hanna; van der Linden, Sebastian

    2018-05-01

    Increasing attention is being directed at mapping the fractional woody cover of savannahs using Earth-observation data. In this study, we test the utility of Landsat TM/ ETM-based spectral-temporal variability metrics for mapping regional-scale woody cover in the Limpopo Province of South Africa, for 2010. We employ a machine learning framework to compare the accuracies of Random Forest models derived using metrics calculated from different seasons. We compare these results to those from fused Landsat-PALSAR data to establish if seasonal metrics can compensate for structural information from the PALSAR signal. Furthermore, we test the applicability of a statistical variable selection method, the recursive feature elimination (RFE), in the automation of the model building process in order to reduce model complexity and processing time. All of our tests were repeated at four scales (30, 60, 90, and 120 m-pixels) to investigate the role of spatial resolution on modelled accuracies. Our results show that multi-seasonal composites combining imagery from both the dry and wet seasons produced the highest accuracies (R2 = 0.77, RMSE = 9.4, at the 120 m scale). When using a single season of observations, dry season imagery performed best (R2 = 0.74, RMSE = 9.9, at the 120 m resolution). Combining Landsat and radar imagery was only marginally beneficial, offering a mean relative improvement of 1% in accuracy at the 120 m scale. However, this improvement was concentrated in areas with lower densities of woody coverage (<30%), which are areas of concern for environmental monitoring. At finer spatial resolutions, the inclusion of SAR data actually reduced accuracies. Overall, the RFE was able to produce the most accurate model (R2 = 0.8, RMSE = 8.9, at the 120 m pixel scale). For mapping savannah woody cover at the 30 m pixel scale, we suggest that monitoring methodologies continue to exploit the Landsat archive, but should aim to use multi-seasonal derived information. When the coarser 120 m pixel scale is adequate, integration of Landsat and SAR data should be considered, especially in areas with lower woody cover densities. The use of multiple seasonal compositing periods offers promise for large-area mapping of savannahs, even in regions with a limited historical Landsat coverage.

  8. Improving the accuracy of CT dimensional metrology by a novel beam hardening correction method

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang; Li, Lei; Zhang, Feng; Xi, Xiaoqi; Deng, Lin; Yan, Bin

    2015-01-01

    Its powerful nondestructive characteristics are attracting more and more research into the study of computed tomography (CT) for dimensional metrology, which offers a practical alternative to the common measurement methods. However, the inaccuracy and uncertainty severely limit the further utilization of CT for dimensional metrology due to many factors, among which the beam hardening (BH) effect plays a vital role. This paper mainly focuses on eliminating the influence of the BH effect in the accuracy of CT dimensional metrology. To correct the BH effect, a novel exponential correction model is proposed. The parameters of the model are determined by minimizing the gray entropy of the reconstructed volume. In order to maintain the consistency and contrast of the corrected volume, a punishment term is added to the cost function, enabling more accurate measurement results to be obtained by the simple global threshold method. The proposed method is efficient, and especially suited to the case where there is a large difference in gray value between material and background. Different spheres with known diameters are used to verify the accuracy of dimensional measurement. Both simulation and real experimental results demonstrate the improvement in measurement precision. Moreover, a more complex workpiece is also tested to show that the proposed method is of general feasibility.

  9. Multispectrum analysis of the oxygen A-band.

    PubMed

    Drouin, Brian J; Benner, D Chris; Brown, Linda R; Cich, Matthew J; Crawford, Timothy J; Devi, V Malathy; Guillaume, Alexander; Hodges, Joseph T; Mlawer, Eli J; Robichaud, David J; Oyafuso, Fabiano; Payne, Vivienne H; Sung, Keeyoon; Wishnow, Edward H; Yu, Shanshan

    2017-01-01

    Retrievals of atmospheric composition from near-infrared measurements require measurements of airmass to better than the desired precision of the composition. The oxygen bands are obvious choices to quantify airmass since the mixing ratio of oxygen is fixed over the full range of atmospheric conditions. The OCO-2 mission is currently retrieving carbon dioxide concentration using the oxygen A-band for airmass normalization. The 0.25% accuracy desired for the carbon dioxide concentration has pushed the required state-of-the-art for oxygen spectroscopy. To measure O 2 A-band cross-sections with such accuracy through the full range of atmospheric pressure requires a sophisticated line-shape model (Rautian or Speed-Dependent Voigt) with line mixing (LM) and collision induced absorption (CIA). Models of each of these phenomena exist, however, this work presents an integrated self-consistent model developed to ensure the best accuracy. It is also important to consider multiple sources of spectroscopic data for such a study in order to improve the dynamic range of the model and to minimize effects of instrumentation and associated systematic errors. The techniques of Fourier Transform Spectroscopy (FTS) and Cavity Ring-Down Spectroscopy (CRDS) allow complimentary information for such an analysis. We utilize multispectrum fitting software to generate a comprehensive new database with improved accuracy based on these datasets. The extensive information will be made available as a multi-dimensional cross-section (ABSCO) table and the parameterization will be offered for inclusion in the HITRANonline database.

  10. Multispectrum analysis of the oxygen A-band

    PubMed Central

    Drouin, Brian J.; Benner, D. Chris; Brown, Linda R.; Cich, Matthew J.; Crawford, Timothy J.; Devi, V. Malathy; Guillaume, Alexander; Hodges, Joseph T.; Mlawer, Eli J.; Robichaud, David J.; Oyafuso, Fabiano; Payne, Vivienne H.; Sung, Keeyoon; Wishnow, Edward H.; Yu, Shanshan

    2016-01-01

    Retrievals of atmospheric composition from near-infrared measurements require measurements of airmass to better than the desired precision of the composition. The oxygen bands are obvious choices to quantify airmass since the mixing ratio of oxygen is fixed over the full range of atmospheric conditions. The OCO-2 mission is currently retrieving carbon dioxide concentration using the oxygen A-band for airmass normalization. The 0.25% accuracy desired for the carbon dioxide concentration has pushed the required state-of-the-art for oxygen spectroscopy. To measure O2 A-band cross-sections with such accuracy through the full range of atmospheric pressure requires a sophisticated line-shape model (Rautian or Speed-Dependent Voigt) with line mixing (LM) and collision induced absorption (CIA). Models of each of these phenomena exist, however, this work presents an integrated self-consistent model developed to ensure the best accuracy. It is also important to consider multiple sources of spectroscopic data for such a study in order to improve the dynamic range of the model and to minimize effects of instrumentation and associated systematic errors. The techniques of Fourier Transform Spectroscopy (FTS) and Cavity Ring-Down Spectroscopy (CRDS) allow complimentary information for such an analysis. We utilize multispectrum fitting software to generate a comprehensive new database with improved accuracy based on these datasets. The extensive information will be made available as a multi-dimensional cross-section (ABSCO) table and the parameterization will be offered for inclusion in the HITRANonline database. PMID:27840454

  11. Multispectrum analysis of the oxygen A-band

    DOE PAGES

    Drouin, Brian J.; Benner, D. Chris; Brown, Linda R.; ...

    2016-04-11

    Retrievals of atmospheric composition from near-infrared measurements require measurements of airmass to better than the desired precision of the composition. The oxygen bands are obvious choices to quantify airmass since the mixing ratio of oxygen is fixed over the full range of atmospheric conditions. The OCO-2 mission is currently retrieving carbon dioxide concentration using the oxygen A-band for airmass normalization. The 0.25% accuracy desired for the carbon dioxide concentration has pushed the required state-of-the-art for oxygen spectroscopy. To measure O2 A-band cross-sections with such accuracy through the full range of atmospheric pressure requires a sophisticated line-shape model (Rautian or Speed-Dependentmore » Voigt) with line mixing (LM) and collision induced absorption (CIA). Models of each of these phenomena exist, however, this work presents an integrated self-consistent model developed to ensure the best accuracy. It is also important to consider multiple sources of spectroscopic data for such a study in order to improve the dynamic range of the model and to minimize effects of instrumentation and associated systematic errors. The techniques of Fourier Transform Spectroscopy (FTS) and Cavity Ring-Down Spectroscopy (CRDS) allow complimentary information for such an analysis. We utilize multispectrum fitting software to generate a comprehensive new database with improved accuracy based on these datasets. As a result, the extensive information will be made available as a multi-dimensional cross-section (ABSCO) table and the parameterization will be offered for inclusion in the HITRANonline database.« less

  12. Multispectrum analysis of the oxygen A-band

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drouin, Brian J.; Benner, D. Chris; Brown, Linda R.

    Retrievals of atmospheric composition from near-infrared measurements require measurements of airmass to better than the desired precision of the composition. The oxygen bands are obvious choices to quantify airmass since the mixing ratio of oxygen is fixed over the full range of atmospheric conditions. The OCO-2 mission is currently retrieving carbon dioxide concentration using the oxygen A-band for airmass normalization. The 0.25% accuracy desired for the carbon dioxide concentration has pushed the required state-of-the-art for oxygen spectroscopy. To measure O2 A-band cross-sections with such accuracy through the full range of atmospheric pressure requires a sophisticated line-shape model (Rautian or Speed-Dependentmore » Voigt) with line mixing (LM) and collision induced absorption (CIA). Models of each of these phenomena exist, however, this work presents an integrated self-consistent model developed to ensure the best accuracy. It is also important to consider multiple sources of spectroscopic data for such a study in order to improve the dynamic range of the model and to minimize effects of instrumentation and associated systematic errors. The techniques of Fourier Transform Spectroscopy (FTS) and Cavity Ring-Down Spectroscopy (CRDS) allow complimentary information for such an analysis. We utilize multispectrum fitting software to generate a comprehensive new database with improved accuracy based on these datasets. As a result, the extensive information will be made available as a multi-dimensional cross-section (ABSCO) table and the parameterization will be offered for inclusion in the HITRANonline database.« less

  13. 20 CFR 655.1308 - Offered wage rate.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    .... Recruitment for this purpose begins when the job order is accepted by the SWA for posting. (d) Wage offer. The... job offers for beginning level employees who have a basic understanding of the occupation. These... monitored and reviewed for accuracy. (2) Level II wage rates are assigned to job offers for employees who...

  14. An Application of Multi-band Forced Photometry to One Square Degree of SERVS: Accurate Photometric Redshifts and Implications for Future Science

    NASA Astrophysics Data System (ADS)

    Nyland, Kristina; Lacy, Mark; Sajina, Anna; Pforr, Janine; Farrah, Duncan; Wilson, Gillian; Surace, Jason; Häußler, Boris; Vaccari, Mattia; Jarvis, Matt

    2017-05-01

    We apply The Tractor image modeling code to improve upon existing multi-band photometry for the Spitzer Extragalactic Representative Volume Survey (SERVS). SERVS consists of post-cryogenic Spitzer observations at 3.6 and 4.5 μm over five well-studied deep fields spanning 18 deg2. In concert with data from ground-based near-infrared (NIR) and optical surveys, SERVS aims to provide a census of the properties of massive galaxies out to z ≈ 5. To accomplish this, we are using The Tractor to perform “forced photometry.” This technique employs prior measurements of source positions and surface brightness profiles from a high-resolution fiducial band from the VISTA Deep Extragalactic Observations survey to model and fit the fluxes at lower-resolution bands. We discuss our implementation of The Tractor over a square-degree test region within the XMM Large Scale Structure field with deep imaging in 12 NIR/optical bands. Our new multi-band source catalogs offer a number of advantages over traditional position-matched catalogs, including (1) consistent source cross-identification between bands, (2) de-blending of sources that are clearly resolved in the fiducial band but blended in the lower resolution SERVS data, (3) a higher source detection fraction in each band, (4) a larger number of candidate galaxies in the redshift range 5 < z < 6, and (5) a statistically significant improvement in the photometric redshift accuracy as evidenced by the significant decrease in the fraction of outliers compared to spectroscopic redshifts. Thus, forced photometry using The Tractor offers a means of improving the accuracy of multi-band extragalactic surveys designed for galaxy evolution studies. We will extend our application of this technique to the full SERVS footprint in the future.

  15. An improved schlieren method for measurement and automatic reconstruction of the far-field focal spot

    PubMed Central

    Wang, Zhengzhou; Hu, Bingliang; Yin, Qinye

    2017-01-01

    The schlieren method of measuring far-field focal spots offers many advantages at the Shenguang III laser facility such as low cost and automatic laser-path collimation. However, current methods of far-field focal spot measurement often suffer from low precision and efficiency when the final focal spot is merged manually, thereby reducing the accuracy of reconstruction. In this paper, we introduce an improved schlieren method to construct the high dynamic-range image of far-field focal spots and improve the reconstruction accuracy and efficiency. First, a detection method based on weak light beam sampling and magnification imaging was designed; images of the main and side lobes of the focused laser irradiance in the far field were obtained using two scientific CCD cameras. Second, using a self-correlation template matching algorithm, a circle the same size as the schlieren ball was dug from the main lobe cutting image and used to change the relative region of the main lobe cutting image within a 100×100 pixel region. The position that had the largest correlation coefficient between the side lobe cutting image and the main lobe cutting image when a circle was dug was identified as the best matching point. Finally, the least squares method was used to fit the center of the side lobe schlieren small ball, and the error was less than 1 pixel. The experimental results show that this method enables the accurate, high-dynamic-range measurement of a far-field focal spot and automatic image reconstruction. Because the best matching point is obtained through image processing rather than traditional reconstruction methods based on manual splicing, this method is less sensitive to the efficiency of focal-spot reconstruction and thus offers better experimental precision. PMID:28207758

  16. Liver fibrosis diagnosis by blood test and elastography in chronic hepatitis C: agreement or combination?

    PubMed

    Calès, P; Boursier, J; Lebigot, J; de Ledinghen, V; Aubé, C; Hubert, I; Oberti, F

    2017-04-01

    In chronic hepatitis C, the European Association for the Study of the Liver and the Asociacion Latinoamericana para el Estudio del Higado recommend performing transient elastography plus a blood test to diagnose significant fibrosis; test concordance confirms the diagnosis. To validate this rule and improve it by combining a blood test, FibroMeter (virus second generation, Echosens, Paris, France) and transient elastography (constitutive tests) into a single combined test, as suggested by the American Association for the Study of Liver Diseases and the Infectious Diseases Society of America. A total of 1199 patients were included in an exploratory set (HCV, n = 679) or in two validation sets (HCV ± HIV, HBV, n = 520). Accuracy was mainly evaluated by correct diagnosis rate for severe fibrosis (pathological Metavir F ≥ 3, primary outcome) by classical test scores or a fibrosis classification, reflecting Metavir staging, as a function of test concordance. Score accuracy: there were no significant differences between the blood test (75.7%), elastography (79.1%) and the combined test (79.4%) (P = 0.066); the score accuracy of each test was significantly (P < 0.001) decreased in discordant vs. concordant tests. Classification accuracy: combined test accuracy (91.7%) was significantly (P < 0.001) increased vs. the blood test (84.1%) and elastography (88.2%); accuracy of each constitutive test was significantly (P < 0.001) decreased in discordant vs. concordant tests but not with combined test: 89.0 vs. 92.7% (P = 0.118). Multivariate analysis for accuracy showed an interaction between concordance and fibrosis level: in the 1% of patients with full classification discordance and severe fibrosis, non-invasive tests were unreliable. The advantage of combined test classification was confirmed in the validation sets. The concordance recommendation is validated. A combined test, expressed in classification instead of score, improves this rule and validates the recommendation of a combined test, avoiding 99% of biopsies, and offering precise staging. © 2017 John Wiley & Sons Ltd.

  17. Can machine-learning improve cardiovascular risk prediction using routine clinical data?

    PubMed Central

    Kai, Joe; Garibaldi, Jonathan M.; Qureshi, Nadeem

    2017-01-01

    Background Current approaches to predict cardiovascular risk fail to identify many people who would benefit from preventive treatment, while others receive unnecessary intervention. Machine-learning offers opportunity to improve accuracy by exploiting complex interactions between risk factors. We assessed whether machine-learning can improve cardiovascular risk prediction. Methods Prospective cohort study using routine clinical data of 378,256 patients from UK family practices, free from cardiovascular disease at outset. Four machine-learning algorithms (random forest, logistic regression, gradient boosting machines, neural networks) were compared to an established algorithm (American College of Cardiology guidelines) to predict first cardiovascular event over 10-years. Predictive accuracy was assessed by area under the ‘receiver operating curve’ (AUC); and sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) to predict 7.5% cardiovascular risk (threshold for initiating statins). Findings 24,970 incident cardiovascular events (6.6%) occurred. Compared to the established risk prediction algorithm (AUC 0.728, 95% CI 0.723–0.735), machine-learning algorithms improved prediction: random forest +1.7% (AUC 0.745, 95% CI 0.739–0.750), logistic regression +3.2% (AUC 0.760, 95% CI 0.755–0.766), gradient boosting +3.3% (AUC 0.761, 95% CI 0.755–0.766), neural networks +3.6% (AUC 0.764, 95% CI 0.759–0.769). The highest achieving (neural networks) algorithm predicted 4,998/7,404 cases (sensitivity 67.5%, PPV 18.4%) and 53,458/75,585 non-cases (specificity 70.7%, NPV 95.7%), correctly predicting 355 (+7.6%) more patients who developed cardiovascular disease compared to the established algorithm. Conclusions Machine-learning significantly improves accuracy of cardiovascular risk prediction, increasing the number of patients identified who could benefit from preventive treatment, while avoiding unnecessary treatment of others. PMID:28376093

  18. Can machine-learning improve cardiovascular risk prediction using routine clinical data?

    PubMed

    Weng, Stephen F; Reps, Jenna; Kai, Joe; Garibaldi, Jonathan M; Qureshi, Nadeem

    2017-01-01

    Current approaches to predict cardiovascular risk fail to identify many people who would benefit from preventive treatment, while others receive unnecessary intervention. Machine-learning offers opportunity to improve accuracy by exploiting complex interactions between risk factors. We assessed whether machine-learning can improve cardiovascular risk prediction. Prospective cohort study using routine clinical data of 378,256 patients from UK family practices, free from cardiovascular disease at outset. Four machine-learning algorithms (random forest, logistic regression, gradient boosting machines, neural networks) were compared to an established algorithm (American College of Cardiology guidelines) to predict first cardiovascular event over 10-years. Predictive accuracy was assessed by area under the 'receiver operating curve' (AUC); and sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) to predict 7.5% cardiovascular risk (threshold for initiating statins). 24,970 incident cardiovascular events (6.6%) occurred. Compared to the established risk prediction algorithm (AUC 0.728, 95% CI 0.723-0.735), machine-learning algorithms improved prediction: random forest +1.7% (AUC 0.745, 95% CI 0.739-0.750), logistic regression +3.2% (AUC 0.760, 95% CI 0.755-0.766), gradient boosting +3.3% (AUC 0.761, 95% CI 0.755-0.766), neural networks +3.6% (AUC 0.764, 95% CI 0.759-0.769). The highest achieving (neural networks) algorithm predicted 4,998/7,404 cases (sensitivity 67.5%, PPV 18.4%) and 53,458/75,585 non-cases (specificity 70.7%, NPV 95.7%), correctly predicting 355 (+7.6%) more patients who developed cardiovascular disease compared to the established algorithm. Machine-learning significantly improves accuracy of cardiovascular risk prediction, increasing the number of patients identified who could benefit from preventive treatment, while avoiding unnecessary treatment of others.

  19. The STARD statement for reporting diagnostic accuracy studies: application to the history and physical examination.

    PubMed

    Simel, David L; Rennie, Drummond; Bossuyt, Patrick M M

    2008-06-01

    The Standards for Reporting of Diagnostic Accuracy (STARD) statement provided guidelines for investigators conducting diagnostic accuracy studies. We reviewed each item in the statement for its applicability to clinical examination diagnostic accuracy research, viewing each discrete aspect of the history and physical examination as a diagnostic test. Nonsystematic review of the STARD statement. Two former STARD Group participants and 1 editor of a journal series on clinical examination research reviewed each STARD item. Suggested interpretations and comments were shared to develop consensus. The STARD Statement applies generally well to clinical examination diagnostic accuracy studies. Three items are the most important for clinical examination diagnostic accuracy studies, and investigators should pay particular attention to their requirements: describe carefully the patient recruitment process, describe participant sampling and address if patients were from a consecutive series, and describe whether the clinicians were masked to the reference standard tests and whether the interpretation of the reference standard test was masked to the clinical examination components or overall clinical impression. The consideration of these and the other STARD items in clinical examination diagnostic research studies would improve the quality of investigations and strengthen conclusions reached by practicing clinicians. The STARD statement provides a very useful framework for diagnostic accuracy studies. The group correctly anticipated that there would be nuances applicable to studies of the clinical examination. We offer guidance that should enhance their usefulness to investigators embarking on original studies of a patient's history and physical examination.

  20. Satellite derived bathymetry: mapping the Irish coastline

    NASA Astrophysics Data System (ADS)

    Monteys, X.; Cahalane, C.; Harris, P.; Hanafin, J.

    2017-12-01

    Ireland has a varied coastline in excess of 3000 km in length largely characterized by extended shallow environments. The coastal shallow water zone can be a challenging and costly environment in which to acquire bathymetry and other oceanographic data using traditional survey methods or airborne LiDAR techniques as demonstrated in the Irish INFOMAR program. Thus, large coastal areas in Ireland, and much of the coastal zone worldwide remain unmapped using modern techniques and is poorly understood. Earth Observations (EO) missions are currently being used to derive timely, cost effective, and quality controlled information for mapping and monitoring coastal environments. Different wavelengths of the solar light penetrate the water column to different depths and are routinely sensed by EO satellites. A large selection of multispectral imagery (MS) from many platforms were examined, as well as from small aircrafts and drones. A number of bays representing very different coastal environments were explored in turn. The project's workflow is created by building a catalogue of satellite and field bathymetric data to assess the suitability of imagery captured at a range of spatial, spectral and temporal resolutions. Turbidity indices are derived from the multispectral information. Finally, a number of spatial regression models using water-leaving radiance parameters and field calibration data are examined. Our assessment reveals that spatial regression algorithms have the potential to significantly improve the accuracy of the predictions up to 10m WD and offer a better handle on the error and uncertainty budget. The four spatial models investigated show better adjustments than the basic non-spatial model. Accuracy of the predictions is better than 10% WD at 95% confidence. Future work will focus on improving the accuracy of the predictions incorporating an analytical model in conjunction with improved empirical methods. The recently launched ESA Sentinel 2 will become the primary focus of study. Satellite bathymetry and coastal mapping products, and remarkably, their repeatability over time, can offer solutions to important coastal zone management issues and address key challenges in the critical line between shoreline changes and human activity, particularly in the light of future climate change scenarios.

  1. Assessment of neuropsychiatric symptoms in dementia: toward improving accuracy

    PubMed Central

    Stella, Florindo

    2013-01-01

    The issue of this article concerned the discussion about tools frequently used tools for assessing neuropsychiatric symptoms of patients with dementia, particularly Alzheimer's disease. The aims were to discuss the main tools for evaluating behavioral disturbances, and particularly the accuracy of the Neuropsychiatric Inventory – Clinician Rating Scale (NPI-C). The clinical approach to and diagnosis of neuropsychiatric syndromes in dementia require suitable accuracy. Advances in the recognition and early accurate diagnosis of psychopathological symptoms help guide appropriate pharmacological and non-pharmacological interventions. In addition, recommended standardized and validated measurements contribute to both scientific research and clinical practice. Emotional distress, caregiver burden, and cognitive impairment often experienced by elderly caregivers, may affect the quality of caregiver reports. The clinician rating approach helps attenuate these misinterpretations. In this scenario, the NPI-C is a promising and versatile tool for assessing neuropsychiatric syndromes in dementia, offering good accuracy and high reliability, mainly based on the diagnostic impression of the clinician. This tool can provide both strategies: a comprehensive assessment of neuropsychiatric symptoms in dementia or the investigation of specific psychopathological syndromes such as agitation, depression, anxiety, apathy, sleep disorders, and aberrant motor disorders, among others. PMID:29213846

  2. Astigmatism evaluation prior to cataract surgery.

    PubMed

    Gupta, Pankaj C; Caty, Jane T

    2018-01-01

    To evaluate and summarize literature from the past 18 months reporting advancements and issues in astigmatism assessment prior to cataract surgery. New and updated toric calculators and regression formulas offer the opportunity for more accurate lens selection for our patients. Concurrently, improvements in topographic evaluation of corneal keratometry have allowed for a decrease in unplanned residual corneal astigmatism. Measuring posterior corneal astigmatism is especially valuable in eyes with keratoconus when planning to implant toric intraocular lens (IOL) and now allows access to this patient population. Improved accuracy of astigmatism evaluation now occurs with point reflections on the corneal surface along with the latest generation toric lens formulas which integrated posterior corneal astigmatism, predicted lens position, and intended spherical power of the IOL. These improvements can allow for incorporation of toric lenses in keratoconus patients.

  3. An adaptive deep learning approach for PPG-based identification.

    PubMed

    Jindal, V; Birjandtalab, J; Pouyan, M Baran; Nourani, M

    2016-08-01

    Wearable biosensors have become increasingly popular in healthcare due to their capabilities for low cost and long term biosignal monitoring. This paper presents a novel two-stage technique to offer biometric identification using these biosensors through Deep Belief Networks and Restricted Boltzman Machines. Our identification approach improves robustness in current monitoring procedures within clinical, e-health and fitness environments using Photoplethysmography (PPG) signals through deep learning classification models. The approach is tested on TROIKA dataset using 10-fold cross validation and achieved an accuracy of 96.1%.

  4. Identification of Long Bone Fractures in Radiology Reports Using Natural Language Processing to support Healthcare Quality Improvement.

    PubMed

    Grundmeier, Robert W; Masino, Aaron J; Casper, T Charles; Dean, Jonathan M; Bell, Jamie; Enriquez, Rene; Deakyne, Sara; Chamberlain, James M; Alpern, Elizabeth R

    2016-11-09

    Important information to support healthcare quality improvement is often recorded in free text documents such as radiology reports. Natural language processing (NLP) methods may help extract this information, but these methods have rarely been applied outside the research laboratories where they were developed. To implement and validate NLP tools to identify long bone fractures for pediatric emergency medicine quality improvement. Using freely available statistical software packages, we implemented NLP methods to identify long bone fractures from radiology reports. A sample of 1,000 radiology reports was used to construct three candidate classification models. A test set of 500 reports was used to validate the model performance. Blinded manual review of radiology reports by two independent physicians provided the reference standard. Each radiology report was segmented and word stem and bigram features were constructed. Common English "stop words" and rare features were excluded. We used 10-fold cross-validation to select optimal configuration parameters for each model. Accuracy, recall, precision and the F1 score were calculated. The final model was compared to the use of diagnosis codes for the identification of patients with long bone fractures. There were 329 unique word stems and 344 bigrams in the training documents. A support vector machine classifier with Gaussian kernel performed best on the test set with accuracy=0.958, recall=0.969, precision=0.940, and F1 score=0.954. Optimal parameters for this model were cost=4 and gamma=0.005. The three classification models that we tested all performed better than diagnosis codes in terms of accuracy, precision, and F1 score (diagnosis code accuracy=0.932, recall=0.960, precision=0.896, and F1 score=0.927). NLP methods using a corpus of 1,000 training documents accurately identified acute long bone fractures from radiology reports. Strategic use of straightforward NLP methods, implemented with freely available software, offers quality improvement teams new opportunities to extract information from narrative documents.

  5. High resolution remote sensing missions of a tethered satellite

    NASA Technical Reports Server (NTRS)

    Vetrella, S.; Moccia, A.

    1986-01-01

    The application of the Tethered Satellite (TS) as an operational remote sensing platform is studied. It represents a new platform capable of covering the altitudes between airplanes and free flying satellites, offering an adequate lifetime, high geometric and radiometric resolution and improved cartographic accuracy. Two operational remote sensing missions are proposed: one using two linear array systems for along track stereoscopic observation and one using a synthetic aperture radar combined with an interferometric technique. These missions are able to improve significantly the accuracy of future real time cartographic systems from space, also allowing, in the case of active microwave systems, the Earth's observation both in adverse weather and at any time, day or night. Furthermore, a simulation program is described in which, in order to examine carefully the potentiality of the TS as a new remote sensing platform, the orbital and attitude dynamics description of the TSS is integrated with the sensor viewing geometry, the Earth's ellipsoid, the atmospheric effects, the Sun illumination and the digital elevation model. A preliminary experiment has been proposed which consist of a metric camera to be deployed downwards during the second Shuttle demonstration flight.

  6. A novel pen-based Bluetooth-enabled insulin delivery system with insulin dose tracking and advice.

    PubMed

    Bailey, Timothy S; Stone, Jenine Y

    2017-05-01

    Diabetes is growing in prevalence internationally. As more individuals require insulin as part of their treatment, technology evolves to optimize delivery, improve adherence, and reduce dosing errors. Insulin pens outperform vial and syringe in simplicity, dosing accuracy, and user preference. Bolus advisors improve dosing confidence and treatment adherence. The InPen System offers a novel approach to treatment via a wireless pen that syncs to a mobile application featuring a bolus advisor, enabling convenient insulin dose tracking and more accurate bolus advice among other features. Areas covered: Existing technology for insulin delivery and bolus advice are reviewed. The mechanics and functionality of the InPen device are delineated. Findings from formative testing and usability studies of the InPen system are reported. Future directions for the InPen system in the treatment of diabetes are discussed. Expert opinion: Diabetes management is complex and largely data-driven. The InPen System offers a promising new opportunity to avail insulin pen-users of features known to improve treatment efficacy, which have otherwise primarily been available to those using pumps. Given that the majority of insulin users do not use insulin pumps, the InPen System is poised to improve glucose control in a significant portion of the diabetes population.

  7. Steps Towards an Operational Service Using Near Real-Time Altimeter Data

    NASA Astrophysics Data System (ADS)

    Ash, E. R.

    2006-07-01

    Thanks largely to modern computing power, numerical forecasts of w inds and waves over the oceans ar e ev er improving, offering greater accuracy and finer resolution in time and sp ace. Howev er, it is recognized that met-ocean models still have difficulty in accurately forecasting sever e w eather conditions, conditions that cause the most damag e and difficulty in mar itime operations. Ther efore a key requir emen t is to provid e improved information on sever e conditions. No individual measur emen t or prediction system is perfect. Offshore buoys provide a continuous long-ter m record of wind and wave conditions, but only at a limited numb er of sites. Satellite data offer all-weath er global cov erage, but with relatively infrequen t samp ling. Forecasts rely on imperf ect numerical schemes and the ab ility to manage a vast quantity of input data. Therefore the best system is one that integr ates information from all available sources, taking advantage of the benef its that each can offer. We report on an initiative supported by the European Space Agen cy (ESA) which investig ated how satellite data could be used to enhan ce systems to provide Near Real Time mon itor ing of met-ocean conditions.

  8. Small angle X-ray scattering and cross-linking for data assisted protein structure prediction in CASP 12 with prospects for improved accuracy.

    PubMed

    Ogorzalek, Tadeusz L; Hura, Greg L; Belsom, Adam; Burnett, Kathryn H; Kryshtafovych, Andriy; Tainer, John A; Rappsilber, Juri; Tsutakawa, Susan E; Fidelis, Krzysztof

    2018-03-01

    Experimental data offers empowering constraints for structure prediction. These constraints can be used to filter equivalently scored models or more powerfully within optimization functions toward prediction. In CASP12, Small Angle X-ray Scattering (SAXS) and Cross-Linking Mass Spectrometry (CLMS) data, measured on an exemplary set of novel fold targets, were provided to the CASP community of protein structure predictors. As solution-based techniques, SAXS and CLMS can efficiently measure states of the full-length sequence in its native solution conformation and assembly. However, this experimental data did not substantially improve prediction accuracy judged by fits to crystallographic models. One issue, beyond intrinsic limitations of the algorithms, was a disconnect between crystal structures and solution-based measurements. Our analyses show that many targets had substantial percentages of disordered regions (up to 40%) or were multimeric or both. Thus, solution measurements of flexibility and assembly support variations that may confound prediction algorithms trained on crystallographic data and expecting globular fully-folded monomeric proteins. Here, we consider the CLMS and SAXS data collected, the information in these solution measurements, and the challenges in incorporating them into computational prediction. As improvement opportunities were only partly realized in CASP12, we provide guidance on how data from the full-length biological unit and the solution state can better aid prediction of the folded monomer or subunit. We furthermore describe strategic integrations of solution measurements with computational prediction programs with the aim of substantially improving foundational knowledge and the accuracy of computational algorithms for biologically-relevant structure predictions for proteins in solution. © 2018 Wiley Periodicals, Inc.

  9. The profile algorithm for microwave delay estimation from water vapor radiometer data

    NASA Technical Reports Server (NTRS)

    Robinson, Steven E.

    1988-01-01

    A new algorithm has been developed for the estimation of tropospheric microwave path delays from water vapor radiometer (WVR) data, which does not require site and weather dependent empirical parameters to produce accuracy better than 0.3 cm of delay. Instead of taking the conventional linear approach, the new algorithm first uses the observables with an emission model to determine an approximate form of the vertical water vapor distribution, which is then explicitly integrated to estimate wet path delays in a second step. The intrinsic accuracy of this algorithm, excluding uncertainties caused by the radiometers and the emission model, has been examined for two channel WVR data using path delays and corresponding simulated observables computed from archived radiosonde data. It is found that annual rms errors for a wide range of sites average 0.18 cm in the absence of clouds, 0.22 cm in cloudy weather, and 0.19 cm overall. In clear weather, the new algorithm's accuracy is comparable to the best that can be obtained from conventional linear algorithms, while in cloudy weather it offers a 35 percent improvement.

  10. ROC curve analyses of eyewitness identification decisions: An analysis of the recent debate.

    PubMed

    Rotello, Caren M; Chen, Tina

    2016-01-01

    How should the accuracy of eyewitness identification decisions be measured, so that best practices for identification can be determined? This fundamental question is under intense debate. One side advocates for continued use of a traditional measure of identification accuracy, known as the diagnosticity ratio , whereas the other side argues that receiver operating characteristic curves (ROCs) should be used instead because diagnosticity is confounded with response bias. Diagnosticity proponents have offered several criticisms of ROCs, which we show are either false or irrelevant to the assessment of eyewitness accuracy. We also show that, like diagnosticity, Bayesian measures of identification accuracy confound response bias with witnesses' ability to discriminate guilty from innocent suspects. ROCs are an essential tool for distinguishing memory-based processes from decisional aspects of a response; simulations of different possible identification tasks and response strategies show that they offer important constraints on theory development.

  11. The STARD Statement for Reporting Diagnostic Accuracy Studies: Application to the History and Physical Examination

    PubMed Central

    Rennie, Drummond; Bossuyt, Patrick M. M.

    2008-01-01

    Summary Objective The Standards for Reporting of Diagnostic Accuracy (STARD) statement provided guidelines for investigators conducting diagnostic accuracy studies. We reviewed each item in the statement for its applicability to clinical examination diagnostic accuracy research, viewing each discrete aspect of the history and physical examination as a diagnostic test. Setting Nonsystematic review of the STARD statement. Interventions Two former STARD Group participants and 1 editor of a journal series on clinical examination research reviewed each STARD item. Suggested interpretations and comments were shared to develop consensus. Measurements and Main Results The STARD Statement applies generally well to clinical examination diagnostic accuracy studies. Three items are the most important for clinical examination diagnostic accuracy studies, and investigators should pay particular attention to their requirements: describe carefully the patient recruitment process, describe participant sampling and address if patients were from a consecutive series, and describe whether the clinicians were masked to the reference standard tests and whether the interpretation of the reference standard test was masked to the clinical examination components or overall clinical impression. The consideration of these and the other STARD items in clinical examination diagnostic research studies would improve the quality of investigations and strengthen conclusions reached by practicing clinicians. Conclusions The STARD statement provides a very useful framework for diagnostic accuracy studies. The group correctly anticipated that there would be nuances applicable to studies of the clinical examination. We offer guidance that should enhance their usefulness to investigators embarking on original studies of a patient’s history and physical examination. PMID:18347878

  12. Integrated Computational Solution for Predicting Skin Sensitization Potential of Molecules

    PubMed Central

    Desai, Aarti; Singh, Vivek K.; Jere, Abhay

    2016-01-01

    Introduction Skin sensitization forms a major toxicological endpoint for dermatology and cosmetic products. Recent ban on animal testing for cosmetics demands for alternative methods. We developed an integrated computational solution (SkinSense) that offers a robust solution and addresses the limitations of existing computational tools i.e. high false positive rate and/or limited coverage. Results The key components of our solution include: QSAR models selected from a combinatorial set, similarity information and literature-derived sub-structure patterns of known skin protein reactive groups. Its prediction performance on a challenge set of molecules showed accuracy = 75.32%, CCR = 74.36%, sensitivity = 70.00% and specificity = 78.72%, which is better than several existing tools including VEGA (accuracy = 45.00% and CCR = 54.17% with ‘High’ reliability scoring), DEREK (accuracy = 72.73% and CCR = 71.44%) and TOPKAT (accuracy = 60.00% and CCR = 61.67%). Although, TIMES-SS showed higher predictive power (accuracy = 90.00% and CCR = 92.86%), the coverage was very low (only 10 out of 77 molecules were predicted reliably). Conclusions Owing to improved prediction performance and coverage, our solution can serve as a useful expert system towards Integrated Approaches to Testing and Assessment for skin sensitization. It would be invaluable to cosmetic/ dermatology industry for pre-screening their molecules, and reducing time, cost and animal testing. PMID:27271321

  13. Adaptive and perceptual learning technologies in medical education and training.

    PubMed

    Kellman, Philip J

    2013-10-01

    Recent advances in the learning sciences offer remarkable potential to improve medical education and maximize the benefits of emerging medical technologies. This article describes 2 major innovation areas in the learning sciences that apply to simulation and other aspects of medical learning: Perceptual learning (PL) and adaptive learning technologies. PL technology offers, for the first time, systematic, computer-based methods for teaching pattern recognition, structural intuition, transfer, and fluency. Synergistic with PL are new adaptive learning technologies that optimize learning for each individual, embed objective assessment, and implement mastery criteria. The author describes the Adaptive Response-Time-based Sequencing (ARTS) system, which uses each learner's accuracy and speed in interactive learning to guide spacing, sequencing, and mastery. In recent efforts, these new technologies have been applied in medical learning contexts, including adaptive learning modules for initial medical diagnosis and perceptual/adaptive learning modules (PALMs) in dermatology, histology, and radiology. Results of all these efforts indicate the remarkable potential of perceptual and adaptive learning technologies, individually and in combination, to improve learning in a variety of medical domains. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  14. A Precise Calibration Technique for Measuring High Gas Temperatures

    NASA Technical Reports Server (NTRS)

    Gokoglu, Suleyman A.; Schultz, Donald F.

    2000-01-01

    A technique was developed for direct measurement of gas temperatures in the range of 2050 K 2700 K with improved accuracy and reproducibility. The technique utilized the low-emittance of certain fibrous materials, and the uncertainty of the technique was United by the uncertainty in the melting points of the materials, i.e., +/-15 K. The materials were pure, thin, metal-oxide fibers whose diameters varied from 60 microns to 400 microns in the experiments. The sharp increase in the emittance of the fibers upon melting was utilized as indication of reaching a known gas temperature. The accuracy of the technique was confirmed by both calculated low emittance values of transparent fibers, of order 0.01, up to a few degrees below their melting point and by the fiber-diameter independence of the results. This melting-point temperature was approached by increments not larger than 4 K, which was accomplished by controlled increases of reactant flow rates in hydrogen-air and/or hydrogen-oxygen flames. As examples of the applications of the technique, the gas-temperature measurements were used: (a) for assessing the uncertainty in inferring gas temperatures from thermocouple measurements, and (b) for calibrating an IR camera to measure gas temperatures. The technique offers an excellent calibration reference for other gas-temperature measurement methods to improve their accuracy and reliably extending their temperature range of applicability.

  15. A Precise Calibration Technique for Measuring High Gas Temperatures

    NASA Technical Reports Server (NTRS)

    Gokoglu, Suleyman A.; Schultz, Donald F.

    1999-01-01

    A technique was developed for direct measurement of gas temperatures in the range of 2050 K - 2700 K with improved accuracy and reproducibility. The technique utilized the low-emittance of certain fibrous Materials, and the uncertainty of the technique was limited by the uncertainty in the melting points of the materials, i.e., +/- 15 K. The materials were pure, thin, metal-oxide fibers whose diameters varied from 60 mm to 400 mm in the experiments. The sharp increase in the emittance of the fibers upon melting was utilized as indication of reaching a known gas temperature. The accuracy of the technique was confirmed by both calculated low emittance values of transparent fibers, of order 0.01, up to a few degrees below their melting point and by the fiber-diameter independence of the results. This melting-point temperature was approached by increments not larger than 4 K, which was accomplished by controlled increases of reactant flow rates in hydrogen-air and/or hydrogen- oxygen flames. As examples of the applications of the technique, the gas-temperature measurements were used (a) for assessing the uncertainty in infering gas temperatures from thermocouple measurements, and (b) for calibrating an IR camera to measure gas temperatures. The technique offers an excellent calibration reference for other gas-temperature measurement methods to improve their accuracy and reliably extending their temperature range of applicability.

  16. Exploring the interactions between forecast accuracy, risk perception and perceived forecast reliability in reservoir operator's decision to use forecast

    NASA Astrophysics Data System (ADS)

    Shafiee-Jood, M.; Cai, X.

    2017-12-01

    Advances in streamflow forecasts at different time scales offer a promise for proactive flood management and improved risk management. Despite the huge potential, previous studies have found that water resources managers are often not willing to incorporate streamflow forecasts information in decisions making, particularly in risky situations. While low accuracy of forecasts information is often cited as the main reason, some studies have found that implementation of streamflow forecasts sometimes is impeded by institutional obstacles and behavioral factors (e.g., risk perception). In fact, a seminal study by O'Connor et al. (2005) found that risk perception is the strongest determinant of forecast use while managers' perception about forecast reliability is not significant. In this study, we aim to address this issue again. However, instead of using survey data and regression analysis, we develop a theoretical framework to assess the user-perceived value of streamflow forecasts. The framework includes a novel behavioral component which incorporates both risk perception and perceived forecast reliability. The framework is then used in a hypothetical problem where reservoir operator should react to probabilistic flood forecasts with different reliabilities. The framework will allow us to explore the interactions among risk perception and perceived forecast reliability, and among the behavioral components and information accuracy. The findings will provide insights to improve the usability of flood forecasts information through better communication and education.

  17. Improving density functional tight binding predictions of free energy surfaces for peptide condensation reactions in solution

    NASA Astrophysics Data System (ADS)

    Kroonblawd, Matthew; Goldman, Nir

    First principles molecular dynamics using highly accurate density functional theory (DFT) is a common tool for predicting chemistry, but the accessible time and space scales are often orders of magnitude beyond the resolution of experiments. Semi-empirical methods such as density functional tight binding (DFTB) offer up to a thousand-fold reduction in required CPU hours and can approach experimental scales. However, standard DFTB parameter sets lack good transferability and calibration for a particular system is usually necessary. Force matching the pairwise repulsive energy term in DFTB to short DFT trajectories can improve the former's accuracy for chemistry that is fast relative to DFT simulation times (<10 ps), but the effects on slow chemistry and the free energy surface are not well-known. We present a force matching approach to increase the accuracy of DFTB predictions for free energy surfaces. Accelerated sampling techniques are combined with path collective variables to generate the reference DFT data set and validate fitted DFTB potentials without a priori knowledge of transition states. Accuracy of force-matched DFTB free energy surfaces is assessed for slow peptide-forming reactions by direct comparison to DFT results for particular paths. Extensions to model prebiotic chemistry under shock conditions are discussed. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  18. An Overview of Recent Developments in Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Bennett, Robert M.; Edwards, John W.

    2004-01-01

    The motivation for Computational Aeroelasticity (CA) and the elements of one type of the analysis or simulation process are briefly reviewed. The need for streamlining and improving the overall process to reduce elapsed time and improve overall accuracy is discussed. Further effort is needed to establish the credibility of the methodology, obtain experience, and to incorporate the experience base to simplify the method for future use. Experience with the application of a variety of Computational Aeroelasticity programs is summarized for the transonic flutter of two wings, the AGARD 445.6 wing and a typical business jet wing. There is a compelling need for a broad range of additional flutter test cases for further comparisons. Some existing data sets that may offer CA challenges are presented.

  19. New diagnostic modalities in the diagnosis of heart failure.

    PubMed Central

    Mitchell, Judith E.; Palta, Sanjeev

    2004-01-01

    Heart failure (HF) is the one cardiovascular disease that is increasing in prevalence in the United States. As the population continues to age, the incidence will certainly be amplified. However, some studies have shown that HF is correctly diagnosed initially in only 50% of affected patients. Despite the use of history, physical examination, echocardiogram, and chest x-ray, the percentage of correct initial diagnosis of HF is low. Recognizing the symptoms of HF decompensations is often problematic because other diagnoses can mimic them. There are two new diagnostic modalities that offer promise in improving HF diagnostic accuracy and identifying early HF decompensations. These diagnostic modalities include tests utilizing impedance cardiography and the B-type natriuretic peptide assay. They have the potential of increasing the accuracy of HF diagnosis and guide pharmacological treatment in the inpatient and outpatient settings. They may also assist in the recognition (or prediction) of acute HF decompensations. Images Figure 2 PMID:15586645

  20. Design of time-pulse coded optoelectronic neuronal elements for nonlinear transformation and integration

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Lazareva, Maria V.

    2008-03-01

    In the paper the actuality of neurophysiologically motivated neuron arrays with flexibly programmable functions and operations with possibility to select required accuracy and type of nonlinear transformation and learning are shown. We consider neurons design and simulation results of multichannel spatio-time algebraic accumulation - integration of optical signals. Advantages for nonlinear transformation and summation - integration are shown. The offered circuits are simple and can have intellectual properties such as learning and adaptation. The integrator-neuron is based on CMOS current mirrors and comparators. The performance: consumable power - 100...500 μW, signal period- 0.1...1ms, input optical signals power - 0.2...20 μW time delays - less 1μs, the number of optical signals - 2...10, integration time - 10...100 of signal periods, accuracy or integration error - about 1%. Various modifications of the neuron-integrators with improved performance and for different applications are considered in the paper.

  1. Cooperative multi-user detection and ranging based on pseudo-random codes

    NASA Astrophysics Data System (ADS)

    Morhart, C.; Biebl, E. M.

    2009-05-01

    We present an improved approach for a Round Trip Time of Flight distance measurement system. The system is intended for the usage in a cooperative localisation system for automotive applications. Therefore, it is designed to address a large number of communication partners per measurement cycle. By using coded signals in a time divison multiple access order, we can detect a large number of pedestrian sensors with just one car sensor. We achieve this by using very short transmit bursts in combination with a real time correlation algorithm. Futhermore, the correlation approach offers real time data, concerning the time of arrival, that can serve as a trigger impulse for other comunication systems. The distance accuracy of the correlation result was further increased by adding a fourier interpolation filter. The system performance was checked with a prototype at 2.4 GHz. We reached a distance measurement accuracy of 12 cm at a range up to 450 m.

  2. Quantitative fluorescence tomography using a trimodality system: in vivo validation

    PubMed Central

    Lin, Yuting; Barber, William C.; Iwanczyk, Jan S.; Roeck, Werner W.; Nalcioglu, Orhan; Gulsen, Gultekin

    2010-01-01

    A fully integrated trimodality fluorescence, diffuse optical, and x-ray computed tomography (FT∕DOT∕XCT) system for small animal imaging is reported in this work. The main purpose of this system is to obtain quantitatively accurate fluorescence concentration images using a multimodality approach. XCT offers anatomical information, while DOT provides the necessary background optical property map to improve FT image accuracy. The quantitative accuracy of this trimodality system is demonstrated in vivo. In particular, we show that a 2-mm-diam fluorescence inclusion located 8 mm deep in a nude mouse can only be localized when functional a priori information from DOT is available. However, the error in the recovered fluorophore concentration is nearly 87%. On the other hand, the fluorophore concentration can be accurately recovered within 2% error when both DOT functional and XCT structural a priori information are utilized together to guide and constrain the FT reconstruction algorithm. PMID:20799770

  3. Estimating Plasma Glucose from Interstitial Glucose: The Issue of Calibration Algorithms in Commercial Continuous Glucose Monitoring Devices

    PubMed Central

    Rossetti, Paolo; Bondia, Jorge; Vehí, Josep; Fanelli, Carmine G.

    2010-01-01

    Evaluation of metabolic control of diabetic people has been classically performed measuring glucose concentrations in blood samples. Due to the potential improvement it offers in diabetes care, continuous glucose monitoring (CGM) in the subcutaneous tissue is gaining popularity among both patients and physicians. However, devices for CGM measure glucose concentration in compartments other than blood, usually the interstitial space. This means that CGM need calibration against blood glucose values, and the accuracy of the estimation of blood glucose will also depend on the calibration algorithm. The complexity of the relationship between glucose dynamics in blood and the interstitial space, contrasts with the simplistic approach of calibration algorithms currently implemented in commercial CGM devices, translating in suboptimal accuracy. The present review will analyze the issue of calibration algorithms for CGM, focusing exclusively on the commercially available glucose sensors. PMID:22163505

  4. Modern mass spectrometry for synthetic biology and structure-based discovery of natural products.

    PubMed

    Henke, Matthew T; Kelleher, Neil L

    2016-08-27

    Covering: up to 2016In this highlight, we describe the current landscape for dereplication and discovery of natural products based on the measurement of the intact mass by LC-MS. Often it is assumed that because better mass accuracy (provided by higher resolution mass spectrometers) is necessary for absolute chemical formula determination (≤1 part-per-million), that it is also necessary for dereplication of natural products. However, the average ability to dereplicate tapers off at ∼10 ppm, with modest improvement gained from better mass accuracy when querying focused databases of natural products. We also highlight some recent examples of how these platforms are applied to synthetic biology, and recent methods for dereplication and correlation of substructures using tandem MS data. We also offer this highlight to serve as a brief primer for those entering the field of mass spectrometry-based natural products discovery.

  5. An adaptive gridless methodology in one dimension

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, N.T.; Hailey, C.E.

    1996-09-01

    Gridless numerical analysis offers great potential for accurately solving for flow about complex geometries or moving boundary problems. Because gridless methods do not require point connection, the mesh cannot twist or distort. The gridless method utilizes a Taylor series about each point to obtain the unknown derivative terms from the current field variable estimates. The governing equation is then numerically integrated to determine the field variables for the next iteration. Effects of point spacing and Taylor series order on accuracy are studied, and they follow similar trends of traditional numerical techniques. Introducing adaption by point movement using a spring analogymore » allows the solution method to track a moving boundary. The adaptive gridless method models linear, nonlinear, steady, and transient problems. Comparison with known analytic solutions is given for these examples. Although point movement adaption does not provide a significant increase in accuracy, it helps capture important features and provides an improved solution.« less

  6. The Case for Laboratory Developed Procedures

    PubMed Central

    Sabatini, Linda M.; Tsongalis, Gregory J.; Caliendo, Angela M.; Olsen, Randall J.; Ashwood, Edward R.; Bale, Sherri; Benirschke, Robert; Carlow, Dean; Funke, Birgit H.; Grody, Wayne W.; Hayden, Randall T.; Hegde, Madhuri; Lyon, Elaine; Pessin, Melissa; Press, Richard D.; Thomson, Richard B.

    2017-01-01

    An explosion of knowledge and technology is revolutionizing medicine and patient care. Novel testing must be brought to the clinic with safety and accuracy, but also in a timely and cost-effective manner, so that patients can benefit and laboratories can offer testing consistent with current guidelines. Under the oversight provided by the Clinical Laboratory Improvement Amendments, laboratories have been able to develop and optimize laboratory procedures for use in-house. Quality improvement programs, interlaboratory comparisons, and the ability of laboratories to adjust assays as needed to improve results, utilize new sample types, or incorporate new mutations, information, or technologies are positive aspects of Clinical Laboratory Improvement Amendments oversight of laboratory-developed procedures. Laboratories have a long history of successful service to patients operating under Clinical Laboratory Improvement Amendments. A series of detailed clinical examples illustrating the quality and positive impact of laboratory-developed procedures on patient care is provided. These examples also demonstrate how Clinical Laboratory Improvement Amendments oversight ensures accurate, reliable, and reproducible testing in clinical laboratories. PMID:28815200

  7. Dual-modality imaging

    NASA Astrophysics Data System (ADS)

    Hasegawa, Bruce; Tang, H. Roger; Da Silva, Angela J.; Wong, Kenneth H.; Iwata, Koji; Wu, Max C.

    2001-09-01

    In comparison to conventional medical imaging techniques, dual-modality imaging offers the advantage of correlating anatomical information from X-ray computed tomography (CT) with functional measurements from single-photon emission computed tomography (SPECT) or with positron emission tomography (PET). The combined X-ray/radionuclide images from dual-modality imaging can help the clinician to differentiate disease from normal uptake of radiopharmaceuticals, and to improve diagnosis and staging of disease. In addition, phantom and animal studies have demonstrated that a priori structural information from CT can be used to improve quantification of tissue uptake and organ function by correcting the radionuclide data for errors due to photon attenuation, partial volume effects, scatter radiation, and other physical effects. Dual-modality imaging therefore is emerging as a method of improving the visual quality and the quantitative accuracy of radionuclide imaging for diagnosis of patients with cancer and heart disease.

  8. Short-Term Intra-Subject Variation in Exhaled Volatile Organic Compounds (VOCs) in COPD Patients and Healthy Controls and Its Effect on Disease Classification

    PubMed Central

    Phillips, Christopher; Mac Parthaláin, Neil; Syed, Yasir; Deganello, Davide; Claypole, Timothy; Lewis, Keir

    2014-01-01

    Exhaled volatile organic compounds (VOCs) are of interest for their potential to diagnose disease non-invasively. However, most breath VOC studies have analyzed single breath samples from an individual and assumed them to be wholly consistent representative of the person. This provided the motivation for an investigation of the variability of breath profiles when three breath samples are taken over a short time period (two minute intervals between samples) for 118 stable patients with Chronic Obstructive Pulmonary Disease (COPD) and 63 healthy controls and analyzed by gas chromatography and mass spectroscopy (GC/MS). The extent of the variation in VOC levels differed between COPD and healthy subjects and the patterns of variation differed for isoprene versus the bulk of other VOCs. In addition, machine learning approaches were applied to the breath data to establish whether these samples differed in their ability to discriminate COPD from healthy states and whether aggregation of multiple samples, into single data sets, could offer improved discrimination. The three breath samples gave similar classification accuracy to one another when evaluated separately (66.5% to 68.3% subjects classified correctly depending on the breath repetition used). Combining multiple breath samples into single data sets gave better discrimination (73.4% subjects classified correctly). Although accuracy is not sufficient for COPD diagnosis in a clinical setting, enhanced sampling and analysis may improve accuracy further. Variability in samples, and short-term effects of practice or exertion, need to be considered in any breath testing program to improve reliability and optimize discrimination. PMID:24957028

  9. Short-Term Intra-Subject Variation in Exhaled Volatile Organic Compounds (VOCs) in COPD Patients and Healthy Controls and Its Effect on Disease Classification.

    PubMed

    Phillips, Christopher; Mac Parthaláin, Neil; Syed, Yasir; Deganello, Davide; Claypole, Timothy; Lewis, Keir

    2014-05-09

    Exhaled volatile organic compounds (VOCs) are of interest for their potential to diagnose disease non-invasively. However, most breath VOC studies have analyzed single breath samples from an individual and assumed them to be wholly consistent representative of the person. This provided the motivation for an investigation of the variability of breath profiles when three breath samples are taken over a short time period (two minute intervals between samples) for 118 stable patients with Chronic Obstructive Pulmonary Disease (COPD) and 63 healthy controls and analyzed by gas chromatography and mass spectroscopy (GC/MS). The extent of the variation in VOC levels differed between COPD and healthy subjects and the patterns of variation differed for isoprene versus the bulk of other VOCs. In addition, machine learning approaches were applied to the breath data to establish whether these samples differed in their ability to discriminate COPD from healthy states and whether aggregation of multiple samples, into single data sets, could offer improved discrimination. The three breath samples gave similar classification accuracy to one another when evaluated separately (66.5% to 68.3% subjects classified correctly depending on the breath repetition used). Combining multiple breath samples into single data sets gave better discrimination (73.4% subjects classified correctly). Although accuracy is not sufficient for COPD diagnosis in a clinical setting, enhanced sampling and analysis may improve accuracy further. Variability in samples, and short-term effects of practice or exertion, need to be considered in any breath testing program to improve reliability and optimize discrimination.

  10. DPOD2005: An extension of ITRF2005 for Precise Orbit Determination

    NASA Astrophysics Data System (ADS)

    Willis, P.; Ries, J. C.; Zelensky, N. P.; Soudarin, L.; Fagard, H.; Pavlis, E. C.; Lemoine, F. G.

    2009-09-01

    For Precise Orbit Determination of altimetry missions, we have computed a data set of DORIS station coordinates defined for specific time intervals called DPOD2005. This terrestrial reference set is an extension of ITRF2005. However, it includes all new DORIS stations and is more reliable, as we disregard stations with large velocity formal errors as they could contaminate POD computations in the near future. About 1/4 of the station coordinates need to be defined as they do not appear in the original ITRF2005 realization. These results were verified with available DORIS and GPS results, as the integrity of DPOD2005 is almost as critical as its accuracy. Besides station coordinates and velocities, we also provide additional information such as periods for which DORIS data should be disregarded for specific DORIS stations, and epochs of coordinate and velocity discontinuities (related to either geophysical events, equipment problem or human intervention). The DPOD model was tested for orbit determination for TOPEX/Poseidon (T/P), Jason-1 and Jason-2. Test results show DPOD2005 offers improvement over the original ITRF2005, improvement that rapidly and significantly increases after 2005. Improvement is also significant for the early T/P cycles indicating improved station velocities in the DPOD2005 model and a more complete station set. Following 2005 the radial accuracy and centering of the ITRF2005-original orbits rapidly degrades due to station loss.

  11. Non-invasive Fetal ECG Signal Quality Assessment for Multichannel Heart Rate Estimation.

    PubMed

    Andreotti, Fernando; Graser, Felix; Malberg, Hagen; Zaunseder, Sebastian

    2017-12-01

    The noninvasive fetal ECG (NI-FECG) from abdominal recordings offers novel prospects for prenatal monitoring. However, NI-FECG signals are corrupted by various nonstationary noise sources, making the processing of abdominal recordings a challenging task. In this paper, we present an online approach that dynamically assess the quality of NI-FECG to improve fetal heart rate (FHR) estimation. Using a naive Bayes classifier, state-of-the-art and novel signal quality indices (SQIs), and an existing adaptive Kalman filter, FHR estimation was improved. For the purpose of training and validating the proposed methods, a large annotated private clinical dataset was used. The suggested classification scheme demonstrated an accuracy of Krippendorff's alpha in determining the overall quality of NI-FECG signals. The proposed Kalman filter outperformed alternative methods for FHR estimation achieving accuracy. The proposed algorithm was able to reliably reflect changes of signal quality and can be used in improving FHR estimation. NI-ECG signal quality estimation and multichannel information fusion are largely unexplored topics. Based on previous works, multichannel FHR estimation is a field that could strongly benefit from such methods. The developed SQI algorithms as well as resulting classifier were made available under a GNU GPL open-source license and contributed to the FECGSYN toolbox.

  12. A cooperative transponder system for improved traffic safety, localizing road users in the 5 GHz band

    NASA Astrophysics Data System (ADS)

    Schaffer, B.; Kalverkamp, G.; Chaabane, M.; Biebl, E. M.

    2012-09-01

    We present a multi-user cooperative mobile transponder system which enables cars to localize pedestrians, bicyclists and other road users in order to improve traffic safety. The system operates at a center frequency of 5.768 GHz, offering the ability to test precision localization technology at frequencies close to the newly designated automotive safety related bands around 5.9 GHz. By carrying out a roundtrip time of flight measurement, the sensor can determine the distance from the onboard localization unit of a car to a road user who is equipped with an active transponder, employing the idea of a secondary radar and pulse compression. The onboard unit sends out a pseudo noise coded interrogation pulse, which is answered by one or more transponders after a short waiting time. Each transponder uses a different waiting time in order to allow for time division multiple access. We present the system setup as well as range measurement results, achieving an accuracy up to centimeters for the distance measurement and a range in the order of hundred meters. We also discuss the effect of clock drift and offset on distance accuracy for different waiting times and show how the system can be improved to further increase precision in a multiuser environment.

  13. FRD and scrambling properties of recent non-circular fibres

    NASA Astrophysics Data System (ADS)

    Avila, Gerardo

    2012-09-01

    Optical fibres with octagonal, square and rectangular core shapes have been proposed as alternative to the circular fibres to link the telescopes to spectrographs in order to increase the accuracy of radial velocity measurements. Theoretically they offer better scrambling properties than their circular counterparts. First commercial octagonal fibres provided good near field scrambling gains. Unfortunately the far field scrambling did not show important figures. This article shows test results on new fibres from CeramOptec. The measurements show substantial improvements of the far field scrambling gains. In addition, evaluation of their focal ratio degradation (FRD) shows much better performances than previous fibres.

  14. Self-consistent assessment of Englert-Schwinger model on atomic properties

    NASA Astrophysics Data System (ADS)

    Lehtomäki, Jouko; Lopez-Acevedo, Olga

    2017-12-01

    Our manuscript investigates a self-consistent solution of the statistical atom model proposed by Berthold-Georg Englert and Julian Schwinger (the ES model) and benchmarks it against atomic Kohn-Sham and two orbital-free models of the Thomas-Fermi-Dirac (TFD)-λvW family. Results show that the ES model generally offers the same accuracy as the well-known TFD-1/5 vW model; however, the ES model corrects the failure in the Pauli potential near-nucleus region. We also point to the inability of describing low-Z atoms as the foremost concern in improving the present model.

  15. Self-consistent assessment of Englert-Schwinger model on atomic properties.

    PubMed

    Lehtomäki, Jouko; Lopez-Acevedo, Olga

    2017-12-21

    Our manuscript investigates a self-consistent solution of the statistical atom model proposed by Berthold-Georg Englert and Julian Schwinger (the ES model) and benchmarks it against atomic Kohn-Sham and two orbital-free models of the Thomas-Fermi-Dirac (TFD)-λvW family. Results show that the ES model generally offers the same accuracy as the well-known TFD-15vW model; however, the ES model corrects the failure in the Pauli potential near-nucleus region. We also point to the inability of describing low-Z atoms as the foremost concern in improving the present model.

  16. Introduction to anatomy on Wikipedia.

    PubMed

    Ledger, Thomas Stephen

    2017-09-01

    Wikipedia (www.wikipedia.com) is the largest encyclopaedia in existence. Of over five million English-language articles, about 6000 relate to Anatomy, which are viewed roughly 30 million times monthly. No work parallels the amount of attention, scope or interdisciplinary layout of Wikipedia, and it offers a unique opportunity to improve the anatomical literacy of the masses. Anatomy on Wikipedia is introduced from an editor's perspective. Article contributors, content, layout and accuracy are discussed, with a view to demystifying editing for anatomy professionals. A final request for edits or on-site feedback from anatomy professionals is made. © 2017 Anatomical Society.

  17. Performance of a Lexical and POS Tagger for Sanskrit

    NASA Astrophysics Data System (ADS)

    Hellwig, Oliver

    Due to the phonetic, morphological, and lexical complexity of Sanskrit, the automatic analysis of this language is a real challenge in the area of natural language processing. The paper describes a series of tests that were performed to assess the accuracy of the tagging program SanskritTagger. To our knowlegde, it offers the first reliable benchmark data for evaluating the quality of taggers for Sanskrit using an unrestricted dictionary and texts from different domains. Based on a detailed analysis of the test results, the paper points out possible directions for future improvements of statistical tagging procedures for Sanskrit.

  18. Introducing a feedback training system for guided home rehabilitation.

    PubMed

    Kohler, Fabian; Schmitz-Rode, Thomas; Disselhorst-Klug, Catherine

    2010-01-15

    As the number of people requiring orthopaedic intervention is growing, individualized physiotherapeutic rehabilitation and adequate postoperative care becomes increasingly relevant. The chances of improvement in the patients condition is directly related to the performance and consistency of the physiotherapeutic exercises.In this paper a smart, cost-effective and easy to use Feedback Training System for home rehabilitation based on standard resistive elements is introduced. This ensures high accuracy of the exercises performed and offers guidance and control to the patient by offering direct feedback about the performance of the movements.46 patients were recruited and performed standard physiotherapeutic training to evaluate the system. The results show a significant increase in the patient's ability to reproduce even simple physiotherapeutic exercises when being supported by the Feedback Training System. Thus physiotherapeutic training can be extended into the home environment whilst ensuring a high quality of training.

  19. Detection of Sub-fM DNA with Target Recycling and Self-Assembly Amplification on Graphene Field-Effect Biosensors

    PubMed Central

    2018-01-01

    All-electronic DNA biosensors based on graphene field-effect transistors (GFETs) offer the prospect of simple and cost-effective diagnostics. For GFET sensors based on complementary probe DNA, the sensitivity is limited by the binding affinity of the target oligonucleotide, in the nM range for 20 mer targets. We report a ∼20 000× improvement in sensitivity through the use of engineered hairpin probe DNA that allows for target recycling and hybridization chain reaction. This enables detection of 21 mer target DNA at sub-fM concentration and provides superior specificity against single-base mismatched oligomers. The work is based on a scalable fabrication process for biosensor arrays that is suitable for multiplexed detection. This approach overcomes the binding-affinity-dependent sensitivity of nucleic acid biosensors and offers a pathway toward multiplexed and label-free nucleic acid testing with high accuracy and selectivity. PMID:29768011

  20. Detection of Sub-fM DNA with Target Recycling and Self-Assembly Amplification on Graphene Field-Effect Biosensors.

    PubMed

    Gao, Zhaoli; Xia, Han; Zauberman, Jonathan; Tomaiuolo, Maurizio; Ping, Jinglei; Zhang, Qicheng; Ducos, Pedro; Ye, Huacheng; Wang, Sheng; Yang, Xinping; Lubna, Fahmida; Luo, Zhengtang; Ren, Li; Johnson, Alan T Charlie

    2018-06-13

    All-electronic DNA biosensors based on graphene field-effect transistors (GFETs) offer the prospect of simple and cost-effective diagnostics. For GFET sensors based on complementary probe DNA, the sensitivity is limited by the binding affinity of the target oligonucleotide, in the nM range for 20 mer targets. We report a ∼20 000× improvement in sensitivity through the use of engineered hairpin probe DNA that allows for target recycling and hybridization chain reaction. This enables detection of 21 mer target DNA at sub-fM concentration and provides superior specificity against single-base mismatched oligomers. The work is based on a scalable fabrication process for biosensor arrays that is suitable for multiplexed detection. This approach overcomes the binding-affinity-dependent sensitivity of nucleic acid biosensors and offers a pathway toward multiplexed and label-free nucleic acid testing with high accuracy and selectivity.

  1. ContextProvider: Context awareness for medical monitoring applications.

    PubMed

    Mitchell, Michael; Meyers, Christopher; Wang, An-I Andy; Tyson, Gary

    2011-01-01

    Smartphones are sensor-rich and Internet-enabled. With their on-board sensors, web services, social media, and external biosensors, smartphones can provide contextual information about the device, user, and environment, thereby enabling the creation of rich, biologically driven applications. We introduce ContextProvider, a framework that offers a unified, query-able interface to contextual data on the device. Unlike other context-based frameworks, ContextProvider offers interactive user feedback, self-adaptive sensor polling, and minimal reliance on third-party infrastructure. ContextProvider also allows for rapid development of new context and bio-aware applications. Evaluation of ContextProvider shows the incorporation of an additional monitoring sensor into the framework with fewer than 100 lines of Java code. With adaptive sensor monitoring, power consumption per sensor can be reduced down to 1% overhead. Finally, through the use of context, accuracy of data interpretation can be improved by up to 80%.

  2. Numerical integration techniques for curved-element discretizations of molecule-solvent interfaces.

    PubMed

    Bardhan, Jaydeep P; Altman, Michael D; Willis, David J; Lippow, Shaun M; Tidor, Bruce; White, Jacob K

    2007-07-07

    Surface formulations of biophysical modeling problems offer attractive theoretical and computational properties. Numerical simulations based on these formulations usually begin with discretization of the surface under consideration; often, the surface is curved, possessing complicated structure and possibly singularities. Numerical simulations commonly are based on approximate, rather than exact, discretizations of these surfaces. To assess the strength of the dependence of simulation accuracy on the fidelity of surface representation, here methods were developed to model several important surface formulations using exact surface discretizations. Following and refining Zauhar's work [J. Comput.-Aided Mol. Des. 9, 149 (1995)], two classes of curved elements were defined that can exactly discretize the van der Waals, solvent-accessible, and solvent-excluded (molecular) surfaces. Numerical integration techniques are presented that can accurately evaluate nonsingular and singular integrals over these curved surfaces. After validating the exactness of the surface discretizations and demonstrating the correctness of the presented integration methods, a set of calculations are presented that compare the accuracy of approximate, planar-triangle-based discretizations and exact, curved-element-based simulations of surface-generalized-Born (sGB), surface-continuum van der Waals (scvdW), and boundary-element method (BEM) electrostatics problems. Results demonstrate that continuum electrostatic calculations with BEM using curved elements, piecewise-constant basis functions, and centroid collocation are nearly ten times more accurate than planar-triangle BEM for basis sets of comparable size. The sGB and scvdW calculations give exceptional accuracy even for the coarsest obtainable discretized surfaces. The extra accuracy is attributed to the exact representation of the solute-solvent interface; in contrast, commonly used planar-triangle discretizations can only offer improved approximations with increasing discretization and associated increases in computational resources. The results clearly demonstrate that the methods for approximate integration on an exact geometry are far more accurate than exact integration on an approximate geometry. A MATLAB implementation of the presented integration methods and sample data files containing curved-element discretizations of several small molecules are available online as supplemental material.

  3. Re-designing scanning to reduce learning demands: the performance of typically developing 2-year-olds.

    PubMed

    McCarthy, John; Light, Janice; Drager, Kathryn; McNaughton, David; Grodzicki, Laura; Jones, Jonathan; Panek, Elizabeth; Parkin, Elizabeth

    2006-12-01

    Children with severe motor impairments who cannot use direct selection are typically introduced to scanning as a means of accessing assistive technology. Unfortunately, it is difficult for young children to learn to scan because the design of current scanning techniques does not always make explicit the offer of items from the selection array; furthermore, it does not provide explicit feedback after activation of the switch to select the target item. In the current study, scanning was redesigned to reduce learning demands by making both the offer of items and the feedback upon selection more explicit through the use of animation realized through HTML and speech output with appropriate intonation. Twenty typically developing 2-year-olds without disabilities were randomly assigned to use either traditional scanning or enhanced scanning to select target items from an array of three items. The 2-year-olds did not learn to use traditional scanning across three sessions. Their performance in Session 3 did not differ from that in Session 1; they did not exceed chance levels of accuracy in either session (mean accuracy of 20% for Sessions 1 and 3). In contrast, the children in the enhanced scanning condition demonstrated improvements in accuracy across the three 10-20-min sessions (mean accuracies of 22 and 48% for Sessions 1 and 3, respectively). There were no reliable differences between the children's performances with the two scanning techniques for Session 1; however, by Session 3, the children were more than twice as accurate using the enhanced scanning technique compared to the traditional design. Results suggest that by redesigning scanning, we may be able to reduce some of the learning demands and thereby reduce some of the instructional time required for children to attain mastery. Clinical implications, limitations, and directions for future research and development are discussed.

  4. Gap-filling a spatially explicit plant trait database: comparing imputation methods and different levels of environmental information

    NASA Astrophysics Data System (ADS)

    Poyatos, Rafael; Sus, Oliver; Badiella, Llorenç; Mencuccini, Maurizio; Martínez-Vilalta, Jordi

    2018-05-01

    The ubiquity of missing data in plant trait databases may hinder trait-based analyses of ecological patterns and processes. Spatially explicit datasets with information on intraspecific trait variability are rare but offer great promise in improving our understanding of functional biogeography. At the same time, they offer specific challenges in terms of data imputation. Here we compare statistical imputation approaches, using varying levels of environmental information, for five plant traits (leaf biomass to sapwood area ratio, leaf nitrogen content, maximum tree height, leaf mass per area and wood density) in a spatially explicit plant trait dataset of temperate and Mediterranean tree species (Ecological and Forest Inventory of Catalonia, IEFC, dataset for Catalonia, north-east Iberian Peninsula, 31 900 km2). We simulated gaps at different missingness levels (10-80 %) in a complete trait matrix, and we used overall trait means, species means, k nearest neighbours (kNN), ordinary and regression kriging, and multivariate imputation using chained equations (MICE) to impute missing trait values. We assessed these methods in terms of their accuracy and of their ability to preserve trait distributions, multi-trait correlation structure and bivariate trait relationships. The relatively good performance of mean and species mean imputations in terms of accuracy masked a poor representation of trait distributions and multivariate trait structure. Species identity improved MICE imputations for all traits, whereas forest structure and topography improved imputations for some traits. No method performed best consistently for the five studied traits, but, considering all traits and performance metrics, MICE informed by relevant ecological variables gave the best results. However, at higher missingness (> 30 %), species mean imputations and regression kriging tended to outperform MICE for some traits. MICE informed by relevant ecological variables allowed us to fill the gaps in the IEFC incomplete dataset (5495 plots) and quantify imputation uncertainty. Resulting spatial patterns of the studied traits in Catalan forests were broadly similar when using species means, regression kriging or the best-performing MICE application, but some important discrepancies were observed at the local level. Our results highlight the need to assess imputation quality beyond just imputation accuracy and show that including environmental information in statistical imputation approaches yields more plausible imputations in spatially explicit plant trait datasets.

  5. Uncertainty of OpenStreetMap data for the road network in Cyprus

    NASA Astrophysics Data System (ADS)

    Demetriou, Demetris

    2016-08-01

    Volunteered geographic information (VGI) refers to the geographic data compiled and created by individuals which are rendered on the Internet through specific web-based tools for diverse areas of interest. One of the most well-known VGI projects is the OpenStreetMap (OSM) that provides worldwide free geospatial data representing a variety of features. A critical issue for all VGI initiatives is the quality of the information offered. Thus, this report looks into the uncertainty of the OSM dataset for the main road network in Cyprus. The evaluation is based on three basic quality standards, namely positional accuracy, completeness and attribute accuracy. The work has been carried out by employing the Model Builder of ArcGIS which facilitated the comparison between the OSM data and the authoritative data provided by the Public Works Department (PWD). Findings showed that the positional accuracy increases with the hierarchical level of a road, it varies per administrative District and around 70% of the roads have a positional accuracy within 6m compared to the reference dataset. Completeness in terms of road length difference is around 25% for three out of four road categories examined and road name completeness is 100% and around 40% for higher and lower level roads, respectively. Attribute accuracy focusing on road name is very high for all levels of roads. These outputs indicate that OSM data are good enough if they fit for the purpose of use. Furthermore, the study revealed some weaknesses of the methods used for calculating the positional accuracy, suggesting the need for methodological improvements.

  6. Estimation of Antenna Pose in the Earth Frame Using Camera and IMU Data from Mobile Phones

    PubMed Central

    Wang, Zhen; Jin, Bingwen; Geng, Weidong

    2017-01-01

    The poses of base station antennas play an important role in cellular network optimization. Existing methods of pose estimation are based on physical measurements performed either by tower climbers or using additional sensors attached to antennas. In this paper, we present a novel non-contact method of antenna pose measurement based on multi-view images of the antenna and inertial measurement unit (IMU) data captured by a mobile phone. Given a known 3D model of the antenna, we first estimate the antenna pose relative to the phone camera from the multi-view images and then employ the corresponding IMU data to transform the pose from the camera coordinate frame into the Earth coordinate frame. To enhance the resulting accuracy, we improve existing camera-IMU calibration models by introducing additional degrees of freedom between the IMU sensors and defining a new error metric based on both the downtilt and azimuth angles, instead of a unified rotational error metric, to refine the calibration. In comparison with existing camera-IMU calibration methods, our method achieves an improvement in azimuth accuracy of approximately 1.0 degree on average while maintaining the same level of downtilt accuracy. For the pose estimation in the camera coordinate frame, we propose an automatic method of initializing the optimization solver and generating bounding constraints on the resulting pose to achieve better accuracy. With this initialization, state-of-the-art visual pose estimation methods yield satisfactory results in more than 75% of cases when plugged into our pipeline, and our solution, which takes advantage of the constraints, achieves even lower estimation errors on the downtilt and azimuth angles, both on average (0.13 and 0.3 degrees lower, respectively) and in the worst case (0.15 and 7.3 degrees lower, respectively), according to an evaluation conducted on a dataset consisting of 65 groups of data. We show that both of our enhancements contribute to the performance improvement offered by the proposed estimation pipeline, which achieves downtilt and azimuth accuracies of respectively 0.47 and 5.6 degrees on average and 1.38 and 12.0 degrees in the worst case, thereby satisfying the accuracy requirements for network optimization in the telecommunication industry. PMID:28397765

  7. SEER*Educate: Use of Abstracting Quality Index Scores to Monitor Improvement of All Employees.

    PubMed

    Potts, Mary S; Scott, Tim; Hafterson, Jennifer L

    2016-01-01

    Integral parts of the Seattle-Puget Sound's Cancer Surveillance System registry's continuous improvement model include the incorporation of SEER*Educate into its training program for all staff and analyzing assessment results using the Abstracting Quality Index (AQI). The AQI offers a comprehensive measure of overall performance in SEER*Educate, which is a Web-based application used to personalize learning and diagnostically pinpoint each staff member's place on the AQI continuum. The assessment results are tallied from 6 abstracting standards within 2 domains: incidence reporting and coding accuracy. More than 100 data items are aligned to 1 or more of the 6 standards to build an aggregated score that is placed on a continuum for continuous improvement. The AQI score accurately identifies those individuals who have a good understanding of how to apply the 6 abstracting standards to reliably generate high quality abstracts.

  8. High-resolution method for evolving complex interface networks

    NASA Astrophysics Data System (ADS)

    Pan, Shucheng; Hu, Xiangyu Y.; Adams, Nikolaus A.

    2018-04-01

    In this paper we describe a high-resolution transport formulation of the regional level-set approach for an improved prediction of the evolution of complex interface networks. The novelty of this method is twofold: (i) construction of local level sets and reconstruction of a global level set, (ii) local transport of the interface network by employing high-order spatial discretization schemes for improved representation of complex topologies. Various numerical test cases of multi-region flow problems, including triple-point advection, single vortex flow, mean curvature flow, normal driven flow, dry foam dynamics and shock-bubble interaction show that the method is accurate and suitable for a wide range of complex interface-network evolutions. Its overall computational cost is comparable to the Semi-Lagrangian regional level-set method while the prediction accuracy is significantly improved. The approach thus offers a viable alternative to previous interface-network level-set method.

  9. Logical Differential Prediction Bayes Net, improving breast cancer diagnosis for older women.

    PubMed

    Nassif, Houssam; Wu, Yirong; Page, David; Burnside, Elizabeth

    2012-01-01

    Overdiagnosis is a phenomenon in which screening identities cancer which may not go on to cause symptoms or death. Women over 65 who develop breast cancer bear the heaviest burden of overdiagnosis. This work introduces novel machine learning algorithms to improve diagnostic accuracy of breast cancer in aging populations. At the same time, we aim at minimizing unnecessary invasive procedures (thus decreasing false positives) and concomitantly addressing overdiagnosis. We develop a novel algorithm. Logical Differential Prediction Bayes Net (LDP-BN), that calculates the risk of breast disease based on mammography findings. LDP-BN uses Inductive Logic Programming (ILP) to learn relational rules, selects older-specific differentially predictive rules, and incorporates them into a Bayes Net, significantly improving its performance. In addition, LDP-BN offers valuable insight into the classification process, revealing novel older-specific rules that link mass presence to invasive, and calcification presence and lack of detectable mass to DCIS.

  10. Icon arrays help younger children's proportional reasoning.

    PubMed

    Ruggeri, Azzurra; Vagharchakian, Laurianne; Xu, Fei

    2018-06-01

    We investigated the effects of two context variables, presentation format (icon arrays or numerical frequencies) and time limitation (limited or unlimited time), on the proportional reasoning abilities of children aged 7 and 10 years, as well as adults. Participants had to select, between two sets of tokens, the one that offered the highest likelihood of drawing a gold token, that is, the set of elements with the greater proportion of gold tokens. Results show that participants performed better in the unlimited time condition. Moreover, besides a general developmental improvement in accuracy, our results show that younger children performed better when proportions were presented as icon arrays, whereas older children and adults were similarly accurate in the two presentation format conditions. Statement of contribution What is already known on this subject? There is a developmental improvement in proportional reasoning accuracy. Icon arrays facilitate reasoning in adults with low numeracy. What does this study add? Participants were more accurate when they were given more time to make the proportional judgement. Younger children's proportional reasoning was more accurate when they were presented with icon arrays. Proportional reasoning abilities correlate with working memory, approximate number system, and subitizing skills. © 2018 The British Psychological Society.

  11. A biased opinion: Demonstration of cognitive bias on a fingerprint matching task through knowledge of DNA test results.

    PubMed

    Stevenage, Sarah V; Bennett, Alice

    2017-07-01

    One study is presented which explores the biasing effects of irrelevant contextual information on a fingerprint matching task. Bias was introduced by providing the outcomes of a DNA test relating to each fictitious case under consideration. This was engineered to suggest either a match, no match, or an inconclusive outcome, and was thus either consistent, misleading or unbiased depending on the ground truth of each fingerprint pair. The results suggested that, when the difficulty of the fingerprint matching task was measurably increased, participants became more vulnerable to the biasing information. Under such conditions, when performance was good, misleading evidence lowered accuracy, and when performance was weaker, consistent evidence improved accuracy. As such, the results confirmed existing demonstrations of cognitive bias from contextual information in the fingerprint task. Moreover, by taking a process-based approach, it became possible to articulate the concerns, and the potential solutions, at each stage of the workflow. The results offer value for the forensic science community in extending the evidence-base regarding cognitive bias, and in articulating routes to improve the credibility of fingerprint decisions. Copyright © 2017. Published by Elsevier B.V.

  12. Precision enhancement of pavement roughness localization with connected vehicles

    NASA Astrophysics Data System (ADS)

    Bridgelall, R.; Huang, Y.; Zhang, Z.; Deng, F.

    2016-02-01

    Transportation agencies rely on the accurate localization and reporting of roadway anomalies that could pose serious hazards to the traveling public. However, the cost and technical limitations of present methods prevent their scaling to all roadways. Connected vehicles with on-board accelerometers and conventional geospatial position receivers offer an attractive alternative because of their potential to monitor all roadways in real-time. The conventional global positioning system is ubiquitous and essentially free to use but it produces impractically large position errors. This study evaluated the improvement in precision achievable by augmenting the conventional geo-fence system with a standard speed bump or an existing anomaly at a pre-determined position to establish a reference inertial marker. The speed sensor subsequently generates position tags for the remaining inertial samples by computing their path distances relative to the reference position. The error model and a case study using smartphones to emulate connected vehicles revealed that the precision in localization improves from tens of metres to sub-centimetre levels, and the accuracy of measuring localized roughness more than doubles. The research results demonstrate that transportation agencies will benefit from using the connected vehicle method to achieve precision and accuracy levels that are comparable to existing laser-based inertial profilers.

  13. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less

  14. Prediction of enteric methane emissions from sheep offered fresh perennial ryegrass () using data measured in indirect open-circuit respiration chambers.

    PubMed

    Zhao, Y G; O'Connell, N E; Yan, T

    2016-06-01

    Development of effective methane (CH) mitigation strategies for grazing sheep requires accurate prediction tools. The present study aimed to identify key parameters influencing enteric CH emissions and develop prediction equations for enteric CH emissions from sheep offered fresh grass. The data used were collected from 82 sheep offered fresh perennial ryegrass () as sole diets in 6 metabolism experiments (data from non-grass-only diets were not used). Sheep were from breeds of Highlander, Texel, Scottish Blackface, and Swaledale at the age of 5 to 18 mo and weighing from 24.5 to 62.7 kg. Grass was harvested daily from 6 swards on contrasting harvest dates (May to December). Before the commencement of each study, the experimental sward was harvested at a residual height of 4 cm and allowed to grow for 2 to 4 wk. The feeding trials commenced when the grass sward was suitable to zero grazing (average grass height = 15 cm), thus offering grass of a quality similar to what grazing animals would receive under routine grazing management. Sheep were housed in individual pens for 14 d and then moved to individual calorimeter chambers for 4 d. Feed intake, fecal and urine outputs, and CH emissions were measured during the final 4 d. Data were analyzed using the REML procedure to develop prediction equations for CH emissions. Linear and multiple prediction equations were developed using BW, DMI, GE intake (GEI), and grass chemical concentrations (DM, OM, water-soluble carbohydrates [WSC], NDF, ADF, nitrogen [N], GE, DE, and ME) as explanatory variables. The mean CH production was 21.1 g/kg DMI or 0.062 MJ/MJ GEI. Dry matter intake and GEI were much more accurate predictors for CH emissions than BW ( < 0.001, = 0.86 and = 0.87 vs. = 0.09, respectively). Adding grass DE and ME concentrations and grass nutrient concentrations (e.g., OM, N, GE, NDF, and WSC) to the relationships between DMI or GEI and CH emissions improved prediction accuracy with values increased to 0.93. Models based on farm-level data, for example, BW and grass nutrient (i.e., DM, GE, OM, and N) concentrations, were also developed and performed satisfactorily ( < 0.001, = 0.63). These models can contribute to improve prediction accuracy for enteric CH emissions from sheep grazing on ryegrass pasture.

  15. An Application of Multi-band Forced Photometry to One Square Degree of SERVS: Accurate Photometric Redshifts and Implications for Future Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nyland, Kristina; Lacy, Mark; Sajina, Anna

    We apply The Tractor image modeling code to improve upon existing multi-band photometry for the Spitzer Extragalactic Representative Volume Survey (SERVS). SERVS consists of post-cryogenic Spitzer observations at 3.6 and 4.5 μ m over five well-studied deep fields spanning 18 deg{sup 2}. In concert with data from ground-based near-infrared (NIR) and optical surveys, SERVS aims to provide a census of the properties of massive galaxies out to z  ≈ 5. To accomplish this, we are using The Tractor to perform “forced photometry.” This technique employs prior measurements of source positions and surface brightness profiles from a high-resolution fiducial band from themore » VISTA Deep Extragalactic Observations survey to model and fit the fluxes at lower-resolution bands. We discuss our implementation of The Tractor over a square-degree test region within the XMM Large Scale Structure field with deep imaging in 12 NIR/optical bands. Our new multi-band source catalogs offer a number of advantages over traditional position-matched catalogs, including (1) consistent source cross-identification between bands, (2) de-blending of sources that are clearly resolved in the fiducial band but blended in the lower resolution SERVS data, (3) a higher source detection fraction in each band, (4) a larger number of candidate galaxies in the redshift range 5 <  z  < 6, and (5) a statistically significant improvement in the photometric redshift accuracy as evidenced by the significant decrease in the fraction of outliers compared to spectroscopic redshifts. Thus, forced photometry using The Tractor offers a means of improving the accuracy of multi-band extragalactic surveys designed for galaxy evolution studies. We will extend our application of this technique to the full SERVS footprint in the future.« less

  16. Comparison of a row-column speller vs. a novel lateral single-character speller: assessment of BCI for severe motor disabled patients.

    PubMed

    Pires, Gabriel; Nunes, Urbano; Castelo-Branco, Miguel

    2012-06-01

    Non-invasive brain-computer interface (BCI) based on electroencephalography (EEG) offers a new communication channel for people suffering from severe motor disorders. This paper presents a novel P300-based speller called lateral single-character (LSC). The LSC performance is compared to that of the standard row-column (RC) speller. We developed LSC, a single-character paradigm comprising all letters of the alphabet following an event strategy that significantly reduces the time for symbol selection, and explores the intrinsic hemispheric asymmetries in visual perception to improve the performance of the BCI. RC and LSC paradigms were tested by 10 able-bodied participants, seven participants with amyotrophic lateral sclerosis (ALS), five participants with cerebral palsy (CP), one participant with Duchenne muscular dystrophy (DMD), and one participant with spinal cord injury (SCI). The averaged results, taking into account all participants who were able to control the BCI online, were significantly higher for LSC, 26.11 bit/min and 89.90% accuracy, than for RC, 21.91 bit/min and 88.36% accuracy. The two paradigms produced different waveforms and the signal-to-noise ratio was significantly higher for LSC. Finally, the novel LSC also showed new discriminative features. The results suggest that LSC is an effective alternative to RC, and that LSC still has a margin for potential improvement in bit rate and accuracy. The high bit rates and accuracy of LSC are a step forward for the effective use of BCI in clinical applications. Copyright © 2011 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  17. Early Detection of Ureteropelvic Junction Obstruction Using Signal Analysis and Machine Learning: A Dynamic Solution to a Dynamic Problem.

    PubMed

    Blum, Emily S; Porras, Antonio R; Biggs, Elijah; Tabrizi, Pooneh R; Sussman, Rachael D; Sprague, Bruce M; Shalaby-Rana, Eglal; Majd, Massoud; Pohl, Hans G; Linguraru, Marius George

    2017-10-21

    We sought to define features that describe the dynamic information in diuresis renograms for the early detection of clinically significant hydronephrosis caused by ureteropelvic junction obstruction. We studied the diuresis renogram of 55 patients with a mean ± SD age of 75 ± 66 days who had congenital hydronephrosis at initial presentation. Five patients had bilaterally affected kidneys for a total of 60 diuresis renograms. Surgery was performed on 35 kidneys. We extracted 45 features based on curve shape and wavelet analysis from the drainage curves recorded after furosemide administration. The optimal features were selected as the combination that maximized the ROC AUC obtained from a linear support vector machine classifier trained to classify patients as with or without obstruction. Using these optimal features we performed leave 1 out cross validation to estimate the accuracy, sensitivity and specificity of our framework. Results were compared to those obtained using post-diuresis drainage half-time and the percent of clearance after 30 minutes. Our framework had 93% accuracy, including 91% sensitivity and 96% specificity, to predict surgical cases. This was a significant improvement over the same accuracy of 82%, including 71% sensitivity and 96% specificity obtained from half-time and 30-minute clearance using the optimal thresholds of 24.57 minutes and 55.77%, respectively. Our machine learning framework significantly improved the diagnostic accuracy of clinically significant hydronephrosis compared to half-time and 30-minute clearance. This aids in the clinical decision making process by offering a tool for earlier detection of severe cases and it has the potential to reduce the number of diuresis renograms required for diagnosis. Copyright © 2018 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  18. An efficient impedance method for induced field evaluation based on a stabilized Bi-conjugate gradient algorithm.

    PubMed

    Wang, Hua; Liu, Feng; Xia, Ling; Crozier, Stuart

    2008-11-21

    This paper presents a stabilized Bi-conjugate gradient algorithm (BiCGstab) that can significantly improve the performance of the impedance method, which has been widely applied to model low-frequency field induction phenomena in voxel phantoms. The improved impedance method offers remarkable computational advantages in terms of convergence performance and memory consumption over the conventional, successive over-relaxation (SOR)-based algorithm. The scheme has been validated against other numerical/analytical solutions on a lossy, multilayered sphere phantom excited by an ideal coil loop. To demonstrate the computational performance and application capability of the developed algorithm, the induced fields inside a human phantom due to a low-frequency hyperthermia device is evaluated. The simulation results show the numerical accuracy and superior performance of the method.

  19. Fontan Surgical Planning: Previous Accomplishments, Current Challenges, and Future Directions.

    PubMed

    Trusty, Phillip M; Slesnick, Timothy C; Wei, Zhenglun Alan; Rossignac, Jarek; Kanter, Kirk R; Fogel, Mark A; Yoganathan, Ajit P

    2018-04-01

    The ultimate goal of Fontan surgical planning is to provide additional insights into the clinical decision-making process. In its current state, surgical planning offers an accurate hemodynamic assessment of the pre-operative condition, provides anatomical constraints for potential surgical options, and produces decent post-operative predictions if boundary conditions are similar enough between the pre-operative and post-operative states. Moving forward, validation with post-operative data is a necessary step in order to assess the accuracy of surgical planning and determine which methodological improvements are needed. Future efforts to automate the surgical planning process will reduce the individual expertise needed and encourage use in the clinic by clinicians. As post-operative physiologic predictions improve, Fontan surgical planning will become an more effective tool to accurately model patient-specific hemodynamics.

  20. Echidna Mark II: one giant leap for 'tilting spine' fibre positioning technology

    NASA Astrophysics Data System (ADS)

    Gilbert, James; Dalton, Gavin

    2016-07-01

    The Australian Astronomical Observatory's 'tilting spine' fibre positioning technology has been redeveloped to provide superior performance in a smaller package. The new design offers demonstrated closed-loop positioning errors of <2.8 μm RMS in only five moves ( 10 s excluding metrology overheads) and an improved capacity for open-loop tracking during observations. Tilt-induced throughput losses have been halved by lengthening spines while maintaining excellent accuracy. New low-voltage multilayer piezo actuator technology has reduced a spine's peak drive amplitude from 150V to <10V, simplifying the control electronics design, reducing the system's overall size, and improving modularity. Every spine is now a truly independent unit with a dedicated drive circuit and no restrictions on the timing or direction of fibre motion.

  1. Resident choice and the survey process: the need for standardized observation and transparency.

    PubMed

    Schnelle, John F; Bertrand, Rosanna; Hurd, Donna; White, Alan; Squires, David; Feuerberg, Marvin; Hickey, Kelly; Simmons, Sandra F

    2009-08-01

    To describe a standardized observation protocol to determine if nursing home (NH) staff offer choice to residents during 3 morning activities of daily living (ADL) and compare the observational data with deficiency statements cited by state survey staff. Morning ADL care was observed in 20 NHs in 5 states by research staff using a standardized observation protocol. The number of observations in which choice was not offered was documented for 3 morning ADL care activities and compared with deficiency statements made by surveyors. Staff failed to offer choice during morning ADL care delivery for at least 1 of 3 ADL care activities in all 20 NHs. Observational data showed residents were not offered choice about when to get out of bed (11%), what to wear (25%), and breakfast dining location (39%). In comparison, survey staff issued only 2 deficiencies in all 20 NHs relevant to choice in the targeted ADL care activities, and neither deficiency was based on observational data. Survey interpretative guidelines instruct surveyors to observe if residents are offered choice during daily care provision, but standardized observation protocols are not provided to surveyors to make this determination. The use of a standardized observation protocol in the survey process similar to that used by research staff in this study would improve the accuracy and transparency of the survey process.

  2. Identification of Long Bone Fractures in Radiology Reports Using Natural Language Processing to Support Healthcare Quality Improvement

    PubMed Central

    Masino, Aaron J.; Casper, T. Charles; Dean, Jonathan M.; Bell, Jamie; Enriquez, Rene; Deakyne, Sara; Chamberlain, James M.; Alpern, Elizabeth R.

    2016-01-01

    Summary Background Important information to support healthcare quality improvement is often recorded in free text documents such as radiology reports. Natural language processing (NLP) methods may help extract this information, but these methods have rarely been applied outside the research laboratories where they were developed. Objective To implement and validate NLP tools to identify long bone fractures for pediatric emergency medicine quality improvement. Methods Using freely available statistical software packages, we implemented NLP methods to identify long bone fractures from radiology reports. A sample of 1,000 radiology reports was used to construct three candidate classification models. A test set of 500 reports was used to validate the model performance. Blinded manual review of radiology reports by two independent physicians provided the reference standard. Each radiology report was segmented and word stem and bigram features were constructed. Common English “stop words” and rare features were excluded. We used 10-fold cross-validation to select optimal configuration parameters for each model. Accuracy, recall, precision and the F1 score were calculated. The final model was compared to the use of diagnosis codes for the identification of patients with long bone fractures. Results There were 329 unique word stems and 344 bigrams in the training documents. A support vector machine classifier with Gaussian kernel performed best on the test set with accuracy=0.958, recall=0.969, precision=0.940, and F1 score=0.954. Optimal parameters for this model were cost=4 and gamma=0.005. The three classification models that we tested all performed better than diagnosis codes in terms of accuracy, precision, and F1 score (diagnosis code accuracy=0.932, recall=0.960, precision=0.896, and F1 score=0.927). Conclusions NLP methods using a corpus of 1,000 training documents accurately identified acute long bone fractures from radiology reports. Strategic use of straightforward NLP methods, implemented with freely available software, offers quality improvement teams new opportunities to extract information from narrative documents. PMID:27826610

  3. SINA: accurate high-throughput multiple sequence alignment of ribosomal RNA genes.

    PubMed

    Pruesse, Elmar; Peplies, Jörg; Glöckner, Frank Oliver

    2012-07-15

    In the analysis of homologous sequences, computation of multiple sequence alignments (MSAs) has become a bottleneck. This is especially troublesome for marker genes like the ribosomal RNA (rRNA) where already millions of sequences are publicly available and individual studies can easily produce hundreds of thousands of new sequences. Methods have been developed to cope with such numbers, but further improvements are needed to meet accuracy requirements. In this study, we present the SILVA Incremental Aligner (SINA) used to align the rRNA gene databases provided by the SILVA ribosomal RNA project. SINA uses a combination of k-mer searching and partial order alignment (POA) to maintain very high alignment accuracy while satisfying high throughput performance demands. SINA was evaluated in comparison with the commonly used high throughput MSA programs PyNAST and mothur. The three BRAliBase III benchmark MSAs could be reproduced with 99.3, 97.6 and 96.1 accuracy. A larger benchmark MSA comprising 38 772 sequences could be reproduced with 98.9 and 99.3% accuracy using reference MSAs comprising 1000 and 5000 sequences. SINA was able to achieve higher accuracy than PyNAST and mothur in all performed benchmarks. Alignment of up to 500 sequences using the latest SILVA SSU/LSU Ref datasets as reference MSA is offered at http://www.arb-silva.de/aligner. This page also links to Linux binaries, user manual and tutorial. SINA is made available under a personal use license.

  4. Study on Classification Accuracy Inspection of Land Cover Data Aided by Automatic Image Change Detection Technology

    NASA Astrophysics Data System (ADS)

    Xie, W.-J.; Zhang, L.; Chen, H.-P.; Zhou, J.; Mao, W.-J.

    2018-04-01

    The purpose of carrying out national geographic conditions monitoring is to obtain information of surface changes caused by human social and economic activities, so that the geographic information can be used to offer better services for the government, enterprise and public. Land cover data contains detailed geographic conditions information, thus has been listed as one of the important achievements in the national geographic conditions monitoring project. At present, the main issue of the production of the land cover data is about how to improve the classification accuracy. For the land cover data quality inspection and acceptance, classification accuracy is also an important check point. So far, the classification accuracy inspection is mainly based on human-computer interaction or manual inspection in the project, which are time consuming and laborious. By harnessing the automatic high-resolution remote sensing image change detection technology based on the ERDAS IMAGINE platform, this paper carried out the classification accuracy inspection test of land cover data in the project, and presented a corresponding technical route, which includes data pre-processing, change detection, result output and information extraction. The result of the quality inspection test shows the effectiveness of the technical route, which can meet the inspection needs for the two typical errors, that is, missing and incorrect update error, and effectively reduces the work intensity of human-computer interaction inspection for quality inspectors, and also provides a technical reference for the data production and quality control of the land cover data.

  5. A Deep Learning Approach to Digitally Stain Optical Coherence Tomography Images of the Optic Nerve Head.

    PubMed

    Devalla, Sripad Krishna; Chin, Khai Sing; Mari, Jean-Martial; Tun, Tin A; Strouthidis, Nicholas G; Aung, Tin; Thiéry, Alexandre H; Girard, Michaël J A

    2018-01-01

    To develop a deep learning approach to digitally stain optical coherence tomography (OCT) images of the optic nerve head (ONH). A horizontal B-scan was acquired through the center of the ONH using OCT (Spectralis) for one eye of each of 100 subjects (40 healthy and 60 glaucoma). All images were enhanced using adaptive compensation. A custom deep learning network was then designed and trained with the compensated images to digitally stain (i.e., highlight) six tissue layers of the ONH. The accuracy of our algorithm was assessed (against manual segmentations) using the dice coefficient, sensitivity, specificity, intersection over union (IU), and accuracy. We studied the effect of compensation, number of training images, and performance comparison between glaucoma and healthy subjects. For images it had not yet assessed, our algorithm was able to digitally stain the retinal nerve fiber layer + prelamina, the RPE, all other retinal layers, the choroid, and the peripapillary sclera and lamina cribrosa. For all tissues, the dice coefficient, sensitivity, specificity, IU, and accuracy (mean) were 0.84 ± 0.03, 0.92 ± 0.03, 0.99 ± 0.00, 0.89 ± 0.03, and 0.94 ± 0.02, respectively. Our algorithm performed significantly better when compensated images were used for training (P < 0.001). Besides offering a good reliability, digital staining also performed well on OCT images of both glaucoma and healthy individuals. Our deep learning algorithm can simultaneously stain the neural and connective tissues of the ONH, offering a framework to automatically measure multiple key structural parameters of the ONH that may be critical to improve glaucoma management.

  6. Accuracy of Genomic Prediction in Switchgrass (Panicum virgatum L.) Improved by Accounting for Linkage Disequilibrium

    PubMed Central

    Ramstein, Guillaume P.; Evans, Joseph; Kaeppler, Shawn M.; Mitchell, Robert B.; Vogel, Kenneth P.; Buell, C. Robin; Casler, Michael D.

    2016-01-01

    Switchgrass is a relatively high-yielding and environmentally sustainable biomass crop, but further genetic gains in biomass yield must be achieved to make it an economically viable bioenergy feedstock. Genomic selection (GS) is an attractive technology to generate rapid genetic gains in switchgrass, and meet the goals of a substantial displacement of petroleum use with biofuels in the near future. In this study, we empirically assessed prediction procedures for genomic selection in two different populations, consisting of 137 and 110 half-sib families of switchgrass, tested in two locations in the United States for three agronomic traits: dry matter yield, plant height, and heading date. Marker data were produced for the families’ parents by exome capture sequencing, generating up to 141,030 polymorphic markers with available genomic-location and annotation information. We evaluated prediction procedures that varied not only by learning schemes and prediction models, but also by the way the data were preprocessed to account for redundancy in marker information. More complex genomic prediction procedures were generally not significantly more accurate than the simplest procedure, likely due to limited population sizes. Nevertheless, a highly significant gain in prediction accuracy was achieved by transforming the marker data through a marker correlation matrix. Our results suggest that marker-data transformations and, more generally, the account of linkage disequilibrium among markers, offer valuable opportunities for improving prediction procedures in GS. Some of the achieved prediction accuracies should motivate implementation of GS in switchgrass breeding programs. PMID:26869619

  7. Usefulness of composite methionine-positron emission tomography/3.0-tesla magnetic resonance imaging to detect the localization and extent of early-stage Cushing adenoma.

    PubMed

    Ikeda, Hidetoshi; Abe, Takehiko; Watanabe, Kazuo

    2010-04-01

    Fifty to eighty percent of Cushing disease is diagnosed by typical endocrine responses. Recently, the number of diagnoses of Cushing disease without typical Cushing syndrome has been increasing; therefore, improving ways to determine the localization of the adenoma and making an early diagnosis is important. This study was undertaken to determine the present diagnostic accuracy for Cushing microadenoma and to compare the differences in diagnostic accuracy between MR imaging and PET/MR imaging. During the past 3 years the authors analyzed the diagnostic accuracy in a series of 35 patients with Cushing adenoma that was verified by surgical pituitary exploration. All 35 cases of Cushing disease, including 20 cases of "overt" and 15 cases of "preclinical" Cushing disease, were studied. Superconductive MR images (1.5 or 3.0 T) and composite images from FDG-PET or methionine (MET)-PET and 3.0-T MR imaging were compared with the localization of adenomas verified by surgery. The diagnostic accuracy of superconductive MR imaging for detecting the localization of Cushing microadenoma was only 40%. The causes of unsatisfactory results for superconductive MR imaging were false-negative results (10 cases), false-positive results (6 cases), and instances of double pituitary adenomas (3 cases). In contrast, the accuracy of microadenoma localization using MET-PET/3.0-T MR imaging was 100% and that of FDG-PET/3.0-T MR imaging was 73%. Moreover, the adenoma location was better delineated on MET-PET/MR images than on FDG-PET/MR images. There was no significant difference in maximum standard uptake value of adenomas evaluated by MET-PET between preclinical Cushing disease and overt Cushing disease. Composite MET-PET/3.0-T MR imaging is useful for the improvement of the delineation of Cushing microadenoma and offers high-quality detectability for early-stage Cushing adenoma.

  8. Performance of two updated blood glucose monitoring systems: an evaluation following ISO 15197:2013.

    PubMed

    Pleus, Stefan; Baumstark, Annette; Rittmeyer, Delia; Jendrike, Nina; Haug, Cornelia; Freckmann, Guido

    2016-05-01

    Objective For patients with diabetes, regular self-monitoring of blood glucose (SMBG) is essential to ensure adequate glycemic control. Therefore, accurate and reliable blood glucose measurements with SMBG systems are necessary. The international standard ISO 15197 describes requirements for SMBG systems, such as limits within which 95% of glucose results have to fall to reach acceptable system accuracy. The 2013 version of this standard sets higher demands, especially regarding system accuracy, than the currently still valid edition. ISO 15197 can be applied by manufacturers to receive a CE mark for their system. Research design and methods This study was an accuracy evaluation following ISO 15197:2013 section 6.3 of two recently updated SMBG systems (Contour * and Contour TS; Bayer Consumer Care AG, Basel, Switzerland) with an improved algorithm to investigate whether the systems fulfill the requirements of the new standard. For this purpose, capillary blood samples of approximately 100 participants were measured with three test strip lots of both systems and deviations from glucose values obtained with a hexokinase-based comparison method (Cobas Integra † 400 plus; Roche Instrument Center, Rotkreuz, Switzerland) were determined. Percentages of values within the acceptance criteria of ISO 15197:2013 were calculated. This study was registered at clinicaltrials.gov (NCT02358408). Main outcome Both updated systems fulfilled the system accuracy requirements of ISO 15197:2013 as 98.5% to 100% of the results were within the stipulated limits. Furthermore, all results were within the clinically non-critical zones A and B of the consensus error grid for type 1 diabetes. Conclusions The technical improvement of the systems ensured compliance with ISO 15197 in the hands of healthcare professionals even in its more stringent 2013 version. Alternative presentation of system accuracy results in radar plots provides additional information with certain advantages. In addition, the surveillance error grid offers a modern tool to assess a system's clinical performance.

  9. Comparison of modal identification techniques using a hybrid-data approach

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.

    1986-01-01

    Modal identification of seemingly simple structures, such as the generic truss is often surprisingly difficult in practice due to high modal density, nonlinearities, and other nonideal factors. Under these circumstances, different data analysis techniques can generate substantially different results. The initial application of a new hybrid-data method for studying the performance characteristics of various identification techniques with such data is summarized. This approach offers new pieces of information for the system identification researcher. First, it allows actual experimental data to be used in the studies, while maintaining the traditional advantage of using simulated data. That is, the identification technique under study is forced to cope with the complexities of real data, yet the performance can be measured unquestionably for the artificial modes because their true parameters are known. Secondly, the accuracy achieved for the true structural modes in the data can be estimated from the accuracy achieved for the artificial modes if the results show similar characteristics. This similarity occurred in the study, for example, for a weak structural mode near 56 Hz. It may even be possible--eventually--to use the error information from the artificial modes to improve the identification accuracy for the structural modes.

  10. Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  11. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    PubMed

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  12. In Vivo Investigation of Breast Cancer Progression by Use of an Internal Control1

    PubMed Central

    Baeten, John; Haller, Jodi; Shih, Helen; Ntziachristos, Vasilis

    2009-01-01

    Optical imaging of breast cancer has been considered for detecting functional and molecular characteristics of diseases in clinical and preclinical settings. Applied to laboratory research, photonic investigations offer a highly versatile tool for preclinical imaging and drug discovery. A particular advantage of the optical method is the availability of multiple spectral bands for performing imaging. Herein, we capitalize on this feature to demonstrate how it is possible to use different wavelengths to offer internal controls and significantly improve the observation accuracy in molecular imaging applications. In particular, we show the independent in vivo detection of cysteine proteases along with tumor permeability and interstitial volume measurements using a dual-wavelength approach. To generate results with a view toward clinically geared studies, a transgenic Her2/neu mouse model that spontaneously developed mammary tumors was used. In vivo findings were validated against conventional ex vivo tests such as histology and Western blot analyses. By correcting for biodistribution parameters, the dual-wavelength method increases the accuracy of molecular observations by separating true molecular target from probe biodistribution. As such, the method is highly appropriate for molecular imaging studies where often probe delivery and target presence are not independently assessed. On the basis of these findings, we propose the dual-wavelength/normalization approach as an essential method for drug discovery and preclinical imaging studies. PMID:19242603

  13. Understanding dental CAD/CAM for restorations--dental milling machines from a mechanical engineering viewpoint. Part B: labside milling machines.

    PubMed

    Lebon, Nicolas; Tapie, Laurent; Duret, Francois; Attal, Jean-Pierre

    2016-01-01

    Nowadays, dental numerical controlled (NC) milling machines are available for dental laboratories (labside solution) and dental production centers. This article provides a mechanical engineering approach to NC milling machines to help dental technicians understand the involvement of technology in digital dentistry practice. The technical and economic criteria are described for four labside and two production center dental NC milling machines available on the market. The technical criteria are focused on the capacities of the embedded technologies of milling machines to mill prosthetic materials and various restoration shapes. The economic criteria are focused on investment cost and interoperability with third-party software. The clinical relevance of the technology is discussed through the accuracy and integrity of the restoration. It can be asserted that dental production center milling machines offer a wider range of materials and types of restoration shapes than labside solutions, while labside solutions offer a wider range than chairside solutions. The accuracy and integrity of restorations may be improved as a function of the embedded technologies provided. However, the more complex the technical solutions available, the more skilled the user must be. Investment cost and interoperability with third-party software increase according to the quality of the embedded technologies implemented. Each private dental practice may decide which fabrication option to use depending on the scope of the practice.

  14. FloWave.US: validated, open-source, and flexible software for ultrasound blood flow analysis.

    PubMed

    Coolbaugh, Crystal L; Bush, Emily C; Caskey, Charles F; Damon, Bruce M; Towse, Theodore F

    2016-10-01

    Automated software improves the accuracy and reliability of blood velocity, vessel diameter, blood flow, and shear rate ultrasound measurements, but existing software offers limited flexibility to customize and validate analyses. We developed FloWave.US-open-source software to automate ultrasound blood flow analysis-and demonstrated the validity of its blood velocity (aggregate relative error, 4.32%) and vessel diameter (0.31%) measures with a skeletal muscle ultrasound flow phantom. Compared with a commercial, manual analysis software program, FloWave.US produced equivalent in vivo cardiac cycle time-averaged mean (TAMean) velocities at rest and following a 10-s muscle contraction (mean bias <1 pixel for both conditions). Automated analysis of ultrasound blood flow data was 9.8 times faster than the manual method. Finally, a case study of a lower extremity muscle contraction experiment highlighted the ability of FloWave.US to measure small fluctuations in TAMean velocity, vessel diameter, and mean blood flow at specific time points in the cardiac cycle. In summary, the collective features of our newly designed software-accuracy, reliability, reduced processing time, cost-effectiveness, and flexibility-offer advantages over existing proprietary options. Further, public distribution of FloWave.US allows researchers to easily access and customize code to adapt ultrasound blood flow analysis to a variety of vascular physiology applications. Copyright © 2016 the American Physiological Society.

  15. A Practice Improvement Education Program Using a Mentored Approach to Improve Nursing Facility Depression Care-Preliminary Data.

    PubMed

    Chodosh, Joshua; Price, Rachel M; Cadogan, Mary P; Damron-Rodriguez, JoAnn; Osterweil, Dan; Czerwinski, Alfredo; Tan, Zaldy S; Merkin, Sharon S; Gans, Daphna; Frank, Janet C

    2015-11-01

    Depression is common in nursing facility residents. Depression data obtained using the Minimum Data Set (MDS) 3.0 offer opportunities for improving diagnostic accuracy and care quality. How best to integrate MDS 3.0 and other data into quality improvement (QI) activity is untested. The objective was to increase nursing home (NH) capability in using QI processes and to improve depression assessment and management through focused mentorship and team building. This was a 6-month intervention with five components: facilitated collection of MDS 3.0 nine-item Patient Health Questionnaire (PHQ-9) and medication data for diagnostic interpretation; education and modeling on QI approaches, team building, and nonpharmacological depression care; mentored team meetings; educational webinars; and technical assistance. PHQ-9 and medication data were collected at baseline and 6 and 9 months. Progress was measured using team participation measures, attitude and care process self-appraisal, mentor assessments, and resident depression outcomes. Five NHs established interprofessional teams that included nursing (44.1%), social work (20.6%), physicians (8.8%), and other disciplines (26.5%). Members participated in 61% of eight offered educational meetings (three onsite mentored team meetings and five webinars). Competency self-ratings improved on four depression care measures (P = .05 to <.001). Mentors observed improvement in team process and enthusiasm during team meetings. For 336 residents with PHQ-9 and medication data, depression scores did not change while medication use declined, from 37.2% of residents at baseline to 31.0% at 9 months (P < .001). This structured mentoring program improved care processes, achieved medication reductions, and was well received. Application to other NH-prevalent syndromes is possible. © 2015, Copyright the Authors Journal compilation © 2015, The American Geriatrics Society.

  16. Ka-band study: 1988

    NASA Technical Reports Server (NTRS)

    Layland, J. W.; Horttor, R. L.; Clauss, R. C.; Wilcher, J. H.; Wallace, R. J.; Mudgway, D. J.

    1989-01-01

    The Ka-band study team was chartered in late 1987 to bring together all the planning elements for establishing 32 GHz (Ka-band) as the primary downlink frequency for deep-space operation, and to provide a stable baseline from which to pursue that development. This article summarizes the results of that study at its conclusion in mid-1988, and corresponds to material presented to NASA's Office of Space Operations on July 14, 1988. For a variety of reasons, Ka-band is the right next major step in deep-space communications. It offers improved radio metric accuracy through reduced plasma sensitivity and increased bandwidth. Because of these improvements, it offers the opportunity to reduce costs in the flight radio system or in the DSN by allocating part of the overall benefits of Ka-band to this cost reduction. A mission scenario is being planned that can drive at least two and possibly all three of the DSN subnets to provide a Ka-band downlink capability by the turn of the century. The implementation scenario devised by the study team is believed to be feasible within reasonable resource expectations, and capable of providing the needed upgrade as a natural follow-on to the technology development which is already underway.

  17. Matching forensic sketches to mug shot photos.

    PubMed

    Klare, Brendan F; Li, Zhifeng; Jain, Anil K

    2011-03-01

    The problem of matching a forensic sketch to a gallery of mug shot images is addressed in this paper. Previous research in sketch matching only offered solutions to matching highly accurate sketches that were drawn while looking at the subject (viewed sketches). Forensic sketches differ from viewed sketches in that they are drawn by a police sketch artist using the description of the subject provided by an eyewitness. To identify forensic sketches, we present a framework called local feature-based discriminant analysis (LFDA). In LFDA, we individually represent both sketches and photos using SIFT feature descriptors and multiscale local binary patterns (MLBP). Multiple discriminant projections are then used on partitioned vectors of the feature-based representation for minimum distance matching. We apply this method to match a data set of 159 forensic sketches against a mug shot gallery containing 10,159 images. Compared to a leading commercial face recognition system, LFDA offers substantial improvements in matching forensic sketches to the corresponding face images. We were able to further improve the matching performance using race and gender information to reduce the target gallery size. Additional experiments demonstrate that the proposed framework leads to state-of-the-art accuracys when matching viewed sketches.

  18. Assimilation of passive and active CCI soil moisture products into hydrological modelling: an intercomparison study in Europe

    NASA Astrophysics Data System (ADS)

    Maggioni, V.; Massari, C.; Camici, S.; Brocca, L.; Marchesini, I.

    2017-12-01

    Soil moisture (SM) is a key variable in rainfall-runoff partitioning since it acts on the main hydrological processes taking part within a catchment. Modeling SM is often a difficult task due to its large variability at different temporal and spatial scales. Ground soil moisture measurements are a valuable tool for improving runoff prediction but are often limited and suffer from spatial representativeness issues. Remotely sensed observations offer a new source of data able to cope the latter issues thus opening new possibilities for improving flood simulations worldwide. Today, several different SM products are available at increased accuracy with respect to the past. Some interesting products are those derived from the Climate Change Initiative (CCI) which offer the most complete and most consistent global SM data record based on active and passive microwave sensors.Thanks to the combination of multiple sensors within an active, a passive and an active+passive products, the CCI SM is expected to provide a significant benefit for the improvement of rainfall-runoff simulations through data assimilation. However, previous studies have shown that the success of the assimilation is not only related to the accuracy of the observations but also to the specific climate and the catchment physical and hydrological characteristics as well as to many necessary choices related to the assimilation technique. These choices along with the type of SM observations (i.e. passive or active) might play an important role for the success or the failure of the assimilation exercise which is not still clear. In this study, based on a large dataset of catchments covering large part of the Europe, we assimilated satellite SM observations from the passive and the active CCI SM products into Modello Idrologico Semiditribuito in Continuo (MISDc, Brocca et al. 2011). Rainfall and temperature data were collected from the European Climate Assessment & Dataset (E-OBS) while discharge data were obtained from the Global Runoff Data Centre (GRDC). Preliminary results show a general improvement of the hydrological simulations for catchments located in the Mediterranean areas specifically for the active product while lower performance is obtained at northern latitudes due to the presence of snow and ice.

  19. Good Practices in Free-energy Calculations

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew; Jarzynski, Christopher; Chipot, Christopher

    2013-01-01

    As access to computational resources continues to increase, free-energy calculations have emerged as a powerful tool that can play a predictive role in drug design. Yet, in a number of instances, the reliability of these calculations can be improved significantly if a number of precepts, or good practices are followed. For the most part, the theory upon which these good practices rely has been known for many years, but often overlooked, or simply ignored. In other cases, the theoretical developments are too recent for their potential to be fully grasped and merged into popular platforms for the computation of free-energy differences. The current best practices for carrying out free-energy calculations will be reviewed demonstrating that, at little to no additional cost, free-energy estimates could be markedly improved and bounded by meaningful error estimates. In energy perturbation and nonequilibrium work methods, monitoring the probability distributions that underlie the transformation between the states of interest, performing the calculation bidirectionally, stratifying the reaction pathway and choosing the most appropriate paradigms and algorithms for transforming between states offer significant gains in both accuracy and precision. In thermodynamic integration and probability distribution (histogramming) methods, properly designed adaptive techniques yield nearly uniform sampling of the relevant degrees of freedom and, by doing so, could markedly improve efficiency and accuracy of free energy calculations without incurring any additional computational expense.

  20. Radial k-t SPIRiT: autocalibrated parallel imaging for generalized phase-contrast MRI.

    PubMed

    Santelli, Claudio; Schaeffter, Tobias; Kozerke, Sebastian

    2014-11-01

    To extend SPIRiT to additionally exploit temporal correlations for highly accelerated generalized phase-contrast MRI and to compare the performance of the proposed radial k-t SPIRiT method relative to frame-by-frame SPIRiT and radial k-t GRAPPA reconstruction for velocity and turbulence mapping in the aortic arch. Free-breathing navigator-gated two-dimensional radial cine imaging with three-directional multi-point velocity encoding was implemented and fully sampled data were obtained in the aortic arch of healthy volunteers. Velocities were encoded with three different first gradient moments per axis to permit quantification of mean velocity and turbulent kinetic energy. Velocity and turbulent kinetic energy maps from up to 14-fold undersampled data were compared for k-t SPIRiT, frame-by-frame SPIRiT, and k-t GRAPPA relative to the fully sampled reference. Using k-t SPIRiT, improvements in magnitude and velocity reconstruction accuracy were found. Temporally resolved magnitude profiles revealed a reduction in spatial blurring with k-t SPIRiT compared with frame-by-frame SPIRiT and k-t GRAPPA for all velocity encodings, leading to improved estimates of turbulent kinetic energy. k-t SPIRiT offers improved reconstruction accuracy at high radial undersampling factors and hence facilitates the use of generalized phase-contrast MRI for routine use. Copyright © 2013 Wiley Periodicals, Inc.

  1. MRI - 3D Ultrasound - X-ray Image Fusion with Electromagnetic Tracking for Transendocardial Therapeutic Injections: In-vitro Validation and In-vivo Feasibility

    PubMed Central

    Hatt, Charles R.; Jain, Ameet K.; Parthasarathy, Vijay; Lang, Andrew; Raval, Amish N.

    2014-01-01

    Myocardial infarction (MI) is one of the leading causes of death in the world. Small animal studies have shown that stem-cell therapy offers dramatic functional improvement post-MI. An endomyocardial catheter injection approach to therapeutic agent delivery has been proposed to improve efficacy through increased cell retention. Accurate targeting is critical for reaching areas of greatest therapeutic potential while avoiding a life-threatening myocardial perforation. Multimodal image fusion has been proposed as a way to improve these procedures by augmenting traditional intra-operative imaging modalities with high resolution pre-procedural images. Previous approaches have suffered from a lack of real-time tissue imaging and dependence on X-ray imaging to track devices, leading to increased ionizing radiation dose. In this paper, we present a new image fusion system for catheter-based targeted delivery of therapeutic agents. The system registers real-time 3D echocardiography, magnetic resonance, X-ray, and electromagnetic sensor tracking within a single flexible framework. All system calibrations and registrations were validated and found to have target registration errors less than 5 mm in the worst case. Injection accuracy was validated in a motion enabled cardiac injection phantom, where targeting accuracy ranged from 0.57 to 3.81 mm. Clinical feasibility was demonstrated with in-vivo swine experiments, where injections were successfully made into targeted regions of the heart. PMID:23561056

  2. Progress on glass ceramic ZERODUR enabling nanometer precision

    NASA Astrophysics Data System (ADS)

    Jedamzik, Ralf; Kunisch, Clemens; Nieder, Johannes; Weber, Peter; Westerhoff, Thomas

    2016-03-01

    The Semiconductor Industry is making continuous progress in shrinking feature size developing technologies and process to achieve < 10 nm feature size. The required Overlay specification for successful production is in the range one nanometer or even smaller. Consequently, materials designed into metrology systems of exposure or inspection tools need to fulfill ever tighter specification on the coefficient of thermal expansion (CTE). The glass ceramic ZERODUR® is a well-established material in critical components of microlithography wafer stepper and offered with an extremely low coefficient of thermal expansion, the tightest tolerance available on market. SCHOTT is continuously improving manufacturing processes and it's method to measure and characterize the CTE behavior of ZERODUR®. This paper is focusing on the "Advanced Dilatometer" for determination of the CTE developed at SCHOTT in the recent years and introduced into production in Q1 2015. The achievement for improving the absolute CTE measurement accuracy and the reproducibility are described in detail. Those achievements are compared to the CTE measurement accuracy reported by the Physikalische Technische Bundesanstalt (PTB), the National Metrology Institute of Germany. The CTE homogeneity is of highest importance to achieve nanometer precision on larger scales. Additionally, the paper presents data on the short scale CTE homogeneity and its improvement in the last two years. The data presented in this paper will explain the capability of ZERODUR® to enable the extreme precision required for future generation of lithography equipment and processes.

  3. An excellent navigation system and experience in craniomaxillofacial navigation surgery: a double-center study

    PubMed Central

    Dai, Jiewen; Wu, Jinyang; Wang, Xudong; Yang, Xudong; Wu, Yunong; Xu, Bing; Shi, Jun; Yu, Hongbo; Cai, Min; Zhang, Wenbin; Zhang, Lei; Sun, Hao; Shen, Guofang; Zhang, Shilei

    2016-01-01

    Numerous problems regarding craniomaxillofacial navigation surgery are not well understood. In this study, we performed a double-center clinical study to quantitatively evaluate the characteristics of our navigation system and experience in craniomaxillofacial navigation surgery. Fifty-six patients with craniomaxillofacial disease were included and randomly divided into experimental (using our AccuNavi-A system) and control (using Strker system) groups to compare the surgical effects. The results revealed that the average pre-operative planning time was 32.32 mins vs 29.74 mins between the experimental and control group, respectively (p > 0.05). The average operative time was 295.61 mins vs 233.56 mins (p > 0.05). The point registration orientation accuracy was 0.83 mm vs 0.92 mm. The maximal average preoperative navigation orientation accuracy was 1.03 mm vs 1.17 mm. The maximal average persistent navigation orientation accuracy was 1.15 mm vs 0.09 mm. The maximal average navigation orientation accuracy after registration recovery was 1.15 mm vs 1.39 mm between the experimental and control group. All patients healed, and their function and profile improved. These findings demonstrate that although surgeons should consider the patients’ time and monetary costs, our qualified navigation surgery system and experience could offer an accurate guide during a variety of craniomaxillofacial surgeries. PMID:27305855

  4. Highly sensitive distributed birefringence measurements based on a two-pulse interrogation of a dynamic Brillouin grating

    NASA Astrophysics Data System (ADS)

    Soto, Marcelo A.; Denisov, Andrey; Angulo-Vinuesa, Xabier; Martin-Lopez, Sonia; Thévenaz, Luc; Gonzalez-Herraez, Miguel

    2017-04-01

    A method for distributed birefringence measurements is proposed based on the interference pattern generated by the interrogation of a dynamic Brillouin grating (DBG) using two short consecutive optical pulses. Compared to existing DBG interrogation techniques, the method here offers an improved sensitivity to birefringence changes thanks to the interferometric effect generated by the reflections of the two pulses. Experimental results demonstrate the possibility to obtain the longitudinal birefringence profile of a 20 m-long Panda fibre with an accuracy of 10-8 using 16 averages and 30 cm spatial resolution. The method enables sub-metric and highly-accurate distributed temperature and strain sensing.

  5. A collimated focused ultrasound beam of high acoustic transmission and minimum diffraction achieved by using a lens with subwavelength structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Zhou; Tu, Juan; Cheng, Jianchun

    An acoustic focusing lens incorporated with periodically aligned subwavelength grooves corrugated on its spherical surface has been developed. It is demonstrated theoretically and experimentally that acoustic focusing achieved by using the lens can suppress the relative side-lobe amplitudes, enhance the focal gain, and minimize the shifting of the focus. Use of the lens coupled with a planar ultrasound transducer can generate an ultrasound beam with enhanced acoustic transmission and collimation effect, which offers the capability of improving the safety, efficiency, and accuracy of targeted surgery implemented by high intensity focused ultrasound.

  6. Robust Bayesian Fluorescence Lifetime Estimation, Decay Model Selection and Instrument Response Determination for Low-Intensity FLIM Imaging

    PubMed Central

    Rowley, Mark I.; Coolen, Anthonius C. C.; Vojnovic, Borivoj; Barber, Paul R.

    2016-01-01

    We present novel Bayesian methods for the analysis of exponential decay data that exploit the evidence carried by every detected decay event and enables robust extension to advanced processing. Our algorithms are presented in the context of fluorescence lifetime imaging microscopy (FLIM) and particular attention has been paid to model the time-domain system (based on time-correlated single photon counting) with unprecedented accuracy. We present estimates of decay parameters for mono- and bi-exponential systems, offering up to a factor of two improvement in accuracy compared to previous popular techniques. Results of the analysis of synthetic and experimental data are presented, and areas where the superior precision of our techniques can be exploited in Förster Resonance Energy Transfer (FRET) experiments are described. Furthermore, we demonstrate two advanced processing methods: decay model selection to choose between differing models such as mono- and bi-exponential, and the simultaneous estimation of instrument and decay parameters. PMID:27355322

  7. A hybrid approach EMD-HW for short-term forecasting of daily stock market time series data

    NASA Astrophysics Data System (ADS)

    Awajan, Ahmad Mohd; Ismail, Mohd Tahir

    2017-08-01

    Recently, forecasting time series has attracted considerable attention in the field of analyzing financial time series data, specifically within the stock market index. Moreover, stock market forecasting is a challenging area of financial time-series forecasting. In this study, a hybrid methodology between Empirical Mode Decomposition with the Holt-Winter method (EMD-HW) is used to improve forecasting performances in financial time series. The strength of this EMD-HW lies in its ability to forecast non-stationary and non-linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy and offers a new forecasting method in time series. The daily stock market time series data of 11 countries is applied to show the forecasting performance of the proposed EMD-HW. Based on the three forecast accuracy measures, the results indicate that EMD-HW forecasting performance is superior to traditional Holt-Winter forecasting method.

  8. High accuracy broadband infrared spectropolarimetry

    NASA Astrophysics Data System (ADS)

    Krishnaswamy, Venkataramanan

    Mueller matrix spectroscopy or Spectropolarimetry combines conventional spectroscopy with polarimetry, providing more information than can be gleaned from spectroscopy alone. Experimental studies on infrared polarization properties of materials covering a broad spectral range have been scarce due to the lack of available instrumentation. This dissertation aims to fill the gap by the design, development, calibration and testing of a broadband Fourier Transform Infra-Red (FT-IR) spectropolarimeter. The instrument operates over the 3-12 mum waveband and offers better overall accuracy compared to the previous generation instruments. Accurate calibration of a broadband spectropolarimeter is a non-trivial task due to the inherent complexity of the measurement process. An improved calibration technique is proposed for the spectropolarimeter and numerical simulations are conducted to study the effectiveness of the proposed technique. Insights into the geometrical structure of the polarimetric measurement matrix is provided to aid further research towards global optimization of Mueller matrix polarimeters. A high performance infrared wire-grid polarizer is characterized using the spectropolarimeter. Mueller matrix spectrum measurements on Penicillin and pine pollen are also presented.

  9. The sensitivity of radiography of the postoperative stomach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ott, D.J.; Munitz, H.A.; Gelfand, D.W.

    1982-09-01

    The results of radiology and endoscopy were compared in 140 patients who had undergone gastric surgery for ulcer disease. Of 74 patients who were examined with single-contrast radiography, 37 had abnormalities that were demonstrated endoscopically. The radiographic sensitivities in these patients were: gastritis 2/22 (9%); ulcer 3/5 (60%); obstruction 8/8 (100%); and miscellaneous abnormalities 2/2 (100%). The predictive accuracy of a diagnois of ulcer was 38%. Of the 66 patients who were examined with double-contrast radiography, 33 abnormalities were found with endoscopy. The radiographic sensitivities were: gastritis 3/13 (23%); ulcer 7/10 (70%); obstruction 4/4 (100%); and miscellaneous abnormalities 6/6 (100%).more » The predictive accuracy of a diagnosis of ulcer was 44%. Radiology appears to be unreliable in diagnosing gastritis and recurrent ulceration in the post-operation stomach. The double-contrast technique does not offer significant improvement over the single-contrast method in evaluating these postoperative problems.« less

  10. Can digital pathology result in cost savings? A financial projection for digital pathology implementation at a large integrated health care organization.

    PubMed

    Ho, Jonhan; Ahlers, Stefan M; Stratman, Curtis; Aridor, Orly; Pantanowitz, Liron; Fine, Jeffrey L; Kuzmishin, John A; Montalto, Michael C; Parwani, Anil V

    2014-01-01

    Digital pathology offers potential improvements in workflow and interpretive accuracy. Although currently digital pathology is commonly used for research and education, its clinical use has been limited to niche applications such as frozen sections and remote second opinion consultations. This is mainly due to regulatory hurdles, but also to a dearth of data supporting a positive economic cost-benefit. Large scale adoption of digital pathology and the integration of digital slides into the routine anatomic/surgical pathology "slide less" clinical workflow will occur only if digital pathology will offer a quantifiable benefit, which could come in the form of more efficient and/or higher quality care. As a large academic-based health care organization expecting to adopt digital pathology for primary diagnosis upon its regulatory approval, our institution estimated potential operational cost savings offered by the implementation of an enterprise-wide digital pathology system (DPS). Projected cost savings were calculated for the first 5 years following implementation of a DPS based on operational data collected from the pathology department. Projected savings were based on two factors: (1) Productivity and lab consolidation savings; and (2) avoided treatment costs due to improvements in the accuracy of cancer diagnoses among nonsubspecialty pathologists. Detailed analyses of incremental treatment costs due to interpretive errors, resulting in either a false positive or false negative diagnosis, was performed for melanoma and breast cancer and extrapolated to 10 other common cancers. When phased in over 5-years, total cost savings based on anticipated improvements in pathology productivity and histology lab consolidation were estimated at $12.4 million for an institution with 219,000 annual accessions. The main contributing factors to these savings were gains in pathologist clinical full-time equivalent capacity impacted by improved pathologist productivity and workload distribution. Expanding the current localized specialty sign-out model to an enterprise-wide shared general/subspecialist sign-out model could potentially reduce costs of incorrect treatment by $5.4 million. These calculations were based on annual over and under treatment costs for breast cancer and melanoma estimated to be approximately $26,000 and $11,000/case, respectively, and extrapolated to $21,500/case for other cancer types. The projected 5-year total cost savings for our large academic-based health care organization upon fully implementing a DPS was approximately $18 million. If the costs of digital pathology acquisition and implementation do not exceed this value, the return on investment becomes attractive to hospital administrators. Furthermore, improved patient outcome enabled by this technology strengthens the argument supporting adoption of an enterprise-wide DPS.

  11. Can Digital Pathology Result In Cost Savings? A Financial Projection For Digital Pathology Implementation At A Large Integrated Health Care Organization

    PubMed Central

    Ho, Jonhan; Ahlers, Stefan M.; Stratman, Curtis; Aridor, Orly; Pantanowitz, Liron; Fine, Jeffrey L.; Kuzmishin, John A.; Montalto, Michael C.; Parwani, Anil V.

    2014-01-01

    Background: Digital pathology offers potential improvements in workflow and interpretive accuracy. Although currently digital pathology is commonly used for research and education, its clinical use has been limited to niche applications such as frozen sections and remote second opinion consultations. This is mainly due to regulatory hurdles, but also to a dearth of data supporting a positive economic cost-benefit. Large scale adoption of digital pathology and the integration of digital slides into the routine anatomic/surgical pathology “slide less” clinical workflow will occur only if digital pathology will offer a quantifiable benefit, which could come in the form of more efficient and/or higher quality care. Aim: As a large academic-based health care organization expecting to adopt digital pathology for primary diagnosis upon its regulatory approval, our institution estimated potential operational cost savings offered by the implementation of an enterprise-wide digital pathology system (DPS). Methods: Projected cost savings were calculated for the first 5 years following implementation of a DPS based on operational data collected from the pathology department. Projected savings were based on two factors: (1) Productivity and lab consolidation savings; and (2) avoided treatment costs due to improvements in the accuracy of cancer diagnoses among nonsubspecialty pathologists. Detailed analyses of incremental treatment costs due to interpretive errors, resulting in either a false positive or false negative diagnosis, was performed for melanoma and breast cancer and extrapolated to 10 other common cancers. Results: When phased in over 5-years, total cost savings based on anticipated improvements in pathology productivity and histology lab consolidation were estimated at $12.4 million for an institution with 219,000 annual accessions. The main contributing factors to these savings were gains in pathologist clinical full-time equivalent capacity impacted by improved pathologist productivity and workload distribution. Expanding the current localized specialty sign-out model to an enterprise-wide shared general/subspecialist sign-out model could potentially reduce costs of incorrect treatment by $5.4 million. These calculations were based on annual over and under treatment costs for breast cancer and melanoma estimated to be approximately $26,000 and $11,000/case, respectively, and extrapolated to $21,500/case for other cancer types. Conclusions: The projected 5-year total cost savings for our large academic-based health care organization upon fully implementing a DPS was approximately $18 million. If the costs of digital pathology acquisition and implementation do not exceed this value, the return on investment becomes attractive to hospital administrators. Furthermore, improved patient outcome enabled by this technology strengthens the argument supporting adoption of an enterprise-wide DPS. PMID:25250191

  12. Can verbal working memory training improve reading?

    PubMed

    Banales, Erin; Kohnen, Saskia; McArthur, Genevieve

    2015-01-01

    The aim of the current study was to determine whether poor verbal working memory is associated with poor word reading accuracy because the former causes the latter, or the latter causes the former. To this end, we tested whether (a) verbal working memory training improves poor verbal working memory or poor word reading accuracy, and whether (b) reading training improves poor reading accuracy or verbal working memory in a case series of four children with poor word reading accuracy and verbal working memory. Each child completed 8 weeks of verbal working memory training and 8 weeks of reading training. Verbal working memory training improved verbal working memory in two of the four children, but did not improve their reading accuracy. Similarly, reading training improved word reading accuracy in all children, but did not improve their verbal working memory. These results suggest that the causal links between verbal working memory and reading accuracy may not be as direct as has been assumed.

  13. Face recognition accuracy of forensic examiners, superrecognizers, and face recognition algorithms.

    PubMed

    Phillips, P Jonathon; Yates, Amy N; Hu, Ying; Hahn, Carina A; Noyes, Eilidh; Jackson, Kelsey; Cavazos, Jacqueline G; Jeckeln, Géraldine; Ranjan, Rajeev; Sankaranarayanan, Swami; Chen, Jun-Cheng; Castillo, Carlos D; Chellappa, Rama; White, David; O'Toole, Alice J

    2018-06-12

    Achieving the upper limits of face identification accuracy in forensic applications can minimize errors that have profound social and personal consequences. Although forensic examiners identify faces in these applications, systematic tests of their accuracy are rare. How can we achieve the most accurate face identification: using people and/or machines working alone or in collaboration? In a comprehensive comparison of face identification by humans and computers, we found that forensic facial examiners, facial reviewers, and superrecognizers were more accurate than fingerprint examiners and students on a challenging face identification test. Individual performance on the test varied widely. On the same test, four deep convolutional neural networks (DCNNs), developed between 2015 and 2017, identified faces within the range of human accuracy. Accuracy of the algorithms increased steadily over time, with the most recent DCNN scoring above the median of the forensic facial examiners. Using crowd-sourcing methods, we fused the judgments of multiple forensic facial examiners by averaging their rating-based identity judgments. Accuracy was substantially better for fused judgments than for individuals working alone. Fusion also served to stabilize performance, boosting the scores of lower-performing individuals and decreasing variability. Single forensic facial examiners fused with the best algorithm were more accurate than the combination of two examiners. Therefore, collaboration among humans and between humans and machines offers tangible benefits to face identification accuracy in important applications. These results offer an evidence-based roadmap for achieving the most accurate face identification possible. Copyright © 2018 the Author(s). Published by PNAS.

  14. SU-E-I-33: Initial Evaluation of Model-Based Iterative CT Reconstruction Using Standard Image Quality Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gingold, E; Dave, J

    2014-06-01

    Purpose: The purpose of this study was to compare a new model-based iterative reconstruction with existing reconstruction methods (filtered backprojection and basic iterative reconstruction) using quantitative analysis of standard image quality phantom images. Methods: An ACR accreditation phantom (Gammex 464) and a CATPHAN600 phantom were scanned using 3 routine clinical acquisition protocols (adult axial brain, adult abdomen, and pediatric abdomen) on a Philips iCT system. Each scan was acquired using default conditions and 75%, 50% and 25% dose levels. Images were reconstructed using standard filtered backprojection (FBP), conventional iterative reconstruction (iDose4) and a prototype model-based iterative reconstruction (IMR). Phantom measurementsmore » included CT number accuracy, contrast to noise ratio (CNR), modulation transfer function (MTF), low contrast detectability (LCD), and noise power spectrum (NPS). Results: The choice of reconstruction method had no effect on CT number accuracy, or MTF (p<0.01). The CNR of a 6 HU contrast target was improved by 1–67% with iDose4 relative to FBP, while IMR improved CNR by 145–367% across all protocols and dose levels. Within each scan protocol, the CNR improvement from IMR vs FBP showed a general trend of greater improvement at lower dose levels. NPS magnitude was greatest for FBP and lowest for IMR. The NPS of the IMR reconstruction showed a pronounced decrease with increasing spatial frequency, consistent with the unusual noise texture seen in IMR images. Conclusion: Iterative Model Reconstruction reduces noise and improves contrast-to-noise ratio without sacrificing spatial resolution in CT phantom images. This offers the possibility of radiation dose reduction and improved low contrast detectability compared with filtered backprojection or conventional iterative reconstruction.« less

  15. Improved bacterial identification directly from urine samples with matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Kitagawa, Koichi; Shigemura, Katsumi; Onuma, Ken-Ichiro; Nishida, Masako; Fujiwara, Mayu; Kobayashi, Saori; Yamasaki, Mika; Nakamura, Tatsuya; Yamamichi, Fukashi; Shirakawa, Toshiro; Tokimatsu, Issei; Fujisawa, Masato

    2018-03-01

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) contributes to rapid identification of pathogens in the clinic but has not yet performed especially well for Gram-positive cocci (GPC) causing complicated urinary tract infection (UTI). The goal of this study was to investigate the possible clinical use of MALDI-TOF MS as a rapid method for bacterial identification directly from urine in complicated UTI. MALDI-TOF MS was applied to urine samples gathered from 142 suspected complicated UTI patients in 2015-2017. We modified the standard procedure (Method 1) for sample preparation by adding an initial 10 minutes of ultrasonication followed by centrifugation at 500 g for 1 minutes to remove debris such as epithelial cells and leukocytes from the urine (Method 2). In 133 urine culture-positive bacteria, the rate of corresponded with urine culture in GPC by MALDI-TOF MS in urine with standard sample preparation (Method 1) was 16.7%, but the modified sample preparation (Method 2) significantly improved that rate to 52.2% (P=.045). Method 2 also improved the identification accuracy for Gram-negative rods (GNR) from 77.1% to 94.2% (P=.022). The modified Method 2 significantly improved the average MALDI score from 1.408±0.153 to 2.166±0.045 (P=.000) for GPC and slightly improved the score from 2.107±0.061 to 2.164±0.037 for GNR. The modified sample preparation for MALDI-TOF MS can improve identification accuracy for complicated UTI causative bacteria. This simple modification offers a rapid and accurate routine diagnosis for UTI, and may possibly be a substitute for urine cultures. © 2017 Wiley Periodicals, Inc.

  16. An Evaluation of the Predictive Validity of Confidence Ratings in Identifying Functional Behavioral Assessment Hypothesis Statements

    ERIC Educational Resources Information Center

    Borgmeier, Chris; Horner, Robert H.

    2006-01-01

    Faced with limited resources, schools require tools that increase the accuracy and efficiency of functional behavioral assessment. Yarbrough and Carr (2000) provided evidence that informant confidence ratings of the likelihood of problem behavior in specific situations offered a promising tool for predicting the accuracy of function-based…

  17. Using the Instructional Level as a Criterion to Target Reading Interventions

    ERIC Educational Resources Information Center

    Parker, David C.; Burns, Matthew K.

    2014-01-01

    The instructional hierarchy offers a useful framework for targeting academic interventions. Within this framework, the accuracy with which a student reads might function as an indicator that the student should receive an intervention that focuses either on accuracy or on fluency. The current study examined whether the instructional level for…

  18. Interest of the MICROSTAR Accelerometer to improve the GRASP Mission.

    NASA Astrophysics Data System (ADS)

    Perrot, E.; Lebat, V.; Foulon, B.; Christophe, B.; Liorzou, F.; Huynh, P. A.

    2015-12-01

    The Geodetic Reference Antenna in Space (GRASP) is a micro satellite mission concept proposed by JPL to improve the definition of the Terrestrial Reference Frame (TRF). GRASP collocates GPS, SLR, VLBI, and DORIS sensors on a dedicated spacecraft in order to establish precise and stable ties between the key geodetic techniques used to define and disseminate the TRF. GRASP also offers a space-based reference antenna for the present and future Global Navigation Satellite Systems (GNSS). By taking advantage of the new testing possibilities offer by the catapult facility at the ZARM drop tower, the ONERA's space accelerometer team proposes an up-dated version, called MICROSTAR, of its ultra sensitive electrostatic accelerometers which have contributed to the success of the last Earth's gravity missions GRACE and GOCE. Built around a cubic proof-mass, it provides the 3 linear accelerations with a resolution better than 10-11 ms-2/Hz1/2 into a measurement bandwidth between 10-3 Hz and 0.1 Hz and the 3 angular accelerations about its 3 orthogonal axes with 5´10-10 rad.s-2/Hz1/2 resolution. Integrated at the centre of mass of the satellite, MICROSTAR improves the Precise Orbit Determination (POD) by accurate measurement of the non-gravitational force acting on the satellite. It offers also the possibility to calibrate the change in the position of the satellite center of mass with an accuracy better than 100 μm as demonstrated in the GRACE mission. Assuming a sufficiently rigid structure between the antennas and the accelerometer, its data can participate to reach the mission objective of 1 mm precision for the TRF position.

  19. PeakRanger: A cloud-enabled peak caller for ChIP-seq data

    PubMed Central

    2011-01-01

    Background Chromatin immunoprecipitation (ChIP), coupled with massively parallel short-read sequencing (seq) is used to probe chromatin dynamics. Although there are many algorithms to call peaks from ChIP-seq datasets, most are tuned either to handle punctate sites, such as transcriptional factor binding sites, or broad regions, such as histone modification marks; few can do both. Other algorithms are limited in their configurability, performance on large data sets, and ability to distinguish closely-spaced peaks. Results In this paper, we introduce PeakRanger, a peak caller software package that works equally well on punctate and broad sites, can resolve closely-spaced peaks, has excellent performance, and is easily customized. In addition, PeakRanger can be run in a parallel cloud computing environment to obtain extremely high performance on very large data sets. We present a series of benchmarks to evaluate PeakRanger against 10 other peak callers, and demonstrate the performance of PeakRanger on both real and synthetic data sets. We also present real world usages of PeakRanger, including peak-calling in the modENCODE project. Conclusions Compared to other peak callers tested, PeakRanger offers improved resolution in distinguishing extremely closely-spaced peaks. PeakRanger has above-average spatial accuracy in terms of identifying the precise location of binding events. PeakRanger also has excellent sensitivity and specificity in all benchmarks evaluated. In addition, PeakRanger offers significant improvements in run time when running on a single processor system, and very marked improvements when allowed to take advantage of the MapReduce parallel environment offered by a cloud computing resource. PeakRanger can be downloaded at the official site of modENCODE project: http://www.modencode.org/software/ranger/ PMID:21554709

  20. LLNL Location and Detection Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, S C; Harris, D B; Anderson, M L

    2003-07-16

    We present two LLNL research projects in the topical areas of location and detection. The first project assesses epicenter accuracy using a multiple-event location algorithm, and the second project employs waveform subspace Correlation to detect and identify events at Fennoscandian mines. Accurately located seismic events are the bases of location calibration. A well-characterized set of calibration events enables new Earth model development, empirical calibration, and validation of models. In a recent study, Bondar et al. (2003) develop network coverage criteria for assessing the accuracy of event locations that are determined using single-event, linearized inversion methods. These criteria are conservative andmore » are meant for application to large bulletins where emphasis is on catalog completeness and any given event location may be improved through detailed analysis or application of advanced algorithms. Relative event location techniques are touted as advancements that may improve absolute location accuracy by (1) ensuring an internally consistent dataset, (2) constraining a subset of events to known locations, and (3) taking advantage of station and event correlation structure. Here we present the preliminary phase of this work in which we use Nevada Test Site (NTS) nuclear explosions, with known locations, to test the effect of travel-time model accuracy on relative location accuracy. Like previous studies, we find that the reference velocity-model and relative-location accuracy are highly correlated. We also find that metrics based on travel-time residual of relocated events are not a reliable for assessing either velocity-model or relative-location accuracy. In the topical area of detection, we develop specialized correlation (subspace) detectors for the principal mines surrounding the ARCES station located in the European Arctic. Our objective is to provide efficient screens for explosions occurring in the mines of the Kola Peninsula (Kovdor, Zapolyarny, Olenogorsk, Khibiny) and the major iron mines of northern Sweden (Malmberget, Kiruna). In excess of 90% of the events detected by the ARCES station are mining explosions, and a significant fraction are from these northern mining groups. The primary challenge in developing waveform correlation detectors is the degree of variation in the source time histories of the shots, which can result in poor correlation among events even in close proximity. Our approach to solving this problem is to use lagged subspace correlation detectors, which offer some prospect of compensating for variation and uncertainty in source time functions.« less

  1. Reduction in Thyroid Nodule Biopsies and Improved Accuracy with American College of Radiology Thyroid Imaging Reporting and Data System.

    PubMed

    Hoang, Jenny K; Middleton, William D; Farjat, Alfredo E; Langer, Jill E; Reading, Carl C; Teefey, Sharlene A; Abinanti, Nicole; Boschini, Fernando J; Bronner, Abraham J; Dahiya, Nirvikar; Hertzberg, Barbara S; Newman, Justin R; Scanga, Daniel; Vogler, Robert C; Tessler, Franklin N

    2018-04-01

    Purpose To compare the biopsy rate and diagnostic accuracy before and after applying the American College of Radiology (ACR) Thyroid Imaging Reporting and Data System (TI-RADS) criteria for thyroid nodule evaluation. Materials and Methods In this retrospective study, eight radiologists with 3-32 years experience in thyroid ultrasonography (US) reviewed US features of 100 thyroid nodules that were cytologically proven, pathologically proven, or both in December 2016. The radiologists evaluated nodule features in five US categories and provided biopsy recommendations based on their own practice patterns without knowledge of ACR TI-RADS criteria. Another three expert radiologists served as the reference standard readers for the imaging findings. ACR TI-RADS criteria were retrospectively applied to the features assigned by the eight radiologists to produce biopsy recommendations. Comparison was made for biopsy rate, sensitivity, specificity, and accuracy. Results Fifteen of the 100 nodules (15%) were malignant. The mean number of nodules recommended for biopsy by the eight radiologists was 80 ± 16 (standard deviation) (range, 38-95 nodules) based on their own practice patterns and 57 ± 11 (range, 37-73 nodules) with retrospective application of ACR TI-RADS criteria. Without ACR TI-RADS criteria, readers had an overall sensitivity, specificity, and accuracy of 95% (95% confidence interval [CI]: 83%, 99%), 20% (95% CI: 16%, 25%), and 28% (95% CI: 21%, 37%), respectively. After applying ACR TI-RADS criteria, overall sensitivity, specificity, and accuracy were 92% (95% CI: 68%, 98%), 44% (95% CI: 33%, 56%), and 52% (95% CI: 40%, 63%), respectively. Although fewer malignancies were recommended for biopsy with ACR TI-RADS criteria, the majority met the criteria for follow-up US, with only three of 120 (2.5%) malignancy encounters requiring no follow-up or biopsy. Expert consensus recommended biopsy in 55 of 100 nodules with ACR TI-RADS criteria. Their sensitivity, specificity, and accuracy were 87% (95% CI: 48%, 98%), 51% (95% CI: 40%, 62%), and 56% (95% CI: 46%, 66%), respectively. Conclusion ACR TI-RADS criteria offer a meaningful reduction in the number of thyroid nodules recommended for biopsy and significantly improve the accuracy of recommendations for nodule management. © RSNA, 2018 Online supplemental material is available for this article.

  2. Global investigations of the satellite-based Fugro OmniSTAR HP service

    NASA Astrophysics Data System (ADS)

    Pflugmacher, Andreas; Heister, Hansbert; Heunecke, Otto

    2009-12-01

    OmniSTAR is one of the world's leading suppliers of satellite-based augmentation services for onshore and offshore GNSS applications. OmniSTAR currently offers three services: VBS, HP and XP. OmniSTAR VBS is the code-based service, suitable for sub-metre positioning accuracy. The HP and XP services provide sub-decimetre accuracy, with the HP service based on a precise differential methodology and the XP service uses precise absolute positioning. The sub-decimetre HP and XP services both have distinctive convergence behaviour, and the positioning task is essentially a time-dependent process during which the accuracy of the estimated coordinates continuously improves over time. To validate the capabilities of the OmniSTAR services, and in particular the HP (High Performance) service, globally distributed measurement campaigns were performed. The results of these investigations confirm that the HP service satisfies its high accuracy specification, but only after a sufficient initialisation phase. Two kinds of disturbances can handicap HP operation: lack of GNSS observations and outages of the augmentation signal. The most serious kind of disturbance is the former. Within a few seconds the achieved convergence level is completely lost. Outages in the reception of augmentation data merely affect the relevant period of the outage - the accuracy during the outage is degraded. Only longer interruptions lead to a loss of the HP solution. When HP convergence is lost, the HP process has to be re-initialized. If there are known points (so-called “seed points”) available, a shortened “kick-start”-initialization is possible. With the aid of seed points it only takes a few minutes to restore convergence.

  3. Investigation of the interpolation method to improve the distributed strain measurement accuracy in optical frequency domain reflectometry systems.

    PubMed

    Cui, Jiwen; Zhao, Shiyuan; Yang, Di; Ding, Zhenyang

    2018-02-20

    We use a spectrum interpolation technique to improve the distributed strain measurement accuracy in a Rayleigh-scatter-based optical frequency domain reflectometry sensing system. We demonstrate that strain accuracy is not limited by the "uncertainty principle" that exists in the time-frequency analysis. Different interpolation methods are investigated and used to improve the accuracy of peak position of the cross-correlation and, therefore, improve the accuracy of the strain. Interpolation implemented by padding zeros on one side of the windowed data in the spatial domain, before the inverse fast Fourier transform, is found to have the best accuracy. Using this method, the strain accuracy and resolution are both improved without decreasing the spatial resolution. The strain of 3 μϵ within the spatial resolution of 1 cm at the position of 21.4 m is distinguished, and the measurement uncertainty is 3.3 μϵ.

  4. Predicting treatment outcome of drug-susceptible tuberculosis patients using machine-learning models.

    PubMed

    Hussain, Owais A; Junejo, Khurum N

    2018-02-20

    Tuberculosis (TB) is a deadly contagious disease and a serious global health problem. It is curable but due to its lengthy treatment process, a patient is likely to leave the treatment incomplete, leading to a more lethal, drug resistant form of disease. The World Health Organization (WHO) propagates Directly Observed Therapy Short-course (DOTS) as an effective way to stop the spread of TB in communities with a high burden. But DOTS also adds a significant burden on the financial feasibility of the program. We aim to facilitate TB programs by predicting the outcome of the treatment of a particular patient at the start of treatment so that their health workers can be utilized in a targeted and cost-effective way. The problem was modeled as a classification problem, and the outcome of treatment was predicted using state-of-art implementations of 3 machine learning algorithms. 4213 patients were evaluated, out of which 64.37% completed their treatment. Results were evaluated using 4 performance measures; accuracy, precision, sensitivity, and specificity. The models offer an improvement of more than 12% accuracy over the baseline prediction. Empirical results also revealed some insights to improve TB programs. Overall, our proposed methodology will may help teams running TB programs manage their human resources more effectively, thus saving more lives.

  5. Combining bimodal presentation schemes and buzz groups improves clinical reasoning and learning at morning report.

    PubMed

    Balslev, Thomas; Rasmussen, Astrid Bruun; Skajaa, Torjus; Nielsen, Jens Peter; Muijtjens, Arno; De Grave, Willem; Van Merriënboer, Jeroen

    2014-12-11

    Abstract Morning reports offer opportunities for intensive work-based learning. In this controlled study, we measured learning processes and outcomes with the report of paediatric emergency room patients. Twelve specialists and 12 residents were randomised into four groups and discussed the same two paediatric cases. The groups differed in their presentation modality (verbal only vs. verbal + text) and the use of buzz groups (with vs. without). The verbal interactions were analysed for clinical reasoning processes. Perceptions of learning and judgment of learning were reported in a questionnaire. Diagnostic accuracy was assessed by a 20-item multiple-choice test. Combined bimodal presentation and buzz groups increased the odds ratio of clinical reasoning to occur in the discussion of cases by a factor of 1.90 (p = 0.013), indicating superior reasoning for buzz groups working with bimodal materials. For specialists, a positive effect of bimodal presentation was found on perceptions of learning (p < 0.05), and for residents, a positive effect of buzz groups was found on judgment of learning (p < 0.005). A positive effect of bimodal presentation on diagnostic accuracy was noted in the specialists (p < 0.05). Combined bimodal presentation and buzz group discussion of emergency cases improves clinicians' clinical reasoning and learning.

  6. Development of a multilocus-based approach for sponge (phylum Porifera) identification: refinement and limitations.

    PubMed

    Yang, Qi; Franco, Christopher M M; Sorokin, Shirley J; Zhang, Wei

    2017-02-02

    For sponges (phylum Porifera), there is no reliable molecular protocol available for species identification. To address this gap, we developed a multilocus-based Sponge Identification Protocol (SIP) validated by a sample of 37 sponge species belonging to 10 orders from South Australia. The universal barcode COI mtDNA, 28S rRNA gene (D3-D5), and the nuclear ITS1-5.8S-ITS2 region were evaluated for their suitability and capacity for sponge identification. The highest Bit Score was applied to infer the identity. The reliability of SIP was validated by phylogenetic analysis. The 28S rRNA gene and COI mtDNA performed better than the ITS region in classifying sponges at various taxonomic levels. A major limitation is that the databases are not well populated and possess low diversity, making it difficult to conduct the molecular identification protocol. The identification is also impacted by the accuracy of the morphological classification of the sponges whose sequences have been submitted to the database. Re-examination of the morphological identification further demonstrated and improved the reliability of sponge identification by SIP. Integrated with morphological identification, the multilocus-based SIP offers an improved protocol for more reliable and effective sponge identification, by coupling the accuracy of different DNA markers.

  7. Development of a multilocus-based approach for sponge (phylum Porifera) identification: refinement and limitations

    PubMed Central

    Yang, Qi; Franco, Christopher M. M.; Sorokin, Shirley J.; Zhang, Wei

    2017-01-01

    For sponges (phylum Porifera), there is no reliable molecular protocol available for species identification. To address this gap, we developed a multilocus-based Sponge Identification Protocol (SIP) validated by a sample of 37 sponge species belonging to 10 orders from South Australia. The universal barcode COI mtDNA, 28S rRNA gene (D3–D5), and the nuclear ITS1-5.8S-ITS2 region were evaluated for their suitability and capacity for sponge identification. The highest Bit Score was applied to infer the identity. The reliability of SIP was validated by phylogenetic analysis. The 28S rRNA gene and COI mtDNA performed better than the ITS region in classifying sponges at various taxonomic levels. A major limitation is that the databases are not well populated and possess low diversity, making it difficult to conduct the molecular identification protocol. The identification is also impacted by the accuracy of the morphological classification of the sponges whose sequences have been submitted to the database. Re-examination of the morphological identification further demonstrated and improved the reliability of sponge identification by SIP. Integrated with morphological identification, the multilocus-based SIP offers an improved protocol for more reliable and effective sponge identification, by coupling the accuracy of different DNA markers. PMID:28150727

  8. Improving diagnostic recognition of primary hyperparathyroidism with machine learning.

    PubMed

    Somnay, Yash R; Craven, Mark; McCoy, Kelly L; Carty, Sally E; Wang, Tracy S; Greenberg, Caprice C; Schneider, David F

    2017-04-01

    Parathyroidectomy offers the only cure for primary hyperparathyroidism, but today only 50% of primary hyperparathyroidism patients are referred for operation, in large part, because the condition is widely under-recognized. The diagnosis of primary hyperparathyroidism can be especially challenging with mild biochemical indices. Machine learning is a collection of methods in which computers build predictive algorithms based on labeled examples. With the aim of facilitating diagnosis, we tested the ability of machine learning to distinguish primary hyperparathyroidism from normal physiology using clinical and laboratory data. This retrospective cohort study used a labeled training set and 10-fold cross-validation to evaluate accuracy of the algorithm. Measures of accuracy included area under the receiver operating characteristic curve, precision (sensitivity), and positive and negative predictive value. Several different algorithms and ensembles of algorithms were tested using the Weka platform. Among 11,830 patients managed operatively at 3 high-volume endocrine surgery programs from March 2001 to August 2013, 6,777 underwent parathyroidectomy for confirmed primary hyperparathyroidism, and 5,053 control patients without primary hyperparathyroidism underwent thyroidectomy. Test-set accuracies for machine learning models were determined using 10-fold cross-validation. Age, sex, and serum levels of preoperative calcium, phosphate, parathyroid hormone, vitamin D, and creatinine were defined as potential predictors of primary hyperparathyroidism. Mild primary hyperparathyroidism was defined as primary hyperparathyroidism with normal preoperative calcium or parathyroid hormone levels. After testing a variety of machine learning algorithms, Bayesian network models proved most accurate, classifying correctly 95.2% of all primary hyperparathyroidism patients (area under receiver operating characteristic = 0.989). Omitting parathyroid hormone from the model did not decrease the accuracy significantly (area under receiver operating characteristic = 0.985). In mild disease cases, however, the Bayesian network model classified correctly 71.1% of patients with normal calcium and 92.1% with normal parathyroid hormone levels preoperatively. Bayesian networking and AdaBoost improved the accuracy of all parathyroid hormone patients to 97.2% cases (area under receiver operating characteristic = 0.994), and 91.9% of primary hyperparathyroidism patients with mild disease. This was significantly improved relative to Bayesian networking alone (P < .0001). Machine learning can diagnose accurately primary hyperparathyroidism without human input even in mild disease. Incorporation of this tool into electronic medical record systems may aid in recognition of this under-diagnosed disorder. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Ensemble support vector machine classification of dementia using structural MRI and mini-mental state examination.

    PubMed

    Sørensen, Lauge; Nielsen, Mads

    2018-05-15

    The International Challenge for Automated Prediction of MCI from MRI data offered independent, standardized comparison of machine learning algorithms for multi-class classification of normal control (NC), mild cognitive impairment (MCI), converting MCI (cMCI), and Alzheimer's disease (AD) using brain imaging and general cognition. We proposed to use an ensemble of support vector machines (SVMs) that combined bagging without replacement and feature selection. SVM is the most commonly used algorithm in multivariate classification of dementia, and it was therefore valuable to evaluate the potential benefit of ensembling this type of classifier. The ensemble SVM, using either a linear or a radial basis function (RBF) kernel, achieved multi-class classification accuracies of 55.6% and 55.0% in the challenge test set (60 NC, 60 MCI, 60 cMCI, 60 AD), resulting in a third place in the challenge. Similar feature subset sizes were obtained for both kernels, and the most frequently selected MRI features were the volumes of the two hippocampal subregions left presubiculum and right subiculum. Post-challenge analysis revealed that enforcing a minimum number of selected features and increasing the number of ensemble classifiers improved classification accuracy up to 59.1%. The ensemble SVM outperformed single SVM classifications consistently in the challenge test set. Ensemble methods using bagging and feature selection can improve the performance of the commonly applied SVM classifier in dementia classification. This resulted in competitive classification accuracies in the International Challenge for Automated Prediction of MCI from MRI data. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. A portable blood plasma clot micro-elastometry device based on resonant acoustic spectroscopy

    NASA Astrophysics Data System (ADS)

    Krebs, C. R.; Li, Ling; Wolberg, Alisa S.; Oldenburg, Amy L.

    2015-07-01

    Abnormal blood clot stiffness is an important indicator of coagulation disorders arising from a variety of cardiovascular diseases and drug treatments. Here, we present a portable instrument for elastometry of microliter volume blood samples based upon the principle of resonant acoustic spectroscopy, where a sample of well-defined dimensions exhibits a fundamental longitudinal resonance mode proportional to the square root of the Young's modulus. In contrast to commercial thromboelastography, the resonant acoustic method offers improved repeatability and accuracy due to the high signal-to-noise ratio of the resonant vibration. We review the measurement principles and the design of a magnetically actuated microbead force transducer applying between 23 pN and 6.7 nN, providing a wide dynamic range of elastic moduli (3 Pa-27 kPa) appropriate for measurement of clot elastic modulus (CEM). An automated and portable device, the CEMport, is introduced and implemented using a 2 nm resolution displacement sensor with demonstrated accuracy and precision of 3% and 2%, respectively, of CEM in biogels. Importantly, the small strains (<0.13%) and low strain rates (<1/s) employed by the CEMport maintain a linear stress-to-strain relationship which provides a perturbative measurement of the Young's modulus. Measurements of blood plasma CEM versus heparin concentration show that CEMport is sensitive to heparin levels below 0.050 U/ml, which suggests future applications in sensing heparin levels of post-surgical cardiopulmonary bypass patients. The portability, high accuracy, and high precision of this device enable new clinical and animal studies for associating CEM with blood coagulation disorders, potentially leading to improved diagnostics and therapeutic monitoring.

  11. Accuracy of genomic prediction in switchgrass ( Panicum virgatum L.) improved by accounting for linkage disequilibrium

    DOE PAGES

    Ramstein, Guillaume P.; Evans, Joseph; Kaeppler, Shawn M.; ...

    2016-02-11

    Switchgrass is a relatively high-yielding and environmentally sustainable biomass crop, but further genetic gains in biomass yield must be achieved to make it an economically viable bioenergy feedstock. Genomic selection (GS) is an attractive technology to generate rapid genetic gains in switchgrass, and meet the goals of a substantial displacement of petroleum use with biofuels in the near future. In this study, we empirically assessed prediction procedures for genomic selection in two different populations, consisting of 137 and 110 half-sib families of switchgrass, tested in two locations in the United States for three agronomic traits: dry matter yield, plant height,more » and heading date. Marker data were produced for the families’ parents by exome capture sequencing, generating up to 141,030 polymorphic markers with available genomic-location and annotation information. We evaluated prediction procedures that varied not only by learning schemes and prediction models, but also by the way the data were preprocessed to account for redundancy in marker information. More complex genomic prediction procedures were generally not significantly more accurate than the simplest procedure, likely due to limited population sizes. Nevertheless, a highly significant gain in prediction accuracy was achieved by transforming the marker data through a marker correlation matrix. Our results suggest that marker-data transformations and, more generally, the account of linkage disequilibrium among markers, offer valuable opportunities for improving prediction procedures in GS. Furthermore, some of the achieved prediction accuracies should motivate implementation of GS in switchgrass breeding programs.« less

  12. A portable blood plasma clot micro-elastometry device based on resonant acoustic spectroscopy.

    PubMed

    Krebs, C R; Li, Ling; Wolberg, Alisa S; Oldenburg, Amy L

    2015-07-01

    Abnormal blood clot stiffness is an important indicator of coagulation disorders arising from a variety of cardiovascular diseases and drug treatments. Here, we present a portable instrument for elastometry of microliter volume blood samples based upon the principle of resonant acoustic spectroscopy, where a sample of well-defined dimensions exhibits a fundamental longitudinal resonance mode proportional to the square root of the Young's modulus. In contrast to commercial thromboelastography, the resonant acoustic method offers improved repeatability and accuracy due to the high signal-to-noise ratio of the resonant vibration. We review the measurement principles and the design of a magnetically actuated microbead force transducer applying between 23 pN and 6.7 nN, providing a wide dynamic range of elastic moduli (3 Pa-27 kPa) appropriate for measurement of clot elastic modulus (CEM). An automated and portable device, the CEMport, is introduced and implemented using a 2 nm resolution displacement sensor with demonstrated accuracy and precision of 3% and 2%, respectively, of CEM in biogels. Importantly, the small strains (<0.13%) and low strain rates (<1/s) employed by the CEMport maintain a linear stress-to-strain relationship which provides a perturbative measurement of the Young's modulus. Measurements of blood plasma CEM versus heparin concentration show that CEMport is sensitive to heparin levels below 0.050 U/ml, which suggests future applications in sensing heparin levels of post-surgical cardiopulmonary bypass patients. The portability, high accuracy, and high precision of this device enable new clinical and animal studies for associating CEM with blood coagulation disorders, potentially leading to improved diagnostics and therapeutic monitoring.

  13. Accuracy of genomic prediction in switchgrass ( Panicum virgatum L.) improved by accounting for linkage disequilibrium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramstein, Guillaume P.; Evans, Joseph; Kaeppler, Shawn M.

    Switchgrass is a relatively high-yielding and environmentally sustainable biomass crop, but further genetic gains in biomass yield must be achieved to make it an economically viable bioenergy feedstock. Genomic selection (GS) is an attractive technology to generate rapid genetic gains in switchgrass, and meet the goals of a substantial displacement of petroleum use with biofuels in the near future. In this study, we empirically assessed prediction procedures for genomic selection in two different populations, consisting of 137 and 110 half-sib families of switchgrass, tested in two locations in the United States for three agronomic traits: dry matter yield, plant height,more » and heading date. Marker data were produced for the families’ parents by exome capture sequencing, generating up to 141,030 polymorphic markers with available genomic-location and annotation information. We evaluated prediction procedures that varied not only by learning schemes and prediction models, but also by the way the data were preprocessed to account for redundancy in marker information. More complex genomic prediction procedures were generally not significantly more accurate than the simplest procedure, likely due to limited population sizes. Nevertheless, a highly significant gain in prediction accuracy was achieved by transforming the marker data through a marker correlation matrix. Our results suggest that marker-data transformations and, more generally, the account of linkage disequilibrium among markers, offer valuable opportunities for improving prediction procedures in GS. Furthermore, some of the achieved prediction accuracies should motivate implementation of GS in switchgrass breeding programs.« less

  14. Numerical Integration Techniques for Curved-Element Discretizations of Molecule–Solvent Interfaces

    PubMed Central

    Bardhan, Jaydeep P.; Altman, Michael D.; Willis, David J.; Lippow, Shaun M.; Tidor, Bruce; White, Jacob K.

    2012-01-01

    Surface formulations of biophysical modeling problems offer attractive theoretical and computational properties. Numerical simulations based on these formulations usually begin with discretization of the surface under consideration; often, the surface is curved, possessing complicated structure and possibly singularities. Numerical simulations commonly are based on approximate, rather than exact, discretizations of these surfaces. To assess the strength of the dependence of simulation accuracy on the fidelity of surface representation, we have developed methods to model several important surface formulations using exact surface discretizations. Following and refining Zauhar’s work (J. Comp.-Aid. Mol. Des. 9:149-159, 1995), we define two classes of curved elements that can exactly discretize the van der Waals, solvent-accessible, and solvent-excluded (molecular) surfaces. We then present numerical integration techniques that can accurately evaluate nonsingular and singular integrals over these curved surfaces. After validating the exactness of the surface discretizations and demonstrating the correctness of the presented integration methods, we present a set of calculations that compare the accuracy of approximate, planar-triangle-based discretizations and exact, curved-element-based simulations of surface-generalized-Born (sGB), surface-continuum van der Waals (scvdW), and boundary-element method (BEM) electrostatics problems. Results demonstrate that continuum electrostatic calculations with BEM using curved elements, piecewise-constant basis functions, and centroid collocation are nearly ten times more accurate than planartriangle BEM for basis sets of comparable size. The sGB and scvdW calculations give exceptional accuracy even for the coarsest obtainable discretized surfaces. The extra accuracy is attributed to the exact representation of the solute–solvent interface; in contrast, commonly used planar-triangle discretizations can only offer improved approximations with increasing discretization and associated increases in computational resources. The results clearly demonstrate that our methods for approximate integration on an exact geometry are far more accurate than exact integration on an approximate geometry. A MATLAB implementation of the presented integration methods and sample data files containing curved-element discretizations of several small molecules are available online at http://web.mit.edu/tidor. PMID:17627358

  15. Testing the discrimination and detection limits of WorldView-2 imagery on a challenging invasive plant target

    NASA Astrophysics Data System (ADS)

    Robinson, T. P.; Wardell-Johnson, G. W.; Pracilio, G.; Brown, C.; Corner, R.; van Klinken, R. D.

    2016-02-01

    Invasive plants pose significant threats to biodiversity and ecosystem function globally, leading to costly monitoring and management effort. While remote sensing promises cost-effective, robust and repeatable monitoring tools to support intervention, it has been largely restricted to airborne platforms that have higher spatial and spectral resolutions, but which lack the coverage and versatility of satellite-based platforms. This study tests the ability of the WorldView-2 (WV2) eight-band satellite sensor for detecting the invasive shrub mesquite (Prosopis spp.) in the north-west Pilbara region of Australia. Detectability was challenged by the target taxa being largely defoliated by a leaf-tying biological control agent (Gelechiidae: Evippe sp. #1) and the presence of other shrubs and trees. Variable importance in the projection (VIP) scores identified bands offering greatest capacity for discrimination were those covering the near-infrared, red, and red-edge wavelengths. Wavelengths between 400 nm and 630 nm (coastal blue, blue, green, yellow) were not useful for species level discrimination in this case. Classification accuracy was tested on three band sets (simulated standard multispectral, all bands, and bands with VIP scores ≥1). Overall accuracies were comparable amongst all band-sets (Kappa = 0.71-0.77). However, mesquite omission rates were unacceptably high (21.3%) when using all eight bands relative to the simulated standard multispectral band-set (9.5%) and the band-set informed by VIP scores (11.9%). An incremental cover evaluation on the latter identified most omissions to be for objects <16 m2. Mesquite omissions reduced to 2.6% and overall accuracy significantly improved (Kappa = 0.88) when these objects were left out of the confusion matrix calculations. Very high mapping accuracy of objects >16 m2 allows application for mapping mesquite shrubs and coalesced stands, the former not previously possible, even with 3 m resolution hyperspectral imagery. WV2 imagery offers excellent portability potential for detecting other species where spectral/spatial resolution or coverage has been an impediment. New generation satellite sensors are removing barriers previously preventing widespread adoption of remote sensing technologies in natural resource management.

  16. Multiscale sagebrush rangeland habitat modeling in southwest Wyoming

    USGS Publications Warehouse

    Homer, Collin G.; Aldridge, Cameron L.; Meyer, Debra K.; Coan, Michael J.; Bowen, Zachary H.

    2009-01-01

    Sagebrush-steppe ecosystems in North America have experienced dramatic elimination and degradation since European settlement. As a result, sagebrush-steppe dependent species have experienced drastic range contractions and population declines. Coordinated ecosystem-wide research, integrated with monitoring and management activities, would improve the ability to maintain existing sagebrush habitats. However, current data only identify resource availability locally, with rigorous spatial tools and models that accurately model and map sagebrush habitats over large areas still unavailable. Here we report on an effort to produce a rigorous large-area sagebrush-habitat classification and inventory with statistically validated products and estimates of precision in the State of Wyoming. This research employs a combination of significant new tools, including (1) modeling sagebrush rangeland as a series of independent continuous field components that can be combined and customized by any user at multiple spatial scales; (2) collecting ground-measured plot data on 2.4-meter imagery in the same season the satellite imagery is acquired; (3) effective modeling of ground-measured data on 2.4-meter imagery to maximize subsequent extrapolation; (4) acquiring multiple seasons (spring, summer, and fall) of an additional two spatial scales of imagery (30 meter and 56 meter) for optimal large-area modeling; (5) using regression tree classification technology that optimizes data mining of multiple image dates, ratios, and bands with ancillary data to extrapolate ground training data to coarser resolution sensors; and (6) employing rigorous accuracy assessment of model predictions to enable users to understand the inherent uncertainties. First-phase results modeled eight rangeland components (four primary targets and four secondary targets) as continuous field predictions. The primary targets included percent bare ground, percent herbaceousness, percent shrub, and percent litter. The four secondary targets included percent sagebrush (Artemisia spp.), percent big sagebrush (Artemisia tridentata), percent Wyoming sagebrush (Artemisia tridentata wyomingensis), and sagebrush height (centimeters). Results were validated by an independent accuracy assessment with root mean square error (RMSE) values ranging from 6.38 percent for bare ground to 2.99 percent for sagebrush at the QuickBird scale and RMSE values ranging from 12.07 percent for bare ground to 6.34 percent for sagebrush at the full Landsat scale. Subsequent project phases are now in progress, with plans to deliver products that improve accuracies of existing components, model new components, complete models over larger areas, track changes over time (from 1988 to 2007), and ultimately model wildlife population trends against these changes. We believe these results offer significant improvement in sagebrush rangeland quantification at multiple scales and offer users products that have been rigorously validated.

  17. Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models

    NASA Astrophysics Data System (ADS)

    Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.

    2017-12-01

    Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream measurements.

  18. Physical activity discrimination improvement using accelerometers and wireless sensor network localization - biomed 2013.

    PubMed

    Bashford, Gregory R; Burnfield, Judith M; Perez, Lance C

    2013-01-01

    Automating documentation of physical activity data (e.g., duration and speed of walking or propelling a wheelchair) into the electronic medical record (EMR) offers promise for improving efficiency of documentation and understanding of best practices in the rehabilitation and home health settings. Commercially available devices which could be used to automate documentation of physical activities are either cumbersome to wear or lack the specificity required to differentiate activities. We have designed a novel system to differentiate and quantify physical activities, using inexpensive accelerometer-based biomechanical data technology and wireless sensor networks, a technology combination that has not been used in a rehabilitation setting to date. As a first step, a feasibility study was performed where 14 healthy young adults (mean age = 22.6 ± 2.5 years, mean height = 173 ± 10.0 cm, mean mass = 70.7 ± 11.3 kg) carried out eight different activities while wearing a biaxial accelerometer sensor. Activities were performed at each participant’s self-selected pace during a single testing session in a controlled environment. Linear discriminant analysis was performed by extracting spectral parameters from the subjects’ accelerometer patterns. It is shown that physical activity classification alone results in an average accuracy of 49.5%, but when combined with rule-based constraints using a wireless sensor network with localization capabilities in an in silico simulated room, accuracy improves to 99.3%. When fully implemented, our technology package is expected to improve goal setting, treatment interventions and patient outcomes by enhancing clinicians’ understanding of patients’ physical performance within a day and across the rehabilitation program.

  19. A Hybrid Remote Sensing Approach for Detecting the Florida Red Tide

    NASA Astrophysics Data System (ADS)

    Carvalho, G. A.; Minnett, P. J.; Banzon, V.; Baringer, W.

    2008-12-01

    Harmful algal blooms (HABs) have caused major worldwide economic losses commonly linked with health problems for humans and wildlife. In the Eastern Gulf of Mexico the toxic marine dinoflagellate Karenia brevis is responsible for nearly annual, massive red tides causing fish kills, shellfish poisoning, and acute respiratory irritation in humans: the so-called Florida Red Tide. Near real-time satellite measurements could be an effective method for identifying HABs. The use of space-borne data would be a highly desired, low-cost technique offering the remote and accurate detection of K. brevis blooms over the West Florida Shelf, bringing tremendous societal benefits to the general public, scientific community, resource managers and medical health practitioners. An extensive in situ database provided by the Florida Fish and Wildlife Conservation Commission's Research Institute was used to examine the long-term accuracy of two satellite- based algorithms at detecting the Florida Red Tide. Using MODIS data from 2002 to 2006, the two algorithms are optimized and their accuracy assessed. It has been found that the sequential application of the algorithms results in improved predictability characteristics, correctly identifying ~80% of the cases (for both sensitivity and specificity, as well as overall accuracy), and exhibiting strong positive (70%) and negative (86%) predictive values.

  20. The importance of atmospheric correction for airborne hyperspectral remote sensing of shallow waters: application to depth estimation

    NASA Astrophysics Data System (ADS)

    Castillo-López, Elena; Dominguez, Jose Antonio; Pereda, Raúl; de Luis, Julio Manuel; Pérez, Ruben; Piña, Felipe

    2017-10-01

    Accurate determination of water depth is indispensable in multiple aspects of civil engineering (dock construction, dikes, submarines outfalls, trench control, etc.). To determine the type of atmospheric correction most appropriate for the depth estimation, different accuracies are required. Accuracy in bathymetric information is highly dependent on the atmospheric correction made to the imagery. The reduction of effects such as glint and cross-track illumination in homogeneous shallow-water areas improves the results of the depth estimations. The aim of this work is to assess the best atmospheric correction method for the estimation of depth in shallow waters, considering that reflectance values cannot be greater than 1.5 % because otherwise the background would not be seen. This paper addresses the use of hyperspectral imagery to quantitative bathymetric mapping and explores one of the most common problems when attempting to extract depth information in conditions of variable water types and bottom reflectances. The current work assesses the accuracy of some classical bathymetric algorithms (Polcyn-Lyzenga, Philpot, Benny-Dawson, Hamilton, principal component analysis) when four different atmospheric correction methods are applied and water depth is derived. No atmospheric correction is valid for all type of coastal waters, but in heterogeneous shallow water the model of atmospheric correction 6S offers good results.

  1. Multi-scale graph-cut algorithm for efficient water-fat separation.

    PubMed

    Berglund, Johan; Skorpil, Mikael

    2017-09-01

    To improve the accuracy and robustness to noise in water-fat separation by unifying the multiscale and graph cut based approaches to B 0 -correction. A previously proposed water-fat separation algorithm that corrects for B 0 field inhomogeneity in 3D by a single quadratic pseudo-Boolean optimization (QPBO) graph cut was incorporated into a multi-scale framework, where field map solutions are propagated from coarse to fine scales for voxels that are not resolved by the graph cut. The accuracy of the single-scale and multi-scale QPBO algorithms was evaluated against benchmark reference datasets. The robustness to noise was evaluated by adding noise to the input data prior to water-fat separation. Both algorithms achieved the highest accuracy when compared with seven previously published methods, while computation times were acceptable for implementation in clinical routine. The multi-scale algorithm was more robust to noise than the single-scale algorithm, while causing only a small increase (+10%) of the reconstruction time. The proposed 3D multi-scale QPBO algorithm offers accurate water-fat separation, robustness to noise, and fast reconstruction. The software implementation is freely available to the research community. Magn Reson Med 78:941-949, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  2. Perceptual experience and posttest improvements in perceptual accuracy and consistency.

    PubMed

    Wagman, Jeffrey B; McBride, Dawn M; Trefzger, Amanda J

    2008-08-01

    Two experiments investigated the relationship between perceptual experience (during practice) and posttest improvements in perceptual accuracy and consistency. Experiment 1 investigated the potential relationship between how often knowledge of results (KR) is provided during a practice session and posttest improvements in perceptual accuracy. Experiment 2 investigated the potential relationship between how often practice (PR) is provided during a practice session and posttest improvements in perceptual consistency. The results of both experiments are consistent with previous findings that perceptual accuracy improves only when practice includes KR and that perceptual consistency improves regardless of whether practice includes KR. In addition, the results showed that although there is a relationship between how often KR is provided during a practice session and posttest improvements in perceptual accuracy, there is no relationship between how often PR is provided during a practice session and posttest improvements in consistency.

  3. Interoceptive Accuracy in Youth with Tic Disorders: Exploring Links with Premonitory Urge, Anxiety and Quality of Life.

    PubMed

    Pile, Victoria; Lau, Jennifer Y F; Topor, Marta; Hedderly, Tammy; Robinson, Sally

    2018-05-18

    Aberrant interoceptive accuracy could contribute to the co-occurrence of anxiety and premonitory urge in chronic tic disorders (CTD). If it can be manipulated through intervention, it would offer a transdiagnostic treatment target for tics and anxiety. Interoceptive accuracy was first assessed consistent with previous protocols and then re-assessed following an instruction attempting to experimentally enhance awareness. The CTD group demonstrated lower interoceptive accuracy than controls but, importantly, this group difference was no longer significant following instruction. In the CTD group, better interoceptive accuracy was associated with higher anxiety and lower quality of life, but not with premonitory urge. Aberrant interoceptive accuracy may represent an underlying trait in CTD that can be manipulated, and relates to anxiety and quality of life.

  4. Research on signal processing method for total organic carbon of water quality online monitor

    NASA Astrophysics Data System (ADS)

    Ma, R.; Xie, Z. X.; Chu, D. Z.; Zhang, S. W.; Cao, X.; Wu, N.

    2017-08-01

    At present, there is no rapid, stable and effective approach of total organic carbon (TOC) measurement in the Marine environmental online monitoring field. Therefore, this paper proposes an online TOC monitor of chemiluminescence signal processing method. The weak optical signal detected by photomultiplier tube can be enhanced and converted by a series of signal processing module: phase-locked amplifier module, fourth-order band pass filter module and AD conversion module. After a long time of comparison test & measurement, compared with the traditional method, on the premise of sufficient accuracy, this chemiluminescence signal processing method can offer greatly improved measuring speed and high practicability for online monitoring.

  5. Diagnostic Dilemmas and Cultural Diversity in Emergency Rooms

    PubMed Central

    Weaver, Charlotte; Sklar, David

    1980-01-01

    Language and cultural beliefs play an extremely important role in the interaction between patients from diverse cultural groups and physicians. Especially in emergency rooms, there are many dangers in missed communications. A patient from a foreign culture, especially one who does not speak English, often expresses symptoms in ways that are unfamiliar to many American physicians. Specific areas of cultural vulnerability can be identified for the major ethnic groups in the United States as they interact with the scientific medical system. A short review of folk medical beliefs and recommendations for improving diagnostic accuracy and treatment may assist emergency room staffs in offering care that is culturally acceptable to patients of diverse ethnic backgrounds. PMID:7347053

  6. Real-time robot deliberation by compilation and monitoring of anytime algorithms

    NASA Technical Reports Server (NTRS)

    Zilberstein, Shlomo

    1994-01-01

    Anytime algorithms are algorithms whose quality of results improves gradually as computation time increases. Certainty, accuracy, and specificity are metrics useful in anytime algorighm construction. It is widely accepted that a successful robotic system must trade off between decision quality and the computational resources used to produce it. Anytime algorithms were designed to offer such a trade off. A model of compilation and monitoring mechanisms needed to build robots that can efficiently control their deliberation time is presented. This approach simplifies the design and implementation of complex intelligent robots, mechanizes the composition and monitoring processes, and provides independent real time robotic systems that automatically adjust resource allocation to yield optimum performance.

  7. Simplified Analysis of Pulse Detonation Rocket Engine Blowdown Gasdynamics and Performance

    NASA Technical Reports Server (NTRS)

    Morris, C. I.; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    Pulse detonation rocket engines (PDREs) offer potential performance improvements over conventional designs, but represent a challenging modellng task. A simplified model for an idealized, straight-tube, single-shot PDRE blowdown process and thrust determination is described and implemented. In order to form an assessment of the accuracy of the model, the flowfield time history is compared to experimental data from Stanford University. Parametric Studies of the effect of mixture stoichiometry, initial fill temperature, and blowdown pressure ratio on the performance of a PDRE are performed using the model. PDRE performance is also compared with a conventional steady-state rocket engine over a range of pressure ratios using similar gasdynamic assumptions.

  8. Evaluation of constraint stabilization procedures for multibody dynamical systems

    NASA Technical Reports Server (NTRS)

    Park, K. C.; Chiou, J. C.

    1987-01-01

    Comparative numerical studies of four constraint treatment techniques for the simulation of general multibody dynamic systems are presented, and results are presented for the example of a classical crank mechanism and for a simplified version of the seven-link manipulator deployment problem. The staggered stabilization technique (Park, 1986) is found to yield improved accuracy and robustness over Baumgarte's (1972) technique, the singular decomposition technique (Walton and Steeves, 1969), and the penalty technique (Lotstedt, 1979). Furthermore, the staggered stabilization technique offers software modularity, and the only data each solution module needs to exchange with the other is a set of vectors plus a common module to generate the gradient matrix of the constraints, B.

  9. Seeing It All: Evaluating Supervised Machine Learning Methods for the Classification of Diverse Otariid Behaviours

    PubMed Central

    Slip, David J.; Hocking, David P.; Harcourt, Robert G.

    2016-01-01

    Constructing activity budgets for marine animals when they are at sea and cannot be directly observed is challenging, but recent advances in bio-logging technology offer solutions to this problem. Accelerometers can potentially identify a wide range of behaviours for animals based on unique patterns of acceleration. However, when analysing data derived from accelerometers, there are many statistical techniques available which when applied to different data sets produce different classification accuracies. We investigated a selection of supervised machine learning methods for interpreting behavioural data from captive otariids (fur seals and sea lions). We conducted controlled experiments with 12 seals, where their behaviours were filmed while they were wearing 3-axis accelerometers. From video we identified 26 behaviours that could be grouped into one of four categories (foraging, resting, travelling and grooming) representing key behaviour states for wild seals. We used data from 10 seals to train four predictive classification models: stochastic gradient boosting (GBM), random forests, support vector machine using four different kernels and a baseline model: penalised logistic regression. We then took the best parameters from each model and cross-validated the results on the two seals unseen so far. We also investigated the influence of feature statistics (describing some characteristic of the seal), testing the models both with and without these. Cross-validation accuracies were lower than training accuracy, but the SVM with a polynomial kernel was still able to classify seal behaviour with high accuracy (>70%). Adding feature statistics improved accuracies across all models tested. Most categories of behaviour -resting, grooming and feeding—were all predicted with reasonable accuracy (52–81%) by the SVM while travelling was poorly categorised (31–41%). These results show that model selection is important when classifying behaviour and that by using animal characteristics we can strengthen the overall accuracy. PMID:28002450

  10. Comparison of Deep Brain Stimulation Lead Targeting Accuracy and Procedure Duration between 1.5- and 3-Tesla Interventional Magnetic Resonance Imaging Systems: An Initial 12-Month Experience.

    PubMed

    Southwell, Derek G; Narvid, Jared A; Martin, Alastair J; Qasim, Salman E; Starr, Philip A; Larson, Paul S

    2016-01-01

    Interventional magnetic resonance imaging (iMRI) allows deep brain stimulator lead placement under general anesthesia. While the accuracy of lead targeting has been described for iMRI systems utilizing 1.5-tesla magnets, a similar assessment of 3-tesla iMRI procedures has not been performed. To compare targeting accuracy, the number of lead targeting attempts, and surgical duration between procedures performed on 1.5- and 3-tesla iMRI systems. Radial targeting error, the number of targeting attempts, and procedure duration were compared between surgeries performed on 1.5- and 3-tesla iMRI systems (SmartFrame and ClearPoint systems). During the first year of operation of each system, 26 consecutive leads were implanted using the 1.5-tesla system, and 23 consecutive leads were implanted using the 3-tesla system. There was no significant difference in radial error (Mann-Whitney test, p = 0.26), number of lead placements that required multiple targeting attempts (Fisher's exact test, p = 0.59), or bilateral procedure durations between surgeries performed with the two systems (p = 0.15). Accurate DBS lead targeting can be achieved with iMRI systems utilizing either 1.5- or 3-tesla magnets. The use of a 3-tesla magnet, however, offers improved visualization of the target structures and allows comparable accuracy and efficiency of placement at the selected targets. © 2016 S. Karger AG, Basel.

  11. Corner-corrected diagonal-norm summation-by-parts operators for the first derivative with increased order of accuracy

    NASA Astrophysics Data System (ADS)

    Del Rey Fernández, David C.; Boom, Pieter D.; Zingg, David W.

    2017-02-01

    Combined with simultaneous approximation terms, summation-by-parts (SBP) operators offer a versatile and efficient methodology that leads to consistent, conservative, and provably stable discretizations. However, diagonal-norm operators with a repeating interior-point operator that have thus far been constructed suffer from a loss of accuracy. While on the interior, these operators are of degree 2p, at a number of nodes near the boundaries, they are of degree p, and therefore of global degree p - meaning the highest degree monomial for which the operators are exact at all nodes. This implies that for hyperbolic problems and operators of degree greater than unity they lead to solutions with a global order of accuracy lower than the degree of the interior-point operator. In this paper, we develop a procedure to construct diagonal-norm first-derivative SBP operators that are of degree 2p at all nodes and therefore can lead to solutions of hyperbolic problems of order 2 p + 1. This is accomplished by adding nonzero entries in the upper-right and lower-left corners of SBP operator matrices with a repeating interior-point operator. This modification necessitates treating these new operators as elements, where mesh refinement is accomplished by increasing the number of elements in the mesh rather than increasing the number of nodes. The significant improvements in accuracy of this new family, for the same repeating interior-point operator, are demonstrated in the context of the linear convection equation.

  12. Improved Motor-Timing: Effects of Synchronized Metro-Nome Training on Golf Shot Accuracy

    PubMed Central

    Sommer, Marius; Rönnqvist, Louise

    2009-01-01

    This study investigates the effect of synchronized metronome training (SMT) on motor timing and how this training might affect golf shot accuracy. Twenty-six experienced male golfers participated (mean age 27 years; mean golf handicap 12.6) in this study. Pre- and post-test investigations of golf shots made by three different clubs were conducted by use of a golf simulator. The golfers were randomized into two groups: a SMT group and a Control group. After the pre-test, the golfers in the SMT group completed a 4-week SMT program designed to improve their motor timing, the golfers in the Control group were merely training their golf-swings during the same time period. No differences between the two groups were found from the pre-test outcomes, either for motor timing scores or for golf shot accuracy. However, the post-test results after the 4-weeks SMT showed evident motor timing improvements. Additionally, significant improvements for golf shot accuracy were found for the SMT group and with less variability in their performance. No such improvements were found for the golfers in the Control group. As with previous studies that used a SMT program, this study’s results provide further evidence that motor timing can be improved by SMT and that such timing improvement also improves golf accuracy. Key points This study investigates the effect of synchronized metronome training (SMT) on motor timing and how this training might affect golf shot accuracy. A randomized control group design was used. The 4 week SMT intervention showed significant improvements in motor timing, golf shot accuracy, and lead to less variability. We conclude that this study’s results provide further evidence that motor timing can be improved by SMT training and that such timing improvement also improves golf accuracy. PMID:24149608

  13. Towards a ternary NIRS-BCI: single-trial classification of verbal fluency task, Stroop task and unconstrained rest

    NASA Astrophysics Data System (ADS)

    Schudlo, Larissa C.; Chau, Tom

    2015-12-01

    Objective. The majority of near-infrared spectroscopy (NIRS) brain-computer interface (BCI) studies have investigated binary classification problems. Limited work has considered differentiation of more than two mental states, or multi-class differentiation of higher-level cognitive tasks using measurements outside of the anterior prefrontal cortex. Improvements in accuracies are needed to deliver effective communication with a multi-class NIRS system. We investigated the feasibility of a ternary NIRS-BCI that supports mental states corresponding to verbal fluency task (VFT) performance, Stroop task performance, and unconstrained rest using prefrontal and parietal measurements. Approach. Prefrontal and parietal NIRS signals were acquired from 11 able-bodied adults during rest and performance of the VFT or Stroop task. Classification was performed offline using bagging with a linear discriminant base classifier trained on a 10 dimensional feature set. Main results. VFT, Stroop task and rest were classified at an average accuracy of 71.7% ± 7.9%. The ternary classification system provided a statistically significant improvement in information transfer rate relative to a binary system controlled by either mental task (0.87 ± 0.35 bits/min versus 0.73 ± 0.24 bits/min). Significance. These results suggest that effective communication can be achieved with a ternary NIRS-BCI that supports VFT, Stroop task and rest via measurements from the frontal and parietal cortices. Further development of such a system is warranted. Accurate ternary classification can enhance communication rates offered by NIRS-BCIs, improving the practicality of this technology.

  14. Deep learning with coherent nanophotonic circuits

    NASA Astrophysics Data System (ADS)

    Shen, Yichen; Harris, Nicholas C.; Skirlo, Scott; Prabhu, Mihika; Baehr-Jones, Tom; Hochberg, Michael; Sun, Xin; Zhao, Shijie; Larochelle, Hugo; Englund, Dirk; Soljačić, Marin

    2017-07-01

    Artificial neural networks are computational network models inspired by signal processing in the brain. These models have dramatically improved performance for many machine-learning tasks, including speech and image recognition. However, today's computing hardware is inefficient at implementing neural networks, in large part because much of it was designed for von Neumann computing schemes. Significant effort has been made towards developing electronic architectures tuned to implement artificial neural networks that exhibit improved computational speed and accuracy. Here, we propose a new architecture for a fully optical neural network that, in principle, could offer an enhancement in computational speed and power efficiency over state-of-the-art electronics for conventional inference tasks. We experimentally demonstrate the essential part of the concept using a programmable nanophotonic processor featuring a cascaded array of 56 programmable Mach-Zehnder interferometers in a silicon photonic integrated circuit and show its utility for vowel recognition.

  15. Particle size reduction to the nanometer range: a promising approach to improve buccal absorption of poorly water-soluble drugs

    PubMed Central

    Rao, Shasha; Song, Yunmei; Peddie, Frank; Evans, Allan M

    2011-01-01

    Poorly water-soluble drugs, such as phenylephrine, offer challenging problems for buccal drug delivery. In order to overcome these problems, particle size reduction (to the nanometer range) and cyclodextrin complexation were investigated for permeability enhancement. The apparent solubility in water and the buccal permeation of the original phenylephrine coarse powder, a phenylephrine–cyclodextrin complex and phenylephrine nanosuspensions were characterized. The particle size and particle surface properties of phenylephrine nanosuspensions were used to optimize the size reduction process. The optimized phenylephrine nanosuspension was then freeze dried and incorporated into a multi-layered buccal patch, consisting of a small tablet adhered to a mucoadhesive film, yielding a phenylephrine buccal product with good dosage accuracy and improved mucosal permeability. The design of the buccal patch allows for drug incorporation without the need to change the mucoadhesive component, and is potentially suited to a range of poorly water-soluble compounds. PMID:21753876

  16. Particle size reduction to the nanometer range: a promising approach to improve buccal absorption of poorly water-soluble drugs.

    PubMed

    Rao, Shasha; Song, Yunmei; Peddie, Frank; Evans, Allan M

    2011-01-01

    Poorly water-soluble drugs, such as phenylephrine, offer challenging problems for buccal drug delivery. In order to overcome these problems, particle size reduction (to the nanometer range) and cyclodextrin complexation were investigated for permeability enhancement. The apparent solubility in water and the buccal permeation of the original phenylephrine coarse powder, a phenylephrine-cyclodextrin complex and phenylephrine nanosuspensions were characterized. The particle size and particle surface properties of phenylephrine nanosuspensions were used to optimize the size reduction process. The optimized phenylephrine nanosuspension was then freeze dried and incorporated into a multi-layered buccal patch, consisting of a small tablet adhered to a mucoadhesive film, yielding a phenylephrine buccal product with good dosage accuracy and improved mucosal permeability. The design of the buccal patch allows for drug incorporation without the need to change the mucoadhesive component, and is potentially suited to a range of poorly water-soluble compounds.

  17. Improvement of the Assignment Methodology of the Approach Embankment Design to Highway Structures in Difficult Conditions

    NASA Astrophysics Data System (ADS)

    Chistyy, Y.; Kuzakhmetova, E.; Fazilova, Z.; Tsukanova, O.

    2018-03-01

    Design issues of junction of bridges and overhead road with approach embankment are studied. The reasons for the formation of deformations in the road structure are indicated. Activities to ensure sustainability and acceleration of the shrinkage of a weak subgrade approach embankment are listed. The necessity of taking into account the man-made impact of the approach embankment on the subgrade behavior is proved. Modern stabilizing agents to improve the properties of used soils in the embankment and the subgrade are suggested. Clarified methodology for determining an active zone of compression in the subgrade under load from the weight of the embankment is described. As an additional condition to the existing methodology for establishing the lower bound of the active zone of compression it is offered to accept the accuracy of evaluation of soil compressibility and determine shrinkage.

  18. An Improved Lattice Boltzmann Model for Non-Newtonian Flows with Applications to Solid-Fluid Interactions in External Flows

    NASA Astrophysics Data System (ADS)

    Adam, Saad; Premnath, Kannan

    2016-11-01

    Fluid mechanics of non-Newtonian fluids, which arise in numerous settings, are characterized by non-linear constitutive models that pose certain unique challenges for computational methods. Here, we consider the lattice Boltzmann method (LBM), which offers some computational advantages due to its kinetic basis and its simpler stream-and-collide procedure enabling efficient simulations. However, further improvements are necessary to improve its numerical stability and accuracy for computations involving broader parameter ranges. Hence, in this study, we extend the cascaded LBM formulation by modifying its moment equilibria and relaxation parameters to handle a variety of non-Newtonian constitutive equations, including power-law and Bingham fluids, with improved stability. In addition, we include corrections to the moment equilibria to obtain an inertial frame invariant scheme without cubic-velocity defects. After preforming its validation study for various benchmark flows, we study the physics of non-Newtonian flow over pairs of circular and square cylinders in a tandem arrangement, especially the wake structure interactions and their effects on resulting forces in each cylinder, and elucidate the effect of the various characteristic parameters.

  19. Robotic Surgery in Gynecology

    PubMed Central

    Bouquet de Joliniere, Jean; Librino, Armando; Dubuisson, Jean-Bernard; Khomsi, Fathi; Ben Ali, Nordine; Fadhlaoui, Anis; Ayoubi, J. M.; Feki, Anis

    2016-01-01

    Minimally invasive surgery (MIS) can be considered as the greatest surgical innovation over the past 30 years. It revolutionized surgical practice with well-proven advantages over traditional open surgery: reduced surgical trauma and incision-related complications, such as surgical-site infections, postoperative pain and hernia, reduced hospital stay, and improved cosmetic outcome. Nonetheless, proficiency in MIS can be technically challenging as conventional laparoscopy is associated with several limitations as the two-dimensional (2D) monitor reduction in-depth perception, camera instability, limited range of motion, and steep learning curves. The surgeon has a low force feedback, which allows simple gestures, respect for tissues, and more effective treatment of complications. Since the 1980s, several computer sciences and robotics projects have been set up to overcome the difficulties encountered with conventional laparoscopy, to augment the surgeon’s skills, achieve accuracy and high precision during complex surgery, and facilitate widespread of MIS. Surgical instruments are guided by haptic interfaces that replicate and filter hand movements. Robotically assisted technology offers advantages that include improved three-dimensional stereoscopic vision, wristed instruments that improve dexterity, and tremor canceling software that improves surgical precision. PMID:27200358

  20. Evaluating the accuracy of SHAPE-directed RNA secondary structure predictions

    PubMed Central

    Sükösd, Zsuzsanna; Swenson, M. Shel; Kjems, Jørgen; Heitsch, Christine E.

    2013-01-01

    Recent advances in RNA structure determination include using data from high-throughput probing experiments to improve thermodynamic prediction accuracy. We evaluate the extent and nature of improvements in data-directed predictions for a diverse set of 16S/18S ribosomal sequences using a stochastic model of experimental SHAPE data. The average accuracy for 1000 data-directed predictions always improves over the original minimum free energy (MFE) structure. However, the amount of improvement varies with the sequence, exhibiting a correlation with MFE accuracy. Further analysis of this correlation shows that accurate MFE base pairs are typically preserved in a data-directed prediction, whereas inaccurate ones are not. Thus, the positive predictive value of common base pairs is consistently higher than the directed prediction accuracy. Finally, we confirm sequence dependencies in the directability of thermodynamic predictions and investigate the potential for greater accuracy improvements in the worst performing test sequence. PMID:23325843

  1. The use of digital PCR to improve the application of quantitative molecular diagnostic methods for tuberculosis.

    PubMed

    Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F

    2016-08-03

    Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification of Mycobacterium tuberculosis would provide a clinically useful readout. The methods described in this study provide a means by which the technical performance of quantitative molecular methods can be evaluated independently of clinical variability to improve accuracy of measurement results. These will assist in ultimately increasing the likelihood that such approaches could be used to improve patient management of TB.

  2. Numerical Simulation of Transitional, Hypersonic Flows using a Hybrid Particle-Continuum Method

    NASA Astrophysics Data System (ADS)

    Verhoff, Ashley Marie

    Analysis of hypersonic flows requires consideration of multiscale phenomena due to the range of flight regimes encountered, from rarefied conditions in the upper atmosphere to fully continuum flow at low altitudes. At transitional Knudsen numbers there are likely to be localized regions of strong thermodynamic nonequilibrium effects that invalidate the continuum assumptions of the Navier-Stokes equations. Accurate simulation of these regions, which include shock waves, boundary and shear layers, and low-density wakes, requires a kinetic theory-based approach where no prior assumptions are made regarding the molecular distribution function. Because of the nature of these types of flows, there is much to be gained in terms of both numerical efficiency and physical accuracy by developing hybrid particle-continuum simulation approaches. The focus of the present research effort is the continued development of the Modular Particle-Continuum (MPC) method, where the Navier-Stokes equations are solved numerically using computational fluid dynamics (CFD) techniques in regions of the flow field where continuum assumptions are valid, and the direct simulation Monte Carlo (DSMC) method is used where strong thermodynamic nonequilibrium effects are present. Numerical solutions of transitional, hypersonic flows are thus obtained with increased physical accuracy relative to CFD alone, and improved numerical efficiency is achieved in comparison to DSMC alone because this more computationally expensive method is restricted to those regions of the flow field where it is necessary to maintain physical accuracy. In this dissertation, a comprehensive assessment of the physical accuracy of the MPC method is performed, leading to the implementation of a non-vacuum supersonic outflow boundary condition in particle domains, and more consistent initialization of DSMC simulator particles along hybrid interfaces. The relative errors between MPC and full DSMC results are greatly reduced as a direct result of these improvements. Next, a new parameter for detecting rotational nonequilibrium effects is proposed and shown to offer advantages over other continuum breakdown parameters, achieving further accuracy gains. Lastly, the capabilities of the MPC method are extended to accommodate multiple chemical species in rotational nonequilibrium, each of which is allowed to equilibrate independently, enabling application of the MPC method to more realistic atmospheric flows.

  3. The role of laryngoscopy in the diagnosis of spasmodic dysphonia.

    PubMed

    Daraei, Pedram; Villari, Craig R; Rubin, Adam D; Hillel, Alexander T; Hapner, Edie R; Klein, Adam M; Johns, Michael M

    2014-03-01

    Spasmodic dysphonia (SD) can be difficult to diagnose, and patients often see multiple physicians for many years before diagnosis. Improving the speed of diagnosis for individuals with SD may decrease the time to treatment and improve patient quality of life more quickly. To assess whether the diagnosis of SD can be accurately predicted through auditory cues alone without the assistance of visual cues offered by laryngoscopic examination. Single-masked, case-control study at a specialized referral center that included patients who underwent laryngoscopic examination as part of a multidisciplinary workup for dysphonia. Twenty-two patients were selected in total: 10 with SD, 5 with vocal tremor, and 7 controls without SD or vocal tremor. The laryngoscopic examination was recorded, deidentified, and edited to make 3 media clips for each patient: video alone, audio alone, and combined video and audio. These clips were randomized and presented to 3 fellowship-trained laryngologist raters (A.D.R., A.T.H., and A.M.K.), who established the most probable diagnosis for each clip. Intrarater and interrater reliability were evaluated using repeat clips incorporated in the presentations. We measured diagnostic accuracy for video-only, audio-only, and combined multimedia clips. These measures were established before data collection. Data analysis was accomplished with analysis of variance and Tukey honestly significant differences. Of patients with SD, diagnostic accuracy was 10%, 73%, and 73% for video-only, audio-only, and combined, respectively (P < .001, df = 2). Of patients with vocal tremor, diagnostic accuracy was 93%, 73%, and 100% for video-only, audio-only, and combined, respectively (P = .05, df = 2). Of the controls, diagnostic accuracy was 81%, 19%, and 62% for video-only, audio-only, and combined, respectively (P < .001, df = 2). The diagnosis of SD during examination is based primarily on auditory cues. Viewing combined audio and video clips afforded no change in diagnostic accuracy compared with audio alone. Laryngoscopy serves an important role in the diagnosis of SD by excluding other pathologic causes and identifying vocal tremor.

  4. Video image analysis in the Australian meat industry - precision and accuracy of predicting lean meat yield in lamb carcasses.

    PubMed

    Hopkins, D L; Safari, E; Thompson, J M; Smith, C R

    2004-06-01

    A wide selection of lamb types of mixed sex (ewes and wethers) were slaughtered at a commercial abattoir and during this process images of 360 carcasses were obtained online using the VIAScan® system developed by Meat and Livestock Australia. Soft tissue depth at the GR site (thickness of tissue over the 12th rib 110 mm from the midline) was measured by an abattoir employee using the AUS-MEAT sheep probe (PGR). Another measure of this thickness was taken in the chiller using a GR knife (NGR). Each carcass was subsequently broken down to a range of trimmed boneless retail cuts and the lean meat yield determined. The current industry model for predicting meat yield uses hot carcass weight (HCW) and tissue depth at the GR site. A low level of accuracy and precision was found when HCW and PGR were used to predict lean meat yield (R(2)=0.19, r.s.d.=2.80%), which could be improved markedly when PGR was replaced by NGR (R(2)=0.41, r.s.d.=2.39%). If the GR measures were replaced by 8 VIAScan® measures then greater prediction accuracy could be achieved (R(2)=0.52, r.s.d.=2.17%). A similar result was achieved when the model was based on principal components (PCs) computed from the 8 VIAScan® measures (R(2)=0.52, r.s.d.=2.17%). The use of PCs also improved the stability of the model compared to a regression model based on HCW and NGR. The transportability of the models was tested by randomly dividing the data set and comparing coefficients and the level of accuracy and precision. Those models based on PCs were superior to those based on regression. It is demonstrated that with the appropriate modeling the VIAScan® system offers a workable method for predicting lean meat yield automatically.

  5. CdSe/ZnS quantum dot fluorescence spectra shape-based thermometry via neural network reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munro, Troy; Laboratory of Soft Matter and Biophysics, Department of Physics and Astronomy, KU Leuven, Celestijnenlaan 200D, B-3001 Heverlee; Liu, Liwang

    As a system of interest gets small, due to the influence of the sensor mass and heat leaks through the sensor contacts, thermal characterization by means of contact temperature measurements becomes cumbersome. Non-contact temperature measurement offers a suitable alternative, provided a reliable relationship between the temperature and the detected signal is available. In this work, exploiting the temperature dependence of their fluorescence spectrum, the use of quantum dots as thermomarkers on the surface of a fiber of interest is demonstrated. The performance is assessed of a series of neural networks that use different spectral shape characteristics as inputs (peak-based—peak intensity,more » peak wavelength; shape-based—integrated intensity, their ratio, full-width half maximum, peak normalized intensity at certain wavelengths, and summation of intensity over several spectral bands) and that yield at their output the fiber temperature in the optically probed area on a spider silk fiber. Starting from neural networks trained on fluorescence spectra acquired in steady state temperature conditions, numerical simulations are performed to assess the quality of the reconstruction of dynamical temperature changes that are photothermally induced by illuminating the fiber with periodically intensity-modulated light. Comparison of the five neural networks investigated to multiple types of curve fits showed that using neural networks trained on a combination of the spectral characteristics improves the accuracy over use of a single independent input, with the greatest accuracy observed for inputs that included both intensity-based measurements (peak intensity) and shape-based measurements (normalized intensity at multiple wavelengths), with an ultimate accuracy of 0.29 K via numerical simulation based on experimental observations. The implications are that quantum dots can be used as a more stable and accurate fluorescence thermometer for solid materials and that use of neural networks for temperature reconstruction improves the accuracy of the measurement.« less

  6. Comparing a paper based monitoring and evaluation system to a mHealth system to support the national community health worker programme, South Africa: an evaluation

    PubMed Central

    2014-01-01

    Background In an attempt to address a complex disease burden, including improving progress towards MDGs 4 and 5, South Africa recently introduced a re-engineered Primary Health Care (PHC) strategy, which has led to the development of a national community health worker (CHW) programme. The present study explored the development of a cell phone-based and paper-based monitoring and evaluation (M&E) system to support the work of the CHWs. Methods One sub-district in the North West province was identified for the evaluation. One outreach team comprising ten CHWs maintained both the paper forms and mHealth system to record household data on community-based services. A comparative analysis was done to calculate the correspondence between the paper and phone records. A focus group discussion was conducted with the CHWs. Clinical referrals, data accuracy and supervised visits were compared and analysed for the paper and phone systems. Results Compared to the mHealth system where data accuracy was assured, 40% of the CHWs showed a consistently high level (>90% correspondence) of data transfer accuracy on paper. Overall, there was an improvement over time, and by the fifth month, all CHWs achieved a correspondence of 90% or above between phone and paper data. The most common error that occurred was summing the total number of visits and/or activities across the five household activity indicators. Few supervised home visits were recorded in either system and there was no evidence of the team leader following up on the automatic notifications received on their cell phones. Conclusions The evaluation emphasizes the need for regular supervision for both systems and rigorous and ongoing assessments of data quality for the paper system. Formalization of a mHealth M&E system for PHC outreach teams delivering community based services could offer greater accuracy of M&E and enhance supervision systems for CHWs. PMID:25106499

  7. Potential Improvements to Remote Primary Productivity Estimation in the Southern California Current System

    NASA Astrophysics Data System (ADS)

    Jacox, M.; Edwards, C. A.; Kahru, M.; Rudnick, D. L.; Kudela, R. M.

    2012-12-01

    A 26-year record of depth integrated primary productivity (PP) in the Southern California Current System (SCCS) is analyzed with the goal of improving satellite net primary productivity (PP) estimates. The ratio of integrated primary productivity to surface chlorophyll correlates strongly to surface chlorophyll concentration (chl0). However, chl0 does not correlate to chlorophyll-specific productivity, and appears to be a proxy for vertical phytoplankton distribution rather than phytoplankton physiology. Modest improvements in PP model performance are achieved by tuning existing algorithms for the SCCS, particularly by empirical parameterization of photosynthetic efficiency in the Vertically Generalized Production Model. Much larger improvements are enabled by improving accuracy of subsurface chlorophyll and light profiles. In a simple vertically resolved production model, substitution of in situ surface data for remote sensing estimates offers only marginal improvements in model r2 and total log10 root mean squared difference, while inclusion of in situ chlorophyll and light profiles improves these metrics significantly. Autonomous underwater gliders, capable of measuring subsurface fluorescence on long-term, long-range deployments, significantly improve PP model fidelity in the SCCS. We suggest their use (and that of other autonomous profilers such as Argo floats) in conjunction with satellites as a way forward for improved PP estimation in coastal upwelling systems.

  8. The potential for improving remote primary productivity estimates through subsurface chlorophyll and irradiance measurement

    NASA Astrophysics Data System (ADS)

    Jacox, Michael G.; Edwards, Christopher A.; Kahru, Mati; Rudnick, Daniel L.; Kudela, Raphael M.

    2015-02-01

    A 26-year record of depth integrated primary productivity (PP) in the Southern California Current System (SCCS) is analyzed with the goal of improving satellite net primary productivity (PP) estimates. Modest improvements in PP model performance are achieved by tuning existing algorithms for the SCCS, particularly by parameterizing carbon fixation rate in the vertically generalized production model as a function of surface chlorophyll concentration and distance from shore. Much larger improvements are enabled by improving the accuracy of subsurface chlorophyll and light profiles. In a simple vertically resolved production model for the SCCS (VRPM-SC), substitution of in situ surface data for remote sensing estimates offers only marginal improvements in model r2 (from 0.54 to 0.56) and total log10 root mean squared difference (from 0.22 to 0.21), while inclusion of in situ chlorophyll and light profiles improves these metrics to 0.77 and 0.15, respectively. Autonomous underwater gliders, capable of measuring subsurface properties on long-term, long-range deployments, significantly improve PP model fidelity in the SCCS. We suggest their use (and that of other autonomous profilers such as Argo floats) in conjunction with satellites as a way forward for large-scale improvements in PP estimation.

  9. Object-oriented classification using quasi-synchronous multispectral images (optical and radar) over agricultural surface

    NASA Astrophysics Data System (ADS)

    Marais Sicre, Claire; Baup, Frederic; Fieuzal, Remy

    2015-04-01

    In the context of climate change (with consequences on temperature and precipitation patterns), persons involved in agricultural management have the imperative to combine: sufficient productivity (as a response of the increment of the necessary foods) and durability of the resources (in order to restrain waste of water, fertilizer or environmental damages). To this end, a detailed knowledge of land use will improve the management of food and water, while preserving the ecosystems. Among the wide range of available monitoring tools, numerous studies demonstrated the interest of satellite images for agricultural mapping. Recently, the launch of several radar and optical sensors offer new perspectives for the multi-wavelength crop monitoring (Terrasar-X, Radarsat-2, Sentinel-1, Landsat-8…) allowing surface survey whatever the cloud conditions. Previous studies have demonstrated the interest of using multi-temporal approaches for crop classification, requiring several images for suitable classification results. Unfortunately, these approaches are limited (due to the satellite orbit cycle) and require waiting several days, week or month before offering an accurate land use map. The objective of this study is to compare the accuracy of object-oriented classification (random forest algorithm combined with vector layer coming from segmentation) to map winter crop (barley, rapeseed, grasslands and wheat) and soil states (bare soils with different surface roughness) using quasi-synchronous images. Satellite data are composed of multi-frequency and multi-polarization (HH, VV, HV and VH) images acquired near the 14th of April, 2010, over a studied area (90km²) located close to Toulouse in France. This is a region of alluvial plains and hills, which are mostly mixed farming and governed by a temperate climate. Remote sensing images are provided by Formosat-2 (04/18), Radarsat-2 (C-band, 04/15), Terrasar-X (X-band, 04/14) and ALOS (L-band, 04/14). Ground data are collected over 214 plots during the MCM'10 experiment conducted by the CESBIO laboratory in 2010. Classifications performances have been evaluated considering two cases: using only one frequency in optical or microwave domain, or using a combination of several frequencies (mixed between optical and microwave). For the first case, best results were obtained using optical wavelength with mean overall accuracy (OA) of 84%, followed by Terrasar-X (HH) and Radarsat-2 (HV or HV) which respectively offer overall accuracies of 77% and 73%. Concerning the vegetation, wheat was well classified whatever the wavelength used (OA > 93%). Barley was more complicated to classified and could be mingled with wheat or grassland. Best results were obtained using of green, red, blue, X-band or L-band wavelength offering an OA superior to 45%. Radar images were clearly well adapted to identify rapeseed (OA > 83%), especially at C (VV, HH and HV) and X-band (HH). The accuracy of grassland classification never exceeded 79% and results were stable between frequencies (excepted at L-band: 51%). The three soil roughness states were quite well classified whatever the wavelength and performances decreased with the increase of soil roughness. The combine use of multi-frequencies increased performances of the classification. Overall accuracy reached respectively 83% and 96% for C-band full polarization and for Formosat-2 multispectral approaches.

  10. Comparing the auscultatory accuracy of health care professionals using three different brands of stethoscopes on a simulator

    PubMed Central

    Mehmood, Mansoor; Abu Grara, Hazem L; Stewart, Joshua S; Khasawneh, Faisal A

    2014-01-01

    Background It is considered standard practice to use disposable or patient-dedicated stethoscopes to prevent cross-contamination between patients in contact precautions and others in their vicinity. The literature offers very little information regarding the quality of currently used stethoscopes. This study assessed the fidelity with which acoustics were perceived by a broad range of health care professionals using three brands of stethoscopes. Methods This prospective study used a simulation center and volunteer health care professionals to test the sound quality offered by three brands of commonly used stethoscopes. The volunteer’s proficiency in identifying five basic ausculatory sounds (wheezing, stridor, crackles, holosystolic murmur, and hyperdynamic bowel sounds) was tested, as well. Results A total of 84 health care professionals (ten attending physicians, 35 resident physicians, and 39 intensive care unit [ICU] nurses) participated in the study. The higher-end stethoscope was more reliable than lower-end stethoscopes in facilitating the diagnosis of the auscultatory sounds, especially stridor and crackles. Our volunteers detected all tested sounds correctly in about 69% of cases. As expected, attending physicians performed the best, followed by resident physicians and subsequently ICU nurses. Neither years of experience nor background noise seemed to affect performance. Postgraduate training continues to offer very little to improve our trainees’ auscultation skills. Conclusion The results of this study indicate that using low-end stethoscopes to care for patients in contact precautions could compromise identifying important auscultatory findings. Furthermore, there continues to be an opportunity to improve our physicians and ICU nurses’ auscultation skills. PMID:25152636

  11. Robotic System for MRI-Guided Stereotactic Neurosurgery

    PubMed Central

    Li, Gang; Cole, Gregory A.; Shang, Weijian; Harrington, Kevin; Camilo, Alex; Pilitsis, Julie G.; Fischer, Gregory S.

    2015-01-01

    Stereotaxy is a neurosurgical technique that can take several hours to reach a specific target, typically utilizing a mechanical frame and guided by preoperative imaging. An error in any one of the numerous steps or deviations of the target anatomy from the preoperative plan such as brain shift (up to 20 mm), may affect the targeting accuracy and thus the treatment effectiveness. Moreover, because the procedure is typically performed through a small burr hole opening in the skull that prevents tissue visualization, the intervention is basically “blind” for the operator with limited means of intraoperative confirmation that may result in reduced accuracy and safety. The presented system is intended to address the clinical needs for enhanced efficiency, accuracy, and safety of image-guided stereotactic neurosurgery for Deep Brain Stimulation (DBS) lead placement. The work describes a magnetic resonance imaging (MRI)-guided, robotically actuated stereotactic neural intervention system for deep brain stimulation procedure, which offers the potential of reducing procedure duration while improving targeting accuracy and enhancing safety. This is achieved through simultaneous robotic manipulation of the instrument and interactively updated in situ MRI guidance that enables visualization of the anatomy and interventional instrument. During simultaneous actuation and imaging, the system has demonstrated less than 15% signal-to-noise ratio (SNR) variation and less than 0.20% geometric distortion artifact without affecting the imaging usability to visualize and guide the procedure. Optical tracking and MRI phantom experiments streamline the clinical workflow of the prototype system, corroborating targeting accuracy with 3-axis root mean square error 1.38 ± 0.45 mm in tip position and 2.03 ± 0.58° in insertion angle. PMID:25376035

  12. Combining accuracy assessment of land-cover maps with environmental monitoring programs

    USGS Publications Warehouse

    Stehman, S.V.; Czaplewski, R.L.; Nusser, S.M.; Yang, L.; Zhu, Z.

    2000-01-01

    A scientifically valid accuracy assessment of a large-area, land-cover map is expensive. Environmental monitoring programs offer a potential source of data to partially defray the cost of accuracy assessment while still maintaining the statistical validity. In this article, three general strategies for combining accuracy assessment and environmental monitoring protocols are described. These strategies range from a fully integrated accuracy assessment and environmental monitoring protocol, to one in which the protocols operate nearly independently. For all three strategies, features critical to using monitoring data for accuracy assessment include compatibility of the land-cover classification schemes, precisely co-registered sample data, and spatial and temporal compatibility of the map and reference data. Two monitoring programs, the National Resources Inventory (NRI) and the Forest Inventory and Monitoring (FIM), are used to illustrate important features for implementing a combined protocol.

  13. A systematic review of the PTSD Checklist's diagnostic accuracy studies using QUADAS.

    PubMed

    McDonald, Scott D; Brown, Whitney L; Benesek, John P; Calhoun, Patrick S

    2015-09-01

    Despite the popularity of the PTSD Checklist (PCL) as a clinical screening test, there has been no comprehensive quality review of studies evaluating its diagnostic accuracy. A systematic quality assessment of 22 diagnostic accuracy studies of the English-language PCL using the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) assessment tool was conducted to examine (a) the quality of diagnostic accuracy studies of the PCL, and (b) whether quality has improved since the 2003 STAndards for the Reporting of Diagnostic accuracy studies (STARD) initiative regarding reporting guidelines for diagnostic accuracy studies. Three raters independently applied the QUADAS tool to each study, and a consensus among the 4 authors is reported. Findings indicated that although studies generally met standards in several quality areas, there is still room for improvement. Areas for improvement include establishing representativeness, adequately describing clinical and demographic characteristics of the sample, and presenting better descriptions of important aspects of test and reference standard execution. Only 2 studies met each of the 14 quality criteria. In addition, study quality has not appreciably improved since the publication of the STARD Statement in 2003. Recommendations for the improvement of diagnostic accuracy studies of the PCL are discussed. (c) 2015 APA, all rights reserved).

  14. Research on material removal accuracy analysis and correction of removal function during ion beam figuring

    NASA Astrophysics Data System (ADS)

    Wu, Weibin; Dai, Yifan; Zhou, Lin; Xu, Mingjin

    2016-09-01

    Material removal accuracy has a direct impact on the machining precision and efficiency of ion beam figuring. By analyzing the factors suppressing the improvement of material removal accuracy, we conclude that correcting the removal function deviation and reducing the removal material amount during each iterative process could help to improve material removal accuracy. Removal function correcting principle can effectively compensate removal function deviation between actual figuring and simulated processes, while experiments indicate that material removal accuracy decreases with a long machining time, so a small amount of removal material in each iterative process is suggested. However, more clamping and measuring steps will be introduced in this way, which will also generate machining errors and suppress the improvement of material removal accuracy. On this account, a free-measurement iterative process method is put forward to improve material removal accuracy and figuring efficiency by using less measuring and clamping steps. Finally, an experiment on a φ 100-mm Zerodur planar is preformed, which shows that, in similar figuring time, three free-measurement iterative processes could improve the material removal accuracy and the surface error convergence rate by 62.5% and 17.6%, respectively, compared with a single iterative process.

  15. Material elemental decomposition in dual and multi-energy CT via a sparsity-dictionary approach for proton stopping power ratio calculation.

    PubMed

    Shen, Chenyang; Li, Bin; Chen, Liyuan; Yang, Ming; Lou, Yifei; Jia, Xun

    2018-04-01

    Accurate calculation of proton stopping power ratio (SPR) relative to water is crucial to proton therapy treatment planning, since SPR affects prediction of beam range. Current standard practice derives SPR using a single CT scan. Recent studies showed that dual-energy CT (DECT) offers advantages to accurately determine SPR. One method to further improve accuracy is to incorporate prior knowledge on human tissue composition through a dictionary approach. In addition, it is also suggested that using CT images with multiple (more than two) energy channels, i.e., multi-energy CT (MECT), can further improve accuracy. In this paper, we proposed a sparse dictionary-based method to convert CT numbers of DECT or MECT to elemental composition (EC) and relative electron density (rED) for SPR computation. A dictionary was constructed to include materials generated based on human tissues of known compositions. For a voxel with CT numbers of different energy channels, its EC and rED are determined subject to a constraint that the resulting EC is a linear non-negative combination of only a few tissues in the dictionary. We formulated this as a non-convex optimization problem. A novel algorithm was designed to solve the problem. The proposed method has a unified structure to handle both DECT and MECT with different number of channels. We tested our method in both simulation and experimental studies. Average errors of SPR in experimental studies were 0.70% in DECT, 0.53% in MECT with three energy channels, and 0.45% in MECT with four channels. We also studied the impact of parameter values and established appropriate parameter values for our method. The proposed method can accurately calculate SPR using DECT and MECT. The results suggest that using more energy channels may improve the SPR estimation accuracy. © 2018 American Association of Physicists in Medicine.

  16. Double peak-induced distance error in short-time-Fourier-transform-Brillouin optical time domain reflectometers event detection and the recovery method.

    PubMed

    Yu, Yifei; Luo, Linqing; Li, Bo; Guo, Linfeng; Yan, Jize; Soga, Kenichi

    2015-10-01

    The measured distance error caused by double peaks in the BOTDRs (Brillouin optical time domain reflectometers) system is a kind of Brillouin scattering spectrum (BSS) deformation, discussed and simulated for the first time in the paper, to the best of the authors' knowledge. Double peak, as a kind of Brillouin spectrum deformation, is important in the enhancement of spatial resolution, measurement accuracy, and crack detection. Due to the variances of the peak powers of the BSS along the fiber, the measured starting point of a step-shape frequency transition region is shifted and results in distance errors. Zero-padded short-time-Fourier-transform (STFT) can restore the transition-induced double peaks in the asymmetric and deformed BSS, thus offering more accurate and quicker measurements than the conventional Lorentz-fitting method. The recovering method based on the double-peak detection and corresponding BSS deformation can be applied to calculate the real starting point, which can improve the distance accuracy of the STFT-based BOTDR system.

  17. Automation of fluorescent differential display with digital readout.

    PubMed

    Meade, Jonathan D; Cho, Yong-Jig; Fisher, Jeffrey S; Walden, Jamie C; Guo, Zhen; Liang, Peng

    2006-01-01

    Since its invention in 1992, differential display (DD) has become the most commonly used technique for identifying differentially expressed genes because of its many advantages over competing technologies such as DNA microarray, serial analysis of gene expression (SAGE), and subtractive hybridization. Despite the great impact of the method on biomedical research, there has been a lack of automation of DD technology to increase its throughput and accuracy for systematic gene expression analysis. Most of previous DD work has taken a "shot-gun" approach of identifying one gene at a time, with a limited number of polymerase chain reaction (PCR) reactions set up manually, giving DD a low-tech and low-throughput image. We have optimized the DD process with a new platform that incorporates fluorescent digital readout, automated liquid handling, and large-format gels capable of running entire 96-well plates. The resulting streamlined fluorescent DD (FDD) technology offers an unprecedented accuracy, sensitivity, and throughput in comprehensive and quantitative analysis of gene expression. These major improvements will allow researchers to find differentially expressed genes of interest, both known and novel, quickly and easily.

  18. Gaze-independent ERP-BCIs: augmenting performance through location-congruent bimodal stimuli

    PubMed Central

    Thurlings, Marieke E.; Brouwer, Anne-Marie; Van Erp, Jan B. F.; Werkhoven, Peter

    2014-01-01

    Gaze-independent event-related potential (ERP) based brain-computer interfaces (BCIs) yield relatively low BCI performance and traditionally employ unimodal stimuli. Bimodal ERP-BCIs may increase BCI performance due to multisensory integration or summation in the brain. An additional advantage of bimodal BCIs may be that the user can choose which modality or modalities to attend to. We studied bimodal, visual-tactile, gaze-independent BCIs and investigated whether or not ERP components’ tAUCs and subsequent classification accuracies are increased for (1) bimodal vs. unimodal stimuli; (2) location-congruent vs. location-incongruent bimodal stimuli; and (3) attending to both modalities vs. to either one modality. We observed an enhanced bimodal (compared to unimodal) P300 tAUC, which appeared to be positively affected by location-congruency (p = 0.056) and resulted in higher classification accuracies. Attending either to one or to both modalities of the bimodal location-congruent stimuli resulted in differences between ERP components, but not in classification performance. We conclude that location-congruent bimodal stimuli improve ERP-BCIs, and offer the user the possibility to switch the attended modality without losing performance. PMID:25249947

  19. Enhancing Ear and Hearing Health Access for Children With Technology and Connectivity.

    PubMed

    Swanepoel, De Wet

    2017-10-12

    Technology and connectivity advances are demonstrating increasing potential to improve access of service delivery to persons with hearing loss. This article demonstrates use cases from community-based hearing screening and automated diagnosis of ear disease. This brief report reviews recent evidence for school- and home-based hearing testing in underserved communities using smartphone technologies paired with calibrated headphones. Another area of potential impact facilitated by technology and connectivity is the use of feature extraction algorithms to facilitate automated diagnosis of most common ear conditions from video-otoscopic images. Smartphone hearing screening using calibrated headphones demonstrated equivalent sensitivity and specificity for school-based hearing screening. Automating test sequences with a forced-choice response paradigm allowed persons with minimal training to offer screening in underserved communities. The automated image analysis and diagnosis system for ear disease demonstrated an overall accuracy of 80.6%, which is up to par and exceeds accuracy rates previously reported for general practitioners and pediatricians. The emergence of these tools that capitalize on technology and connectivity advances enables affordable and accessible models of service delivery for community-based ear and hearing care.

  20. Taking error into account when fitting models using Approximate Bayesian Computation.

    PubMed

    van der Vaart, Elske; Prangle, Dennis; Sibly, Richard M

    2018-03-01

    Stochastic computer simulations are often the only practical way of answering questions relating to ecological management. However, due to their complexity, such models are difficult to calibrate and evaluate. Approximate Bayesian Computation (ABC) offers an increasingly popular approach to this problem, widely applied across a variety of fields. However, ensuring the accuracy of ABC's estimates has been difficult. Here, we obtain more accurate estimates by incorporating estimation of error into the ABC protocol. We show how this can be done where the data consist of repeated measures of the same quantity and errors may be assumed to be normally distributed and independent. We then derive the correct acceptance probabilities for a probabilistic ABC algorithm, and update the coverage test with which accuracy is assessed. We apply this method, which we call error-calibrated ABC, to a toy example and a realistic 14-parameter simulation model of earthworms that is used in environmental risk assessment. A comparison with exact methods and the diagnostic coverage test show that our approach improves estimation of parameter values and their credible intervals for both models. © 2017 by the Ecological Society of America.

  1. EXPLICIT LEAST-DEGREE BOUNDARY FILTERS FOR DISCONTINUOUS GALERKIN.

    PubMed

    Nguyen, Dang-Manh; Peters, Jörg

    2017-01-01

    Convolving the output of Discontinuous Galerkin (DG) computations using spline filters can improve both smoothness and accuracy of the output. At domain boundaries, these filters have to be one-sided for non-periodic boundary conditions. Recently, position-dependent smoothness-increasing accuracy-preserving (PSIAC) filters were shown to be a superset of the well-known one-sided RLKV and SRV filters. Since PSIAC filters can be formulated symbolically, PSIAC filtering amounts to forming linear products with local DG output and so offers a more stable and efficient implementation. The paper introduces a new class of PSIAC filters NP 0 that have small support and are piecewise constant. Extensive numerical experiments for the canonical hyperbolic test equation show NP 0 filters outperform the more complex known boundary filters. NP 0 filters typically reduce the L ∞ error in the boundary region below that of the interior where optimally superconvergent symmetric filters of the same support are applied. NP 0 filtering can be implemented as forming linear combinations of the data with short rational weights. Exact derivatives of the convolved output are easy to compute.

  2. A minimalistic approach to static and dynamic electron correlations: Amending generalized valence bond method with extended random phase approximation correlation correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Koushik; Jawulski, Konrad; Pastorczak, Ewa

    A perfect-pairing generalized valence bond (GVB) approximation is known to be one of the simplest approximations, which allows one to capture the essence of static correlation in molecular systems. In spite of its attractive feature of being relatively computationally efficient, this approximation misses a large portion of dynamic correlation and does not offer sufficient accuracy to be generally useful for studying electronic structure of molecules. We propose to correct the GVB model and alleviate some of its deficiencies by amending it with the correlation energy correction derived from the recently formulated extended random phase approximation (ERPA). On the examples ofmore » systems of diverse electronic structures, we show that the resulting ERPA-GVB method greatly improves upon the GVB model. ERPA-GVB recovers most of the electron correlation and it yields energy barrier heights of excellent accuracy. Thanks to a balanced treatment of static and dynamic correlation, ERPA-GVB stays reliable when one moves from systems dominated by dynamic electron correlation to those for which the static correlation comes into play.« less

  3. EXPLICIT LEAST-DEGREE BOUNDARY FILTERS FOR DISCONTINUOUS GALERKIN*

    PubMed Central

    Nguyen, Dang-Manh; Peters, Jörg

    2017-01-01

    Convolving the output of Discontinuous Galerkin (DG) computations using spline filters can improve both smoothness and accuracy of the output. At domain boundaries, these filters have to be one-sided for non-periodic boundary conditions. Recently, position-dependent smoothness-increasing accuracy-preserving (PSIAC) filters were shown to be a superset of the well-known one-sided RLKV and SRV filters. Since PSIAC filters can be formulated symbolically, PSIAC filtering amounts to forming linear products with local DG output and so offers a more stable and efficient implementation. The paper introduces a new class of PSIAC filters NP0 that have small support and are piecewise constant. Extensive numerical experiments for the canonical hyperbolic test equation show NP0 filters outperform the more complex known boundary filters. NP0 filters typically reduce the L∞ error in the boundary region below that of the interior where optimally superconvergent symmetric filters of the same support are applied. NP0 filtering can be implemented as forming linear combinations of the data with short rational weights. Exact derivatives of the convolved output are easy to compute. PMID:29081643

  4. SoFoCles: feature filtering for microarray classification based on gene ontology.

    PubMed

    Papachristoudis, Georgios; Diplaris, Sotiris; Mitkas, Pericles A

    2010-02-01

    Marker gene selection has been an important research topic in the classification analysis of gene expression data. Current methods try to reduce the "curse of dimensionality" by using statistical intra-feature set calculations, or classifiers that are based on the given dataset. In this paper, we present SoFoCles, an interactive tool that enables semantic feature filtering in microarray classification problems with the use of external, well-defined knowledge retrieved from the Gene Ontology. The notion of semantic similarity is used to derive genes that are involved in the same biological path during the microarray experiment, by enriching a feature set that has been initially produced with legacy methods. Among its other functionalities, SoFoCles offers a large repository of semantic similarity methods that are used in order to derive feature sets and marker genes. The structure and functionality of the tool are discussed in detail, as well as its ability to improve classification accuracy. Through experimental evaluation, SoFoCles is shown to outperform other classification schemes in terms of classification accuracy in two real datasets using different semantic similarity computation approaches.

  5. Interactive lesion segmentation with shape priors from offline and online learning.

    PubMed

    Shepherd, Tony; Prince, Simon J D; Alexander, Daniel C

    2012-09-01

    In medical image segmentation, tumors and other lesions demand the highest levels of accuracy but still call for the highest levels of manual delineation. One factor holding back automatic segmentation is the exemption of pathological regions from shape modelling techniques that rely on high-level shape information not offered by lesions. This paper introduces two new statistical shape models (SSMs) that combine radial shape parameterization with machine learning techniques from the field of nonlinear time series analysis. We then develop two dynamic contour models (DCMs) using the new SSMs as shape priors for tumor and lesion segmentation. From training data, the SSMs learn the lower level shape information of boundary fluctuations, which we prove to be nevertheless highly discriminant. One of the new DCMs also uses online learning to refine the shape prior for the lesion of interest based on user interactions. Classification experiments reveal superior sensitivity and specificity of the new shape priors over those previously used to constrain DCMs. User trials with the new interactive algorithms show that the shape priors are directly responsible for improvements in accuracy and reductions in user demand.

  6. Worldwide differential GPS for Space Shuttle landing operations

    NASA Technical Reports Server (NTRS)

    Loomis, Peter V. W.; Denaro, Robert P.; Saunders, Penny

    1990-01-01

    Worldwide differential Global Positioning System (WWDGPS) is viewed as an effective method of offering continuous high-quality navigation worldwide. The concept utilizes a network with as few as 33 ground stations to observe most of the error sources of GPS and provide error corrections to users on a worldwide basis. The WWDGPS real-time GPS tracking concept promises a threefold or fourfold improvement in accuracy for authorized dual-frequency users, and in addition maintains an accurate and current ionosphere model for single-frequency users. A real-time global tracking network also has the potential to reverse declarations of poor health on marginal satellites, increasing the number of satellites in the constellation and lessening the probability of GPS navigation outage. For Space Shuttle operations, the use of WWDGPS-aided P-code equipment promises performance equal to or better than other current landing guidance systems in terms of accuracy and reliability. This performance comes at significantly less cost to NASA, which will participate as a customer in a system designed as a commercial operation serving the global civil navigation community.

  7. Methods for assessment of keel bone damage in poultry.

    PubMed

    Casey-Trott, T; Heerkens, J L T; Petrik, M; Regmi, P; Schrader, L; Toscano, M J; Widowski, T

    2015-10-01

    Keel bone damage (KBD) is a critical issue facing the laying hen industry today as a result of the likely pain leading to compromised welfare and the potential for reduced productivity. Recent reports suggest that damage, while highly variable and likely dependent on a host of factors, extends to all systems (including battery cages, furnished cages, and non-cage systems), genetic lines, and management styles. Despite the extent of the problem, the research community remains uncertain as to the causes and influencing factors of KBD. Although progress has been made investigating these factors, the overall effort is hindered by several issues related to the assessment of KBD, including quality and variation in the methods used between research groups. These issues prevent effective comparison of studies, as well as difficulties in identifying the presence of damage leading to poor accuracy and reliability. The current manuscript seeks to resolve these issues by offering precise definitions for types of KBD, reviewing methods for assessment, and providing recommendations that can improve the accuracy and reliability of those assessments. © 2015 Poultry Science Association Inc.

  8. Classifying coastal resources by integrating optical and radar imagery and color infrared photography

    USGS Publications Warehouse

    Ramsey, Elijah W.; Nelson, Gene A.; Sapkota, Sijan

    1998-01-01

    A progressive classification of a marsh and forest system using Landsat Thematic Mapper (TM), color infrared (CIR) photograph, and ERS-1 synthetic aperture radar (SAR) data improved classification accuracy when compared to classification using solely TM reflective band data. The classification resulted in a detailed identification of differences within a nearly monotypic black needlerush marsh. Accuracy percentages of these classes were surprisingly high given the complexities of classification. The detailed classification resulted in a more accurate portrayal of the marsh transgressive sequence than was obtainable with TM data alone. Individual sensor contribution to the improved classification was compared to that using only the six reflective TM bands. Individually, the green reflective CIR and SAR data identified broad categories of water, marsh, and forest. In combination with TM, SAR and the green CIR band each improved overall accuracy by about 3% and 15% respectively. The SAR data improved the TM classification accuracy mostly in the marsh classes. The green CIR data also improved the marsh classification accuracy and accuracies in some water classes. The final combination of all sensor data improved almost all class accuracies from 2% to 70% with an overall improvement of about 20% over TM data alone. Not only was the identification of vegetation types improved, but the spatial detail of the classification approached 10 m in some areas.

  9. Pharmacometabolomics Informs Quantitative Radiomics for Glioblastoma Diagnostic Innovation.

    PubMed

    Katsila, Theodora; Matsoukas, Minos-Timotheos; Patrinos, George P; Kardamakis, Dimitrios

    2017-08-01

    Applications of omics systems biology technologies have enormous promise for radiology and diagnostics in surgical fields. In this context, the emerging fields of radiomics (a systems scale approach to radiology using a host of technologies, including omics) and pharmacometabolomics (use of metabolomics for patient and disease stratification and guiding precision medicine) offer much synergy for diagnostic innovation in surgery, particularly in neurosurgery. This synthesis of omics fields and applications is timely because diagnostic accuracy in central nervous system tumors still challenges decision-making. Considering the vast heterogeneity in brain tumors, disease phenotypes, and interindividual variability in surgical and chemotherapy outcomes, we believe that diagnostic accuracy can be markedly improved by quantitative radiomics coupled to pharmacometabolomics and related health information technologies while optimizing economic costs of traditional diagnostics. In this expert review, we present an innovation analysis on a systems-level multi-omics approach toward diagnostic accuracy in central nervous system tumors. For this, we suggest that glioblastomas serve as a useful application paradigm. We performed a literature search on PubMed for articles published in English between 2006 and 2016. We used the search terms "radiomics," "glioblastoma," "biomarkers," "pharmacogenomics," "pharmacometabolomics," "pharmacometabonomics/pharmacometabolomics," "collaborative informatics," and "precision medicine." A list of the top 4 insights we derived from this literature analysis is presented in this study. For example, we found that (i) tumor grading needs to be better refined, (ii) diagnostic precision should be improved, (iii) standardization in radiomics is lacking, and (iv) quantitative radiomics needs to prove clinical implementation. We conclude with an interdisciplinary call to the metabolomics, pharmacy/pharmacology, radiology, and surgery communities that pharmacometabolomics coupled to information technologies (chemoinformatics tools, databases, collaborative systems) can inform quantitative radiomics, thus translating Big Data and information growth to knowledge growth, rational drug development and diagnostics innovation for glioblastomas, and possibly in other brain tumors.

  10. A portable blood plasma clot micro-elastometry device based on resonant acoustic spectroscopy

    PubMed Central

    Krebs, C. R.; Li, Ling; Wolberg, Alisa S.; Oldenburg, Amy L.

    2015-01-01

    Abnormal blood clot stiffness is an important indicator of coagulation disorders arising from a variety of cardiovascular diseases and drug treatments. Here, we present a portable instrument for elastometry of microliter volume blood samples based upon the principle of resonant acoustic spectroscopy, where a sample of well-defined dimensions exhibits a fundamental longitudinal resonance mode proportional to the square root of the Young’s modulus. In contrast to commercial thromboelastography, the resonant acoustic method offers improved repeatability and accuracy due to the high signal-to-noise ratio of the resonant vibration. We review the measurement principles and the design of a magnetically actuated microbead force transducer applying between 23 pN and 6.7 nN, providing a wide dynamic range of elastic moduli (3 Pa–27 kPa) appropriate for measurement of clot elastic modulus (CEM). An automated and portable device, the CEMport, is introduced and implemented using a 2 nm resolution displacement sensor with demonstrated accuracy and precision of 3% and 2%, respectively, of CEM in biogels. Importantly, the small strains (<0.13%) and low strain rates (<1/s) employed by the CEMport maintain a linear stress-to-strain relationship which provides a perturbative measurement of the Young’s modulus. Measurements of blood plasma CEM versus heparin concentration show that CEMport is sensitive to heparin levels below 0.050 U/ml, which suggests future applications in sensing heparin levels of post-surgical cardiopulmonary bypass patients. The portability, high accuracy, and high precision of this device enable new clinical and animal studies for associating CEM with blood coagulation disorders, potentially leading to improved diagnostics and therapeutic monitoring. PMID:26233406

  11. Iterative Most-Likely Point Registration (IMLP): A Robust Algorithm for Computing Optimal Shape Alignment

    PubMed Central

    Billings, Seth D.; Boctor, Emad M.; Taylor, Russell H.

    2015-01-01

    We present a probabilistic registration algorithm that robustly solves the problem of rigid-body alignment between two shapes with high accuracy, by aptly modeling measurement noise in each shape, whether isotropic or anisotropic. For point-cloud shapes, the probabilistic framework additionally enables modeling locally-linear surface regions in the vicinity of each point to further improve registration accuracy. The proposed Iterative Most-Likely Point (IMLP) algorithm is formed as a variant of the popular Iterative Closest Point (ICP) algorithm, which iterates between point-correspondence and point-registration steps. IMLP’s probabilistic framework is used to incorporate a generalized noise model into both the correspondence and the registration phases of the algorithm, hence its name as a most-likely point method rather than a closest-point method. To efficiently compute the most-likely correspondences, we devise a novel search strategy based on a principal direction (PD)-tree search. We also propose a new approach to solve the generalized total-least-squares (GTLS) sub-problem of the registration phase, wherein the point correspondences are registered under a generalized noise model. Our GTLS approach has improved accuracy, efficiency, and stability compared to prior methods presented for this problem and offers a straightforward implementation using standard least squares. We evaluate the performance of IMLP relative to a large number of prior algorithms including ICP, a robust variant on ICP, Generalized ICP (GICP), and Coherent Point Drift (CPD), as well as drawing close comparison with the prior anisotropic registration methods of GTLS-ICP and A-ICP. The performance of IMLP is shown to be superior with respect to these algorithms over a wide range of noise conditions, outliers, and misalignments using both mesh and point-cloud representations of various shapes. PMID:25748700

  12. Development of variable pathlength UV-vis spectroscopy combined with partial-least-squares regression for wastewater chemical oxygen demand (COD) monitoring.

    PubMed

    Chen, Baisheng; Wu, Huanan; Li, Sam Fong Yau

    2014-03-01

    To overcome the challenging task to select an appropriate pathlength for wastewater chemical oxygen demand (COD) monitoring with high accuracy by UV-vis spectroscopy in wastewater treatment process, a variable pathlength approach combined with partial-least squares regression (PLSR) was developed in this study. Two new strategies were proposed to extract relevant information of UV-vis spectral data from variable pathlength measurements. The first strategy was by data fusion with two data fusion levels: low-level data fusion (LLDF) and mid-level data fusion (MLDF). Predictive accuracy was found to improve, indicated by the lower root-mean-square errors of prediction (RMSEP) compared with those obtained for single pathlength measurements. Both fusion levels were found to deliver very robust PLSR models with residual predictive deviations (RPD) greater than 3 (i.e. 3.22 and 3.29, respectively). The second strategy involved calculating the slopes of absorbance against pathlength at each wavelength to generate slope-derived spectra. Without the requirement to select the optimal pathlength, the predictive accuracy (RMSEP) was improved by 20-43% as compared to single pathlength spectroscopy. Comparing to nine-factor models from fusion strategy, the PLSR model from slope-derived spectroscopy was found to be more parsimonious with only five factors and more robust with residual predictive deviation (RPD) of 3.72. It also offered excellent correlation of predicted and measured COD values with R(2) of 0.936. In sum, variable pathlength spectroscopy with the two proposed data analysis strategies proved to be successful in enhancing prediction performance of COD in wastewater and showed high potential to be applied in on-line water quality monitoring. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. The availability of prior ECGs improves paramedic accuracy in recognizing ST-segment elevation myocardial infarction.

    PubMed

    O'Donnell, Daniel; Mancera, Mike; Savory, Eric; Christopher, Shawn; Schaffer, Jason; Roumpf, Steve

    2015-01-01

    Early and accurate identification of ST-elevation myocardial infarction (STEMI) by prehospital providers has been shown to significantly improve door to balloon times and improve patient outcomes. Previous studies have shown that paramedic accuracy in reading 12 lead ECGs can range from 86% to 94%. However, recent studies have demonstrated that accuracy diminishes for the more uncommon STEMI presentations (e.g. lateral). Unlike hospital physicians, paramedics rarely have the ability to review previous ECGs for comparison. Whether or not a prior ECG can improve paramedic accuracy is not known. The availability of prior ECGs improves paramedic accuracy in ECG interpretation. 130 paramedics were given a single clinical scenario. Then they were randomly assigned 12 computerized prehospital ECGs, 6 with and 6 without an accompanying prior ECG. All ECGs were obtained from a local STEMI registry. For each ECG paramedics were asked to determine whether or not there was a STEMI and to rate their confidence in their interpretation. To determine if the old ECGs improved accuracy we used a mixed effects logistic regression model to calculate p-values between the control and intervention. The addition of a previous ECG improved the accuracy of identifying STEMIs from 75.5% to 80.5% (p=0.015). A previous ECG also increased paramedic confidence in their interpretation (p=0.011). The availability of previous ECGs improves paramedic accuracy and enhances their confidence in interpreting STEMIs. Further studies are needed to evaluate this impact in a clinical setting. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. A Method for Improving Hotspot Directional Signatures in BRDF Models Used for MODIS

    NASA Technical Reports Server (NTRS)

    Jiao, Ziti; Schaaf, Crystal B.; Dong, Yadong; Roman, Miguel; Hill, Michael J.; Chen, Jing M.; Wang, Zhuosen; Zhang, Hu; Saenz, Edward; Poudyal, Rajesh; hide

    2016-01-01

    The semi-empirical, kernel-driven, linear RossThick-LiSparseReciprocal (RTLSR) Bidirectional Reflectance Distribution Function (BRDF) model is used to generate the routine MODIS BRDFAlbedo product due to its global applicability and the underlying physics. A challenge of this model in regard to surface reflectance anisotropy effects comes from its underestimation of the directional reflectance signatures near the Sun illumination direction; also known as the hotspot effect. In this study, a method has been developed for improving the ability of the RTLSR model to simulate the magnitude and width of the hotspot effect. The method corrects the volumetric scattering component of the RTLSR model using an exponential approximation of a physical hotspot kernel, which recreates the hotspot magnitude and width using two free parameters (C(sub 1) and C(sub 2), respectively). The approach allows one to reconstruct, with reasonable accuracy, the hotspot effect by adjusting or using the prior values of these two hotspot variables. Our results demonstrate that: (1) significant improvements in capturing hotspot effect can be made to this method by using the inverted hotspot parameters; (2) the reciprocal nature allow this method to be more adaptive for simulating the hotspot height and width with high accuracy, especially in cases where hotspot signatures are available; and (3) while the new approach is consistent with the heritage RTLSR model inversion used to estimate intrinsic narrowband and broadband albedos, it presents some differences for vegetation clumping index (CI) retrievals. With the hotspot-related model parameters determined a priori, this method offers improved performance for various ecological remote sensing applications; including the estimation of canopy structure parameters.

  15. Multilevel summation with B-spline interpolation for pairwise interactions in molecular dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardy, David J., E-mail: dhardy@illinois.edu; Schulten, Klaus; Wolff, Matthew A.

    2016-03-21

    The multilevel summation method for calculating electrostatic interactions in molecular dynamics simulations constructs an approximation to a pairwise interaction kernel and its gradient, which can be evaluated at a cost that scales linearly with the number of atoms. The method smoothly splits the kernel into a sum of partial kernels of increasing range and decreasing variability with the longer-range parts interpolated from grids of increasing coarseness. Multilevel summation is especially appropriate in the context of dynamics and minimization, because it can produce continuous gradients. This article explores the use of B-splines to increase the accuracy of the multilevel summation methodmore » (for nonperiodic boundaries) without incurring additional computation other than a preprocessing step (whose cost also scales linearly). To obtain accurate results efficiently involves technical difficulties, which are overcome by a novel preprocessing algorithm. Numerical experiments demonstrate that the resulting method offers substantial improvements in accuracy and that its performance is competitive with an implementation of the fast multipole method in general and markedly better for Hamiltonian formulations of molecular dynamics. The improvement is great enough to establish multilevel summation as a serious contender for calculating pairwise interactions in molecular dynamics simulations. In particular, the method appears to be uniquely capable for molecular dynamics in two situations, nonperiodic boundary conditions and massively parallel computation, where the fast Fourier transform employed in the particle–mesh Ewald method falls short.« less

  16. Multilevel summation with B-spline interpolation for pairwise interactions in molecular dynamics simulations.

    PubMed

    Hardy, David J; Wolff, Matthew A; Xia, Jianlin; Schulten, Klaus; Skeel, Robert D

    2016-03-21

    The multilevel summation method for calculating electrostatic interactions in molecular dynamics simulations constructs an approximation to a pairwise interaction kernel and its gradient, which can be evaluated at a cost that scales linearly with the number of atoms. The method smoothly splits the kernel into a sum of partial kernels of increasing range and decreasing variability with the longer-range parts interpolated from grids of increasing coarseness. Multilevel summation is especially appropriate in the context of dynamics and minimization, because it can produce continuous gradients. This article explores the use of B-splines to increase the accuracy of the multilevel summation method (for nonperiodic boundaries) without incurring additional computation other than a preprocessing step (whose cost also scales linearly). To obtain accurate results efficiently involves technical difficulties, which are overcome by a novel preprocessing algorithm. Numerical experiments demonstrate that the resulting method offers substantial improvements in accuracy and that its performance is competitive with an implementation of the fast multipole method in general and markedly better for Hamiltonian formulations of molecular dynamics. The improvement is great enough to establish multilevel summation as a serious contender for calculating pairwise interactions in molecular dynamics simulations. In particular, the method appears to be uniquely capable for molecular dynamics in two situations, nonperiodic boundary conditions and massively parallel computation, where the fast Fourier transform employed in the particle-mesh Ewald method falls short.

  17. Principles of Quantitative MR Imaging with Illustrated Review of Applicable Modular Pulse Diagrams.

    PubMed

    Mills, Andrew F; Sakai, Osamu; Anderson, Stephan W; Jara, Hernan

    2017-01-01

    Continued improvements in diagnostic accuracy using magnetic resonance (MR) imaging will require development of methods for tissue analysis that complement traditional qualitative MR imaging studies. Quantitative MR imaging is based on measurement and interpretation of tissue-specific parameters independent of experimental design, compared with qualitative MR imaging, which relies on interpretation of tissue contrast that results from experimental pulse sequence parameters. Quantitative MR imaging represents a natural next step in the evolution of MR imaging practice, since quantitative MR imaging data can be acquired using currently available qualitative imaging pulse sequences without modifications to imaging equipment. The article presents a review of the basic physical concepts used in MR imaging and how quantitative MR imaging is distinct from qualitative MR imaging. Subsequently, the article reviews the hierarchical organization of major applicable pulse sequences used in this article, with the sequences organized into conventional, hybrid, and multispectral sequences capable of calculating the main tissue parameters of T1, T2, and proton density. While this new concept offers the potential for improved diagnostic accuracy and workflow, awareness of this extension to qualitative imaging is generally low. This article reviews the basic physical concepts in MR imaging, describes commonly measured tissue parameters in quantitative MR imaging, and presents the major available pulse sequences used for quantitative MR imaging, with a focus on the hierarchical organization of these sequences. © RSNA, 2017.

  18. First-order analytic propagation of satellites in the exponential atmosphere of an oblate planet

    NASA Astrophysics Data System (ADS)

    Martinusi, Vladimir; Dell'Elce, Lamberto; Kerschen, Gaëtan

    2017-04-01

    The paper offers the fully analytic solution to the motion of a satellite orbiting under the influence of the two major perturbations, due to the oblateness and the atmospheric drag. The solution is presented in a time-explicit form, and takes into account an exponential distribution of the atmospheric density, an assumption that is reasonably close to reality. The approach involves two essential steps. The first one concerns a new approximate mathematical model that admits a closed-form solution with respect to a set of new variables. The second step is the determination of an infinitesimal contact transformation that allows to navigate between the new and the original variables. This contact transformation is obtained in exact form, and afterwards a Taylor series approximation is proposed in order to make all the computations explicit. The aforementioned transformation accommodates both perturbations, improving the accuracy of the orbit predictions by one order of magnitude with respect to the case when the atmospheric drag is absent from the transformation. Numerical simulations are performed for a low Earth orbit starting at an altitude of 350 km, and they show that the incorporation of drag terms into the contact transformation generates an error reduction by a factor of 7 in the position vector. The proposed method aims at improving the accuracy of analytic orbit propagation and transforming it into a viable alternative to the computationally intensive numerical methods.

  19. Multilevel summation with B-spline interpolation for pairwise interactions in molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Hardy, David J.; Wolff, Matthew A.; Xia, Jianlin; Schulten, Klaus; Skeel, Robert D.

    2016-03-01

    The multilevel summation method for calculating electrostatic interactions in molecular dynamics simulations constructs an approximation to a pairwise interaction kernel and its gradient, which can be evaluated at a cost that scales linearly with the number of atoms. The method smoothly splits the kernel into a sum of partial kernels of increasing range and decreasing variability with the longer-range parts interpolated from grids of increasing coarseness. Multilevel summation is especially appropriate in the context of dynamics and minimization, because it can produce continuous gradients. This article explores the use of B-splines to increase the accuracy of the multilevel summation method (for nonperiodic boundaries) without incurring additional computation other than a preprocessing step (whose cost also scales linearly). To obtain accurate results efficiently involves technical difficulties, which are overcome by a novel preprocessing algorithm. Numerical experiments demonstrate that the resulting method offers substantial improvements in accuracy and that its performance is competitive with an implementation of the fast multipole method in general and markedly better for Hamiltonian formulations of molecular dynamics. The improvement is great enough to establish multilevel summation as a serious contender for calculating pairwise interactions in molecular dynamics simulations. In particular, the method appears to be uniquely capable for molecular dynamics in two situations, nonperiodic boundary conditions and massively parallel computation, where the fast Fourier transform employed in the particle-mesh Ewald method falls short.

  20. PhyLM: A Mission Design Concept for an Optical/Lidar Instrument to Measure Ocean Productivity and Aerosols from Space

    NASA Technical Reports Server (NTRS)

    Gervin, Janette C.; Behrenfeld, Michael; McClain, Charles R.; Spinhirne, James; Purves, Lloyd; Wood, H. John; Roberto, Michael R.

    2004-01-01

    The Physiology Lidar-Multispectral Mission (PhyLM) is intended to explore the complex ecosystems of our global oceans. New "inversion" methods and improved understanding of marine optics have opened the door to quantifying a range of critical ocean properties. This new information could revolutionize our understanding of global ocean processes, such as phytoplankton growth, harmful algal blooms, carbon fluxes between major pools and the productivity equation. The new science requires new measurements not addressed by currently planned space missions. PhyLM will combine active and advanced passive remote sensing technologies to quantify standing stocks and fluxes of climate-critical components of the Ocean carbon cycle to meet these science providing multispectral bands from the far UV through the near infrared (340 - 1250 nm) at a ground resolution of 250 m. Improved detectors, filters, mirrors, digitization and focal plane design will offer an overall higher-quality data product. The unprecedented accuracy and precision of the absolute water-leaving radiances will support inversion- based quantification of an expanded set of ocean carbon cycle components. The dual- wavelength (532 & 1064 nm) Nd:Yag Lidar will enhance the accuracy and precision of the passive data by providing aerosol profiles for atmospheric correction and coincident active measurements of backscattering. The Lidar will also examine dark-side fluorescence as an additional approach to quantifying phytoplankton biomass in highly productive regions.

  1. Prospects for higher spatial resolution quantitative X-ray analysis using transition element L-lines

    NASA Astrophysics Data System (ADS)

    Statham, P.; Holland, J.

    2014-03-01

    Lowering electron beam kV reduces electron scattering and improves spatial resolution of X-ray analysis. However, a previous round robin analysis of steels at 5 - 6 kV using Lα-lines for the first row transition elements gave poor accuracies. Our experiments on SS63 steel using Lα-lines show similar biases in Cr and Ni that cannot be corrected with changes to self-absorption coefficients or carbon coating. The inaccuracy may be caused by different probabilities for emission and anomalous self-absorption for the La-line between specimen and pure element standard. Analysis using Ll(L3-M1)-lines gives more accurate results for SS63 plausibly because the M1-shell is not so vulnerable to the atomic environment as the unfilled M4,5-shell. However, Ll-intensities are very weak and WDS analysis may be impractical for some applications. EDS with large area SDD offers orders of magnitude faster analysis and achieves similar results to WDS analysis with Lα-lines but poorer energy resolution precludes the use of Ll-lines in most situations. EDS analysis of K-lines at low overvoltage is an alternative strategy for improving spatial resolution that could give higher accuracy. The trade-off between low kV versus low overvoltage is explored in terms of sensitivity for element detection for different elements.

  2. Is multiple-sequence alignment required for accurate inference of phylogeny?

    PubMed

    Höhl, Michael; Ragan, Mark A

    2007-04-01

    The process of inferring phylogenetic trees from molecular sequences almost always starts with a multiple alignment of these sequences but can also be based on methods that do not involve multiple sequence alignment. Very little is known about the accuracy with which such alignment-free methods recover the correct phylogeny or about the potential for increasing their accuracy. We conducted a large-scale comparison of ten alignment-free methods, among them one new approach that does not calculate distances and a faster variant of our pattern-based approach; all distance-based alignment-free methods are freely available from http://www.bioinformatics.org.au (as Python package decaf+py). We show that most methods exhibit a higher overall reconstruction accuracy in the presence of high among-site rate variation. Under all conditions that we considered, variants of the pattern-based approach were significantly better than the other alignment-free methods. The new pattern-based variant achieved a speed-up of an order of magnitude in the distance calculation step, accompanied by a small loss of tree reconstruction accuracy. A method of Bayesian inference from k-mers did not improve on classical alignment-free (and distance-based) methods but may still offer other advantages due to its Bayesian nature. We found the optimal word length k of word-based methods to be stable across various data sets, and we provide parameter ranges for two different alphabets. The influence of these alphabets was analyzed to reveal a trade-off in reconstruction accuracy between long and short branches. We have mapped the phylogenetic accuracy for many alignment-free methods, among them several recently introduced ones, and increased our understanding of their behavior in response to biologically important parameters. In all experiments, the pattern-based approach emerged as superior, at the expense of higher resource consumption. Nonetheless, no alignment-free method that we examined recovers the correct phylogeny as accurately as does an approach based on maximum-likelihood distance estimates of multiply aligned sequences.

  3. Optimized hardware framework of MLP with random hidden layers for classification applications

    NASA Astrophysics Data System (ADS)

    Zyarah, Abdullah M.; Ramesh, Abhishek; Merkel, Cory; Kudithipudi, Dhireesha

    2016-05-01

    Multilayer Perceptron Networks with random hidden layers are very efficient at automatic feature extraction and offer significant performance improvements in the training process. They essentially employ large collection of fixed, random features, and are expedient for form-factor constrained embedded platforms. In this work, a reconfigurable and scalable architecture is proposed for the MLPs with random hidden layers with a customized building block based on CORDIC algorithm. The proposed architecture also exploits fixed point operations for area efficiency. The design is validated for classification on two different datasets. An accuracy of ~ 90% for MNIST dataset and 75% for gender classification on LFW dataset was observed. The hardware has 299 speed-up over the corresponding software realization.

  4. High quality optically polished aluminum mirror and process for producing

    NASA Technical Reports Server (NTRS)

    Lyons, III, James J. (Inventor); Zaniewski, John J. (Inventor)

    2005-01-01

    A new technical advancement in the field of precision aluminum optics permits high quality optical polishing of aluminum monolith, which, in the field of optics, offers numerous benefits because of its machinability, lightweight, and low cost. This invention combines diamond turning and conventional polishing along with india ink, a newly adopted material, for the polishing to accomplish a significant improvement in surface precision of aluminum monolith for optical purposes. This invention guarantees the precise optical polishing of typical bare aluminum monolith to surface roughness of less than about 30 angstroms rms and preferably about 5 angstroms rms while maintaining a surface figure accuracy in terms of surface figure error of not more than one-fifteenth of wave peak-to-valley.

  5. High quality optically polished aluminum mirror and process for producing

    NASA Technical Reports Server (NTRS)

    Lyons, III, James J. (Inventor); Zaniewski, John J. (Inventor)

    2002-01-01

    A new technical advancement in the field of precision aluminum optics permits high quality optical polishing of aluminum monolith, which, in the field of optics, offers numerous benefits because of its machinability, lightweight, and low cost. This invention combines diamond turning and conventional polishing along with india ink, a newly adopted material, for the polishing to accomplish a significant improvement in surface precision of aluminum monolith for optical purposes. This invention guarantees the precise optical polishing of typical bare aluminum monolith to surface roughness of less than about 30 angstroms rms and preferably about 5 angstroms rms while maintaining a surface figure accuracy in terms of surface figure error of not more than one-fifteenth of wave peak-to-valley.

  6. NASA Workshop on Distributed Parameter Modeling and Control of Flexible Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Marks, Virginia B. (Compiler); Keckler, Claude R. (Compiler)

    1994-01-01

    Although significant advances have been made in modeling and controlling flexible systems, there remains a need for improvements in model accuracy and in control performance. The finite element models of flexible systems are unduly complex and are almost intractable to optimum parameter estimation for refinement using experimental data. Distributed parameter or continuum modeling offers some advantages and some challenges in both modeling and control. Continuum models often result in a significantly reduced number of model parameters, thereby enabling optimum parameter estimation. The dynamic equations of motion of continuum models provide the advantage of allowing the embedding of the control system dynamics, thus forming a complete set of system dynamics. There is also increased insight provided by the continuum model approach.

  7. Portable Life Support Subsystem Thermal Hydraulic Performance Analysis

    NASA Technical Reports Server (NTRS)

    Barnes, Bruce; Pinckney, John; Conger, Bruce

    2010-01-01

    This paper presents the current state of the thermal hydraulic modeling efforts being conducted for the Constellation Space Suit Element (CSSE) Portable Life Support Subsystem (PLSS). The goal of these efforts is to provide realistic simulations of the PLSS under various modes of operation. The PLSS thermal hydraulic model simulates the thermal, pressure, flow characteristics, and human thermal comfort related to the PLSS performance. This paper presents modeling approaches and assumptions as well as component model descriptions. Results from the models are presented that show PLSS operations at steady-state and transient conditions. Finally, conclusions and recommendations are offered that summarize results, identify PLSS design weaknesses uncovered during review of the analysis results, and propose areas for improvement to increase model fidelity and accuracy.

  8. Protein Function Prediction: Problems and Pitfalls.

    PubMed

    Pearson, William R

    2015-09-03

    The characterization of new genomes based on their protein sets has been revolutionized by new sequencing technologies, but biologists seeking to exploit new sequence information are often frustrated by the challenges associated with accurately assigning biological functions to newly identified proteins. Here, we highlight some of the challenges in functional inference from sequence similarity. Investigators can improve the accuracy of function prediction by (1) being conservative about the evolutionary distance to a protein of known function; (2) considering the ambiguous meaning of "functional similarity," and (3) being aware of the limitations of annotations in functional databases. Protein function prediction does not offer "one-size-fits-all" solutions. Prediction strategies work better when the idiosyncrasies of function and functional annotation are better understood. Copyright © 2015 John Wiley & Sons, Inc.

  9. Verification of sub-grid filtered drag models for gas-particle fluidized beds with immersed cylinder arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarkar, Avik; Sun, Xin; Sundaresan, Sankaran

    2014-04-23

    The accuracy of coarse-grid multiphase CFD simulations of fluidized beds may be improved via the inclusion of filtered constitutive models. In our previous study (Sarkar et al., Chem. Eng. Sci., 104, 399-412), we developed such a set of filtered drag relationships for beds with immersed arrays of cooling tubes. Verification of these filtered drag models is addressed in this work. Predictions from coarse-grid simulations with the sub-grid filtered corrections are compared against accurate, highly-resolved simulations of full-scale turbulent and bubbling fluidized beds. The filtered drag models offer a computationally efficient yet accurate alternative for obtaining macroscopic predictions, but the spatialmore » resolution of meso-scale clustering heterogeneities is sacrificed.« less

  10. Feature instructions improve face-matching accuracy

    PubMed Central

    Bindemann, Markus

    2018-01-01

    Identity comparisons of photographs of unfamiliar faces are prone to error but important for applied settings, such as person identification at passport control. Finding techniques to improve face-matching accuracy is therefore an important contemporary research topic. This study investigated whether matching accuracy can be improved by instruction to attend to specific facial features. Experiment 1 showed that instruction to attend to the eyebrows enhanced matching accuracy for optimized same-day same-race face pairs but not for other-race faces. By contrast, accuracy was unaffected by instruction to attend to the eyes, and declined with instruction to attend to ears. Experiment 2 replicated the eyebrow-instruction improvement with a different set of same-race faces, comprising both optimized same-day and more challenging different-day face pairs. These findings suggest that instruction to attend to specific features can enhance face-matching accuracy, but feature selection is crucial and generalization across face sets may be limited. PMID:29543822

  11. Laser marking as a result of applying reverse engineering

    NASA Astrophysics Data System (ADS)

    Mihalache, Andrei; Nagîţ, Gheorghe; Rîpanu, Marius Ionuţ; Slǎtineanu, Laurenţiu; Dodun, Oana; Coteaţǎ, Margareta

    2018-05-01

    The elaboration of a modern manufacturing technology needs a certain quantum of information concerning the part to be obtained. When it is necessary to elaborate the technology for an existing object, such an information could be ensured by using the principles specific to the reverse engineering. Essentially, in the case of this method, the analysis of the surfaces and of other characteristics of the part must offer enough information for the elaboration of the part manufacturing technology. On the other hand, it is known that the laser marking is a processing method able to ensure the transfer of various inscriptions or drawings on a part. Sometimes, the laser marking could be based on the analysis of an existing object, whose image could be used to generate the same object or an improved object. There are many groups of factors able to affect the results of applying the laser marking process. A theoretical analysis was proposed to show that the heights of triangles obtained by means of a CNC marking equipment depend on the width of the line generated by the laser spot on the workpiece surface. An experimental research was thought and materialized to highlight the influence exerted by the line with and the angle of lines intersections on the accuracy of the marking process. By mathematical processing of the experimental results, empirical mathematical models were determined. The power type model and the graphical representation elaborated on the base of this model offered an image concerning the influences exerted by the considered input factors on the marking process accuracy.

  12. Modelling compression sensing in ionic polymer metal composites

    NASA Astrophysics Data System (ADS)

    Volpini, Valentina; Bardella, Lorenzo; Rodella, Andrea; Cha, Youngsu; Porfiri, Maurizio

    2017-03-01

    Ionic polymer metal composites (IPMCs) consist of an ionomeric membrane, including mobile counterions, sandwiched between two thin noble metal electrodes. IPMCs find application as sensors and actuators, where an imposed mechanical loading generates a voltage across the electrodes, and, vice versa, an imposed electric field causes deformation. Here, we present a predictive modelling approach to elucidate the dynamic sensing response of IPMCs subject to a time-varying through-the-thickness compression (‘compression sensing’). The model relies on the continuum theory recently developed by Porfiri and co-workers, which couples finite deformations to the modified Poisson-Nernst-Planck (PNP) system governing the IPMC electrochemistry. For the ‘compression sensing’ problem we establish a perturbative closed-form solution along with a finite element (FE) solution. The systematic comparison between these two solutions is a central contribution of this study, offering insight on accuracy and mathematical complexity. The method of matched asymptotic expansions is employed to find the analytical solution. To this end, we uncouple the force balance from the modified PNP system and separately linearise the PNP equations in the ionomer bulk and in the boundary layers at the ionomer-electrode interfaces. Comparison with FE results for the fully coupled nonlinear system demonstrates the accuracy of the analytical solution to describe IPMC sensing for moderate deformation levels. We finally demonstrate the potential of the modelling scheme to accurately reproduce experimental results from the literature. The proposed model is expected to aid in the design of IPMC sensors, contribute to an improved understanding of IPMC electrochemomechanical response, and offer insight into the role of nonlinear phenomena across mechanics and electrochemistry.

  13. Improved accuracy and precision of tracer kinetic parameters by joint fitting to variable flip angle and dynamic contrast enhanced MRI data.

    PubMed

    Dickie, Ben R; Banerji, Anita; Kershaw, Lucy E; McPartlin, Andrew; Choudhury, Ananya; West, Catharine M; Rose, Chris J

    2016-10-01

    To improve the accuracy and precision of tracer kinetic model parameter estimates for use in dynamic contrast enhanced (DCE) MRI studies of solid tumors. Quantitative DCE-MRI requires an estimate of precontrast T1 , which is obtained prior to fitting a tracer kinetic model. As T1 mapping and tracer kinetic signal models are both a function of precontrast T1 it was hypothesized that its joint estimation would improve the accuracy and precision of both precontrast T1 and tracer kinetic model parameters. Accuracy and/or precision of two-compartment exchange model (2CXM) parameters were evaluated for standard and joint fitting methods in well-controlled synthetic data and for 36 bladder cancer patients. Methods were compared under a number of experimental conditions. In synthetic data, joint estimation led to statistically significant improvements in the accuracy of estimated parameters in 30 of 42 conditions (improvements between 1.8% and 49%). Reduced accuracy was observed in 7 of the remaining 12 conditions. Significant improvements in precision were observed in 35 of 42 conditions (between 4.7% and 50%). In clinical data, significant improvements in precision were observed in 18 of 21 conditions (between 4.6% and 38%). Accuracy and precision of DCE-MRI parameter estimates are improved when signal models are fit jointly rather than sequentially. Magn Reson Med 76:1270-1281, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  14. Systematic reviews of diagnostic tests in endocrinology: an audit of methods, reporting, and performance.

    PubMed

    Spencer-Bonilla, Gabriela; Singh Ospina, Naykky; Rodriguez-Gutierrez, Rene; Brito, Juan P; Iñiguez-Ariza, Nicole; Tamhane, Shrikant; Erwin, Patricia J; Murad, M Hassan; Montori, Victor M

    2017-07-01

    Systematic reviews provide clinicians and policymakers estimates of diagnostic test accuracy and their usefulness in clinical practice. We identified all available systematic reviews of diagnosis in endocrinology, summarized the diagnostic accuracy of the tests included, and assessed the credibility and clinical usefulness of the methods and reporting. We searched Ovid MEDLINE, EMBASE, and Cochrane CENTRAL from inception to December 2015 for systematic reviews and meta-analyses reporting accuracy measures of diagnostic tests in endocrinology. Experienced reviewers independently screened for eligible studies and collected data. We summarized the results, methods, and reporting of the reviews. We performed subgroup analyses to categorize diagnostic tests as most useful based on their accuracy. We identified 84 systematic reviews; half of the tests included were classified as helpful when positive, one-fourth as helpful when negative. Most authors adequately reported how studies were identified and selected and how their trustworthiness (risk of bias) was judged. Only one in three reviews, however, reported an overall judgment about trustworthiness and one in five reported using adequate meta-analytic methods. One in four reported contacting authors for further information and about half included only patients with diagnostic uncertainty. Up to half of the diagnostic endocrine tests in which the likelihood ratio was calculated or provided are likely to be helpful in practice when positive as are one-quarter when negative. Most diagnostic systematic reviews in endocrine lack methodological rigor, protection against bias, and offer limited credibility. Substantial efforts, therefore, seem necessary to improve the quality of diagnostic systematic reviews in endocrinology.

  15. Absorption Mode FT-ICR Mass Spectrometry Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Donald F.; Kilgour, David P.; Konijnenburg, Marco

    2013-12-03

    Fourier transform ion cyclotron resonance mass spectrometry offers the highest mass resolving power for molecular imaging experiments. This high mass resolving power ensures that closely spaced peaks at the same nominal mass are resolved for proper image generation. Typically higher magnetic fields are used to increase mass resolving power. However, a gain in mass resolving power can also be realized by phase correction of the data for absorption mode display. In addition to mass resolving power, absorption mode offers higher mass accuracy and signal-to-noise ratio over the conventional magnitude mode. Here we present the first use of absorption mode formore » Fourier transform ion cyclotron resonance mass spectrometry imaging. The Autophaser algorithm is used to phase correct each spectrum (pixel) in the image and then these parameters are used by the Chameleon work-flow based data processing software to generate absorption mode ?Datacubes? for image and spectral viewing. Absorption mode reveals new mass and spatial features that are not resolved in magnitude mode and results in improved selected ion image contrast.« less

  16. Public-use blood pressure measurement: the kiosk quandary.

    PubMed

    Alpert, Bruce S; Dart, Richard A; Sica, Domenic A

    2014-10-01

    It is important to note the opportunity that validated public-use kiosks offer the U.S. healthcare system in terms of ease of public access, reduced cost of screening/monitoring, and the opportunity to support coordinated care between physicians, pharmacists, and patients. It is equally important to recognize that all public-use BP kiosks are not equivalent. Members of the AAMI Sphygmomanometer Committee and other ‘‘concerned citizens’’ are working with FDA officials to try to improve both device validation and cuff range performance of these devices. In reality, regulatory changes will be slow to take effect, and for the foreseeable future, the burden of device accuracy assessment lies with the private sector and the public. There is a device currently available that has undergone full validation testing and offers a wide-range cuff validated for almost all US adult arms. We recognize the importance of innovation in out-of-office BP measurement. Therefore, in the interest of public health, we strongly urge those business professionals buying such devices, and those health professionals advising patients on their use, to become better informed and more discriminant in their device selection.

  17. Researches on the Orbit Determination and Positioning of the Chinese Lunar Exploration Program

    NASA Astrophysics Data System (ADS)

    Li, P. J.

    2015-07-01

    This dissertation studies the precise orbit determination (POD) and positioning of the Chinese lunar exploration spacecraft, emphasizing the variety of VLBI (very long baseline interferometry) technologies applied for the deep-space exploration, and their contributions to the methods and accuracies of the precise orbit determination and positioning. In summary, the main contents are as following: In this work, using the real-time data measured by the CE-2 (Chang'E-2) detector, the accuracy of orbit determination is analyzed for the domestic lunar probe under the present condition, and the role played by the VLBI tracking data is particularly reassessed through the precision orbit determination experiments for CE-2. The experiments of the short-arc orbit determination for the lunar probe show that the combination of the ranging and VLBI data with the arc of 15 minutes is able to improve the accuracy by 1-1.5 order of magnitude, compared to the cases for only using the ranging data with the arc of 3 hours. The orbital accuracy is assessed through the orbital overlapping analysis, and the results show that the VLBI data is able to contribute to the CE-2's long-arc POD especially in the along-track and orbital normal directions. For the CE-2's 100 km× 100 km lunar orbit, the position errors are better than 30 meters, and for the CE-2's 15 km× 100 km orbit, the position errors are better than 45 meters. The observational data with the delta differential one-way ranging (Δ DOR) from the CE-2's X-band monitoring and control system experimental are analyzed. It is concluded that the accuracy of Δ DOR delay is dramatically improved with the noise level better than 0.1 ns, and the systematic errors are well calibrated. Although it is unable to support the development of an independent lunar gravity model, the tracking data of CE-2 provided the evaluations of different lunar gravity models through POD, and the accuracies are examined in terms of orbit-to-orbit solution differences for several gravity models. It is found that for the 100 km× 100 km lunar orbit, with a degree and order expansion up to 165, the JPL's gravity model LP165P does not show noticeable improvement over Japan's SGM series models (100× 100), but for the 15 km× 100 km lunar orbit, a higher degree-order model can significantly improve the orbit accuracy. After accomplished its nominal mission, CE-2 launched its extended missions, which involving the L2 mission and the 4179 Toutatis mission. During the flight of the extended missions, the regime offers very little dynamics thus requires an extensive amount of time and tracking data in order to attain a solution. The overlap errors are computed, and it is indicated that the use of VLBI measurements is able to increase the accuracy and reduce the total amount of tracking time. An orbit determination method based on the polynomial fitting is proposed for the CE-3's planned lunar soft landing mission. In this method, spacecraft's dynamic modeling is not necessary, and its noise reduction is expected to be better than that of the point positioning method by making full use of all-arc observational data. The simulation experiments and real data processing showed that the optimal description of the CE-1's free-fall landing trajectory is a set of five-order polynomial functions for each of the position components as well as velocity components in J2000.0. The combination of the VLBI delay, the delay rate data, and the USB (united S-band) ranging data significantly improved the accuracy than the use of USB data alone. In order to determine the position for the CE-3's Lunar Lander, a kinematic statistical method is proposed. This method uses both ranging and VLBI measurements to the lander for a continuous arc, combing with precise knowledge about the motion of the moon as provided by planetary ephemeris, to estimate the lander's position on the lunar surface with high accuracy. Application of the lunar digital elevation model (DEM) as constraints in the lander positioning is helpful. The positioning method for the traverse of lunar rover is also investigated. The integration of delay-rate method is able to achieve higher precise positioning results than the point positioning method. This method provides a wide application of the VLBI data. In the automated sample return mission, the lunar orbit rendezvous and docking are involved. Precise orbit determination using the same-beam VLBI (SBI) measurement for two spacecraft at the same time is analyzed. The simulation results showed that the SBI data is able to improve the absolute and relative orbit accuracy for two targets by 1-2 orders of magnitude. In order to verify the simulation results and test the two-target POD software developed by SHAO (Shanghai Astronomical Observatory), the real SBI data of the SELENE (Selenological and Engineering Explorer) are processed. The POD results for the Rstar and the Vstar showed that the combination of SBI data could significantly improve the accuracy for the two spacecraft, especially for the Vstar with less ranging data, and the POD accuracy is improved by approximate one order of magnitude to the POD accuracy of the Rstar.

  18. Disaster damage detection through synergistic use of deep learning and 3D point cloud features derived from very high resolution oblique aerial images, and multiple-kernel-learning

    NASA Astrophysics Data System (ADS)

    Vetrivel, Anand; Gerke, Markus; Kerle, Norman; Nex, Francesco; Vosselman, George

    2018-06-01

    Oblique aerial images offer views of both building roofs and façades, and thus have been recognized as a potential source to detect severe building damages caused by destructive disaster events such as earthquakes. Therefore, they represent an important source of information for first responders or other stakeholders involved in the post-disaster response process. Several automated methods based on supervised learning have already been demonstrated for damage detection using oblique airborne images. However, they often do not generalize well when data from new unseen sites need to be processed, hampering their practical use. Reasons for this limitation include image and scene characteristics, though the most prominent one relates to the image features being used for training the classifier. Recently features based on deep learning approaches, such as convolutional neural networks (CNNs), have been shown to be more effective than conventional hand-crafted features, and have become the state-of-the-art in many domains, including remote sensing. Moreover, often oblique images are captured with high block overlap, facilitating the generation of dense 3D point clouds - an ideal source to derive geometric characteristics. We hypothesized that the use of CNN features, either independently or in combination with 3D point cloud features, would yield improved performance in damage detection. To this end we used CNN and 3D features, both independently and in combination, using images from manned and unmanned aerial platforms over several geographic locations that vary significantly in terms of image and scene characteristics. A multiple-kernel-learning framework, an effective way for integrating features from different modalities, was used for combining the two sets of features for classification. The results are encouraging: while CNN features produced an average classification accuracy of about 91%, the integration of 3D point cloud features led to an additional improvement of about 3% (i.e. an average classification accuracy of 94%). The significance of 3D point cloud features becomes more evident in the model transferability scenario (i.e., training and testing samples from different sites that vary slightly in the aforementioned characteristics), where the integration of CNN and 3D point cloud features significantly improved the model transferability accuracy up to a maximum of 7% compared with the accuracy achieved by CNN features alone. Overall, an average accuracy of 85% was achieved for the model transferability scenario across all experiments. Our main conclusion is that such an approach qualifies for practical use.

  19. Improved Space Object Orbit Determination Using CMOS Detectors

    NASA Astrophysics Data System (ADS)

    Schildknecht, T.; Peltonen, J.; Sännti, T.; Silha, J.; Flohrer, T.

    2014-09-01

    CMOS-sensors, or in general Active Pixel Sensors (APS), are rapidly replacing CCDs in the consumer camera market. Due to significant technological advances during the past years these devices start to compete with CCDs also for demanding scientific imaging applications, in particular in the astronomy community. CMOS detectors offer a series of inherent advantages compared to CCDs, due to the structure of their basic pixel cells, which each contains their own amplifier and readout electronics. The most prominent advantages for space object observations are the extremely fast and flexible readout capabilities, feasibility for electronic shuttering and precise epoch registration, and the potential to perform image processing operations on-chip and in real-time. The major challenges and design drivers for ground-based and space-based optical observation strategies have been analyzed. CMOS detector characteristics were critically evaluated and compared with the established CCD technology, especially with respect to the above mentioned observations. Similarly, the desirable on-chip processing functionalities which would further enhance the object detection and image segmentation were identified. Finally, we simulated several observation scenarios for ground- and space-based sensor by assuming different observation and sensor properties. We will introduce the analyzed end-to-end simulations of the ground- and space-based strategies in order to investigate the orbit determination accuracy and its sensitivity which may result from different values for the frame-rate, pixel scale, astrometric and epoch registration accuracies. Two cases were simulated, a survey using a ground-based sensor to observe objects in LEO for surveillance applications, and a statistical survey with a space-based sensor orbiting in LEO observing small-size debris in LEO. The ground-based LEO survey uses a dynamical fence close to the Earth shadow a few hours after sunset. For the space-based scenario a sensor in a sun-synchronous LEO orbit, always pointing in the anti-sun direction to achieve optimum illumination conditions for small LEO debris, was simulated. For the space-based scenario the simulations showed a 20 130 % improvement of the accuracy of all orbital parameters when varying the frame rate from 1/3 fps, which is the fastest rate for a typical CCD detector, to 50 fps, which represents the highest rate of scientific CMOS cameras. Changing the epoch registration accuracy from a typical 20.0 ms for a mechanical shutter to 0.025 ms, the theoretical value for the electronic shutter of a CMOS camera, improved the orbit accuracy by 4 to 190 %. The ground-based scenario also benefit from the specific CMOS characteristics, but to a lesser extent.

  20. Accuracy of MSCT Coronary Angiography with 64 Row CT Scanner—Facing the Facts

    PubMed Central

    Wehrschuetz, M.; Wehrschuetz, E.; Schuchlenz, H.; Schaffler, G.

    2010-01-01

    Improvements in multislice computed tomography (MSCT) angiography of the coronary vessels have enabled the minimally invasive detection of coronary artery stenoses, while quantitative coronary angiography (QCA) is the accepted reference standard for evaluation thereof. Sixteen-slice MSCT showed promising diagnostic accuracy in detecting coronary artery stenoses haemodynamically and the subsequent introduction of 64-slice scanners promised excellent and fast results for coronary artery studies. This prompted us to evaluate the diagnostic accuracy, sensitivity, specificity, and the negative und positive predictive value of 64-slice MSCT in the detection of haemodynamically significant coronary artery stenoses. Thirty-seven consecutive subjects with suspected coronary artery disease were evaluated with MSCT angiography and the results compared with QCA. All vessels were considered for the assessment of significant coronary artery stenosis (diameter reduction ≥ 50%). Thirteen patients (35%) were identified as having significant coronary artery stenoses on QCA with 6.3% (35/555) affected segments. None of the coronary segments were excluded from analysis. Overall sensitivity for classifying stenoses of 64-slice MSCT was 69%, specificity was 92%, positive predictive value was 38% and negative predictive value was 98%. The interobserver variability for detection of significant lesions had a k-value of 0.43. Sixty-four-slice MSCT offers the diagnostic potential to detect coronary artery disease, to quantify haemodynamically significant coronary artery stenoses and to avoid unnecessary invasive coronary artery examinations. PMID:20567636

  1. Evaluation of accelerated iterative x-ray CT image reconstruction using floating point graphics hardware.

    PubMed

    Kole, J S; Beekman, F J

    2006-02-21

    Statistical reconstruction methods offer possibilities to improve image quality as compared with analytical methods, but current reconstruction times prohibit routine application in clinical and micro-CT. In particular, for cone-beam x-ray CT, the use of graphics hardware has been proposed to accelerate the forward and back-projection operations, in order to reduce reconstruction times. In the past, wide application of this texture hardware mapping approach was hampered owing to limited intrinsic accuracy. Recently, however, floating point precision has become available in the latest generation commodity graphics cards. In this paper, we utilize this feature to construct a graphics hardware accelerated version of the ordered subset convex reconstruction algorithm. The aims of this paper are (i) to study the impact of using graphics hardware acceleration for statistical reconstruction on the reconstructed image accuracy and (ii) to measure the speed increase one can obtain by using graphics hardware acceleration. We compare the unaccelerated algorithm with the graphics hardware accelerated version, and for the latter we consider two different interpolation techniques. A simulation study of a micro-CT scanner with a mathematical phantom shows that at almost preserved reconstructed image accuracy, speed-ups of a factor 40 to 222 can be achieved, compared with the unaccelerated algorithm, and depending on the phantom and detector sizes. Reconstruction from physical phantom data reconfirms the usability of the accelerated algorithm for practical cases.

  2. Breast Cancer Detection by B7-H3-Targeted Ultrasound Molecular Imaging.

    PubMed

    Bachawal, Sunitha V; Jensen, Kristin C; Wilson, Katheryne E; Tian, Lu; Lutz, Amelie M; Willmann, Jürgen K

    2015-06-15

    Ultrasound complements mammography as an imaging modality for breast cancer detection, especially in patients with dense breast tissue, but its utility is limited by low diagnostic accuracy. One emerging molecular tool to address this limitation involves contrast-enhanced ultrasound using microbubbles targeted to molecular signatures on tumor neovasculature. In this study, we illustrate how tumor vascular expression of B7-H3 (CD276), a member of the B7 family of ligands for T-cell coregulatory receptors, can be incorporated into an ultrasound method that can distinguish normal, benign, precursor, and malignant breast pathologies for diagnostic purposes. Through an IHC analysis of 248 human breast specimens, we found that vascular expression of B7-H3 was selectively and significantly higher in breast cancer tissues. B7-H3 immunostaining on blood vessels distinguished benign/precursors from malignant lesions with high diagnostic accuracy in human specimens. In a transgenic mouse model of cancer, the B7-H3-targeted ultrasound imaging signal was increased significantly in breast cancer tissues and highly correlated with ex vivo expression levels of B7-H3 on quantitative immunofluorescence. Our findings offer a preclinical proof of concept for the use of B7-H3-targeted ultrasound molecular imaging as a tool to improve the diagnostic accuracy of breast cancer detection in patients. ©2015 American Association for Cancer Research.

  3. Kinematic-PPP using Single/Dual Frequency Observations from (GPS, GLONASS and GPS/GLONASS) Constellations for Hydrography

    NASA Astrophysics Data System (ADS)

    Farah, Ashraf

    2018-03-01

    Global Positioning System (GPS) technology is ideally suited for inshore and offshore positioning because of its high accuracy and the short observation time required for a position fix. Precise point positioning (PPP) is a technique used for position computation with a high accuracy using a single GNSS receiver. It relies on highly accurate satellite position and clock data that can be acquired from different sources such as the International GNSS Service (IGS). PPP precision varies based on positioning technique (static or kinematic), observations type (single or dual frequency) and the duration of observations among other factors. PPP offers comparable accuracy to differential GPS with safe in cost and time. For many years, PPP users depended on GPS (American system) which considered the solely reliable system. GLONASS's contribution in PPP techniques was limited due to fail in maintaining full constellation. Yet, GLONASS limited observations could be integrated into GPS-based PPP to improve availability and precision. As GLONASS reached its full constellation early 2013, there is a wide interest in PPP systems based on GLONASS only and independent of GPS. This paper investigates the performance of kinematic PPP solution for the hydrographic applications in the Nile river (Aswan, Egypt) based on GPS, GLONASS and GPS/GLONASS constellations. The study investigates also the effect of using two different observation types; single-frequency and dual frequency observations from the tested constellations.

  4. Accuracy and Efficiency of Recording Pediatric Early Warning Scores Using an Electronic Physiological Surveillance System Compared With Traditional Paper-Based Documentation.

    PubMed

    Sefton, Gerri; Lane, Steven; Killen, Roger; Black, Stuart; Lyon, Max; Ampah, Pearl; Sproule, Cathryn; Loren-Gosling, Dominic; Richards, Caitlin; Spinty, Jean; Holloway, Colette; Davies, Coral; Wilson, April; Chean, Chung Shen; Carter, Bernie; Carrol, E D

    2017-05-01

    Pediatric Early Warning Scores are advocated to assist health professionals to identify early signs of serious illness or deterioration in hospitalized children. Scores are derived from the weighting applied to recorded vital signs and clinical observations reflecting deviation from a predetermined "norm." Higher aggregate scores trigger an escalation in care aimed at preventing critical deterioration. Process errors made while recording these data, including plotting or calculation errors, have the potential to impede the reliability of the score. To test this hypothesis, we conducted a controlled study of documentation using five clinical vignettes. We measured the accuracy of vital sign recording, score calculation, and time taken to complete documentation using a handheld electronic physiological surveillance system, VitalPAC Pediatric, compared with traditional paper-based charts. We explored the user acceptability of both methods using a Web-based survey. Twenty-three staff participated in the controlled study. The electronic physiological surveillance system improved the accuracy of vital sign recording, 98.5% versus 85.6%, P < .02, Pediatric Early Warning Score calculation, 94.6% versus 55.7%, P < .02, and saved time, 68 versus 98 seconds, compared with paper-based documentation, P < .002. Twenty-nine staff completed the Web-based survey. They perceived that the electronic physiological surveillance system offered safety benefits by reducing human error while providing instant visibility of recorded data to the entire clinical team.

  5. (18)F-Fluorodeoxyglucose PET/MR Imaging in Head and Neck Cancer.

    PubMed

    Platzek, Ivan

    2016-10-01

    (18)F-fluorodeoxyglucose (FDG) PET/MR imaging does not offer significant additional information in initial staging of squamous cell carcinoma of the head and neck when compared with standalone MR imaging. In patients with suspected tumor recurrence, FDG PET/MR imaging has higher sensitivity than MR imaging, although its accuracy is equivalent to the accuracy of FDG PET/CT. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Acceptability of NHS 111 the telephone service for urgent health care: cross sectional postal survey of users’ views

    PubMed Central

    O’Cathain, Alicia

    2014-01-01

    Background. In 2010, a new telephone service, NHS 111, was piloted to improve access to urgent care in England. A unique feature is the use of non-clinical call takers who triage calls with computerized decision support and have access to clinical advisors when necessary. Aim. To explore users’ acceptability of NHS 111. Design. Cross-sectional postal survey. Setting. Four pilot sites in England. Method. A postal survey of recent users of NHS 111. Results. The response rate was 41% (1769/4265), with 49% offering written comments (872/1769). Sixty-five percent indicated the advice given had been very helpful and 28% quite helpful. The majority of respondents (86%) indicated that they fully complied with advice. Seventy-three percent was very satisfied and 19% quite satisfied with the service overall. Users were less satisfied with the relevance of questions asked, and the accuracy and appropriateness of advice given, than with other aspects of the service. Users who were autorouted to NHS 111 from services such as GP out-of-hours services were less satisfied than direct callers. Conclusion. In pilot services in the first year of operation, NHS 111 appeared to be acceptable to the majority of users. Acceptability could be improved by reassessing the necessity of triage questions used and auditing the accuracy and appropriateness of advice given. User acceptability should be viewed in the context of findings from the wider evaluation, which identified that the NHS 111 pilot services did not improve access to urgent care and indeed increased the use of emergency ambulance services. PMID:24334420

  7. An Improved Pathological Brain Detection System Based on Two-Dimensional PCA and Evolutionary Extreme Learning Machine.

    PubMed

    Nayak, Deepak Ranjan; Dash, Ratnakar; Majhi, Banshidhar

    2017-12-07

    Pathological brain detection has made notable stride in the past years, as a consequence many pathological brain detection systems (PBDSs) have been proposed. But, the accuracy of these systems still needs significant improvement in order to meet the necessity of real world diagnostic situations. In this paper, an efficient PBDS based on MR images is proposed that markedly improves the recent results. The proposed system makes use of contrast limited adaptive histogram equalization (CLAHE) to enhance the quality of the input MR images. Thereafter, two-dimensional PCA (2DPCA) strategy is employed to extract the features and subsequently, a PCA+LDA approach is used to generate a compact and discriminative feature set. Finally, a new learning algorithm called MDE-ELM is suggested that combines modified differential evolution (MDE) and extreme learning machine (ELM) for segregation of MR images as pathological or healthy. The MDE is utilized to optimize the input weights and hidden biases of single-hidden-layer feed-forward neural networks (SLFN), whereas an analytical method is used for determining the output weights. The proposed algorithm performs optimization based on both the root mean squared error (RMSE) and norm of the output weights of SLFNs. The suggested scheme is benchmarked on three standard datasets and the results are compared against other competent schemes. The experimental outcomes show that the proposed scheme offers superior results compared to its counterparts. Further, it has been noticed that the proposed MDE-ELM classifier obtains better accuracy with compact network architecture than conventional algorithms.

  8. Improved scatter correction using adaptive scatter kernel superposition

    NASA Astrophysics Data System (ADS)

    Sun, M.; Star-Lack, J. M.

    2010-11-01

    Accurate scatter correction is required to produce high-quality reconstructions of x-ray cone-beam computed tomography (CBCT) scans. This paper describes new scatter kernel superposition (SKS) algorithms for deconvolving scatter from projection data. The algorithms are designed to improve upon the conventional approach whose accuracy is limited by the use of symmetric kernels that characterize the scatter properties of uniform slabs. To model scatter transport in more realistic objects, nonstationary kernels, whose shapes adapt to local thickness variations in the projection data, are proposed. Two methods are introduced: (1) adaptive scatter kernel superposition (ASKS) requiring spatial domain convolutions and (2) fast adaptive scatter kernel superposition (fASKS) where, through a linearity approximation, convolution is efficiently performed in Fourier space. The conventional SKS algorithm, ASKS, and fASKS, were tested with Monte Carlo simulations and with phantom data acquired on a table-top CBCT system matching the Varian On-Board Imager (OBI). All three models accounted for scatter point-spread broadening due to object thickening, object edge effects, detector scatter properties and an anti-scatter grid. Hounsfield unit (HU) errors in reconstructions of a large pelvis phantom with a measured maximum scatter-to-primary ratio over 200% were reduced from -90 ± 58 HU (mean ± standard deviation) with no scatter correction to 53 ± 82 HU with SKS, to 19 ± 25 HU with fASKS and to 13 ± 21 HU with ASKS. HU accuracies and measured contrast were similarly improved in reconstructions of a body-sized elliptical Catphan phantom. The results show that the adaptive SKS methods offer significant advantages over the conventional scatter deconvolution technique.

  9. Research on navigation of satellite constellation based on an asynchronous observation model using X-ray pulsar

    NASA Astrophysics Data System (ADS)

    Guo, Pengbin; Sun, Jian; Hu, Shuling; Xue, Ju

    2018-02-01

    Pulsar navigation is a promising navigation method for high-altitude orbit space tasks or deep space exploration. At present, an important reason for restricting the development of pulsar navigation is that navigation accuracy is not high due to the slow update of the measurements. In order to improve the accuracy of pulsar navigation, an asynchronous observation model which can improve the update rate of the measurements is proposed on the basis of satellite constellation which has a broad space for development because of its visibility and reliability. The simulation results show that the asynchronous observation model improves the positioning accuracy by 31.48% and velocity accuracy by 24.75% than that of the synchronous observation model. With the new Doppler effects compensation method in the asynchronous observation model proposed in this paper, the positioning accuracy is improved by 32.27%, and the velocity accuracy is improved by 34.07% than that of the traditional method. The simulation results show that without considering the clock error will result in a filtering divergence.

  10. Multi-Target Tracking Using an Improved Gaussian Mixture CPHD Filter.

    PubMed

    Si, Weijian; Wang, Liwei; Qu, Zhiyu

    2016-11-23

    The cardinalized probability hypothesis density (CPHD) filter is an alternative approximation to the full multi-target Bayesian filter for tracking multiple targets. However, although the joint propagation of the posterior intensity and cardinality distribution in its recursion allows more reliable estimates of the target number than the PHD filter, the CPHD filter suffers from the spooky effect where there exists arbitrary PHD mass shifting in the presence of missed detections. To address this issue in the Gaussian mixture (GM) implementation of the CPHD filter, this paper presents an improved GM-CPHD filter, which incorporates a weight redistribution scheme into the filtering process to modify the updated weights of the Gaussian components when missed detections occur. In addition, an efficient gating strategy that can adaptively adjust the gate sizes according to the number of missed detections of each Gaussian component is also presented to further improve the computational efficiency of the proposed filter. Simulation results demonstrate that the proposed method offers favorable performance in terms of both estimation accuracy and robustness to clutter and detection uncertainty over the existing methods.

  11. Improving long time behavior of Poisson bracket mapping equation: A non-Hamiltonian approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Hyun Woo; Rhee, Young Min, E-mail: ymrhee@postech.ac.kr

    2014-05-14

    Understanding nonadiabatic dynamics in complex systems is a challenging subject. A series of semiclassical approaches have been proposed to tackle the problem in various settings. The Poisson bracket mapping equation (PBME) utilizes a partial Wigner transform and a mapping representation for its formulation, and has been developed to describe nonadiabatic processes in an efficient manner. Operationally, it is expressed as a set of Hamilton's equations of motion, similar to more conventional classical molecular dynamics. However, this original Hamiltonian PBME sometimes suffers from a large deviation in accuracy especially in the long time limit. Here, we propose a non-Hamiltonian variant ofmore » PBME to improve its behavior especially in that limit. As a benchmark, we simulate spin-boson and photosynthetic model systems and find that it consistently outperforms the original PBME and its Ehrenfest style variant. We explain the source of this improvement by decomposing the components of the mapping Hamiltonian and by assessing the energy flow between the system and the bath. We discuss strengths and weaknesses of our scheme with a viewpoint of offering future prospects.« less

  12. Bringing influenza vaccines into the 21st century

    PubMed Central

    Settembre, Ethan C; Dormitzer, Philip R; Rappuoli, Rino

    2014-01-01

    The recent H7N9 influenza outbreak in China highlights the need for influenza vaccine production systems that are robust and can quickly generate substantial quantities of vaccines that target new strains for pandemic and seasonal immunization. Although the influenza vaccine system, a public-private partnership, has been effective in providing vaccines, there are areas for improvement. Technological advances such as mammalian cell culture production and synthetic vaccine seeds provide a means to increase the speed and accuracy of targeting new influenza strains with mass-produced vaccines by dispensing with the need for egg isolation, adaptation, and reassortment of vaccine viruses. New influenza potency assays that no longer require the time-consuming step of generating sheep antisera could further speed vaccine release. Adjuvants that increase the breadth of the elicited immune response and allow dose sparing provide an additional means to increase the number of available vaccine doses. Together these technologies can improve the influenza vaccination system in the near term. In the longer term, disruptive technologies, such as RNA-based flu vaccines and ‘universal’ flu vaccines, offer a promise of a dramatically improved influenza vaccine system. PMID:24378716

  13. Bringing influenza vaccines into the 21st century.

    PubMed

    Settembre, Ethan C; Dormitzer, Philip R; Rappuoli, Rino

    2014-01-01

    The recent H7N9 influenza outbreak in China highlights the need for influenza vaccine production systems that are robust and can quickly generate substantial quantities of vaccines that target new strains for pandemic and seasonal immunization. Although the influenza vaccine system, a public-private partnership, has been effective in providing vaccines, there are areas for improvement. Technological advances such as mammalian cell culture production and synthetic vaccine seeds provide a means to increase the speed and accuracy of targeting new influenza strains with mass-produced vaccines by dispensing with the need for egg isolation, adaptation, and reassortment of vaccine viruses. New influenza potency assays that no longer require the time-consuming step of generating sheep antisera could further speed vaccine release. Adjuvants that increase the breadth of the elicited immune response and allow dose sparing provide an additional means to increase the number of available vaccine doses. Together these technologies can improve the influenza vaccination system in the near term. In the longer term, disruptive technologies, such as RNA-based flu vaccines and 'universal' flu vaccines, offer a promise of a dramatically improved influenza vaccine system.

  14. Deep Logic Networks: Inserting and Extracting Knowledge From Deep Belief Networks.

    PubMed

    Tran, Son N; d'Avila Garcez, Artur S

    2018-02-01

    Developments in deep learning have seen the use of layerwise unsupervised learning combined with supervised learning for fine-tuning. With this layerwise approach, a deep network can be seen as a more modular system that lends itself well to learning representations. In this paper, we investigate whether such modularity can be useful to the insertion of background knowledge into deep networks, whether it can improve learning performance when it is available, and to the extraction of knowledge from trained deep networks, and whether it can offer a better understanding of the representations learned by such networks. To this end, we use a simple symbolic language-a set of logical rules that we call confidence rules-and show that it is suitable for the representation of quantitative reasoning in deep networks. We show by knowledge extraction that confidence rules can offer a low-cost representation for layerwise networks (or restricted Boltzmann machines). We also show that layerwise extraction can produce an improvement in the accuracy of deep belief networks. Furthermore, the proposed symbolic characterization of deep networks provides a novel method for the insertion of prior knowledge and training of deep networks. With the use of this method, a deep neural-symbolic system is proposed and evaluated, with the experimental results indicating that modularity through the use of confidence rules and knowledge insertion can be beneficial to network performance.

  15. Continuous Glucose Monitoring in Subjects with Type 1 Diabetes: Improvement in Accuracy by Correcting for Background Current

    PubMed Central

    Youssef, Joseph El; Engle, Julia M.; Massoud, Ryan G.; Ward, W. Kenneth

    2010-01-01

    Abstract Background A cause of suboptimal accuracy in amperometric glucose sensors is the presence of a background current (current produced in the absence of glucose) that is not accounted for. We hypothesized that a mathematical correction for the estimated background current of a commercially available sensor would lead to greater accuracy compared to a situation in which we assumed the background current to be zero. We also tested whether increasing the frequency of sensor calibration would improve sensor accuracy. Methods This report includes analysis of 20 sensor datasets from seven human subjects with type 1 diabetes. Data were divided into a training set for algorithm development and a validation set on which the algorithm was tested. A range of potential background currents was tested. Results Use of the background current correction of 4 nA led to a substantial improvement in accuracy (improvement of absolute relative difference or absolute difference of 3.5–5.5 units). An increase in calibration frequency led to a modest accuracy improvement, with an optimum at every 4 h. Conclusions Compared to no correction, a correction for the estimated background current of a commercially available glucose sensor led to greater accuracy and better detection of hypoglycemia and hyperglycemia. The accuracy-optimizing scheme presented here can be implemented in real time. PMID:20879968

  16. Improving critical thinking and clinical reasoning with a continuing education course.

    PubMed

    Cruz, Dina Monteiro; Pimenta, Cibele Mattos; Lunney, Margaret

    2009-03-01

    Continuing education courses related to critical thinking and clinical reasoning are needed to improve the accuracy of diagnosis. This study evaluated a 4-day, 16-hour continuing education course conducted in Brazil.Thirty-nine nurses completed a pretest and a posttest consisting of two written case studies designed to measure the accuracy of nurses' diagnoses. There were significant differences in accuracy from pretest to posttest for case 1 (p = .008) and case 2 (p = .042) and overall (p = .001). Continuing education courses should be implemented to improve the accuracy of nurses' diagnoses.

  17. Using methods from the data mining and machine learning literature for disease classification and prediction: A case study examining classification of heart failure sub-types

    PubMed Central

    Austin, Peter C.; Tu, Jack V.; Ho, Jennifer E.; Levy, Daniel; Lee, Douglas S.

    2014-01-01

    Objective Physicians classify patients into those with or without a specific disease. Furthermore, there is often interest in classifying patients according to disease etiology or subtype. Classification trees are frequently used to classify patients according to the presence or absence of a disease. However, classification trees can suffer from limited accuracy. In the data-mining and machine learning literature, alternate classification schemes have been developed. These include bootstrap aggregation (bagging), boosting, random forests, and support vector machines. Study design and Setting We compared the performance of these classification methods with those of conventional classification trees to classify patients with heart failure according to the following sub-types: heart failure with preserved ejection fraction (HFPEF) vs. heart failure with reduced ejection fraction (HFREF). We also compared the ability of these methods to predict the probability of the presence of HFPEF with that of conventional logistic regression. Results We found that modern, flexible tree-based methods from the data mining literature offer substantial improvement in prediction and classification of heart failure sub-type compared to conventional classification and regression trees. However, conventional logistic regression had superior performance for predicting the probability of the presence of HFPEF compared to the methods proposed in the data mining literature. Conclusion The use of tree-based methods offers superior performance over conventional classification and regression trees for predicting and classifying heart failure subtypes in a population-based sample of patients from Ontario. However, these methods do not offer substantial improvements over logistic regression for predicting the presence of HFPEF. PMID:23384592

  18. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    PubMed

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  19. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory

    PubMed Central

    Zaman, Zahur; Blanckaert, Norbert J. C.; Chan, Daniel W.; Dubois, Jeffrey A.; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W.; Nilsen, Olaug L.; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L.; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality. PMID:18924721

  20. Pediatric Disaster Triage: Multiple Simulation Curriculum Improves Prehospital Care Providers' Assessment Skills.

    PubMed

    Cicero, Mark Xavier; Whitfill, Travis; Overly, Frank; Baird, Janette; Walsh, Barbara; Yarzebski, Jorge; Riera, Antonio; Adelgais, Kathleen; Meckler, Garth D; Baum, Carl; Cone, David Christopher; Auerbach, Marc

    2017-01-01

    Paramedics and emergency medical technicians (EMTs) triage pediatric disaster victims infrequently. The objective of this study was to measure the effect of a multiple-patient, multiple-simulation curriculum on accuracy of pediatric disaster triage (PDT). Paramedics, paramedic students, and EMTs from three sites were enrolled. Triage accuracy was measured three times (Time 0, Time 1 [two weeks later], and Time 2 [6 months later]) during a disaster simulation, in which high and low fidelity manikins and actors portrayed 10 victims. Accuracy was determined by participant triage decision concordance with predetermined expected triage level (RED [Immediate], YELLOW [Delayed], GREEN [Ambulatory], BLACK [Deceased]) for each victim. Between Time 0 and Time 1, participants completed an interactive online module, and after each simulation there was an individual debriefing. Associations between participant level of training, years of experience, and enrollment site were determined, as were instances of the most dangerous mistriage, when RED and YELLOW victims were triaged BLACK. The study enrolled 331 participants, and the analysis included 261 (78.9%) participants who completed the study, 123 from the Connecticut site, 83 from Rhode Island, and 55 from Massachusetts. Triage accuracy improved significantly from Time 0 to Time 1, after the educational interventions (first simulation with debriefing, and an interactive online module), with a median 10% overall improvement (p < 0.001). Subgroup analyses showed between Time 0 and Time 1, paramedics and paramedic students improved more than EMTs (p = 0.002). Analysis of triage accuracy showed greatest improvement in overall accuracy for YELLOW triage patients (Time 0 50% accurate, Time1 100%), followed by RED patients (Time 0 80%, Time 1 100%). There was no significant difference in accuracy between Time 1 and Time 2 (p = 0.073). This study shows that the multiple-victim, multiple-simulation curriculum yields a durable 10% improvement in simulated triage accuracy. Future iterations of the curriculum can target greater improvements in EMT triage accuracy.

  1. RAPID COMMUNICATION: Improving prediction accuracy of GPS satellite clocks with periodic variation behaviour

    NASA Astrophysics Data System (ADS)

    Heo, Youn Jeong; Cho, Jeongho; Heo, Moon Beom

    2010-07-01

    The broadcast ephemeris and IGS ultra-rapid predicted (IGU-P) products are primarily available for use in real-time GPS applications. The IGU orbit precision has been remarkably improved since late 2007, but its clock products have not shown acceptably high-quality prediction performance. One reason for this fact is that satellite atomic clocks in space can be easily influenced by various factors such as temperature and environment and this leads to complicated aspects like periodic variations, which are not sufficiently described by conventional models. A more reliable prediction model is thus proposed in this paper in order to be utilized particularly in describing the periodic variation behaviour satisfactorily. The proposed prediction model for satellite clocks adds cyclic terms to overcome the periodic effects and adopts delay coordinate embedding, which offers the possibility of accessing linear or nonlinear coupling characteristics like satellite behaviour. The simulation results have shown that the proposed prediction model outperforms the IGU-P solutions at least on a daily basis.

  2. Distributed collaborative probabilistic design for turbine blade-tip radial running clearance using support vector machine of regression

    NASA Astrophysics Data System (ADS)

    Fei, Cheng-Wei; Bai, Guang-Chen

    2014-12-01

    To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.

  3. 1990 censuses to increase use of automation.

    PubMed

    Ward, S E

    1988-12-01

    This article summarizes information from selected reports presented at the 12th Population Census Conference. Ward reports that plans for the 1990 census in many countries of Asia and the Pacific call for increased use of automation, with applications ranging from the use of computer-generated maps of enumeration areas and optical mark readers for data processing to desktop publishing and electronic mail for disseminating the results. Recent advances in automation offer opportunities for improved accuracy and speed of census operations while reducing the need for clerical personnel. Most of the technologies discussed at the 12th Population Census are designed to make the planning, editing, processing, analysis, and publication of census data more reliable and efficient. However, technology alone cannot overcome high rates of illiteracy that preclude having respondents complete the census forms themselves. But it enables even China, India, Indonesia and Pakistan - the countries with huge population and limited financial resources - to make significant improvements in their forthcoming censuses.

  4. Middle-ear microsurgery simulation to improve new robotic procedures.

    PubMed

    Kazmitcheff, Guillaume; Nguyen, Yann; Miroir, Mathieu; Péan, Fabien; Ferrary, Evelyne; Cotin, Stéphane; Sterkers, Olivier; Duriez, Christian

    2014-01-01

    Otological microsurgery is delicate and requires high dexterity in bad ergonomic conditions. To assist surgeons in these indications, a teleoperated system, called RobOtol, is developed. This robot enhances gesture accuracy and handiness and allows exploration of new procedures for middle ear surgery. To plan new procedures that exploit the capacities given by the robot, a surgical simulator is developed. The simulation reproduces with high fidelity the behavior of the anatomical structures and can also be used as a training tool for an easier control of the robot for surgeons. In the paper, we introduce the middle ear surgical simulation and then we perform virtually two challenging procedures with the robot. We show how interactive simulation can assist in analyzing the benefits of robotics in the case of complex manipulations or ergonomics studies and allow the development of innovative surgical procedures. New robot-based microsurgical procedures are investigated. The improvement offered by RobOtol is also evaluated and discussed.

  5. Middle-Ear Microsurgery Simulation to Improve New Robotic Procedures

    PubMed Central

    Kazmitcheff, Guillaume; Nguyen, Yann; Miroir, Mathieu; Péan, Fabien; Ferrary, Evelyne; Cotin, Stéphane; Sterkers, Olivier; Duriez, Christian

    2014-01-01

    Otological microsurgery is delicate and requires high dexterity in bad ergonomic conditions. To assist surgeons in these indications, a teleoperated system, called RobOtol, is developed. This robot enhances gesture accuracy and handiness and allows exploration of new procedures for middle ear surgery. To plan new procedures that exploit the capacities given by the robot, a surgical simulator is developed. The simulation reproduces with high fidelity the behavior of the anatomical structures and can also be used as a training tool for an easier control of the robot for surgeons. In the paper, we introduce the middle ear surgical simulation and then we perform virtually two challenging procedures with the robot. We show how interactive simulation can assist in analyzing the benefits of robotics in the case of complex manipulations or ergonomics studies and allow the development of innovative surgical procedures. New robot-based microsurgical procedures are investigated. The improvement offered by RobOtol is also evaluated and discussed. PMID:25157373

  6. Temperature-controlled electrothermal atomization-atomic absorption spectrometry using a pyrometric feedback system in conjunction with a background monitoring device

    NASA Astrophysics Data System (ADS)

    Van Deijck, W.; Roelofsen, A. M.; Pieters, H. J.; Herber, R. F. M.

    The construction of a temperature-controlled feedback system for electrothermal atomization-atomic absorption spectrometry (ETA-AAS) using an optical pyrometer applied to the atomization stage is described. The system was used in conjunction with a fast-response background monitoring device. The heating rate of the furnace amounted to 1400° s -1 with a reproducibility better than 1%. The precision of the temperature control at a steady state temperature of 2000°C was 0.1%. The analytical improvements offered by the present system have been demonstrated by the determination of cadmium and lead in blood and finally by the determination of lead in serum. Both the sensitivity and the precision of the method have been improved. The accuracy of the method was checked by determining the lead content for a number of scrum samples both by ETA-AAS and differential pulse anodic stripping voltametry (DPASV) and proved to be satisfactory.

  7. Enhancing speech recognition using improved particle swarm optimization based hidden Markov model.

    PubMed

    Selvaraj, Lokesh; Ganesan, Balakrishnan

    2014-01-01

    Enhancing speech recognition is the primary intention of this work. In this paper a novel speech recognition method based on vector quantization and improved particle swarm optimization (IPSO) is suggested. The suggested methodology contains four stages, namely, (i) denoising, (ii) feature mining (iii), vector quantization, and (iv) IPSO based hidden Markov model (HMM) technique (IP-HMM). At first, the speech signals are denoised using median filter. Next, characteristics such as peak, pitch spectrum, Mel frequency Cepstral coefficients (MFCC), mean, standard deviation, and minimum and maximum of the signal are extorted from the denoised signal. Following that, to accomplish the training process, the extracted characteristics are given to genetic algorithm based codebook generation in vector quantization. The initial populations are created by selecting random code vectors from the training set for the codebooks for the genetic algorithm process and IP-HMM helps in doing the recognition. At this point the creativeness will be done in terms of one of the genetic operation crossovers. The proposed speech recognition technique offers 97.14% accuracy.

  8. Cadastral Database Positional Accuracy Improvement

    NASA Astrophysics Data System (ADS)

    Hashim, N. M.; Omar, A. H.; Ramli, S. N. M.; Omar, K. M.; Din, N.

    2017-10-01

    Positional Accuracy Improvement (PAI) is the refining process of the geometry feature in a geospatial dataset to improve its actual position. This actual position relates to the absolute position in specific coordinate system and the relation to the neighborhood features. With the growth of spatial based technology especially Geographical Information System (GIS) and Global Navigation Satellite System (GNSS), the PAI campaign is inevitable especially to the legacy cadastral database. Integration of legacy dataset and higher accuracy dataset like GNSS observation is a potential solution for improving the legacy dataset. However, by merely integrating both datasets will lead to a distortion of the relative geometry. The improved dataset should be further treated to minimize inherent errors and fitting to the new accurate dataset. The main focus of this study is to describe a method of angular based Least Square Adjustment (LSA) for PAI process of legacy dataset. The existing high accuracy dataset known as National Digital Cadastral Database (NDCDB) is then used as bench mark to validate the results. It was found that the propose technique is highly possible for positional accuracy improvement of legacy spatial datasets.

  9. An Innovative Approach to Improving the Accuracy of Delirium Assessments Using the Confusion Assessment Method for the Intensive Care Unit.

    PubMed

    DiLibero, Justin; O'Donoghue, Sharon C; DeSanto-Madeya, Susan; Felix, Janice; Ninobla, Annalyn; Woods, Allison

    2016-01-01

    Delirium occurs in up to 80% of intensive care unit (ICU) patients. Despite its prevalence in this population, there continues to be inaccuracies in delirium assessments. In the absence of accurate delirium assessments, delirium in critically ill ICU patients will remain unrecognized and will lead to negative clinical and organizational outcomes. The goal of this quality improvement project was to facilitate sustained improvement in the accuracy of delirium assessments among all ICU patients including those who were sedate or agitated. A pretest-posttest design was used to evaluate the effectiveness of a program to improve the accuracy of delirium screenings among patients admitted to a medical ICU or coronary care unit. Two hundred thirty-six delirium assessment audits were completed during the baseline period and 535 during the postintervention period. Compliance with performing at least 1 delirium assessment every shift was 85% at baseline and improved to 99% during the postintervention period. Baseline assessment accuracy was 70.31% among all patients and 53.49% among sedate and agitated patients. Postintervention assessment accuracy improved to 95.51% for all patients and 89.23% among sedate and agitated patients. The results from this project suggest the effectiveness of the program in improving assessment accuracy among difficult-to-assess patients. Further research is needed to demonstrate the effectiveness of this model across other critical care units, patient populations, and organizations.

  10. Forecasting daily patient volumes in the emergency department.

    PubMed

    Jones, Spencer S; Thomas, Alun; Evans, R Scott; Welch, Shari J; Haug, Peter J; Snow, Gregory L

    2008-02-01

    Shifts in the supply of and demand for emergency department (ED) resources make the efficient allocation of ED resources increasingly important. Forecasting is a vital activity that guides decision-making in many areas of economic, industrial, and scientific planning, but has gained little traction in the health care industry. There are few studies that explore the use of forecasting methods to predict patient volumes in the ED. The goals of this study are to explore and evaluate the use of several statistical forecasting methods to predict daily ED patient volumes at three diverse hospital EDs and to compare the accuracy of these methods to the accuracy of a previously proposed forecasting method. Daily patient arrivals at three hospital EDs were collected for the period January 1, 2005, through March 31, 2007. The authors evaluated the use of seasonal autoregressive integrated moving average, time series regression, exponential smoothing, and artificial neural network models to forecast daily patient volumes at each facility. Forecasts were made for horizons ranging from 1 to 30 days in advance. The forecast accuracy achieved by the various forecasting methods was compared to the forecast accuracy achieved when using a benchmark forecasting method already available in the emergency medicine literature. All time series methods considered in this analysis provided improved in-sample model goodness of fit. However, post-sample analysis revealed that time series regression models that augment linear regression models by accounting for serial autocorrelation offered only small improvements in terms of post-sample forecast accuracy, relative to multiple linear regression models, while seasonal autoregressive integrated moving average, exponential smoothing, and artificial neural network forecasting models did not provide consistently accurate forecasts of daily ED volumes. This study confirms the widely held belief that daily demand for ED services is characterized by seasonal and weekly patterns. The authors compared several time series forecasting methods to a benchmark multiple linear regression model. The results suggest that the existing methodology proposed in the literature, multiple linear regression based on calendar variables, is a reasonable approach to forecasting daily patient volumes in the ED. However, the authors conclude that regression-based models that incorporate calendar variables, account for site-specific special-day effects, and allow for residual autocorrelation provide a more appropriate, informative, and consistently accurate approach to forecasting daily ED patient volumes.

  11. Concept Mapping Improves Metacomprehension Accuracy among 7th Graders

    ERIC Educational Resources Information Center

    Redford, Joshua S.; Thiede, Keith W.; Wiley, Jennifer; Griffin, Thomas D.

    2012-01-01

    Two experiments explored concept map construction as a useful intervention to improve metacomprehension accuracy among 7th grade students. In the first experiment, metacomprehension was marginally better for a concept mapping group than for a rereading group. In the second experiment, metacomprehension accuracy was significantly greater for a…

  12. Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polly, B.

    2011-09-01

    This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.

  13. Randomized controlled pilot trial of mindfulness-based stress reduction for breast and colorectal cancer survivors: effects on cancer-related cognitive impairment.

    PubMed

    Johns, Shelley A; Von Ah, Diane; Brown, Linda F; Beck-Coon, Kathleen; Talib, Tasneem L; Alyea, Jennifer M; Monahan, Patrick O; Tong, Yan; Wilhelm, Laura; Giesler, R Brian

    2016-06-01

    Cancer-related cognitive impairment (CRCI) is a common, fatigue-related symptom that disrupts cancer survivors' quality of life. Few interventions for CRCI exist. As part of a randomized pilot study targeting cancer-related fatigue, the effects of mindfulness-based stress reduction (MBSR) on survivors' cognitive outcomes were investigated. Breast and colorectal cancer survivors (n = 71) with moderate-to-severe fatigue were randomized to MBSR (n = 35) or a fatigue education and support (ES; n = 36) condition. The Attentional Function Index (AFI) and the Stroop test were used to assess survivors' cognitive function at baseline (T1), after the 8-week intervention period (T2), and 6 months later (T3) using intent-to-treat analysis. Mediation analyses were performed to explore mechanisms of intervention effects on cognitive functioning. MBSR participants reported significantly greater improvement on the AFI total score compared to ES participants at T2 (d = 0.83, p = 0.001) and T3 (d = 0.55, p = 0.021). MBSR also significantly outperformed ES on most AFI subscales, although both groups improved over time. MBSR produced greater Stroop accuracy rates relative to ES at T2 (r = 0.340, p = 0.005) and T3 (r = 0.280, p = 0.030), with improved accuracy over time only for the MBSR group. There were no significant differences in Stroop reaction time between groups. Improvements in mindfulness mediated the effect of group (e.g., MBSR vs. ES) on AFI total score at T2 and T3. Additional randomized trials with more comprehensive cognitive measures are warranted to definitively assess the efficacy of MBSR for CRCI. This pilot study has important implications for all cancer survivors as it is the first published trial to show that MBSR offers robust and durable improvements in CRCI.

  14. Note: An improved calibration system with phase correction for electronic transformers with digital output.

    PubMed

    Cheng, Han-miao; Li, Hong-bin

    2015-08-01

    The existing electronic transformer calibration systems employing data acquisition cards cannot satisfy some practical applications, because the calibration systems have phase measurement errors when they work in the mode of receiving external synchronization signals. This paper proposes an improved calibration system scheme with phase correction to improve the phase measurement accuracy. We employ NI PCI-4474 to design a calibration system, and the system has the potential to receive external synchronization signals and reach extremely high accuracy classes. Accuracy verification has been carried out in the China Electric Power Research Institute, and results demonstrate that the system surpasses the accuracy class 0.05. Furthermore, this system has been used to test the harmonics measurement accuracy of all-fiber optical current transformers. In the same process, we have used an existing calibration system, and a comparison of the test results is presented. The system after improvement is suitable for the intended applications.

  15. Understanding the delayed-keyword effect on metacomprehension accuracy.

    PubMed

    Thiede, Keith W; Dunlosky, John; Griffin, Thomas D; Wiley, Jennifer

    2005-11-01

    The typical finding from research on metacomprehension is that accuracy is quite low. However, recent studies have shown robust accuracy improvements when judgments follow certain generation tasks (summarizing or keyword listing) but only when these tasks are performed at a delay rather than immediately after reading (K. W. Thiede & M. C. M. Anderson, 2003; K. W. Thiede, M. C. M. Anderson, & D. Therriault, 2003). The delayed and immediate conditions in these studies confounded the delay between reading and generation tasks with other task lags, including the lag between multiple generation tasks and the lag between generation tasks and judgments. The first 2 experiments disentangle these confounded manipulations and provide clear evidence that the delay between reading and keyword generation is the only lag critical to improving metacomprehension accuracy. The 3rd and 4th experiments show that not all delayed tasks produce improvements and suggest that delayed generative tasks provide necessary diagnostic cues about comprehension for improving metacomprehension accuracy.

  16. Techniques for improving the accuracy of cyrogenic temperature measurement in ground test programs

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Fabik, Richard H.

    1993-01-01

    The performance of a sensor is often evaluated by determining to what degree of accuracy a measurement can be made using this sensor. The absolute accuracy of a sensor is an important parameter considered when choosing the type of sensor to use in research experiments. Tests were performed to improve the accuracy of cryogenic temperature measurements by calibration of the temperature sensors when installed in their experimental operating environment. The calibration information was then used to correct for temperature sensor measurement errors by adjusting the data acquisition system software. This paper describes a method to improve the accuracy of cryogenic temperature measurements using corrections in the data acquisition system software such that the uncertainty of an individual temperature sensor is improved from plus or minus 0.90 deg R to plus or minus 0.20 deg R over a specified range.

  17. Improving the accuracy of camber predictions for precast pretensioned concrete beams : [tech transfer summary].

    DOT National Transportation Integrated Search

    2015-07-01

    Implementing the recommendations of this study is expected to significantly : improve the accuracy of camber measurements and predictions and to : ultimately help reduce construction delays, improve bridge serviceability, : and decrease costs.

  18. Number-Density Measurements of CO2 in Real Time with an Optical Frequency Comb for High Accuracy and Precision

    NASA Astrophysics Data System (ADS)

    Scholten, Sarah K.; Perrella, Christopher; Anstie, James D.; White, Richard T.; Al-Ashwal, Waddah; Hébert, Nicolas Bourbeau; Genest, Jérôme; Luiten, Andre N.

    2018-05-01

    Real-time and accurate measurements of gas properties are highly desirable for numerous real-world applications. Here, we use an optical-frequency comb to demonstrate absolute number-density and temperature measurements of a sample gas with state-of-the-art precision and accuracy. The technique is demonstrated by measuring the number density of 12C16O2 with an accuracy of better than 1% and a precision of 0.04% in a measurement and analysis cycle of less than 1 s. This technique is transferable to numerous molecular species, thus offering an avenue for near-universal gas concentration measurements.

  19. Deep into the Brain: Artificial Intelligence in Stroke Imaging

    PubMed Central

    Lee, Eun-Jae; Kim, Yong-Hwan; Kim, Namkug; Kang, Dong-Wha

    2017-01-01

    Artificial intelligence (AI), a computer system aiming to mimic human intelligence, is gaining increasing interest and is being incorporated into many fields, including medicine. Stroke medicine is one such area of application of AI, for improving the accuracy of diagnosis and the quality of patient care. For stroke management, adequate analysis of stroke imaging is crucial. Recently, AI techniques have been applied to decipher the data from stroke imaging and have demonstrated some promising results. In the very near future, such AI techniques may play a pivotal role in determining the therapeutic methods and predicting the prognosis for stroke patients in an individualized manner. In this review, we offer a glimpse at the use of AI in stroke imaging, specifically focusing on its technical principles, clinical application, and future perspectives. PMID:29037014

  20. Geometric registration of remotely sensed data with SAMIR

    NASA Astrophysics Data System (ADS)

    Gianinetto, Marco; Barazzetti, Luigi; Dini, Luigi; Fusiello, Andrea; Toldo, Roberto

    2015-06-01

    The commercial market offers several software packages for the registration of remotely sensed data through standard one-to-one image matching. Although very rapid and simple, this strategy does not take into consideration all the interconnections among the images of a multi-temporal data set. This paper presents a new scientific software, called Satellite Automatic Multi-Image Registration (SAMIR), able to extend the traditional registration approach towards multi-image global processing. Tests carried out with high-resolution optical (IKONOS) and high-resolution radar (COSMO-SkyMed) data showed that SAMIR can improve the registration phase with a more rigorous and robust workflow without initial approximations, user's interaction or limitation in spatial/spectral data size. The validation highlighted a sub-pixel accuracy in image co-registration for the considered imaging technologies, including optical and radar imagery.

  1. Challenges in implementing electronic hand hygiene monitoring systems.

    PubMed

    Conway, Laurie J

    2016-05-02

    Electronic hand hygiene (HH) monitoring systems offer the exciting prospect of a more precise, less biased measure of HH performance than direct observation. However, electronic systems are challenging to implement. Selecting a system that minimizes disruption to the physical infrastructure and to clinician workflow, and that fits with the organization's culture and budget, is challenging. Getting front-line workers' buy-in and addressing concerns about the accuracy of the system and how the data will be used are also difficult challenges. Finally, ensuring information from the system reaches front-line workers and is used by them to improve HH practice is a complex challenge. We describe these challenges in detail and suggests ways to overcome them. Copyright © 2016 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  2. Deep into the Brain: Artificial Intelligence in Stroke Imaging.

    PubMed

    Lee, Eun-Jae; Kim, Yong-Hwan; Kim, Namkug; Kang, Dong-Wha

    2017-09-01

    Artificial intelligence (AI), a computer system aiming to mimic human intelligence, is gaining increasing interest and is being incorporated into many fields, including medicine. Stroke medicine is one such area of application of AI, for improving the accuracy of diagnosis and the quality of patient care. For stroke management, adequate analysis of stroke imaging is crucial. Recently, AI techniques have been applied to decipher the data from stroke imaging and have demonstrated some promising results. In the very near future, such AI techniques may play a pivotal role in determining the therapeutic methods and predicting the prognosis for stroke patients in an individualized manner. In this review, we offer a glimpse at the use of AI in stroke imaging, specifically focusing on its technical principles, clinical application, and future perspectives.

  3. Twisted Acoustics: Metasurface-Enabled Multiplexing and Demultiplexing.

    PubMed

    Jiang, Xue; Liang, Bin; Cheng, Jian-Chun; Qiu, Cheng-Wei

    2018-05-01

    Metasurfaces are used to enable acoustic orbital angular momentum (a-OAM)-based multiplexing in real-time, postprocess-free, and sensor-scanning-free fashions to improve the bandwidth of acoustic communication, with intrinsic compatibility and expandability to cooperate with other multiplexing schemes. The metasurface-based communication relying on encoding information onto twisted beams is numerically and experimentally demonstrated by realizing real-time picture transfer, which differs from existing static data transfer by encoding data onto OAM states. With the advantages of real-time transmission, passive and instantaneous data decoding, vanishingly low loss, compact size, and high transmitting accuracy, the study of a-OAM-based information transfer with metasurfaces offers new route to boost the capacity of acoustic communication and great potential to profoundly advance relevant fields. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. A robust omnifont open-vocabulary Arabic OCR system using pseudo-2D-HMM

    NASA Astrophysics Data System (ADS)

    Rashwan, Abdullah M.; Rashwan, Mohsen A.; Abdel-Hameed, Ahmed; Abdou, Sherif; Khalil, A. H.

    2012-01-01

    Recognizing old documents is highly desirable since the demand for quickly searching millions of archived documents has recently increased. Using Hidden Markov Models (HMMs) has been proven to be a good solution to tackle the main problems of recognizing typewritten Arabic characters. These attempts however achieved a remarkable success for omnifont OCR under very favorable conditions, they didn't achieve the same performance in practical conditions, i.e. noisy documents. In this paper we present an omnifont, large-vocabulary Arabic OCR system using Pseudo Two Dimensional Hidden Markov Model (P2DHMM), which is a generalization of the HMM. P2DHMM offers a more efficient way to model the Arabic characters, such model offer both minimal dependency on the font size/style (omnifont), and high level of robustness against noise. The evaluation results of this system are very promising compared to a baseline HMM system and best OCRs available in the market (Sakhr and NovoDynamics). The recognition accuracy of the P2DHMM classifier is measured against the classic HMM classifier, the average word accuracy rates for P2DHMM and HMM classifiers are 79% and 66% respectively. The overall system accuracy is measured against Sakhr and NovoDynamics OCR systems, the average word accuracy rates for P2DHMM, NovoDynamics, and Sakhr are 74%, 71%, and 61% respectively.

  5. Synchronization of autonomous objects in discrete event simulation

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1990-01-01

    Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.

  6. Improvements in force variability and structure from vision- to memory-guided submaximal isometric knee extension in subacute stroke.

    PubMed

    Chow, John W; Stokic, Dobrivoje S

    2018-03-01

    We examined changes in variability, accuracy, frequency composition, and temporal regularity of force signal from vision-guided to memory-guided force-matching tasks in 17 subacute stroke and 17 age-matched healthy subjects. Subjects performed a unilateral isometric knee extension at 10, 30, and 50% of peak torque [maximum voluntary contraction (MVC)] for 10 s (3 trials each). Visual feedback was removed at the 5-s mark in the first two trials (feedback withdrawal), and 30 s after the second trial the subjects were asked to produce the target force without visual feedback (force recall). The coefficient of variation and constant error were used to quantify force variability and accuracy. Force structure was assessed by the median frequency, relative spectral power in the 0-3-Hz band, and sample entropy of the force signal. At 10% MVC, the force signal in subacute stroke subjects became steadier, more broadband, and temporally more irregular after the withdrawal of visual feedback, with progressively larger error at higher contraction levels. Also, the lack of modulation in the spectral frequency at higher force levels with visual feedback persisted in both the withdrawal and recall conditions. In terms of changes from the visual feedback condition, the feedback withdrawal produced a greater difference between the paretic, nonparetic, and control legs than the force recall. The overall results suggest improvements in force variability and structure from vision- to memory-guided force control in subacute stroke despite decreased accuracy. Different sensory-motor memory retrieval mechanisms seem to be involved in the feedback withdrawal and force recall conditions, which deserves further study. NEW & NOTEWORTHY We demonstrate that in the subacute phase of stroke, force signals during a low-level isometric knee extension become steadier, more broadband in spectral power, and more complex after removal of visual feedback. Larger force errors are produced when recalling target forces than immediately after withdrawing visual feedback. Although visual feedback offers better accuracy, it worsens force variability and structure in subacute stroke. The feedback withdrawal and force recall conditions seem to involve different memory retrieval mechanisms.

  7. Multiparametric plasma EV profiling facilitates diagnosis of pancreatic malignancy.

    PubMed

    Yang, Katherine S; Im, Hyungsoon; Hong, Seonki; Pergolini, Ilaria; Del Castillo, Andres Fernandez; Wang, Rui; Clardy, Susan; Huang, Chen-Han; Pille, Craig; Ferrone, Soldano; Yang, Robert; Castro, Cesar M; Lee, Hakho; Del Castillo, Carlos Fernandez; Weissleder, Ralph

    2017-05-24

    Pancreatic ductal adenocarcinoma (PDAC) is usually detected late in the disease process. Clinical workup through imaging and tissue biopsies is often complex and expensive due to a paucity of reliable biomarkers. We used an advanced multiplexed plasmonic assay to analyze circulating tumor-derived extracellular vesicles (tEVs) in more than 100 clinical populations. Using EV-based protein marker profiling, we identified a signature of five markers (PDAC EV signature) for PDAC detection. In our prospective cohort, the accuracy for the PDAC EV signature was 84% [95% confidence interval (CI), 69 to 93%] but only 63 to 72% for single-marker screening. One of the best markers, GPC1 alone, had a sensitivity of 82% (CI, 60 to 95%) and a specificity of 52% (CI, 30 to 74%), whereas the PDAC EV signature showed a sensitivity of 86% (CI, 65 to 97%) and a specificity of 81% (CI, 58 to 95%). The PDAC EV signature of tEVs offered higher sensitivity, specificity, and accuracy than the existing serum marker (CA 19-9) or single-tEV marker analyses. This approach should improve the diagnosis of pancreatic cancer. Copyright © 2017, American Association for the Advancement of Science.

  8. Potential transducers based man-tailored biomimetic sensors for selective recognition of dextromethorphan as an antitussive drug.

    PubMed

    El-Naby, Eman H; Kamel, Ayman H

    2015-09-01

    A biomimetic potentiometric sensor for specific recognition of dextromethorphan (DXM), a drug classified according to the Drug Enforcement Administration (DEA) as a "drug of concern", is designed and characterized. A molecularly imprinted polymer (MIP), with special molecular recognition properties of DXM, was prepared by thermal polymerization in which DXM acted as template molecule, methacrylic acid (MAA) and acrylonitrile (AN) acted as functional monomers in the presence of ethylene glycol dimethacrylate (EGDMA) as crosslinker. The sensors showed a high selectivity and a sensitive response to the template in aqueous system. Electrochemical evaluation of these sensors revealed near-Nernstian response with slopes of 49.6±0.5 and 53.4±0.5 mV decade(-1) with a detection limit of 1.9×10(-6), and 1.0×10(-6) mol L(-1) DXM with MIP/MAA and MIP/AN membrane based sensors, respectively. Significantly improved accuracy, precision, response time, stability, selectivity and sensitivity were offered by these simple and cost-effective potentiometric sensors compared with other standard techniques. The method has the requisite accuracy, sensitivity and precision to assay DXM in pharmaceutical products. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Overlay accuracy with respect to device scaling

    NASA Astrophysics Data System (ADS)

    Leray, Philippe; Laidler, David; Cheng, Shaunee

    2012-03-01

    Overlay metrology performance is usually reported as repeatability, matching between tools or optics aberrations distorting the measurement (Tool induced shift or TIS). Over the last few years, improvement of these metrics by the tool suppliers has been impressive. But, what about accuracy? Using different target types, we have already reported small differences in the mean value as well as fingerprint [1]. These differences make the correctables questionable. Which target is correct and therefore which translation, scaling etc. values should be fed back to the scanner? In this paper we investigate the sources of these differences, using several approaches. First, we measure the response of different targets to offsets programmed in a test vehicle. Second, we check the response of the same overlay targets to overlay errors programmed into the scanner. We compare overlay target designs; what is the contribution of the size of the features that make up the target? We use different overlay measurement techniques; is DBO (Diffraction Based Overlay) more accurate than IBO (Image Based Overlay)? We measure overlay on several stacks; what is the stack contribution to inaccuracy? In conclusion, we offer an explanation for the observed differences and propose a solution to reduce them.

  10. Inherent Limitations of Hydraulic Tomography

    USGS Publications Warehouse

    Bohling, Geoffrey C.; Butler, J.J.

    2010-01-01

    We offer a cautionary note in response to an increasing level of enthusiasm regarding high-resolution aquifer characterization with hydraulic tomography. We use synthetic examples based on two recent field experiments to demonstrate that a high degree of nonuniqueness remains in estimates of hydraulic parameter fields even when those estimates are based on simultaneous analysis of a number of carefully controlled hydraulic tests. We must, therefore, be careful not to oversell the technique to the community of practicing hydrogeologists, promising a degree of accuracy and resolution that, in many settings, will remain unattainable, regardless of the amount of effort invested in the field investigation. No practically feasible amount of hydraulic tomography data will ever remove the need to regularize or bias the inverse problem in some fashion in order to obtain a unique solution. Thus, along with improving the resolution of hydraulic tomography techniques, we must also strive to couple those techniques with procedures for experimental design and uncertainty assessment and with other more cost-effective field methods, such as geophysical surveying and, in unconsolidated formations, direct-push profiling, in order to develop methods for subsurface characterization with the resolution and accuracy needed for practical field applications. Copyright ?? 2010 The Author(s). Journal compilation ?? 2010 National Ground Water Association.

  11. Theoretical and Experimental Studies of Epidermal Heat Flux Sensors for Measurements of Core Body Temperature

    PubMed Central

    Zhang, Yihui; Webb, Richard Chad; Luo, Hongying; Xue, Yeguang; Kurniawan, Jonas; Cho, Nam Heon; Krishnan, Siddharth; Li, Yuhang; Huang, Yonggang

    2016-01-01

    Long-term, continuous measurement of core body temperature is of high interest, due to the widespread use of this parameter as a key biomedical signal for clinical judgment and patient management. Traditional approaches rely on devices or instruments in rigid and planar forms, not readily amenable to intimate or conformable integration with soft, curvilinear, time-dynamic, surfaces of the skin. Here, materials and mechanics designs for differential temperature sensors are presented which can attach softly and reversibly onto the skin surface, and also sustain high levels of deformation (e.g., bending, twisting, and stretching). A theoretical approach, together with a modeling algorithm, yields core body temperature from multiple differential measurements from temperature sensors separated by different effective distances from the skin. The sensitivity, accuracy, and response time are analyzed by finite element analyses (FEA) to provide guidelines for relationships between sensor design and performance. Four sets of experiments on multiple devices with different dimensions and under different convection conditions illustrate the key features of the technology and the analysis approach. Finally, results indicate that thermally insulating materials with cellular structures offer advantages in reducing the response time and increasing the accuracy, while improving the mechanics and breathability. PMID:25953120

  12. Monitoring Agricultural Production in Primary Export Countries within the framework of the GEOGLAM Initiative

    NASA Astrophysics Data System (ADS)

    Becker-Reshef, I.; Justice, C. O.; Vermote, E.

    2012-12-01

    Up to date, reliable, global, information on crop production prospects is indispensible for informing and regulating grain markets and for instituting effective agricultural policies. The recent price surges in the global grain markets were in large part triggered by extreme weather events in primary grain export countries. These events raise important questions about the accuracy of current production forecasts and their role in market fluctuations, and highlight the deficiencies in the state of global agricultural monitoring. Satellite-based earth observations are increasingly utilized as a tool for monitoring agricultural production as they offer cost-effective, daily, global information on crop growth and extent and their utility for crop production forecasting has long been demonstrated. Within this context, the Group on Earth Observations developed the Global Agricultural Monitoring (GEOGLAM) initiative which was adopted by the G20 as part of the action plan on food price volatility and agriculture. The goal of GEOGLAM is to enhance agricultural production estimates through the use of Earth observations. This talk will explore the potential contribution of EO-based methods for improving the accuracy of early production estimates of main export countries within the framework of GEOGLAM.

  13. Assessment of Direct-to-Consumer Genetic Testing Policy in Korea Based on Consumer Preference.

    PubMed

    Jeong, Gicheol

    2017-01-01

    In June 2016, Korea permitted direct-to-consumer genetic testing (DTC-GT) on 42 genes. However, both the market and industry have not yet been fully activated. Considering the aforementioned context, this study provides important insights. The Korean DTC-GT policy assessment is based on consumer preference analysis using a discrete choice experiment. In August 2016, a web-based survey was conducted to collect data from 1,200 respondents. The estimation results show that consumers prefer a DTC-GT product that is cheap, tests various items or genes, offers accurate test results, and guarantees the confidentiality of all information. However, consumers are not entirely satisfied by current DTC-GT products due to the existence of insufficient and/or inadequate policies. First, the permitted testing of 42 genes is insufficient to satisfy consumers' curiosity regarding their genes. Second, the accuracy of the DTC-GT products has not been fully verified, assessed, and communicated to consumers. Finally, regulatory loopholes that allow information leaks in the DTC-GT process can occur. These findings imply that DTC-GT requires an improvement in government policy-making criteria and the implementation of practical measures to guarantee test accuracy and genetic information. © 2017 S. Karger AG, Basel.

  14. Wireless Wearable Multisensory Suite and Real-Time Prediction of Obstructive Sleep Apnea Episodes.

    PubMed

    Le, Trung Q; Cheng, Changqing; Sangasoongsong, Akkarapol; Wongdhamma, Woranat; Bukkapatnam, Satish T S

    2013-01-01

    Obstructive sleep apnea (OSA) is a common sleep disorder found in 24% of adult men and 9% of adult women. Although continuous positive airway pressure (CPAP) has emerged as a standard therapy for OSA, a majority of patients are not tolerant to this treatment, largely because of the uncomfortable nasal air delivery during their sleep. Recent advances in wireless communication and advanced ("bigdata") preditive analytics technologies offer radically new point-of-care treatment approaches for OSA episodes with unprecedented comfort and afforadability. We introduce a Dirichlet process-based mixture Gaussian process (DPMG) model to predict the onset of sleep apnea episodes based on analyzing complex cardiorespiratory signals gathered from a custom-designed wireless wearable multisensory suite. Extensive testing with signals from the multisensory suite as well as PhysioNet's OSA database suggests that the accuracy of offline OSA classification is 88%, and accuracy for predicting an OSA episode 1-min ahead is 83% and 3-min ahead is 77%. Such accurate prediction of an impending OSA episode can be used to adaptively adjust CPAP airflow (toward improving the patient's adherence) or the torso posture (e.g., minor chin adjustments to maintain steady levels of the airflow).

  15. Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method.

    PubMed

    Batres-Mendoza, Patricia; Ibarra-Manzano, Mario A; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Montoro-Sanjose, Carlos R; Romero-Troncoso, Rene J; Rostro-Gonzalez, Horacio

    2017-01-01

    We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications.

  16. Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method

    PubMed Central

    Batres-Mendoza, Patricia; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Montoro-Sanjose, Carlos R.

    2017-01-01

    We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications. PMID:29348744

  17. Development of an integrated sub-picometric SWIFTS-based wavelength meter

    NASA Astrophysics Data System (ADS)

    Duchemin, Céline; Thomas, Fabrice; Martin, Bruno; Morino, Eric; Puget, Renaud; Oliveres, Robin; Bonneville, Christophe; Gonthiez, Thierry; Valognes, Nicolas

    2017-02-01

    SWIFTSTM technology has been known for over five years to offer compact and high-resolution laser spectrum analyzers. The increase of wavelength monitoring demand with even better accuracy and resolution has pushed the development of a wavelength meter based on SWIFTSTM technology, named LW-10. As a reminder, SWIFTSTM principle consists in a waveguide in which a stationary wave is created, sampled and read out by a linear image sensor array. Due to its inherent properties (non-uniform subsampling) and aliasing signal (as presented in Shannon-Nyquist criterion), the system offers short spectral window bandwidths thus needs an a priori on the working wavelength and thermal monitoring. Although SWIFTSTM-based devices are barely sensitive to atmospheric pressure, temperature control is a key factor to master both high accuracy and wavelength meter resolution. Temperature control went from passive (temperature probing only) to active control (Peltier thermoelectric cooler) with milli-degree accuracy. The software part consists in dropping the Fourier-like transform, for a least-squares method directly on the interference pattern. Moreover, the consideration of the system's chromatic behavior provides a "signature" for automated wavelength detection and discrimination. This SWIFTSTM-based new device - LW-10 - shows outstanding results in terms of absolute accuracy, wavelength meter resolution as well as calibration robustness within a compact device, compared to other existing technologies. On the 630 - 1100 nm range, the final device configuration allows pulsed or CW lasers monitoring with 20 MHz resolution and 200 MHz absolute accuracy. Non-exhaustive applications include tunable laser control and frequency locking experiments

  18. Build Angle: Does It Influence the Accuracy of 3D-Printed Dental Restorations Using Digital Light-Processing Technology?

    PubMed

    Osman, Reham B; Alharbi, Nawal; Wismeijer, Daniel

    The aim of this study was to evaluate the effect of the build orientation/build angle on the dimensional accuracy of full-coverage dental restorations manufactured using digital light-processing technology (DLP-AM). A full dental crown was digitally designed and 3D-printed using DLP-AM. Nine build angles were used: 90, 120, 135, 150, 180, 210, 225, 240, and 270 degrees. The specimens were digitally scanned using a high-resolution optical surface scanner (IScan D104i, Imetric). Dimensional accuracy was evaluated using the digital subtraction technique. The 3D digital files of the scanned printed crowns (test model) were exported in standard tessellation language (STL) format and superimposed on the STL file of the designed crown [reference model] using Geomagic Studio 2014 (3D Systems). The root mean square estimate (RMSE) values were evaluated, and the deviation patterns on the color maps were further assessed. The build angle influenced the dimensional accuracy of 3D-printed restorations. The lowest RMSE was recorded for the 135-degree and 210-degree build angles. However, the overall deviation pattern on the color map was more favorable with the 135-degree build angle in contrast with the 210-degree build angle where the deviation was observed around the critical marginal area. Within the limitations of this study, the recommended build angle using the current DLP system was 135 degrees. Among the selected build angles, it offers the highest dimensional accuracy and the most favorable deviation pattern. It also offers a self-supporting crown geometry throughout the building process.

  19. Serious gaming technology in major incident triage training: a pragmatic controlled trial.

    PubMed

    Knight, James F; Carley, Simon; Tregunna, Bryan; Jarvis, Steve; Smithies, Richard; de Freitas, Sara; Dunwell, Ian; Mackway-Jones, Kevin

    2010-09-01

    By exploiting video games technology, serious games strive to deliver affordable, accessible and usable interactive virtual worlds, supporting applications in training, education, marketing and design. The aim of the present study was to evaluate the effectiveness of such a serious game in the teaching of major incident triage by comparing it with traditional training methods. Pragmatic controlled trial. During Major Incident Medical Management and Support Courses, 91 learners were randomly distributed into one of two training groups: 44 participants practiced triage sieve protocol using a card-sort exercise, whilst the remaining 47 participants used a serious game. Following the training sessions, each participant undertook an evaluation exercise, whereby they were required to triage eight casualties in a simulated live exercise. Performance was assessed in terms of tagging accuracy (assigning the correct triage tag to the casualty), step accuracy (following correct procedure) and time taken to triage all casualties. Additionally, the usability of both the card-sort exercise and video game were measured using a questionnaire. Tagging accuracy by participants who underwent the serious game training was significantly higher than those who undertook the card-sort exercise [Chi2=13.126, p=0.02]. Step accuracy was also higher in the serious game group but only for the numbers of participants that followed correct procedure when triaging all eight casualties [Chi2=5.45, p=0.0196]. There was no significant difference in time to triage all casualties (card-sort=435+/-74 s vs video game=456+/-62 s, p=0.155). Serious game technologies offer the potential to enhance learning and improve subsequent performance when compared to traditional educational methods. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  20. An Extreme Learning Machine-Based Neuromorphic Tactile Sensing System for Texture Recognition.

    PubMed

    Rasouli, Mahdi; Chen, Yi; Basu, Arindam; Kukreja, Sunil L; Thakor, Nitish V

    2018-04-01

    Despite significant advances in computational algorithms and development of tactile sensors, artificial tactile sensing is strikingly less efficient and capable than the human tactile perception. Inspired by efficiency of biological systems, we aim to develop a neuromorphic system for tactile pattern recognition. We particularly target texture recognition as it is one of the most necessary and challenging tasks for artificial sensory systems. Our system consists of a piezoresistive fabric material as the sensor to emulate skin, an interface that produces spike patterns to mimic neural signals from mechanoreceptors, and an extreme learning machine (ELM) chip to analyze spiking activity. Benefiting from intrinsic advantages of biologically inspired event-driven systems and massively parallel and energy-efficient processing capabilities of the ELM chip, the proposed architecture offers a fast and energy-efficient alternative for processing tactile information. Moreover, it provides the opportunity for the development of low-cost tactile modules for large-area applications by integration of sensors and processing circuits. We demonstrate the recognition capability of our system in a texture discrimination task, where it achieves a classification accuracy of 92% for categorization of ten graded textures. Our results confirm that there exists a tradeoff between response time and classification accuracy (and information transfer rate). A faster decision can be achieved at early time steps or by using a shorter time window. This, however, results in deterioration of the classification accuracy and information transfer rate. We further observe that there exists a tradeoff between the classification accuracy and the input spike rate (and thus energy consumption). Our work substantiates the importance of development of efficient sparse codes for encoding sensory data to improve the energy efficiency. These results have a significance for a wide range of wearable, robotic, prosthetic, and industrial applications.

  1. Comparison of Marker-Based Genomic Estimated Breeding Values and Phenotypic Evaluation for Selection of Bacterial Spot Resistance in Tomato.

    PubMed

    Liabeuf, Debora; Sim, Sung-Chur; Francis, David M

    2018-03-01

    Bacterial spot affects tomato crops (Solanum lycopersicum) grown under humid conditions. Major genes and quantitative trait loci (QTL) for resistance have been described, and multiple loci from diverse sources need to be combined to improve disease control. We investigated genomic selection (GS) prediction models for resistance to Xanthomonas euvesicatoria and experimentally evaluated the accuracy of these models. The training population consisted of 109 families combining resistance from four sources and directionally selected from a population of 1,100 individuals. The families were evaluated on a plot basis in replicated inoculated trials and genotyped with single nucleotide polymorphisms (SNP). We compared the prediction ability of models developed with 14 to 387 SNP. Genomic estimated breeding values (GEBV) were derived using Bayesian least absolute shrinkage and selection operator regression (BL) and ridge regression (RR). Evaluations were based on leave-one-out cross validation and on empirical observations in replicated field trials using the next generation of inbred progeny and a hybrid population resulting from selections in the training population. Prediction ability was evaluated based on correlations between GEBV and phenotypes (r g ), percentage of coselection between genomic and phenotypic selection, and relative efficiency of selection (r g /r p ). Results were similar with BL and RR models. Models using only markers previously identified as significantly associated with resistance but weighted based on GEBV and mixed models with markers associated with resistance treated as fixed effects and markers distributed in the genome treated as random effects offered greater accuracy and a high percentage of coselection. The accuracy of these models to predict the performance of progeny and hybrids exceeded the accuracy of phenotypic selection.

  2. Sources, Sinks, and Model Accuracy

    EPA Science Inventory

    Spatial demographic models are a necessary tool for understanding how to manage landscapes sustainably for animal populations. These models, therefore, must offer precise and testable predications about animal population dynamics and how animal demographic parameters respond to ...

  3. Will the Measurement Robots Take Our Jobs? An Update on the State of Automated M&V for Energy Efficiency Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Touzani, Samir; Taylor, Cody

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The rising availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifyingmore » savings, offers potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer M&V capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline some parts of M&V. Here in this paper, we detail metrics to assess the performance of these new M&V approaches, and a framework to compute the metrics. We also discuss the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. Finally we discuss the potential evolution of M&V and early results of pilots currently underway to incorporate M&V automation into ratepayer-funded programs and professional implementation and evaluation practice.« less

  4. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes

    PubMed Central

    2016-01-01

    Background The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Objective Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. Methods After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients’ true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. Results We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. Conclusions With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access. PMID:26935793

  5. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes.

    PubMed

    Chien, Tsair-Wei; Lin, Weir-Sen

    2016-03-02

    The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients' true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access.

  6. The best prostate biopsy scheme is dictated by the gland volume: a monocentric study.

    PubMed

    Dell'Atti, L

    2015-08-01

    Accuracy of biopsy scheme depends on different parameters. Prostate-specific antigen (PSA) level and digital rectal examination (DRE) influenced the detection rate and suggested the biopsy scheme to approach each patient. Another parameter is the prostate volume. Sampling accuracy tends to decrease progressively with an increasing prostate volume. We prospectively observed detection cancer rate in suspicious prostate cancer (PCa) and improved by applying a protocol biopsy according to prostate volume (PV). Clinical data and pathological features of these 1356 patients were analysed and included in this study. This protocol is a combined scheme that includes transrectal (TR) 12-core PBx (TR12PBx) for PV ≤ 30 cc, TR 14-core PBx (TR14PBx) for PV > 30 cc but < 60 cc, TR 18-core PBx (TR18PBx) for PV ≥ 60 cc. Out of a total of 1356 patients, in 111 (8.2%) PCa was identified through TR12PBx scheme, in 198 (14.6%) through TR14PBx scheme and in 253 (18.6%) through TR18PBx scheme. The PCa detection rate was increased by 44% by adding two TZ cores (TR14PBx scheme). The TR18PBx scheme increased this rate by 21.7% vs. TR14PBx scheme. The diagnostic yield offered by TR18PBx was statistically significant compared to the detection rate offered by the TR14PBx scheme (p < 0.003). The biopsy Gleason score and the percentage of core involvement were comparable between PCa detected by the TR14PBx scheme diagnostic yield and those detected by the TR18PBx scheme (p = 0.362). The only PV parameter, in our opinion, can be significant in choosing the best biopsy scheme to approach in a first setting of biopsies increasing PCa detection rate.

  7. Social Power Increases Interoceptive Accuracy

    PubMed Central

    Moeini-Jazani, Mehrad; Knoeferle, Klemens; de Molière, Laura; Gatti, Elia; Warlop, Luk

    2017-01-01

    Building on recent psychological research showing that power increases self-focused attention, we propose that having power increases accuracy in perception of bodily signals, a phenomenon known as interoceptive accuracy. Consistent with our proposition, participants in a high-power experimental condition outperformed those in the control and low-power conditions in the Schandry heartbeat-detection task. We demonstrate that the effect of power on interoceptive accuracy is not explained by participants’ physiological arousal, affective state, or general intention for accuracy. Rather, consistent with our reasoning that experiencing power shifts attentional resources inward, we show that the effect of power on interoceptive accuracy is dependent on individuals’ chronic tendency to focus on their internal sensations. Moreover, we demonstrate that individuals’ chronic sense of power also predicts interoceptive accuracy similar to, and independent of, how their situationally induced feeling of power does. We therefore provide further support on the relation between power and enhanced perception of bodily signals. Our findings offer a novel perspective–a psychophysiological account–on how power might affect judgments and behavior. We highlight and discuss some of these intriguing possibilities for future research. PMID:28824501

  8. Can NMR solve some significant challenges in metabolomics?

    PubMed Central

    Gowda, G.A. Nagana; Raftery, Daniel

    2015-01-01

    The field of metabolomics continues to witness rapid growth driven by fundamental studies, methods development, and applications in a number of disciplines that include biomedical science, plant and nutrition sciences, drug development, energy and environmental sciences, toxicology, etc. NMR spectroscopy is one of the two most widely used analytical platforms in the metabolomics field, along with mass spectrometry (MS). NMR's excellent reproducibility and quantitative accuracy, its ability to identify structures of unknown metabolites, its capacity to generate metabolite profiles using intact biospecimens with no need for separation, and its capabilities for tracing metabolic pathways using isotope labeled substrates offer unique strengths for metabolomics applications. However, NMR's limited sensitivity and resolution continue to pose a major challenge and have restricted both the number and the quantitative accuracy of metabolites analyzed by NMR. Further, the analysis of highly complex biological samples has increased the demand for new methods with improved detection, better unknown identification, and more accurate quantitation of larger numbers of metabolites. Recent efforts have contributed significant improvements in these areas, and have thereby enhanced the pool of routinely quantifiable metabolites. Additionally, efforts focused on combining NMR and MS promise opportunities to exploit the combined strength of the two analytical platforms for direct comparison of the metabolite data, unknown identification and reliable biomarker discovery that continue to challenge the metabolomics field. This article presents our perspectives on the emerging trends in NMR-based metabolomics and NMR's continuing role in the field with an emphasis on recent and ongoing research from our laboratory. PMID:26476597

  9. Accurate Damage Location in Complex Composite Structures and Industrial Environments using Acoustic Emission

    NASA Astrophysics Data System (ADS)

    Eaton, M.; Pearson, M.; Lee, W.; Pullin, R.

    2015-07-01

    The ability to accurately locate damage in any given structure is a highly desirable attribute for an effective structural health monitoring system and could help to reduce operating costs and improve safety. This becomes a far greater challenge in complex geometries and materials, such as modern composite airframes. The poor translation of promising laboratory based SHM demonstrators to industrial environments forms a barrier to commercial up take of technology. The acoustic emission (AE) technique is a passive NDT method that detects elastic stress waves released by the growth of damage. It offers very sensitive damage detection, using a sparse array of sensors to detect and globally locate damage within a structure. However its application to complex structures commonly yields poor accuracy due to anisotropic wave propagation and the interruption of wave propagation by structural features such as holes and thickness changes. This work adopts an empirical mapping technique for AE location, known as Delta T Mapping, which uses experimental training data to account for such structural complexities. The technique is applied to a complex geometry composite aerospace structure undergoing certification testing. The component consists of a carbon fibre composite tube with varying wall thickness and multiple holes, that was loaded under bending. The damage location was validated using X-ray CT scanning and the Delta T Mapping technique was shown to improve location accuracy when compared with commercial algorithms. The onset and progression of damage were monitored throughout the test and used to inform future design iterations.

  10. Can NMR solve some significant challenges in metabolomics?

    NASA Astrophysics Data System (ADS)

    Nagana Gowda, G. A.; Raftery, Daniel

    2015-11-01

    The field of metabolomics continues to witness rapid growth driven by fundamental studies, methods development, and applications in a number of disciplines that include biomedical science, plant and nutrition sciences, drug development, energy and environmental sciences, toxicology, etc. NMR spectroscopy is one of the two most widely used analytical platforms in the metabolomics field, along with mass spectrometry (MS). NMR's excellent reproducibility and quantitative accuracy, its ability to identify structures of unknown metabolites, its capacity to generate metabolite profiles using intact bio-specimens with no need for separation, and its capabilities for tracing metabolic pathways using isotope labeled substrates offer unique strengths for metabolomics applications. However, NMR's limited sensitivity and resolution continue to pose a major challenge and have restricted both the number and the quantitative accuracy of metabolites analyzed by NMR. Further, the analysis of highly complex biological samples has increased the demand for new methods with improved detection, better unknown identification, and more accurate quantitation of larger numbers of metabolites. Recent efforts have contributed significant improvements in these areas, and have thereby enhanced the pool of routinely quantifiable metabolites. Additionally, efforts focused on combining NMR and MS promise opportunities to exploit the combined strength of the two analytical platforms for direct comparison of the metabolite data, unknown identification and reliable biomarker discovery that continue to challenge the metabolomics field. This article presents our perspectives on the emerging trends in NMR-based metabolomics and NMR's continuing role in the field with an emphasis on recent and ongoing research from our laboratory.

  11. 23 CFR 1200.22 - State traffic safety information system improvements grants.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... measures to be used to demonstrate quantitative progress in the accuracy, completeness, timeliness... to implement, provides an explanation. (d) Requirement for quantitative improvement. A State shall demonstrate quantitative improvement in the data attributes of accuracy, completeness, timeliness, uniformity...

  12. 23 CFR 1200.22 - State traffic safety information system improvements grants.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... measures to be used to demonstrate quantitative progress in the accuracy, completeness, timeliness... to implement, provides an explanation. (d) Requirement for quantitative improvement. A State shall demonstrate quantitative improvement in the data attributes of accuracy, completeness, timeliness, uniformity...

  13. Accuracy and precision of patient positioning for pelvic MR-only radiation therapy using digitally reconstructed radiographs

    NASA Astrophysics Data System (ADS)

    Kemppainen, R.; Vaara, T.; Joensuu, T.; Kiljunen, T.

    2018-03-01

    Background and Purpose. Magnetic resonance imaging (MRI) has in recent years emerged as an imaging modality to drive precise contouring of targets and organs at risk in external beam radiation therapy. Moreover, recent advances in MRI enable treatment of cancer without computed tomography (CT) simulation. A commercially available MR-only solution, MRCAT, offers a single-modality approach that provides density information for dose calculation and generation of positioning reference images. We evaluated the accuracy of patient positioning based on MRCAT digitally reconstructed radiographs (DRRs) by comparing to standard CT based workflow. Materials and Methods. Twenty consecutive prostate cancer patients being treated with external beam radiation therapy were included in the study. DRRs were generated for each patient based on the planning CT and MRCAT. The accuracy assessment was performed by manually registering the DRR images to planar kV setup images using bony landmarks. A Bayesian linear mixed effects model was used to separate systematic and random components (inter- and intra-observer variation) in the assessment. In addition, method agreement was assessed using a Bland-Altman analysis. Results. The systematic difference between MRCAT and CT based patient positioning, averaged over the study population, were found to be (mean [95% CI])  -0.49 [-0.85 to  -0.13] mm, 0.11 [-0.33 to  +0.57] mm and  -0.05 [-0.23 to  +0.36] mm in vertical, longitudinal and lateral directions, respectively. The increases in total random uncertainty were estimated to be below 0.5 mm for all directions, when using MR-only workflow instead of CT. Conclusions. The MRCAT pseudo-CT method provides clinically acceptable accuracy and precision for patient positioning for pelvic radiation therapy based on planar DRR images. Furthermore, due to the reduction of geometric uncertainty, compared to dual-modality workflow, the approach is likely to improve the total geometric accuracy of pelvic radiation therapy.

  14. A comparison of genomic selection models across time in interior spruce (Picea engelmannii × glauca) using unordered SNP imputation methods

    PubMed Central

    Ratcliffe, B; El-Dien, O G; Klápště, J; Porth, I; Chen, C; Jaquish, B; El-Kassaby, Y A

    2015-01-01

    Genomic selection (GS) potentially offers an unparalleled advantage over traditional pedigree-based selection (TS) methods by reducing the time commitment required to carry out a single cycle of tree improvement. This quality is particularly appealing to tree breeders, where lengthy improvement cycles are the norm. We explored the prospect of implementing GS for interior spruce (Picea engelmannii × glauca) utilizing a genotyped population of 769 trees belonging to 25 open-pollinated families. A series of repeated tree height measurements through ages 3–40 years permitted the testing of GS methods temporally. The genotyping-by-sequencing (GBS) platform was used for single nucleotide polymorphism (SNP) discovery in conjunction with three unordered imputation methods applied to a data set with 60% missing information. Further, three diverse GS models were evaluated based on predictive accuracy (PA), and their marker effects. Moderate levels of PA (0.31–0.55) were observed and were of sufficient capacity to deliver improved selection response over TS. Additionally, PA varied substantially through time accordingly with spatial competition among trees. As expected, temporal PA was well correlated with age-age genetic correlation (r=0.99), and decreased substantially with increasing difference in age between the training and validation populations (0.04–0.47). Moreover, our imputation comparisons indicate that k-nearest neighbor and singular value decomposition yielded a greater number of SNPs and gave higher predictive accuracies than imputing with the mean. Furthermore, the ridge regression (rrBLUP) and BayesCπ (BCπ) models both yielded equal, and better PA than the generalized ridge regression heteroscedastic effect model for the traits evaluated. PMID:26126540

  15. A comparison of genomic selection models across time in interior spruce (Picea engelmannii × glauca) using unordered SNP imputation methods.

    PubMed

    Ratcliffe, B; El-Dien, O G; Klápště, J; Porth, I; Chen, C; Jaquish, B; El-Kassaby, Y A

    2015-12-01

    Genomic selection (GS) potentially offers an unparalleled advantage over traditional pedigree-based selection (TS) methods by reducing the time commitment required to carry out a single cycle of tree improvement. This quality is particularly appealing to tree breeders, where lengthy improvement cycles are the norm. We explored the prospect of implementing GS for interior spruce (Picea engelmannii × glauca) utilizing a genotyped population of 769 trees belonging to 25 open-pollinated families. A series of repeated tree height measurements through ages 3-40 years permitted the testing of GS methods temporally. The genotyping-by-sequencing (GBS) platform was used for single nucleotide polymorphism (SNP) discovery in conjunction with three unordered imputation methods applied to a data set with 60% missing information. Further, three diverse GS models were evaluated based on predictive accuracy (PA), and their marker effects. Moderate levels of PA (0.31-0.55) were observed and were of sufficient capacity to deliver improved selection response over TS. Additionally, PA varied substantially through time accordingly with spatial competition among trees. As expected, temporal PA was well correlated with age-age genetic correlation (r=0.99), and decreased substantially with increasing difference in age between the training and validation populations (0.04-0.47). Moreover, our imputation comparisons indicate that k-nearest neighbor and singular value decomposition yielded a greater number of SNPs and gave higher predictive accuracies than imputing with the mean. Furthermore, the ridge regression (rrBLUP) and BayesCπ (BCπ) models both yielded equal, and better PA than the generalized ridge regression heteroscedastic effect model for the traits evaluated.

  16. Statistical models for incorporating data from routine HIV testing of pregnant women at antenatal clinics into HIV/AIDS epidemic estimates.

    PubMed

    Sheng, Ben; Marsh, Kimberly; Slavkovic, Aleksandra B; Gregson, Simon; Eaton, Jeffrey W; Bao, Le

    2017-04-01

    HIV prevalence data collected from routine HIV testing of pregnant women at antenatal clinics (ANC-RT) are potentially available from all facilities that offer testing services to pregnant women and can be used to improve estimates of national and subnational HIV prevalence trends. We develop methods to incorporate these new data source into the Joint United Nations Programme on AIDS Estimation and Projection Package in Spectrum 2017. We develop a new statistical model for incorporating ANC-RT HIV prevalence data, aggregated either to the health facility level (site-level) or regionally (census-level), to estimate HIV prevalence alongside existing sources of HIV prevalence data from ANC unlinked anonymous testing (ANC-UAT) and household-based national population surveys. Synthetic data are generated to understand how the availability of ANC-RT data affects the accuracy of various parameter estimates. We estimate HIV prevalence and additional parameters using both ANC-RT and other existing data. Fitting HIV prevalence using synthetic data generally gives precise estimates of the underlying trend and other parameters. More years of ANC-RT data should improve prevalence estimates. More ANC-RT sites and continuation with existing ANC-UAT sites may improve the estimate of calibration between ANC-UAT and ANC-RT sites. We have proposed methods to incorporate ANC-RT data into Spectrum to obtain more precise estimates of prevalence and other measures of the epidemic. Many assumptions about the accuracy, consistency, and representativeness of ANC-RT prevalence underlie the use of these data for monitoring HIV epidemic trends and should be tested as more data become available from national ANC-RT programs.

  17. Statistical Models for Incorporating Data from Routine HIV Testing of Pregnant Women at Antenatal Clinics into HIV/AIDS Epidemic Estimates

    PubMed Central

    Sheng, Ben; Marsh, Kimberly; Slavkovic, Aleksandra B.; Gregson, Simon; Eaton, Jeffrey W.; Bao, Le

    2017-01-01

    Objective HIV prevalence data collected from routine HIV testing of pregnant women at antenatal clinics (ANC-RT) are potentially available from all facilities that offer testing services to pregnant women, and can be used to improve estimates of national and sub-national HIV prevalence trends. We develop methods to incorporate this new data source into the UNAIDS Estimation and Projection Package (EPP) in Spectrum 2017. Methods We develop a new statistical model for incorporating ANC-RT HIV prevalence data, aggregated either to the health facility level (‘site-level’) or regionally (‘census-level’), to estimate HIV prevalence alongside existing sources of HIV prevalence data from ANC unlinked anonymous testing (ANC-UAT) and household-based national population surveys. Synthetic data are generated to understand how the availability of ANC-RT data affects the accuracy of various parameter estimates. Results We estimate HIV prevalence and additional parameters using both ANC-RT and other existing data. Fitting HIV prevalence using synthetic data generally gives precise estimates of the underlying trend and other parameters. More years of ANC-RT data should improve prevalence estimates. More ANC-RT sites and continuation with existing ANC-UAT sites may improve the estimate of calibration between ANC-UAT and ANC-RT sites. Conclusion We have proposed methods to incorporate ANC-RT data into Spectrum to obtain more precise estimates of prevalence and other measures of the epidemic. Many assumptions about the accuracy, consistency, and representativeness of ANC-RT prevalence underlie the use of these data for monitoring HIV epidemic trends, and should be tested as more data become available from national ANC-RT programs. PMID:28296804

  18. Evaluation of sampling frequency, window size and sensor position for classification of sheep behaviour.

    PubMed

    Walton, Emily; Casey, Christy; Mitsch, Jurgen; Vázquez-Diosdado, Jorge A; Yan, Juan; Dottorini, Tania; Ellis, Keith A; Winterlich, Anthony; Kaler, Jasmeet

    2018-02-01

    Automated behavioural classification and identification through sensors has the potential to improve health and welfare of the animals. Position of a sensor, sampling frequency and window size of segmented signal data has a major impact on classification accuracy in activity recognition and energy needs for the sensor, yet, there are no studies in precision livestock farming that have evaluated the effect of all these factors simultaneously. The aim of this study was to evaluate the effects of position (ear and collar), sampling frequency (8, 16 and 32 Hz) of a triaxial accelerometer and gyroscope sensor and window size (3, 5 and 7 s) on the classification of important behaviours in sheep such as lying, standing and walking. Behaviours were classified using a random forest approach with 44 feature characteristics. The best performance for walking, standing and lying classification in sheep (accuracy 95%, F -score 91%-97%) was obtained using combination of 32 Hz, 7 s and 32 Hz, 5 s for both ear and collar sensors, although, results obtained with 16 Hz and 7 s window were comparable with accuracy of 91%-93% and F -score 88%-95%. Energy efficiency was best at a 7 s window. This suggests that sampling at 16 Hz with 7 s window will offer benefits in a real-time behavioural monitoring system for sheep due to reduced energy needs.

  19. Evaluation of sampling frequency, window size and sensor position for classification of sheep behaviour

    PubMed Central

    Walton, Emily; Casey, Christy; Mitsch, Jurgen; Vázquez-Diosdado, Jorge A.; Yan, Juan; Dottorini, Tania; Ellis, Keith A.; Winterlich, Anthony

    2018-01-01

    Automated behavioural classification and identification through sensors has the potential to improve health and welfare of the animals. Position of a sensor, sampling frequency and window size of segmented signal data has a major impact on classification accuracy in activity recognition and energy needs for the sensor, yet, there are no studies in precision livestock farming that have evaluated the effect of all these factors simultaneously. The aim of this study was to evaluate the effects of position (ear and collar), sampling frequency (8, 16 and 32 Hz) of a triaxial accelerometer and gyroscope sensor and window size (3, 5 and 7 s) on the classification of important behaviours in sheep such as lying, standing and walking. Behaviours were classified using a random forest approach with 44 feature characteristics. The best performance for walking, standing and lying classification in sheep (accuracy 95%, F-score 91%–97%) was obtained using combination of 32 Hz, 7 s and 32 Hz, 5 s for both ear and collar sensors, although, results obtained with 16 Hz and 7 s window were comparable with accuracy of 91%–93% and F-score 88%–95%. Energy efficiency was best at a 7 s window. This suggests that sampling at 16 Hz with 7 s window will offer benefits in a real-time behavioural monitoring system for sheep due to reduced energy needs. PMID:29515862

  20. Accuracy and Efficiency of Recording Pediatric Early Warning Scores Using an Electronic Physiological Surveillance System Compared With Traditional Paper-Based Documentation

    PubMed Central

    Sefton, Gerri; Lane, Steven; Killen, Roger; Black, Stuart; Lyon, Max; Ampah, Pearl; Sproule, Cathryn; Loren-Gosling, Dominic; Richards, Caitlin; Spinty, Jean; Holloway, Colette; Davies, Coral; Wilson, April; Chean, Chung Shen; Carter, Bernie; Carrol, E.D.

    2017-01-01

    Pediatric Early Warning Scores are advocated to assist health professionals to identify early signs of serious illness or deterioration in hospitalized children. Scores are derived from the weighting applied to recorded vital signs and clinical observations reflecting deviation from a predetermined “norm.” Higher aggregate scores trigger an escalation in care aimed at preventing critical deterioration. Process errors made while recording these data, including plotting or calculation errors, have the potential to impede the reliability of the score. To test this hypothesis, we conducted a controlled study of documentation using five clinical vignettes. We measured the accuracy of vital sign recording, score calculation, and time taken to complete documentation using a handheld electronic physiological surveillance system, VitalPAC Pediatric, compared with traditional paper-based charts. We explored the user acceptability of both methods using a Web-based survey. Twenty-three staff participated in the controlled study. The electronic physiological surveillance system improved the accuracy of vital sign recording, 98.5% versus 85.6%, P < .02, Pediatric Early Warning Score calculation, 94.6% versus 55.7%, P < .02, and saved time, 68 versus 98 seconds, compared with paper-based documentation, P < .002. Twenty-nine staff completed the Web-based survey. They perceived that the electronic physiological surveillance system offered safety benefits by reducing human error while providing instant visibility of recorded data to the entire clinical team. PMID:27832032

  1. Towards automated spectroscopic tissue classification in thyroid and parathyroid surgery.

    PubMed

    Schols, Rutger M; Alic, Lejla; Wieringa, Fokko P; Bouvy, Nicole D; Stassen, Laurents P S

    2017-03-01

    In (para-)thyroid surgery iatrogenic parathyroid injury should be prevented. To aid the surgeons' eye, a camera system enabling parathyroid-specific image enhancement would be useful. Hyperspectral camera technology might work, provided that the spectral signature of parathyroid tissue offers enough specific features to be reliably and automatically distinguished from surrounding tissues. As a first step to investigate this, we examined the feasibility of wide band diffuse reflectance spectroscopy (DRS) for automated spectroscopic tissue classification, using silicon (Si) and indium-gallium-arsenide (InGaAs) sensors. DRS (350-1830 nm) was performed during (para-)thyroid resections. From the acquired spectra 36 features at predefined wavelengths were extracted. The best features for classification of parathyroid from adipose or thyroid were assessed by binary logistic regression for Si- and InGaAs-sensor ranges. Classification performance was evaluated by leave-one-out cross-validation. In 19 patients 299 spectra were recorded (62 tissue sites: thyroid = 23, parathyroid = 21, adipose = 18). Classification accuracy of parathyroid-adipose was, respectively, 79% (Si), 82% (InGaAs) and 97% (Si/InGaAs combined). Parathyroid-thyroid classification accuracies were 80% (Si), 75% (InGaAs), 82% (Si/InGaAs combined). Si and InGaAs sensors are fairly accurate for automated spectroscopic classification of parathyroid, adipose and thyroid tissues. Combination of both sensor technologies improves accuracy. Follow-up research, aimed towards hyperspectral imaging seems justified. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Worlddem - a Novel Global Foundation Layer

    NASA Astrophysics Data System (ADS)

    Riegler, G.; Hennig, S. D.; Weber, M.

    2015-03-01

    Airbus Defence and Space's WorldDEM™ provides a global Digital Elevation Model of unprecedented quality, accuracy, and coverage. The product will feature a vertical accuracy of 2m (relative) and better than 6m (absolute) in a 12m x 12m raster. The accuracy will surpass that of any global satellite-based elevation model available. WorldDEM is a game-changing disruptive technology and will define a new standard in global elevation models. The German radar satellites TerraSAR-X and TanDEM-X form a high-precision radar interferometer in space and acquire the data basis for the WorldDEM. This mission is performed jointly with the German Aerospace Center (DLR). Airbus DS refines the Digital Surface Model (e.g. editing of acquisition, processing artefacts and water surfaces) or generates a Digital Terrain Model. Three product levels are offered: WorldDEMcore (output of the processing, no editing is applied), WorldDEM™ (guarantees a void-free terrain description and hydrological consistency) and WorldDEM DTM (represents bare Earth elevation). Precise elevation data is the initial foundation of any accurate geospatial product, particularly when the integration of multi-source imagery and data is performed based upon it. Fused data provides for improved reliability, increased confidence and reduced ambiguity. This paper will present the current status of product development activities including methodologies and tool to generate these, like terrain and water bodies editing and DTM generation. In addition, the studies on verification & validation of the WorldDEM products will be presented.

  3. An automated approach to measuring child movement and location in the early childhood classroom.

    PubMed

    Irvin, Dwight W; Crutchfield, Stephen A; Greenwood, Charles R; Kearns, William D; Buzhardt, Jay

    2018-06-01

    Children's movement is an important issue in child development and outcome in early childhood research, intervention, and practice. Digital sensor technologies offer improvements in naturalistic movement measurement and analysis. We conducted validity and feasibility testing of a real-time, indoor mapping and location system (Ubisense, Inc.) within a preschool classroom. Real-time indoor mapping has several implications with respect to efficiently and conveniently: (a) determining the activity areas where children are spending the most and least time per day (e.g., music); and (b) mapping a focal child's atypical real-time movements (e.g., lapping behavior). We calibrated the accuracy of Ubisense point-by-point location estimates (i.e., X and Y coordinates) against laser rangefinder measurements using several stationary points and atypical movement patterns as reference standards. Our results indicate that activity areas occupied and atypical movement patterns could be plotted with an accuracy of 30.48 cm (1 ft) using a Ubisense transponder tag attached to the participating child's shirt. The accuracy parallels findings of other researchers employing Ubisense to study atypical movement patterns in individuals at risk for dementia in an assisted living facility. The feasibility of Ubisense was tested in an approximately 90-min assessment of two children, one typically developing and one with Down syndrome, during natural classroom activities, and the results proved positive. Implications for employing Ubisense in early childhood classrooms as a data-based decision-making tool to support children's development and its potential integration with other wearable sensor technologies are discussed.

  4. Robotic Stereotaxy in Cranial Neurosurgery: A Qualitative Systematic Review.

    PubMed

    Fomenko, Anton; Serletis, Demitre

    2017-12-14

    Modern-day stereotactic techniques have evolved to tackle the neurosurgical challenge of accurately and reproducibly accessing specific brain targets. Neurosurgical advances have been made in synergy with sophisticated technological developments and engineering innovations such as automated robotic platforms. Robotic systems offer a unique combination of dexterity, durability, indefatigability, and precision. To perform a systematic review of robotic integration for cranial stereotactic guidance in neurosurgery. Specifically, we comprehensively analyze the strengths and weaknesses of a spectrum of robotic technologies, past and present, including details pertaining to each system's kinematic specifications and targeting accuracy profiles. Eligible articles on human clinical applications of cranial robotic-guided stereotactic systems between 1985 and 2017 were extracted from several electronic databases, with a focus on stereotactic biopsy procedures, stereoelectroencephalography, and deep brain stimulation electrode insertion. Cranial robotic stereotactic systems feature serial or parallel architectures with 4 to 7 degrees of freedom, and frame-based or frameless registration. Indications for robotic assistance are diversifying, and include stereotactic biopsy, deep brain stimulation and stereoelectroencephalography electrode placement, ventriculostomy, and ablation procedures. Complication rates are low, and mainly consist of hemorrhage. Newer systems benefit from increasing targeting accuracy, intraoperative imaging ability, improved safety profiles, and reduced operating times. We highlight emerging future directions pertaining to the integration of robotic technologies into future neurosurgical procedures. Notably, a trend toward miniaturization, cost-effectiveness, frameless registration, and increasing safety and accuracy characterize successful stereotactic robotic technologies. Copyright © 2017 by the Congress of Neurological Surgeons

  5. Avulsion research using flume experiments and highly accurate and temporal-rich SfM datasets

    NASA Astrophysics Data System (ADS)

    Javernick, L.; Bertoldi, W.; Vitti, A.

    2017-12-01

    SfM's ability to produce high-quality, large-scale digital elevation models (DEMs) of complicated and rapidly evolving systems has made it a valuable technique for low-budget researchers and practitioners. While SfM has provided valuable datasets that capture single-flood event DEMs, there is an increasing scientific need to capture higher temporal resolution datasets that can quantify the evolutionary processes instead of pre- and post-flood snapshots. However, flood events' dangerous field conditions and image matching challenges (e.g. wind, rain) prevent quality SfM-image acquisition. Conversely, flume experiments offer opportunities to document flood events, but achieving consistent and accurate DEMs to detect subtle changes in dry and inundated areas remains a challenge for SfM (e.g. parabolic error signatures).This research aimed at investigating the impact of naturally occurring and manipulated avulsions on braided river morphology and on the encroachment of floodplain vegetation, using laboratory experiments. This required DEMs with millimeter accuracy and precision and at a temporal resolution to capture the processes. SfM was chosen as it offered the most practical method. Through redundant local network design and a meticulous ground control point (GCP) survey with a Leica Total Station in red laser configuration (reported 2 mm accuracy), the SfM residual errors compared to separate ground truthing data produced mean errors of 1.5 mm (accuracy) and standard deviations of 1.4 mm (precision) without parabolic error signatures. Lighting conditions in the flume were limited to uniform, oblique, and filtered LED strips, which removed glint and thus improved bed elevation mean errors to 4 mm, but errors were further reduced by means of an open source software for refraction correction. The obtained datasets have provided the ability to quantify how small flood events with avulsion can have similar morphologic and vegetation impacts as large flood events without avulsion. Further, this research highlights the potential application of SfM in the laboratory and ability to document physical and biological processes at greater spatial and temporal resolution. Marie Sklodowska-Curie Individual Fellowship: River-HMV, 656917

  6. Tightly Coupled Integration of GPS Ambiguity Fixed Precise Point Positioning and MEMS-INS through a Troposphere-Constrained Adaptive Kalman Filter

    PubMed Central

    Han, Houzeng; Xu, Tianhe; Wang, Jian

    2016-01-01

    Precise Point Positioning (PPP) makes use of the undifferenced pseudorange and carrier phase measurements with ionospheric-free (IF) combinations to achieve centimeter-level positioning accuracy. Conventionally, the IF ambiguities are estimated as float values. To improve the PPP positioning accuracy and shorten the convergence time, the integer phase clock model with between-satellites single-difference (BSSD) operation is used to recover the integer property. However, the continuity and availability of stand-alone PPP is largely restricted by the observation environment. The positioning performance will be significantly degraded when GPS operates under challenging environments, if less than five satellites are present. A commonly used approach is integrating a low cost inertial sensor to improve the positioning performance and robustness. In this study, a tightly coupled (TC) algorithm is implemented by integrating PPP with inertial navigation system (INS) using an Extended Kalman filter (EKF). The navigation states, inertial sensor errors and GPS error states are estimated together. The troposphere constrained approach, which utilizes external tropospheric delay as virtual observation, is applied to further improve the ambiguity-fixed height positioning accuracy, and an improved adaptive filtering strategy is implemented to improve the covariance modelling considering the realistic noise effect. A field vehicular test with a geodetic GPS receiver and a low cost inertial sensor was conducted to validate the improvement on positioning performance with the proposed approach. The results show that the positioning accuracy has been improved with inertial aiding. Centimeter-level positioning accuracy is achievable during the test, and the PPP/INS TC integration achieves a fast re-convergence after signal outages. For troposphere constrained solutions, a significant improvement for the height component has been obtained. The overall positioning accuracies of the height component are improved by 30.36%, 16.95% and 24.07% for three different convergence times, i.e., 60, 50 and 30 min, respectively. It shows that the ambiguity-fixed horizontal positioning accuracy has been significantly improved. When compared with the conventional PPP solution, it can be seen that position accuracies are improved by 19.51%, 61.11% and 23.53% for the north, east and height components, respectively, after one hour convergence through the troposphere constraint fixed PPP/INS with adaptive covariance model. PMID:27399721

  7. Crown-Level Tree Species Classification Using Integrated Airborne Hyperspectral and LIDAR Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Wu, J.; Wang, Y.; Kong, X.; Bao, H.; Ni, Y.; Ma, L.; Jin, J.

    2018-05-01

    Mapping tree species is essential for sustainable planning as well as to improve our understanding of the role of different trees as different ecological service. However, crown-level tree species automatic classification is a challenging task due to the spectral similarity among diversified tree species, fine-scale spatial variation, shadow, and underlying objects within a crown. Advanced remote sensing data such as airborne Light Detection and Ranging (LiDAR) and hyperspectral imagery offer a great potential opportunity to derive crown spectral, structure and canopy physiological information at the individual crown scale, which can be useful for mapping tree species. In this paper, an innovative approach was developed for tree species classification at the crown level. The method utilized LiDAR data for individual tree crown delineation and morphological structure extraction, and Compact Airborne Spectrographic Imager (CASI) hyperspectral imagery for pure crown-scale spectral extraction. Specifically, four steps were include: 1) A weighted mean filtering method was developed to improve the accuracy of the smoothed Canopy Height Model (CHM) derived from LiDAR data; 2) The marker-controlled watershed segmentation algorithm was, therefore, also employed to delineate the tree-level canopy from the CHM image in this study, and then individual tree height and tree crown were calculated according to the delineated crown; 3) Spectral features within 3 × 3 neighborhood regions centered on the treetops detected by the treetop detection algorithm were derived from the spectrally normalized CASI imagery; 4) The shape characteristics related to their crown diameters and heights were established, and different crown-level tree species were classified using the combination of spectral and shape characteristics. Analysis of results suggests that the developed classification strategy in this paper (OA = 85.12 %, Kc = 0.90) performed better than LiDAR-metrics method (OA = 79.86 %, Kc = 0.81) and spectral-metircs method (OA = 71.26, Kc = 0.69) in terms of classification accuracy, which indicated that the advanced method of data processing and sensitive feature selection are critical for improving the accuracy of crown-level tree species classification.

  8. Ensemble-based prediction of RNA secondary structures.

    PubMed

    Aghaeepour, Nima; Hoos, Holger H

    2013-04-24

    Accurate structure prediction methods play an important role for the understanding of RNA function. Energy-based, pseudoknot-free secondary structure prediction is one of the most widely used and versatile approaches, and improved methods for this task have received much attention over the past five years. Despite the impressive progress that as been achieved in this area, existing evaluations of the prediction accuracy achieved by various algorithms do not provide a comprehensive, statistically sound assessment. Furthermore, while there is increasing evidence that no prediction algorithm consistently outperforms all others, no work has been done to exploit the complementary strengths of multiple approaches. In this work, we present two contributions to the area of RNA secondary structure prediction. Firstly, we use state-of-the-art, resampling-based statistical methods together with a previously published and increasingly widely used dataset of high-quality RNA structures to conduct a comprehensive evaluation of existing RNA secondary structure prediction procedures. The results from this evaluation clarify the performance relationship between ten well-known existing energy-based pseudoknot-free RNA secondary structure prediction methods and clearly demonstrate the progress that has been achieved in recent years. Secondly, we introduce AveRNA, a generic and powerful method for combining a set of existing secondary structure prediction procedures into an ensemble-based method that achieves significantly higher prediction accuracies than obtained from any of its component procedures. Our new, ensemble-based method, AveRNA, improves the state of the art for energy-based, pseudoknot-free RNA secondary structure prediction by exploiting the complementary strengths of multiple existing prediction procedures, as demonstrated using a state-of-the-art statistical resampling approach. In addition, AveRNA allows an intuitive and effective control of the trade-off between false negative and false positive base pair predictions. Finally, AveRNA can make use of arbitrary sets of secondary structure prediction procedures and can therefore be used to leverage improvements in prediction accuracy offered by algorithms and energy models developed in the future. Our data, MATLAB software and a web-based version of AveRNA are publicly available at http://www.cs.ubc.ca/labs/beta/Software/AveRNA.

  9. Attitude-correlated frames approach for a star sensor to improve attitude accuracy under highly dynamic conditions.

    PubMed

    Ma, Liheng; Zhan, Dejun; Jiang, Guangwen; Fu, Sihua; Jia, Hui; Wang, Xingshu; Huang, Zongsheng; Zheng, Jiaxing; Hu, Feng; Wu, Wei; Qin, Shiqiao

    2015-09-01

    The attitude accuracy of a star sensor decreases rapidly when star images become motion-blurred under dynamic conditions. Existing techniques concentrate on a single frame of star images to solve this problem and improvements are obtained to a certain extent. An attitude-correlated frames (ACF) approach, which concentrates on the features of the attitude transforms of the adjacent star image frames, is proposed to improve upon the existing techniques. The attitude transforms between different star image frames are measured by the strap-down gyro unit precisely. With the ACF method, a much larger star image frame is obtained through the combination of adjacent frames. As a result, the degradation of attitude accuracy caused by motion-blurring are compensated for. The improvement of the attitude accuracy is approximately proportional to the square root of the number of correlated star image frames. Simulations and experimental results indicate that the ACF approach is effective in removing random noises and improving the attitude determination accuracy of the star sensor under highly dynamic conditions.

  10. Knowledge Retrieval Solutions.

    ERIC Educational Resources Information Center

    Khan, Kamran

    1998-01-01

    Excalibur RetrievalWare offers true knowledge retrieval solutions. Its fundamental technologies, Adaptive Pattern Recognition Processing and Semantic Networks, have capabilities for knowledge discovery and knowledge management of full-text, structured and visual information. The software delivers a combination of accuracy, extensibility,…

  11. Method of the active contour for segmentation of bone systems on bitmap images

    NASA Astrophysics Data System (ADS)

    Vu, Hai Anh; Safonov, Roman A.; Kolesnikova, Anna S.; Kirillova, Irina V.; Kossovich, Leonid U.

    2018-02-01

    It is developed within a method of the active contours the approach, which is allowing to realize separation of a contour of a object of the image in case of its segmentation. This approach exceeds a parametric method on speed, but also does not concede to it on decision accuracy. The approach is offered within this operation will allow to realize allotment of a contour with high accuracy of the image and quicker than a parametric method of the active contours.

  12. Accuracy Improvement of Neutron Nuclear Data on Minor Actinides

    NASA Astrophysics Data System (ADS)

    Harada, Hideo; Iwamoto, Osamu; Iwamoto, Nobuyuki; Kimura, Atsushi; Terada, Kazushi; Nakao, Taro; Nakamura, Shoji; Mizuyama, Kazuhito; Igashira, Masayuki; Katabuchi, Tatsuya; Sano, Tadafumi; Takahashi, Yoshiyuki; Takamiya, Koichi; Pyeon, Cheol Ho; Fukutani, Satoshi; Fujii, Toshiyuki; Hori, Jun-ichi; Yagi, Takahiro; Yashima, Hiroshi

    2015-05-01

    Improvement of accuracy of neutron nuclear data for minor actinides (MAs) and long-lived fission products (LLFPs) is required for developing innovative nuclear system transmuting these nuclei. In order to meet the requirement, the project entitled as "Research and development for Accuracy Improvement of neutron nuclear data on Minor ACtinides (AIMAC)" has been started as one of the "Innovative Nuclear Research and Development Program" in Japan at October 2013. The AIMAC project team is composed of researchers in four different fields: differential nuclear data measurement, integral nuclear data measurement, nuclear chemistry, and nuclear data evaluation. By integrating all of the forefront knowledge and techniques in these fields, the team aims at improving the accuracy of the data. The background and research plan of the AIMAC project are presented.

  13. Far infrared diagnostics of electron concentration in combustion MHD plasmas using interferometry and Faraday rotation

    NASA Astrophysics Data System (ADS)

    Kuzmenko, P. J.

    1985-12-01

    The plasma electrical conductivity is a key parameter in determining the efficiency of an magnetohydrodynamic (MHD) generator. Electromagnetic waves offer an accurate, non-intrusive probe. The electron concentration and mobility may be deduced from the refractive index and absorption coefficient measured with an interferometer. The first experiment used an HCOOH laser at 393.6 microns feeding a Michelson interferometer mounted around a combustor duct with open ports. Simultaneous measurements of positive ion density and plasma temperature made with a Langmuir probe and line reversal apparatus verified the operation of the interferometer. With a magnetic field present, measurement of the polarization rotation and induced ellipticity in a wave traveling along the field provides information on the plasma conductivity. Compared to interferometry, diagnostic apparatus based on Faraday rotation offers simpler optics and requires far less stringent mechanical stability at a cost of lower sensitivity. An advanced detection scheme, using a polarizing beam splitter improved the sensitivity to be comparable to that of interferometry. Interferometry is the preferred technique for small scale, high accuracy measurements, with Faraday rotation reserved for large systems or measurements within a working generator.

  14. Feature selection and classifier parameters estimation for EEG signals peak detection using particle swarm optimization.

    PubMed

    Adam, Asrul; Shapiai, Mohd Ibrahim; Tumari, Mohd Zaidi Mohd; Mohamad, Mohd Saberi; Mubin, Marizan

    2014-01-01

    Electroencephalogram (EEG) signal peak detection is widely used in clinical applications. The peak point can be detected using several approaches, including time, frequency, time-frequency, and nonlinear domains depending on various peak features from several models. However, there is no study that provides the importance of every peak feature in contributing to a good and generalized model. In this study, feature selection and classifier parameters estimation based on particle swarm optimization (PSO) are proposed as a framework for peak detection on EEG signals in time domain analysis. Two versions of PSO are used in the study: (1) standard PSO and (2) random asynchronous particle swarm optimization (RA-PSO). The proposed framework tries to find the best combination of all the available features that offers good peak detection and a high classification rate from the results in the conducted experiments. The evaluation results indicate that the accuracy of the peak detection can be improved up to 99.90% and 98.59% for training and testing, respectively, as compared to the framework without feature selection adaptation. Additionally, the proposed framework based on RA-PSO offers a better and reliable classification rate as compared to standard PSO as it produces low variance model.

  15. How Urban Parks Offer Opportunities for Physical Activity in Dublin, Ireland.

    PubMed

    Burrows, Eve; O'Mahony, Margaret; Geraghty, Dermot

    2018-04-21

    Parks are an important part of the urban fabric of cities. They offer people the opportunity to connect with nature, engage in physical activity, find a haven away from the city noise, or spend time alone or with family and friends. This study examines the relative importance of park and park visit characteristics for 865 survey participants in Dublin, Ireland. The data is analyzed using a multinomial logistic regression model which can distinguish the relative importance of attributes. The model results demonstrate an improvement over proportional by chance accuracy, indicating that the model is useful. The results suggest that when and why individuals go to the park along with the proximity of their residence to the park influence visit frequency more than their age and gender and more than their impression of the sound levels in the park. The contribution of the results, in terms of their potential usefulness to planners, suggest that the priority should be on the provision of park space close to residential areas, so that individuals can engage in activities such as walking and relaxation, and that the quality of that space, in the context of noise levels at least, is less important.

  16. Note: An improved calibration system with phase correction for electronic transformers with digital output

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Han-miao, E-mail: chenghanmiao@hust.edu.cn; Li, Hong-bin, E-mail: lihongbin@hust.edu.cn; State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Wuhan 430074

    The existing electronic transformer calibration systems employing data acquisition cards cannot satisfy some practical applications, because the calibration systems have phase measurement errors when they work in the mode of receiving external synchronization signals. This paper proposes an improved calibration system scheme with phase correction to improve the phase measurement accuracy. We employ NI PCI-4474 to design a calibration system, and the system has the potential to receive external synchronization signals and reach extremely high accuracy classes. Accuracy verification has been carried out in the China Electric Power Research Institute, and results demonstrate that the system surpasses the accuracy classmore » 0.05. Furthermore, this system has been used to test the harmonics measurement accuracy of all-fiber optical current transformers. In the same process, we have used an existing calibration system, and a comparison of the test results is presented. The system after improvement is suitable for the intended applications.« less

  17. Impact of ambiguity resolution and application of transformation parameters obtained by regional GNSS network in Precise Point Positioning

    NASA Astrophysics Data System (ADS)

    Gandolfi, S.; Poluzzi, L.; Tavasci, L.

    2012-12-01

    Precise Point Positioning (PPP) is one of the possible approaches for GNSS data processing. As known this technique is faster and more flexible compared to the others which are based on a differenced approach and constitute a reliable methods for accurate positioning of remote GNSS stations, even in some remote area such as Antarctica. Until few years ago one of the major limits of the method was the impossibility to resolve the ambiguity as integer but nowadays many methods are available to resolve this aspect. The first software package permitting a PPP solution was the GIPSY OASIS realized, developed and maintained by JPL (NASA). JPL produce also orbits and files ready to be used with GIPSY. Recently, using these products came possible to resolve ambiguities improving the stability of solutions. PPP permit to estimate position into the reference frame of the orbits (IGS) and when coordinate in others reference frames, such al ITRF, are needed is necessary to apply a transformation. Within his products JPL offer, for each day, a global 7 parameter transformation that permit to locate the survey into the ITRF RF. In some cases it's also possible to create a costumed process and obtain analogous parameters using local/regional reference network of stations which coordinates are available also in the desired reference frame. In this work some tests on accuracy has been carried out comparing different PPP solutions obtained using the same software packages (GIPSY) but considering the ambiguity resolution, the global and regional transformation parameters. In particular two test area have been considered, first one located in Antarctica and the second one in Italy. Aim of the work is the evaluation of the impact of ambiguity resolution and the use of local/regional transformation parameter in the final solutions. Tests shown how the ambiguity resolution improve the precision, especially in the EAST component with a scattering reduction about 8%. And the use of global transformation parameter permit to improve the accuracy of about 59%, 63% and 29% in the three components N E U, but other tests shown how is possible to improve the accuracy of 67% 71% and 53% using regional transformation parameters. Example of the impact of global vs regional parameters transformation in a GPS time series

  18. Improved Accuracy of Continuous Glucose Monitoring Systems in Pediatric Patients with Diabetes Mellitus: Results from Two Studies.

    PubMed

    Laffel, Lori

    2016-02-01

    This study was designed to evaluate accuracy, performance, and safety of the Dexcom (San Diego, CA) G4(®) Platinum continuous glucose monitoring (CGM) system (G4P) compared with the Dexcom G4 Platinum with Software 505 algorithm (SW505) when used as adjunctive management to blood glucose (BG) monitoring over a 7-day period in youth, 2-17 years of age, with diabetes. Youth wore either one or two sensors placed on the abdomen or upper buttocks for 7 days, calibrating the device twice daily with a uniform BG meter. Participants had one in-clinic session on Day 1, 4, or 7, during which fingerstick BG measurements (self-monitoring of blood glucose [SMBG]) were obtained every 30 ± 5 min for comparison with CGM, and in youth 6-17 years of age, reference YSI glucose measurements were obtained from arterialized venous blood collected every 15 ± 5 min for comparison with CGM. The sensor was removed by the participant/family after 7 days. In comparison of 2,922 temporally paired points of CGM with the reference YSI measurement for G4P and 2,262 paired points for SW505, the mean absolute relative difference (MARD) was 17% for G4P versus 10% for SW505 (P < 0.0001). In comparison of 16,318 temporally paired points of CGM with SMBG for G4P and 4,264 paired points for SW505, MARD was 15% for G4P versus 13% for SW505 (P < 0.0001). Similarly, error grid analyses indicated superior performance with SW505 compared with G4P in comparison of CGM with YSI and CGM with SMBG results, with greater percentages of SW505 results falling within error grid Zone A or the combined Zones A plus B. There were no serious adverse events or device-related serious adverse events for either the G4P or the SW505, and there was no sensor breakoff. The updated algorithm offers substantial improvements in accuracy and performance in pediatric patients with diabetes. Use of CGM with improved performance has potential to increase glucose time in range and improve glycemic outcomes for youth.

  19. Development of a three-dimensional multistage inverse design method for aerodynamic matching of axial compressor blading

    NASA Astrophysics Data System (ADS)

    van Rooij, Michael P. C.

    Current turbomachinery design systems increasingly rely on multistage Computational Fluid Dynamics (CFD) as a means to assess performance of designs. However, design weaknesses attributed to improper stage matching are addressed using often ineffective strategies involving a costly iterative loop between blading modification, revision of design intent, and evaluation of aerodynamic performance. A design methodology is presented which greatly improves the process of achieving design-point aerodynamic matching. It is based on a three-dimensional viscous inverse design method which generates the blade camber surface based on prescribed pressure loading, thickness distribution and stacking line. This inverse design method has been extended to allow blading analysis and design in a multi-blade row environment. Blade row coupling was achieved through a mixing plane approximation. Parallel computing capability in the form of MPI has been implemented to reduce the computational time for multistage calculations. Improvements have been made to the flow solver to reach the level of accuracy required for multistage calculations. These include inclusion of heat flux, temperature-dependent treatment of viscosity, and improved calculation of stress components and artificial dissipation near solid walls. A validation study confirmed that the obtained accuracy is satisfactory at design point conditions. Improvements have also been made to the inverse method to increase robustness and design fidelity. These include the possibility to exclude spanwise sections of the blade near the endwalls from the design process, and a scheme that adjusts the specified loading area for changes resulting from the leading and trailing edge treatment. Furthermore, a pressure loading manager has been developed. Its function is to automatically adjust the pressure loading area distribution during the design calculation in order to achieve a specified design objective. Possible objectives are overall mass flow and compression ratio, and radial distribution of exit flow angle. To supplement the loading manager, mass flow inlet and exit boundary conditions have been implemented. Through appropriate combination of pressure or mass flow inflow/outflow boundary conditions and loading manager objectives, increased control over the design intent can be obtained. The three-dimensional multistage inverse design method with pressure loading manager was demonstrated to offer greatly enhanced blade row matching capabilities. Multistage design allows for simultaneous design of blade rows in a mutually interacting environment, which permits the redesigned blading to adapt to changing aerodynamic conditions resulting from the redesign. This ensures that the obtained blading geometry and performance implied by the prescribed pressure loading distribution are consistent with operation in the multi-blade row environment. The developed methodology offers high aerodynamic design quality and productivity, and constitutes a significant improvement over existing approaches used to address design-point aerodynamic matching.

  20. Analytical-Based Partial Volume Recovery in Mouse Heart Imaging

    NASA Astrophysics Data System (ADS)

    Dumouchel, Tyler; deKemp, Robert A.

    2011-02-01

    Positron emission tomography (PET) is a powerful imaging modality that has the ability to yield quantitative images of tracer activity. Physical phenomena such as photon scatter, photon attenuation, random coincidences and spatial resolution limit quantification potential and must be corrected to preserve the accuracy of reconstructed images. This study focuses on correcting the partial volume effects that arise in mouse heart imaging when resolution is insufficient to resolve the true tracer distribution in the myocardium. The correction algorithm is based on fitting 1D profiles through the myocardium in gated PET images to derive myocardial contours along with blood, background and myocardial activity. This information is interpolated onto a 2D grid and convolved with the tomograph's point spread function to derive regional recovery coefficients enabling partial volume correction. The point spread function was measured by placing a line source inside a small animal PET scanner. PET simulations were created based on noise properties measured from a reconstructed PET image and on the digital MOBY phantom. The algorithm can estimate the myocardial activity to within 5% of the truth when different wall thicknesses, backgrounds and noise properties are encountered that are typical of healthy FDG mouse scans. The method also significantly improves partial volume recovery in simulated infarcted tissue. The algorithm offers a practical solution to the partial volume problem without the need for co-registered anatomic images and offers a basis for improved quantitative 3D heart imaging.

  1. Analysis of the secrecy of the running key in quantum encryption channels using coherent states of light

    NASA Astrophysics Data System (ADS)

    Nikulin, Vladimir V.; Hughes, David H.; Malowicki, John; Bedi, Vijit

    2015-05-01

    Free-space optical communication channels offer secure links with low probability of interception and detection. Despite their point-to-point topology, additional security features may be required in privacy-critical applications. Encryption can be achieved at the physical layer by using quantized values of photons, which makes exploitation of such quantum communication links extremely difficult. One example of such technology is keyed communication in quantum noise, a novel quantum modulation protocol that offers ultra-secure communication with competitive performance characteristics. Its utilization relies on specific coherent measurements to decrypt the signal. The process of measurements is complicated by the inherent and irreducible quantum noise of coherent states. This problem is different from traditional laser communication with coherent detection; therefore continuous efforts are being made to improve the measurement techniques. Quantum-based encryption systems that use the phase of the signal as the information carrier impose aggressive requirements on the accuracy of the measurements when an unauthorized party attempts intercepting the data stream. Therefore, analysis of the secrecy of the data becomes extremely important. In this paper, we present the results of a study that had a goal of assessment of potential vulnerability of the running key. Basic results of the laboratory measurements are combined with simulation studies and statistical analysis that can be used for both conceptual improvement of the encryption approach and for quantitative comparison of secrecy of different quantum communication protocols.

  2. Quantifying Network Dynamics and Information Flow Across Chinese Social Media During the African Ebola Outbreak.

    PubMed

    Feng, Shihui; Hossain, Liaquat; Crawford, John W; Bossomaier, Terry

    2018-02-01

    Social media provides us with a new platform on which to explore how the public responds to disasters and, of particular importance, how they respond to the emergence of infectious diseases such as Ebola. Provided it is appropriately informed, social media offers a potentially powerful means of supporting both early detection and effective containment of communicable diseases, which is essential for improving disaster medicine and public health preparedness. The 2014 West African Ebola outbreak is a particularly relevant contemporary case study on account of the large number of annual arrivals from Africa, including Chinese employees engaged in projects in Africa. Weibo (Weibo Corp, Beijing, China) is China's most popular social media platform, with more than 2 billion users and over 300 million daily posts, and offers great opportunity to monitor early detection and promotion of public health awareness. We present a proof-of-concept study of a subset of Weibo posts during the outbreak demonstrating potential and identifying priorities for improving the efficacy and accuracy of information dissemination. We quantify the evolution of the social network topology within Weibo relating to the efficacy of information sharing. We show how relatively few nodes in the network can have a dominant influence over both the quality and quantity of the information shared. These findings make an important contribution to disaster medicine and public health preparedness from theoretical and methodological perspectives for dealing with epidemics. (Disaster Med Public Health Preparedness. 2018;12:26-37).

  3. Conclusions about children's reporting accuracy for energy and macronutrients over multiple interviews depend on the analytic approach for comparing reported information to reference information.

    PubMed

    Baxter, Suzanne Domel; Smith, Albert F; Hardin, James W; Nichols, Michele D

    2007-04-01

    Validation study data are used to illustrate that conclusions about children's reporting accuracy for energy and macronutrients over multiple interviews (ie, time) depend on the analytic approach for comparing reported and reference information-conventional, which disregards accuracy of reported items and amounts, or reporting-error-sensitive, which classifies reported items as matches (eaten) or intrusions (not eaten), and amounts as corresponding or overreported. Children were observed eating school meals on 1 day (n=12), or 2 (n=13) or 3 (n=79) nonconsecutive days separated by >or=25 days, and interviewed in the morning after each observation day about intake the previous day. Reference (observed) and reported information were transformed to energy and macronutrients (ie, protein, carbohydrate, and fat), and compared. For energy and each macronutrient: report rates (reported/reference), correspondence rates (genuine accuracy measures), and inflation ratios (error measures). Mixed-model analyses. Using the conventional approach for analyzing energy and macronutrients, report rates did not vary systematically over interviews (all four P values >0.61). Using the reporting-error-sensitive approach for analyzing energy and macronutrients, correspondence rates increased over interviews (all four P values <0.04), indicating that reporting accuracy improved over time; inflation ratios decreased, although not significantly, over interviews, also suggesting that reporting accuracy improved over time. Correspondence rates were lower than report rates, indicating that reporting accuracy was worse than implied by conventional measures. When analyzed using the reporting-error-sensitive approach, children's dietary reporting accuracy for energy and macronutrients improved over time, but the conventional approach masked improvements and overestimated accuracy. The reporting-error-sensitive approach is recommended when analyzing data from validation studies of dietary reporting accuracy for energy and macronutrients.

  4. Conclusions about children’s reporting accuracy for energy and macronutrients over multiple interviews depend on the analytic approach for comparing reported information to reference information

    PubMed Central

    Baxter, Suzanne Domel; Smith, Albert F.; Hardin, James W.; Nichols, Michele D.

    2008-01-01

    Objective Validation-study data are used to illustrate that conclusions about children’s reporting accuracy for energy and macronutrients over multiple interviews (ie, time) depend on the analytic approach for comparing reported and reference information—conventional, which disregards accuracy of reported items and amounts, or reporting-error-sensitive, which classifies reported items as matches (eaten) or intrusions (not eaten), and amounts as corresponding or overreported. Subjects and design Children were observed eating school meals on one day (n = 12), or two (n = 13) or three (n = 79) nonconsecutive days separated by ≥25 days, and interviewed in the morning after each observation day about intake the previous day. Reference (observed) and reported information were transformed to energy and macronutrients (protein, carbohydrate, fat), and compared. Main outcome measures For energy and each macronutrient: report rates (reported/reference), correspondence rates (genuine accuracy measures), inflation ratios (error measures). Statistical analyses Mixed-model analyses. Results Using the conventional approach for analyzing energy and macronutrients, report rates did not vary systematically over interviews (Ps > .61). Using the reporting-error-sensitive approach for analyzing energy and macronutrients, correspondence rates increased over interviews (Ps < .04), indicating that reporting accuracy improved over time; inflation ratios decreased, although not significantly, over interviews, also suggesting that reporting accuracy improved over time. Correspondence rates were lower than report rates, indicating that reporting accuracy was worse than implied by conventional measures. Conclusions When analyzed using the reporting-error-sensitive approach, children’s dietary reporting accuracy for energy and macronutrients improved over time, but the conventional approach masked improvements and overestimated accuracy. Applications The reporting-error-sensitive approach is recommended when analyzing data from validation studies of dietary reporting accuracy for energy and macronutrients. PMID:17383265

  5. Improving coding accuracy in an academic practice.

    PubMed

    Nguyen, Dana; O'Mara, Heather; Powell, Robert

    2017-01-01

    Practice management has become an increasingly important component of graduate medical education. This applies to every practice environment; private, academic, and military. One of the most critical aspects of practice management is documentation and coding for physician services, as they directly affect the financial success of any practice. Our quality improvement project aimed to implement a new and innovative method for teaching billing and coding in a longitudinal fashion in a family medicine residency. We hypothesized that implementation of a new teaching strategy would increase coding accuracy rates among residents and faculty. Design: single group, pretest-posttest. military family medicine residency clinic. Study populations: 7 faculty physicians and 18 resident physicians participated as learners in the project. Educational intervention: monthly structured coding learning sessions in the academic curriculum that involved learner-presented cases, small group case review, and large group discussion. overall coding accuracy (compliance) percentage and coding accuracy per year group for the subjects that were able to participate longitudinally. Statistical tests used: average coding accuracy for population; paired t test to assess improvement between 2 intervention periods, both aggregate and by year group. Overall coding accuracy rates remained stable over the course of time regardless of the modality of the educational intervention. A paired t test was conducted to compare coding accuracy rates at baseline (mean (M)=26.4%, SD=10%) to accuracy rates after all educational interventions were complete (M=26.8%, SD=12%); t24=-0.127, P=.90. Didactic teaching and small group discussion sessions did not improve overall coding accuracy in a residency practice. Future interventions could focus on educating providers at the individual level.

  6. Diagnostic reasoning and underlying knowledge of students with preclinical patient contacts in PBL.

    PubMed

    Diemers, Agnes D; van de Wiel, Margje W J; Scherpbier, Albert J J A; Baarveld, Frank; Dolmans, Diana H J M

    2015-12-01

    Medical experts have access to elaborate and integrated knowledge networks consisting of biomedical and clinical knowledge. These coherent knowledge networks enable them to generate more accurate diagnoses in a shorter time. However, students' knowledge networks are less organised and students have difficulties linking theory and practice and transferring acquired knowledge. Therefore we wanted to explore the development and transfer of knowledge of third-year preclinical students on a problem-based learning (PBL) course with real patient contacts. Before and after a 10-week PBL course with real patients, third-year medical students were asked to think out loud while diagnosing four types of paper patient problems (two course cases and two transfer cases), and explain the underlying pathophysiological mechanisms of the patient features. Diagnostic accuracy and time needed to think through the cases were measured. The think-aloud protocols were transcribed verbatim and different types of knowledge were coded and quantitatively analysed. The written pathophysiological explanations were translated into networks of concepts. Both the concepts and the links between concepts in students' networks were compared to model networks. Over the course diagnostic accuracy increased, case-processing time decreased, and students used less biomedical and clinical knowledge during diagnostic reasoning. The quality of the pathophysiological explanations increased: the students used more concepts, especially more model concepts, and they used fewer wrong concepts and links. The findings differed across course and transfer cases. The effects were generally less strong for transfer cases. Students' improved diagnostic accuracy and the improved quality of their knowledge networks suggest that integration of biomedical and clinical knowledge took place during a 10-week course. The differences between course and transfer cases demonstrate that transfer is complex and time-consuming. We therefore suggest offering students many varied patient contacts with the same underlying pathophysiological mechanism and encouraging students to link biomedical and clinical knowledge. © 2015 John Wiley & Sons Ltd.

  7. Cardiac gating with a pulse oximeter for dual-energy imaging

    NASA Astrophysics Data System (ADS)

    Shkumat, N. A.; Siewerdsen, J. H.; Dhanantwari, A. C.; Williams, D. B.; Paul, N. S.; Yorkston, J.; Van Metter, R.

    2008-11-01

    The development and evaluation of a prototype cardiac gating system for double-shot dual-energy (DE) imaging is described. By acquiring both low- and high-kVp images during the resting phase of the cardiac cycle (diastole), heart misalignment between images can be reduced, thereby decreasing the magnitude of cardiac motion artifacts. For this initial implementation, a fingertip pulse oximeter was employed to measure the peripheral pulse waveform ('plethysmogram'), offering potential logistic, cost and workflow advantages compared to an electrocardiogram. A gating method was developed that accommodates temporal delays due to physiological pulse propagation, oximeter waveform processing and the imaging system (software, filter-wheel, anti-scatter Bucky-grid and flat-panel detector). Modeling the diastolic period allowed the calculation of an implemented delay, timp, required to trigger correctly during diastole at any patient heart rate (HR). The model suggests a triggering scheme characterized by two HR regimes, separated by a threshold, HRthresh. For rates at or below HRthresh, sufficient time exists to expose on the same heartbeat as the plethysmogram pulse [timp(HR) = 0]. Above HRthresh, a characteristic timp(HR) delays exposure to the subsequent heartbeat, accounting for all fixed and variable system delays. Performance was evaluated in terms of accuracy and precision of diastole-trigger coincidence and quantitative evaluation of artifact severity in gated and ungated DE images. Initial implementation indicated 85% accuracy in diastole-trigger coincidence. Through the identification of an improved HR estimation method (modified temporal smoothing of the oximeter waveform), trigger accuracy of 100% could be achieved with improved precision. To quantify the effect of the gating system on DE image quality, human observer tests were conducted to measure the magnitude of cardiac artifact under conditions of successful and unsuccessful diastolic gating. Six observers independently measured the artifact in 111 patient DE images. The data indicate that successful diastolic gating results in a statistically significant reduction (p < 0.001) in the magnitude of cardiac motion artifact, with residual artifact attributed primarily to gross patient motion.

  8. An on-line calibration algorithm for external parameters of visual system based on binocular stereo cameras

    NASA Astrophysics Data System (ADS)

    Wang, Liqiang; Liu, Zhen; Zhang, Zhonghua

    2014-11-01

    Stereo vision is the key in the visual measurement, robot vision, and autonomous navigation. Before performing the system of stereo vision, it needs to calibrate the intrinsic parameters for each camera and the external parameters of the system. In engineering, the intrinsic parameters remain unchanged after calibrating cameras, and the positional relationship between the cameras could be changed because of vibration, knocks and pressures in the vicinity of the railway or motor workshops. Especially for large baselines, even minute changes in translation or rotation can affect the epipolar geometry and scene triangulation to such a degree that visual system becomes disabled. A technology including both real-time examination and on-line recalibration for the external parameters of stereo system becomes particularly important. This paper presents an on-line method for checking and recalibrating the positional relationship between stereo cameras. In epipolar geometry, the external parameters of cameras can be obtained by factorization of the fundamental matrix. Thus, it offers a method to calculate the external camera parameters without any special targets. If the intrinsic camera parameters are known, the external parameters of system can be calculated via a number of random matched points. The process is: (i) estimating the fundamental matrix via the feature point correspondences; (ii) computing the essential matrix from the fundamental matrix; (iii) obtaining the external parameters by decomposition of the essential matrix. In the step of computing the fundamental matrix, the traditional methods are sensitive to noise and cannot ensure the estimation accuracy. We consider the feature distribution situation in the actual scene images and introduce a regional weighted normalization algorithm to improve accuracy of the fundamental matrix estimation. In contrast to traditional algorithms, experiments on simulated data prove that the method improves estimation robustness and accuracy of the fundamental matrix. Finally, we take an experiment for computing the relationship of a pair of stereo cameras to demonstrate accurate performance of the algorithm.

  9. Application of Numerical Integration and Data Fusion in Unit Vector Method

    NASA Astrophysics Data System (ADS)

    Zhang, J.

    2012-01-01

    The Unit Vector Method (UVM) is a series of orbit determination methods which are designed by Purple Mountain Observatory (PMO) and have been applied extensively. It gets the conditional equations for different kinds of data by projecting the basic equation to different unit vectors, and it suits for weighted process for different kinds of data. The high-precision data can play a major role in orbit determination, and accuracy of orbit determination is improved obviously. The improved UVM (PUVM2) promoted the UVM from initial orbit determination to orbit improvement, and unified the initial orbit determination and orbit improvement dynamically. The precision and efficiency are improved further. In this thesis, further research work has been done based on the UVM: Firstly, for the improvement of methods and techniques for observation, the types and decision of the observational data are improved substantially, it is also asked to improve the decision of orbit determination. The analytical perturbation can not meet the requirement. So, the numerical integration for calculating the perturbation has been introduced into the UVM. The accuracy of dynamical model suits for the accuracy of the real data, and the condition equations of UVM are modified accordingly. The accuracy of orbit determination is improved further. Secondly, data fusion method has been introduced into the UVM. The convergence mechanism and the defect of weighted strategy have been made clear in original UVM. The problem has been solved in this method, the calculation of approximate state transition matrix is simplified and the weighted strategy has been improved for the data with different dimension and different precision. Results of orbit determination of simulation and real data show that the work of this thesis is effective: (1) After the numerical integration has been introduced into the UVM, the accuracy of orbit determination is improved obviously, and it suits for the high-accuracy data of available observation apparatus. Compare with the classical differential improvement with the numerical integration, its calculation speed is also improved obviously. (2) After data fusion method has been introduced into the UVM, weighted distribution accords rationally with the accuracy of different kinds of data, all data are fully used and the new method is also good at numerical stability and rational weighted distribution.

  10. Accuracy improvement of multimodal measurement of speed of sound based on image processing

    NASA Astrophysics Data System (ADS)

    Nitta, Naotaka; Kaya, Akio; Misawa, Masaki; Hyodo, Koji; Numano, Tomokazu

    2017-07-01

    Since the speed of sound (SOS) reflects tissue characteristics and is expected as an evaluation index of elasticity and water content, the noninvasive measurement of SOS is eagerly anticipated. However, it is difficult to measure the SOS by using an ultrasound device alone. Therefore, we have presented a noninvasive measurement method of SOS using ultrasound (US) and magnetic resonance (MR) images. By this method, we determine the longitudinal SOS based on the thickness measurement using the MR image and the time of flight (TOF) measurement using the US image. The accuracy of SOS measurement is affected by the accuracy of image registration and the accuracy of thickness measurements in the MR and US images. In this study, we address the accuracy improvement in the latter thickness measurement, and present an image-processing-based method for improving the accuracy of thickness measurement. The method was investigated by using in vivo data obtained from a tissue-engineered cartilage implanted in the back of a rat, with an unclear boundary.

  11. Lunar Reconnaissance Orbiter Orbit Determination Accuracy Analysis

    NASA Technical Reports Server (NTRS)

    Slojkowski, Steven E.

    2014-01-01

    Results from operational OD produced by the NASA Goddard Flight Dynamics Facility for the LRO nominal and extended mission are presented. During the LRO nominal mission, when LRO flew in a low circular orbit, orbit determination requirements were met nearly 100% of the time. When the extended mission began, LRO returned to a more elliptical frozen orbit where gravity and other modeling errors caused numerous violations of mission accuracy requirements. Prediction accuracy is particularly challenged during periods when LRO is in full-Sun. A series of improvements to LRO orbit determination are presented, including implementation of new lunar gravity models, improved spacecraft solar radiation pressure modeling using a dynamic multi-plate area model, a shorter orbit determination arc length, and a constrained plane method for estimation. The analysis presented in this paper shows that updated lunar gravity models improved accuracy in the frozen orbit, and a multiplate dynamic area model improves prediction accuracy during full-Sun orbit periods. Implementation of a 36-hour tracking data arc and plane constraints during edge-on orbit geometry also provide benefits. A comparison of the operational solutions to precision orbit determination solutions shows agreement on a 100- to 250-meter level in definitive accuracy.

  12. High-precision radiometric tracking for planetary approach and encounter in the inner solar system

    NASA Technical Reports Server (NTRS)

    Christensen, C. S.; Thurman, S. W.; Davidson, J. M.; Finger, M. H.; Folkner, W. M.

    1989-01-01

    The benefits of improved radiometric tracking data have been studied for planetary approach within the inner Solar System using the Mars Rover Sample Return trajectory as a model. It was found that the benefit of improved data to approach and encounter navigation was highly dependent on the a priori uncertainties assumed for several non-estimated parameters, including those for frame-tie, Earth orientation, troposphere delay, and station locations. With these errors at their current levels, navigational performance was found to be insensitive to enhancements in data accuracy. However, when expected improvements in these errors are modeled, performance with current-accuracy data significantly improves, with substantial further improvements possible with enhancements in data accuracy.

  13. IMPROVING THE ACCURACY OF HISTORIC SATELLITE IMAGE CLASSIFICATION BY COMBINING LOW-RESOLUTION MULTISPECTRAL DATA WITH HIGH-RESOLUTION PANCHROMATIC DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Getman, Daniel J

    2008-01-01

    Many attempts to observe changes in terrestrial systems over time would be significantly enhanced if it were possible to improve the accuracy of classifications of low-resolution historic satellite data. In an effort to examine improving the accuracy of historic satellite image classification by combining satellite and air photo data, two experiments were undertaken in which low-resolution multispectral data and high-resolution panchromatic data were combined and then classified using the ECHO spectral-spatial image classification algorithm and the Maximum Likelihood technique. The multispectral data consisted of 6 multispectral channels (30-meter pixel resolution) from Landsat 7. These data were augmented with panchromatic datamore » (15m pixel resolution) from Landsat 7 in the first experiment, and with a mosaic of digital aerial photography (1m pixel resolution) in the second. The addition of the Landsat 7 panchromatic data provided a significant improvement in the accuracy of classifications made using the ECHO algorithm. Although the inclusion of aerial photography provided an improvement in accuracy, this improvement was only statistically significant at a 40-60% level. These results suggest that once error levels associated with combining aerial photography and multispectral satellite data are reduced, this approach has the potential to significantly enhance the precision and accuracy of classifications made using historic remotely sensed data, as a way to extend the time range of efforts to track temporal changes in terrestrial systems.« less

  14. Nanotechnology applications in thoracic surgery

    PubMed Central

    Hofferberth, Sophie C.; Grinstaff, Mark W.; Colson, Yolonda L.

    2016-01-01

    Nanotechnology is an emerging, rapidly evolving field with the potential to significantly impact care across the full spectrum of cancer therapy. Of note, several recent nanotechnological advances show particular promise to improve outcomes for thoracic surgical patients. A variety of nanotechnologies are described that offer possible solutions to existing challenges encountered in the detection, diagnosis and treatment of lung cancer. Nanotechnology-based imaging platforms have the ability to improve the surgical care of patients with thoracic malignancies through technological advances in intraoperative tumour localization, lymph node mapping and accuracy of tumour resection. Moreover, nanotechnology is poised to revolutionize adjuvant lung cancer therapy. Common chemotherapeutic drugs, such as paclitaxel, docetaxel and doxorubicin, are being formulated using various nanotechnologies to improve drug delivery, whereas nanoparticle (NP)-based imaging technologies can monitor the tumour microenvironment and facilitate molecularly targeted lung cancer therapy. Although early nanotechnology-based delivery systems show promise, the next frontier in lung cancer therapy is the development of ‘theranostic’ multifunctional NPs capable of integrating diagnosis, drug monitoring, tumour targeting and controlled drug release into various unifying platforms. This article provides an overview of key existing and emerging nanotechnology platforms that may find clinical application in thoracic surgery in the near future. PMID:26843431

  15. Accelerated treatment protocols: full arch treatment with interim and definitive prostheses.

    PubMed

    Drago, Carl

    2012-01-01

    With the advent of titanium, root form implants and osseointegration, dental treatment has undergone a metamorphosis in recent years. These new techniques enable dentists to provide anchorage for various kinds of prostheses that improve masticatory function, esthetics, and comfort for patients. Implant treatment protocols have been improved relative to implant macro- and micro-geometries, surgical and prosthetic components, and treatment times. Over the past 20 years, immediate occlusal function (also known as loading) has been established as a predictable treatment modality, provided certain specific criteria are met. In many cases, edentulous patients, crippled by the loss of their teeth, can undergo outpatient surgical and prosthetic procedures and return to a masticatory function that is near normal--sometimes after only one day of surgical and prosthetic treatment. This treatment option is also available for patients with advanced, generalized periodontal disease. Computer-assisted design/Computer-assisted manufacturing (CAD/CAM) has transformed how dental prostheses are made, offering improved accuracy, longevity, and biocompatibility; along with reduced labor costs and fewer complications than casting technologies. This article reviews the principles associated with immediate occlusal loading and illustrates one specific accelerated prosthodontic treatment protocol used to treat edentulous and partially edentulous patients with interim and definitive prostheses.

  16. Swing arm profilometer: high accuracy testing for large reaction-bonded silicon carbide optics with a capacitive probe

    NASA Astrophysics Data System (ADS)

    Xiong, Ling; Luo, Xiao; Hu, Hai-xiang; Zhang, Zhi-yu; Zhang, Feng; Zheng, Li-gong; Zhang, Xue-jun

    2017-08-01

    A feasible way to improve the manufacturing efficiency of large reaction-bonded silicon carbide optics is to increase the processing accuracy in the ground stage before polishing, which requires high accuracy metrology. A swing arm profilometer (SAP) has been used to measure large optics during the ground stage. A method has been developed for improving the measurement accuracy of SAP using a capacitive probe and implementing calibrations. The experimental result compared with the interferometer test shows the accuracy of 0.068 μm in root-mean-square (RMS) and maps in 37 low-order Zernike terms show accuracy of 0.048 μm RMS, which shows a powerful capability to provide a major input in high-precision grinding.

  17. STARD 2015: An Updated List of Essential Items for Reporting Diagnostic Accuracy Studies.

    PubMed

    Bossuyt, Patrick M; Reitsma, Johannes B; Bruns, David E; Gatsonis, Constantine A; Glasziou, Paul P; Irwig, Les; Lijmer, Jeroen G; Moher, David; Rennie, Drummond; de Vet, Henrica C W; Kressel, Herbert Y; Rifai, Nader; Golub, Robert M; Altman, Douglas G; Hooft, Lotty; Korevaar, Daniël A; Cohen, Jérémie F

    2015-12-01

    Incomplete reporting has been identified as a major source of avoidable waste in biomedical research. Essential information is often not provided in study reports, impeding the identification, critical appraisal, and replication of studies. To improve the quality of reporting of diagnostic accuracy studies, the Standards for Reporting of Diagnostic Accuracy Studies (STARD) statement was developed. Here we present STARD 2015, an updated list of 30 essential items that should be included in every report of a diagnostic accuracy study. This update incorporates recent evidence about sources of bias and variability in diagnostic accuracy and is intended to facilitate the use of STARD. As such, STARD 2015 may help to improve completeness and transparency in reporting of diagnostic accuracy studies.

  18. Software Defined GPS API: Development and Implementation of GPS Correlator Architectures Using MATLAB with Focus on SDR Implementations

    DTIC Science & Technology

    2014-05-18

    intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved techniques...with the intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved...intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved techniques to

  19. Generating Keywords Improves Metacomprehension and Self-Regulation in Elementary and Middle School Children

    ERIC Educational Resources Information Center

    de Bruin, Anique B. H.; Thiede, Keith W.; Camp, Gino; Redford, Joshua

    2011-01-01

    The ability to monitor understanding of texts, usually referred to as metacomprehension accuracy, is typically quite poor in adult learners; however, recently interventions have been developed to improve accuracy. In two experiments, we evaluated whether generating delayed keywords prior to judging comprehension improved metacomprehension accuracy…

  20. An electromechanical, patient positioning system for head and neck radiotherapy

    NASA Astrophysics Data System (ADS)

    Ostyn, Mark; Dwyer, Thomas; Miller, Matthew; King, Paden; Sacks, Rachel; Cruikshank, Ross; Rosario, Melvin; Martinez, Daniel; Kim, Siyong; Yeo, Woon-Hong

    2017-09-01

    In cancer treatment with radiation, accurate patient setup is critical for proper dose delivery. Improper arrangement can lead to disease recurrence, permanent organ damage, or lack of disease control. While current immobilization equipment often helps for patient positioning, manual adjustment is required, involving iterative, time-consuming steps. Here, we present an electromechanical robotic system for improving patient setup in radiotherapy, specifically targeting head and neck cancer. This positioning system offers six degrees of freedom for a variety of applications in radiation oncology. An analytical calculation of inverse kinematics serves as fundamental criteria to design the system. Computational mechanical modeling and experimental study of radiotherapy compatibility and x-ray-based imaging demonstrates the device feasibility and reliability to be used in radiotherapy. An absolute positioning accuracy test in a clinical treatment room supports the clinical feasibility of the system.

  1. Characteristic Analysis and Experiment of a Dynamic Flow Balance Valve

    NASA Astrophysics Data System (ADS)

    Bin, Li; Song, Guo; Xuyao, Mao; Chao, Wu; Deman, Zhang; Jin, Shang; Yinshui, Liu

    2017-12-01

    Comprehensive characteristics of a dynamic flow balance valve of water system were analysed. The flow balance valve can change the drag efficient automatically according to the condition of system, and the effective control flowrate is constant in the range of job pressure. The structure of the flow balance valve was introduced, and the theoretical calculation formula for the variable opening of the valve core was derived. A rated pressure of 20kPa to 200kPa and a rated flowrate of 10m3/h were offered in the numerical work. Static and fluent CFX analyses show good behaviours: through the valve core structure optimization and improve design of the compressive spring, the dynamic flow balance valve can stabilize the flowrate of system evidently. And experiments show that the flow control accuracy is within 5%.

  2. Translocations, inversions and other chromosome rearrangements.

    PubMed

    Morin, Scott J; Eccles, Jennifer; Iturriaga, Amanda; Zimmerman, Rebekah S

    2017-01-01

    Chromosomal rearrangements have long been known to significantly impact fertility and miscarriage risk. Advancements in molecular diagnostics are challenging contemporary clinicians and patients in accurately characterizing the reproductive risk of a given abnormality. Initial attempts at preimplantation genetic diagnosis were limited by the inability to simultaneously evaluate aneuploidy and missed up to 70% of aneuploidy in chromosomes unrelated to the rearrangement. Contemporary platforms are more accurate and less susceptible to technical errors. These techniques also offer the ability to improve outcomes through diagnosis of uniparental disomy and may soon be able to consistently distinguish between normal and balanced translocation karyotypes. Although an accurate projection of the anticipated number of unbalanced embryos is not possible at present, confirmation of normal/balanced status results in high pregnancy rates (PRs) and diagnostic accuracy. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  3. Interactive numerical flow visualization using stream surfaces

    NASA Technical Reports Server (NTRS)

    Hultquist, J. P. M.

    1990-01-01

    Particle traces and ribbons are often used to depict the structure of three-dimensional flowfields, but images produced using these models can be ambiguous. Stream surfaces offer a more visually intuitive method for the depiction of flowfields, but interactive response is needed to allow the user to place surfaces which reveal the essential features of a given flowfield. FLORA, a software package which supports the interactive calculation and display of stream surfaces on silicon graphics workstations, is described. Alternative methods for the integration of particle traces are examined, and calculation through computational space is found to provide rapid results with accuracy adequate for most purposes. Rapid calculation of traces is teamed with progressive refinement of appoximated surfaces. An initial approximation provides immediate user feedback, and subsequent improvement of the surface ensures that the final image is an accurate representation of the flowfield.

  4. COSMO-SkyMed Spotlight interometry over rural areas: the Slumgullion landslide in Colorado, USA

    USGS Publications Warehouse

    Milillo, Pietro; Fielding, Eric J.; Schulz, William H.; Delbridge, Brent; Burgmann, Roland

    2014-01-01

    In the last 7 years, spaceborne synthetic aperture radar (SAR) data with resolution of better than a meter acquired by satellites in spotlight mode offered an unprecedented improvement in SAR interferometry (InSAR). Most attention has been focused on monitoring urban areas and man-made infrastructure exploiting geometric accuracy, stability, and phase fidelity of the spotlight mode. In this paper, we explore the potential application of the COSMO-SkyMed® Spotlight mode to rural areas where decorrelation is substantial and rapidly increases with time. We focus on the rapid repeat times of as short as one day possible with the COSMO-SkyMed® constellation. We further present a qualitative analysis of spotlight interferometry over the Slumgullion landslide in southwest Colorado, which moves at rates of more than 1 cm/day.

  5. Testing Modeling Assumptions in the West Africa Ebola Outbreak

    NASA Astrophysics Data System (ADS)

    Burghardt, Keith; Verzijl, Christopher; Huang, Junming; Ingram, Matthew; Song, Binyang; Hasne, Marie-Pierre

    2016-10-01

    The Ebola virus in West Africa has infected almost 30,000 and killed over 11,000 people. Recent models of Ebola Virus Disease (EVD) have often made assumptions about how the disease spreads, such as uniform transmissibility and homogeneous mixing within a population. In this paper, we test whether these assumptions are necessarily correct, and offer simple solutions that may improve disease model accuracy. First, we use data and models of West African migration to show that EVD does not homogeneously mix, but spreads in a predictable manner. Next, we estimate the initial growth rate of EVD within country administrative divisions and find that it significantly decreases with population density. Finally, we test whether EVD strains have uniform transmissibility through a novel statistical test, and find that certain strains appear more often than expected by chance.

  6. Neurological prognostication of outcome in patients in coma after cardiac arrest.

    PubMed

    Rossetti, Andrea O; Rabinstein, Alejandro A; Oddo, Mauro

    2016-05-01

    Management of coma after cardiac arrest has improved during the past decade, allowing an increasing proportion of patients to survive, thus prognostication has become an integral part of post-resuscitation care. Neurologists are increasingly confronted with raised expectations of next of kin and the necessity to provide early predictions of long-term prognosis. During the past decade, as technology and clinical evidence have evolved, post-cardiac arrest prognostication has moved towards a multimodal paradigm combining clinical examination with additional methods, consisting of electrophysiology, blood biomarkers, and brain imaging, to optimise prognostic accuracy. Prognostication should never be based on a single indicator; although some variables have very low false positive rates for poor outcome, multimodal assessment provides resassurance about the reliability of a prognostic estimate by offering concordant evidence. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Versatile electrophoresis-based self-test platform.

    PubMed

    Guijt, Rosanne M

    2015-03-01

    Lab on a Chip technology offers the possibility to extract chemical information from a complex sample in a simple, automated way without the need for a laboratory setting. In the health care sector, this chemical information could be used as a diagnostic tool for example to inform dosing. In this issue, the research underpinning a family of electrophoresis-based point-of-care devices for self-testing of ionic analytes in various sample matrices is described [Electrophoresis 2015, 36, 712-721.]. Hardware, software, and methodological chances made to improve the overall analytical performance in terms of accuracy, precision, detection limit, and reliability are discussed. In addition to the main focus of lithium monitoring, new applications including the use of the platform for veterinary purposes, sodium, and for creatinine measurements are included. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Multiplex titration RT-PCR: rapid determination of gene expression patterns for a large number of genes

    NASA Technical Reports Server (NTRS)

    Nebenfuhr, A.; Lomax, T. L.

    1998-01-01

    We have developed an improved method for determination of gene expression levels with RT-PCR. The procedure is rapid and does not require extensive optimization or densitometric analysis. Since the detection of individual transcripts is PCR-based, small amounts of tissue samples are sufficient for the analysis of expression patterns in large gene families. Using this method, we were able to rapidly screen nine members of the Aux/IAA family of auxin-responsive genes and identify those genes which vary in message abundance in a tissue- and light-specific manner. While not offering the accuracy of conventional semi-quantitative or competitive RT-PCR, our method allows quick screening of large numbers of genes in a wide range of RNA samples with just a thermal cycler and standard gel analysis equipment.

  9. [Image fusion, virtual reality, robotics and navigation. Effects on surgical practice].

    PubMed

    Maresceaux, J; Soler, L; Ceulemans, R; Garcia, A; Henri, M; Dutson, E

    2002-05-01

    In the new minimally invasive surgical era, virtual reality, robotics, and image merging have become topics on their own, offering the potential to revolutionize current surgical treatment and assessment. Improved patient care in the digital age seems to be the primary impetus for continued efforts in the field of telesurgery. The progress in endoscopic surgery with regard to telesurgery is manifested by digitization of the pre-, intra-, and postoperative interaction with the patients' surgical disease via computer system integration: so-called Computer Assisted Surgery (CAS). The preoperative assessment can be improved by 3D organ reconstruction, as in virtual colonoscopy or cholangiography, and by planning and practicing surgery using virtual or simulated organs. When integrating all of the data recorded during this preoperative stage, an enhanced reality can be made possible to improve intra-operative patient interactions. CAS allows for increased three-dimensional accuracy, improved precision and the reproducibility of procedures. The ability to store the actions of the surgeon as digitized information also allows for universal, rapid distribution: i.e., the surgeon's activity can be transmitted to the other side of the operating room or to a remote site via high-speed communications links, as was recently demonstrated by our own team during the Lindbergh operation. Furthermore, the surgeon will be able to share his expertise and skill through teleconsultation and telemanipulation, bringing the patient closer to the expert surgical team through electronic means and opening the way to advanced and continuous surgical learning. Finally, for postoperative interaction, virtual reality and simulation can provide us with 4 dimensional images, time being the fourth dimension. This should allow physicians to have a better idea of the disease process in evolution, and treatment modifications based on this view can be anticipated. We are presently determining the accuracy and efficacy of 4 dimensional imaging compared to conventional evaluations.

  10. Phylo: A Citizen Science Approach for Improving Multiple Sequence Alignment

    PubMed Central

    Kam, Alfred; Kwak, Daniel; Leung, Clarence; Wu, Chu; Zarour, Eleyine; Sarmenta, Luis; Blanchette, Mathieu; Waldispühl, Jérôme

    2012-01-01

    Background Comparative genomics, or the study of the relationships of genome structure and function across different species, offers a powerful tool for studying evolution, annotating genomes, and understanding the causes of various genetic disorders. However, aligning multiple sequences of DNA, an essential intermediate step for most types of analyses, is a difficult computational task. In parallel, citizen science, an approach that takes advantage of the fact that the human brain is exquisitely tuned to solving specific types of problems, is becoming increasingly popular. There, instances of hard computational problems are dispatched to a crowd of non-expert human game players and solutions are sent back to a central server. Methodology/Principal Findings We introduce Phylo, a human-based computing framework applying “crowd sourcing” techniques to solve the Multiple Sequence Alignment (MSA) problem. The key idea of Phylo is to convert the MSA problem into a casual game that can be played by ordinary web users with a minimal prior knowledge of the biological context. We applied this strategy to improve the alignment of the promoters of disease-related genes from up to 44 vertebrate species. Since the launch in November 2010, we received more than 350,000 solutions submitted from more than 12,000 registered users. Our results show that solutions submitted contributed to improving the accuracy of up to 70% of the alignment blocks considered. Conclusions/Significance We demonstrate that, combined with classical algorithms, crowd computing techniques can be successfully used to help improving the accuracy of MSA. More importantly, we show that an NP-hard computational problem can be embedded in casual game that can be easily played by people without significant scientific training. This suggests that citizen science approaches can be used to exploit the billions of “human-brain peta-flops” of computation that are spent every day playing games. Phylo is available at: http://phylo.cs.mcgill.ca. PMID:22412834

  11. Qualitative differences in offline improvement of procedural memory by daytime napping and overnight sleep: An fMRI study.

    PubMed

    Sugawara, Sho K; Koike, Takahiko; Kawamichi, Hiroaki; Makita, Kai; Hamano, Yuki H; Takahashi, Haruka K; Nakagawa, Eri; Sadato, Norihiro

    2017-09-20

    Daytime napping offers various benefits for healthy adults, including enhancement of motor skill learning. It remains controversial whether napping can provide the same enhancement as overnight sleep, and if so, whether the same neural underpinning is recruited. To investigate this issue, we conducted functional MRI during motor skill learning, before and after a short day-nap, in 13 participants, and compared them with a larger group (n=47) who were tested following regular overnight sleep. Training in a sequential finger-tapping task required participants to press a keyboard in the MRI scanner with their non-dominant left hand as quickly and accurately as possible. The nap group slept for 60min in the scanner after the training run, and the previously trained skill was subsequently re-tested. The whole-night sleep group went home after the training, and was tested the next day. Offline improvement of speed was observed in both groups, whereas accuracy was significantly improved only in the whole-night sleep group. Correspondingly, the offline increment in task-related activation was significant in the putamen of the whole-night group. This finding reveals a qualitative difference in the offline improvement effect between daytime napping and overnight sleep. Copyright © 2017. Published by Elsevier B.V.

  12. New technologies in the management of risk and violence in forensic settings.

    PubMed

    Tully, John; Larkin, Fintan; Fahy, Thomas

    2015-06-01

    Novel technological interventions are increasingly used in mental health settings. In this article, we describe 3 novel technological strategies in use for management of risk and violence in 2 forensic psychiatry settings in the United Kingdom: electronic monitoring by GPS-based tracking devices of patients on leave from a medium secure service in London, and closed circuit television (CCTV) monitoring and motion sensor technology at Broadmoor high secure hospital. A common theme is the use of these technologies to improve the completeness and accuracy of data used by clinicians to make clinical decisions. Another common thread is that each of these strategies supports and improves current clinical approaches rather than drastically changing them. The technologies offer a broad range of benefits. These include less restrictive options for patients, improved accountability of both staff and patients, less invasive testing, improved automated record-keeping, and better assurance reporting. Services utilizing technologies need also be aware of limitations. Technologies may be seen as unduly restrictive by patients and advocates, and technical issues may reduce effectiveness. It is vital that the types of technological innovations described in this article should be subject to thorough evaluation that addresses cost effectiveness, qualitative analysis of patients' attitudes, safety, and ethical considerations.

  13. Improved correlation corrections to the local-spin-density approximation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Painter, G.S.

    1981-10-15

    The accurate correlation energies for the para- and ferromagnetic states of the electron liquid calculated by Ceperley and Alder were recently used by Vosko, Wilk, and Nusair to produce a new correlation-energy density of increased accuracy and proper limiting behavior in the metallic density regime (r/sub s/< or =6). In the present work, the correlation potential in the local-spin-density approximation (LSDA) is derived from the correlation-energy-density representation of Vosko et al. Characteristics of the new exchange-correlation model are compared with those of the LSDA model of Gunnarsson and Lundqvist. Specific comparison is made between these models and exact results inmore » the treatment of atomic and molecular hydrogen. Since the new treatment of correlation primarily affects the region of small r/sub s/, which is exchange dominated, correlation corrections are small compared with errors in the exchange energy. Thus, in light atoms the improved correlation model leads to a reduced cancellation of error between exchange and correlation energies, emphasizing the necessity for improved exchange treatment. For more homogeneous systems, the model should offer real improvement. The present results obtained with precise treatment of correlation within the prescription of Vosko et al. serve to define the present limitations of the LSDA and indicate the importance of nonlocal corrections, particularly for atoms.« less

  14. Enhanced Impact Resistance of Three-Dimensional-Printed Parts with Structured Filaments.

    PubMed

    Peng, Fang; Zhao, Zhiyang; Xia, Xuhui; Cakmak, Miko; Vogt, Bryan D

    2018-05-09

    Net-shape manufacture of customizable objects through three-dimensional (3D) printing offers tremendous promise for personalization to improve the fit, performance, and comfort associated with devices and tools used in our daily lives. However, the application of 3D printing in structural objects has been limited by their poor mechanical performance that manifests from the layer-by-layer process by which the part is produced. Here, this interfacial weakness is overcome using a structured, core-shell polymer filament where a polycarbonate (PC) core solidifies quickly to define the shape, whereas an olefin ionomer shell contains functionality (crystallinity and ionic) that strengthen the interface between the printed layers. This structured filament leads to improved dimensional accuracy and impact resistance in comparison to the individual components. The impact resistance from structured filaments containing 45 vol % shell can exceed 800 J/m. The origins of this improved impact resistance are probed using X-ray microcomputed tomography. Energy is dissipated by delamination of the shell from PC near the crack tip, whereas PC remains intact to provide stability to the part after impact. This structured filament provides tremendous improvements in the critical properties for manufacture and represents a major leap forward in the impact properties obtainable for 3D-printed parts.

  15. Semi-automated extraction of landslides in Taiwan based on SPOT imagery and DEMs

    NASA Astrophysics Data System (ADS)

    Eisank, Clemens; Hölbling, Daniel; Friedl, Barbara; Chen, Yi-Chin; Chang, Kang-Tsung

    2014-05-01

    The vast availability and improved quality of optical satellite data and digital elevation models (DEMs), as well as the need for complete and up-to-date landslide inventories at various spatial scales have fostered the development of semi-automated landslide recognition systems. Among the tested approaches for designing such systems, object-based image analysis (OBIA) stepped out to be a highly promising methodology. OBIA offers a flexible, spatially enabled framework for effective landslide mapping. Most object-based landslide mapping systems, however, have been tailored to specific, mainly small-scale study areas or even to single landslides only. Even though reported mapping accuracies tend to be higher than for pixel-based approaches, accuracy values are still relatively low and depend on the particular study. There is still room to improve the applicability and objectivity of object-based landslide mapping systems. The presented study aims at developing a knowledge-based landslide mapping system implemented in an OBIA environment, i.e. Trimble eCognition. In comparison to previous knowledge-based approaches, the classification of segmentation-derived multi-scale image objects relies on digital landslide signatures. These signatures hold the common operational knowledge on digital landslide mapping, as reported by 25 Taiwanese landslide experts during personal semi-structured interviews. Specifically, the signatures include information on commonly used data layers, spectral and spatial features, and feature thresholds. The signatures guide the selection and implementation of mapping rules that were finally encoded in Cognition Network Language (CNL). Multi-scale image segmentation is optimized by using the improved Estimation of Scale Parameter (ESP) tool. The approach described above is developed and tested for mapping landslides in a sub-region of the Baichi catchment in Northern Taiwan based on SPOT imagery and a high-resolution DEM. An object-based accuracy assessment is conducted by quantitatively comparing extracted landslide objects with landslide polygons that were visually interpreted by local experts. The applicability and transferability of the mapping system are evaluated by comparing initial accuracies with those achieved for the following two tests: first, usage of a SPOT image from the same year, but for a different area within the Baichi catchment; second, usage of SPOT images from multiple years for the same region. The integration of the common knowledge via digital landslide signatures is new in object-based landslide studies. In combination with strategies to optimize image segmentation this may lead to a more objective, transferable and stable knowledge-based system for the mapping of landslides from optical satellite data and DEMs.

  16. Global Optimization Ensemble Model for Classification Methods

    PubMed Central

    Anwar, Hina; Qamar, Usman; Muzaffar Qureshi, Abdul Wahab

    2014-01-01

    Supervised learning is the process of data mining for deducing rules from training datasets. A broad array of supervised learning algorithms exists, every one of them with its own advantages and drawbacks. There are some basic issues that affect the accuracy of classifier while solving a supervised learning problem, like bias-variance tradeoff, dimensionality of input space, and noise in the input data space. All these problems affect the accuracy of classifier and are the reason that there is no global optimal method for classification. There is not any generalized improvement method that can increase the accuracy of any classifier while addressing all the problems stated above. This paper proposes a global optimization ensemble model for classification methods (GMC) that can improve the overall accuracy for supervised learning problems. The experimental results on various public datasets showed that the proposed model improved the accuracy of the classification models from 1% to 30% depending upon the algorithm complexity. PMID:24883382

  17. Application of Quasi-Linearization Techniques to Rail Vehicle Dynamic Analyses

    DOT National Transportation Integrated Search

    1978-11-01

    The objective of the work reported here was to define methods for applying the describing function technique to realistic models of nonlinear rail cars. The describing function method offers a compromise between the accuracy of nonlinear digital simu...

  18. Mapping of taiga forest units using AIRSAR data and/or optical data, and retrieval of forest parameters

    NASA Technical Reports Server (NTRS)

    Rignot, Eric; Williams, Cynthia; Way, Jobea; Viereck, Leslie

    1993-01-01

    A maximum a posteriori Bayesian classifier for multifrequency polarimetric SAR data is used to perform a supervised classification of forest types in the floodplains of Alaska. The image classes include white spruce, balsam poplar, black spruce, alder, non-forests, and open water. The authors investigate the effect on classification accuracy of changing environmental conditions, and of frequency and polarization of the signal. The highest classification accuracy (86 percent correctly classified forest pixels, and 91 percent overall) is obtained combining L- and C-band frequencies fully polarimetric on a date where the forest is just recovering from flooding. The forest map compares favorably with a vegetation map assembled from digitized aerial photos which took five years for completion, and address the state of the forest in 1978, ignoring subsequent fires, changes in the course of the river, clear-cutting of trees, and tree growth. HV-polarization is the most useful polarization at L- and C-band for classification. C-band VV (ERS-1 mode) and L-band HH (J-ERS-1 mode) alone or combined yield unsatisfactory classification accuracies. Additional data acquired in the winter season during thawed and frozen days yield classification accuracies respectively 20 percent and 30 percent lower due to a greater confusion between conifers and deciduous trees. Data acquired at the peak of flooding in May 1991 also yield classification accuracies 10 percent lower because of dominant trunk-ground interactions which mask out finer differences in radar backscatter between tree species. Combination of several of these dates does not improve classification accuracy. For comparison, panchromatic optical data acquired by SPOT in the summer season of 1991 are used to classify the same area. The classification accuracy (78 percent for the forest types and 90 percent if open water is included) is lower than that obtained with AIRSAR although conifers and deciduous trees are better separated due to the presence of leaves on the deciduous trees. Optical data do not separate black spruce and white spruce as well as SAR data, cannot separate alder from balsam poplar, and are of course limited by the frequent cloud cover in the polar regions. Yet, combining SPOT and AIRSAR offers better chances to identify vegetation types independent of ground truth information using a combination of NDVI indexes from SPOT, biomass numbers from AIRSAR, and a segmentation map from either one.

  19. Investigations of interference between electromagnetic transponders and wireless MOSFET dosimeters: a phantom study.

    PubMed

    Su, Zhong; Zhang, Lisha; Ramakrishnan, V; Hagan, Michael; Anscher, Mitchell

    2011-05-01

    To evaluate both the Calypso Systems' (Calypso Medical Technologies, Inc., Seattle, WA) localization accuracy in the presence of wireless metal-oxide-semiconductor field-effect transistor (MOSFET) dosimeters of dose verification system (DVS, Sicel Technologies, Inc., Morrisville, NC) and the dosimeters' reading accuracy in the presence of wireless electromagnetic transponders inside a phantom. A custom-made, solid-water phantom was fabricated with space for transponders and dosimeters. Two inserts were machined with positioning grooves precisely matching the dimensions of the transponders and dosimeters and were arranged in orthogonal and parallel orientations, respectively. To test the transponder localization accuracy with/without presence of dosimeters (hypothesis 1), multivariate analyses were performed on transponder-derived localization data with and without dosimeters at each preset distance to detect statistically significant localization differences between the control and test sets. To test dosimeter dose-reading accuracy with/without presence of transponders (hypothesis 2), an approach of alternating the transponder presence in seven identical fraction dose (100 cGy) deliveries and measurements was implemented. Two-way analysis of variance was performed to examine statistically significant dose-reading differences between the two groups and the different fractions. A relative-dose analysis method was also used to evaluate transponder impact on dose-reading accuracy after dose-fading effect was removed by a second-order polynomial fit. Multivariate analysis indicated that hypothesis 1 was false; there was a statistically significant difference between the localization data from the control and test sets. However, the upper and lower bounds of the 95% confidence intervals of the localized positional differences between the control and test sets were less than 0.1 mm, which was significantly smaller than the minimum clinical localization resolution of 0.5 mm. For hypothesis 2, analysis of variance indicated that there was no statistically significant difference between the dosimeter readings with and without the presence of transponders. Both orthogonal and parallel configurations had difference of polynomial-fit dose to measured dose values within 1.75%. The phantom study indicated that the Calypso System's localization accuracy was not affected clinically due to the presence of DVS wireless MOSFET dosimeters and the dosimeter-measured doses were not affected by the presence of transponders. Thus, the same patients could be implanted with both transponders and dosimeters to benefit from improved accuracy of radiotherapy treatments offered by conjunctional use of the two systems.

  20. The Effect of Written Corrective Feedback on Grammatical Accuracy of EFL Students: An Improvement over Previous Unfocused Designs

    ERIC Educational Resources Information Center

    Khanlarzadeh, Mobin; Nemati, Majid

    2016-01-01

    The effectiveness of written corrective feedback (WCF) in the improvement of language learners' grammatical accuracy has been a topic of interest in SLA studies for the past couple of decades. The present study reports the findings of a three-month study investigating the effect of direct unfocused WCF on the grammatical accuracy of elementary…

  1. Finite element analysis of transonic flows in cascades: Importance of computational grids in improving accuracy and convergence

    NASA Technical Reports Server (NTRS)

    Ecer, A.; Akay, H. U.

    1981-01-01

    The finite element method is applied for the solution of transonic potential flows through a cascade of airfoils. Convergence characteristics of the solution scheme are discussed. Accuracy of the numerical solutions is investigated for various flow regions in the transonic flow configuration. The design of an efficient finite element computational grid is discussed for improving accuracy and convergence.

  2. [Design and accuracy analysis of upper slicing system of MSCT].

    PubMed

    Jiang, Rongjian

    2013-05-01

    The upper slicing system is the main components of the optical system in MSCT. This paper focuses on the design of upper slicing system and its accuracy analysis to improve the accuracy of imaging. The error of slice thickness and ray center by bearings, screw and control system were analyzed and tested. In fact, the accumulated error measured is less than 1 microm, absolute error measured is less than 10 microm. Improving the accuracy of the upper slicing system contributes to the appropriate treatment methods and success rate of treatment.

  3. Method to improve accuracy of positioning object by eLoran system with applying standard Kalman filter

    NASA Astrophysics Data System (ADS)

    Grunin, A. P.; Kalinov, G. A.; Bolokhovtsev, A. V.; Sai, S. V.

    2018-05-01

    This article reports on a novel method to improve the accuracy of positioning an object by a low frequency hyperbolic radio navigation system like an eLoran. This method is based on the application of the standard Kalman filter. Investigations of an affection of the filter parameters and the type of the movement on accuracy of the vehicle position estimation are carried out. Evaluation of the method accuracy was investigated by separating data from the semi-empirical movement model to different types of movements.

  4. Systematic Calibration for Ultra-High Accuracy Inertial Measurement Units.

    PubMed

    Cai, Qingzhong; Yang, Gongliu; Song, Ningfang; Liu, Yiliang

    2016-06-22

    An inertial navigation system (INS) has been widely used in challenging GPS environments. With the rapid development of modern physics, an atomic gyroscope will come into use in the near future with a predicted accuracy of 5 × 10(-6)°/h or better. However, existing calibration methods and devices can not satisfy the accuracy requirements of future ultra-high accuracy inertial sensors. In this paper, an improved calibration model is established by introducing gyro g-sensitivity errors, accelerometer cross-coupling errors and lever arm errors. A systematic calibration method is proposed based on a 51-state Kalman filter and smoother. Simulation results show that the proposed calibration method can realize the estimation of all the parameters using a common dual-axis turntable. Laboratory and sailing tests prove that the position accuracy in a five-day inertial navigation can be improved about 8% by the proposed calibration method. The accuracy can be improved at least 20% when the position accuracy of the atomic gyro INS can reach a level of 0.1 nautical miles/5 d. Compared with the existing calibration methods, the proposed method, with more error sources and high order small error parameters calibrated for ultra-high accuracy inertial measurement units (IMUs) using common turntables, has a great application potential in future atomic gyro INSs.

  5. Systematic review of discharge coding accuracy

    PubMed Central

    Burns, E.M.; Rigby, E.; Mamidanna, R.; Bottle, A.; Aylin, P.; Ziprin, P.; Faiz, O.D.

    2012-01-01

    Introduction Routinely collected data sets are increasingly used for research, financial reimbursement and health service planning. High quality data are necessary for reliable analysis. This study aims to assess the published accuracy of routinely collected data sets in Great Britain. Methods Systematic searches of the EMBASE, PUBMED, OVID and Cochrane databases were performed from 1989 to present using defined search terms. Included studies were those that compared routinely collected data sets with case or operative note review and those that compared routinely collected data with clinical registries. Results Thirty-two studies were included. Twenty-five studies compared routinely collected data with case or operation notes. Seven studies compared routinely collected data with clinical registries. The overall median accuracy (routinely collected data sets versus case notes) was 83.2% (IQR: 67.3–92.1%). The median diagnostic accuracy was 80.3% (IQR: 63.3–94.1%) with a median procedure accuracy of 84.2% (IQR: 68.7–88.7%). There was considerable variation in accuracy rates between studies (50.5–97.8%). Since the 2002 introduction of Payment by Results, accuracy has improved in some respects, for example primary diagnoses accuracy has improved from 73.8% (IQR: 59.3–92.1%) to 96.0% (IQR: 89.3–96.3), P= 0.020. Conclusion Accuracy rates are improving. Current levels of reported accuracy suggest that routinely collected data are sufficiently robust to support their use for research and managerial decision-making. PMID:21795302

  6. The study of vehicle classification equipment with solutions to improve accuracy in Oklahoma.

    DOT National Transportation Integrated Search

    2014-12-01

    The accuracy of vehicle counting and classification data is vital for appropriate future highway and road : design, including determining pavement characteristics, eliminating traffic jams, and improving safety. : Organizations relying on vehicle cla...

  7. Learning Linear Spatial-Numeric Associations Improves Accuracy of Memory for Numbers

    PubMed Central

    Thompson, Clarissa A.; Opfer, John E.

    2016-01-01

    Memory for numbers improves with age and experience. One potential source of improvement is a logarithmic-to-linear shift in children’s representations of magnitude. To test this, Kindergartners and second graders estimated the location of numbers on number lines and recalled numbers presented in vignettes (Study 1). Accuracy at number-line estimation predicted memory accuracy on a numerical recall task after controlling for the effect of age and ability to approximately order magnitudes (mapper status). To test more directly whether linear numeric magnitude representations caused improvements in memory, half of children were given feedback on their number-line estimates (Study 2). As expected, learning linear representations was again linked to memory for numerical information even after controlling for age and mapper status. These results suggest that linear representations of numerical magnitude may be a causal factor in development of numeric recall accuracy. PMID:26834688

  8. Learning Linear Spatial-Numeric Associations Improves Accuracy of Memory for Numbers.

    PubMed

    Thompson, Clarissa A; Opfer, John E

    2016-01-01

    Memory for numbers improves with age and experience. One potential source of improvement is a logarithmic-to-linear shift in children's representations of magnitude. To test this, Kindergartners and second graders estimated the location of numbers on number lines and recalled numbers presented in vignettes (Study 1). Accuracy at number-line estimation predicted memory accuracy on a numerical recall task after controlling for the effect of age and ability to approximately order magnitudes (mapper status). To test more directly whether linear numeric magnitude representations caused improvements in memory, half of children were given feedback on their number-line estimates (Study 2). As expected, learning linear representations was again linked to memory for numerical information even after controlling for age and mapper status. These results suggest that linear representations of numerical magnitude may be a causal factor in development of numeric recall accuracy.

  9. Utilizing a language model to improve online dynamic data collection in P300 spellers.

    PubMed

    Mainsah, Boyla O; Colwell, Kenneth A; Collins, Leslie M; Throckmorton, Chandra S

    2014-07-01

    P300 spellers provide a means of communication for individuals with severe physical limitations, especially those with locked-in syndrome, such as amyotrophic lateral sclerosis. However, P300 speller use is still limited by relatively low communication rates due to the multiple data measurements that are required to improve the signal-to-noise ratio of event-related potentials for increased accuracy. Therefore, the amount of data collection has competing effects on accuracy and spelling speed. Adaptively varying the amount of data collection prior to character selection has been shown to improve spelling accuracy and speed. The goal of this study was to optimize a previously developed dynamic stopping algorithm that uses a Bayesian approach to control data collection by incorporating a priori knowledge via a language model. Participants ( n = 17) completed online spelling tasks using the dynamic stopping algorithm, with and without a language model. The addition of the language model resulted in improved participant performance from a mean theoretical bit rate of 46.12 bits/min at 88.89% accuracy to 54.42 bits/min ( ) at 90.36% accuracy.

  10. Accuracy in inference of nursing diagnoses in heart failure patients.

    PubMed

    Pereira, Juliana de Melo Vellozo; Cavalcanti, Ana Carla Dantas; Lopes, Marcos Venícios de Oliveira; da Silva, Valéria Gonçalves; de Souza, Rosana Oliveira; Gonçalves, Ludmila Cuzatis

    2015-01-01

    Heart failure (HF) is a common cause of hospitalization and requires accuracy in clinical judgment and appropriate nursing diagnoses. to determine the accuracy of nursing diagnoses of fatigue, intolerance to activity and decreased cardiac output in hospitalized HF patients. descriptive study applied to nurses with experience in NANDA-I and/or HF nursing diagnoses. Evaluation and accuracy were determined by calculating efficacy (E), false negative (FN), false positive (FP) and trend (T) measures. Nurses who showed acceptable inspection for two diagnoses were selected. the nursing diagnosis of fatigue was the most commonly mistaken diagnosis identified by the nursing evaluators. the search for improving diagnostic accuracy reaffirms the need for continuous and specific training to improve the diagnosis capability of nurses. the training allowed the exercise of clinical judgment and better accuracy of nurses.

  11. STARD 2015: An Updated List of Essential Items for Reporting Diagnostic Accuracy Studies.

    PubMed

    Bossuyt, Patrick M; Reitsma, Johannes B; Bruns, David E; Gatsonis, Constantine A; Glasziou, Paul P; Irwig, Les; Lijmer, Jeroen G; Moher, David; Rennie, Drummond; de Vet, Henrica C W; Kressel, Herbert Y; Rifai, Nader; Golub, Robert M; Altman, Douglas G; Hooft, Lotty; Korevaar, Daniël A; Cohen, Jérémie F

    2015-12-01

    Incomplete reporting has been identified as a major source of avoidable waste in biomedical research. Essential information is often not provided in study reports, impeding the identification, critical appraisal, and replication of studies. To improve the quality of reporting of diagnostic accuracy studies, the Standards for Reporting of Diagnostic Accuracy Studies (STARD) statement was developed. Here we present STARD 2015, an updated list of 30 essential items that should be included in every report of a diagnostic accuracy study. This update incorporates recent evidence about sources of bias and variability in diagnostic accuracy and is intended to facilitate the use of STARD. As such, STARD 2015 may help to improve completeness and transparency in reporting of diagnostic accuracy studies. © 2015 American Association for Clinical Chemistry.

  12. Diagnostic accuracy of routine blood examinations and CSF lactate level for post-neurosurgical bacterial meningitis.

    PubMed

    Zhang, Yang; Xiao, Xiong; Zhang, Junting; Gao, Zhixian; Ji, Nan; Zhang, Liwei

    2017-06-01

    To evaluate the diagnostic accuracy of routine blood examinations and Cerebrospinal Fluid (CSF) lactate level for Post-neurosurgical Bacterial Meningitis (PBM) at a large sample-size of post-neurosurgical patients. The diagnostic accuracies of routine blood examinations and CSF lactate level to distinguish between PAM and PBM were evaluated with the values of the Area Under the Curve of the Receiver Operating Characteristic (AUC -ROC ) by retrospectively analyzing the datasets of post-neurosurgical patients in the clinical information databases. The diagnostic accuracy of routine blood examinations was relatively low (AUC -ROC <0.7). The CSF lactate level achieved rather high diagnostic accuracy (AUC -ROC =0.891; CI 95%, 0.852-0.922). The variables of patient age, operation duration, surgical diagnosis and postoperative days (the interval days between the neurosurgery and examinations) were shown to affect the diagnostic accuracy of these examinations. The variables were integrated with routine blood examinations and CSF lactate level by Fisher discriminant analysis to improve their diagnostic accuracy. As a result, the diagnostic accuracy of blood examinations and CSF lactate level was significantly improved with an AUC -ROC value=0.760 (CI 95%, 0.737-0.782) and 0.921 (CI 95%, 0.887-0.948) respectively. The PBM diagnostic accuracy of routine blood examinations was relatively low, whereas the accuracy of CSF lactate level was high. Some variables that are involved in the incidence of PBM can also affect the diagnostic accuracy for PBM. Taking into account the effects of these variables significantly improves the diagnostic accuracies of routine blood examinations and CSF lactate level. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Tracking accuracy assessment for concentrator photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Norton, Matthew S. H.; Anstey, Ben; Bentley, Roger W.; Georghiou, George E.

    2010-10-01

    The accuracy to which a concentrator photovoltaic (CPV) system can track the sun is an important parameter that influences a number of measurements that indicate the performance efficiency of the system. This paper presents work carried out into determining the tracking accuracy of a CPV system, and illustrates the steps involved in gaining an understanding of the tracking accuracy. A Trac-Stat SL1 accuracy monitor has been used in the determination of pointing accuracy and has been integrated into the outdoor CPV module test facility at the Photovoltaic Technology Laboratories in Nicosia, Cyprus. Results from this work are provided to demonstrate how important performance indicators may be presented, and how the reliability of results is improved through the deployment of such accuracy monitors. Finally, recommendations on the use of such sensors are provided as a means to improve the interpretation of real outdoor performance.

  14. An improved Multimodel Approach for Global Sea Surface Temperature Forecasts

    NASA Astrophysics Data System (ADS)

    Khan, M. Z. K.; Mehrotra, R.; Sharma, A.

    2014-12-01

    The concept of ensemble combinations for formulating improved climate forecasts has gained popularity in recent years. However, many climate models share similar physics or modeling processes, which may lead to similar (or strongly correlated) forecasts. Recent approaches for combining forecasts that take into consideration differences in model accuracy over space and time have either ignored the similarity of forecast among the models or followed a pairwise dynamic combination approach. Here we present a basis for combining model predictions, illustrating the improvements that can be achieved if procedures for factoring in inter-model dependence are utilised. The utility of the approach is demonstrated by combining sea surface temperature (SST) forecasts from five climate models over a period of 1960-2005. The variable of interest, the monthly global sea surface temperature anomalies (SSTA) at a 50´50 latitude-longitude grid, is predicted three months in advance to demonstrate the utility of the proposed algorithm. Results indicate that the proposed approach offers consistent and significant improvements for majority of grid points compared to the case where the dependence among the models is ignored. Therefore, the proposed approach of combining multiple models by taking into account the existing interdependence, provides an attractive alternative to obtain improved climate forecast. In addition, an approach to combine seasonal forecasts from multiple climate models with varying periods of availability is also demonstrated.

  15. Diagnosis of hydronephrosis: comparison of radionuclide scanning and sonography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malave, S.R.; Neiman, H.L.; Spies, S.M.

    1980-12-01

    Diagnostic sonographic and radioisotope scanning techniques have been shown to be useful in the diagnosis of obstructive uropathy. The accuracy of both methods was compared and sonography was found to provide the more accurate data (sensitivity, 90%, specificity, 98%; accuracy, 97%). Sonography provides excellent anatomic information and enables one to grade the degree of dilatation. Renal radionuclide studies were less sensitive in detecting obstruction, particularly in the presence of chronic renal disease, but offered additional information regarding relative renal blood flow, total effective renal plasma flow, and interval change in renal parenchymal function.

  16. Can NMR solve some significant challenges in metabolomics?

    PubMed

    Nagana Gowda, G A; Raftery, Daniel

    2015-11-01

    The field of metabolomics continues to witness rapid growth driven by fundamental studies, methods development, and applications in a number of disciplines that include biomedical science, plant and nutrition sciences, drug development, energy and environmental sciences, toxicology, etc. NMR spectroscopy is one of the two most widely used analytical platforms in the metabolomics field, along with mass spectrometry (MS). NMR's excellent reproducibility and quantitative accuracy, its ability to identify structures of unknown metabolites, its capacity to generate metabolite profiles using intact bio-specimens with no need for separation, and its capabilities for tracing metabolic pathways using isotope labeled substrates offer unique strengths for metabolomics applications. However, NMR's limited sensitivity and resolution continue to pose a major challenge and have restricted both the number and the quantitative accuracy of metabolites analyzed by NMR. Further, the analysis of highly complex biological samples has increased the demand for new methods with improved detection, better unknown identification, and more accurate quantitation of larger numbers of metabolites. Recent efforts have contributed significant improvements in these areas, and have thereby enhanced the pool of routinely quantifiable metabolites. Additionally, efforts focused on combining NMR and MS promise opportunities to exploit the combined strength of the two analytical platforms for direct comparison of the metabolite data, unknown identification and reliable biomarker discovery that continue to challenge the metabolomics field. This article presents our perspectives on the emerging trends in NMR-based metabolomics and NMR's continuing role in the field with an emphasis on recent and ongoing research from our laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Classification of bifurcations regions in IVOCT images using support vector machine and artificial neural network models

    NASA Astrophysics Data System (ADS)

    Porto, C. D. N.; Costa Filho, C. F. F.; Macedo, M. M. G.; Gutierrez, M. A.; Costa, M. G. F.

    2017-03-01

    Studies in intravascular optical coherence tomography (IV-OCT) have demonstrated the importance of coronary bifurcation regions in intravascular medical imaging analysis, as plaques are more likely to accumulate in this region leading to coronary disease. A typical IV-OCT pullback acquires hundreds of frames, thus developing an automated tool to classify the OCT frames as bifurcation or non-bifurcation can be an important step to speed up OCT pullbacks analysis and assist automated methods for atherosclerotic plaque quantification. In this work, we evaluate the performance of two state-of-the-art classifiers, SVM and Neural Networks in the bifurcation classification task. The study included IV-OCT frames from 9 patients. In order to improve classification performance, we trained and tested the SVM with different parameters by means of a grid search and different stop criteria were applied to the Neural Network classifier: mean square error, early stop and regularization. Different sets of features were tested, using feature selection techniques: PCA, LDA and scalar feature selection with correlation. Training and test were performed in sets with a maximum of 1460 OCT frames. We quantified our results in terms of false positive rate, true positive rate, accuracy, specificity, precision, false alarm, f-measure and area under ROC curve. Neural networks obtained the best classification accuracy, 98.83%, overcoming the results found in literature. Our methods appear to offer a robust and reliable automated classification of OCT frames that might assist physicians indicating potential frames to analyze. Methods for improving neural networks generalization have increased the classification performance.

  18. Fragment assignment in the cloud with eXpress-D

    PubMed Central

    2013-01-01

    Background Probabilistic assignment of ambiguously mapped fragments produced by high-throughput sequencing experiments has been demonstrated to greatly improve accuracy in the analysis of RNA-Seq and ChIP-Seq, and is an essential step in many other sequence census experiments. A maximum likelihood method using the expectation-maximization (EM) algorithm for optimization is commonly used to solve this problem. However, batch EM-based approaches do not scale well with the size of sequencing datasets, which have been increasing dramatically over the past few years. Thus, current approaches to fragment assignment rely on heuristics or approximations for tractability. Results We present an implementation of a distributed EM solution to the fragment assignment problem using Spark, a data analytics framework that can scale by leveraging compute clusters within datacenters–“the cloud”. We demonstrate that our implementation easily scales to billions of sequenced fragments, while providing the exact maximum likelihood assignment of ambiguous fragments. The accuracy of the method is shown to be an improvement over the most widely used tools available and can be run in a constant amount of time when cluster resources are scaled linearly with the amount of input data. Conclusions The cloud offers one solution for the difficulties faced in the analysis of massive high-thoughput sequencing data, which continue to grow rapidly. Researchers in bioinformatics must follow developments in distributed systems–such as new frameworks like Spark–for ways to port existing methods to the cloud and help them scale to the datasets of the future. Our software, eXpress-D, is freely available at: http://github.com/adarob/express-d. PMID:24314033

  19. Informed Source Separation of Atmospheric and Surface Signal Contributions in Shortwave Hyperspectral Imagery using Non-negative Matrix Factorization

    NASA Astrophysics Data System (ADS)

    Wright, L.; Coddington, O.; Pilewskie, P.

    2015-12-01

    Current challenges in Earth remote sensing require improved instrument spectral resolution, spectral coverage, and radiometric accuracy. Hyperspectral instruments, deployed on both aircraft and spacecraft, are a growing class of Earth observing sensors designed to meet these challenges. They collect large amounts of spectral data, allowing thorough characterization of both atmospheric and surface properties. The higher accuracy and increased spectral and spatial resolutions of new imagers require new numerical approaches for processing imagery and separating surface and atmospheric signals. One potential approach is source separation, which allows us to determine the underlying physical causes of observed changes. Improved signal separation will allow hyperspectral instruments to better address key science questions relevant to climate change, including land-use changes, trends in clouds and atmospheric water vapor, and aerosol characteristics. In this work, we investigate a Non-negative Matrix Factorization (NMF) method for the separation of atmospheric and land surface signal sources. NMF offers marked benefits over other commonly employed techniques, including non-negativity, which avoids physically impossible results, and adaptability, which allows the method to be tailored to hyperspectral source separation. We adapt our NMF algorithm to distinguish between contributions from different physically distinct sources by introducing constraints on spectral and spatial variability and by using library spectra to inform separation. We evaluate our NMF algorithm with simulated hyperspectral images as well as hyperspectral imagery from several instruments including, the NASA Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), NASA Hyperspectral Imager for the Coastal Ocean (HICO) and National Ecological Observatory Network (NEON) Imaging Spectrometer.

  20. Determining Directional Dependency in Causal Associations

    PubMed Central

    Pornprasertmanit, Sunthud; Little, Todd D.

    2014-01-01

    Directional dependency is a method to determine the likely causal direction of effect between two variables. This article aims to critique and improve upon the use of directional dependency as a technique to infer causal associations. We comment on several issues raised by von Eye and DeShon (2012), including: encouraging the use of the signs of skewness and excessive kurtosis of both variables, discouraging the use of D’Agostino’s K2, and encouraging the use of directional dependency to compare variables only within time points. We offer improved steps for determining directional dependency that fix the problems we note. Next, we discuss how to integrate directional dependency into longitudinal data analysis with two variables. We also examine the accuracy of directional dependency evaluations when several regression assumptions are violated. Directional dependency can suggest the direction of a relation if (a) the regression error in population is normal, (b) an unobserved explanatory variable correlates with any variables equal to or less than .2, (c) a curvilinear relation between both variables is not strong (standardized regression coefficient ≤ .2), (d) there are no bivariate outliers, and (e) both variables are continuous. PMID:24683282

  1. Good practices in free-energy calculations.

    PubMed

    Pohorille, Andrew; Jarzynski, Christopher; Chipot, Christophe

    2010-08-19

    As access to computational resources continues to increase, free-energy calculations have emerged as a powerful tool that can play a predictive role in a wide range of research areas. Yet, the reliability of these calculations can often be improved significantly if a number of precepts, or good practices, are followed. Although the theory upon which these good practices rely has largely been known for many years, it is often overlooked or simply ignored. In other cases, the theoretical developments are too recent for their potential to be fully grasped and merged into popular platforms for the computation of free-energy differences. In this contribution, the current best practices for carrying out free-energy calculations using free energy perturbation and nonequilibrium work methods are discussed, demonstrating that at little to no additional cost, free-energy estimates could be markedly improved and bounded by meaningful error estimates. Monitoring the probability distributions that underlie the transformation between the states of interest, performing the calculation bidirectionally, stratifying the reaction pathway, and choosing the most appropriate paradigms and algorithms for transforming between states offer significant gains in both accuracy and precision.

  2. Cryo-balloon catheter localization in fluoroscopic images

    NASA Astrophysics Data System (ADS)

    Kurzendorfer, Tanja; Brost, Alexander; Jakob, Carolin; Mewes, Philip W.; Bourier, Felix; Koch, Martin; Kurzidim, Klaus; Hornegger, Joachim; Strobel, Norbert

    2013-03-01

    Minimally invasive catheter ablation has become the preferred treatment option for atrial fibrillation. Although the standard ablation procedure involves ablation points set by radio-frequency catheters, cryo-balloon catheters have even been reported to be more advantageous in certain cases. As electro-anatomical mapping systems do not support cryo-balloon ablation procedures, X-ray guidance is needed. However, current methods to provide support for cryo-balloon catheters in fluoroscopically guided ablation procedures rely heavily on manual user interaction. To improve this, we propose a first method for automatic cryo-balloon catheter localization in fluoroscopic images based on a blob detection algorithm. Our method is evaluated on 24 clinical images from 17 patients. The method successfully detected the cryoballoon in 22 out of 24 images, yielding a success rate of 91.6 %. The successful localization achieved an accuracy of 1.00 mm +/- 0.44 mm. Even though our methods currently fails in 8.4 % of the images available, it still offers a significant improvement over manual methods. Furthermore, detecting a landmark point along the cryo-balloon catheter can be a very important step for additional post-processing operations.

  3. Potential Subjective Effectiveness of Active Interior Noise Control in Propeller Airplanes

    NASA Technical Reports Server (NTRS)

    Powell, Clemans A.; Sullivan, Brenda M.

    2000-01-01

    Active noise control technology offers the potential for weight-efficient aircraft interior noise reduction, particularly for propeller aircraft. However, there is little information on how passengers respond to this type of interior noise control. This paper presents results of two experiments that use sound quality engineering practices to determine the subjective effectiveness of hypothetical active noise control (ANC) systems in a range of propeller aircraft. The two experiments differed by the type of judgments made by the subjects: pair comparisons based on preference in the first and numerical category scaling of noisiness in the second. Although the results of the two experiments were in general agreement that the hypothetical active control measures improved the interior noise environments, the pair comparison method appears to be more sensitive to subtle changes in the characteristics of the sounds which are related to passenger preference. The reductions in subjective response due to the ANC conditions were predicted with reasonable accuracy by reductions in measured loudness level. Inclusion of corrections for the sound quality characteristics of tonality and fluctuation strength in multiple regression models improved the prediction of the ANC effects.

  4. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations

    PubMed Central

    Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359

  5. Development and validation of a paediatric long-bone fracture classification. A prospective multicentre study in 13 European paediatric trauma centres

    PubMed Central

    2011-01-01

    Background The aim of this study was to develop a child-specific classification system for long bone fractures and to examine its reliability and validity on the basis of a prospective multicentre study. Methods Using the sequentially developed classification system, three samples of between 30 and 185 paediatric limb fractures from a pool of 2308 fractures documented in two multicenter studies were analysed in a blinded fashion by eight orthopaedic surgeons, on a total of 5 occasions. Intra- and interobserver reliability and accuracy were calculated. Results The reliability improved with successive simplification of the classification. The final version resulted in an overall interobserver agreement of κ = 0.71 with no significant difference between experienced and less experienced raters. Conclusions In conclusion, the evaluation of the newly proposed classification system resulted in a reliable and routinely applicable system, for which training in its proper use may further improve the reliability. It can be recommended as a useful tool for clinical practice and offers the option for developing treatment recommendations and outcome predictions in the future. PMID:21548939

  6. Improving the accuracy and usability of Iowa falling weight deflectometer data.

    DOT National Transportation Integrated Search

    2013-05-01

    This study aims to improve the accuracy and usability of Iowa Falling Weight Deflectometer (FWD) data by incorporating significant : enhancements into the fully-automated software system for rapid processing of the FWD data. These enhancements includ...

  7. Numerical solution of boundary-integral equations for molecular electrostatics.

    PubMed

    Bardhan, Jaydeep P

    2009-03-07

    Numerous molecular processes, such as ion permeation through channel proteins, are governed by relatively small changes in energetics. As a result, theoretical investigations of these processes require accurate numerical methods. In the present paper, we evaluate the accuracy of two approaches to simulating boundary-integral equations for continuum models of the electrostatics of solvation. The analysis emphasizes boundary-element method simulations of the integral-equation formulation known as the apparent-surface-charge (ASC) method or polarizable-continuum model (PCM). In many numerical implementations of the ASC/PCM model, one forces the integral equation to be satisfied exactly at a set of discrete points on the boundary. We demonstrate in this paper that this approach to discretization, known as point collocation, is significantly less accurate than an alternative approach known as qualocation. Furthermore, the qualocation method offers this improvement in accuracy without increasing simulation time. Numerical examples demonstrate that electrostatic part of the solvation free energy, when calculated using the collocation and qualocation methods, can differ significantly; for a polypeptide, the answers can differ by as much as 10 kcal/mol (approximately 4% of the total electrostatic contribution to solvation). The applicability of the qualocation discretization to other integral-equation formulations is also discussed, and two equivalences between integral-equation methods are derived.

  8. Influence of single particle orbital sets and configuration selection on multideterminant wavefunctions in quantum Monte Carlo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clay, Raymond C.; Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, California 94550; Morales, Miguel A., E-mail: moralessilva2@llnl.gov

    2015-06-21

    Multideterminant wavefunctions, while having a long history in quantum chemistry, are increasingly being used in highly accurate quantum Monte Carlo calculations. Since the accuracy of QMC is ultimately limited by the quality of the trial wavefunction, multi-Slater determinants wavefunctions offer an attractive alternative to Slater-Jastrow and more sophisticated wavefunction ansatz for several reasons. They can be efficiently calculated, straightforwardly optimized, and systematically improved by increasing the number of included determinants. In spite of their potential, however, the convergence properties of multi-Slater determinant wavefunctions with respect to orbital set choice and excited determinant selection are poorly understood, which hinders the applicationmore » of these wavefunctions to large systems and solids. In this paper, by performing QMC calculations on the equilibrium and stretched carbon dimer, we find that convergence of the recovered correlation energy with respect to number of determinants can depend quite strongly on basis set and determinant selection methods, especially where there is strong correlation. We demonstrate that properly chosen orbital sets and determinant selection techniques from quantum chemistry methods can dramatically reduce the required number of determinants (and thus the computational cost) to reach a given accuracy, which we argue shows clear need for an automatic QMC-only method for selecting determinants and generating optimal orbital sets.« less

  9. Glass ceramic ZERODUR enabling nanometer precision

    NASA Astrophysics Data System (ADS)

    Jedamzik, Ralf; Kunisch, Clemens; Nieder, Johannes; Westerhoff, Thomas

    2014-03-01

    The IC Lithography roadmap foresees manufacturing of devices with critical dimension of < 20 nm. Overlay specification of single digit nanometer asking for nanometer positioning accuracy requiring sub nanometer position measurement accuracy. The glass ceramic ZERODUR® is a well-established material in critical components of microlithography wafer stepper and offered with an extremely low coefficient of thermal expansion (CTE), the tightest tolerance available on market. SCHOTT is continuously improving manufacturing processes and it's method to measure and characterize the CTE behavior of ZERODUR® to full fill the ever tighter CTE specification for wafer stepper components. In this paper we present the ZERODUR® Lithography Roadmap on the CTE metrology and tolerance. Additionally, simulation calculations based on a physical model are presented predicting the long term CTE behavior of ZERODUR® components to optimize dimensional stability of precision positioning devices. CTE data of several low thermal expansion materials are compared regarding their temperature dependence between - 50°C and + 100°C. ZERODUR® TAILORED 22°C is full filling the tight CTE tolerance of +/- 10 ppb / K within the broadest temperature interval compared to all other materials of this investigation. The data presented in this paper explicitly demonstrates the capability of ZERODUR® to enable the nanometer precision required for future generation of lithography equipment and processes.

  10. Non-invasive diagnosis of liver fibrosis in chronic hepatitis C

    PubMed Central

    Schiavon, Leonardo de Lucca; Narciso-Schiavon, Janaína Luz; de Carvalho-Filho, Roberto José

    2014-01-01

    Assessment of liver fibrosis in chronic hepatitis C virus (HCV) infection is considered a relevant part of patient care and key for decision making. Although liver biopsy has been considered the gold standard for staging liver fibrosis, it is an invasive technique and subject to sampling errors and significant intra- and inter-observer variability. Over the last decade, several noninvasive markers were proposed for liver fibrosis diagnosis in chronic HCV infection, with variable performance. Besides the clear advantage of being noninvasive, a more objective interpretation of test results may overcome the mentioned intra- and inter-observer variability of liver biopsy. In addition, these tests can theoretically offer a more accurate view of fibrogenic events occurring in the entire liver with the advantage of providing frequent fibrosis evaluation without additional risk. However, in general, these tests show low accuracy in discriminating between intermediate stages of fibrosis and may be influenced by several hepatic and extra-hepatic conditions. These methods are either serum markers (usually combined in a mathematical model) or imaging modalities that can be used separately or combined in algorithms to improve accuracy. In this review we will discuss the different noninvasive methods that are currently available for the evaluation of liver fibrosis in chronic hepatitis C, their advantages, limitations and application in clinical practice. PMID:24659877

  11. Profile of Instrumentation Laboratory's HemosIL® AcuStar HIT-Ab(PF4-H) assay for diagnosis of heparin-induced thrombocytopenia.

    PubMed

    Nagler, Michael; Cuker, Adam

    2017-05-01

    Immunoassays play an essential role in the diagnosis of heparin-induced thrombocytopenia (HIT). The objective of this article is to review HemosIL® AcuStar HIT-Ab(PF4-H) (Instrumentation Laboratory, Bedford, MA, USA), a new chemiluminescent immunoassay for HIT. Areas covered: The authors searched the published literature for evaluation studies of HemosIL® AcuStar HIT-Ab(PF4-H) and sought information from the manufacturer. In this paper, the authors discuss the analytical principle and technical aspects of the assay; describe its diagnostic performance in validation studies; report on its reproducibility, cost-effectiveness, and regulatory status; and discuss the implications of the assay on clinical practice and means of integrating it in diagnostic pathways. HemosIL® AcuStar HIT-Ab(PF4-H) is compared with other rapid assays and widely used enzyme-linked immunoassays for the diagnosis of HIT. Expert commentary: HemosIL® AcuStar HIT-Ab(PF4-H) is automatable, can be performed 24 h per day, offers a rapid turnaround time, and appears to have favorable diagnostic accuracy, particularly at thresholds above that listed in the label. These advantages could lead to improved patient outcomes through rapid provision of results at the point of care, enhancing the accuracy of initial diagnosis.

  12. Design considerations for a novel MRI compatible manipulator for prostate cryoablation.

    PubMed

    Abdelaziz, S; Esteveny, L; Renaud, P; Bayle, B; Barbé, L; De Mathelin, M; Gangi, A

    2011-11-01

    Prostate carcinoma is a commonly diagnosed cancer in men. Nonsurgical treatment of early stage prostate cancer is an important alternative. The use of MRI for tumor cryoablation is of particular interest: it offers lower morbidity compared with other localized techniques. However, the current manual procedure is very time-consuming and has limited accuracy. A novel robotic assistant is therefore designed for prostate cancer cryotherapy treatment under MRI guidance to improve efficiency and accuracy. Gesture definition was achieved based on actions of interventional radiologists at University Hospital of Strasbourg. A transperineal approach with a semiautonomous prostatic cryoprobe localization procedure was developed where the needle axis is automatically positioned before manual insertion. The workflow was developed simultaneously with the robotic assistant used for needle positioning. The design and the associated workflow of an original wire-driven manipulator were developed. The device is compact and has a low weight: its overall dimensions in the scanner are 100 × 100 × 40 mm with a weight of 120 g. Very good MRI compatibility was demonstrated. A novel cryoablation procedure based on the use of a robotic assistant is proposed. The device design was presented with demonstration of MRI compatibility. Further developments include automatic registration and in vivo experimental testing.

  13. A novel harmony search-K means hybrid algorithm for clustering gene expression data

    PubMed Central

    Nazeer, KA Abdul; Sebastian, MP; Kumar, SD Madhu

    2013-01-01

    Recent progress in bioinformatics research has led to the accumulation of huge quantities of biological data at various data sources. The DNA microarray technology makes it possible to simultaneously analyze large number of genes across different samples. Clustering of microarray data can reveal the hidden gene expression patterns from large quantities of expression data that in turn offers tremendous possibilities in functional genomics, comparative genomics, disease diagnosis and drug development. The k- ¬means clustering algorithm is widely used for many practical applications. But the original k-¬means algorithm has several drawbacks. It is computationally expensive and generates locally optimal solutions based on the random choice of the initial centroids. Several methods have been proposed in the literature for improving the performance of the k-¬means algorithm. A meta-heuristic optimization algorithm named harmony search helps find out near-global optimal solutions by searching the entire solution space. Low clustering accuracy of the existing algorithms limits their use in many crucial applications of life sciences. In this paper we propose a novel Harmony Search-K means Hybrid (HSKH) algorithm for clustering the gene expression data. Experimental results show that the proposed algorithm produces clusters with better accuracy in comparison with the existing algorithms. PMID:23390351

  14. A novel harmony search-K means hybrid algorithm for clustering gene expression data.

    PubMed

    Nazeer, Ka Abdul; Sebastian, Mp; Kumar, Sd Madhu

    2013-01-01

    Recent progress in bioinformatics research has led to the accumulation of huge quantities of biological data at various data sources. The DNA microarray technology makes it possible to simultaneously analyze large number of genes across different samples. Clustering of microarray data can reveal the hidden gene expression patterns from large quantities of expression data that in turn offers tremendous possibilities in functional genomics, comparative genomics, disease diagnosis and drug development. The k- ¬means clustering algorithm is widely used for many practical applications. But the original k-¬means algorithm has several drawbacks. It is computationally expensive and generates locally optimal solutions based on the random choice of the initial centroids. Several methods have been proposed in the literature for improving the performance of the k-¬means algorithm. A meta-heuristic optimization algorithm named harmony search helps find out near-global optimal solutions by searching the entire solution space. Low clustering accuracy of the existing algorithms limits their use in many crucial applications of life sciences. In this paper we propose a novel Harmony Search-K means Hybrid (HSKH) algorithm for clustering the gene expression data. Experimental results show that the proposed algorithm produces clusters with better accuracy in comparison with the existing algorithms.

  15. "FLIPSY"—A New Solvent-Suppression Sequence for Nonexchanging Solutes Offering Improved Integral Accuracy Relative to 1D NOESY

    NASA Astrophysics Data System (ADS)

    Neuhaus, David; Ismail, Ismail M.; Chung, Chun-Wa

    A new method of solvent suppression is described, based on presaturation in combination with volume selection; the name "FLIPSY" is proposed for this sequence. A low-flip-angle pulse is used for excitation, immediately followed by two 180° pulses, each of which is independently phase cycled through Exorcycle. The phase-cycled inversion pulses achieve volume selection in a way similar to the widely used 1D NOESY sequence, thereby largely eliminating any residual "hump" signal from the solvent. The two 180° pulses combine to produce a net 360° rotation for zmagnetization and either a 180° or a 360° rotation for transverse magnetization, depending on the step in the phase cycle. This allows the overall flip angle of the sequence to be controlled by adjusting the length of the initial excitation pulse. It is demonstrated that this property allows one to choose freely a suitable compromise between signal strength and integral accuracy when using FLIPSY, just as when using single-pulse excitation. Such a choice cannot be made when using 1D NOESY, since the effective flip angle in that experiment is always 90°. The application of FLIPSY to recording LC-NMR spectra is demonstrated.

  16. Theoretical and Experimental Studies of Epidermal Heat Flux Sensors for Measurements of Core Body Temperature.

    PubMed

    Zhang, Yihui; Webb, Richard Chad; Luo, Hongying; Xue, Yeguang; Kurniawan, Jonas; Cho, Nam Heon; Krishnan, Siddharth; Li, Yuhang; Huang, Yonggang; Rogers, John A

    2016-01-07

    Long-term, continuous measurement of core body temperature is of high interest, due to the widespread use of this parameter as a key biomedical signal for clinical judgment and patient management. Traditional approaches rely on devices or instruments in rigid and planar forms, not readily amenable to intimate or conformable integration with soft, curvilinear, time-dynamic, surfaces of the skin. Here, materials and mechanics designs for differential temperature sensors are presented which can attach softly and reversibly onto the skin surface, and also sustain high levels of deformation (e.g., bending, twisting, and stretching). A theoretical approach, together with a modeling algorithm, yields core body temperature from multiple differential measurements from temperature sensors separated by different effective distances from the skin. The sensitivity, accuracy, and response time are analyzed by finite element analyses (FEA) to provide guidelines for relationships between sensor design and performance. Four sets of experiments on multiple devices with different dimensions and under different convection conditions illustrate the key features of the technology and the analysis approach. Finally, results indicate that thermally insulating materials with cellular structures offer advantages in reducing the response time and increasing the accuracy, while improving the mechanics and breathability. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Pulse Transit Time Based Continuous Cuffless Blood Pressure Estimation: A New Extension and A Comprehensive Evaluation.

    PubMed

    Ding, Xiaorong; Yan, Bryan P; Zhang, Yuan-Ting; Liu, Jing; Zhao, Ni; Tsang, Hon Ki

    2017-09-14

    Cuffless technique enables continuous blood pressure (BP) measurement in an unobtrusive manner, and thus has the potential to revolutionize the conventional cuff-based approaches. This study extends the pulse transit time (PTT) based cuffless BP measurement method by introducing a new indicator - the photoplethysmogram (PPG) intensity ratio (PIR). The performance of the models with PTT and PIR was comprehensively evaluated in comparison with six models that are based on sole PTT. The validation conducted on 33 subjects with and without hypertension, at rest and under various maneuvers with induced BP changes, and over an extended calibration interval, respectively. The results showed that, comparing to the PTT models, the proposed methods achieved better accuracy on each subject group at rest state and over 24 hours calibration interval. Although the BP estimation errors under dynamic maneuvers and over extended calibration interval were significantly increased for all methods, the proposed methods still outperformed the compared methods in the latter situation. These findings suggest that additional BP-related indicator other than PTT has added value for improving the accuracy of cuffless BP measurement. This study also offers insights into future research in cuffless BP measurement for tracking dynamic BP changes and over extended periods of time.

  18. Non-Invasive Monitoring of Cardiac Output in Critical Care Medicine.

    PubMed

    Nguyen, Lee S; Squara, Pierre

    2017-01-01

    Critically ill patients require close hemodynamic monitoring to titrate treatment on a regular basis. It allows administering fluid with parsimony and adjusting inotropes and vasoactive drugs when necessary. Although invasive monitoring is considered as the reference method, non-invasive monitoring presents the obvious advantage of being associated with fewer complications, at the expanse of accuracy, precision, and step-response change. A great many methods and devices are now used over the world, and this article focuses on several of them, providing with a brief review of related underlying physical principles and validation articles analysis. Reviewed methods include electrical bioimpedance and bioreactance, respiratory-derived cardiac output (CO) monitoring technique, pulse wave transit time, ultrasound CO monitoring, multimodal algorithmic estimation, and inductance thoracocardiography. Quality criteria with which devices were reviewed included: accuracy (closeness of agreement between a measurement value and a true value of the measured), precision (closeness of agreement between replicate measurements on the same or similar objects under specified conditions), and step response change (delay between physiological change and its indication). Our conclusion is that the offer of non-invasive monitoring has improved in the past few years, even though further developments are needed to provide clinicians with sufficiently accurate devices for routine use, as alternative to invasive monitoring devices.

  19. Bilinear Convolutional Neural Networks for Fine-grained Visual Recognition.

    PubMed

    Lin, Tsung-Yu; RoyChowdhury, Aruni; Maji, Subhransu

    2017-07-04

    We present a simple and effective architecture for fine-grained recognition called Bilinear Convolutional Neural Networks (B-CNNs). These networks represent an image as a pooled outer product of features derived from two CNNs and capture localized feature interactions in a translationally invariant manner. B-CNNs are related to orderless texture representations built on deep features but can be trained in an end-to-end manner. Our most accurate model obtains 84.1%, 79.4%, 84.5% and 91.3% per-image accuracy on the Caltech-UCSD birds [66], NABirds [63], FGVC aircraft [42], and Stanford cars [33] dataset respectively and runs at 30 frames-per-second on a NVIDIA Titan X GPU. We then present a systematic analysis of these networks and show that (1) the bilinear features are highly redundant and can be reduced by an order of magnitude in size without significant loss in accuracy, (2) are also effective for other image classification tasks such as texture and scene recognition, and (3) can be trained from scratch on the ImageNet dataset offering consistent improvements over the baseline architecture. Finally, we present visualizations of these models on various datasets using top activations of neural units and gradient-based inversion techniques. The source code for the complete system is available at http://vis-www.cs.umass.edu/bcnn.

  20. Practical approach to subject-specific estimation of knee joint contact force.

    PubMed

    Knarr, Brian A; Higginson, Jill S

    2015-08-20

    Compressive forces experienced at the knee can significantly contribute to cartilage degeneration. Musculoskeletal models enable predictions of the internal forces experienced at the knee, but validation is often not possible, as experimental data detailing loading at the knee joint is limited. Recently available data reporting compressive knee force through direct measurement using instrumented total knee replacements offer a unique opportunity to evaluate the accuracy of models. Previous studies have highlighted the importance of subject-specificity in increasing the accuracy of model predictions; however, these techniques may be unrealistic outside of a research setting. Therefore, the goal of our work was to identify a practical approach for accurate prediction of tibiofemoral knee contact force (KCF). Four methods for prediction of knee contact force were compared: (1) standard static optimization, (2) uniform muscle coordination weighting, (3) subject-specific muscle coordination weighting and (4) subject-specific strength adjustments. Walking trials for three subjects with instrumented knee replacements were used to evaluate the accuracy of model predictions. Predictions utilizing subject-specific muscle coordination weighting yielded the best agreement with experimental data; however this method required in vivo data for weighting factor calibration. Including subject-specific strength adjustments improved models' predictions compared to standard static optimization, with errors in peak KCF less than 0.5 body weight for all subjects. Overall, combining clinical assessments of muscle strength with standard tools available in the OpenSim software package, such as inverse kinematics and static optimization, appears to be a practical method for predicting joint contact force that can be implemented for many applications. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Practical approach to subject-specific estimation of knee joint contact force

    PubMed Central

    Knarr, Brian A.; Higginson, Jill S.

    2015-01-01

    Compressive forces experienced at the knee can significantly contribute to cartilage degeneration. Musculoskeletal models enable predictions of the internal forces experienced at the knee, but validation is often not possible, as experimental data detailing loading at the knee joint is limited. Recently available data reporting compressive knee force through direct measurement using instrumented total knee replacements offer a unique opportunity to evaluate the accuracy of models. Previous studies have highlighted the importance of subject-specificity in increasing the accuracy of model predictions; however, these techniques may be unrealistic outside of a research setting. Therefore, the goal of our work was to identify a practical approach for accurate prediction of tibiofemoral knee contact force (KCF). Four methods for prediction of knee contact force were compared: (1) standard static optimization, (2) uniform muscle coordination weighting, (3) subject-specific muscle coordination weighting and (4) subject-specific strength adjustments. Walking trials for three subjects with instrumented knee replacements were used to evaluate the accuracy of model predictions. Predictions utilizing subject-specific muscle coordination weighting yielded the best agreement with experimental data, however this method required in vivo data for weighting factor calibration. Including subject-specific strength adjustments improved models’ predictions compared to standard static optimization, with errors in peak KCF less than 0.5 body weight for all subjects. Overall, combining clinical assessments of muscle strength with standard tools available in the OpenSim software package, such as inverse kinematics and static optimization, appears to be a practical method for predicting joint contact force that can be implemented for many applications. PMID:25952546

  2. Experimental Evaluation of UWB Indoor Positioning for Sport Postures

    PubMed Central

    Defraye, Jense; Steendam, Heidi; Gerlo, Joeri; De Clercq, Dirk; De Poorter, Eli

    2018-01-01

    Radio frequency (RF)-based indoor positioning systems (IPSs) use wireless technologies (including Wi-Fi, Zigbee, Bluetooth, and ultra-wide band (UWB)) to estimate the location of persons in areas where no Global Positioning System (GPS) reception is available, for example in indoor stadiums or sports halls. Of the above-mentioned forms of radio frequency (RF) technology, UWB is considered one of the most accurate approaches because it can provide positioning estimates with centimeter-level accuracy. However, it is not yet known whether UWB can also offer such accurate position estimates during strenuous dynamic activities in which moves are characterized by fast changes in direction and velocity. To answer this question, this paper investigates the capabilities of UWB indoor localization systems for tracking athletes during their complex (and most of the time unpredictable) movements. To this end, we analyze the impact of on-body tag placement locations and human movement patterns on localization accuracy and communication reliability. Moreover, two localization algorithms (particle filter and Kalman filter) with different optimizations (bias removal, non-line-of-sight (NLoS) detection, and path determination) are implemented. It is shown that although the optimal choice of optimization depends on the type of movement patterns, some of the improvements can reduce the localization error by up to 31%. Overall, depending on the selected optimization and on-body tag placement, our algorithms show good results in terms of positioning accuracy, with average errors in position estimates of 20 cm. This makes UWB a suitable approach for tracking dynamic athletic activities. PMID:29315267

  3. A new computer aided diagnosis system for evaluation of chronic liver disease with ultrasound shear wave elastography imaging.

    PubMed

    Gatos, Ilias; Tsantis, Stavros; Spiliopoulos, Stavros; Karnabatidis, Dimitris; Theotokas, Ioannis; Zoumpoulis, Pavlos; Loupas, Thanasis; Hazle, John D; Kagadis, George C

    2016-03-01

    Classify chronic liver disease (CLD) from ultrasound shear-wave elastography (SWE) imaging by means of a computer aided diagnosis (CAD) system. The proposed algorithm employs an inverse mapping technique (red-green-blue to stiffness) to quantify 85 SWE images (54 healthy and 31 with CLD). Texture analysis is then applied involving the automatic calculation of 330 first and second order textural features from every transformed stiffness value map to determine functional features that characterize liver elasticity and describe liver condition for all available stages. Consequently, a stepwise regression analysis feature selection procedure is utilized toward a reduced feature subset that is fed into the support vector machines (SVMs) classification algorithm in the design of the CAD system. With regard to the mapping procedure accuracy, the stiffness map values had an average difference of 0.01 ± 0.001 kPa compared to the quantification results derived from the color-box provided by the built-in software of the ultrasound system. Highest classification accuracy from the SVM model was 87.0% with sensitivity and specificity values of 83.3% and 89.1%, respectively. Receiver operating characteristic curves analysis gave an area under the curve value of 0.85 with [0.77-0.89] confidence interval. The proposed CAD system employing color to stiffness mapping and classification algorithms offered superior results, comparing the already published clinical studies. It could prove to be of value to physicians improving the diagnostic accuracy of CLD and can be employed as a second opinion tool for avoiding unnecessary invasive procedures.

  4. Lidar detection of underwater objects using a neuro-SVM-based architecture.

    PubMed

    Mitra, Vikramjit; Wang, Chia-Jiu; Banerjee, Satarupa

    2006-05-01

    This paper presents a neural network architecture using a support vector machine (SVM) as an inference engine (IE) for classification of light detection and ranging (Lidar) data. Lidar data gives a sequence of laser backscatter intensities obtained from laser shots generated from an airborne object at various altitudes above the earth surface. Lidar data is pre-filtered to remove high frequency noise. As the Lidar shots are taken from above the earth surface, it has some air backscatter information, which is of no importance for detecting underwater objects. Because of these, the air backscatter information is eliminated from the data and a segment of this data is subsequently selected to extract features for classification. This is then encoded using linear predictive coding (LPC) and polynomial approximation. The coefficients thus generated are used as inputs to the two branches of a parallel neural architecture. The decisions obtained from the two branches are vector multiplied and the result is fed to an SVM-based IE that presents the final inference. Two parallel neural architectures using multilayer perception (MLP) and hybrid radial basis function (HRBF) are considered in this paper. The proposed structure fits the Lidar data classification task well due to the inherent classification efficiency of neural networks and accurate decision-making capability of SVM. A Bayesian classifier and a quadratic classifier were considered for the Lidar data classification task but they failed to offer high prediction accuracy. Furthermore, a single-layered artificial neural network (ANN) classifier was also considered and it failed to offer good accuracy. The parallel ANN architecture proposed in this paper offers high prediction accuracy (98.9%) and is found to be the most suitable architecture for the proposed task of Lidar data classification.

  5. Dimensional accuracy of jaw scans performed on alginate impressions or stone models: A practice-oriented study.

    PubMed

    Vogel, Annike B; Kilic, Fatih; Schmidt, Falko; Rübel, Sebastian; Lapatki, Bernd G

    2015-07-01

    Digital jaw models offer more extensive possibilities for analysis than casts and make it easier to share and archive relevant information. The aim of this study was to compare the dimensional accuracy of scans performed on alginate impressions and on stone models to reference scans performed on underlying resin models. Precision spheres 5 mm in diameter were occlusally fitted to the sites of the first premolars and first molars on a pair of jaw models fabricated from resin. A structured-light scanner was used for digitization. Once the two reference models had been scanned, alginate impressions were taken and scanned after no later than 1 h. A third series of scans was performed on type III stone models derived from the impressions. All scans were analyzed by performing five repeated measurements to determine the distances between the various sphere centers. Compared to the reference scans, the stone-model scans were larger by a mean of 73.6 µm (maxilla) or 65.2 µm (mandible). The impression scans were only larger by 7.7 µm (maxilla) or smaller by 0.7 µm (mandible). Median standard deviations over the five repeated measurements of 1.0 µm for the reference scans, 2.35 µm for the impression scans, and 2.0 µm for the stone-model scans indicate that the values measured in this study were adequately reproducible. Alginate impressions can be suitably digitized by structured-light scanning and offer considerably better dimensional accuracy than stone models. Apparently, however, both impression scans and stone-model scans can offer adequate precision for orthodontic purposes. The main issue of impression scans (which is incomplete representation of model surfaces) is being systematically explored in a follow-up study.

  6. A Method for Improving the Pose Accuracy of a Robot Manipulator Based on Multi-Sensor Combined Measurement and Data Fusion

    PubMed Central

    Liu, Bailing; Zhang, Fumin; Qu, Xinghua

    2015-01-01

    An improvement method for the pose accuracy of a robot manipulator by using a multiple-sensor combination measuring system (MCMS) is presented. It is composed of a visual sensor, an angle sensor and a series robot. The visual sensor is utilized to measure the position of the manipulator in real time, and the angle sensor is rigidly attached to the manipulator to obtain its orientation. Due to the higher accuracy of the multi-sensor, two efficient data fusion approaches, the Kalman filter (KF) and multi-sensor optimal information fusion algorithm (MOIFA), are used to fuse the position and orientation of the manipulator. The simulation and experimental results show that the pose accuracy of the robot manipulator is improved dramatically by 38%∼78% with the multi-sensor data fusion. Comparing with reported pose accuracy improvement methods, the primary advantage of this method is that it does not require the complex solution of the kinematics parameter equations, increase of the motion constraints and the complicated procedures of the traditional vision-based methods. It makes the robot processing more autonomous and accurate. To improve the reliability and accuracy of the pose measurements of MCMS, the visual sensor repeatability is experimentally studied. An optimal range of 1 × 0.8 × 1 ∼ 2 × 0.8 × 1 m in the field of view (FOV) is indicated by the experimental results. PMID:25850067

  7. Effect of recent popularity on heat-conduction based recommendation models

    NASA Astrophysics Data System (ADS)

    Li, Wen-Jun; Dong, Qiang; Shi, Yang-Bo; Fu, Yan; He, Jia-Lin

    2017-05-01

    Accuracy and diversity are two important measures in evaluating the performance of recommender systems. It has been demonstrated that the recommendation model inspired by the heat conduction process has high diversity yet low accuracy. Many variants have been introduced to improve the accuracy while keeping high diversity, most of which regard the current node-degree of an item as its popularity. However in this way, a few outdated items of large degree may be recommended to an enormous number of users. In this paper, we take the recent popularity (recently increased item degrees) into account in the heat-conduction based methods, and propose accordingly the improved recommendation models. Experimental results on two benchmark data sets show that the accuracy can be largely improved while keeping the high diversity compared with the original models.

  8. 13 Years of TOPEX/POSEIDON Precision Orbit Determination and the 10-fold Improvement in Expected Orbit Accuracy

    NASA Technical Reports Server (NTRS)

    Lemoine, F. G.; Zelensky, N. P.; Luthcke, S. B.; Rowlands, D. D.; Beckley, B. D.; Klosko, S. M.

    2006-01-01

    Launched in the summer of 1992, TOPEX/POSEIDON (T/P) was a joint mission between NASA and the Centre National d Etudes Spatiales (CNES), the French Space Agency, to make precise radar altimeter measurements of the ocean surface. After the remarkably successful 13-years of mapping the ocean surface T/P lost its ability to maneuver and was de-commissioned January 2006. T/P revolutionized the study of the Earth s oceans by vastly exceeding pre-launch estimates of surface height accuracy recoverable from radar altimeter measurements. The precision orbit lies at the heart of the altimeter measurement providing the reference frame from which the radar altimeter measurements are made. The expected quality of orbit knowledge had limited the measurement accuracy expectations of past altimeter missions, and still remains a major component in the error budget of all altimeter missions. This paper describes critical improvements made to the T/P orbit time series over the 13-years of precise orbit determination (POD) provided by the GSFC Space Geodesy Laboratory. The POD improvements from the pre-launch T/P expectation of radial orbit accuracy and Mission requirement of 13-cm to an expected accuracy of about 1.5-cm with today s latest orbits will be discussed. The latest orbits with 1.5 cm RMS radial accuracy represent a significant improvement to the 2.0-cm accuracy orbits currently available on the T/P Geophysical Data Record (GDR) altimeter product.

  9. Optical vector network analyzer with improved accuracy based on polarization modulation and polarization pulling.

    PubMed

    Li, Wei; Liu, Jian Guo; Zhu, Ning Hua

    2015-04-15

    We report a novel optical vector network analyzer (OVNA) with improved accuracy based on polarization modulation and stimulated Brillouin scattering (SBS) assisted polarization pulling. The beating between adjacent higher-order optical sidebands which are generated because of the nonlinearity of an electro-optic modulator (EOM) introduces considerable error to the OVNA. In our scheme, the measurement error is significantly reduced by removing the even-order optical sidebands using polarization discrimination. The proposed approach is theoretically analyzed and experimentally verified. The experimental results show that the accuracy of the OVNA is greatly improved compared to a conventional OVNA.

  10. Flight assessment of the onboard propulsion system model for the Performance Seeking Control algorithm on an F-15 aircraft

    NASA Technical Reports Server (NTRS)

    Orme, John S.; Schkolnik, Gerard S.

    1995-01-01

    Performance Seeking Control (PSC), an onboard, adaptive, real-time optimization algorithm, relies upon an onboard propulsion system model. Flight results illustrated propulsion system performance improvements as calculated by the model. These improvements were subject to uncertainty arising from modeling error. Thus to quantify uncertainty in the PSC performance improvements, modeling accuracy must be assessed. A flight test approach to verify PSC-predicted increases in thrust (FNP) and absolute levels of fan stall margin is developed and applied to flight test data. Application of the excess thrust technique shows that increases of FNP agree to within 3 percent of full-scale measurements for most conditions. Accuracy to these levels is significant because uncertainty bands may now be applied to the performance improvements provided by PSC. Assessment of PSC fan stall margin modeling accuracy was completed with analysis of in-flight stall tests. Results indicate that the model overestimates the stall margin by between 5 to 10 percent. Because PSC achieves performance gains by using available stall margin, this overestimation may represent performance improvements to be recovered with increased modeling accuracy. Assessment of thrust and stall margin modeling accuracy provides a critical piece for a comprehensive understanding of PSC's capabilities and limitations.

  11. Response Latency as a Predictor of the Accuracy of Children's Reports

    ERIC Educational Resources Information Center

    Ackerman, Rakefet; Koriat, Asher

    2011-01-01

    Researchers have explored various diagnostic cues to the accuracy of information provided by child eyewitnesses. Previous studies indicated that children's confidence in their reports predicts the relative accuracy of these reports, and that the confidence-accuracy relationship generally improves as children grow older. In this study, we examined…

  12. Text Mining Genotype-Phenotype Relationships from Biomedical Literature for Database Curation and Precision Medicine.

    PubMed

    Singhal, Ayush; Simmons, Michael; Lu, Zhiyong

    2016-11-01

    The practice of precision medicine will ultimately require databases of genes and mutations for healthcare providers to reference in order to understand the clinical implications of each patient's genetic makeup. Although the highest quality databases require manual curation, text mining tools can facilitate the curation process, increasing accuracy, coverage, and productivity. However, to date there are no available text mining tools that offer high-accuracy performance for extracting such triplets from biomedical literature. In this paper we propose a high-performance machine learning approach to automate the extraction of disease-gene-variant triplets from biomedical literature. Our approach is unique because we identify the genes and protein products associated with each mutation from not just the local text content, but from a global context as well (from the Internet and from all literature in PubMed). Our approach also incorporates protein sequence validation and disease association using a novel text-mining-based machine learning approach. We extract disease-gene-variant triplets from all abstracts in PubMed related to a set of ten important diseases (breast cancer, prostate cancer, pancreatic cancer, lung cancer, acute myeloid leukemia, Alzheimer's disease, hemochromatosis, age-related macular degeneration (AMD), diabetes mellitus, and cystic fibrosis). We then evaluate our approach in two ways: (1) a direct comparison with the state of the art using benchmark datasets; (2) a validation study comparing the results of our approach with entries in a popular human-curated database (UniProt) for each of the previously mentioned diseases. In the benchmark comparison, our full approach achieves a 28% improvement in F1-measure (from 0.62 to 0.79) over the state-of-the-art results. For the validation study with UniProt Knowledgebase (KB), we present a thorough analysis of the results and errors. Across all diseases, our approach returned 272 triplets (disease-gene-variant) that overlapped with entries in UniProt and 5,384 triplets without overlap in UniProt. Analysis of the overlapping triplets and of a stratified sample of the non-overlapping triplets revealed accuracies of 93% and 80% for the respective categories (cumulative accuracy, 77%). We conclude that our process represents an important and broadly applicable improvement to the state of the art for curation of disease-gene-variant relationships.

  13. [Accuracy improvement of spectral classification of crop using microwave backscatter data].

    PubMed

    Jia, Kun; Li, Qiang-Zi; Tian, Yi-Chen; Wu, Bing-Fang; Zhang, Fei-Fei; Meng, Ji-Hua

    2011-02-01

    In the present study, VV polarization microwave backscatter data used for improving accuracies of spectral classification of crop is investigated. Classification accuracy using different classifiers based on the fusion data of HJ satellite multi-spectral and Envisat ASAR VV backscatter data are compared. The results indicate that fusion data can take full advantage of spectral information of HJ multi-spectral data and the structure sensitivity feature of ASAR VV polarization data. The fusion data enlarges the spectral difference among different classifications and improves crop classification accuracy. The classification accuracy using fusion data can be increased by 5 percent compared to the single HJ data. Furthermore, ASAR VV polarization data is sensitive to non-agrarian area of planted field, and VV polarization data joined classification can effectively distinguish the field border. VV polarization data associating with multi-spectral data used in crop classification enlarges the application of satellite data and has the potential of spread in the domain of agriculture.

  14. A novel rotational matrix and translation vector algorithm: geometric accuracy for augmented reality in oral and maxillofacial surgeries.

    PubMed

    Murugesan, Yahini Prabha; Alsadoon, Abeer; Manoranjan, Paul; Prasad, P W C

    2018-06-01

    Augmented reality-based surgeries have not been successfully implemented in oral and maxillofacial areas due to limitations in geometric accuracy and image registration. This paper aims to improve the accuracy and depth perception of the augmented video. The proposed system consists of a rotational matrix and translation vector algorithm to reduce the geometric error and improve the depth perception by including 2 stereo cameras and a translucent mirror in the operating room. The results on the mandible/maxilla area show that the new algorithm improves the video accuracy by 0.30-0.40 mm (in terms of overlay error) and the processing rate to 10-13 frames/s compared to 7-10 frames/s in existing systems. The depth perception increased by 90-100 mm. The proposed system concentrates on reducing the geometric error. Thus, this study provides an acceptable range of accuracy with a shorter operating time, which provides surgeons with a smooth surgical flow. Copyright © 2018 John Wiley & Sons, Ltd.

  15. Enhancing Visual Perception and Motor Accuracy among School Children through a Mindfulness and Compassion Program

    PubMed Central

    Tarrasch, Ricardo; Margalit-Shalom, Lilach; Berger, Rony

    2017-01-01

    The present study assessed the effects of the mindfulness/compassion cultivating program: “Call to Care-Israel” on the performance in visual perception (VP) and motor accuracy, as well as on anxiety levels and self-reported mindfulness among 4th and 5th grade students. One hundred and thirty-eight children participated in the program for 24 weekly sessions, while 78 children served as controls. Repeated measures ANOVA’s yielded significant interactions between time of measurement and group for VP, motor accuracy, reported mindfulness, and anxiety. Post hoc tests revealed significant improvements in the four aforementioned measures in the experimental group only. In addition, significant correlations were obtained between the improvement in motor accuracy and the reduction in anxiety and the increase in mindfulness. Since VP and motor accuracy are basic skills associated with quantifiable academic characteristics, such as reading and mathematical abilities, the results may suggest that mindfulness practice has the ability to improve academic achievements. PMID:28286492

  16. Roles of an Upper-Body Compression Garment on Athletic Performances.

    PubMed

    Hooper, David R; Dulkis, Lexie L; Secola, Paul J; Holtzum, Gabriel; Harper, Sean P; Kalkowski, Ryan J; Comstock, Brett A; Szivak, Tunde K; Flanagan, Shawn D; Looney, David P; DuPont, William H; Maresh, Carl M; Volek, Jeff S; Culley, Kevin P; Kraemer, William J

    2015-09-01

    Compression garments (CGs) have been previously shown to enhance proprioception; however, this benefit has not been previously shown to transfer to improved performance in sports skills. The purpose of this study was to assess whether enhanced proprioception and comfort can be manifested in improved sports performance of high-level athletes. Eleven Division I collegiate pitchers (age: 21.0 ± 2.9 years; height: 181.0 ± 4.6 cm; weight: 89.0 ± 13.0 kg; body fat: 12.0 ± 4.1%) and 10 Division I collegiate golfers (age: 20.0 ± 1.3 years; height: 178.1 ± 3.9 cm; weight: 76.4 ± 8.3 kg; body fat: 11.8 ± 2.6%) participated in the study. A counterbalanced within-group design was used. Subjects performed the respective baseball or golf protocol wearing either typical noncompressive (NC) or the experimental CG. Golfers participated in an assessment of driving distance and accuracy, as well as approach shot, chipping, and putting accuracy. Pitchers were assessed for fastball accuracy and velocity. In pitchers, there was a significant (p ≤ 0.05) improvement in fastball accuracy (NC: 0.30 ± 0.04 vs. CG: 0.21 ± 0.07 cm). There were no differences in pitching velocity. In golfers, there were significant (p ≤ 0.05) improvements in driving accuracy (NC: 86.7 ± 30.6 vs. CG: 68.9 ± 18.5 feet), as well as approach shot accuracy (NC: 26.6 ± 11.9 vs. CG: 22.1 ± 8.2 feet) and chipping accuracy (NC: 2.9 ± 0.6 vs. CG: 2.3 ± 0.6 inch). There was also a significant (p ≤ 0.05) increase in comfort for the golfers (NC: 3.7 ± 0.8 vs. CG: 4.5 ± 1.0). These results demonstrate that comfort and performance can be improved with the use of CGs in high-level athletes being most likely mediated by improved proprioceptive cues during upper-body movements.

  17. Eight-Week Battle Rope Training Improves Multiple Physical Fitness Dimensions and Shooting Accuracy in Collegiate Basketball Players.

    PubMed

    Chen, Wei-Han; Wu, Huey-June; Lo, Shin-Liang; Chen, Hui; Yang, Wen-Wen; Huang, Chen-Fu; Liu, Chiang

    2018-05-28

    Chen, WH, Wu, HJ, Lo, SL, Chen, H, Yang, WW, Huang, CF, and Liu, C. Eight-week battle rope training improves multiple physical fitness dimensions and shooting accuracy in collegiate basketball players. J Strength Cond Res XX(X): 000-000, 2018-Basketball players must possess optimally developed physical fitness in multiple dimensions and shooting accuracy. This study investigated whether (battle rope [BR]) training enhances multiple physical fitness dimensions, including aerobic capacity (AC), upper-body anaerobic power (AnP), upper-body and lower-body power, agility, and core muscle endurance, and shooting accuracy in basketball players and compared its effects with those of regular training (shuttle run [SR]). Thirty male collegiate basketball players were randomly assigned to the BR or SR groups (n = 15 per group). Both groups received 8-week interval training for 3 sessions per week; the protocol consisted of the same number of sets, exercise time, and rest interval time. The BR group exhibited significant improvements in AC (Progressive Aerobic Cardiovascular Endurance Run laps: 17.6%), upper-body AnP (mean power: 7.3%), upper-body power (basketball chest pass speed: 4.8%), lower-body power (jump height: 2.6%), core muscle endurance (flexion: 37.0%, extension: 22.8%, and right side bridge: 23.0%), and shooting accuracy (free throw: 14.0% and dynamic shooting: 36.2%). However, the SR group exhibited improvements in only AC (12.0%) and upper-body power (3.8%) (p < 0.05). The BR group demonstrated larger pre-post improvements in upper-body AnP (fatigue index) and dynamic shooting accuracy than the SR group did (p < 0.05). The BR group showed higher post-training performance in upper-body AnP (mean power and fatigue index) than the SR group did (p < 0.05). Thus, BR training effectively improves multiple physical fitness dimensions and shooting accuracy in collegiate basketball players.

  18. Computerized analysis and duplication of mandibular motion.

    PubMed

    Knap, F J; Abler, J H; Richardson, B L

    1975-05-01

    A new digital system has been devised to analyze and duplicate jaw motion. The arrangement of the electronic system offers a range of versatility which includes graphic as well as numerical data analysis. The duplicator linkage is identical to the sensor linkage which, together with an accurate model transfer system, results in an encouraging level of accuracy in jaw-motion duplication. The data collected from normal subjects should offer some new knowledge in the normal motions of the mandible as well as establish a reference for comparison with abnormal masticatory function.

  19. Scribed transparency microplates mounted on a modified standard microplate.

    PubMed

    Cheong, Brandon Huey-Ping; Chua, Wei Seong; Liew, Oi Wah; Ng, Tuck Wah

    2014-08-01

    The immense cost effectiveness of using transparencies as analyte handling implements in microplate instrumentation offers the possibility of application even in resource-limited laboratories. In this work, a standard microplate was adapted to serve as the permanent base for disposable scribed transparencies. The approach is shown to ameliorate evaporation, which can affect assay accuracy when analytes need to be incubated for some time. It also offers assurance against fluorescence measurement errors due to the cross-talk of samples from adjacent wells. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Aiming Instruments On The Space Station

    NASA Technical Reports Server (NTRS)

    Estus, Jay M.; Laskin, Robert; Lin, Yu-Hwan

    1989-01-01

    Report discusses capabilities and requirements for aiming scientific instruments carried aboard proposed Space Station. Addresses two issues: whether system envisioned for pointing instruments at celestial targets offers sufficiently low jitter, high accuracy, and high stability to meet scientific requirements; whether it can do so even in presence of many vibrations and other disturbances on Space Station. Salient conclusion of study, recommendation to develop pointing-actuator system including mechanical/fluid base isolator underneath reactionaless gimbal subsystem. This kind of system offers greatest promise of high performance, cost-effectiveness, and modularity for job at hand.

Top