Science.gov

Sample records for analysis increases accuracy

  1. Increasing Accuracy in Environmental Measurements

    NASA Astrophysics Data System (ADS)

    Jacksier, Tracey; Fernandes, Adelino; Matthew, Matt; Lehmann, Horst

    2016-04-01

    Human activity is increasing the concentrations of green house gases (GHG) in the atmosphere which results in temperature increases. High precision is a key requirement of atmospheric measurements to study the global carbon cycle and its effect on climate change. Natural air containing stable isotopes are used in GHG monitoring to calibrate analytical equipment. This presentation will examine the natural air and isotopic mixture preparation process, for both molecular and isotopic concentrations, for a range of components and delta values. The role of precisely characterized source material will be presented. Analysis of individual cylinders within multiple batches will be presented to demonstrate the ability to dynamically fill multiple cylinders containing identical compositions without isotopic fractionation. Additional emphasis will focus on the ability to adjust isotope ratios to more closely bracket sample types without the reliance on combusting naturally occurring materials, thereby improving analytical accuracy.

  2. Reporting Data with "Over-the-Counter" Data Analysis Supports Increases Educators' Analysis Accuracy

    ERIC Educational Resources Information Center

    Rankin, Jenny Grant

    2013-01-01

    There is extensive research on the benefits of making data-informed decisions to improve learning, but these benefits rely on the data being effectively interpreted. Despite educators' above-average intellect and education levels, there is evidence many educators routinely misinterpret student data. Data analysis problems persist even at…

  3. Joint Analysis of Psychiatric Disorders Increases Accuracy of Risk Prediction for Schizophrenia, Bipolar Disorder, and Major Depressive Disorder

    PubMed Central

    Maier, Robert; Moser, Gerhard; Chen, Guo-Bo; Ripke, Stephan; Absher, Devin; Agartz, Ingrid; Akil, Huda; Amin, Farooq; Andreassen, Ole A.; Anjorin, Adebayo; Anney, Richard; Arking, Dan E.; Asherson, Philip; Azevedo, Maria H.; Backlund, Lena; Badner, Judith A.; Bailey, Anthony J.; Banaschewski, Tobias; Barchas, Jack D.; Barnes, Michael R.; Barrett, Thomas B.; Bass, Nicholas; Battaglia, Agatino; Bauer, Michael; Bayés, Mònica; Bellivier, Frank; Bergen, Sarah E.; Berrettini, Wade; Betancur, Catalina; Bettecken, Thomas; Biederman, Joseph; Binder, Elisabeth B.; Black, Donald W.; Blackwood, Douglas H.R.; Bloss, Cinnamon S.; Boehnke, Michael; Boomsma, Dorret I.; Breen, Gerome; Breuer, René; Bruggeman, Richard; Buccola, Nancy G.; Buitelaar, Jan K.; Bunney, William E.; Buxbaum, Joseph D.; Byerley, William F.; Caesar, Sian; Cahn, Wiepke; Cantor, Rita M.; Casas, Miguel; Chakravarti, Aravinda; Chambert, Kimberly; Choudhury, Khalid; Cichon, Sven; Cloninger, C. Robert; Collier, David A.; Cook, Edwin H.; Coon, Hilary; Cormand, Bru; Cormican, Paul; Corvin, Aiden; Coryell, William H.; Craddock, Nicholas; Craig, David W.; Craig, Ian W.; Crosbie, Jennifer; Cuccaro, Michael L.; Curtis, David; Czamara, Darina; Daly, Mark J.; Datta, Susmita; Dawson, Geraldine; Day, Richard; De Geus, Eco J.; Degenhardt, Franziska; Devlin, Bernie; Djurovic, Srdjan; Donohoe, Gary J.; Doyle, Alysa E.; Duan, Jubao; Dudbridge, Frank; Duketis, Eftichia; Ebstein, Richard P.; Edenberg, Howard J.; Elia, Josephine; Ennis, Sean; Etain, Bruno; Fanous, Ayman; Faraone, Stephen V.; Farmer, Anne E.; Ferrier, I. Nicol; Flickinger, Matthew; Fombonne, Eric; Foroud, Tatiana; Frank, Josef; Franke, Barbara; Fraser, Christine; Freedman, Robert; Freimer, Nelson B.; Freitag, Christine M.; Friedl, Marion; Frisén, Louise; Gallagher, Louise; Gejman, Pablo V.; Georgieva, Lyudmila; Gershon, Elliot S.; Geschwind, Daniel H.; Giegling, Ina; Gill, Michael; Gordon, Scott D.; Gordon-Smith, Katherine; Green, Elaine K.; Greenwood, Tiffany A.; Grice, Dorothy E.; Gross, Magdalena; Grozeva, Detelina; Guan, Weihua; Gurling, Hugh; De Haan, Lieuwe; Haines, Jonathan L.; Hakonarson, Hakon; Hallmayer, Joachim; Hamilton, Steven P.; Hamshere, Marian L.; Hansen, Thomas F.; Hartmann, Annette M.; Hautzinger, Martin; Heath, Andrew C.; Henders, Anjali K.; Herms, Stefan; Hickie, Ian B.; Hipolito, Maria; Hoefels, Susanne; Holmans, Peter A.; Holsboer, Florian; Hoogendijk, Witte J.; Hottenga, Jouke-Jan; Hultman, Christina M.; Hus, Vanessa; Ingason, Andrés; Ising, Marcus; Jamain, Stéphane; Jones, Ian; Jones, Lisa; Kähler, Anna K.; Kahn, René S.; Kandaswamy, Radhika; Keller, Matthew C.; Kelsoe, John R.; Kendler, Kenneth S.; Kennedy, James L.; Kenny, Elaine; Kent, Lindsey; Kim, Yunjung; Kirov, George K.; Klauck, Sabine M.; Klei, Lambertus; Knowles, James A.; Kohli, Martin A.; Koller, Daniel L.; Konte, Bettina; Korszun, Ania; Krabbendam, Lydia; Krasucki, Robert; Kuntsi, Jonna; Kwan, Phoenix; Landén, Mikael; Långström, Niklas; Lathrop, Mark; Lawrence, Jacob; Lawson, William B.; Leboyer, Marion; Ledbetter, David H.; Lee, Phil H.; Lencz, Todd; Lesch, Klaus-Peter; Levinson, Douglas F.; Lewis, Cathryn M.; Li, Jun; Lichtenstein, Paul; Lieberman, Jeffrey A.; Lin, Dan-Yu; Linszen, Don H.; Liu, Chunyu; Lohoff, Falk W.; Loo, Sandra K.; Lord, Catherine; Lowe, Jennifer K.; Lucae, Susanne; MacIntyre, Donald J.; Madden, Pamela A.F.; Maestrini, Elena; Magnusson, Patrik K.E.; Mahon, Pamela B.; Maier, Wolfgang; Malhotra, Anil K.; Mane, Shrikant M.; Martin, Christa L.; Martin, Nicholas G.; Mattheisen, Manuel; Matthews, Keith; Mattingsdal, Morten; McCarroll, Steven A.; McGhee, Kevin A.; McGough, James J.; McGrath, Patrick J.; McGuffin, Peter; McInnis, Melvin G.; McIntosh, Andrew; McKinney, Rebecca; McLean, Alan W.; McMahon, Francis J.; McMahon, William M.; McQuillin, Andrew; Medeiros, Helena; Medland, Sarah E.; Meier, Sandra; Melle, Ingrid; Meng, Fan; Meyer, Jobst; Middeldorp, Christel M.; Middleton, Lefkos; Milanova, Vihra; Miranda, Ana; Monaco, Anthony P.; Montgomery, Grant W.; Moran, Jennifer L.; Moreno-De-Luca, Daniel; Morken, Gunnar; Morris, Derek W.; Morrow, Eric M.; Moskvina, Valentina; Mowry, Bryan J.; Muglia, Pierandrea; Mühleisen, Thomas W.; Müller-Myhsok, Bertram; Murtha, Michael; Myers, Richard M.; Myin-Germeys, Inez; Neale, Benjamin M.; Nelson, Stan F.; Nievergelt, Caroline M.; Nikolov, Ivan; Nimgaonkar, Vishwajit; Nolen, Willem A.; Nöthen, Markus M.; Nurnberger, John I.; Nwulia, Evaristus A.; Nyholt, Dale R.; O’Donovan, Michael C.; O’Dushlaine, Colm; Oades, Robert D.; Olincy, Ann; Oliveira, Guiomar; Olsen, Line; Ophoff, Roel A.; Osby, Urban; Owen, Michael J.; Palotie, Aarno; Parr, Jeremy R.; Paterson, Andrew D.; Pato, Carlos N.; Pato, Michele T.; Penninx, Brenda W.; Pergadia, Michele L.; Pericak-Vance, Margaret A.; Perlis, Roy H.; Pickard, Benjamin S.; Pimm, Jonathan; Piven, Joseph; Posthuma, Danielle; Potash, James B.; Poustka, Fritz; Propping, Peter; Purcell, Shaun M.; Puri, Vinay; Quested, Digby J.; Quinn, Emma M.; Ramos-Quiroga, Josep Antoni; Rasmussen, Henrik B.; Raychaudhuri, Soumya; Rehnström, Karola; Reif, Andreas; Ribasés, Marta; Rice, John P.; Rietschel, Marcella; Ripke, Stephan; Roeder, Kathryn; Roeyers, Herbert; Rossin, Lizzy; Rothenberger, Aribert; Rouleau, Guy; Ruderfer, Douglas; Rujescu, Dan; Sanders, Alan R.; Sanders, Stephan J.; Santangelo, Susan L.; Schachar, Russell; Schalling, Martin; Schatzberg, Alan F.; Scheftner, William A.; Schellenberg, Gerard D.; Scherer, Stephen W.; Schork, Nicholas J.; Schulze, Thomas G.; Schumacher, Johannes; Schwarz, Markus; Scolnick, Edward; Scott, Laura J.; Sergeant, Joseph A.; Shi, Jianxin; Shilling, Paul D.; Shyn, Stanley I.; Silverman, Jeremy M.; Sklar, Pamela; Slager, Susan L.; Smalley, Susan L.; Smit, Johannes H.; Smith, Erin N.; Smoller, Jordan W.; Sonuga-Barke, Edmund J.S.; St Clair, David; State, Matthew; Steffens, Michael; Steinhausen, Hans-Christoph; Strauss, John S.; Strohmaier, Jana; Stroup, T. Scott; Sullivan, Patrick F.; Sutcliffe, James; Szatmari, Peter; Szelinger, Szabocls; Thapar, Anita; Thirumalai, Srinivasa; Thompson, Robert C.; Todorov, Alexandre A.; Tozzi, Federica; Treutlein, Jens; Tzeng, Jung-Ying; Uhr, Manfred; van den Oord, Edwin J.C.G.; Van Grootheest, Gerard; Van Os, Jim; Vicente, Astrid M.; Vieland, Veronica J.; Vincent, John B.; Visscher, Peter M.; Walsh, Christopher A.; Wassink, Thomas H.; Watson, Stanley J.; Weiss, Lauren A.; Weissman, Myrna M.; Werge, Thomas; Wienker, Thomas F.; Wiersma, Durk; Wijsman, Ellen M.; Willemsen, Gonneke; Williams, Nigel; Willsey, A. Jeremy; Witt, Stephanie H.; Wray, Naomi R.; Xu, Wei; Young, Allan H.; Yu, Timothy W.; Zammit, Stanley; Zandi, Peter P.; Zhang, Peng; Zitman, Frans G.; Zöllner, Sebastian; Coryell, William; Potash, James B.; Scheftner, William A.; Shi, Jianxin; Weissman, Myrna M.; Hultman, Christina M.; Landén, Mikael; Levinson, Douglas F.; Kendler, Kenneth S.; Smoller, Jordan W.; Wray, Naomi R.; Lee, S. Hong

    2015-01-01

    Genetic risk prediction has several potential applications in medical research and clinical practice and could be used, for example, to stratify a heterogeneous population of patients by their predicted genetic risk. However, for polygenic traits, such as psychiatric disorders, the accuracy of risk prediction is low. Here we use a multivariate linear mixed model and apply multi-trait genomic best linear unbiased prediction for genetic risk prediction. This method exploits correlations between disorders and simultaneously evaluates individual risk for each disorder. We show that the multivariate approach significantly increases the prediction accuracy for schizophrenia, bipolar disorder, and major depressive disorder in the discovery as well as in independent validation datasets. By grouping SNPs based on genome annotation and fitting multiple random effects, we show that the prediction accuracy could be further improved. The gain in prediction accuracy of the multivariate approach is equivalent to an increase in sample size of 34% for schizophrenia, 68% for bipolar disorder, and 76% for major depressive disorders using single trait models. Because our approach can be readily applied to any number of GWAS datasets of correlated traits, it is a flexible and powerful tool to maximize prediction accuracy. With current sample size, risk predictors are not useful in a clinical setting but already are a valuable research tool, for example in experimental designs comparing cases with high and low polygenic risk. PMID:25640677

  4. Joint analysis of psychiatric disorders increases accuracy of risk prediction for schizophrenia, bipolar disorder, and major depressive disorder.

    PubMed

    Maier, Robert; Moser, Gerhard; Chen, Guo-Bo; Ripke, Stephan; Coryell, William; Potash, James B; Scheftner, William A; Shi, Jianxin; Weissman, Myrna M; Hultman, Christina M; Landén, Mikael; Levinson, Douglas F; Kendler, Kenneth S; Smoller, Jordan W; Wray, Naomi R; Lee, S Hong

    2015-02-01

    Genetic risk prediction has several potential applications in medical research and clinical practice and could be used, for example, to stratify a heterogeneous population of patients by their predicted genetic risk. However, for polygenic traits, such as psychiatric disorders, the accuracy of risk prediction is low. Here we use a multivariate linear mixed model and apply multi-trait genomic best linear unbiased prediction for genetic risk prediction. This method exploits correlations between disorders and simultaneously evaluates individual risk for each disorder. We show that the multivariate approach significantly increases the prediction accuracy for schizophrenia, bipolar disorder, and major depressive disorder in the discovery as well as in independent validation datasets. By grouping SNPs based on genome annotation and fitting multiple random effects, we show that the prediction accuracy could be further improved. The gain in prediction accuracy of the multivariate approach is equivalent to an increase in sample size of 34% for schizophrenia, 68% for bipolar disorder, and 76% for major depressive disorders using single trait models. Because our approach can be readily applied to any number of GWAS datasets of correlated traits, it is a flexible and powerful tool to maximize prediction accuracy. With current sample size, risk predictors are not useful in a clinical setting but already are a valuable research tool, for example in experimental designs comparing cases with high and low polygenic risk. PMID:25640677

  5. Analysis of spatial variability of near-surface soil moisture to increase rainfall-runoff modelling accuracy in SW Hungary

    NASA Astrophysics Data System (ADS)

    Hegedüs, P.; Czigány, S.; Pirkhoffer, E.; Balatonyi, L.; Hickey, R.

    2015-04-01

    Between September 5, 2008 and September 5, 2009, near-surface soil moisture time series were collected in the northern part of a 1.7 km2 watershed in SWHungary at 14 monitoring locations using a portable TDR-300 soil moisture sensor. The objectives of this study are to increase the accuracy of soil moisture measurement at watershed scale, to improve flood forecasting accuracy, and to optimize soil moisture sensor density. According to our results, in 10 of 13 cases, a strong correlation exists between the measured soil moisture data of Station 5 and all other monitoring stations; Station 5 is considered representative for the entire watershed. Logically, the selection of the location of the representative measurement point(s) is essential for obtaining representative and accurate soil moisture values for the given watershed. This could be done by (i) employing monitoring stations of higher number at the exploratory phase of the monitoring, (ii) mapping soil physical properties at watershed scale, and (iii) running cross-relational statistical analyses on the obtained data. Our findings indicate that increasing the number of soil moisture data points available for interpolation increases the accuracy of watershed-scale soil moisture estimation. The data set used for interpolation (and estimation of mean antecedent soil moisture values) could be improved (thus, having a higher number of data points) by selecting points of similar properties to the measurement points from the DEM and soil databases. By using a higher number of data points for interpolation, both interpolation accuracy and spatial resolution have increased for the measured soil moisture values for the Pósa Valley.

  6. Increasing Accuracy in Computed Inviscid Boundary Conditions

    NASA Technical Reports Server (NTRS)

    Dyson, Roger

    2004-01-01

    A technique has been devised to increase the accuracy of computational simulations of flows of inviscid fluids by increasing the accuracy with which surface boundary conditions are represented. This technique is expected to be especially beneficial for computational aeroacoustics, wherein it enables proper accounting, not only for acoustic waves, but also for vorticity and entropy waves, at surfaces. Heretofore, inviscid nonlinear surface boundary conditions have been limited to third-order accuracy in time for stationary surfaces and to first-order accuracy in time for moving surfaces. For steady-state calculations, it may be possible to achieve higher accuracy in space, but high accuracy in time is needed for efficient simulation of multiscale unsteady flow phenomena. The present technique is the first surface treatment that provides the needed high accuracy through proper accounting of higher-order time derivatives. The present technique is founded on a method known in art as the Hermitian modified solution approximation (MESA) scheme. This is because high time accuracy at a surface depends upon, among other things, correction of the spatial cross-derivatives of flow variables, and many of these cross-derivatives are included explicitly on the computational grid in the MESA scheme. (Alternatively, a related method other than the MESA scheme could be used, as long as the method involves consistent application of the effects of the cross-derivatives.) While the mathematical derivation of the present technique is too lengthy and complex to fit within the space available for this article, the technique itself can be characterized in relatively simple terms: The technique involves correction of surface-normal spatial pressure derivatives at a boundary surface to satisfy the governing equations and the boundary conditions and thereby achieve arbitrarily high orders of time accuracy in special cases. The boundary conditions can now include a potentially infinite number

  7. Final Technical Report: Increasing Prediction Accuracy.

    SciTech Connect

    King, Bruce Hardison; Hansen, Clifford; Stein, Joshua

    2015-12-01

    PV performance models are used to quantify the value of PV plants in a given location. They combine the performance characteristics of the system, the measured or predicted irradiance and weather at a site, and the system configuration and design into a prediction of the amount of energy that will be produced by a PV system. These predictions must be as accurate as possible in order for finance charges to be minimized. Higher accuracy equals lower project risk. The Increasing Prediction Accuracy project at Sandia focuses on quantifying and reducing uncertainties in PV system performance models.

  8. Do saccharide doped PAGAT dosimeters increase accuracy?

    NASA Astrophysics Data System (ADS)

    Berndt, B.; Skyt, P. S.; Holloway, L.; Hill, R.; Sankar, A.; De Deene, Y.

    2015-01-01

    To improve the dosimetric accuracy of normoxic polyacrylamide gelatin (PAGAT) gel dosimeters, the addition of saccharides (glucose and sucrose) has been suggested. An increase in R2-response sensitivity upon irradiation will result in smaller uncertainties in the derived dose if all other uncertainties are conserved. However, temperature variations during the magnetic resonance scanning of polymer gels result in one of the highest contributions to dosimetric uncertainties. The purpose of this project was to study the dose sensitivity against the temperature sensitivity. The overall dose uncertainty of PAGAT gel dosimeters with different concentrations of saccharides (0, 10 and 20%) was investigated. For high concentrations of glucose or sucrose, a clear improvement of the dose sensitivity was observed. For doses up to 6 Gy, the overall dose uncertainty was reduced up to 0.3 Gy for all saccharide loaded gels compared to PAGAT gel. Higher concentrations of glucose and sucrose deteriorate the accuracy of PAGAT dosimeters for doses above 9 Gy.

  9. Classification Accuracy Increase Using Multisensor Data Fusion

    NASA Astrophysics Data System (ADS)

    Makarau, A.; Palubinskas, G.; Reinartz, P.

    2011-09-01

    The practical use of very high resolution visible and near-infrared (VNIR) data is still growing (IKONOS, Quickbird, GeoEye-1, etc.) but for classification purposes the number of bands is limited in comparison to full spectral imaging. These limitations may lead to the confusion of materials such as different roofs, pavements, roads, etc. and therefore may provide wrong interpretation and use of classification products. Employment of hyperspectral data is another solution, but their low spatial resolution (comparing to multispectral data) restrict their usage for many applications. Another improvement can be achieved by fusion approaches of multisensory data since this may increase the quality of scene classification. Integration of Synthetic Aperture Radar (SAR) and optical data is widely performed for automatic classification, interpretation, and change detection. In this paper we present an approach for very high resolution SAR and multispectral data fusion for automatic classification in urban areas. Single polarization TerraSAR-X (SpotLight mode) and multispectral data are integrated using the INFOFUSE framework, consisting of feature extraction (information fission), unsupervised clustering (data representation on a finite domain and dimensionality reduction), and data aggregation (Bayesian or neural network). This framework allows a relevant way of multisource data combination following consensus theory. The classification is not influenced by the limitations of dimensionality, and the calculation complexity primarily depends on the step of dimensionality reduction. Fusion of single polarization TerraSAR-X, WorldView-2 (VNIR or full set), and Digital Surface Model (DSM) data allow for different types of urban objects to be classified into predefined classes of interest with increased accuracy. The comparison to classification results of WorldView-2 multispectral data (8 spectral bands) is provided and the numerical evaluation of the method in comparison to

  10. Combining Multiple Gyroscope Outputs for Increased Accuracy

    NASA Technical Reports Server (NTRS)

    Bayard, David S.

    2003-01-01

    A proposed method of processing the outputs of multiple gyroscopes to increase the accuracy of rate (that is, angular-velocity) readings has been developed theoretically and demonstrated by computer simulation. Although the method is applicable, in principle, to any gyroscopes, it is intended especially for application to gyroscopes that are parts of microelectromechanical systems (MEMS). The method is based on the concept that the collective performance of multiple, relatively inexpensive, nominally identical devices can be better than that of one of the devices considered by itself. The method would make it possible to synthesize the readings of a single, more accurate gyroscope (a virtual gyroscope) from the outputs of a large number of microscopic gyroscopes fabricated together on a single MEMS chip. The big advantage would be that the combination of the MEMS gyroscope array and the processing circuitry needed to implement the method would be smaller, lighter in weight, and less power-hungry, relative to a conventional gyroscope of equal accuracy. The method (see figure) is one of combining and filtering the digitized outputs of multiple gyroscopes to obtain minimum-variance estimates of rate. In the combining-and-filtering operations, measurement data from the gyroscopes would be weighted and smoothed with respect to each other according to the gain matrix of a minimum- variance filter. According to Kalman-filter theory, the gain matrix of the minimum-variance filter is uniquely specified by the filter covariance, which propagates according to a matrix Riccati equation. The present method incorporates an exact analytical solution of this equation.

  11. Diagnostic accuracy of deep vein thrombosis is increased by analysis using combined optimal cut-off values of postoperative plasma D-dimer levels

    PubMed Central

    JIANG, YONG; LI, JIE; LIU, YANG; ZHANG, WEIGUO

    2016-01-01

    The present study aimed to evaluate the accuracy of analysis using optimal cut-off values of plasma D-dimer levels in the diagnosis of deep vein thrombosis (DVT). A total of 175 orthopedic patients with DVT and 162 patients without DVT were included in the study. Ultrasonic color Doppler imaging was performed on lower limb veins prior to and following orthopedic surgery in order to determine the types of orthopedic conditions that were present. An enzyme-linked fluorescent assay was performed to detect the expression levels of D-dimer in plasma, and receiver operating characteristic analysis was performed to predict the occurrence of DVT on the basis of the expression levels of D-dimer. After surgery, the expression levels of D-dimer in the plasma of DVT patients were significantly higher in comparison with those in orthopedic patients without DVT (P<0.05). When the patients were divided into subgroups according to the underlying orthopedic condition, the expression levels of D-dimer in the plasma of each subgroup were higher 1 day after orthopedic surgery in comparison to those prior to surgery (P<0.05). The diagnostic accuracy achieved using combined optimal cut-off values at 1 and 3 days post-surgery was significantly higher than the accuracy when using a single optimal cut-off value (P<0.05). In conclusion, detection of D-dimer expression levels at 1 day post-orthopedic surgery may be important in predicting DVT. In addition, the diagnostic accuracy of DVT is significantly increased by analysis using combined optimal cut-off values of D-dimer plasma expression levels. PMID:27168793

  12. Accuracy in Quantitative 3D Image Analysis

    PubMed Central

    Bassel, George W.

    2015-01-01

    Quantitative 3D imaging is becoming an increasingly popular and powerful approach to investigate plant growth and development. With the increased use of 3D image analysis, standards to ensure the accuracy and reproducibility of these data are required. This commentary highlights how image acquisition and postprocessing can introduce artifacts into 3D image data and proposes steps to increase both the accuracy and reproducibility of these analyses. It is intended to aid researchers entering the field of 3D image processing of plant cells and tissues and to help general readers in understanding and evaluating such data. PMID:25804539

  13. Hydraulic servo system increases accuracy in fatigue testing

    NASA Technical Reports Server (NTRS)

    Dixon, G. V.; Kibler, K. S.

    1967-01-01

    Hydraulic servo system increases accuracy in applying fatigue loading to a specimen under test. An error sensing electronic control loop, coupled to the hydraulic proportional closed loop cyclic force generator, provides an accurately controlled peak force to the specimen.

  14. Portable, high intensity isotopic neutron source provides increased experimental accuracy

    NASA Technical Reports Server (NTRS)

    Mohr, W. C.; Stewart, D. C.; Wahlgren, M. A.

    1968-01-01

    Small portable, high intensity isotopic neutron source combines twelve curium-americium beryllium sources. This high intensity of neutrons, with a flux which slowly decreases at a known rate, provides for increased experimental accuracy.

  15. Accuracy analysis of distributed simulation systems

    NASA Astrophysics Data System (ADS)

    Lin, Qi; Guo, Jing

    2010-08-01

    Existed simulation works always emphasize on procedural verification, which put too much focus on the simulation models instead of simulation itself. As a result, researches on improving simulation accuracy are always limited in individual aspects. As accuracy is the key in simulation credibility assessment and fidelity study, it is important to give an all-round discussion of the accuracy of distributed simulation systems themselves. First, the major elements of distributed simulation systems are summarized, which can be used as the specific basis of definition, classification and description of accuracy of distributed simulation systems. In Part 2, the framework of accuracy of distributed simulation systems is presented in a comprehensive way, which makes it more sensible to analyze and assess the uncertainty of distributed simulation systems. The concept of accuracy of distributed simulation systems is divided into 4 other factors and analyzed respectively further more in Part 3. In Part 4, based on the formalized description of framework of accuracy analysis in distributed simulation systems, the practical approach are put forward, which can be applied to study unexpected or inaccurate simulation results. Following this, a real distributed simulation system based on HLA is taken as an example to verify the usefulness of the approach proposed. The results show that the method works well and is applicable in accuracy analysis of distributed simulation systems.

  16. Simultaneous analysis of multiple enzymes increases accuracy of pulsed-field gel electrophoresis in assigning genetic relationships among homogeneous Salmonella strains.

    PubMed

    Zheng, Jie; Keys, Christine E; Zhao, Shaohua; Ahmed, Rafiq; Meng, Jianghong; Brown, Eric W

    2011-01-01

    Due to a highly homogeneous genetic composition, the subtyping of Salmonella enterica serovar Enteritidis strains to an epidemiologically relevant level remains intangible for pulsed-field gel electrophoresis (PFGE). We reported previously on a highly discriminatory PFGE-based subtyping scheme for S. enterica serovar Enteritidis that relies on a single combined cluster analysis of multiple restriction enzymes. However, the ability of a subtyping method to correctly infer genetic relatedness among outbreak strains is also essential for effective molecular epidemiological traceback. In this study, genetic and phylogenetic analyses were performed to assess whether concatenated enzyme methods can cluster closely related salmonellae into epidemiologically relevant hierarchies. PFGE profiles were generated by use of six restriction enzymes (XbaI, BlnI, SpeI, SfiI, PacI, and NotI) for 74 strains each of S. enterica serovar Enteritidis and S. enterica serovar Typhimurium. Correlation analysis of Dice similarity coefficients for all pairwise strain comparisons underscored the importance of combining multiple enzymes for the accurate assignment of genetic relatedness among Salmonella strains. The mean correlation increased from 81% and 41% for single-enzyme PFGE up to 99% and 96% for five-enzyme combined PFGE for S. enterica serovar Enteritidis and S. enterica serovar Typhimurium strains, respectively. Data regressions approached 100% correlation among Dice similarities for S. enterica serovar Enteritidis and S. enterica serovar Typhimurium strains when a minimum of six enzymes were concatenated. Phylogenetic congruence measures singled out XbaI, BlnI, SfiI, and PacI as most concordant for S. enterica serovar Enteritidis, while XbaI, BlnI, and SpeI were most concordant among S. enterica serovar Typhimurium strains. Together, these data indicate that PFGE coupled with sufficient enzyme numbers and combinations is capable of discerning accurate genetic relationships among

  17. Accuracy analysis of automatic distortion correction

    NASA Astrophysics Data System (ADS)

    Kolecki, Jakub; Rzonca, Antoni

    2015-06-01

    The paper addresses the problem of the automatic distortion removal from images acquired with non-metric SLR camera equipped with prime lenses. From the photogrammetric point of view the following question arises: is the accuracy of distortion control data provided by the manufacturer for a certain lens model (not item) sufficient in order to achieve demanded accuracy? In order to obtain the reliable answer to the aforementioned problem the two kinds of tests were carried out for three lens models. Firstly the multi-variant camera calibration was conducted using the software providing full accuracy analysis. Secondly the accuracy analysis using check points took place. The check points were measured in the images resampled based on estimated distortion model or in distortion-free images simply acquired in the automatic distortion removal mode. The extensive conclusions regarding application of each calibration approach in practice are given. Finally the rules of applying automatic distortion removal in photogrammetric measurements are suggested.

  18. Modeling Linkage Disequilibrium Increases Accuracy of Polygenic Risk Scores.

    PubMed

    Vilhjálmsson, Bjarni J; Yang, Jian; Finucane, Hilary K; Gusev, Alexander; Lindström, Sara; Ripke, Stephan; Genovese, Giulio; Loh, Po-Ru; Bhatia, Gaurav; Do, Ron; Hayeck, Tristan; Won, Hong-Hee; Kathiresan, Sekar; Pato, Michele; Pato, Carlos; Tamimi, Rulla; Stahl, Eli; Zaitlen, Noah; Pasaniuc, Bogdan; Belbin, Gillian; Kenny, Eimear E; Schierup, Mikkel H; De Jager, Philip; Patsopoulos, Nikolaos A; McCarroll, Steve; Daly, Mark; Purcell, Shaun; Chasman, Daniel; Neale, Benjamin; Goddard, Michael; Visscher, Peter M; Kraft, Peter; Patterson, Nick; Price, Alkes L

    2015-10-01

    Polygenic risk scores have shown great promise in predicting complex disease risk and will become more accurate as training sample sizes increase. The standard approach for calculating risk scores involves linkage disequilibrium (LD)-based marker pruning and applying a p value threshold to association statistics, but this discards information and can reduce predictive accuracy. We introduce LDpred, a method that infers the posterior mean effect size of each marker by using a prior on effect sizes and LD information from an external reference panel. Theory and simulations show that LDpred outperforms the approach of pruning followed by thresholding, particularly at large sample sizes. Accordingly, predicted R(2) increased from 20.1% to 25.3% in a large schizophrenia dataset and from 9.8% to 12.0% in a large multiple sclerosis dataset. A similar relative improvement in accuracy was observed for three additional large disease datasets and for non-European schizophrenia samples. The advantage of LDpred over existing methods will grow as sample sizes increase. PMID:26430803

  19. Modeling Linkage Disequilibrium Increases Accuracy of Polygenic Risk Scores

    PubMed Central

    Vilhjálmsson, Bjarni J.; Yang, Jian; Finucane, Hilary K.; Gusev, Alexander; Lindström, Sara; Ripke, Stephan; Genovese, Giulio; Loh, Po-Ru; Bhatia, Gaurav; Do, Ron; Hayeck, Tristan; Won, Hong-Hee; Ripke, Stephan; Neale, Benjamin M.; Corvin, Aiden; Walters, James T.R.; Farh, Kai-How; Holmans, Peter A.; Lee, Phil; Bulik-Sullivan, Brendan; Collier, David A.; Huang, Hailiang; Pers, Tune H.; Agartz, Ingrid; Agerbo, Esben; Albus, Margot; Alexander, Madeline; Amin, Farooq; Bacanu, Silviu A.; Begemann, Martin; Belliveau, Richard A.; Bene, Judit; Bergen, Sarah E.; Bevilacqua, Elizabeth; Bigdeli, Tim B.; Black, Donald W.; Bruggeman, Richard; Buccola, Nancy G.; Buckner, Randy L.; Byerley, William; Cahn, Wiepke; Cai, Guiqing; Campion, Dominique; Cantor, Rita M.; Carr, Vaughan J.; Carrera, Noa; Catts, Stanley V.; Chambert, Kimberly D.; Chan, Raymond C.K.; Chen, Ronald Y.L.; Chen, Eric Y.H.; Cheng, Wei; Cheung, Eric F.C.; Chong, Siow Ann; Cloninger, C. Robert; Cohen, David; Cohen, Nadine; Cormican, Paul; Craddock, Nick; Crowley, James J.; Curtis, David; Davidson, Michael; Davis, Kenneth L.; Degenhardt, Franziska; Del Favero, Jurgen; DeLisi, Lynn E.; Demontis, Ditte; Dikeos, Dimitris; Dinan, Timothy; Djurovic, Srdjan; Donohoe, Gary; Drapeau, Elodie; Duan, Jubao; Dudbridge, Frank; Durmishi, Naser; Eichhammer, Peter; Eriksson, Johan; Escott-Price, Valentina; Essioux, Laurent; Fanous, Ayman H.; Farrell, Martilias S.; Frank, Josef; Franke, Lude; Freedman, Robert; Freimer, Nelson B.; Friedl, Marion; Friedman, Joseph I.; Fromer, Menachem; Genovese, Giulio; Georgieva, Lyudmila; Gershon, Elliot S.; Giegling, Ina; Giusti-Rodrguez, Paola; Godard, Stephanie; Goldstein, Jacqueline I.; Golimbet, Vera; Gopal, Srihari; Gratten, Jacob; Grove, Jakob; de Haan, Lieuwe; Hammer, Christian; Hamshere, Marian L.; Hansen, Mark; Hansen, Thomas; Haroutunian, Vahram; Hartmann, Annette M.; Henskens, Frans A.; Herms, Stefan; Hirschhorn, Joel N.; Hoffmann, Per; Hofman, Andrea; Hollegaard, Mads V.; Hougaard, David M.; Ikeda, Masashi; Joa, Inge; Julia, Antonio; Kahn, Rene S.; Kalaydjieva, Luba; Karachanak-Yankova, Sena; Karjalainen, Juha; Kavanagh, David; Keller, Matthew C.; Kelly, Brian J.; Kennedy, James L.; Khrunin, Andrey; Kim, Yunjung; Klovins, Janis; Knowles, James A.; Konte, Bettina; Kucinskas, Vaidutis; Kucinskiene, Zita Ausrele; Kuzelova-Ptackova, Hana; Kahler, Anna K.; Laurent, Claudine; Keong, Jimmy Lee Chee; Lee, S. Hong; Legge, Sophie E.; Lerer, Bernard; Li, Miaoxin; Li, Tao; Liang, Kung-Yee; Lieberman, Jeffrey; Limborska, Svetlana; Loughland, Carmel M.; Lubinski, Jan; Lnnqvist, Jouko; Macek, Milan; Magnusson, Patrik K.E.; Maher, Brion S.; Maier, Wolfgang; Mallet, Jacques; Marsal, Sara; Mattheisen, Manuel; Mattingsdal, Morten; McCarley, Robert W.; McDonald, Colm; McIntosh, Andrew M.; Meier, Sandra; Meijer, Carin J.; Melegh, Bela; Melle, Ingrid; Mesholam-Gately, Raquelle I.; Metspalu, Andres; Michie, Patricia T.; Milani, Lili; Milanova, Vihra; Mokrab, Younes; Morris, Derek W.; Mors, Ole; Mortensen, Preben B.; Murphy, Kieran C.; Murray, Robin M.; Myin-Germeys, Inez; Mller-Myhsok, Bertram; Nelis, Mari; Nenadic, Igor; Nertney, Deborah A.; Nestadt, Gerald; Nicodemus, Kristin K.; Nikitina-Zake, Liene; Nisenbaum, Laura; Nordin, Annelie; O’Callaghan, Eadbhard; O’Dushlaine, Colm; O’Neill, F. Anthony; Oh, Sang-Yun; Olincy, Ann; Olsen, Line; Van Os, Jim; Pantelis, Christos; Papadimitriou, George N.; Papiol, Sergi; Parkhomenko, Elena; Pato, Michele T.; Paunio, Tiina; Pejovic-Milovancevic, Milica; Perkins, Diana O.; Pietilinen, Olli; Pimm, Jonathan; Pocklington, Andrew J.; Powell, John; Price, Alkes; Pulver, Ann E.; Purcell, Shaun M.; Quested, Digby; Rasmussen, Henrik B.; Reichenberg, Abraham; Reimers, Mark A.; Richards, Alexander L.; Roffman, Joshua L.; Roussos, Panos; Ruderfer, Douglas M.; Salomaa, Veikko; Sanders, Alan R.; Schall, Ulrich; Schubert, Christian R.; Schulze, Thomas G.; Schwab, Sibylle G.; Scolnick, Edward M.; Scott, Rodney J.; Seidman, Larry J.; Shi, Jianxin; Sigurdsson, Engilbert; Silagadze, Teimuraz; Silverman, Jeremy M.; Sim, Kang; Slominsky, Petr; Smoller, Jordan W.; So, Hon-Cheong; Spencer, Chris C.A.; Stahl, Eli A.; Stefansson, Hreinn; Steinberg, Stacy; Stogmann, Elisabeth; Straub, Richard E.; Strengman, Eric; Strohmaier, Jana; Stroup, T. Scott; Subramaniam, Mythily; Suvisaari, Jaana; Svrakic, Dragan M.; Szatkiewicz, Jin P.; Sderman, Erik; Thirumalai, Srinivas; Toncheva, Draga; Tooney, Paul A.; Tosato, Sarah; Veijola, Juha; Waddington, John; Walsh, Dermot; Wang, Dai; Wang, Qiang; Webb, Bradley T.; Weiser, Mark; Wildenauer, Dieter B.; Williams, Nigel M.; Williams, Stephanie; Witt, Stephanie H.; Wolen, Aaron R.; Wong, Emily H.M.; Wormley, Brandon K.; Wu, Jing Qin; Xi, Hualin Simon; Zai, Clement C.; Zheng, Xuebin; Zimprich, Fritz; Wray, Naomi R.; Stefansson, Kari; Visscher, Peter M.; Adolfsson, Rolf; Andreassen, Ole A.; Blackwood, Douglas H.R.; Bramon, Elvira; Buxbaum, Joseph D.; Børglum, Anders D.; Cichon, Sven; Darvasi, Ariel; Domenici, Enrico; Ehrenreich, Hannelore; Esko, Tonu; Gejman, Pablo V.; Gill, Michael; Gurling, Hugh; Hultman, Christina M.; Iwata, Nakao; Jablensky, Assen V.; Jonsson, Erik G.; Kendler, Kenneth S.; Kirov, George; Knight, Jo; Lencz, Todd; Levinson, Douglas F.; Li, Qingqin S.; Liu, Jianjun; Malhotra, Anil K.; McCarroll, Steven A.; McQuillin, Andrew; Moran, Jennifer L.; Mortensen, Preben B.; Mowry, Bryan J.; Nthen, Markus M.; Ophoff, Roel A.; Owen, Michael J.; Palotie, Aarno; Pato, Carlos N.; Petryshen, Tracey L.; Posthuma, Danielle; Rietschel, Marcella; Riley, Brien P.; Rujescu, Dan; Sham, Pak C.; Sklar, Pamela; St. Clair, David; Weinberger, Daniel R.; Wendland, Jens R.; Werge, Thomas; Daly, Mark J.; Sullivan, Patrick F.; O’Donovan, Michael C.; Kraft, Peter; Hunter, David J.; Adank, Muriel; Ahsan, Habibul; Aittomäki, Kristiina; Baglietto, Laura; Berndt, Sonja; Blomquist, Carl; Canzian, Federico; Chang-Claude, Jenny; Chanock, Stephen J.; Crisponi, Laura; Czene, Kamila; Dahmen, Norbert; Silva, Isabel dos Santos; Easton, Douglas; Eliassen, A. Heather; Figueroa, Jonine; Fletcher, Olivia; Garcia-Closas, Montserrat; Gaudet, Mia M.; Gibson, Lorna; Haiman, Christopher A.; Hall, Per; Hazra, Aditi; Hein, Rebecca; Henderson, Brian E.; Hofman, Albert; Hopper, John L.; Irwanto, Astrid; Johansson, Mattias; Kaaks, Rudolf; Kibriya, Muhammad G.; Lichtner, Peter; Lindström, Sara; Liu, Jianjun; Lund, Eiliv; Makalic, Enes; Meindl, Alfons; Meijers-Heijboer, Hanne; Müller-Myhsok, Bertram; Muranen, Taru A.; Nevanlinna, Heli; Peeters, Petra H.; Peto, Julian; Prentice, Ross L.; Rahman, Nazneen; Sánchez, María José; Schmidt, Daniel F.; Schmutzler, Rita K.; Southey, Melissa C.; Tamimi, Rulla; Travis, Ruth; Turnbull, Clare; Uitterlinden, Andre G.; van der Luijt, Rob B.; Waisfisz, Quinten; Wang, Zhaoming; Whittemore, Alice S.; Yang, Rose; Zheng, Wei; Kathiresan, Sekar; Pato, Michele; Pato, Carlos; Tamimi, Rulla; Stahl, Eli; Zaitlen, Noah; Pasaniuc, Bogdan; Belbin, Gillian; Kenny, Eimear E.; Schierup, Mikkel H.; De Jager, Philip; Patsopoulos, Nikolaos A.; McCarroll, Steve; Daly, Mark; Purcell, Shaun; Chasman, Daniel; Neale, Benjamin; Goddard, Michael; Visscher, Peter M.; Kraft, Peter; Patterson, Nick; Price, Alkes L.

    2015-01-01

    Polygenic risk scores have shown great promise in predicting complex disease risk and will become more accurate as training sample sizes increase. The standard approach for calculating risk scores involves linkage disequilibrium (LD)-based marker pruning and applying a p value threshold to association statistics, but this discards information and can reduce predictive accuracy. We introduce LDpred, a method that infers the posterior mean effect size of each marker by using a prior on effect sizes and LD information from an external reference panel. Theory and simulations show that LDpred outperforms the approach of pruning followed by thresholding, particularly at large sample sizes. Accordingly, predicted R2 increased from 20.1% to 25.3% in a large schizophrenia dataset and from 9.8% to 12.0% in a large multiple sclerosis dataset. A similar relative improvement in accuracy was observed for three additional large disease datasets and for non-European schizophrenia samples. The advantage of LDpred over existing methods will grow as sample sizes increase. PMID:26430803

  20. Using Transponders on the Moon to Increase Accuracy of GPS

    NASA Technical Reports Server (NTRS)

    Penanen, Konstantin; Chui, Talso

    2008-01-01

    It has been proposed to place laser or radio transponders at suitably chosen locations on the Moon to increase the accuracy achievable using the Global Positioning System (GPS) or other satellite-based positioning system. The accuracy of GPS position measurements depends on the accuracy of determination of the ephemerides of the GPS satellites. These ephemerides are determined by means of ranging to and from Earth-based stations and consistency checks among the satellites. Unfortunately, ranging to and from Earth is subject to errors caused by atmospheric effects, notably including unpredictable variations in refraction. The proposal is based on exploitation of the fact that ranging between a GPS satellite and another object outside the atmosphere is not subject to error-inducing atmospheric effects. The Moon is such an object and is a convenient place for a ranging station. The ephemeris of the Moon is well known and, unlike a GPS satellite, the Moon is massive enough that its orbit is not measurably affected by the solar wind and solar radiation. According to the proposal, each GPS satellite would repeatedly send a short laser or radio pulse toward the Moon and the transponder(s) would respond by sending back a pulse and delay information. The GPS satellite could then compute its distance from the known position(s) of the transponder(s) on the Moon. Because the same hemisphere of the Moon faces the Earth continuously, any transponders placed there would remain continuously or nearly continuously accessible to GPS satellites, and so only a relatively small number of transponders would be needed to provide continuous coverage. Assuming that the transponders would depend on solar power, it would be desirable to use at least two transponders, placed at diametrically opposite points on the edges of the Moon disk as seen from Earth, so that all or most of the time, at least one of them would be in sunlight.

  1. Lunar Reconnaissance Orbiter Orbit Determination Accuracy Analysis

    NASA Technical Reports Server (NTRS)

    Slojkowski, Steven E.

    2014-01-01

    Results from operational OD produced by the NASA Goddard Flight Dynamics Facility for the LRO nominal and extended mission are presented. During the LRO nominal mission, when LRO flew in a low circular orbit, orbit determination requirements were met nearly 100% of the time. When the extended mission began, LRO returned to a more elliptical frozen orbit where gravity and other modeling errors caused numerous violations of mission accuracy requirements. Prediction accuracy is particularly challenged during periods when LRO is in full-Sun. A series of improvements to LRO orbit determination are presented, including implementation of new lunar gravity models, improved spacecraft solar radiation pressure modeling using a dynamic multi-plate area model, a shorter orbit determination arc length, and a constrained plane method for estimation. The analysis presented in this paper shows that updated lunar gravity models improved accuracy in the frozen orbit, and a multiplate dynamic area model improves prediction accuracy during full-Sun orbit periods. Implementation of a 36-hour tracking data arc and plane constraints during edge-on orbit geometry also provide benefits. A comparison of the operational solutions to precision orbit determination solutions shows agreement on a 100- to 250-meter level in definitive accuracy.

  2. Using satellite data to increase accuracy of PMF calculations

    SciTech Connect

    Mettel, M.C.

    1992-03-01

    The accuracy of a flood severity estimate depends on the data used. The more detailed and precise the data, the more accurate the estimate. Earth observation satellites gather detailed data for determining the probable maximum flood at hydropower projects.

  3. Accuracy considerations in the computational analysis of jet noise

    NASA Technical Reports Server (NTRS)

    Scott, James N.

    1993-01-01

    The application of computational fluid dynamics methods to the analysis of problems in aerodynamic noise has resulted in the extension and adaptation of conventional CFD to the discipline now referred to as computational aeroacoustics (CAA). In the analysis of jet noise accurate resolution of a wide range of spatial and temporal scales in the flow field is essential if the acoustic far field is to be predicted. The numerical simulation of unsteady jet flow has been successfully demonstrated and many flow features have been computed with reasonable accuracy. Grid refinement and increased solution time are discussed as means of improving accuracy of Navier-Stokes solutions of unsteady jet flow. In addition various properties of different numerical procedures which influence accuracy are examined with particular emphasis on dispersion and dissipation characteristics. These properties are investigated by using selected schemes to solve model problems for the propagation of a shock wave and a sinusoidal disturbance. The results are compared for the different schemes.

  4. Lunar Reconnaissance Orbiter Orbit Determination Accuracy Analysis

    NASA Technical Reports Server (NTRS)

    Slojkowski, Steven E.

    2014-01-01

    LRO definitive and predictive accuracy requirements were easily met in the nominal mission orbit, using the LP150Q lunar gravity model. center dot Accuracy of the LP150Q model is poorer in the extended mission elliptical orbit. center dot Later lunar gravity models, in particular GSFC-GRAIL-270, improve OD accuracy in the extended mission. center dot Implementation of a constrained plane when the orbit is within 45 degrees of the Earth-Moon line improves cross-track accuracy. center dot Prediction accuracy is still challenged during full-Sun periods due to coarse spacecraft area modeling - Implementation of a multi-plate area model with definitive attitude input can eliminate prediction violations. - The FDF is evaluating using analytic and predicted attitude modeling to improve full-Sun prediction accuracy. center dot Comparison of FDF ephemeris file to high-precision ephemeris files provides gross confirmation that overlap compares properly assess orbit accuracy.

  5. Holter triage ambulatory ECG analysis. Accuracy and time efficiency.

    PubMed

    Cooper, D H; Kennedy, H L; Lyyski, D S; Sprague, M K

    1996-01-01

    Triage ambulatory electrocardiographic (ECG) analysis permits relatively unskilled office workers to submit 24-hour ambulatory ECG Holter tapes to an automatic instrument (model 563, Del Mar Avionics, Irvine, CA) for interpretation. The instrument system "triages" what it is capable of automatically interpreting and rejects those tapes (with high ventricular arrhythmia density) requiring thorough analysis. Nevertheless, a trained cardiovascular technician ultimately edits what is accepted for analysis. This study examined the clinical validity of one manufacturer's triage instrumentation with regard to accuracy and time efficiency for interpreting ventricular arrhythmia. A database of 50 Holter tapes stratified for frequency of ventricular ectopic beats (VEBs) was examined by triage, conventional, and full-disclosure hand-count Holter analysis. Half of the tapes were found to be automatically analyzable by the triage method. Comparison of the VEB accuracy of triage versus conventional analysis using the full-disclosure hand count as the standard showed that triage analysis overall appeared as accurate as conventional Holter analysis but had limitations in detecting ventricular tachycardia (VT) runs. Overall sensitivity, positive predictive accuracy, and false positive rate for the triage ambulatory ECG analysis were 96, 99, and 0.9%, respectively, for isolated VEBs, 92, 93, and 7%, respectively, for ventricular couplets, and 48, 93, and 7%, respectively, for VT. Error in VT detection by triage analysis occurred on a single tape. Of the remaining 11 tapes containing VT runs, accuracy was significantly increased, with a sensitivity of 86%, positive predictive accuracy of 90%, and false positive rate of 10%. Stopwatch-recorded time efficiency was carefully logged during both triage and conventional ambulatory ECG analysis and divided into five time phases: secretarial, machine, analysis, editing, and total time. Triage analysis was significantly (P < .05) more time

  6. Increasing the range accuracy of three-dimensional ghost imaging ladar using optimum slicing number method

    NASA Astrophysics Data System (ADS)

    Yang, Xu; Zhang, Yong; Xu, Lu; Yang, Cheng-Hua; Wang, Qiang; Liu, Yue-Hao; Zhao, Yuan

    2015-12-01

    The range accuracy of three-dimensional (3D) ghost imaging is derived. Based on the derived range accuracy equation, the relationship between the slicing number and the range accuracy is analyzed and an optimum slicing number (OSN) is determined. According to the OSN, an improved 3D ghost imaging algorithm is proposed to increase the range accuracy. Experimental results indicate that the slicing number can affect the range accuracy significantly and the highest range accuracy can be achieved if the 3D ghost imaging system works with OSN. Project supported by the Young Scientist Fund of the National Natural Science Foundation of China (Grant No. 61108072).

  7. Measurement Accuracy Limitation Analysis on Synchrophasors

    SciTech Connect

    Zhao, Jiecheng; Zhan, Lingwei; Liu, Yilu; Qi, Hairong; Gracia, Jose R; Ewing, Paul D

    2015-01-01

    This paper analyzes the theoretical accuracy limitation of synchrophasors measurements on phase angle and frequency of the power grid. Factors that cause the measurement error are analyzed, including error sources in the instruments and in the power grid signal. Different scenarios of these factors are evaluated according to the normal operation status of power grid measurement. Based on the evaluation and simulation, the errors of phase angle and frequency caused by each factor are calculated and discussed.

  8. Dust trajectory sensor: accuracy and data analysis.

    PubMed

    Xie, J; Sternovsky, Z; Grün, E; Auer, S; Duncan, N; Drake, K; Le, H; Horanyi, M; Srama, R

    2011-10-01

    The Dust Trajectory Sensor (DTS) instrument is developed for the measurement of the velocity vector of cosmic dust particles. The trajectory information is imperative in determining the particles' origin and distinguishing dust particles from different sources. The velocity vector also reveals information on the history of interaction between the charged dust particle and the magnetospheric or interplanetary space environment. The DTS operational principle is based on measuring the induced charge from the dust on an array of wire electrodes. In recent work, the DTS geometry has been optimized [S. Auer, E. Grün, S. Kempf, R. Srama, A. Srowig, Z. Sternovsky, and V Tschernjawski, Rev. Sci. Instrum. 79, 084501 (2008)] and a method of triggering was developed [S. Auer, G. Lawrence, E. Grün, H. Henkel, S. Kempf, R. Srama, and Z. Sternovsky, Nucl. Instrum. Methods Phys. Res. A 622, 74 (2010)]. This article presents the method of analyzing the DTS data and results from a parametric study on the accuracy of the measurements. A laboratory version of the DTS has been constructed and tested with particles in the velocity range of 2-5 km/s using the Heidelberg dust accelerator facility. Both the numerical study and the analyzed experimental data show that the accuracy of the DTS instrument is better than about 1% in velocity and 1° in direction. PMID:22047326

  9. Dust trajectory sensor: Accuracy and data analysis

    NASA Astrophysics Data System (ADS)

    Xie, J.; Sternovsky, Z.; Grün, E.; Auer, S.; Duncan, N.; Drake, K.; Le, H.; Horanyi, M.; Srama, R.

    2011-10-01

    The Dust Trajectory Sensor (DTS) instrument is developed for the measurement of the velocity vector of cosmic dust particles. The trajectory information is imperative in determining the particles' origin and distinguishing dust particles from different sources. The velocity vector also reveals information on the history of interaction between the charged dust particle and the magnetospheric or interplanetary space environment. The DTS operational principle is based on measuring the induced charge from the dust on an array of wire electrodes. In recent work, the DTS geometry has been optimized [S. Auer, E. Grün, S. Kempf, R. Srama, A. Srowig, Z. Sternovsky, and V Tschernjawski, Rev. Sci. Instrum. 79, 084501 (2008), 10.1063/1.2960566] and a method of triggering was developed [S. Auer, G. Lawrence, E. Grün, H. Henkel, S. Kempf, R. Srama, and Z. Sternovsky, Nucl. Instrum. Methods Phys. Res. A 622, 74 (2010), 10.1016/j.nima.2010.06.091]. This article presents the method of analyzing the DTS data and results from a parametric study on the accuracy of the measurements. A laboratory version of the DTS has been constructed and tested with particles in the velocity range of 2-5 km/s using the Heidelberg dust accelerator facility. Both the numerical study and the analyzed experimental data show that the accuracy of the DTS instrument is better than about 1% in velocity and 1° in direction.

  10. Increasing Assignment Completion and Accuracy Using a Daily Report Card Procedure.

    ERIC Educational Resources Information Center

    Drew, Barry M.; And Others

    1982-01-01

    Examined the effects of daily report cards designed to increase the completion and accuracy of in-class assignments in two youngsters described as having a behavioral history of difficulty in completing seat work. Use of the procedure produced immediate significant changes in rates of both completion and accuracy. (Author)

  11. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis

    NASA Astrophysics Data System (ADS)

    Litjens, Geert; Sánchez, Clara I.; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen-van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen

    2016-05-01

    Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce ‘deep learning’ as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30–40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that ‘deep learning’ holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging.

  12. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis.

    PubMed

    Litjens, Geert; Sánchez, Clara I; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen-van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen

    2016-01-01

    Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce 'deep learning' as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30-40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that 'deep learning' holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging. PMID:27212078

  13. Meta-analysis of diagnostic accuracy studies in mental health

    PubMed Central

    Takwoingi, Yemisi; Riley, Richard D; Deeks, Jonathan J

    2015-01-01

    Objectives To explain methods for data synthesis of evidence from diagnostic test accuracy (DTA) studies, and to illustrate different types of analyses that may be performed in a DTA systematic review. Methods We described properties of meta-analytic methods for quantitative synthesis of evidence. We used a DTA review comparing the accuracy of three screening questionnaires for bipolar disorder to illustrate application of the methods for each type of analysis. Results The discriminatory ability of a test is commonly expressed in terms of sensitivity (proportion of those with the condition who test positive) and specificity (proportion of those without the condition who test negative). There is a trade-off between sensitivity and specificity, as an increasing threshold for defining test positivity will decrease sensitivity and increase specificity. Methods recommended for meta-analysis of DTA studies --such as the bivariate or hierarchical summary receiver operating characteristic (HSROC) model --jointly summarise sensitivity and specificity while taking into account this threshold effect, as well as allowing for between study differences in test performance beyond what would be expected by chance. The bivariate model focuses on estimation of a summary sensitivity and specificity at a common threshold while the HSROC model focuses on the estimation of a summary curve from studies that have used different thresholds. Conclusions Meta-analyses of diagnostic accuracy studies can provide answers to important clinical questions. We hope this article will provide clinicians with sufficient understanding of the terminology and methods to aid interpretation of systematic reviews and facilitate better patient care. PMID:26446042

  14. Accuracy analysis of optical ranging in atmosphere

    NASA Astrophysics Data System (ADS)

    Yuan, Hong-wu; Huang, Yin-bo; Mei, Hai-ping; Rao, Rui-zhong

    2009-07-01

    Optical ranging is one of the most precise techniques for distance measurement. The effects of the density variation of atmosphere, aerosols and clouds on optical ranging precision are generally considered, a new method is proposed for calculating the ranging precision in the presence of aerosol particles and clouds. The size distribution spectrum models for aerosols and clouds in the Optical Properties of Aerosols and Clouds Package (OPAC) are adopted. Results show that aerosols and clouds could introduce errors of several centimeters to several ten meters to the ranging. The relationship between the ranging precision and the relative humidity, the zenith angle of ranging direction and the optical wavelength is also analyzed. The ranging error doesn't have an obvious relationship with the wavelength, but depends on the zenith angle, especially for the angle larger than 70 degree. The ranging error depends on the relative humidity as well. The ranging error induced by aerosols increases gradually with the increase of the relative humidity when the relative humidity is less than 80%, but it increases rapidly when the relative humidity is larger than 80%. Our results could provide a theoretical basis and reference for the application of optical ranging.

  15. Accuracy Analysis of the PIC Method

    NASA Astrophysics Data System (ADS)

    Verboncoeur, J. P.; Cartwright, K. L.

    2000-10-01

    The discretization errors for many steps of the classical Particle-in-Cell (PIC) model have been well-studied (C. K. Birdsall and A. B. Langdon, Plasma Physics via Computer Simulation, McGraw-Hill, New York, NY (1985).) (R. W. Hockney and J. W. Eastwood, Computer Simulation Using Particles, McGraw-Hill, New York, NY (1981).). In this work, the errors in the interpolation algorithms, which provide the connection between continuum particles and discrete fields, are described in greater detail. In addition, the coupling of errors between steps in the method is derived. The analysis is carried out for both electrostatic and electromagnetic PIC models, and the results are demonstrated using a bounded one-dimensional electrostatic PIC code (J. P. Verboncoeur et al., J. Comput. Phys. 104, 321-328 (1993).), as well as a bounded two-dimensional electromagnetic PIC code (J. P. Verboncoeur et al., Comp. Phys. Comm. 87, 199-211 (1995).).

  16. Accuracy of remotely sensed data: Sampling and analysis procedures

    NASA Technical Reports Server (NTRS)

    Congalton, R. G.; Oderwald, R. G.; Mead, R. A.

    1982-01-01

    A review and update of the discrete multivariate analysis techniques used for accuracy assessment is given. A listing of the computer program written to implement these techniques is given. New work on evaluating accuracy assessment using Monte Carlo simulation with different sampling schemes is given. The results of matrices from the mapping effort of the San Juan National Forest is given. A method for estimating the sample size requirements for implementing the accuracy assessment procedures is given. A proposed method for determining the reliability of change detection between two maps of the same area produced at different times is given.

  17. Increasing the accuracy of measurements based on the solution of Pauli's quantum equation

    NASA Astrophysics Data System (ADS)

    Ermishin, Sergey; Korol, Alexandra

    2013-05-01

    There is a measurements principle that ensures the increase of accuracy of measurements based on redundant measurements. Main properties of the solution are: a discrete method with a surge of probability within the parent entity and comparison of the graph of the probability distribution for the diffraction grids with the graph of probability density function. Method based on the analog of Pauli equation solution. The method of electronic reference measurements with quantum computing applied to mathematical data processing allows to greatly increase the credibility and accuracy of measurements at low cost, which is confirmed by simulation.

  18. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis

    PubMed Central

    Litjens, Geert; Sánchez, Clara I.; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen - van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen

    2016-01-01

    Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce ‘deep learning’ as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30–40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that ‘deep learning’ holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging. PMID:27212078

  19. Accuracy analysis and design of A3 parallel spindle head

    NASA Astrophysics Data System (ADS)

    Ni, Yanbing; Zhang, Biao; Sun, Yupeng; Zhang, Yuan

    2016-03-01

    As functional components of machine tools, parallel mechanisms are widely used in high efficiency machining of aviation components, and accuracy is one of the critical technical indexes. Lots of researchers have focused on the accuracy problem of parallel mechanisms, but in terms of controlling the errors and improving the accuracy in the stage of design and manufacturing, further efforts are required. Aiming at the accuracy design of a 3-DOF parallel spindle head(A3 head), its error model, sensitivity analysis and tolerance allocation are investigated. Based on the inverse kinematic analysis, the error model of A3 head is established by using the first-order perturbation theory and vector chain method. According to the mapping property of motion and constraint Jacobian matrix, the compensatable and uncompensatable error sources which affect the accuracy in the end-effector are separated. Furthermore, sensitivity analysis is performed on the uncompensatable error sources. The sensitivity probabilistic model is established and the global sensitivity index is proposed to analyze the influence of the uncompensatable error sources on the accuracy in the end-effector of the mechanism. The results show that orientation error sources have bigger effect on the accuracy in the end-effector. Based upon the sensitivity analysis results, the tolerance design is converted into the issue of nonlinearly constrained optimization with the manufacturing cost minimum being the optimization objective. By utilizing the genetic algorithm, the allocation of the tolerances on each component is finally determined. According to the tolerance allocation results, the tolerance ranges of ten kinds of geometric error sources are obtained. These research achievements can provide fundamental guidelines for component manufacturing and assembly of this kind of parallel mechanisms.

  20. Using Self-Monitoring to Increase Attending to Task and Academic Accuracy in Children with Autism

    ERIC Educational Resources Information Center

    Holifield, Cassandra; Goodman, Janet; Hazelkorn, Michael; Heflin, L. Juane

    2010-01-01

    This study was conducted to investigate the effectiveness of a self-monitoring procedure on increasing attending to task and academic accuracy in two elementary students with autism in their self-contained classroom. A multiple baseline across participants in two academic subject areas was used to assess the effectiveness of the intervention. Both…

  1. Bilingual Language Assessment: A Meta-Analysis of Diagnostic Accuracy

    ERIC Educational Resources Information Center

    Dollaghan, Christine A.; Horner, Elizabeth A.

    2011-01-01

    Purpose: To describe quality indicators for appraising studies of diagnostic accuracy and to report a meta-analysis of measures for diagnosing language impairment (LI) in bilingual Spanish-English U.S. children. Method: The authors searched electronically and by hand to locate peer-reviewed English-language publications meeting inclusion criteria;…

  2. Range accuracy analysis of streak tube imaging lidar systems

    NASA Astrophysics Data System (ADS)

    Ye, Guangchao; Fan, Rongwei; Chen, Zhaodong; Yuan, Wei; Chen, Deying; He, Ping

    2016-02-01

    Streak tube imaging lidar (STIL) is an active imaging system that has a high range accuracy and a wide range gate with the use of a pulsed laser transmitter and streak tube receiver to produce 3D range images. This work investigates the range accuracy performance of STIL systems based on a peak detection algorithm, taking into account the effects of blurring of the image. A theoretical model of the time-resolved signal distribution, including the static blurring width in addition to the laser pulse width, is presented, resulting in a modified range accuracy analysis. The model indicates that the static blurring width has a significant effect on the range accuracy, which is validated by both the simulation and experimental results. By using the optimal static blurring width, the range accuracies are enhanced in both indoor and outdoor experiments, with a stand-off distance of 10 m and 1700 m, respectively, and corresponding, best range errors of 0.06 m and 0.25 m were achieved in a daylight environment.

  3. Neutron electric dipole moment and possibilities of increasing accuracy of experiments

    NASA Astrophysics Data System (ADS)

    Serebrov, A. P.; Kolomenskiy, E. A.; Pirozhkov, A. N.; Krasnoshchekova, I. A.; Vasiliev, A. V.; Polyushkin, A. O.; Lasakov, M. S.; Murashkin, A. N.; Solovey, V. A.; Fomin, A. K.; Shoka, I. V.; Zherebtsov, O. M.; Aleksandrov, E. B.; Dmitriev, S. P.; Dovator, N. A.; Geltenbort, P.; Ivanov, S. N.; Zimmer, O.

    2016-01-01

    The paper reports the results of an experiment on searching for the neutron electric dipole moment (EDM), performed on the ILL reactor (Grenoble, France). The double-chamber magnetic resonance spectrometer (Petersburg Nuclear Physics Institute (PNPI)) with prolonged holding of ultra cold neutrons has been used. Sources of possible systematic errors are analyzed, and their influence on the measurement results is estimated. The ways and prospects of increasing accuracy of the experiment are discussed.

  4. Development of nonlinear weighted compact schemes with increasingly higher order accuracy

    NASA Astrophysics Data System (ADS)

    Zhang, Shuhai; Jiang, Shufen; Shu, Chi-Wang

    2008-07-01

    In this paper, we design a class of high order accurate nonlinear weighted compact schemes that are higher order extensions of the nonlinear weighted compact schemes proposed by Deng and Zhang [X. Deng, H. Zhang, Developing high-order weighted compact nonlinear schemes, J. Comput. Phys. 165 (2000) 22-44] and the weighted essentially non-oscillatory schemes of Jiang and Shu [G.-S. Jiang, C.-W. Shu, Efficient implementation of weighted ENO schemes, J. Comput. Phys. 126 (1996) 202-228] and Balsara and Shu [D.S. Balsara, C.-W. Shu, Monotonicity preserving weighted essentially non-oscillatory schemes with increasingly high order of accuracy, J. Comput. Phys. 160 (2000) 405-452]. These nonlinear weighted compact schemes are proposed based on the cell-centered compact scheme of Lele [S.K. Lele, Compact finite difference schemes with spectral-like resolution, J. Comput. Phys. 103 (1992) 16-42]. Instead of performing the nonlinear interpolation on the conservative variables as in Deng and Zhang (2000), we propose to directly interpolate the flux on its stencil. Using the Lax-Friedrichs flux splitting and characteristic-wise projection, the resulted interpolation formulae are similar to those of the regular WENO schemes. Hence, the detailed analysis and even many pieces of the code can be directly copied from those of the regular WENO schemes. Through systematic test and comparison with the regular WENO schemes, we observe that the nonlinear weighted compact schemes have the same ability to capture strong discontinuities, while the resolution of short waves is improved and numerical dissipation is reduced.

  5. Mesoscale modelling methodology based on nudging to increase accuracy in WRA

    NASA Astrophysics Data System (ADS)

    Mylonas Dirdiris, Markos; Barbouchi, Sami; Hermmann, Hugo

    2016-04-01

    The offshore wind energy has recently become a rapidly growing renewable energy resource worldwide, with several offshore wind projects in development in different planning stages. Despite of this, a better understanding of the atmospheric interaction within the marine atmospheric boundary layer (MABL) is needed in order to contribute to a better energy capture and cost-effectiveness. Light has been thrown in observational nudging as it has recently become an innovative method to increase the accuracy of wind flow modelling. This particular study focuses on the observational nudging capability of Weather Research and Forecasting (WRF) and ways the uncertainty of wind flow modelling in the wind resource assessment (WRA) can be reduced. Finally, an alternative way to calculate the model uncertainty is pinpointed. Approach WRF mesoscale model will be nudged with observations from FINO3 at three different heights. The model simulations with and without applying observational nudging will be verified against FINO1 measurement data at 100m. In order to evaluate the observational nudging capability of WRF two ways to derive the model uncertainty will be described: one global uncertainty and an uncertainty per wind speed bin derived using the recommended practice of the IEA in order to link the model uncertainty to a wind energy production uncertainty. This study assesses the observational data assimilation capability of WRF model within the same vertical gridded atmospheric column. The principal aim is to investigate whether having observations up to one height could improve the simulation at a higher vertical level. The study will use objective analysis implementing a Cress-man scheme interpolation to interpolate the observation in time and in sp ace (keeping the horizontal component constant) to the gridded analysis. Then the WRF model core will incorporate the interpolated variables to the "first guess" to develop a nudged simulation. Consequently, WRF with and without

  6. Accuracy Enhancement of Inertial Sensors Utilizing High Resolution Spectral Analysis

    PubMed Central

    Noureldin, Aboelmagd; Armstrong, Justin; El-Shafie, Ahmed; Karamat, Tashfeen; McGaughey, Don; Korenberg, Michael; Hussain, Aini

    2012-01-01

    In both military and civilian applications, the inertial navigation system (INS) and the global positioning system (GPS) are two complementary technologies that can be integrated to provide reliable positioning and navigation information for land vehicles. The accuracy enhancement of INS sensors and the integration of INS with GPS are the subjects of widespread research. Wavelet de-noising of INS sensors has had limited success in removing the long-term (low-frequency) inertial sensor errors. The primary objective of this research is to develop a novel inertial sensor accuracy enhancement technique that can remove both short-term and long-term error components from inertial sensor measurements prior to INS mechanization and INS/GPS integration. A high resolution spectral analysis technique called the fast orthogonal search (FOS) algorithm is used to accurately model the low frequency range of the spectrum, which includes the vehicle motion dynamics and inertial sensor errors. FOS models the spectral components with the most energy first and uses an adaptive threshold to stop adding frequency terms when fitting a term does not reduce the mean squared error more than fitting white noise. The proposed method was developed, tested and validated through road test experiments involving both low-end tactical grade and low cost MEMS-based inertial systems. The results demonstrate that in most cases the position accuracy during GPS outages using FOS de-noised data is superior to the position accuracy using wavelet de-noising.

  7. Increased Genomic Prediction Accuracy in Wheat Breeding Through Spatial Adjustment of Field Trial Data

    PubMed Central

    Lado, Bettina; Matus, Ivan; Rodríguez, Alejandra; Inostroza, Luis; Poland, Jesse; Belzile, François; del Pozo, Alejandro; Quincke, Martín; Castro, Marina; von Zitzewitz, Jarislav

    2013-01-01

    In crop breeding, the interest of predicting the performance of candidate cultivars in the field has increased due to recent advances in molecular breeding technologies. However, the complexity of the wheat genome presents some challenges for applying new technologies in molecular marker identification with next-generation sequencing. We applied genotyping-by-sequencing, a recently developed method to identify single-nucleotide polymorphisms, in the genomes of 384 wheat (Triticum aestivum) genotypes that were field tested under three different water regimes in Mediterranean climatic conditions: rain-fed only, mild water stress, and fully irrigated. We identified 102,324 single-nucleotide polymorphisms in these genotypes, and the phenotypic data were used to train and test genomic selection models intended to predict yield, thousand-kernel weight, number of kernels per spike, and heading date. Phenotypic data showed marked spatial variation. Therefore, different models were tested to correct the trends observed in the field. A mixed-model using moving-means as a covariate was found to best fit the data. When we applied the genomic selection models, the accuracy of predicted traits increased with spatial adjustment. Multiple genomic selection models were tested, and a Gaussian kernel model was determined to give the highest accuracy. The best predictions between environments were obtained when data from different years were used to train the model. Our results confirm that genotyping-by-sequencing is an effective tool to obtain genome-wide information for crops with complex genomes, that these data are efficient for predicting traits, and that correction of spatial variation is a crucial ingredient to increase prediction accuracy in genomic selection models. PMID:24082033

  8. Increased Throwing Accuracy Improves Children's Catching Performance in a Ball-Catching Task from the Movement Assessment Battery (MABC-2)

    PubMed Central

    Dirksen, Tim; De Lussanet, Marc H. E.; Zentgraf, Karen; Slupinski, Lena; Wagner, Heiko

    2016-01-01

    The Movement Assessment Battery for Children (MABC-2) is a functional test for identifying deficits in the motor performance of children. The test contains a ball-catching task that requires the children to catch a self-thrown ball with one hand. As the task can be executed with a variety of different catching strategies, it is assumed that the task success can also vary considerably. Even though it is not clear, whether the performance merely depends on the catching skills or also to some extent on the throwing skills, the MABC-2 takes into account only the movement outcome. Therefore, the purpose of the current study was to examine (1) to what extent the throwing accuracy has an effect on the children's catching performance and (2) to what extent the throwing accuracy influences their choice of catching strategy. In line with the test manual, the children's catching performance was quantified on basis of the number of correctly caught balls. The throwing accuracy and the catching strategy were quantified by applying a kinematic analysis on the ball's trajectory and the hand movements. Based on linear regression analyses, we then investigated the relation between throwing accuracy, catching performance and catching strategy. The results show that an increased throwing accuracy is significantly correlated with an increased catching performance. Moreover, a higher throwing accuracy is significantly correlated with a longer duration of the hand on the ball's parabola, which indicates that throwing the ball more accurately could enable the children to effectively reduce the requirements on temporal precision. As the children's catching performance and their choice of catching strategy in the ball-catching task of the MABC-2 are substantially determined by their throwing accuracy, the test evaluation should not be based on the movement outcome alone, but should also take into account the children's throwing performance. Our findings could be of particular value for the

  9. Analysis of the Ionospheric Corrections Accuracy of EGNOS System

    NASA Astrophysics Data System (ADS)

    Prats, X.; Orus, R.; Hernandez-Pajares, M.; Juan, M.; Sanz, J.

    2002-01-01

    Satellite Based Augmentation systems (SBAS) provide to Global Navigation Satellite Systems (GNSS) users with an extra set of information, in order to enhance accuracy and integrity levels of GNSS stand alone positioning. The ionosphere is one of the main error component in SBAS. Therefore, the analysis of system performances requires a calibration of the broadcast corrections. In this context, different test methods to analyze the performance of these corrections are presented. The first set of tests involves two of the ionospheric calculations that are applied to the Global Ionospheric Maps (GIM), computed by the IGS Associate Analysis Centers: a TEC TOPEX comparison test and the STEC variations test. The second family of tests provides two very accurate analysis based on large-baselines ambiguity resolution techniques giving accuracies of about 16cm of L1 and few millimeters of L1 in the STEC and double differenced STEC determinations, respectively. Those four analysis have been applied for the EGNOS System Test Bed (ESTB) signal, which is the European SBAS provider.

  10. Accuracy analysis of pointing control system of solar power station

    NASA Technical Reports Server (NTRS)

    Hung, J. C.; Peebles, P. Z., Jr.

    1978-01-01

    The first-phase effort concentrated on defining the minimum basic functions that the retrodirective array must perform, identifying circuits that are capable of satisfying the basic functions, and looking at some of the error sources in the system and how they affect accuracy. The initial effort also examined three methods for generating torques for mechanical antenna control, performed a rough analysis of the flexible body characteristics of the solar collector, and defined a control system configuration for mechanical pointing control of the array.

  11. The effectiveness of FE model for increasing accuracy in stretch forming simulation of aircraft skin panels

    NASA Astrophysics Data System (ADS)

    Kono, A.; Yamada, T.; Takahashi, S.

    2013-12-01

    In the aerospace industry, stretch forming has been used to form the outer surface parts of aircraft, which are called skin panels. Empirical methods have been used to correct the springback by measuring the formed panels. However, such methods are impractical and cost prohibitive. Therefore, there is a need to develop simulation technologies to predict the springback caused by stretch forming [1]. This paper reports the results of a study on the influences of the modeling conditions and parameters on the accuracy of an FE analysis simulating the stretch forming of aircraft skin panels. The effects of the mesh aspect ratio, convergence criteria, and integration points are investigated, and better simulation conditions and parameters are proposed.

  12. The Meta-Analysis of Clinical Judgment Project: Effects of Experience on Judgment Accuracy

    ERIC Educational Resources Information Center

    Spengler, Paul M.; White, Michael J.; Aegisdottir, Stefania; Maugherman, Alan S.; Anderson, Linda A.; Cook, Robert S.; Nichols, Cassandra N.; Lampropoulos, Georgios K.; Walker, Blain S.; Cohen, Genna R.; Rush, Jeffrey D.

    2009-01-01

    Clinical and educational experience is one of the most commonly studied variables in clinical judgment research. Contrary to clinicians' perceptions, clinical judgment researchers have generally concluded that accuracy does not improve with increased education, training, or clinical experience. In this meta-analysis, the authors synthesized…

  13. Design and analysis of a high-accuracy flexure hinge.

    PubMed

    Liu, Min; Zhang, Xianmin; Fatikow, Sergej

    2016-05-01

    This paper designs and analyzes a new kind of flexure hinge obtained by using a topology optimization approach, namely, a quasi-V-shaped flexure hinge (QVFH). Flexure hinges are formed by three segments: the left and right segments with convex shapes and the middle segment with straight line. According to the results of topology optimization, the curve equations of profiles of the flexure hinges are developed by numerical fitting. The in-plane dimensionless compliance equations of the flexure hinges are derived based on Castigliano's second theorem. The accuracy of rotation, which is denoted by the compliance of the center of rotation that deviates from the midpoint, is derived. The equations for evaluating the maximum stresses are also provided. These dimensionless equations are verified by finite element analysis and experimentation. The analytical results are within 8% uncertainty compared to the finite element analysis results and within 9% uncertainty compared to the experimental measurement data. Compared with the filleted V-shaped flexure hinge, the QVFH has a higher accuracy of rotation and better ability of preserving the center of rotation position but smaller compliance. PMID:27250469

  14. Analysis of deformable image registration accuracy using computational modeling.

    PubMed

    Zhong, Hualiang; Kim, Jinkoo; Chetty, Indrin J

    2010-03-01

    Computer aided modeling of anatomic deformation, allowing various techniques and protocols in radiation therapy to be systematically verified and studied, has become increasingly attractive. In this study the potential issues in deformable image registration (DIR) were analyzed based on two numerical phantoms: One, a synthesized, low intensity gradient prostate image, and the other a lung patient's CT image data set. Each phantom was modeled with region-specific material parameters with its deformation solved using a finite element method. The resultant displacements were used to construct a benchmark to quantify the displacement errors of the Demons and B-Spline-based registrations. The results show that the accuracy of these registration algorithms depends on the chosen parameters, the selection of which is closely associated with the intensity gradients of the underlying images. For the Demons algorithm, both single resolution (SR) and multiresolution (MR) registrations required approximately 300 iterations to reach an accuracy of 1.4 mm mean error in the lung patient's CT image (and 0.7 mm mean error averaged in the lung only). For the low gradient prostate phantom, these algorithms (both SR and MR) required at least 1600 iterations to reduce their mean errors to 2 mm. For the B-Spline algorithms, best performance (mean errors of 1.9 mm for SR and 1.6 mm for MR, respectively) on the low gradient prostate was achieved using five grid nodes in each direction. Adding more grid nodes resulted in larger errors. For the lung patient's CT data set, the B-Spline registrations required ten grid nodes in each direction for highest accuracy (1.4 mm for SR and 1.5 mm for MR). The numbers of iterations or grid nodes required for optimal registrations depended on the intensity gradients of the underlying images. In summary, the performance of the Demons and B-Spline registrations have been quantitatively evaluated using numerical phantoms. The results show that parameter

  15. Analysis of deformable image registration accuracy using computational modeling

    SciTech Connect

    Zhong Hualiang; Kim, Jinkoo; Chetty, Indrin J.

    2010-03-15

    Computer aided modeling of anatomic deformation, allowing various techniques and protocols in radiation therapy to be systematically verified and studied, has become increasingly attractive. In this study the potential issues in deformable image registration (DIR) were analyzed based on two numerical phantoms: One, a synthesized, low intensity gradient prostate image, and the other a lung patient's CT image data set. Each phantom was modeled with region-specific material parameters with its deformation solved using a finite element method. The resultant displacements were used to construct a benchmark to quantify the displacement errors of the Demons and B-Spline-based registrations. The results show that the accuracy of these registration algorithms depends on the chosen parameters, the selection of which is closely associated with the intensity gradients of the underlying images. For the Demons algorithm, both single resolution (SR) and multiresolution (MR) registrations required approximately 300 iterations to reach an accuracy of 1.4 mm mean error in the lung patient's CT image (and 0.7 mm mean error averaged in the lung only). For the low gradient prostate phantom, these algorithms (both SR and MR) required at least 1600 iterations to reduce their mean errors to 2 mm. For the B-Spline algorithms, best performance (mean errors of 1.9 mm for SR and 1.6 mm for MR, respectively) on the low gradient prostate was achieved using five grid nodes in each direction. Adding more grid nodes resulted in larger errors. For the lung patient's CT data set, the B-Spline registrations required ten grid nodes in each direction for highest accuracy (1.4 mm for SR and 1.5 mm for MR). The numbers of iterations or grid nodes required for optimal registrations depended on the intensity gradients of the underlying images. In summary, the performance of the Demons and B-Spline registrations have been quantitatively evaluated using numerical phantoms. The results show that parameter

  16. Nonparametric meta-analysis for diagnostic accuracy studies.

    PubMed

    Zapf, Antonia; Hoyer, Annika; Kramer, Katharina; Kuss, Oliver

    2015-12-20

    Summarizing the information of many studies using a meta-analysis becomes more and more important, also in the field of diagnostic studies. The special challenge in meta-analysis of diagnostic accuracy studies is that in general sensitivity and specificity are co-primary endpoints. Across the studies both endpoints are correlated, and this correlation has to be considered in the analysis. The standard approach for such a meta-analysis is the bivariate logistic random effects model. An alternative approach is to use marginal beta-binomial distributions for the true positives and the true negatives, linked by copula distributions. In this article, we propose a new, nonparametric approach of analysis, which has greater flexibility with respect to the correlation structure, and always converges. In a simulation study, it becomes apparent that the empirical coverage of all three approaches is in general below the nominal level. Regarding bias, empirical coverage, and mean squared error the nonparametric model is often superior to the standard model, and comparable with the copula model. The three approaches are also applied to two example meta-analyses. PMID:26174020

  17. Increased prediction accuracy in wheat breeding trials using a marker × environment interaction genomic selection model.

    PubMed

    Lopez-Cruz, Marco; Crossa, Jose; Bonnett, David; Dreisigacker, Susanne; Poland, Jesse; Jannink, Jean-Luc; Singh, Ravi P; Autrique, Enrique; de los Campos, Gustavo

    2015-04-01

    Genomic selection (GS) models use genome-wide genetic information to predict genetic values of candidates of selection. Originally, these models were developed without considering genotype × environment interaction(G×E). Several authors have proposed extensions of the single-environment GS model that accommodate G×E using either covariance functions or environmental covariates. In this study, we model G×E using a marker × environment interaction (M×E) GS model; the approach is conceptually simple and can be implemented with existing GS software. We discuss how the model can be implemented by using an explicit regression of phenotypes on markers or using co-variance structures (a genomic best linear unbiased prediction-type model). We used the M×E model to analyze three CIMMYT wheat data sets (W1, W2, and W3), where more than 1000 lines were genotyped using genotyping-by-sequencing and evaluated at CIMMYT's research station in Ciudad Obregon, Mexico, under simulated environmental conditions that covered different irrigation levels, sowing dates and planting systems. We compared the M×E model with a stratified (i.e., within-environment) analysis and with a standard (across-environment) GS model that assumes that effects are constant across environments (i.e., ignoring G×E). The prediction accuracy of the M×E model was substantially greater of that of an across-environment analysis that ignores G×E. Depending on the prediction problem, the M×E model had either similar or greater levels of prediction accuracy than the stratified analyses. The M×E model decomposes marker effects and genomic values into components that are stable across environments (main effects) and others that are environment-specific (interactions). Therefore, in principle, the interaction model could shed light over which variants have effects that are stable across environments and which ones are responsible for G×E. The data set and the scripts required to reproduce the analysis are

  18. Increased Prediction Accuracy in Wheat Breeding Trials Using a Marker × Environment Interaction Genomic Selection Model

    PubMed Central

    Lopez-Cruz, Marco; Crossa, Jose; Bonnett, David; Dreisigacker, Susanne; Poland, Jesse; Jannink, Jean-Luc; Singh, Ravi P.; Autrique, Enrique; de los Campos, Gustavo

    2015-01-01

    Genomic selection (GS) models use genome-wide genetic information to predict genetic values of candidates of selection. Originally, these models were developed without considering genotype × environment interaction(G×E). Several authors have proposed extensions of the single-environment GS model that accommodate G×E using either covariance functions or environmental covariates. In this study, we model G×E using a marker × environment interaction (M×E) GS model; the approach is conceptually simple and can be implemented with existing GS software. We discuss how the model can be implemented by using an explicit regression of phenotypes on markers or using co-variance structures (a genomic best linear unbiased prediction-type model). We used the M×E model to analyze three CIMMYT wheat data sets (W1, W2, and W3), where more than 1000 lines were genotyped using genotyping-by-sequencing and evaluated at CIMMYT’s research station in Ciudad Obregon, Mexico, under simulated environmental conditions that covered different irrigation levels, sowing dates and planting systems. We compared the M×E model with a stratified (i.e., within-environment) analysis and with a standard (across-environment) GS model that assumes that effects are constant across environments (i.e., ignoring G×E). The prediction accuracy of the M×E model was substantially greater of that of an across-environment analysis that ignores G×E. Depending on the prediction problem, the M×E model had either similar or greater levels of prediction accuracy than the stratified analyses. The M×E model decomposes marker effects and genomic values into components that are stable across environments (main effects) and others that are environment-specific (interactions). Therefore, in principle, the interaction model could shed light over which variants have effects that are stable across environments and which ones are responsible for G×E. The data set and the scripts required to reproduce the analysis

  19. Increasing cutaneous afferent feedback improves proprioceptive accuracy at the knee in patients with sensory ataxia.

    PubMed

    Macefield, Vaughan G; Norcliffe-Kaufmann, Lucy; Goulding, Niamh; Palma, Jose-Alberto; Fuente Mora, Cristina; Kaufmann, Horacio

    2016-02-01

    Hereditary sensory and autonomic neuropathy type III (HSAN III) features disturbed proprioception and a marked ataxic gait. We recently showed that joint angle matching error at the knee is positively correlated with the degree of ataxia. Using intraneural microelectrodes, we also documented that these patients lack functional muscle spindle afferents but have preserved large-diameter cutaneous afferents, suggesting that patients with better proprioception may be relying more on proprioceptive cues provided by tactile afferents. We tested the hypothesis that enhancing cutaneous sensory feedback by stretching the skin at the knee joint using unidirectional elasticity tape could improve proprioceptive accuracy in patients with a congenital absence of functional muscle spindles. Passive joint angle matching at the knee was used to assess proprioceptive accuracy in 25 patients with HSAN III and 9 age-matched control subjects, with and without taping. Angles of the reference and indicator knees were recorded with digital inclinometers and the absolute error, gradient, and correlation coefficient between the two sides calculated. Patients with HSAN III performed poorly on the joint angle matching test [mean matching error 8.0 ± 0.8° (±SE); controls 3.0 ± 0.3°]. Following application of tape bilaterally to the knee in an X-shaped pattern, proprioceptive performance improved significantly in the patients (mean error 5.4 ± 0.7°) but not in the controls (3.0 ± 0.2°). Across patients, but not controls, significant increases in gradient and correlation coefficient were also apparent following taping. We conclude that taping improves proprioception at the knee in HSAN III, presumably via enhanced sensory feedback from the skin. PMID:26655817

  20. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    PubMed

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html. PMID:24254576

  1. Molecular Isotopic Distribution Analysis (MIDAs) with Adjustable Mass Accuracy

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Ogurtsov, Aleksey Y.; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  2. Convective Weather Forecast Accuracy Analysis at Center and Sector Levels

    NASA Technical Reports Server (NTRS)

    Wang, Yao; Sridhar, Banavar

    2010-01-01

    This paper presents a detailed convective forecast accuracy analysis at center and sector levels. The study is aimed to provide more meaningful forecast verification measures to aviation community, as well as to obtain useful information leading to the improvements in the weather translation capacity models. In general, the vast majority of forecast verification efforts over past decades have been on the calculation of traditional standard verification measure scores over forecast and observation data analyses onto grids. These verification measures based on the binary classification have been applied in quality assurance of weather forecast products at the national level for many years. Our research focuses on the forecast at the center and sector levels. We calculate the standard forecast verification measure scores for en-route air traffic centers and sectors first, followed by conducting the forecast validation analysis and related verification measures for weather intensities and locations at centers and sectors levels. An approach to improve the prediction of sector weather coverage by multiple sector forecasts is then developed. The weather severe intensity assessment was carried out by using the correlations between forecast and actual weather observation airspace coverage. The weather forecast accuracy on horizontal location was assessed by examining the forecast errors. The improvement in prediction of weather coverage was determined by the correlation between actual sector weather coverage and prediction. observed and forecasted Convective Weather Avoidance Model (CWAM) data collected from June to September in 2007. CWAM zero-minute forecast data with aircraft avoidance probability of 60% and 80% are used as the actual weather observation. All forecast measurements are based on 30-minute, 60- minute, 90-minute, and 120-minute forecasts with the same avoidance probabilities. The forecast accuracy analysis for times under one-hour showed that the errors in

  3. A method of increasing test range and accuracy of bioindicators: Geobacillus stearothermophilus spores.

    PubMed

    Lundahl, Gunnel

    2003-01-01

    Spores of Geobacillus stearothermophilus are very sensitive to changes in temperature. When validating sterilizing processes, the most common bioindicator (BI) is spores of Geobacillus stearothermophilus ATCC12980 and ATCC7953 with about 10(6) spores /BI and a D121-value of about 2 minutes in water. Because these spores of Geobacillus stearothermophilus do not survive at a F0-value above 12 minutes, it has not been possible to evaluate the agreement between the biological F-value (F(BIO)) and physical measurements (time and temperature) when the physical F0-value exceeds that limit. However, it has been proven that glycerin substantially increases the heat resistance of the spores, and it is possible to utilize that property when manufacturing BIs suitable to use in processes with longer sterilization time or high temperature (above 121 degrees C). By the method described, it is possible to make use of the sensitivity and durability of Geobacillus stearothermophilus' spores when glycerin has increased both test range and accuracy. Experience from years of development and validation work with the use of the highly sensitive glycerin-water-spore-suspension sensor (GWS-sensor) is reported. Validation of the steam sterilization process at high temperature has been possible with the use of GWS-sensors. It has also been shown that the spores in suspension keep their characteristics for a period of 19 months when stored cold (8 degrees C). PMID:14558699

  4. Analysis of instrumentation error effects on the identification accuracy of aircraft parameters

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.

    1972-01-01

    An analytical investigation is presented of the effect of unmodeled measurement system errors on the accuracy of aircraft stability and control derivatives identified from flight test data. Such error sources include biases, scale factor errors, instrument position errors, misalignments, and instrument dynamics. Two techniques (ensemble analysis and simulated data analysis) are formulated to determine the quantitative variations to the identified parameters resulting from the unmodeled instrumentation errors. The parameter accuracy that would result from flight tests of the F-4C aircraft with typical quality instrumentation is determined using these techniques. It is shown that unmodeled instrument errors can greatly increase the uncertainty in the value of the identified parameters. General recommendations are made of procedures to be followed to insure that the measurement system associated with identifying stability and control derivatives from flight test provides sufficient accuracy.

  5. Accuracy Analysis on Large Blocks of High Resolution Images

    NASA Technical Reports Server (NTRS)

    Passini, Richardo M.

    2007-01-01

    Although high altitude frequencies effects are removed at the time of basic image generation, low altitude (Yaw) effects are still present in form of affinity/angular affinity. They are effectively removed by additional parameters. Bundle block adjustment based on properly weighted ephemeris/altitude quaternions (BBABEQ) are not enough to remove the systematic effect. Moreover, due to the narrow FOV of the HRSI, position and altitude are highly correlated making it almost impossible to separate and remove their systematic effects without extending the geometric model (Self-Calib.) The systematic effects gets evident on the increase of accuracy (in terms of RMSE at GCPs) for looser and relaxed ground control at the expense of large and strong block deformation with large residuals at check points. Systematic errors are most freely distributed and their effects propagated all over the block.

  6. Oxytocin increases bias, but not accuracy, in face recognition line-ups.

    PubMed

    Bate, Sarah; Bennetts, Rachel; Parris, Benjamin A; Bindemann, Markus; Udale, Robert; Bussunt, Amanda

    2015-07-01

    Previous work indicates that intranasal inhalation of oxytocin improves face recognition skills, raising the possibility that it may be used in security settings. However, it is unclear whether oxytocin directly acts upon the core face-processing system itself or indirectly improves face recognition via affective or social salience mechanisms. In a double-blind procedure, 60 participants received either an oxytocin or placebo nasal spray before completing the One-in-Ten task-a standardized test of unfamiliar face recognition containing target-present and target-absent line-ups. Participants in the oxytocin condition outperformed those in the placebo condition on target-present trials, yet were more likely to make false-positive errors on target-absent trials. Signal detection analyses indicated that oxytocin induced a more liberal response bias, rather than increasing accuracy per se. These findings support a social salience account of the effects of oxytocin on face recognition and indicate that oxytocin may impede face recognition in certain scenarios. PMID:25433464

  7. Nationwide forestry applications program. Analysis of forest classification accuracy

    NASA Technical Reports Server (NTRS)

    Congalton, R. G.; Mead, R. A.; Oderwald, R. G.; Heinen, J. (Principal Investigator)

    1981-01-01

    The development of LANDSAT classification accuracy assessment techniques, and of a computerized system for assessing wildlife habitat from land cover maps are considered. A literature review on accuracy assessment techniques and an explanation for the techniques development under both projects are included along with listings of the computer programs. The presentations and discussions at the National Working Conference on LANDSAT Classification Accuracy are summarized. Two symposium papers which were published on the results of this project are appended.

  8. Geographic stacking: Decision fusion to increase global land cover map accuracy

    NASA Astrophysics Data System (ADS)

    Clinton, Nicholas; Yu, Le; Gong, Peng

    2015-05-01

    Techniques to combine multiple classifier outputs is an established sub-discipline in data mining, referred to as "stacking," "ensemble classification," or "meta-learning." Here we describe how stacking of geographically allocated classifications can create a map composite of higher accuracy than any of the individual classifiers. We used both voting algorithms and trainable classifiers with a set of validation data to combine individual land cover maps. We describe the generality of this setup in terms of existing algorithms and accuracy assessment procedures. This method has the advantage of not requiring posterior probabilities or level of support for predicted class labels. We demonstrate the technique using Landsat based, 30-meter land cover maps, the highest resolution, globally available product of this kind. We used globally distributed validation samples to composite the maps and compute accuracy. We show that geographic stacking can improve individual map accuracy by up to 6.6%. The voting methods can also achieve higher accuracy than the best of the input classifications. Accuracies from different classifiers, input data, and output type are compared. The results are illustrated on a Landsat scene in California, USA. The compositing technique described here has broad applicability in remote sensing based map production and geographic classification.

  9. DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES

    EPA Science Inventory

    Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...

  10. Predictive accuracy of population viability analysis in conservation biology.

    PubMed

    Brook, B W; O'Grady, J J; Chapman, A P; Burgman, M A; Akçakaya, H R; Frankham, R

    2000-03-23

    Population viability analysis (PVA) is widely applied in conservation biology to predict extinction risks for threatened species and to compare alternative options for their management. It can also be used as a basis for listing species as endangered under World Conservation Union criteria. However, there is considerable scepticism regarding the predictive accuracy of PVA, mainly because of a lack of validation in real systems. Here we conducted a retrospective test of PVA based on 21 long-term ecological studies--the first comprehensive and replicated evaluation of the predictive powers of PVA. Parameters were estimated from the first half of each data set and the second half was used to evaluate the performance of the model. Contrary to recent criticisms, we found that PVA predictions were surprisingly accurate. The risk of population decline closely matched observed outcomes, there was no significant bias, and population size projections did not differ significantly from reality. Furthermore, the predictions of the five PVA software packages were highly concordant. We conclude that PVA is a valid and sufficiently accurate tool for categorizing and managing endangered species. PMID:10746724

  11. Accuracy of 3D scanners in tooth mark analysis.

    PubMed

    Molina, Ana; Martin-de-las-Heras, Stella

    2015-01-01

    The objective of this study was to compare the accuracy of contact and laser 3D scanners in tooth mark analysis. Ten dental casts were scanned with both 3D scanners. Seven linear measurements were made from the 3D images of dental casts and biting edges generated with DentalPrint© software (University of Granada, Granada, Spain). The uncertainty value for contact 3D scanning was 0.833 for the upper dental cast and 0.660 mm for the lower cast; similar uncertainty values were found for 3D-laser scanning. Slightly higher uncertainty values were obtained for the 3D biting edges generated. The uncertainty values for single measurements ranged from 0.1 to 0.3 mm with the exception of the intercanine distance, in which higher values were obtained. Knowledge of the error rate in the 3D scanning of dental casts and biting edges is especially relevant to be applied in practical forensic cases. PMID:25388960

  12. Unconscious Reward Cues Increase Invested Effort, but Do Not Change Speed-Accuracy Tradeoffs

    ERIC Educational Resources Information Center

    Bijleveld, Erik; Custers, Ruud; Aarts, Henk

    2010-01-01

    While both conscious and unconscious reward cues enhance effort to work on a task, previous research also suggests that conscious rewards may additionally affect speed-accuracy tradeoffs. Based on this idea, two experiments explored whether reward cues that are presented above (supraliminal) or below (subliminal) the threshold of conscious…

  13. Analysis and improvement of accuracy, sensitivity, and resolution of the coherent gradient sensing method.

    PubMed

    Dong, Xuelin; Zhang, Changxing; Feng, Xue; Duan, Zhiyin

    2016-06-10

    The coherent gradient sensing (CGS) method, one kind of shear interferometry sensitive to surface slope, has been applied to full-field curvature measuring for decades. However, its accuracy, sensitivity, and resolution have not been studied clearly. In this paper, we analyze the accuracy, sensitivity, and resolution for the CGS method based on the derivation of its working principle. The results show that the sensitivity is related to the grating pitch and distance, and the accuracy and resolution are determined by the wavelength of the laser beam and the diameter of the reflected beam. The sensitivity is proportional to the ratio of grating distance to its pitch, while the accuracy will decline as this ratio increases. In addition, we demonstrate that using phase gratings as the shearing element can improve the interferogram and enhance accuracy, sensitivity, and resolution. The curvature of a spherical reflector is measured by CGS with Ronchi gratings and phase gratings under different experimental parameters to illustrate this analysis. All of the results are quite helpful for CGS applications. PMID:27409035

  14. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part II. Statistical Methods of Meta-Analysis.

    PubMed

    Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi; Park, Seong Ho

    2015-01-01

    Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107

  15. Using student-managed interventions to increase homework completion and accuracy

    PubMed Central

    Olympia, Daniel E.; Sheridan, Susan M.; Jenson, William R.; Andrews, Debra

    1994-01-01

    We examined the effectiveness of self-managed individual and group contingency procedures in improving the completion and accuracy rates of daily mathematics homework assignments. A group of sixth-grade students having homework difficulties in mathematics were selected for the study. There was substantial improvement in the amount of homework completed over baseline for a majority of the students, whereas the results for accuracy were mixed. Students who participated in the self-management training made significant gains on standardized measures of academic achievement and curriculum-based measures of classroom performance. Parents also reported significantly fewer problems associated with homework completion following the intervention. Students who were allowed to select their own performance goals made superior improvements in the number of homework assignments returned compared to students who were given a specified goal by the classroom teacher. Parents, subjects, and the classroom teacher responded positively on consumer satisfaction measures following termination of the study. PMID:16795827

  16. Accuracy Analysis of a Box-wing Theoretical SRP Model

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoya; Hu, Xiaogong; Zhao, Qunhe; Guo, Rui

    2016-07-01

    For Beidou satellite navigation system (BDS) a high accuracy SRP model is necessary for high precise applications especially with Global BDS establishment in future. The BDS accuracy for broadcast ephemeris need be improved. So, a box-wing theoretical SRP model with fine structure and adding conical shadow factor of earth and moon were established. We verified this SRP model by the GPS Block IIF satellites. The calculation was done with the data of PRN 1, 24, 25, 27 satellites. The results show that the physical SRP model for POD and forecast for GPS IIF satellite has higher accuracy with respect to Bern empirical model. The 3D-RMS of orbit is about 20 centimeters. The POD accuracy for both models is similar but the prediction accuracy with the physical SRP model is more than doubled. We tested 1-day 3-day and 7-day orbit prediction. The longer is the prediction arc length, the more significant is the improvement. The orbit prediction accuracy with the physical SRP model for 1-day, 3-day and 7-day arc length are 0.4m, 2.0m, 10.0m respectively. But they are 0.9m, 5.5m and 30m with Bern empirical model respectively. We apply this means to the BDS and give out a SRP model for Beidou satellites. Then we test and verify the model with Beidou data of one month only for test. Initial results show the model is good but needs more data for verification and improvement. The orbit residual RMS is similar to that with our empirical force model which only estimate the force for along track, across track direction and y-bias. But the orbit overlap and SLR observation evaluation show some improvement. The remaining empirical force is reduced significantly for present Beidou constellation.

  17. Increasing accuracy in the assessment of motion sickness: A construct methodology

    NASA Technical Reports Server (NTRS)

    Stout, Cynthia S.; Cowings, Patricia S.

    1993-01-01

    The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.

  18. A Model Based Approach to Increase the Part Accuracy in Robot Based Incremental Sheet Metal Forming

    SciTech Connect

    Meier, Horst; Laurischkat, Roman; Zhu Junhong

    2011-01-17

    One main influence on the dimensional accuracy in robot based incremental sheet metal forming results from the compliance of the involved robot structures. Compared to conventional machine tools the low stiffness of the robot's kinematic results in a significant deviation of the planned tool path and therefore in a shape of insufficient quality. To predict and compensate these deviations offline, a model based approach, consisting of a finite element approach, to simulate the sheet forming, and a multi body system, modeling the compliant robot structure, has been developed. This paper describes the implementation and experimental verification of the multi body system model and its included compensation method.

  19. Increasing the precision and accuracy of top-loading balances:  application of experimental design.

    PubMed

    Bzik, T J; Henderson, P B; Hobbs, J P

    1998-01-01

    The traditional method of estimating the weight of multiple objects is to obtain the weight of each object individually. We demonstrate that the precision and accuracy of these estimates can be improved by using a weighing scheme in which multiple objects are simultaneously on the balance. The resulting system of linear equations is solved to yield the weight estimates for the objects. Precision and accuracy improvements can be made by using a weighing scheme without requiring any more weighings than the number of objects when a total of at least six objects are to be weighed. It is also necessary that multiple objects can be weighed with about the same precision as that obtained with a single object, and the scale bias remains relatively constant over the set of weighings. Simulated and empirical examples are given for a system of eight objects in which up to five objects can be weighed simultaneously. A modified Plackett-Burman weighing scheme yields a 25% improvement in precision over the traditional method and implicitly removes the scale bias from seven of the eight objects. Applications of this novel use of experimental design techniques are shown to have potential commercial importance for quality control methods that rely on the mass change rate of an object. PMID:21644600

  20. Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)

    SciTech Connect

    Polly, B.

    2011-09-01

    This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.

  1. Vestibular and Oculomotor Assessments May Increase Accuracy of Subacute Concussion Assessment.

    PubMed

    McDevitt, J; Appiah-Kubi, K O; Tierney, R; Wright, W G

    2016-08-01

    In this study, we collected and analyzed preliminary data for the internal consistency of a new condensed model to assess vestibular and oculomotor impairments following a concussion. We also examined this model's ability to discriminate concussed athletes from healthy controls. Each participant was tested in a concussion assessment protocol that consisted of the Neurocom's Sensory Organization Test (SOT), Balance Error Scoring System exam, and a series of 8 vestibular and oculomotor assessments. Of these 10 assessments, only the SOT, near point convergence, and the signs and symptoms (S/S) scores collected following optokinetic stimulation, the horizontal eye saccades test, and the gaze stabilization test were significantly correlated with health status, and were used in further analyses. Multivariate logistic regression for binary outcomes was employed and these beta weights were used to calculate the area under the receiver operating characteristic curve ( area under the curve). The best model supported by our findings suggest that an exam consisting of the 4 SOT sensory ratios, near point convergence, and the optokinetic stimulation signs and symptoms score are sensitive in discriminating concussed athletes from healthy controls (accuracy=98.6%, AUC=0.983). However, an even more parsimonious model consisting of only the optokinetic stimulation and gaze stabilization test S/S scores and near point convergence was found to be a sensitive model for discriminating concussed athletes from healthy controls (accuracy=94.4%, AUC=0.951) without the need for expensive equipment. Although more investigation is needed, these findings will be helpful to health professionals potentially providing them with a sensitive and specific battery of simple vestibular and oculomotor assessments for concussion management. PMID:27176886

  2. Radiometric and Geometric Accuracy Analysis of Rasat Pan Imagery

    NASA Astrophysics Data System (ADS)

    Kocaman, S.; Yalcin, I.; Guler, M.

    2016-06-01

    RASAT is the second Turkish Earth Observation satellite which was launched in 2011. It operates with pushbroom principle and acquires panchromatic and MS images with 7.5 m and 15 m resolutions, respectively. The swath width of the sensor is 30 km. The main aim of this study is to analyse the radiometric and geometric quality of RASAT images. A systematic validation approach for the RASAT imagery and its products is being applied. RASAT image pair acquired over Kesan city in Edirne province of Turkey are used for the investigations. The raw RASAT data (L0) are processed by Turkish Space Agency (TUBITAK-UZAY) to produce higher level image products. The image products include radiometrically processed (L1), georeferenced (L2) and orthorectified (L3) data, as well as pansharpened images. The image quality assessments include visual inspections, noise, MTF and histogram analyses. The geometric accuracy assessment results are only preliminary and the assessment is performed using the raw images. The geometric accuracy potential is investigated using 3D ground control points extracted from road intersections, which were measured manually in stereo from aerial images with 20 cm resolution and accuracy. The initial results of the study, which were performed using one RASAT panchromatic image pair, are presented in this paper.

  3. Utility of an Algorithm to Increase the Accuracy of Medication History in an Obstetrical Setting

    PubMed Central

    Corbel, Aline; Baud, David; Chaouch, Aziz; Beney, Johnny; Csajka, Chantal; Panchaud, Alice

    2016-01-01

    Background In an obstetrical setting, inaccurate medication histories at hospital admission may result in failure to identify potentially harmful treatments for patients and/or their fetus(es). Methods This prospective study was conducted to assess average concordance rates between (1) a medication list obtained with a one-page structured medication history algorithm developed for the obstetrical setting and (2) the medication list reported in medical records and obtained by open-ended questions based on standard procedures. Both lists were converted into concordance rate using a best possible medication history approach as the reference (information obtained by patients, prescribers and community pharmacists’ interviews). Results The algorithm-based method obtained a higher average concordance rate than the standard method, with respectively 90.2% [CI95% 85.8–94.3] versus 24.6% [CI95%15.3–34.4] concordance rates (p<0.01). Conclusion Our algorithm-based method strongly enhanced the accuracy of the medication history in our obstetric population, without using substantial resources. Its implementation is an effective first step to the medication reconciliation process, which has been recognized as a very important component of patients’ drug safety. PMID:26999743

  4. Increasing accuracy of dispersal kernels in grid-based population models

    USGS Publications Warehouse

    Slone, D.H.

    2011-01-01

    Dispersal kernels in grid-based population models specify the proportion, distance and direction of movements within the model landscape. Spatial errors in dispersal kernels can have large compounding effects on model accuracy. Circular Gaussian and Laplacian dispersal kernels at a range of spatial resolutions were investigated, and methods for minimizing errors caused by the discretizing process were explored. Kernels of progressively smaller sizes relative to the landscape grid size were calculated using cell-integration and cell-center methods. These kernels were convolved repeatedly, and the final distribution was compared with a reference analytical solution. For large Gaussian kernels (σ > 10 cells), the total kernel error was <10 &sup-11; compared to analytical results. Using an invasion model that tracked the time a population took to reach a defined goal, the discrete model results were comparable to the analytical reference. With Gaussian kernels that had σ ≤ 0.12 using the cell integration method, or σ ≤ 0.22 using the cell center method, the kernel error was greater than 10%, which resulted in invasion times that were orders of magnitude different than theoretical results. A goal-seeking routine was developed to adjust the kernels to minimize overall error. With this, corrections for small kernels were found that decreased overall kernel error to <10-11 and invasion time error to <5%.

  5. Increasing accuracy and throughput in large-scale microsatellite fingerprinting of cacao field germplasm collections

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Microsatellite-based DNA fingerprinting has been increasingly applied in crop genebank management. However, efficiency and cost-saving remain a major challenge for large scale genotyping, even when middle or high throughput genotyping facility is available. In this study we report on increasing the...

  6. Increased accuracy of batch fecundity estimates using oocyte stage ratios in Plectropomus leopardus.

    PubMed

    Carter, A B; Williams, A J; Russ, G R

    2009-08-01

    Using the ratio of the number of migratory nuclei to hydrated oocytes to estimate batch fecundity of common coral trout Plectropomus leopardus increases the time over which samples can be collected and, therefore, increases the sample size available and reduces biases in batch fecundity estimates. PMID:20738569

  7. Increasing accuracy of daily evapotranspiration through synergistic use of MSG and MERIS/AATSR

    NASA Astrophysics Data System (ADS)

    Timmermans, Joris; van der Tol, Christiaan; Su, Zhongbo

    2010-05-01

    Daily Evapotranspiration estimates are important in many applications. Evapotranspiration plays a significant role in the water, energy and carbon cycles. Through these cycles evapotranspiration is important for monitoring droughts, managing agricultural irrigation, and weather forecast modeling. Drought levels and irrigation needs can be calculated from evapotranspiration because evapotranspiration estimates give a direct indication on the health and growth rate of crops. The evaporation of the soil and open water bodies and transpiration from plants combine as a lower forcing boundary parameter to the atmosphere affecting local and regional weather patterns. Evapotranspiration can be estimated using different techniques: ground measurements, hydrological modeling, and remote sensing algorithms. The first two techniques are not suitable for large scale estimation of evapotranspiration. Ground measurements are only valid within a small footprint area; and hydrological modelling requires intensive knowledge of a too large amount of processes. The advantage of remote sensing algorithms is that they are capable of estimating the evapotranspiration over large scales with a limited amount of parameters. In remote sensing a trade off exists between temporal and spatial resolution. Geostationary satellites have high temporal resolution but have a low spatial resolution, where near-Polar Orbiting satellites have high spatial resolution but have low temporal resolution. For example the SEVIRI sensor on the Meteosat Second Generation (MSG) satellite acquires images every 15 minutes with a resolution of 3km, where the AATSR/MERIS combination of the ENVISAT satellite has a revisit time of several days with a 1km resolution. Combining the advantages of geostationary satellites and polar-orbiting satellites will greatly improve the accuracy of the daily evapotranspiration estimates. Estimating daily evapotranspiration from near-polar orbiting satellites requires a method to

  8. The Use of Scale-Dependent Precision to Increase Forecast Accuracy in Earth System Modelling

    NASA Astrophysics Data System (ADS)

    Thornes, Tobias; Duben, Peter; Palmer, Tim

    2016-04-01

    At the current pace of development, it may be decades before the 'exa-scale' computers needed to resolve individual convective clouds in weather and climate models become available to forecasters, and such machines will incur very high power demands. But the resolution could be improved today by switching to more efficient, 'inexact' hardware with which variables can be represented in 'reduced precision'. Currently, all numbers in our models are represented as double-precision floating points - each requiring 64 bits of memory - to minimise rounding errors, regardless of spatial scale. Yet observational and modelling constraints mean that values of atmospheric variables are inevitably known less precisely on smaller scales, suggesting that this may be a waste of computer resources. More accurate forecasts might therefore be obtained by taking a scale-selective approach whereby the precision of variables is gradually decreased at smaller spatial scales to optimise the overall efficiency of the model. To study the effect of reducing precision to different levels on multiple spatial scales, we here introduce a new model atmosphere developed by extending the Lorenz '96 idealised system to encompass three tiers of variables - which represent large-, medium- and small-scale features - for the first time. In this chaotic but computationally tractable system, the 'true' state can be defined by explicitly resolving all three tiers. The abilities of low resolution (single-tier) double-precision models and similar-cost high resolution (two-tier) models in mixed-precision to produce accurate forecasts of this 'truth' are compared. The high resolution models outperform the low resolution ones even when small-scale variables are resolved in half-precision (16 bits). This suggests that using scale-dependent levels of precision in more complicated real-world Earth System models could allow forecasts to be made at higher resolution and with improved accuracy. If adopted, this new

  9. Analysis of cost and accuracy of alternative strategies for Enterobacteriaceae identification.

    PubMed

    Robertson, E A; Macks, G C; MacLowry, J D

    1976-04-01

    Analysis of the cost of time and material required for the diagnosis of Enterobacteriacea isolates indicated that a conventional 17-tube (20-test) setup costs $7.98 per isolated identified. Using the API 20E, a similar identification cost $3.02. A conventional 7-tube (10-test) setup cost $3.60, whereas the comparable cost by 30% while increasing the number of isolate identified correctly by 3%. Other strategres using the API 20E or a deoxyribounclease test were also evaluated for cost and accuracy. PMID:770498

  10. High Frequency rTMS over the Left Parietal Lobule Increases Non-Word Reading Accuracy

    ERIC Educational Resources Information Center

    Costanzo, Floriana; Menghini, Deny; Caltagirone, Carlo; Oliveri, Massimiliano; Vicari, Stefano

    2012-01-01

    Increasing evidence in the literature supports the usefulness of Transcranial Magnetic Stimulation (TMS) in studying reading processes. Two brain regions are primarily involved in phonological decoding: the left superior temporal gyrus (STG), which is associated with the auditory representation of spoken words, and the left inferior parietal lobe…

  11. Repeating a Monologue under Increasing Time Pressure: Effects on Fluency, Complexity, and Accuracy

    ERIC Educational Resources Information Center

    Thai, Chau; Boers, Frank

    2016-01-01

    Studies have shown that learners' task performance improves when they have the opportunity to repeat the task. Conditions for task repetition vary, however. In the 4/3/2 activity, learners repeat a monologue under increasing time pressure. The purpose is to foster fluency, but it has been suggested in the literature that it also benefits other…

  12. Increasing the accuracy and automation of fractional vegetation cover estimation from digital photographs

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The use of automated methods to estimate canopy cover (CC) from digital photographs has increased in recent years given its potential to produce accurate, fast and inexpensive CC measurements. Wide acceptance has been delayed because of the limitations of these methods. This work introduces a novel ...

  13. The increase of ultrasound measurements accuracy with the use of two-frequency sounding

    NASA Astrophysics Data System (ADS)

    Shulgina, Yu V.; Soldatov, A. I.; Rozanova, Ya V.; Soldatov, A. A.; Shulgin, E. M.

    2015-04-01

    In the article the new method for detection of the temporary position of the received echo signal is considered. The method consists in successive emission of sounded impulses on two frequencies and also the current study is concerned with the analysis of ultrasound fluctuation propagation time to and from the deflector on every frequency. The detailed description of the mathematical tool is presented in the article. The math tool used allows the authors to decrease the measurement error with help of calculations needed.

  14. Accuracy of the Parallel Analysis Procedure with Polychoric Correlations

    ERIC Educational Resources Information Center

    Cho, Sun-Joo; Li, Feiming; Bandalos, Deborah

    2009-01-01

    The purpose of this study was to investigate the application of the parallel analysis (PA) method for choosing the number of factors in component analysis for situations in which data are dichotomous or ordinal. Although polychoric correlations are sometimes used as input for component analyses, the random data matrices generated for use in PA…

  15. Evaluation of precision and accuracy of selenium measurements in biological materials using neutron activation analysis

    SciTech Connect

    Greenberg, R.R.

    1988-01-01

    In recent years, the accurate determination of selenium in biological materials has become increasingly important in view of the essential nature of this element for human nutrition and its possible role as a protective agent against cancer. Unfortunately, the accurate determination of selenium in biological materials is often difficult for most analytical techniques for a variety of reasons, including interferences, complicated selenium chemistry due to the presence of this element in multiple oxidation states and in a variety of different organic species, stability and resistance to destruction of some of these organo-selenium species during acid dissolution, volatility of some selenium compounds, and potential for contamination. Neutron activation analysis (NAA) can be one of the best analytical techniques for selenium determinations in biological materials for a number of reasons. Currently, precision at the 1% level (1s) and overall accuracy at the 1 to 2% level (95% confidence interval) can be attained at the U.S. National Bureau of Standards (NBS) for selenium determinations in biological materials when counting statistics are not limiting (using the {sup 75}Se isotope). An example of this level of precision and accuracy is summarized. Achieving this level of accuracy, however, requires strict attention to all sources of systematic error. Precise and accurate results can also be obtained after radiochemical separations.

  16. Bayesian approach increases accuracy when selecting cowpea genotypes with high adaptability and phenotypic stability.

    PubMed

    Barroso, L M A; Teodoro, P E; Nascimento, M; Torres, F E; Dos Santos, A; Corrêa, A M; Sagrilo, E; Corrêa, C C G; Silva, F A; Ceccon, G

    2016-01-01

    This study aimed to verify that a Bayesian approach could be used for the selection of upright cowpea genotypes with high adaptability and phenotypic stability, and the study also evaluated the efficiency of using informative and minimally informative a priori distributions. Six trials were conducted in randomized blocks, and the grain yield of 17 upright cowpea genotypes was assessed. To represent the minimally informative a priori distributions, a probability distribution with high variance was used, and a meta-analysis concept was adopted to represent the informative a priori distributions. Bayes factors were used to conduct comparisons between the a priori distributions. The Bayesian approach was effective for selection of upright cowpea genotypes with high adaptability and phenotypic stability using the Eberhart and Russell method. Bayes factors indicated that the use of informative a priori distributions provided more accurate results than minimally informative a priori distributions. PMID:26985961

  17. Increased ephemeris accuracy using attitude-dependent aerodynamic force coefficients for inertially stabilized spacecraft

    NASA Technical Reports Server (NTRS)

    Folta, David C.; Baker, David F.

    1991-01-01

    The FREEMAC program used to generate the aerodynamic coefficients, as well as associated routines that allow the results to be used in other software is described. These capabilities are applied in two numerical examples to the short-term orbit prediction of the Gamma Ray Observatory (GRO) and Hubble Space Telescope (HST) spacecraft. Predictions using attitude-dependent aerodynamic coefficients were made on a modified version of the PC-based Ephemeris Generation Program (EPHGEN) and were compared to definitive orbit solutions obtained from actual tracking data. The numerical results show improvement in the predicted semi-major axis and along-track positions that would seem to be worth the added computational effort. Finally, other orbit and attitude analysis applications are noted that could profit from using FREEMAC-calculated aerodynamic coefficients, including orbital lifetime studies, orbit determination methods, attitude dynamics simulators, and spacecraft control system component sizing.

  18. Increasing the accuracy in the application of global ionospheric maps computed from GNSS data

    NASA Astrophysics Data System (ADS)

    Hernadez-Pajarez, Manuel; Juan, Miguel; Sanz, Jaume; Garcia-Rigo, Alberto

    2013-04-01

    Since June 1998 the Technical University of Catalonia (UPC) is contributing to the International GNSS Service (IGS) by providing global maps of Vertical Total Electron Content (Vertical TEC or VTEC) of the Ionosphere, computed with global tomographic modelling from dual-frequency GNSS measurements of the global IGS network. Due to the IGS requirements, in order to facilitate the combination of different global VTEC products from different analysis centers (computed with different techniques and softwares) in a common product, such global ionospheric maps have been provided in a two-dimension (2D) description (VTEC), in spite of they were computed from the very beginning with a tomographic model, estimating separately top and bottomside electron content (see above mentioned references). In this work we present the study of the impact of incorporating the raw vertical distribution of electron content (preserved from the original UPC tomographic runs) in the algorithm of retrieving a given Slant TEC (STEC) for a given receiver-transmitter line-of-sight and time, as a "companion-map" of the original UPC global VTEC map distributed through IGS servers in IONEX format. The performance will be evaluated taking as ground truth the very accurate STEC difference values provided by the direct GNSS observation in a continuous arch of dual-frequency data (for a given GNSS satellite-receiver pair) for several receivers worldwide distributed which have not been involved in the computation of global VTEC maps.

  19. Coupled Loads Analysis Accuracy from the Space Vehicle Perspective

    NASA Astrophysics Data System (ADS)

    Dickens, J. M.; Wittbrodt, M. J.; Gate, M. M.; Li, L. H.; Stroeve, A.

    2001-01-01

    Coupled loads analysis (CLA) consists of performing a structural response analysis, usually a time-history response analysis, with reduced dynamic models typically provided by two different companies to obtain the coupled response of a launch vehicle and space vehicle to the launching and staging events required to place the space vehicle into orbit. The CLA is performed by the launch vehicle contractor with a reduced dynamics mathematical model that is coupled to the launch vehicle, or booster, model to determine the coupled loads for each substructure. Recently, the booster and space vehicle contractors have been from different countries. Due to the language differences and governmental restrictions, the verification of the CLA is much more difficult than when working with launch vehicle and space vehicle contractors of the same country. This becomes exceedingly clear when the CLA analysis results do not seem to pass an intuitive judgement. Presented in the sequel are three checks that a space vehicle contractor can perform on the results of a coupled loads analysis to partially verify the analysis.

  20. Cytopathological Analysis of Cyst Fluid Enhances Diagnostic Accuracy of Mucinous Pancreatic Cystic Neoplasms

    PubMed Central

    Utomo, Wesley K.; Braat, Henri; Bruno, Marco J.; van Eijck, Casper H.J.; Koerkamp, Bas Groot; Krak, Nanda C.; van de Vreede, Adriaan; Fuhler, Gwenny M.; Peppelenbosch, Maikel P.; Biermann, Katharina

    2015-01-01

    Abstract Widespread use of cross-sectional imaging and increasing age of the general population has increased the number of detected pancreatic cystic lesions. However, several pathological entities with a variety in malignant potential have to be discriminated to allow clinical decision making. Discrimination between mucinous pancreatic cystic neoplasms (PCNs) and nonmucinous pancreatic lesions is the primary step in the clinical work-up, as malignant transformation is mostly associated with mucinous PCN. We performed a retrospective analysis of all resected PCN in our tertiary center from 2000 to 2014, to evaluate preoperative diagnostic performance and the results of implementation of the consensus guidelines over time. This was followed by a prospective cohort study of patients with an undefined pancreatic cyst, where the added value of cytopathological mucin evaluation to carcinoembryonic antigen (CEA) in cyst fluid for the discrimination of mucinous PCN and nonmucinous cysts was investigated. Retrospective analysis showed 115 patients operated for a PCN, with a correct preoperative classification in 96.2% of the patients. High-grade dysplasia or invasive carcinoma was observed in only 32.3% of mucinous PCN. In our prospective cohort (n = 71), 57.7% of patients were classified as having a mucinous PCN. CEA ≥192 ng/mL had an accuracy of 63.4%, and cytopathological mucin evaluation an accuracy of 73.0%. Combining these 2 tests further improved diagnostic accuracy of a mucinous PCN to 76.8%. CEA level and mucin evaluation were not predictive of the degree of dysplasia. These findings show that adding cytopathology to cyst fluid biochemistry improves discrimination between mucinous PCN and nonmucinous cysts.

  1. ssDNA Pairing Accuracy Increases When Abasic Sites Divide Nucleotides into Small Groups

    PubMed Central

    Peacock-Villada, Alexandra; Coljee, Vincent; Danilowicz, Claudia; Prentiss, Mara

    2015-01-01

    Accurate sequence dependent pairing of single-stranded DNA (ssDNA) molecules plays an important role in gene chips, DNA origami, and polymerase chain reactions. In many assays accurate pairing depends on mismatched sequences melting at lower temperatures than matched sequences; however, for sequences longer than ~10 nucleotides, single mismatches and correct matches have melting temperature differences of less than 3°C. We demonstrate that appropriately grouping of 35 bases in ssDNA using abasic sites increases the difference between the melting temperature of correct bases and the melting temperature of mismatched base pairings. Importantly, in the presence of appropriately spaced abasic sites mismatches near one end of a long dsDNA destabilize the annealing at the other end much more effectively than in systems without the abasic sites, suggesting that the dsDNA melts more uniformly in the presence of appropriately spaced abasic sites. In sum, the presence of appropriately spaced abasic sites allows temperature to more accurately discriminate correct base pairings from incorrect ones. PMID:26115175

  2. Increase of Readability and Accuracy of 3d Models Using Fusion of Close Range Photogrammetry and Laser Scanning

    NASA Astrophysics Data System (ADS)

    Gašparović, M.; Malarić, I.

    2012-07-01

    The development of laser scanning technology has opened a new page in geodesy and enabled an entirely new way of presenting data. Products obtained by the method of laser scanning are used in many sciences, as well as in archaeology. It should be noted that 3D models of archaeological artefacts obtained by laser scanning are fully measurable, written in 1:1 scale and have high accuracy. On the other hand, texture and RGB values of the surface of the object obtained by a laser scanner have lower resolution and poorer radiometric characteristics in relation to the textures captured with a digital camera. Scientific research and the goal of this paper are to increase the accuracy and readability of the 3D model with textures obtained with a digital camera. Laser scanning was performed with triangulation scanner of high accuracy, Vivid 9i (Konica Minolta), while for photogrammetric recording digital camera Nikon D90 with a lens of fixed focal length 20 mm, was used. It is important to stress that a posteriori accuracy score of the global registration of point clouds in the form of the standard deviation was ± 0.136 mm while the average distance was only ± 0.080 mm. Also research has proven that the quality projection texture model increases readability. Recording of archaeological artefacts and making their photorealistic 3D model greatly contributes to archaeology as a science, accelerates processing and reconstruction of the findings. It also allows the presentation of findings to the general public, not just to the experts.

  3. Tourniquet Test for Dengue Diagnosis: Systematic Review and Meta-analysis of Diagnostic Test Accuracy

    PubMed Central

    Reid, Hamish; Thomas, Emma; Foster, Charlie; Darton, Thomas C.

    2016-01-01

    Background Dengue fever is a ubiquitous arboviral infection in tropical and sub-tropical regions, whose incidence has increased over recent decades. In the absence of a rapid point of care test, the clinical diagnosis of dengue is complex. The World Health Organisation has outlined diagnostic criteria for making the diagnosis of dengue infection, which includes the use of the tourniquet test (TT). Purpose To assess the quality of the evidence supporting the use of the TT and perform a diagnostic accuracy meta-analysis comparing the TT to antibody response measured by ELISA. Data Sources A comprehensive literature search was conducted in the following databases to April, 2016: MEDLINE (PubMed), EMBASE, Cochrane Central Register of Controlled Trials, BIOSIS, Web of Science, SCOPUS. Study Selection Studies comparing the diagnostic accuracy of the tourniquet test with ELISA for the diagnosis of dengue were included. Data Extraction Two independent authors extracted data using a standardized form. Data Synthesis A total of 16 studies with 28,739 participants were included in the meta-analysis. Pooled sensitivity for dengue diagnosis by TT was 58% (95% Confidence Interval (CI), 43%-71%) and the specificity was 71% (95% CI, 60%-80%). In the subgroup analysis sensitivity for non-severe dengue diagnosis was 55% (95% CI, 52%-59%) and the specificity was 63% (95% CI, 60%-66%), whilst sensitivity for dengue hemorrhagic fever diagnosis was 62% (95% CI, 53%-71%) and the specificity was 60% (95% CI, 48%-70%). Receiver-operator characteristics demonstrated a test accuracy (AUC) of 0.70 (95% CI, 0.66–0.74). Conclusion The tourniquet test is widely used in resource poor settings despite currently available evidence demonstrating only a marginal benefit in making a diagnosis of dengue infection alone. Registration The protocol for this systematic review was registered at PROSPERO: CRD42015020323. PMID:27486661

  4. In pursuit of virtual lead optimization: Pruning ensembles of receptor structures for increased efficiency and accuracy during docking

    PubMed Central

    Bolstad, Erin S. D.; Anderson, Amy C.

    2008-01-01

    Representing receptors as ensembles of protein conformations during docking is a powerful method to approximate protein flexibility and increase the accuracy of the resulting ranked list of compounds. Unfortunately, docking compounds against a large number of ensemble members can increase computational cost and time investment. In this manuscript, we present an efficient method to evaluate and select the most contributive ensemble members prior to docking for targets with a conserved core of residues that bind a ligand moiety. We observed that ensemble members that preserve the geometry of the active site core are most likely to place ligands in the active site with a conserved orientation, generally rank ligands correctly and increase interactions with the receptor. A relative distance approach is used to quantify the preservation of the three-dimensional interatomic distances of the conserved ligand-binding atoms and prune large ensembles quickly. In this study, we investigate dihydrofolate reductase as an example of a protein with a conserved core; however, this method for accurately selecting relevant ensemble members a priori can be applied to any system with a conserved ligand-binding core, including HIV-1 protease, kinases and acetylcholinesterase. Representing a drug target as a pruned ensemble during in silico screening should increase the accuracy and efficiency of high throughput analyses of lead analogs. PMID:18781587

  5. Spatial and Temporal Analysis on the Distribution of Active Radio-Frequency Identification (RFID) Tracking Accuracy with the Kriging Method

    PubMed Central

    Liu, Xin; Shannon, Jeremy; Voun, Howard; Truijens, Martijn; Chi, Hung-Lin; Wang, Xiangyu

    2014-01-01

    Radio frequency identification (RFID) technology has already been applied in a number of areas to facilitate the tracking process. However, the insufficient tracking accuracy of RFID is one of the problems that impedes its wider application. Previous studies focus on examining the accuracy of discrete points RFID, thereby leaving the tracking accuracy of the areas between the observed points unpredictable. In this study, spatial and temporal analysis is applied to interpolate the continuous distribution of RFID tracking accuracy based on the Kriging method. An implementation trial has been conducted in the loading and docking area in front of a warehouse to validate this approach. The results show that the weak signal area can be easily identified by the approach developed in the study. The optimum distance between two RFID readers and the effect of the sudden removal of readers are also presented by analysing the spatial and temporal variation of RFID tracking accuracy. This study reveals the correlation between the testing time and the stability of RFID tracking accuracy. Experimental results show that the proposed approach can be used to assist the RFID system setup process to increase tracking accuracy. PMID:25356648

  6. Spatial and temporal analysis on the distribution of active radio-frequency identification (RFID) tracking accuracy with the Kriging method.

    PubMed

    Liu, Xin; Shannon, Jeremy; Voun, Howard; Truijens, Martijn; Chi, Hung-Lin; Wang, Xiangyu

    2014-01-01

    Radio frequency identification (RFID) technology has already been applied in a number of areas to facilitate the tracking process. However, the insufficient tracking accuracy of RFID is one of the problems that impedes its wider application. Previous studies focus on examining the accuracy of discrete points RFID, thereby leaving the tracking accuracy of the areas between the observed points unpredictable. In this study, spatial and temporal analysis is applied to interpolate the continuous distribution of RFID tracking accuracy based on the Kriging method. An implementation trial has been conducted in the loading and docking area in front of a warehouse to validate this approach. The results show that the weak signal area can be easily identified by the approach developed in the study. The optimum distance between two RFID readers and the effect of the sudden removal of readers are also presented by analysing the spatial and temporal variation of RFID tracking accuracy. This study reveals the correlation between the testing time and the stability of RFID tracking accuracy. Experimental results show that the proposed approach can be used to assist the RFID system setup process to increase tracking accuracy. PMID:25356648

  7. Line-shapes analysis with ultra-high accuracy

    NASA Astrophysics Data System (ADS)

    Wcisło, Piotr; Cygan, Agata; Lisak, Daniel; Ciuryło, Roman

    2014-11-01

    We present analysis of the R7 Q8 O2 B-band rovibronic transition measured with ultra-high signal-to-noise ratio by Pound-Drever-Hall-locked frequency-stabilized cavity-ring- down spectroscopy. For line-shape calculations ab intio in spirt approach was used based on numerical solution of the proper transport/relaxation equation. Consequences for spectroscopic determination of the Boltzmann constant as well as precise determination of the line position in the Doppler limited spectroscopy are indicated.

  8. Toward Improved Force-Field Accuracy through Sensitivity Analysis of Host-Guest Binding Thermodynamics

    PubMed Central

    Yin, Jian; Fenley, Andrew T.; Henriksen, Niel M.; Gilson, Michael K.

    2015-01-01

    Improving the capability of atomistic computer models to predict the thermodynamics of noncovalent binding is critical for successful structure-based drug design, and the accuracy of such calculations remains limited by non-optimal force field parameters. Ideally, one would incorporate protein-ligand affinity data into force field parametrization, but this would be inefficient and costly. We now demonstrate that sensitivity analysis can be used to efficiently tune Lennard-Jones parameters of aqueous host-guest systems for increasingly accurate calculations of binding enthalpy. These results highlight the promise of a comprehensive use of calorimetric host-guest binding data, along with existing validation data sets, to improve force field parameters for the simulation of noncovalent binding, with the ultimate goal of making protein-ligand modeling more accurate and hence speeding drug discovery. PMID:26181208

  9. Accuracy and Precision of Silicon Based Impression Media for Quantitative Areal Texture Analysis

    PubMed Central

    Goodall, Robert H.; Darras, Laurent P.; Purnell, Mark A.

    2015-01-01

    Areal surface texture analysis is becoming widespread across a diverse range of applications, from engineering to ecology. In many studies silicon based impression media are used to replicate surfaces, and the fidelity of replication defines the quality of data collected. However, while different investigators have used different impression media, the fidelity of surface replication has not been subjected to quantitative analysis based on areal texture data. Here we present the results of an analysis of the accuracy and precision with which different silicon based impression media of varying composition and viscosity replicate rough and smooth surfaces. Both accuracy and precision vary greatly between different media. High viscosity media tested show very low accuracy and precision, and most other compounds showed either the same pattern, or low accuracy and high precision, or low precision and high accuracy. Of the media tested, mid viscosity President Jet Regular Body and low viscosity President Jet Light Body (Coltène Whaledent) are the only compounds to show high levels of accuracy and precision on both surface types. Our results show that data acquired from different impression media are not comparable, supporting calls for greater standardisation of methods in areal texture analysis. PMID:25991505

  10. Surface Accuracy Analysis of Single Panels for the Shanghai 65-M Radio Telescope

    NASA Astrophysics Data System (ADS)

    Fu, Li; Liu, Guoxi; Jin, Chao; Yan, Feng; An, Tao; Zhiqiang, Shen

    We presented the surface accuracy measurements of 5 single panels of the Shanghai 65-meter radio telescope by employing the coordinate measuring machine and laser tracker. The measurement data obtained from the two instruments were analyzed with the common point transformation and CAD surface fitting techniques, respectively. The derived rms uncertainties of panel accuracy from two methods are consistent with each other, and both match the design specification. The simulations of the effects of manufacturing error, gravity, temperature and wind on the panel surface accuracy with the finite element analysis method suggest that the first two factors account for primary sources of the accuracy uncertainty. The panel deformation under concentrated load was analyzed through finite element analysis and experiment, and the comparison error is 5.6%. There is not plastic deformation when people of weight below 70kg installs and remedies the panel.

  11. Analysis of visual plotting accuracy and sporadic pollution and consequences for shower association.

    NASA Astrophysics Data System (ADS)

    Koschack, R.

    1991-12-01

    An analysis of the plotting accuracy and of the sporadic pollution for visual meteor observations is given. It is found that both factors limit the observability of minor showers to ZHR ≥ 3. Based on the results of the analysis, rules are developed for minor shower observations.

  12. Accuracy Analysis of a Low-Cost Platform for Positioning and Navigation

    NASA Astrophysics Data System (ADS)

    Hofmann, S.; Kuntzsch, C.; Schulze, M. J.; Eggert, D.; Sester, M.

    2012-07-01

    This paper presents an accuracy analysis of a platform based on low-cost components for landmark-based navigation intended for research and teaching purposes. The proposed platform includes a LEGO MINDSTORMS NXT 2.0 kit, an Android-based Smartphone as well as a compact laser scanner Hokuyo URG-04LX. The robot is used in a small indoor environment, where GNSS is not available. Therefore, a landmark map was produced in advance, with the landmark positions provided to the robot. All steps of procedure to set up the platform are shown. The main focus of this paper is the reachable positioning accuracy, which was analyzed in this type of scenario depending on the accuracy of the reference landmarks and the directional and distance measuring accuracy of the laser scanner. Several experiments were carried out, demonstrating the practically achievable positioning accuracy. To evaluate the accuracy, ground truth was acquired using a total station. These results are compared to the theoretically achievable accuracies and the laser scanner's characteristics.

  13. Long-term deflections of reinforced concrete elements: accuracy analysis of predictions by different methods

    NASA Astrophysics Data System (ADS)

    Gribniak, Viktor; Bacinskas, Darius; Kacianauskas, Rimantas; Kaklauskas, Gintaris; Torres, Lluis

    2013-08-01

    Long-term deflection response of reinforced concrete flexural members is influenced by the interaction of complex physical phenomena, such as concrete creep, shrinkage and cracking, which makes their prediction difficult. A number of approaches are proposed by design codes with different degrees of simplification and accuracy. This paper statistically investigates accuracy of long-term deflection predictions made by some of the most widely used design codes ( Eurocode 2, ACI 318, ACI 435, and the new Russian code SP 52-101) and a numerical technique proposed by the authors. The accuracy is analyzed using test data of 322 reinforced concrete members from 27 test programs reported in the literature. The predictions of each technique are discussed, and a comparative analysis is made showing the influence of different parameters, such as sustained loading duration, compressive strength of concrete, loading intensity and reinforcement ratio, on the prediction accuracy.

  14. Increased accuracy of species lists developed for alpine lakes using morphology and cytochrome oxidase I for identification of specimens.

    PubMed

    Deiner, Kristy; Knapp, Roland A; Boiano, Daniel M; May, Bernie

    2013-09-01

    The first step in many community ecology studies is to produce a species list from a sample of individuals. Community ecologists now have two viable ways of producing a species list: morphological and barcode identification. In this study, we compared the taxonomic resolution gained by a combined use of both methods and tested whether a change in taxonomic resolution significantly impacted richness estimates for benthic macroinvertebrates sampled from ten lakes in Sequoia National Park, USA. Across all lakes, 77 unique taxa were identified and 42% (32) were reliably identified to species using both barcode and morphological identification. Of the 32 identified to species, 63% (20) were identified solely by comparing the barcode sequence from cytochrome oxidase I to the Barcode of Life reference library. The increased resolution using a combined identification approach compared to identifications based solely on morphology resulted in a significant increase in estimated richness within a lake at the order, family, genus and species levels of taxonomy (P < 0.05). Additionally, young or damaged individuals that could not be identified using morphology were identified using their COI sequences to the genus or species level on average 75% of the time. Our results demonstrate that a combined identification approach improves accuracy of benthic macroinvertebrate species lists in alpine lakes and subsequent estimates of richness. We encourage the use of barcodes for identification purposes and specifically when morphology is insufficient, as in the case of damaged and early life stage specimens of benthic macroinvertebrates. PMID:23773698

  15. Geolocation and Pointing Accuracy Analysis for the WindSat Sensor

    NASA Technical Reports Server (NTRS)

    Meissner, Thomas; Wentz, Frank J.; Purdy, William E.; Gaiser, Peter W.; Poe, Gene; Uliana, Enzo A.

    2006-01-01

    Geolocation and pointing accuracy analyses of the WindSat flight data are presented. The two topics were intertwined in the flight data analysis and will be addressed together. WindSat has no unusual geolocation requirements relative to other sensors, but its beam pointing knowledge accuracy is especially critical to support accurate polarimetric radiometry. Pointing accuracy was improved and verified using geolocation analysis in conjunction with scan bias analysis. nvo methods were needed to properly identify and differentiate between data time tagging and pointing knowledge errors. Matchups comparing coastlines indicated in imagery data with their known geographic locations were used to identify geolocation errors. These coastline matchups showed possible pointing errors with ambiguities as to the true source of the errors. Scan bias analysis of U, the third Stokes parameter, and of vertical and horizontal polarizations provided measurement of pointing offsets resolving ambiguities in the coastline matchup analysis. Several geolocation and pointing bias sources were incfementally eliminated resulting in pointing knowledge and geolocation accuracy that met all design requirements.

  16. There's a Bug in Your Ear!: Using Technology to Increase the Accuracy of DTT Implementation

    ERIC Educational Resources Information Center

    McKinney, Tracy; Vasquez, Eleazar, III.

    2014-01-01

    Many professionals have successfully implemented discrete trial teaching in the past. However, there have not been extensive studies examining the accuracy of discrete trial teaching implementation. This study investigated the use of Bug in Ear feedback on the accuracy of discrete trial teaching implementation among two pre-service teachers…

  17. Accuracy of mucocutaneous leishmaniasis diagnosis using polymerase chain reaction: systematic literature review and meta-analysis

    PubMed Central

    Gomes, Ciro Martins; Mazin, Suleimy Cristina; dos Santos, Elisa Raphael; Cesetti, Mariana Vicente; Bächtold, Guilherme Albergaria Brízida; Cordeiro, João Henrique de Freitas; Theodoro, Fabrício Claudino Estrela Terra; Damasco, Fabiana dos Santos; Carranza, Sebastián Andrés Vernal; Santos, Adriana de Oliveira; Roselino, Ana Maria; Sampaio, Raimunda Nonata Ribeiro

    2015-01-01

    The diagnosis of mucocutaneous leishmaniasis (MCL) is hampered by the absence of a gold standard. An accurate diagnosis is essential because of the high toxicity of the medications for the disease. This study aimed to assess the ability of polymerase chain reaction (PCR) to identify MCL and to compare these results with clinical research recently published by the authors. A systematic literature review based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses: the PRISMA Statement was performed using comprehensive search criteria and communication with the authors. A meta-analysis considering the estimates of the univariate and bivariate models was performed. Specificity near 100% was common among the papers. The primary reason for accuracy differences was sensitivity. The meta-analysis, which was only possible for PCR samples of lesion fragments, revealed a sensitivity of 71% [95% confidence interval (CI) = 0.59; 0.81] and a specificity of 93% (95% CI = 0.83; 0.98) in the bivariate model. The search for measures that could increase the sensitivity of PCR should be encouraged. The quality of the collected material and the optimisation of the amplification of genetic material should be prioritised. PMID:25946238

  18. Inclusion of quality controls on leishmaniases molecular tests to increase diagnostic accuracy in research and reference laboratories.

    PubMed

    da C Gonçalves-de-Albuquerque, Suênia; Pessoa-e-Silva, Rômulo; Trajano-Silva, Lays A M; de Morais, Rayana C S; Brandão-Filho, Sinval P; de Paiva-Cavalcanti, Milena

    2015-04-01

    Early detection of leishmaniases and prompt institution of treatment are paramount for individuals and communities affected by these diseases. To overcome the remaining limitations inherent to molecular methods currently used and to ensure the accuracy of results in leishmaniases diagnosis, two triplex polymerase chain reaction (PCR) assays with quality controls for the reactions were developed. Validity indicators were assessed in 186 dog blood samples from endemic areas in Brazil. The level of agreement between the new tools and their singleplex protocols was assessed by kappa analysis. The triplex PCR for visceral leishmaniasis showed sensitivity (S) = 78.68 %, specificity (E) = 85.29 %, and efficiency (e) = 81.05 %. The cutaneous leishmaniasis protocol showed S = 97.29 %, E = 79.16 %, and e = 90.16 %. Both protocols showed good agreement with gold standards. These new tools enable, in a single reaction, the diagnosis of the diseases and the evaluation of the sample quality and DNA extraction process, thus reducing the cost of reagents and avoiding the eventual need for collecting a second sample. PMID:25428552

  19. Diagnostic accuracy of refractometry for assessing bovine colostrum quality: A systematic review and meta-analysis.

    PubMed

    Buczinski, S; Vandeweerd, J M

    2016-09-01

    Provision of good quality colostrum [i.e., immunoglobulin G (IgG) concentration ≥50g/L] is the first step toward ensuring proper passive transfer of immunity for young calves. Precise quantification of colostrum IgG levels cannot be easily performed on the farm. Assessment of the refractive index using a Brix scale with a refractometer has been described as being highly correlated with IgG concentration in colostrum. The aim of this study was to perform a systematic review of the diagnostic accuracy of Brix refractometry to diagnose good quality colostrum. From 101 references initially obtain ed, 11 were included in the systematic review meta-analysis representing 4,251 colostrum samples. The prevalence of good colostrum samples with IgG ≥50g/L varied from 67.3 to 92.3% (median 77.9%). Specific estimates of accuracy [sensitivity (Se) and specificity (Sp)] were obtained for different reported cut-points using a hierarchical summary receiver operating characteristic curve model. For the cut-point of 22% (n=8 studies), Se=80.2% (95% CI: 71.1-87.0%) and Sp=82.6% (71.4-90.0%). Decreasing the cut-point to 18% increased Se [96.1% (91.8-98.2%)] and decreased Sp [54.5% (26.9-79.6%)]. Modeling the effect of these Brix accuracy estimates using a stochastic simulation and Bayes theorem showed that a positive result with the 22% Brix cut-point can be used to diagnose good quality colostrum (posttest probability of a good colostrum: 94.3% (90.7-96.9%). The posttest probability of good colostrum with a Brix value <18% was only 22.7% (12.3-39.2%). Based on this study, the 2 cut-points could be alternatively used to select good quality colostrum (sample with Brix ≥22%) or to discard poor quality colostrum (sample with Brix <18%). When sample results are between these 2 values, colostrum supplementation should be considered. PMID:27423958

  20. Lorentzian’ analysis of the accuracy of modern catalogues of stellar positions

    NASA Astrophysics Data System (ADS)

    Varaksina, N. Y.; Nefedyev, Y. A.; Churkin, K. O.; Zabbarova, R. R.; Demin, S. A.

    2015-12-01

    There is a new approach for the estimation of the position accuracy and proper motions of the stars in astrometric catalogues by comparison of the stars' positions in the researched and Hipparcos catalogues in different periods, but under a standard equinox. To verify this method was carried out the analysis of the star positions and proper motions UCAC2, PPM, ACRS, Tycho-2, ACT, TRC, FON and Tycho catalogues. As a result of this study was obtained that the accuracy of positions and proper motions of the stars in Tycho-2 and UCAC2 catalogues are approximately equal. The results of the comparison are represented graphically.

  1. Accuracy analysis of the space shuttle solid rocket motor profile measuring device

    NASA Technical Reports Server (NTRS)

    Estler, W. Tyler

    1989-01-01

    The Profile Measuring Device (PMD) was developed at the George C. Marshall Space Flight Center following the loss of the Space Shuttle Challenger. It is a rotating gauge used to measure the absolute diameters of mating features of redesigned Solid Rocket Motor field joints. Diameter tolerance of these features are typically + or - 0.005 inches and it is required that the PMD absolute measurement uncertainty be within this tolerance. In this analysis, the absolute accuracy of these measurements were found to be + or - 0.00375 inches, worst case, with a potential accuracy of + or - 0.0021 inches achievable by improved temperature control.

  2. Orbit Determination Accuracy Analysis of the Magnetospheric Multiscale Mission During Perigee Raise

    NASA Technical Reports Server (NTRS)

    Pachura, Daniel A.; Vavrina, Matthew A.; Carpenter, J. R.; Wright, Cinnamon A.

    2014-01-01

    The Goddard Space Flight Center (GSFC) Flight Dynamics Facility (FDF) will provide orbit determination and prediction support for the Magnetospheric Multiscale (MMS) mission during the missions commissioning period. The spacecraft will launch into a highly elliptical Earth orbit in 2015. Starting approximately four days after launch, a series of five large perigee-raising maneuvers will be executed near apogee on a nearly every-other-orbit cadence. This perigee-raise operations concept requires a high-accuracy estimate of the orbital state within one orbit following the maneuver for performance evaluation and a high-accuracy orbit prediction to correctly plan and execute the next maneuver in the sequence. During early mission design, a linear covariance analysis method was used to study orbit determination and prediction accuracy for this perigee-raising campaign. This paper provides a higher fidelity Monte Carlo analysis using the operational COTS extended Kalman filter implementation that was performed to validate the linear covariance analysis estimates and to better characterize orbit determination performance for actively maneuvering spacecraft in a highly elliptical orbit. The study finds that the COTS extended Kalman filter tool converges on accurate definitive orbit solutions quickly, but prediction accuracy through orbits with very low altitude perigees is degraded by the unpredictability of atmospheric density variation.

  3. Orbit Determination Accuracy Analysis of the Magnetospheric Multiscale Mission During Perigee Raise

    NASA Technical Reports Server (NTRS)

    Pachura, Daniel A.; Vavrina, Matthew A.; Carpenter, J. Russell; Wright, Cinnamon A.

    2014-01-01

    The Goddard Space Flight Center (GSFC) Flight Dynamics Facility (FDF) will provide orbit determination and prediction support for the Magnetospheric Multiscale (MMS) mission during the mission's commissioning period. The spacecraft will launch into a highly elliptical Earth orbit in 2015. Starting approximately four days after launch, a series of five large perigee-raising maneuvers will be executed near apogee on a nearly every-other-orbit cadence. This perigee-raise operations concept requires a high-accuracy estimate of the orbital state within one orbit following the maneuver for performance evaluation and a high-accuracy orbit prediction to correctly plan and execute the next maneuver in the sequence. During early mission design, a linear covariance analysis method was used to study orbit determination and prediction accuracy for this perigee-raising campaign. This paper provides a higher fidelity Monte Carlo analysis using the operational COTS extended Kalman filter implementation that was performed to validate the linear covariance analysis estimates and to better characterize orbit determination performance for actively maneuvering spacecraft in a highly elliptical orbit. The study finds that the COTS extended Kalman filter tool converges on accurate definitive orbit solutions quickly, but prediction accuracy through orbits with very low altitude perigees is degraded by the unpredictability of atmospheric density variation.

  4. Accuracy Analysis for Finite-Volume Discretization Schemes on Irregular Grids

    NASA Technical Reports Server (NTRS)

    Diskin, Boris; Thomas, James L.

    2010-01-01

    A new computational analysis tool, downscaling test, is introduced and applied for studying the convergence rates of truncation and discretization errors of nite-volume discretization schemes on general irregular (e.g., unstructured) grids. The study shows that the design-order convergence of discretization errors can be achieved even when truncation errors exhibit a lower-order convergence or, in some cases, do not converge at all. The downscaling test is a general, efficient, accurate, and practical tool, enabling straightforward extension of verification and validation to general unstructured grid formulations. It also allows separate analysis of the interior, boundaries, and singularities that could be useful even in structured-grid settings. There are several new findings arising from the use of the downscaling test analysis. It is shown that the discretization accuracy of a common node-centered nite-volume scheme, known to be second-order accurate for inviscid equations on triangular grids, degenerates to first order for mixed grids. Alternative node-centered schemes are presented and demonstrated to provide second and third order accuracies on general mixed grids. The local accuracy deterioration at intersections of tangency and in flow/outflow boundaries is demonstrated using the DS tests tailored to examining the local behavior of the boundary conditions. The discretization-error order reduction within inviscid stagnation regions is demonstrated. The accuracy deterioration is local, affecting mainly the velocity components, but applies to any order scheme.

  5. [Effect of different distribution of components concentration on the accuracy of quantitative spectral analysis].

    PubMed

    Li, Gang; Zhao, Zhe; Wang, Hui-Quan; Lin, Ling; Zhang, Bao-Ju; Wu, Xiao-Rong

    2012-07-01

    In order to discuss the effect of different distribution of components concentration on the accuracy of quantitative spectral analysis, according to the Lambert-Beer law, ideal absorption spectra of samples with three components were established. Gaussian noise was added to the spectra. Correction and prediction models were built by partial least squares regression to reflect the unequal modeling and prediction results between different distributions of components. Results show that, in the case of pure linear absorption, the accuracy of model is related to the distribution of components concentration. Not only to the component we focus on, but also to the non-tested components, the larger covered and more uniform distribution is a significant point of calibration set samples to establish a universal model and provide a satisfactory accuracy. This research supplies a theoretic guidance for reasonable choice of samples with suitable concentration distribution, which enhances the quality of model and reduces the prediction error of the predict set. PMID:23016350

  6. Analysis of the Accuracy and Robustness of the Leap Motion Controller

    PubMed Central

    Weichert, Frank; Bachmann, Daniel; Rudak, Bartholomäus; Fisseler, Denis

    2013-01-01

    The Leap Motion Controller is a new device for hand gesture controlled user interfaces with declared sub-millimeter accuracy. However, up to this point its capabilities in real environments have not been analyzed. Therefore, this paper presents a first study of a Leap Motion Controller. The main focus of attention is on the evaluation of the accuracy and repeatability. For an appropriate evaluation, a novel experimental setup was developed making use of an industrial robot with a reference pen allowing a position accuracy of 0.2 mm. Thereby, a deviation between a desired 3D position and the average measured positions below 0.2 mm has been obtained for static setups and of 1.2 mm for dynamic setups. Using the conclusion of this analysis can improve the development of applications for the Leap Motion controller in the field of Human-Computer Interaction. PMID:23673678

  7. Global Sensitivity Analysis of Environmental Models: Convergence, Robustness and Accuracy Analysis

    NASA Astrophysics Data System (ADS)

    Sarrazin, F.; Pianosi, F.; Hartmann, A. J.; Wagener, T.

    2014-12-01

    Sensitivity analysis aims to characterize the impact that changes in model input factors (e.g. the parameters) have on the model output (e.g. simulated streamflow). It is a valuable diagnostic tool for model understanding and for model improvement, it enhances calibration efficiency, and it supports uncertainty and scenario analysis. It is of particular interest for environmental models because they are often complex, non-linear, non-monotonic and exhibit strong interactions between their parameters. However, sensitivity analysis has to be carefully implemented to produce reliable results at moderate computational cost. For example, sample size can have a strong impact on the results and has to be carefully chosen. Yet, there is little guidance available for this step in environmental modelling. The objective of the present study is to provide guidelines for a robust sensitivity analysis, in order to support modellers in making appropriate choices for its implementation and in interpreting its outcome. We considered hydrological models with increasing level of complexity. We tested four sensitivity analysis methods, Regional Sensitivity Analysis, Method of Morris, a density-based (PAWN) and a variance-based (Sobol) method. The convergence and variability of sensitivity indices were investigated. We used bootstrapping to assess and improve the robustness of sensitivity indices even for limited sample sizes. Finally, we propose a quantitative validation approach for sensitivity analysis based on the Kolmogorov-Smirnov statistics.

  8. Optical System Error Analysis and Calibration Method of High-Accuracy Star Trackers

    PubMed Central

    Sun, Ting; Xing, Fei; You, Zheng

    2013-01-01

    The star tracker is a high-accuracy attitude measurement device widely used in spacecraft. Its performance depends largely on the precision of the optical system parameters. Therefore, the analysis of the optical system parameter errors and a precise calibration model are crucial to the accuracy of the star tracker. Research in this field is relatively lacking a systematic and universal analysis up to now. This paper proposes in detail an approach for the synthetic error analysis of the star tracker, without the complicated theoretical derivation. This approach can determine the error propagation relationship of the star tracker, and can build intuitively and systematically an error model. The analysis results can be used as a foundation and a guide for the optical design, calibration, and compensation of the star tracker. A calibration experiment is designed and conducted. Excellent calibration results are achieved based on the calibration model. To summarize, the error analysis approach and the calibration method are proved to be adequate and precise, and could provide an important guarantee for the design, manufacture, and measurement of high-accuracy star trackers. PMID:23567527

  9. Optical system error analysis and calibration method of high-accuracy star trackers.

    PubMed

    Sun, Ting; Xing, Fei; You, Zheng

    2013-01-01

    The star tracker is a high-accuracy attitude measurement device widely used in spacecraft. Its performance depends largely on the precision of the optical system parameters. Therefore, the analysis of the optical system parameter errors and a precise calibration model are crucial to the accuracy of the star tracker. Research in this field is relatively lacking a systematic and universal analysis up to now. This paper proposes in detail an approach for the synthetic error analysis of the star tracker, without the complicated theoretical derivation. This approach can determine the error propagation relationship of the star tracker, and can build intuitively and systematically an error model. The analysis results can be used as a foundation and a guide for the optical design, calibration, and compensation of the star tracker. A calibration experiment is designed and conducted. Excellent calibration results are achieved based on the calibration model. To summarize, the error analysis approach and the calibration method are proved to be adequate and precise, and could provide an important guarantee for the design, manufacture, and measurement of high-accuracy star trackers. PMID:23567527

  10. Whole-body predictors of wrist shot accuracy in ice hockey: a kinematic analysis.

    PubMed

    Michaud-Paquette, Yannick; Magee, Patrick; Pearsall, David; Turcotte, René

    2011-03-01

    The purpose of this study was to identify joint angular kinematics that corresponds to shooting accuracy in the stationary ice hockey wrist shot. Twenty-four subjects participated in this study, each performing 10 successful shots on four shooting targets. An eight-camera infra-red motion capture system (240 Hz), along with passive reflective markers, was used to record motion of the joints, hockey stick, and puck throughout the performance of the wrist shot. A multiple regression analysis was carried out to examine whole-body kinematic variables with accuracy scores as the dependent variable. Significant accuracy predictors were identified in the lower limbs, torso and upper limbs. Interpretation of the kinematics suggests that characteristics such as a better stability of the base of support, momentum cancellation, proper trunk orientation and a more dynamic control of the lead arm throughout the wrist shot movement are presented as predictors for the accuracy outcome. These findings are substantial as they not only provide a framework for further analysis of motor control strategies using tools for accurate projection of objects, but more tangibly they may provide a comprehensive evidence-based guide to coaches and athletes for planned training to improve performance. PMID:21560748

  11. Preliminary navigation accuracy analysis for the TDRSS Onboard Navigation System (TONS) experiment on EP/EUVE

    NASA Technical Reports Server (NTRS)

    Gramling, C. J.; Long, A. C.; Lee, T.; Ottenstein, N. A.; Samii, M. V.

    1991-01-01

    A Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) is currently being developed by NASA to provide a high accuracy autonomous navigation capability for users of TDRSS and its successor, the Advanced TDRSS (ATDRSS). The fully autonomous user onboard navigation system will support orbit determination, time determination, and frequency determination, based on observation of a continuously available, unscheduled navigation beacon signal. A TONS experiment will be performed in conjunction with the Explorer Platform (EP) Extreme Ultraviolet Explorer (EUVE) mission to flight quality TONS Block 1. An overview is presented of TONS and a preliminary analysis of the navigation accuracy anticipated for the TONS experiment. Descriptions of the TONS experiment and the associated navigation objectives, as well as a description of the onboard navigation algorithms, are provided. The accuracy of the selected algorithms is evaluated based on the processing of realistic simulated TDRSS one way forward link Doppler measurements. The analysis process is discussed and the associated navigation accuracy results are presented.

  12. Accuracy of bite mark analysis from food substances: A comparative study

    PubMed Central

    Daniel, M. Jonathan; Pazhani, Ambiga

    2015-01-01

    Aims and Objectives: The aims and objectives of the study were to compare the accuracy of bite mark analysis from three different food substances-apple, cheese and chocolate using two techniques-the manual docking procedure and computer assisted overlay generation technique and to compare the accuracy of the two techniques for bite mark analysis on food substances. Materials and Methods: The individuals who participated in the study were made to bite on three food substances-apple, cheese, and chocolate. Dentate individuals were included in the study. Edentulous individuals and individuals having a missing anterior tooth were excluded from the study. The dental casts of the individual were applied to the positive cast of the bitten food substance to determine docking or matching. Then, computer generated overlays were compared with bite mark pattern on the foodstuff. Results: The results were tabulated and the comparison of bite mark analysis on the three different food substances was analyzed by Kruskall-Wallis ANOVA test and the comparison of the two techniques was analyzed by Spearman's Rho correlation coefficient. Conclusion: On comparing the bite marks analysis from the three food substances-apple, cheese and chocolate, the accuracy was found to be greater for chocolate and cheese than apple. PMID:26816463

  13. Accuracy of urea breath test in Helicobacter pylori infection: Meta-analysis

    PubMed Central

    Ferwana, Mazen; Abdulmajeed, Imad; Alhajiahmed, Ali; Madani, Wedad; Firwana, Belal; Hasan, Rim; Altayar, Osama; Limburg, Paul J; Murad, Mohammad Hassan; Knawy, Bandar

    2015-01-01

    AIM: To quantitatively summarize and appraise the available evidence of urea breath test (UBT) use to diagnose Helicobacter pylori (H. pylori) infection in patients with dyspepsia and provide pooled diagnostic accuracy measures. METHODS: We searched MEDLINE, EMBASE, Cochrane library and other databases for studies addressing the value of UBT in the diagnosis of H. pylori infection. We included cross-sectional studies that evaluated the diagnostic accuracy of UBT in adult patients with dyspeptic symptoms. Risk of bias was assessed using QUADAS (Quality Assessment of Diagnostic Accuracy Studies)-2 tool. Diagnostic accuracy measures were pooled using the random-effects model. Subgroup analysis was conducted by UBT type (13C vs 14C) and by measurement technique (Infrared spectrometry vs Isotope Ratio Mass Spectrometry). RESULTS: Out of 1380 studies identified, only 23 met the eligibility criteria. Fourteen studies (61%) evaluated 13C UBT and 9 studies (39%) evaluated 14C UBT. There was significant variation in the type of reference standard tests used across studies.Pooled sensitivity was 0.96 (95%CI: 0.95-0.97) andpooled specificity was 0.93 (95%CI: 0.91-0.94). Likelihood ratio for a positive test was 12 and for a negative test was 0.05 with an area under thecurve of 0.985. Meta-analyses were associated with a significant statistical heterogeneity that remained unexplained after subgroup analysis. The included studies had a moderate risk of bias. CONCLUSION: UBT has high diagnostic accuracy for detecting H. pylori infection in patients with dyspepsia. The reliability of diagnostic meta-analytic estimates however is limited by significant heterogeneity. PMID:25632206

  14. Increased Proportion of Variance Explained and Prediction Accuracy of Survival of Breast Cancer Patients with Use of Whole-Genome Multiomic Profiles.

    PubMed

    Vazquez, Ana I; Veturi, Yogasudha; Behring, Michael; Shrestha, Sadeep; Kirst, Matias; Resende, Marcio F R; de Los Campos, Gustavo

    2016-07-01

    Whole-genome multiomic profiles hold valuable information for the analysis and prediction of disease risk and progression. However, integrating high-dimensional multilayer omic data into risk-assessment models is statistically and computationally challenging. We describe a statistical framework, the Bayesian generalized additive model ((BGAM), and present software for integrating multilayer high-dimensional inputs into risk-assessment models. We used BGAM and data from The Cancer Genome Atlas for the analysis and prediction of survival after diagnosis of breast cancer. We developed a sequence of studies to (1) compare predictions based on single omics with those based on clinical covariates commonly used for the assessment of breast cancer patients (COV), (2) evaluate the benefits of combining COV and omics, (3) compare models based on (a) COV and gene expression profiles from oncogenes with (b) COV and whole-genome gene expression (WGGE) profiles, and (4) evaluate the impacts of combining multiple omics and their interactions. We report that (1) WGGE profiles and whole-genome methylation (METH) profiles offer more predictive power than any of the COV commonly used in clinical practice (e.g., subtype and stage), (2) adding WGGE or METH profiles to COV increases prediction accuracy, (3) the predictive power of WGGE profiles is considerably higher than that based on expression from large-effect oncogenes, and (4) the gain in prediction accuracy when combining multiple omics is consistent. Our results show the feasibility of omic integration and highlight the importance of WGGE and METH profiles in breast cancer, achieving gains of up to 7 points area under the curve (AUC) over the COV in some cases. PMID:27129736

  15. Increased Proportion of Variance Explained and Prediction Accuracy of Survival of Breast Cancer Patients with Use of Whole-Genome Multiomic Profiles

    PubMed Central

    Vazquez, Ana I.; Veturi, Yogasudha; Behring, Michael; Shrestha, Sadeep; Kirst, Matias; Resende, Marcio F. R.; de los Campos, Gustavo

    2016-01-01

    Whole-genome multiomic profiles hold valuable information for the analysis and prediction of disease risk and progression. However, integrating high-dimensional multilayer omic data into risk-assessment models is statistically and computationally challenging. We describe a statistical framework, the Bayesian generalized additive model ((BGAM), and present software for integrating multilayer high-dimensional inputs into risk-assessment models. We used BGAM and data from The Cancer Genome Atlas for the analysis and prediction of survival after diagnosis of breast cancer. We developed a sequence of studies to (1) compare predictions based on single omics with those based on clinical covariates commonly used for the assessment of breast cancer patients (COV), (2) evaluate the benefits of combining COV and omics, (3) compare models based on (a) COV and gene expression profiles from oncogenes with (b) COV and whole-genome gene expression (WGGE) profiles, and (4) evaluate the impacts of combining multiple omics and their interactions. We report that (1) WGGE profiles and whole-genome methylation (METH) profiles offer more predictive power than any of the COV commonly used in clinical practice (e.g., subtype and stage), (2) adding WGGE or METH profiles to COV increases prediction accuracy, (3) the predictive power of WGGE profiles is considerably higher than that based on expression from large-effect oncogenes, and (4) the gain in prediction accuracy when combining multiple omics is consistent. Our results show the feasibility of omic integration and highlight the importance of WGGE and METH profiles in breast cancer, achieving gains of up to 7 points area under the curve (AUC) over the COV in some cases. PMID:27129736

  16. Accuracy and repeatability of Roentgen stereophotogrammetric analysis (RSA) for measuring knee laxity in longitudinal studies.

    PubMed

    Fleming, B C; Peura, G D; Abate, J A; Beynnon, B D

    2001-10-01

    Roentgen stereophotogrammetric analysis (RSA) can be used to assess temporal changes in anterior-posterior (A-P) knee laxity. However, the accuracy and precision of RSA is dependent on many factors and should be independently evaluated for a particular application. The objective of this study was to evaluate the use of RSA for measuring A-P knee laxity. The specific aims were to assess the variation or "noise" inherent to RSA, to determine the reproducibility of RSA for repeated A-P laxity testing, and to assess the accuracy of these measurements. Two experiments were performed. The first experiment utilized three rigid models of the tibiofemoral joint to assess the noise and to compare digitization errors of two independent examiners. No differences were found in the kinematic outputs of the RSA due to examiner, repeated trials, or the model used. In a second experiment, A-P laxity values between the A-P shear load limits of +/-60 N of five cadaver goat knees were measured to assess the error associated with repeated testing. The RSA laxity values were also compared to those obtained from a custom designed linkage system. The mean A-P laxity values with the knee 30 degrees, 60 degrees, and 90 degrees of flexion for the ACL-intact goat knee (+/-95% confidence interval) were 0.8 (+/-0.25), 0.9 (+/-0.29), and 0.4 (+/-0.22) mm, respectively. In the ACL-deficient knee, the A-P laxity values increased by an order of magnitude to 8.8 (+/-1.39), 7.6 (+/-1.32), and 3.1 (+/-1.20)mm, respectively. No significant differences were found between the A-P laxity values measured by RSA and the independent measurement technique. A highly significant linear relationship (r(2)=0.83) was also found between these techniques. This study suggests that the RSA method is an accurate and precise means to measure A-P knee laxity for repeated testing over time. PMID:11522316

  17. Assembly accuracy analysis for small components with a planar surface in large-scale metrology

    NASA Astrophysics Data System (ADS)

    Wang, Qing; Huang, Peng; Li, Jiangxiong; Ke, Yinglin; Yang, Bingru; Maropoulos, Paul G.

    2016-04-01

    Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.

  18. Analysis of machining accuracy during free form surface milling simulation for different milling strategies

    NASA Astrophysics Data System (ADS)

    Matras, A.; Kowalczyk, R.

    2014-11-01

    The analysis results of machining accuracy after the free form surface milling simulations (based on machining EN AW- 7075 alloys) for different machining strategies (Level Z, Radial, Square, Circular) are presented in the work. Particular milling simulations were performed using CAD/CAM Esprit software. The accuracy of obtained allowance is defined as a difference between the theoretical surface of work piece element (the surface designed in CAD software) and the machined surface after a milling simulation. The difference between two surfaces describes a value of roughness, which is as the result of tool shape mapping on the machined surface. Accuracy of the left allowance notifies in direct way a surface quality after the finish machining. Described methodology of usage CAD/CAM software can to let improve a time design of machining process for a free form surface milling by a 5-axis CNC milling machine with omitting to perform the item on a milling machine in order to measure the machining accuracy for the selected strategies and cutting data.

  19. Accuracy and repeatability of two methods of gait analysis - GaitRite™ und Mobility Lab™ - in subjects with cerebellar ataxia.

    PubMed

    Schmitz-Hübsch, Tanja; Brandt, Alexander U; Pfueller, Caspar; Zange, Leonora; Seidel, Adrian; Kühn, Andrea A; Paul, Friedemann; Minnerop, Martina; Doss, Sarah

    2016-07-01

    Instrumental gait analysis is increasingly recognized as a useful tool for the evaluation of movement disorders. The various assessment devices available to date have mostly been evaluated in healthy populations only. We aimed to explore whether reliability and validity seen in healthy subjects can also be assumed in subjects with cerebellar ataxic gait. Gait was recorded simultaneously with two devices - a sensor-embedded walkway and an inertial sensor based system - to explore test accuracy in two groups of subjects: one with mild to moderate cerebellar ataxia due to a subtype of autosomal-dominantly inherited neurodegenerative disorder (SCA14), the other were healthy subjects matched for age and height (CTR). Test precision was assessed by retest within session for each device. In conclusion, accuracy and repeatability of gait measurements were not compromised by ataxic gait disorder. The accuracy of spatial measures was speed-dependent and a direct comparison of stride length from both devices will be most reliably made at comfortable speed. Measures of stride variability had low agreement between methods in CTR and at retest in both groups. However, the marked increase of stride variability in ataxia outweighs the observed amount of imprecision. PMID:27289221

  20. Accuracy analysis of CryoSat-2 SARIn mode data over Antarctica

    NASA Astrophysics Data System (ADS)

    Wang, Fang; Bamber, Jonathan; Cheng, Xiao

    2015-04-01

    In 2010, CryoSat-2 was launched, carrying a unique satellite radar altimetry (SRA) instrument called SAR/Interferometric Radar Altimeter (SIRAL), with the aim of measuring and monitoring sea ice, ice sheets and mountain glaciers. The novel SAR Interferometric mode (SARInM) of CryoSat-2 is designed to improve the accuracy, resolution and geolocation of height measurements over the steeper margins of ice sheets and ice caps. Over these areas, it employs the synthetic aperture radar (SAR) capability to reduce the size of the footprint to effectively 450m along track and ~1km across track implemented from an airborne prototype originally termed a delay-Doppler altimeter. Additionally, CryoSat-2 used the phase difference between its two antennas to estimate surface slope in the across-track direction and identify the point of closed approach directly. The phase difference is 2pi for a surface slope of approximately 1deg. If the slope is above this threshold, the tracked surface in the returned waveform may be not the point of closed approach causing an error in slope correction. For this reason, the analysis was limited to slopes of 1deg or less in this study. We used extensive coverage of Antarctica provided by the ICESat laser altimeter mission between 2003 and 2009 to assess the accuracy of SARInM data. We corrected for changes in elevations due to the interval between the acquisition of the ICESat and CryoSat-2 data (from July 2010 and December 2013). Two methods were used: (1) the ICESat point was compared with a DEM derived from CryoSat-2 data (Point-to-DEM; PtoDEM), and (2) the ICESat point was compared with a CryoSat-2 point directly (Point-to-Point; PtoP). For PtoDEM, CryoSat-2 elevations were interpolated onto a regular 1km polar stereographic grid with a standard parallel of 71°S, using ordinary kriging. For PtoP, the maximum distance between a CryoSat-2 point location and ICESat point location was set to 35m. For the areas with slopes less than 0.2deg, the

  1. Analysis of accuracy in optical motion capture - A protocol for laboratory setup evaluation.

    PubMed

    Eichelberger, Patric; Ferraro, Matteo; Minder, Ursina; Denton, Trevor; Blasimann, Angela; Krause, Fabian; Baur, Heiner

    2016-07-01

    Validity and reliability as scientific quality criteria have to be considered when using optical motion capture (OMC) for research purposes. Literature and standards recommend individual laboratory setup evaluation. However, system characteristics such as trueness, precision and uncertainty are often not addressed in scientific reports on 3D human movement analysis. One reason may be the lack of simple and practical methods for evaluating accuracy parameters of OMC. A protocol was developed for investigating the accuracy of an OMC system (Vicon, volume 5.5×1.2×2.0m(3)) with standard laboratory equipment and by means of trueness and uncertainty of marker distances. The study investigated the effects of number of cameras (6, 8 and 10), measurement height (foot, knee and hip) and movement condition (static and dynamic) on accuracy. Number of cameras, height and movement condition affected system accuracy significantly. For lower body assessment during level walking, the most favorable setting (10 cameras, foot region) revealed mean trueness and uncertainty to be -0.08 and 0.33mm, respectively. Dynamic accuracy cannot be predicted based on static error assessments. Dynamic procedures have to be used instead. The significant influence of the number of cameras and the measurement location suggests that instrumental errors should be evaluated in a laboratory- and task-specific manner. The use of standard laboratory equipment makes the proposed procedure widely applicable and it supports the setup process of OCM by simple functional error assessment. Careful system configuration and thorough measurement process control are needed to produce high-quality data. PMID:27230474

  2. The analysis accuracy assessment of CORINE land cover in the Iberian coast

    NASA Astrophysics Data System (ADS)

    Grullón, Yraida R.; Alhaddad, Bahaaeddin; Cladera, Josep R.

    2009-09-01

    Corine land cover 2000 (CLC2000) is a project jointly managed by the Joint Research Centre (JRC) and the European Environment Agency (EEA). Its aim is to update the Corine land cover database in Europe for the year 2000. Landsat-7 Enhanced Thematic Mapper (ETM) satellite images were used for the update and were acquired within the framework of the Image2000 project. Knowledge of the land status through the use of mapping CORINE Land Cover is of great importance to study of interaction land cover and land use categories in Europe scale. This paper presents the accuracy assessment methodology designed and implemented to validate the Iberian Coast CORINE Land Cover 2000 cartography. It presents an implementation of a new methodological concept for land cover data production, Object- Based classification, and automatic generalization to assess the thematic accuracy of CLC2000 by means of an independent data source based on the comparison of the land cover database with reference data derived from visual interpretation of high resolution satellite imageries for sample areas. In our case study, the existing Object-Based classifications are supported with digital maps and attribute databases. According to the quality tests performed, we computed the overall accuracy, and Kappa Coefficient. We will focus on the development of a methodology based on classification and generalization analysis for built-up areas that may improve the investigation. This study can be divided in these fundamental steps: -Extract artificial areas from land use Classifications based on Land-sat and Spot images. -Manuel interpretation for high resolution of multispectral images. -Determine the homogeneity of artificial areas by generalization process. -Overall accuracy, Kappa Coefficient and Special grid (fishnet) test for quality test. Finally, this paper will concentrate to illustrate the precise accuracy of CORINE dataset based on the above general steps.

  3. Diagnostic Accuracy of Xpert Test in Tuberculosis Detection: A Systematic Review and Meta-analysis

    PubMed Central

    Kaur, Ravdeep; Kachroo, Kavita; Sharma, Jitendar Kumar; Vatturi, Satyanarayana Murthy; Dang, Amit

    2016-01-01

    Background: World Health Organization (WHO) recommends the use of Xpert MTB/RIF assay for rapid diagnosis of tuberculosis (TB) and detection of rifampicin resistance. This systematic review was done to know about the diagnostic accuracy and cost-effectiveness of the Xpert MTB/RIF assay. Methods: A systematic literature search was conducted in following databases: Cochrane Central Register of Controlled Trials and Cochrane Database of Systematic Reviews, MEDLINE, PUBMED, Scopus, Science Direct and Google Scholar for relevant studies for studies published between 2010 and December 2014. Studies given in the systematic reviews were accessed separately and used for analysis. Selection of studies, data extraction and assessment of quality of included studies was performed independently by two reviewers. Studies evaluating the diagnostic accuracy of Xpert MTB/RIF assay among adult or predominantly adult patients (≥14 years), presumed to have pulmonary TB with or without HIV infection were included in the review. Also, studies that had assessed the diagnostic accuracy of Xpert MTB/RIF assay using sputum and other respiratory specimens were included. Results: The included studies had a low risk of any form of bias, showing that findings are of high scientific validity and credibility. Quantitative analysis of 37 included studies shows that Xpert MTB/RIF is an accurate diagnostic test for TB and detection of rifampicin resistance. Conclusion: Xpert MTB/RIF assay is a robust, sensitive and specific test for accurate diagnosis of tuberculosis as compared to conventional tests like culture and microscopic examination. PMID:27013842

  4. Accuracy and reproducibility of bending stiffness measurements by mechanical response tissue analysis in artificial human ulnas.

    PubMed

    Arnold, Patricia A; Ellerbrock, Emily R; Bowman, Lyn; Loucks, Anne B

    2014-11-01

    Osteoporosis is characterized by reduced bone strength, but no FDA-approved medical device measures bone strength. Bone strength is strongly associated with bone stiffness, but no FDA-approved medical device measures bone stiffness either. Mechanical Response Tissue Analysis (MRTA) is a non-significant risk, non-invasive, radiation-free, vibration analysis technique for making immediate, direct functional measurements of the bending stiffness of long bones in humans in vivo. MRTA has been used for research purposes for more than 20 years, but little has been published about its accuracy. To begin to investigate its accuracy, we compared MRTA measurements of bending stiffness in 39 artificial human ulna bones to measurements made by Quasistatic Mechanical Testing (QMT). In the process, we also quantified the reproducibility (i.e., precision and repeatability) of both methods. MRTA precision (1.0±1.0%) and repeatability (3.1 ± 3.1%) were not as high as those of QMT (0.2 ± 0.2% and 1.3+1.7%, respectively; both p<10(-4)). The relationship between MRTA and QMT measurements of ulna bending stiffness was indistinguishable from the identity line (p=0.44) and paired measurements by the two methods agreed within a 95% confidence interval of ± 5%. If such accuracy can be achieved on real human ulnas in situ, and if the ulna is representative of the appendicular skeleton, MRTA may prove clinically useful. PMID:25261885

  5. Menu label accuracy at a university's foodservices. An exploratory recipe nutrition analysis.

    PubMed

    Feldman, Charles; Murray, Douglas; Chavarria, Stephanie; Zhao, Hang

    2015-09-01

    The increase in the weight of American adults and children has been positively associated with the prevalence of the consumption of food-away-from-home. The objective was to assess the accuracy of claimed nutritional information of foods purchased in contracted foodservices located on the campus of an institution of higher education. Fifty popular food items were randomly collected from five main dining outlets located on a selected campus in the northeastern United States. The sampling was repeated three times on separate occasions for an aggregate total of 150 food samples. The samples were then weighed and assessed for nutrient composition (protein, cholesterol, fiber, carbohydrates, total fat, calories, sugar, and sodium) using nutrient analysis software. Results were compared with foodservices' published nutrition information. Two group comparisons, claimed and measured, were performed using the paired-sample t-test. Descriptive statistics were used as well. Among the nine nutritional values, six nutrients (total fat, sodium, protein, fiber, cholesterol, and weight) had more than 10% positive average discrepancies between measured and claimed values. Statistical significance of the variance was obtained in four of the eight categories of nutrient content: total fat, sodium, protein, and cholesterol (P < .05). Significance was also reached in the variance of actual portion weight compared to the published claims (P < .001). Significant differences of portion size (weight), total fat, sodium, protein, and cholesterol were found among the sampled values and the foodservices' published claims. The findings from this study raise the concern that if the actual nutritional information does not accurately reflect the declared values on menus, conclusions, decisions and actions based on posted information may not be valid. PMID:25958116

  6. Increasing the Accuracy in the Measurement of the Minor Isotopes of Uranium: Care in Selection of Reference Materials, Baselines and Detector Calibration

    NASA Astrophysics Data System (ADS)

    Poths, J.; Koepf, A.; Boulyga, S. F.

    2008-12-01

    The minor isotopes of uranium (U-233, U-234, U-236) are increasingly useful for tracing a variety of processes: movement of anthropogenic nuclides in the environment (ref 1), sources of uranium ores (ref 2), and nuclear material attribution (ref 3). We report on improved accuracy for U-234/238 and U-236/238 by supplementing total evaporation protocol TIMS measurement on Faraday detectors (ref 4)with multiplier measurement for the minor isotopes. Measurement of small signals on Faraday detectors alone is limited by noise floors of the amplifiers and accurate measurement of the baseline offsets. The combined detector approach improves the reproducibility to better than ±1% (relative) for the U-234/238 at natural abundance, and yields a detection limit for U-236/U-238 of <0.2 ppm. We have quantified contribution of different factors to the uncertainties associated with these peak jumping measurement on a single detector, with an aim of further improvement. The uncertainties in the certified values for U-234 and U-236 in the uranium standard NBS U005, if used for mass bias correction, dominates the uncertainty in their isotopic ratio measurements. Software limitations in baseline measurement drives the detection limit for the U-236/U-238 ratio. This is a topic for discussion with the instrument manufacturers. Finally, deviation from linearity of the response of the electron multiplier with count rate limits the accuracy and reproducibility of these minor isotope measurements. References: (1) P. Steier et al(2008) Nuc Inst Meth(B), 266, 2246-2250. (2) E. Keegan et al (2008) Appl Geochem 23, 765-777. (3) K. Mayer et al (1998) IAEA-CN-98/11, in Advances in Destructive and Non-destructive Analysis for Environmental Monitoring and Nuclear Forensics. (4) S. Richter and S. Goldberg(2003) Int J Mass Spectrom, 229, 181-197.

  7. Shortening the retention interval of 24-hour dietary recalls increases fourth-grade children’s accuracy for reporting energy and macronutrient intake at school meals

    PubMed Central

    Guinn, Caroline H.; Royer, Julie A.; Hardin, James W.; Mackelprang, Alyssa J.; Smith, Albert F.

    2010-01-01

    Background Accurate information about children’s intake is crucial for national nutrition policy and for research and clinical activities. To analyze accuracy for reporting energy and nutrients, most validation studies utilize the conventional approach which was not designed to capture errors of reported foods and amounts. The reporting-error-sensitive approach captures errors of reported foods and amounts. Objective To extend results to energy and macronutrients for a validation study concerning retention interval (elapsed time between to-be-reported meals and the interview) and accuracy for reporting school-meal intake, the conventional and reporting-error-sensitive approaches were compared. Design and participants/setting Fourth-grade children (n=374) were observed eating two school meals, and interviewed to obtain a 24-hour recall using one of six interview conditions from crossing two target periods (prior-24-hours; previous-day) with three interview times (morning; afternoon; evening). Data were collected in one district during three school years (2004–2005; 2005–2006; 2006–2007). Main outcome measures Report rates (reported/observed), correspondence rates (correctly reported/observed), and inflation ratios (intruded/observed) were calculated for energy and macronutrients. Statistical analyses performed For each outcome measure, mixed-model analysis of variance was conducted with target period, interview time, their interaction, and sex in the model; results were adjusted for school year and interviewer. Results Conventional approach — Report rates for energy and macronutrients did not differ by target period, interview time, their interaction, or sex. Reporting-error-sensitive approach — Correspondence rates for energy and macronutrients differed by target period (four P-values<0.0001) and the target-period by interview-time interaction (four P-values<0.0001); inflation ratios for energy and macronutrients differed by target period (four P

  8. The Accuracy of Diagnostic Methods for Diabetic Retinopathy: A Systematic Review and Meta-Analysis

    PubMed Central

    Martínez-Vizcaíno, Vicente; Cavero-Redondo, Iván; Álvarez-Bueno, Celia; Rodríguez-Artalejo, Fernando

    2016-01-01

    Objective The objective of this study was to evaluate the accuracy of the recommended glycemic measures for diagnosing diabetic retinopathy. Methods We systematically searched MEDLINE, EMBASE, the Cochrane Library, and the Web of Science databases from inception to July 2015 for observational studies comparing the diagnostic accuracy of glycated hemoglobin (HbA1c), fasting plasma glucose (FPG), and 2-hour plasma glucose (2h-PG). Random effects models for the diagnostic odds ratio (dOR) value computed by Moses’ constant for a linear model and 95% CIs were used to calculate the accuracy of the test. Hierarchical summary receiver operating characteristic curves (HSROC) were used to summarize the overall test performance. Results Eleven published studies were included in the meta-analysis. The pooled dOR values for the diagnosis of retinopathy were 16.32 (95% CI 13.86–19.22) for HbA1c and 4.87 (95% CI 4.39–5.40) for FPG. The area under the HSROC was 0.837 (95% CI 0.781–0.892) for HbA1c and 0.735 (95% CI 0.657–0.813) for FPG. The 95% confidence region for the point that summarizes the overall test performance of the included studies occurs where the cut-offs ranged from 6.1% (43.2 mmol/mol) to 7.8% (61.7 mmol/mol) for HbA1c and from 7.8 to 9.3 mmol/L for FPG. In the four studies that provided information regarding 2h-PG, the pooled accuracy estimates for HbA1c were similar to those of 2h-PG; the overall performance for HbA1c was superior to that for FPG. Conclusions The three recommended tests for the diagnosis of type 2 diabetes in nonpregnant adults showed sufficient accuracy for their use in clinical settings, although the overall accuracy for the diagnosis of retinopathy was similar for HbA1c and 2h-PG, which were both more accurate than for FPG. Due to the variability and inconveniences of the glucose level-based methods, HbA1c appears to be the most appropriate method for the diagnosis diabetic retinopathy. PMID:27123641

  9. Superior accuracy of model-based radiostereometric analysis for measurement of polyethylene wear

    PubMed Central

    Stilling, M.; Kold, S.; de Raedt, S.; Andersen, N. T.; Rahbek, O.; Søballe, K.

    2012-01-01

    Objectives The accuracy and precision of two new methods of model-based radiostereometric analysis (RSA) were hypothesised to be superior to a plain radiograph method in the assessment of polyethylene (PE) wear. Methods A phantom device was constructed to simulate three-dimensional (3D) PE wear. Images were obtained consecutively for each simulated wear position for each modality. Three commercially available packages were evaluated: model-based RSA using laser-scanned cup models (MB-RSA), model-based RSA using computer-generated elementary geometrical shape models (EGS-RSA), and PolyWare. Precision (95% repeatability limits) and accuracy (Root Mean Square Errors) for two-dimensional (2D) and 3D wear measurements were assessed. Results The precision for 2D wear measures was 0.078 mm, 0.102 mm, and 0.076 mm for EGS-RSA, MB-RSA, and PolyWare, respectively. For the 3D wear measures the precision was 0.185 mm, 0.189 mm, and 0.244 mm for EGS-RSA, MB-RSA, and PolyWare respectively. Repeatability was similar for all methods within the same dimension, when compared between 2D and 3D (all p > 0.28). For the 2D RSA methods, accuracy was below 0.055 mm and at least 0.335 mm for PolyWare. For 3D measurements, accuracy was 0.1 mm, 0.2 mm, and 0.3 mm for EGS-RSA, MB-RSA and PolyWare respectively. PolyWare was less accurate compared with RSA methods (p = 0.036). No difference was observed between the RSA methods (p = 0.10). Conclusions For all methods, precision and accuracy were better in 2D, with RSA methods being superior in accuracy. Although less accurate and precise, 3D RSA defines the clinically relevant wear pattern (multidirectional). PolyWare is a good and low-cost alternative to RSA, despite being less accurate and requiring a larger sample size. PMID:23610688

  10. Improved accuracy for finite element structural analysis via a new integrated force method

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Hopkins, Dale A.; Aiello, Robert A.; Berke, Laszlo

    1992-01-01

    A comparative study was carried out to determine the accuracy of finite element analyses based on the stiffness method, a mixed method, and the new integrated force and dual integrated force methods. The numerical results were obtained with the following software: MSC/NASTRAN and ASKA for the stiffness method; an MHOST implementation method for the mixed method; and GIFT for the integrated force methods. The results indicate that on an overall basis, the stiffness and mixed methods present some limitations. The stiffness method generally requires a large number of elements in the model to achieve acceptable accuracy. The MHOST method tends to achieve a higher degree of accuracy for course models than does the stiffness method implemented by MSC/NASTRAN and ASKA. The two integrated force methods, which bestow simultaneous emphasis on stress equilibrium and strain compatibility, yield accurate solutions with fewer elements in a model. The full potential of these new integrated force methods remains largely unexploited, and they hold the promise of spawning new finite element structural analysis tools.

  11. Improved accuracy for finite element structural analysis via an integrated force method

    NASA Technical Reports Server (NTRS)

    Patnaik, S. N.; Hopkins, D. A.; Aiello, R. A.; Berke, L.

    1992-01-01

    A comparative study was carried out to determine the accuracy of finite element analyses based on the stiffness method, a mixed method, and the new integrated force and dual integrated force methods. The numerical results were obtained with the following software: MSC/NASTRAN and ASKA for the stiffness method; an MHOST implementation method for the mixed method; and GIFT for the integrated force methods. The results indicate that on an overall basis, the stiffness and mixed methods present some limitations. The stiffness method generally requires a large number of elements in the model to achieve acceptable accuracy. The MHOST method tends to achieve a higher degree of accuracy for course models than does the stiffness method implemented by MSC/NASTRAN and ASKA. The two integrated force methods, which bestow simultaneous emphasis on stress equilibrium and strain compatibility, yield accurate solutions with fewer elements in a model. The full potential of these new integrated force methods remains largely unexploited, and they hold the promise of spawning new finite element structural analysis tools.

  12. Future dedicated Venus-SGG flight mission: Accuracy assessment and performance analysis

    NASA Astrophysics Data System (ADS)

    Zheng, Wei; Hsu, Houtse; Zhong, Min; Yun, Meijuan

    2016-01-01

    This study concentrates principally on the systematic requirements analysis for the future dedicated Venus-SGG (spacecraft gravity gradiometry) flight mission in China in respect of the matching measurement accuracies of the spacecraft-based scientific instruments and the orbital parameters of the spacecraft. Firstly, we created and proved the single and combined analytical error models of the cumulative Venusian geoid height influenced by the gravity gradient error of the spacecraft-borne atom-interferometer gravity gradiometer (AIGG) and the orbital position error and orbital velocity error tracked by the deep space network (DSN) on the Earth station. Secondly, the ultra-high-precision spacecraft-borne AIGG is propitious to making a significant contribution to globally mapping the Venusian gravitational field and modeling the geoid with unprecedented accuracy and spatial resolution through weighing the advantages and disadvantages among the electrostatically suspended gravity gradiometer, the superconducting gravity gradiometer and the AIGG. Finally, the future dedicated Venus-SGG spacecraft had better adopt the optimal matching accuracy indices consisting of 3 × 10-13/s2 in gravity gradient, 10 m in orbital position and 8 × 10-4 m/s in orbital velocity and the preferred orbital parameters comprising an orbital altitude of 300 ± 50 km, an observation time of 60 months and a sampling interval of 1 s.

  13. Accuracy Analysis and Validation of the Mars Science Laboratory (MSL) Robotic Arm

    NASA Technical Reports Server (NTRS)

    Collins, Curtis L.; Robinson, Matthew L.

    2013-01-01

    The Mars Science Laboratory (MSL) Curiosity Rover is currently exploring the surface of Mars with a suite of tools and instruments mounted to the end of a five degree-of-freedom robotic arm. To verify and meet a set of end-to-end system level accuracy requirements, a detailed positioning uncertainty model of the arm was developed and exercised over the arm operational workspace. Error sources at each link in the arm kinematic chain were estimated and their effects propagated to the tool frames.A rigorous test and measurement program was developed and implemented to collect data to characterize and calibrate the kinematic and stiffness parameters of the arm. Numerous absolute and relative accuracy and repeatability requirements were validated with a combination of analysis and test data extrapolated to the Mars gravity and thermal environment. Initial results of arm accuracy and repeatability on Mars demonstrate the effectiveness of the modeling and test program as the rover continues to explore the foothills of Mount Sharp.

  14. Design and accuracy analysis of a metamorphic CNC flame cutting machine for ship manufacturing

    NASA Astrophysics Data System (ADS)

    Hu, Shenghai; Zhang, Manhui; Zhang, Baoping; Chen, Xi; Yu, Wei

    2016-05-01

    The current research of processing large size fabrication holes on complex spatial curved surface mainly focuses on the CNC flame cutting machines design for ship hull of ship manufacturing. However, the existing machines cannot meet the continuous cutting requirements with variable pass conditions through their fixed configuration, and cannot realize high-precision processing as the accuracy theory is not studied adequately. This paper deals with structure design and accuracy prediction technology of novel machine tools for solving the problem of continuous and high-precision cutting. The needed variable trajectory and variable pose kinematic characteristics of non-contact cutting tool are figured out and a metamorphic CNC flame cutting machine designed through metamorphic principle is presented. To analyze kinematic accuracy of the machine, models of joint clearances, manufacturing tolerances and errors in the input variables and error models considering the combined effects are derived based on screw theory after establishing ideal kinematic models. Numerical simulations, processing experiment and trajectory tracking experiment are conducted relative to an eccentric hole with bevels on cylindrical surface respectively. The results of cutting pass contour and kinematic error interval which the position error is from-0.975 mm to +0.628 mm and orientation error is from-0.01 rad to +0.01 rad indicate that the developed machine can complete cutting process continuously and effectively, and the established kinematic error models are effective although the interval is within a `large' range. It also shows the matching property between metamorphic principle and variable working tasks, and the mapping correlation between original designing parameters and kinematic errors of machines. This research develops a metamorphic CNC flame cutting machine and establishes kinematic error models for accuracy analysis of machine tools.

  15. A critical analysis of the accuracy of several numerical techniques for combustion kinetic rate equations

    NASA Technical Reports Server (NTRS)

    Radhadrishnan, Krishnan

    1993-01-01

    A detailed analysis of the accuracy of several techniques recently developed for integrating stiff ordinary differential equations is presented. The techniques include two general-purpose codes EPISODE and LSODE developed for an arbitrary system of ordinary differential equations, and three specialized codes CHEMEQ, CREK1D, and GCKP4 developed specifically to solve chemical kinetic rate equations. The accuracy study is made by application of these codes to two practical combustion kinetics problems. Both problems describe adiabatic, homogeneous, gas-phase chemical reactions at constant pressure, and include all three combustion regimes: induction, heat release, and equilibration. To illustrate the error variation in the different combustion regimes the species are divided into three types (reactants, intermediates, and products), and error versus time plots are presented for each species type and the temperature. These plots show that CHEMEQ is the most accurate code during induction and early heat release. During late heat release and equilibration, however, the other codes are more accurate. A single global quantity, a mean integrated root-mean-square error, that measures the average error incurred in solving the complete problem is used to compare the accuracy of the codes. Among the codes examined, LSODE is the most accurate for solving chemical kinetics problems. It is also the most efficient code, in the sense that it requires the least computational work to attain a specified accuracy level. An important finding is that use of the algebraic enthalpy conservation equation to compute the temperature can be more accurate and efficient than integrating the temperature differential equation.

  16. Measurement methods and accuracy analysis of Chang'E-5 Panoramic Camera installation parameters

    NASA Astrophysics Data System (ADS)

    Yan, Wei; Ren, Xin; Liu, Jianjun; Tan, Xu; Wang, Wenrui; Chen, Wangli; Zhang, Xiaoxia; Li, Chunlai

    2016-04-01

    Chang'E-5 (CE-5) is a lunar probe for the third phase of China Lunar Exploration Project (CLEP), whose main scientific objectives are to implement lunar surface sampling and to return the samples back to the Earth. To achieve these goals, investigation of lunar surface topography and geological structure within sampling area seems to be extremely important. The Panoramic Camera (PCAM) is one of the payloads mounted on CE-5 lander. It consists of two optical systems which installed on a camera rotating platform. Optical images of sampling area can be obtained by PCAM in the form of a two-dimensional image and a stereo images pair can be formed by left and right PCAM images. Then lunar terrain can be reconstructed based on photogrammetry. Installation parameters of PCAM with respect to CE-5 lander are critical for the calculation of exterior orientation elements (EO) of PCAM images, which is used for lunar terrain reconstruction. In this paper, types of PCAM installation parameters and coordinate systems involved are defined. Measurement methods combining camera images and optical coordinate observations are studied for this work. Then research contents such as observation program and specific solution methods of installation parameters are introduced. Parametric solution accuracy is analyzed according to observations obtained by PCAM scientifically validated experiment, which is used to test the authenticity of PCAM detection process, ground data processing methods, product quality and so on. Analysis results show that the accuracy of the installation parameters affects the positional accuracy of corresponding image points of PCAM stereo images within 1 pixel. So the measurement methods and parameter accuracy studied in this paper meet the needs of engineering and scientific applications. Keywords: Chang'E-5 Mission; Panoramic Camera; Installation Parameters; Total Station; Coordinate Conversion

  17. Accuracy of surface tension measurement from drop shapes: the role of image analysis.

    PubMed

    Kalantarian, Ali; Saad, Sameh M I; Neumann, A Wilhelm

    2013-11-01

    Axisymmetric Drop Shape Analysis (ADSA) has been extensively used for surface tension measurement. In essence, ADSA works by matching a theoretical profile of the drop to the extracted experimental profile, taking surface tension as an adjustable parameter. Of the three main building blocks of ADSA, i.e. edge detection, the numerical integration of the Laplace equation for generating theoretical curves and the optimization procedure, only edge detection (that extracts the drop profile line from the drop image) needs extensive study. For the purpose of this article, the numerical integration of the Laplace equation for generating theoretical curves and the optimization procedure will only require a minor effort. It is the aim of this paper to investigate how far the surface tension accuracy of drop shape techniques can be pushed by fine tuning and optimizing edge detection strategies for a given drop image. Two different aspects of edge detection are pursued here: sub-pixel resolution and pixel resolution. The effect of two sub-pixel resolution strategies, i.e. spline and sigmoid, on the accuracy of surface tension measurement is investigated. It is found that the number of pixel points in the fitting procedure of the sub-pixel resolution techniques is crucial, and its value should be determined based on the contrast of the image, i.e. the gray level difference between the drop and the background. On the pixel resolution side, two suitable and reliable edge detectors, i.e. Canny and SUSAN, are explored, and the effect of user-specified parameters of the edge detector on the accuracy of surface tension measurement is scrutinized. Based on the contrast of the image, an optimum value of the user-specified parameter of the edge detector, SUSAN, is suggested. Overall, an accuracy of 0.01mJ/m(2) is achievable for the surface tension determination by careful fine tuning of edge detection algorithms. PMID:24018120

  18. Spot detection accuracy analysis in turbulent channel for free space optical communication

    NASA Astrophysics Data System (ADS)

    Liu, Yan-Fei; Dai, Yong-Hong; Yu, Sheng-Lin; Xin, Shan; Chen, Jing; Ai, Yong

    2015-10-01

    Increasingly importance has been taken seriously for high frame rate CMOS camera to optical communication acquisition pointing and tacking (APT) system, with its compact structure, easy to developed and adapted to beacon light spot detection in atmospheric channel. As spot position accuracy directly determines the performance of space optical communication, it is very important to design a high precision spot center algorithm. Usually spot location algorithm uses gravity algorithm, shape center capturing algorithm or self-adaption threshold algorithm. In experiments we analyzed the characteristics of the spots which transmitted through atmospheric turbulence and studied light transmission characteristics in turbulent channel. We carried out a beacon light detection experiments in a distance of 3.4km, collected the beacon spots on CMOS camera and signal light power. We calculated spot position with two different algorithm and compared the calculation accuracy between field dispersive spot and ideal Gaussian laser spot. Experiment research show that, gravity center algorithm should be more suitable for beacon beam spot which accuracy can be improved about 1.3 pixels for a Gaussian spot. But the shape center algorithm has higher precision. The reasons were analyzed which made an important preparation for subsequent testing.

  19. Accuracy of finite-element models for the stress analysis of multiple-holed moderator blocks

    SciTech Connect

    Smith, P.D.; Sullivan, R.M.; Lewis, A.C.; Yu, H.J.

    1981-01-01

    Two steps have been taken to quantify and improve the accuracy in the analysis. First, the limitations of various approximation techniques have been studied with the aid of smaller benchmark problems containing fewer holes. Second, a new family of computer programs has been developed for handling such large problems. This paper describes the accuracy studies and the benchmark problems. A review is given of some proposed modeling techniques including local mesh refinement, homogenization, a special-purpose finite element, and substructuring. Some limitations of these approaches are discussed. The new finite element programs and the features that contribute to their efficiency are discussed. These include a standard architecture for out-of-core data processing and an equation solver that operates on a peripheral array processor. The central conclusions of the paper are: (1) modeling approximation methods such as local mesh refinement and homogenization tend to be unreliable, and they should be justified by a fine mesh benchmark analysis; and (2) finite element codes are now available that can achieve accurate solutions at a reasonable cost, and there is no longer a need to employ modeling approximations in the two-dimensional analysis of HTGR fuel elements. 10 figures.

  20. Methodology issues concerning the accuracy of kinematic data collection and analysis using the ariel performance analysis system

    NASA Technical Reports Server (NTRS)

    Wilmington, R. P.; Klute, Glenn K. (Editor); Carroll, Amy E. (Editor); Stuart, Mark A. (Editor); Poliner, Jeff (Editor); Rajulu, Sudhakar (Editor); Stanush, Julie (Editor)

    1992-01-01

    Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.

  1. Accuracy evaluation of a new stereophotogrammetry-based functional method for joint kinematic analysis in biomechanics.

    PubMed

    Galetto, Maurizio; Gastaldi, Laura; Lisco, Giulia; Mastrogiacomo, Luca; Pastorelli, Stefano

    2014-11-01

    The human joint kinematics is an interesting topic in biomechanics and turns to be useful for the analysis of human movement in several fields. A crucial issue regards the assessment of joint parameters, like axes and centers of rotation, due to the direct influence on human motion patterns. A proper accuracy in the estimation of these parameters is hence required. On the whole, stereophotogrammetry-based predictive methods and, as an alternative, functional ones can be used to this end. This article presents a new functional algorithm for the assessment of knee joint parameters, based on a polycentric hinge model for the knee flexion-extension. The proposed algorithm is discussed, identifying its fields of application and its limits. The techniques for estimating the joint parameters from the metrological point of view are analyzed, so as to lay the groundwork for enhancing and eventually replacing predictive methods, currently used in the laboratories of human movement analysis. This article also presents an assessment of the accuracy associated with the whole process of measurement and joint parameters estimation. To this end, the presented functional method is tested through both computer simulations and a series of experimental laboratory tests in which swing motions were imposed to a polycentric mechanical analogue and a stereophotogrammetric system was used to record them. PMID:25500863

  2. Combined Scintigraphy and Tumor Marker Analysis Predicts Unfavorable Histopathology of Neuroblastic Tumors with High Accuracy

    PubMed Central

    Fendler, Wolfgang Peter; Wenter, Vera; Thornton, Henriette Ingrid; Ilhan, Harun; von Schweinitz, Dietrich; Coppenrath, Eva; Schmid, Irene; Bartenstein, Peter; Pfluger, Thomas

    2015-01-01

    Objectives Our aim was to improve the prediction of unfavorable histopathology (UH) in neuroblastic tumors through combined imaging and biochemical parameters. Methods 123I-MIBG SPECT and MRI was performed before surgical resection or biopsy in 47 consecutive pediatric patients with neuroblastic tumor. Semi-quantitative tumor-to-liver count-rate ratio (TLCRR), MRI tumor size and margins, urine catecholamine and NSE blood levels of neuron specific enolase (NSE) were recorded. Accuracy of single and combined variables for prediction of UH was tested by ROC analysis with Bonferroni correction. Results 34 of 47 patients had UH based on the International Neuroblastoma Pathology Classification (INPC). TLCRR and serum NSE both predicted UH with moderate accuracy. Optimal cut-off for TLCRR was 2.0, resulting in 68% sensitivity and 100% specificity (AUC-ROC 0.86, p < 0.001). Optimal cut-off for NSE was 25.8 ng/ml, resulting in 74% sensitivity and 85% specificity (AUC-ROC 0.81, p = 0.001). Combination of TLCRR/NSE criteria reduced false negative findings from 11/9 to only five, with improved sensitivity and specificity of 85% (AUC-ROC 0.85, p < 0.001). Conclusion Strong 123I-MIBG uptake and high serum level of NSE were each predictive of UH. Combined analysis of both parameters improved the prediction of UH in patients with neuroblastic tumor. MRI parameters and urine catecholamine levels did not predict UH. PMID:26177109

  3. The Accuracy of the Swallowing Kinematic Analysis at Various Movement Velocities of the Hyoid and Epiglottis

    PubMed Central

    Lee, Seung Hak; Chun, Seong Min; Lee, Jung Chan; Min, Yusun; Bang, Sang-Heum; Kim, Hee Chan; Han, Tai Ryoon

    2013-01-01

    Objective To evaluate the accuracy of the swallowing kinematic analysis. Methods To evaluate the accuracy at various velocities of movement, we developed an instrumental model of linear and rotational movement, representing the physiologic movement of the hyoid and epiglottis, respectively. A still image of 8 objects was also used for measuring the length of the objects as a basic screening, and 18 movie files of the instrumental model, taken from videofluoroscopy with different velocities. The images and movie files were digitized and analyzed by an experienced examiner, who was blinded to the study. Results The Pearson correlation coefficients between the measured and instrumental reference values were over 0.99 (p<0.001) for all of the analyses. Bland-Altman plots showed narrow ranges of the 95% confidence interval of agreement between the measured and reference values as follows: 0.14 to 0.94 mm for distances in a still image, -0.14 to 1.09 mm/s for linear velocities, and -1.02 to 3.81 degree/s for angular velocities. Conclusion Our findings demonstrate that the distance and velocity measurements obtained by swallowing kinematic analysis are highly valid in a wide range of movement velocity. PMID:23869329

  4. An analysis of the accuracy of magnetic resonance flip angle measurement methods

    NASA Astrophysics Data System (ADS)

    Morrell, Glen R.; Schabel, Matthias C.

    2010-10-01

    Several methods of flip angle mapping for magnetic resonance imaging have been proposed. We evaluated the accuracy of five methods of flip angle measurement in the presence of measurement noise. Our analysis was performed in a closed form by propagation of probability density functions (PDFs). The flip angle mapping methods compared were (1) the phase-sensitive method, (2) the dual-angle method using gradient recalled echoes (GRE), (3) an extended version of the GRE dual-angle method incorporating phase information, (4) the AFI method and (5) an extended version of the AFI method incorporating phase information. Our analysis took into account differences in required imaging time for these methods in the comparison of noise efficiency. PDFs of the flip angle estimate for each method for each value of true flip angle were calculated. These PDFs completely characterize the performance of each method. Mean bias and standard deviation were computed from these PDFs to more simply quantify the relative accuracy of each method over its range of measurable flip angles. We demonstrate that the phase-sensitive method provides the lowest mean bias and standard deviation of flip angle estimate of the five methods evaluated over a wide range of flip angles.

  5. A Comparative Accuracy Analysis of Classification Methods in Determination of Cultivated Lands with Spot 5 Satellite Imagery

    NASA Astrophysics Data System (ADS)

    kaya, S.; Alganci, U.; Sertel, E.; Ustundag, B.

    2013-12-01

    A Comparative Accuracy Analysis of Classification Methods in Determination of Cultivated Lands with Spot 5 Satellite Imagery Ugur ALGANCI1, Sinasi KAYA1,2, Elif SERTEL1,2,Berk USTUNDAG3 1 ITU, Center for Satellite Communication and Remote Sensing, 34469, Maslak-Istanbul,Turkey 2 ITU, Department of Geomatics, 34469, Maslak-Istanbul, Turkey 3 ITU, Agricultural and Environmental Informatics Research Center,34469, Maslak-Istanbul,Turkey alganci@itu.edu.tr, kayasina@itu.edu.tr, sertele@itu.edu.tr, berk@berk.tc ABSTRACT Cultivated land determination and their area estimation are important tasks for agricultural management. Derived information is mostly used in agricultural policies and precision agriculture, in specifically; yield estimation, irrigation and fertilization management and farmers declaration verification etc. The use of satellite image in crop type identification and area estimate is common for two decades due to its capability of monitoring large areas, rapid data acquisition and spectral response to crop properties. With launch of high and very high spatial resolution optical satellites in the last decade, such kind of analysis have gained importance as they provide information at big scale. With increasing spatial resolution of satellite images, image classification methods to derive the information form them have become important with increase of the spectral heterogeneity within land objects. In this research, pixel based classification with maximum likelihood algorithm and object based classification with nearest neighbor algorithm were applied to 2012 dated 2.5 m resolution SPOT 5 satellite images in order to investigate the accuracy of these methods in determination of cotton and corn planted lands and their area estimation. Study area was selected in Sanliurfa Province located on Southeastern Turkey that contributes to Turkey's agricultural production in a major way. Classification results were compared in terms of crop type identification using

  6. Repetition, not number of sources, increases both susceptibility to misinformation and confidence in the accuracy of eyewitnesses.

    PubMed

    Foster, Jeffrey L; Huthwaite, Thomas; Yesberg, Julia A; Garry, Maryanne; Loftus, Elizabeth F

    2012-02-01

    Are claims more credible when made by multiple sources, or is it the repetition of claims that matters? Some research suggests that claims have more credibility when independent sources make them. Yet, other research suggests that simply repeating information makes it more accessible and encourages reliance on automatic processes-factors known to change people's judgments. In Experiment 1, people took part in a "misinformation" study: people first watched a video of a crime and later read eyewitness reports attributed to one or three different eyewitnesses who made misleading claims in either one report or repeated the same misleading claims across all three reports. In Experiment 2, people who had not seen any videos read those same reports and indicated how confident they were that each claim happened in the original event. People were more misled by-and more confident about-claims that were repeated, regardless of how many eyewitnesses made them. We hypothesize that people interpreted the familiarity of repeated claims as markers of accuracy. These findings fit with research showing that repeating information makes it seem more true, and highlight the power of a single repeated voice. PMID:22257711

  7. Zagreb Amblyopia Preschool Screening Study: near and distance visual acuity testing increase the diagnostic accuracy of screening for amblyopia

    PubMed Central

    Bušić, Mladen; Bjeloš, Mirjana; Petrovečki, Mladen; Kuzmanović Elabjer, Biljana; Bosnar, Damir; Ramić, Senad; Miletić, Daliborka; Andrijašević, Lidija; Kondža Krstonijević, Edita; Jakovljević, Vid; Bišćan Tvrdi, Ana; Predović, Jurica; Kokot, Antonio; Bišćan, Filip; Kovačević Ljubić, Mirna; Motušić Aras, Ranka

    2016-01-01

    Aim To present and evaluate a new screening protocol for amblyopia in preschool children. Methods Zagreb Amblyopia Preschool Screening (ZAPS) study protocol performed screening for amblyopia by near and distance visual acuity (VA) testing of 15 648 children aged 48-54 months attending kindergartens in the City of Zagreb County between September 2011 and June 2014 using Lea Symbols in lines test. If VA in either eye was >0.1 logMAR, the child was re-tested, if failed at re-test, the child was referred to comprehensive eye examination at the Eye Clinic. Results 78.04% of children passed the screening test. Estimated prevalence of amblyopia was 8.08%. Testability, sensitivity, and specificity of the ZAPS study protocol were 99.19%, 100.00%, and 96.68% respectively. Conclusion The ZAPS study used the most discriminative VA test with optotypes in lines as they do not underestimate amblyopia. The estimated prevalence of amblyopia was considerably higher than reported elsewhere. To the best of our knowledge, the ZAPS study protocol reached the highest sensitivity and specificity when evaluating diagnostic accuracy of VA tests for screening. The pass level defined at ≤0.1 logMAR for 4-year-old children, using Lea Symbols in lines missed no amblyopia cases, advocating that both near and distance VA testing should be performed when screening for amblyopia. PMID:26935612

  8. Tissue Probability Map Constrained 4-D Clustering Algorithm for Increased Accuracy and Robustness in Serial MR Brain Image Segmentation

    PubMed Central

    Xue, Zhong; Shen, Dinggang; Li, Hai; Wong, Stephen

    2010-01-01

    The traditional fuzzy clustering algorithm and its extensions have been successfully applied in medical image segmentation. However, because of the variability of tissues and anatomical structures, the clustering results might be biased by the tissue population and intensity differences. For example, clustering-based algorithms tend to over-segment white matter tissues of MR brain images. To solve this problem, we introduce a tissue probability map constrained clustering algorithm and apply it to serial MR brain image segmentation, i.e., a series of 3-D MR brain images of the same subject at different time points. Using the new serial image segmentation algorithm in the framework of the CLASSIC framework, which iteratively segments the images and estimates the longitudinal deformations, we improved both accuracy and robustness for serial image computing, and at the mean time produced longitudinally consistent segmentation and stable measures. In the algorithm, the tissue probability maps consist of both the population-based and subject-specific segmentation priors. Experimental study using both simulated longitudinal MR brain data and the Alzheimer’s Disease Neuroimaging Initiative (ADNI) data confirmed that using both priors more accurate and robust segmentation results can be obtained. The proposed algorithm can be applied in longitudinal follow up studies of MR brain imaging with subtle morphological changes for neurological disorders. PMID:26566399

  9. Increasing Accuracy: A New Design and Algorithm for Automatically Measuring Weights, Travel Direction and Radio Frequency Identification (RFID) of Penguins

    PubMed Central

    Afanasyev, Vsevolod; Buldyrev, Sergey V.; Dunn, Michael J.; Robst, Jeremy; Preston, Mark; Bremner, Steve F.; Briggs, Dirk R.; Brown, Ruth; Adlard, Stacey; Peat, Helen J.

    2015-01-01

    A fully automated weighbridge using a new algorithm and mechanics integrated with a Radio Frequency Identification System is described. It is currently in use collecting data on Macaroni penguins (Eudyptes chrysolophus) at Bird Island, South Georgia. The technology allows researchers to collect very large, highly accurate datasets of both penguin weight and direction of their travel into or out of a breeding colony, providing important contributory information to help understand penguin breeding success, reproductive output and availability of prey. Reliable discrimination between single and multiple penguin crossings is demonstrated. Passive radio frequency tags implanted into penguins allow researchers to match weight and trip direction to individual birds. Low unit and operation costs, low maintenance needs, simple operator requirements and accurate time stamping of every record are all important features of this type of weighbridge, as is its proven ability to operate 24 hours a day throughout a breeding season, regardless of temperature or weather conditions. Users are able to define required levels of accuracy by adjusting filters and raw data are automatically recorded and stored allowing for a range of processing options. This paper presents the underlying principles, design specification and system description, provides evidence of the weighbridge’s accurate performance and demonstrates how its design is a significant improvement on existing systems. PMID:25894763

  10. Increasing Accuracy: A New Design and Algorithm for Automatically Measuring Weights, Travel Direction and Radio Frequency Identification (RFID) of Penguins.

    PubMed

    Afanasyev, Vsevolod; Buldyrev, Sergey V; Dunn, Michael J; Robst, Jeremy; Preston, Mark; Bremner, Steve F; Briggs, Dirk R; Brown, Ruth; Adlard, Stacey; Peat, Helen J

    2015-01-01

    A fully automated weighbridge using a new algorithm and mechanics integrated with a Radio Frequency Identification System is described. It is currently in use collecting data on Macaroni penguins (Eudyptes chrysolophus) at Bird Island, South Georgia. The technology allows researchers to collect very large, highly accurate datasets of both penguin weight and direction of their travel into or out of a breeding colony, providing important contributory information to help understand penguin breeding success, reproductive output and availability of prey. Reliable discrimination between single and multiple penguin crossings is demonstrated. Passive radio frequency tags implanted into penguins allow researchers to match weight and trip direction to individual birds. Low unit and operation costs, low maintenance needs, simple operator requirements and accurate time stamping of every record are all important features of this type of weighbridge, as is its proven ability to operate 24 hours a day throughout a breeding season, regardless of temperature or weather conditions. Users are able to define required levels of accuracy by adjusting filters and raw data are automatically recorded and stored allowing for a range of processing options. This paper presents the underlying principles, design specification and system description, provides evidence of the weighbridge's accurate performance and demonstrates how its design is a significant improvement on existing systems. PMID:25894763

  11. High Accuracy Liquid Propellant Slosh Predictions Using an Integrated CFD and Controls Analysis Interface

    NASA Technical Reports Server (NTRS)

    Marsell, Brandon; Griffin, David; Schallhorn, Dr. Paul; Roth, Jacob

    2012-01-01

    Coupling computational fluid dynamics (CFD) with a controls analysis tool elegantly allows for high accuracy predictions of the interaction between sloshing liquid propellants and th e control system of a launch vehicle. Instead of relying on mechanical analogs which are not valid during aU stages of flight, this method allows for a direct link between the vehicle dynamic environments calculated by the solver in the controls analysis tool to the fluid flow equations solved by the CFD code. This paper describes such a coupling methodology, presents the results of a series of test cases, and compares said results against equivalent results from extensively validated tools. The coupling methodology, described herein, has proven to be highly accurate in a variety of different cases.

  12. Integrated CFD and Controls Analysis Interface for High Accuracy Liquid Propellant Slosh Predictions

    NASA Technical Reports Server (NTRS)

    Marsell, Brandon; Griffin, David; Schallhorn, Paul; Roth, Jacob

    2012-01-01

    Coupling computational fluid dynamics (CFD) with a controls analysis tool elegantly allows for high accuracy predictions of the interaction between sloshing liquid propellants and the control system of a launch vehicle. Instead of relying on mechanical analogs which are n0t va lid during all stages of flight, this method allows for a direct link between the vehicle dynamic environments calculated by the solver in the controls analysis tool to the fluid now equations solved by the CFD code. This paper describes such a coupling methodology, presents the results of a series of test cases, and compares said results against equivalent results from extensively validated tools. The coupling methodology, described herein, has proven to be highly accurate in a variety of different cases.

  13. Gaining Precision and Accuracy on Microprobe Trace Element Analysis with the Multipoint Background Method

    NASA Astrophysics Data System (ADS)

    Allaz, J. M.; Williams, M. L.; Jercinovic, M. J.; Donovan, J. J.

    2014-12-01

    Electron microprobe trace element analysis is a significant challenge, but can provide critical data when high spatial resolution is required. Due to the low peak intensity, the accuracy and precision of such analyses relies critically on background measurements, and on the accuracy of any pertinent peak interference corrections. A linear regression between two points selected at appropriate off-peak positions is a classical approach for background characterization in microprobe analysis. However, this approach disallows an accurate assessment of background curvature (usually exponential). Moreover, if present, background interferences can dramatically affect the results if underestimated or ignored. The acquisition of a quantitative WDS scan over the spectral region of interest is still a valuable option to determine the background intensity and curvature from a fitted regression of background portions of the scan, but this technique retains an element of subjectivity as the analyst has to select areas in the scan, which appear to represent background. We present here a new method, "Multi-Point Background" (MPB), that allows acquiring up to 24 off-peak background measurements from wavelength positions around the peaks. This method aims to improve the accuracy, precision, and objectivity of trace element analysis. The overall efficiency is amended because no systematic WDS scan needs to be acquired in order to check for the presence of possible background interferences. Moreover, the method is less subjective because "true" backgrounds are selected by the statistical exclusion of erroneous background measurements, reducing the need for analyst intervention. This idea originated from efforts to refine EPMA monazite U-Th-Pb dating, where it was recognised that background errors (peak interference or background curvature) could result in errors of several tens of million years on the calculated age. Results obtained on a CAMECA SX-100 "UltraChron" using monazite

  14. Sensitivity Analysis for Characterizing the Accuracy and Precision of JEM/SMILES Mesospheric O3

    NASA Astrophysics Data System (ADS)

    Esmaeili Mahani, M.; Baron, P.; Kasai, Y.; Murata, I.; Kasaba, Y.

    2011-12-01

    The main purpose of this study is to evaluate the Superconducting sub-Millimeter Limb Emission Sounder (SMILES) measurements of mesospheric ozone, O3. As the first step, the error due to the impact of Mesospheric Temperature Inversions (MTIs) on ozone retrieval has been determined. The impacts of other parameters such as pressure variability, solar events, and etc. on mesospheric O3 will also be investigated. Ozone, is known to be important due to the stratospheric O3 layer protection of life on Earth by absorbing harmful UV radiations. However, O3 chemistry can be studied purely in the mesosphere without distraction of heterogeneous situation and dynamical variations due to the short lifetime of O3 in this region. Mesospheric ozone is produced by the photo-dissociation of O2 and the subsequent reaction of O with O2. Diurnal and semi-diurnal variations of mesospheric ozone are associated with variations in solar activity. The amplitude of the diurnal variation increases from a few percent at an altitude of 50 km, to about 80 percent at 70 km. Although despite the apparent simplicity of this situation, significant disagreements exist between the predictions from the existing models and observations, which need to be resolved. SMILES is a highly sensitive radiometer with a few to several tens percent of precision from upper troposphere to the mesosphere. SMILES was developed by the Japanese Aerospace eXploration Agency (JAXA) and the National Institute of Information and Communications Technology (NICT) located at the Japanese Experiment Module (JEM) on the International Space Station (ISS). SMILES has successfully measured the vertical distributions and the diurnal variations of various atmospheric species in the latitude range of 38S to 65N from October 2009 to April 2010. A sensitivity analysis is being conducted to investigate the expected precision and accuracy of the mesospheric O3 profiles (from 50 to 90 km height) due to the impact of Mesospheric Temperature

  15. A unification of models for meta-analysis of diagnostic accuracy studies without a gold standard.

    PubMed

    Liu, Yulun; Chen, Yong; Chu, Haitao

    2015-06-01

    Several statistical methods for meta-analysis of diagnostic accuracy studies have been discussed in the presence of a gold standard. However, in practice, the selected reference test may be imperfect due to measurement error, non-existence, invasive nature, or expensive cost of a gold standard. It has been suggested that treating an imperfect reference test as a gold standard can lead to substantial bias in the estimation of diagnostic test accuracy. Recently, two models have been proposed to account for imperfect reference test, namely, a multivariate generalized linear mixed model (MGLMM) and a hierarchical summary receiver operating characteristic (HSROC) model. Both models are very flexible in accounting for heterogeneity in accuracies of tests across studies as well as the dependence between tests. In this article, we show that these two models, although with different formulations, are closely related and are equivalent in the absence of study-level covariates. Furthermore, we provide the exact relations between the parameters of these two models and assumptions under which two models can be reduced to equivalent submodels. On the other hand, we show that some submodels of the MGLMM do not have corresponding equivalent submodels of the HSROC model, and vice versa. With three real examples, we illustrate the cases when fitting the MGLMM and HSROC models leads to equivalent submodels and hence identical inference, and the cases when the inferences from two models are slightly different. Our results generalize the important relations between the bivariate generalized linear mixed model and HSROC model when the reference test is a gold standard. PMID:25358907

  16. TP53 Mutational Analysis Enhances the Prognostic Accuracy of IHC4 and PAM50 Assays

    PubMed Central

    Lin, Ching-Hung; Chen, I-Chiun; Huang, Chiun-Sheng; Hu, Fu-Chang; Kuo, Wen-Hung; Kuo, Kuan-Ting; Wang, Chung-Chieh; Wu, Pei-Fang; Chang, Dwan-Ying; Wang, Ming-Yang; Chang, Chin-Hao; Chen, Wei-Wu; Lu, Yen-Shen; Cheng, Ann-Lii

    2015-01-01

    IHC4 and PAM50 assays have been shown to provide additional prognostic information for patients with early breast cancer. We evaluated whether incorporating TP53 mutation analysis can further enhance their prognostic accuracy. We examined TP53 mutation and the IHC4 score in tumors of 605 patients diagnosed with stage I–III breast cancer at National Taiwan University Hospital (the NTUH cohort). We obtained information regarding TP53 mutation and PAM50 subtypes in 699 tumors from the Molecular Taxonomy of Breast Cancer International Consortium (METABRIC) cohort. We found that TP53 mutation was significantly associated with high-risk IHC4 group and with luminal B, HER2-enriched, and basal-like subtypes. Despite the strong associations, TP53 mutation independently predicted shorter relapse-free survival (hazard ratio [HR] = 1.63, P = 0.007) in the NTUH cohort and shorter breast cancer-specific survival (HR = 2.35, P = <0.001) in the METABRIC cohort. TP53 mutational analysis added significant prognostic information in addition to the IHC4 score (∆ LR-χ2 = 8.61, P = 0.002) in the NTUH cohort and the PAM50 subtypes (∆ LR-χ2 = 18.9, P = <0.001) in the METABRIC cohort. We conclude that incorporating TP53 mutation analysis can enhance the prognostic accuracy of the IHC4 and PAM50 assays. PMID:26671300

  17. Reconstruction Accuracy Assessment of Surface and Underwater 3D Motion Analysis: A New Approach

    PubMed Central

    de Jesus, Kelly; de Jesus, Karla; Figueiredo, Pedro; Vilas-Boas, João Paulo; Fernandes, Ricardo Jorge; Machado, Leandro José

    2015-01-01

    This study assessed accuracy of surface and underwater 3D reconstruction of a calibration volume with and without homography. A calibration volume (6000 × 2000 × 2500 mm) with 236 markers (64 above and 88 underwater control points—with 8 common points at water surface—and 92 validation points) was positioned on a 25 m swimming pool and recorded with two surface and four underwater cameras. Planar homography estimation for each calibration plane was computed to perform image rectification. Direct linear transformation algorithm for 3D reconstruction was applied, using 1600000 different combinations of 32 and 44 points out of the 64 and 88 control points for surface and underwater markers (resp.). Root Mean Square (RMS) error with homography of control and validations points was lower than without it for surface and underwater cameras (P ≤ 0.03). With homography, RMS errors of control and validation points were similar between surface and underwater cameras (P ≥ 0.47). Without homography, RMS error of control points was greater for underwater than surface cameras (P ≤ 0.04) and the opposite was observed for validation points (P ≤ 0.04). It is recommended that future studies using 3D reconstruction should include homography to improve swimming movement analysis accuracy. PMID:26175796

  18. Diagnostic test accuracy of glutamate dehydrogenase for Clostridium difficile: Systematic review and meta-analysis.

    PubMed

    Arimoto, Jun; Horita, Nobuyuki; Kato, Shingo; Fuyuki, Akiko; Higurashi, Takuma; Ohkubo, Hidenori; Endo, Hiroki; Takashi, Nonaka; Kaneko, Takeshi; Nakajima, Atsushi

    2016-01-01

    We performed this systematic review and meta-analysis to assess the diagnostic accuracy of detecting glutamate dehydrogenase (GDH) for Clostridium difficile infection (CDI) based on the hierarchical model. Two investigators electrically searched four databases. Reference tests were stool cell cytotoxicity neutralization assay (CCNA) and stool toxigenic culture (TC). To assess the overall accuracy, we calculated the diagnostic odds ratio (DOR) using a DerSimonian-Laird random-model and area the under hierarchical summary receiver operating characteristics (AUC) using Holling's proportional hazard models. The summary estimate of the sensitivity and the specificity were obtained using the bivariate model. According to 42 reports consisting of 3055 reference positive comparisons, and 26188 reference negative comparisons, the DOR was 115 (95%CI: 77-172, I(2) = 12.0%) and the AUC was 0.970 (95%CI: 0.958-0.982). The summary estimate of sensitivity and specificity were 0.911 (95%CI: 0.871-0.940) and 0.912 (95%CI: 0.892-0.928). The positive and negative likelihood ratios were 10.4 (95%CI 8.4-12.7) and 0.098 (95%CI 0.066-0.142), respectively. Detecting GDH for the diagnosis of CDI had both high sensitivity and specificity. Considering its low cost and prevalence, it is appropriate for a screening test for CDI. PMID:27418431

  19. Effects of light refraction on the accuracy of camera calibration and reconstruction in underwater motion analysis.

    PubMed

    Kwon, Young-Hoo; Casebolt, Jeffrey B

    2006-01-01

    One of the most serious obstacles to accurate quantification of the underwater motion of a swimmer's body is image deformation caused by refraction. Refraction occurs at the water-air interface plane (glass) owing to the density difference. Camera calibration-reconstruction algorithms commonly used in aquatic research do not have the capability to correct this refraction-induced nonlinear image deformation and produce large reconstruction errors. The aim of this paper is to provide a through review of: the nature of the refraction-induced image deformation and its behaviour in underwater object-space plane reconstruction; the intrinsic shortcomings of the Direct Linear Transformation (DLT) method in underwater motion analysis; experimental conditions that interact with refraction; and alternative algorithms and strategies that can be used to improve the calibration-reconstruction accuracy. Although it is impossible to remove the refraction error completely in conventional camera calibration-reconstruction methods, it is possible to improve the accuracy to some extent by manipulating experimental conditions or calibration frame characteristics. Alternative algorithms, such as the localized DLT and the double-plane method are also available for error reduction. The ultimate solution for the refraction problem is to develop underwater camera calibration and reconstruction algorithms that have the capability to correct refraction. PMID:16521625

  20. Accuracy and Repeatability of the Gait Analysis by the WalkinSense System

    PubMed Central

    de Castro, Marcelo P.; Soares, Denise P.; Borgonovo-Santos, Márcio; Sousa, Filipa; Vilas-Boas, João Paulo

    2014-01-01

    WalkinSense is a new device designed to monitor walking. The aim of this study was to measure the accuracy and repeatability of the gait analysis performed by the WalkinSense system. Descriptions of values recorded by WalkinSense depicting typical gait in adults are also presented. A bench experiment using the Trublu calibration device was conducted to statically test the WalkinSense. Following this, a dynamic test was carried out overlapping the WalkinSense and the Pedar insoles in 40 healthy participants during walking. Pressure peak, pressure peak time, pressure-time integral, and mean pressure at eight-foot regions were calculated. In the bench experiments, the repeatability (i) among the WalkinSense sensors (within), (ii) between two WalkinSense devices, and (iii) between the WalkinSense and the Trublu devices was excellent. In the dynamic tests, the repeatability of the WalkinSense (i) between stances in the same trial (within-trial) and (ii) between trials was also excellent (ICC > 0.90). When the eight-foot regions were analyzed separately, the within-trial and between-trials repeatability was good-to-excellent in 88% (ICC > 0.80) of the data and fair in 11%. In short, the data suggest that the WalkinSense has good-to-excellent levels of accuracy and repeatability for plantar pressure variables. PMID:24701570

  1. Diagnostic test accuracy of glutamate dehydrogenase for Clostridium difficile: Systematic review and meta-analysis

    PubMed Central

    Arimoto, Jun; Horita, Nobuyuki; Kato, Shingo; Fuyuki, Akiko; Higurashi, Takuma; Ohkubo, Hidenori; Endo, Hiroki; Takashi, Nonaka; Kaneko, Takeshi; Nakajima, Atsushi

    2016-01-01

    We performed this systematic review and meta-analysis to assess the diagnostic accuracy of detecting glutamate dehydrogenase (GDH) for Clostridium difficile infection (CDI) based on the hierarchical model. Two investigators electrically searched four databases. Reference tests were stool cell cytotoxicity neutralization assay (CCNA) and stool toxigenic culture (TC). To assess the overall accuracy, we calculated the diagnostic odds ratio (DOR) using a DerSimonian-Laird random-model and area the under hierarchical summary receiver operating characteristics (AUC) using Holling’s proportional hazard models. The summary estimate of the sensitivity and the specificity were obtained using the bivariate model. According to 42 reports consisting of 3055 reference positive comparisons, and 26188 reference negative comparisons, the DOR was 115 (95%CI: 77–172, I2 = 12.0%) and the AUC was 0.970 (95%CI: 0.958–0.982). The summary estimate of sensitivity and specificity were 0.911 (95%CI: 0.871–0.940) and 0.912 (95%CI: 0.892–0.928). The positive and negative likelihood ratios were 10.4 (95%CI 8.4–12.7) and 0.098 (95%CI 0.066–0.142), respectively. Detecting GDH for the diagnosis of CDI had both high sensitivity and specificity. Considering its low cost and prevalence, it is appropriate for a screening test for CDI. PMID:27418431

  2. Effects of light refraction on the accuracy of camera calibration and reconstruction in underwater motion analysis.

    PubMed

    Kwon, Young-Hoo; Casebolt, Jeffrey B

    2006-07-01

    One of the most serious obstacles to accurate quantification of the underwater motion of a swimmer's body is image deformation caused by refraction. Refraction occurs at the water-air interface plane (glass) owing to the density difference. Camera calibration-reconstruction algorithms commonly used in aquatic research do not have the capability to correct this refraction-induced nonlinear image deformation and produce large reconstruction errors. The aim of this paper is to provide a thorough review of: the nature of the refraction-induced image deformation and its behaviour in underwater object-space plane reconstruction; the intrinsic shortcomings of the Direct Linear Transformation (DLT) method in underwater motion analysis; experimental conditions that interact with refraction; and alternative algorithms and strategies that can be used to improve the calibration-reconstruction accuracy. Although it is impossible to remove the refraction error completely in conventional camera calibration-reconstruction methods, it is possible to improve the accuracy to some extent by manipulating experimental conditions or calibration frame characteristics. Alternative algorithms, such as the localized DLT and the double-plane method are also available for error reduction. The ultimate solution for the refraction problem is to develop underwater camera calibration and reconstruction algorithms that have the capability to correct refraction. PMID:16939159

  3. Diagnostic Accuracy of Noncontrast CT in Detecting Acute Appendicitis: A Meta-analysis of Prospective Studies.

    PubMed

    Xiong, Bing; Zhong, Baishu; Li, Zhenwei; Zhou, Feng; Hu, Ruying; Feng, Zhan; Xu, Shunliang; Chen, Feng

    2015-06-01

    The aim of the study is to evaluate the diagnostic accuracy of noncontrast CT in detecting acute appendicitis. Prospective studies in which noncontrast CT was performed to evaluate acute appendicitis were found on PubMed, EMBASE, and Cochrane Library. Pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, and diagnostic odds ratio were assessed. The summary receiver-operating characteristic curve was conducted and the area under the curve was calculated. Seven original studies investigating a total of 845 patients were included in this meta-analysis. The pooled sensitivity and specificity were 0.90 (95% CI: 0.86-0.92) and 0.94 (95% CI: 0.92-0.97), respectively. The pooled positive likelihood ratio, negative likelihood ratio, and diagnostic odds ratio was 12.90 (95% CI: 4.80-34.67), 0.09 (95% CI: 0.04-0.20), and 162.76 (95% CI: 31.05-853.26), respectively. The summary receiver-operating characteristic curve was symmetrical and the area under the curve was 0.97 (95% CI: 0.95-0.99). In conclusion, noncontrast CT has high diagnostic accuracy in detecting acute appendicitis, which is adequate for clinical decision making. PMID:26031278

  4. Accuracy enhancement of GPS time series using principal component analysis and block spatial filtering

    NASA Astrophysics Data System (ADS)

    He, Xiaoxing; Hua, Xianghong; Yu, Kegen; Xuan, Wei; Lu, Tieding; Zhang, W.; Chen, X.

    2015-03-01

    This paper focuses on performance analysis and accuracy enhancement of long-term position time series of a regional network of GPS stations with two near sub-blocks, one block of 8 stations in Cascadia region and another block of 14 stations in Southern California. We have analyzed the seasonal variations of the 22 IGS site positions between 2004 and 2011. The Green's function is used to calculate the station-site displacements induced by the environmental loading due to atmospheric pressure, soil moisture, snow depth and nontidal ocean. The analysis has revealed that these loading factors can result in position shift of centimeter level, the displacement time series exhibit a periodic pattern, which can explain about 12.70-21.78% of the seasonal amplitude on vertical GPS time series, and the loading effect is significantly different among the two nearby geographical regions. After the loading effect is corrected, the principal component analysis (PCA)-based block spatial filtering is proposed to filter out the remaining common mode error (CME) of the GPS time series. The results show that the PCA-based block spatial filtering can extract the CME more accurately and effectively than the conventional overall filtering method, reducing more of the uncertainty. With the loading correction and block spatial filtering, about 68.34-73.20% of the vertical GPS seasonal power can be separated and removed, improving the reliability of the GPS time series and hence enabling better deformation analysis and higher precision geodetic applications.

  5. modern global models of the earth's gravity field: analysis of their accuracy and resolution

    NASA Astrophysics Data System (ADS)

    Ganagina, Irina; Karpik, Alexander; Kanushin, Vadim; Goldobin, Denis; Kosareva, Alexandra; Kosarev, Nikolay; Mazurova, Elena

    2015-04-01

    Introduction: Accurate knowledge of the fine structure of the Earth's gravity field extends opportunities in geodynamic problem-solving and high-precision navigation. In the course of our investigations have been analyzed the resolution and accuracy of 33 modern global models of the Earth's gravity field and among them 23 combined models and 10 satellite models obtained by the results of GOCE, GRACE, and CHAMP satellite gravity mission. The Earth's geopotential model data in terms of normalized spherical harmonic coefficients were taken from the web-site of the International Centre for Global Earth Models (ICGEM) in Potsdam. Theory: Accuracy and resolution estimation of global Earth's gravity field models is based on the analysis of degree variances of geopotential coefficients and their errors. During investigations for analyzing models were obtained dependences of approximation errors for gravity anomalies on the spherical harmonic expansion of the geopotential, relative errors of geopotential's spherical harmonic coefficients, degree variances for geopotential coefficients, and error variances of potential coefficients obtained from gravity anomalies. Delphi 7-based software developed by authors was used for the analysis of global Earth's gravity field models. Experience: The results of investigations show that spherical harmonic coefficients of all matched. Diagrams of degree variances for spherical harmonic coefficients and their errors bring us to the conclusion that the degree variances of most models equal to their error variances for a degree less than that declared by developers. The accuracy of normalized spherical harmonic coefficients of geopotential models is estimated as 10-9. This value characterizes both inherent errors of models, and the difference of coefficients in various models, as well as a scale poor predicted instability of the geopotential, and resolution. Furthermore, we compared the gravity anomalies computed by models with those

  6. [Analysis on the accuracy of simple selection method of Fengshi (GB 31)].

    PubMed

    Li, Zhixing; Zhang, Haihua; Li, Suhe

    2015-12-01

    To explore the accuracy of simple selection method of Fengshi (GB 31). Through the study of the ancient and modern data,the analysis and integration of the acupuncture books,the comparison of the locations of Fengshi (GB 31) by doctors from all dynasties and the integration of modern anatomia, the modern simple selection method of Fengshi (GB 31) is definite, which is the same as the traditional way. It is believed that the simple selec tion method is in accord with the human-oriented thought of TCM. Treatment by acupoints should be based on the emerging nature and the individual difference of patients. Also, it is proposed that Fengshi (GB 31) should be located through the integration between the simple method and body surface anatomical mark. PMID:26964185

  7. Comprehensive Numerical Analysis of Finite Difference Time Domain Methods for Improving Optical Waveguide Sensor Accuracy

    PubMed Central

    Samak, M. Mosleh E. Abu; Bakar, A. Ashrif A.; Kashif, Muhammad; Zan, Mohd Saiful Dzulkifly

    2016-01-01

    This paper discusses numerical analysis methods for different geometrical features that have limited interval values for typically used sensor wavelengths. Compared with existing Finite Difference Time Domain (FDTD) methods, the alternating direction implicit (ADI)-FDTD method reduces the number of sub-steps by a factor of two to three, which represents a 33% time savings in each single run. The local one-dimensional (LOD)-FDTD method has similar numerical equation properties, which should be calculated as in the previous method. Generally, a small number of arithmetic processes, which result in a shorter simulation time, are desired. The alternating direction implicit technique can be considered a significant step forward for improving the efficiency of unconditionally stable FDTD schemes. This comparative study shows that the local one-dimensional method had minimum relative error ranges of less than 40% for analytical frequencies above 42.85 GHz, and the same accuracy was generated by both methods.

  8. Accuracy analysis by using WARIMA model to forecast TEC in China

    NASA Astrophysics Data System (ADS)

    Liu, Lilong; Chen, Jun; Wu, Pituan; Cai, Chenghui; Huang, Liangke

    2015-12-01

    Aiming at the characteristic of nonlinear and non-stationary in ionospheric total electron content(TEC), this article bring Wavelet Analysis into the autoregressive integrated moving average model to forecast the next four days' TEC values by using six days' ionospheric grid observation data of Chinese area in 2010 provided by IGS station. Taking IGS station's observation data as true value, compare the forecast value with it then count the forecast accuracies which are to prove that it has a quite good result by using WARIMA model to forecast Chinese area's Ionospheric grid data. But near the geomagnetic latitude of about +/-20°grid, the model's forecast results are a little worse than others' because Geomagnetic activity is irregular which lead to the TEC values there change greatly.

  9. High-accuracy biodistribution analysis of adeno-associated virus variants by double barcode sequencing

    PubMed Central

    Marsic, Damien; Méndez-Gómez, Héctor R; Zolotukhin, Sergei

    2015-01-01

    Biodistribution analysis is a key step in the evaluation of adeno-associated virus (AAV) capsid variants, whether natural isolates or produced by rational design or directed evolution. Indeed, when screening candidate vectors, accurate knowledge about which tissues are infected and how efficiently is essential. We describe the design, validation, and application of a new vector, pTR-UF50-BC, encoding a bioluminescent protein, a fluorescent protein and a DNA barcode, which can be used to visualize localization of transduction at the organism, organ, tissue, or cellular levels. In addition, by linking capsid variants to different barcoded versions of the vector and amplifying the barcode region from various tissue samples using barcoded primers, biodistribution of viral genomes can be analyzed with high accuracy and efficiency. PMID:26793739

  10. Diagnostic Accuracy of Calretinin for Malignant Mesothelioma in Serous Effusions: a Meta-analysis

    PubMed Central

    Li, Diandian; Wang, Bo; Long, Hongyu; Wen, Fuqiang

    2015-01-01

    Numerous studies have investigated the utility of calretinin in differentiating malignant mesothelioma (MM) from metastatic carcinoma (MC) in serous effusions. However, the results remain controversial. The aim of this study is to determine the overall accuracy of calretinin in serous effusions for MM through a meta-analysis of published studies. Publications addressing the accuracy of calretinin in the diagnosis of MM were selected from the Medline (Ovid), PubMed, the Cochrane Library Database and the Web of Science. Data from selected studies were pooled to yield summary sensitivity, specificity, positive and negative likelihood ratio (LR), diagnostic odds ratio (DOR), and receiver operating characteristic (SROC) curve. Statistical analysis was performed by Meta-Disc 1.4 and STATA 12.0 softwares. 18 studies met the inclusion criteria and the summary estimating for calretinin in the diagnosis of MM were: sensitivity 0.91 (95%CI: 0.87–0.94), specificity 0.96 (95%CI: 0.95–0.96), positive likelihood ratio (PLR) 14.42 (95%CI: 7.92–26.26), negative likelihood ratio (NLR) 0.1 (95%CI: 0.05–0.2) and diagnostic odds ratio 163.03 (95%CI: 54.62–486.63). The SROC curve indicated that the maximum joint sensitivity and specificity (Q-value) was 0.92; the area under the curve was 0.97. Our findings suggest that calretinin may be a useful diagnostic tool for confirming MM in serous effusions. PMID:25821016

  11. Slight pressure imbalances can affect accuracy and precision of dual inlet-based clumped isotope analysis.

    PubMed

    Fiebig, Jens; Hofmann, Sven; Löffler, Niklas; Lüdecke, Tina; Methner, Katharina; Wacker, Ulrike

    2016-01-01

    It is well known that a subtle nonlinearity can occur during clumped isotope analysis of CO2 that - if remaining unaddressed - limits accuracy. The nonlinearity is induced by a negative background on the m/z 47 ion Faraday cup, whose magnitude is correlated with the intensity of the m/z 44 ion beam. The origin of the negative background remains unclear, but is possibly due to secondary electrons. Usually, CO2 gases of distinct bulk isotopic compositions are equilibrated at 1000 °C and measured along with the samples in order to be able to correct for this effect. Alternatively, measured m/z 47 beam intensities can be corrected for the contribution of secondary electrons after monitoring how the negative background on m/z 47 evolves with the intensity of the m/z 44 ion beam. The latter correction procedure seems to work well if the m/z 44 cup exhibits a wider slit width than the m/z 47 cup. Here we show that the negative m/z 47 background affects precision of dual inlet-based clumped isotope measurements of CO2 unless raw m/z 47 intensities are directly corrected for the contribution of secondary electrons. Moreover, inaccurate results can be obtained even if the heated gas approach is used to correct for the observed nonlinearity. The impact of the negative background on accuracy and precision arises from small imbalances in m/z 44 ion beam intensities between reference and sample CO2 measurements. It becomes the more significant the larger the relative contribution of secondary electrons to the m/z 47 signal is and the higher the flux rate of CO2 into the ion source is set. These problems can be overcome by correcting the measured m/z 47 ion beam intensities of sample and reference gas for the contributions deriving from secondary electrons after scaling these contributions to the intensities of the corresponding m/z 49 ion beams. Accuracy and precision of this correction are demonstrated by clumped isotope analysis of three internal carbonate standards. The

  12. Computer-aided analysis of star shot films for high-accuracy radiation therapy treatment units

    NASA Astrophysics Data System (ADS)

    Depuydt, Tom; Penne, Rudi; Verellen, Dirk; Hrbacek, Jan; Lang, Stephanie; Leysen, Katrien; Vandevondel, Iwein; Poels, Kenneth; Reynders, Truus; Gevaert, Thierry; Duchateau, Michael; Tournel, Koen; Boussaer, Marlies; Cosentino, Dorian; Garibaldi, Cristina; Solberg, Timothy; De Ridder, Mark

    2012-05-01

    As mechanical stability of radiation therapy treatment devices has gone beyond sub-millimeter levels, there is a rising demand for simple yet highly accurate measurement techniques to support the routine quality control of these devices. A combination of using high-resolution radiosensitive film and computer-aided analysis could provide an answer. One generally known technique is the acquisition of star shot films to determine the mechanical stability of rotations of gantries and the therapeutic beam. With computer-aided analysis, mechanical performance can be quantified as a radiation isocenter radius size. In this work, computer-aided analysis of star shot film is further refined by applying an analytical solution for the smallest intersecting circle problem, in contrast to the gradient optimization approaches used until today. An algorithm is presented and subjected to a performance test using two different types of radiosensitive film, the Kodak EDR2 radiographic film and the ISP EBT2 radiochromic film. Artificial star shots with a priori known radiation isocenter size are used to determine the systematic errors introduced by the digitization of the film and the computer analysis. The estimated uncertainty on the isocenter size measurement with the presented technique was 0.04 mm (2σ) and 0.06 mm (2σ) for radiographic and radiochromic films, respectively. As an application of the technique, a study was conducted to compare the mechanical stability of O-ring gantry systems with C-arm-based gantries. In total ten systems of five different institutions were included in this study and star shots were acquired for gantry, collimator, ring, couch rotations and gantry wobble. It was not possible to draw general conclusions about differences in mechanical performance between O-ring and C-arm gantry systems, mainly due to differences in the beam-MLC alignment procedure accuracy. Nevertheless, the best performing O-ring system in this study, a BrainLab/MHI Vero system

  13. Diagnostic Accuracy of PIK3CA Mutation Detection by Circulating Free DNA in Breast Cancer: A Meta-Analysis of Diagnostic Test Accuracy

    PubMed Central

    Zhu, Hanjiang; Lin, Yan; Pan, Bo; Zhang, Xiaohui; Huang, Xin; Xu, Qianqian; Xu, Yali; Sun, Qiang

    2016-01-01

    Mutation of p110 alpha-catalytic subunit of phosphatidylinositol 3-kinase (PIK3CA) has high predictive and prognostic values for breast cancer. Hence, there has been a marked interest in detecting and monitoring PIK3CA genotype with non-invasive technique, such as circulating free DNA (cfDNA). However, the diagnostic accuracy of PIK3CA genotyping by cfDNA is still a problem of controversy. Here, we conducted the first meta-analysis to evaluate overall diagnostic performance of cfDNA for PIK3CA mutation detection. Literature search was performed in Pubmed, Embase and Cochrane Central Register of Controlled Trials databases. Seven cohorts from five studies with 247 patients were included. The pooled sensitivity, specificity, positive and negative likelihood ratio, diagnostic odds ratio and area under summary receiver operating characteristic curve were calculated for accuracy evaluation. The pooled sensitivity and specificity were 0.86 (95% confidence interval [CI] 0.32–0.99) and 0.98 (95% CI 0.86–1.00), respectively; the pooled positive and negative likelihood ratio were 42.8 (95% CI 5.1–356.9) and 0.14 (95% CI 0.02–1.34), respectively; diagnostic odds ratio for evaluating the overall diagnostic performance was 300 (95% CI 8–11867); area under summary receiver operating characteristic curve reached 0.99 (95% CI 0.97–0.99). Subgroup analysis with metastatic breast cancer revealed remarkable improvement in diagnostic performance (sensitivity: 0.86–0.91; specificity: 0.98; diagnostic odds ratio: 300–428). This meta-analysis proved that detecting PIK3CA gene mutation by cfDNA has high diagnostic accuracy in breast cancer, especially for metastatic breast cancer. It may serve as a reliable non-invasive assay for detecting and monitoring PIK3CA mutation status in order to deliver personalized and precise treatment. PMID:27336598

  14. Evaluating the accuracy of molecular diagnostic testing for canine visceral leishmaniasis using latent class analysis.

    PubMed

    Solcà, Manuela da Silva; Bastos, Leila Andrade; Guedes, Carlos Eduardo Sampaio; Bordoni, Marcelo; Borja, Lairton Souza; Larangeira, Daniela Farias; da Silva Estrela Tuy, Pétala Gardênia; Amorim, Leila Denise Alves Ferreira; Nascimento, Eliane Gomes; de Sá Oliveira, Geraldo Gileno; dos-Santos, Washington Luis Conrado; Fraga, Deborah Bittencourt Mothé; Veras, Patrícia Sampaio Tavares

    2014-01-01

    Host tissues affected by Leishmania infantum have differing degrees of parasitism. Previously, the use of different biological tissues to detect L. infantum DNA in dogs has provided variable results. The present study was conducted to evaluate the accuracy of molecular diagnostic testing (qPCR) in dogs from an endemic area for canine visceral leishmaniasis (CVL) by determining which tissue type provided the highest rate of parasite DNA detection. Fifty-one symptomatic dogs were tested for CVL using serological, parasitological and molecular methods. Latent class analysis (LCA) was performed for accuracy evaluation of these methods. qPCR detected parasite DNA in 100% of these animals from at least one of the following tissues: splenic and bone marrow aspirates, lymph node and skin fragments, blood and conjunctival swabs. Using latent variable as gold standard, the qPCR achieved a sensitivity of 95.8% (CI 90.4-100) in splenic aspirate; 79.2% (CI 68-90.3) in lymph nodes; 77.3% (CI 64.5-90.1) in skin; 75% (CI 63.1-86.9) in blood; 50% (CI 30-70) in bone marrow; 37.5% (CI 24.2-50.8) in left-eye; and 29.2% (CI 16.7-41.6) in right-eye conjunctival swabs. The accuracy of qPCR using splenic aspirates was further evaluated in a random larger sample (n = 800), collected from dogs during a prevalence study. The specificity achieved by qPCR was 76.7% (CI 73.7-79.6) for splenic aspirates obtained from the greater sample. The sensitivity accomplished by this technique was 95% (CI 93.5-96.5) that was higher than those obtained for the other diagnostic tests and was similar to that observed in the smaller sampling study. This confirms that the splenic aspirate is the most effective type of tissue for detecting L. infantum infection. Additionally, we demonstrated that LCA could be used to generate a suitable gold standard for comparative CVL testing. PMID:25076494

  15. Evaluating the Accuracy of Molecular Diagnostic Testing for Canine Visceral Leishmaniasis Using Latent Class Analysis

    PubMed Central

    Solcà, Manuela da Silva; Bastos, Leila Andrade; Guedes, Carlos Eduardo Sampaio; Bordoni, Marcelo; Borja, Lairton Souza; Larangeira, Daniela Farias; da Silva Estrela Tuy, Pétala Gardênia; Amorim, Leila Denise Alves Ferreira; Nascimento, Eliane Gomes; de Sá Oliveira, Geraldo Gileno; dos-Santos, Washington Luis Conrado; Fraga, Deborah Bittencourt Mothé; Veras, Patrícia Sampaio Tavares

    2014-01-01

    Host tissues affected by Leishmania infantum have differing degrees of parasitism. Previously, the use of different biological tissues to detect L. infantum DNA in dogs has provided variable results. The present study was conducted to evaluate the accuracy of molecular diagnostic testing (qPCR) in dogs from an endemic area for canine visceral leishmaniasis (CVL) by determining which tissue type provided the highest rate of parasite DNA detection. Fifty-one symptomatic dogs were tested for CVL using serological, parasitological and molecular methods. Latent class analysis (LCA) was performed for accuracy evaluation of these methods. qPCR detected parasite DNA in 100% of these animals from at least one of the following tissues: splenic and bone marrow aspirates, lymph node and skin fragments, blood and conjunctival swabs. Using latent variable as gold standard, the qPCR achieved a sensitivity of 95.8% (CI 90.4–100) in splenic aspirate; 79.2% (CI 68–90.3) in lymph nodes; 77.3% (CI 64.5–90.1) in skin; 75% (CI 63.1–86.9) in blood; 50% (CI 30–70) in bone marrow; 37.5% (CI 24.2–50.8) in left-eye; and 29.2% (CI 16.7–41.6) in right-eye conjunctival swabs. The accuracy of qPCR using splenic aspirates was further evaluated in a random larger sample (n = 800), collected from dogs during a prevalence study. The specificity achieved by qPCR was 76.7% (CI 73.7–79.6) for splenic aspirates obtained from the greater sample. The sensitivity accomplished by this technique was 95% (CI 93.5–96.5) that was higher than those obtained for the other diagnostic tests and was similar to that observed in the smaller sampling study. This confirms that the splenic aspirate is the most effective type of tissue for detecting L. infantum infection. Additionally, we demonstrated that LCA could be used to generate a suitable gold standard for comparative CVL testing. PMID:25076494

  16. Issues of model accuracy and uncertainty evaluation in the context of multi-model analysis

    NASA Astrophysics Data System (ADS)

    Hill, M. C.; Foglia, L.; Mehl, S.; Burlando, P.

    2009-12-01

    Thorough consideration of alternative conceptual models is an important and often neglected step in the study of many natural systems, including groundwater systems. This means that many modelling efforts are less useful for system management than they could be because they exclude alternatives considered important by some stakeholders, which makes them more vulnerable to criticism. Important steps include identifying reasonable alternative models and possibly using model discrimination criteria and associated model averaging to improve predictions and measures of prediction uncertainty. Here we use the computer code MMA (Multi-Model Analysis) to: (1) manage the model discrimination statistics produced by many alternative models, (2) mange predictions, and (3) calculate measures of prediction uncertainty. (1) to (3) also assist in understand the physical processes most important to model fit and predictions of interest. We focus on the ability of a groundwater model constructed using MODFLOW to predict heads and flows in the Maggia Valley, Southern Switzerland, where connections between groundwater, surface water and ecology are of interest. Sixty-four alternative models were designed deterministically and differ in how the river, recharge, bedrock topography, and hydraulic conductivity are characterized. None of the models correctly represent heads and flows in the Northern and Southern part of the valley simultaneously. A cross-validation experiment was conducted to compare model discrimination results with the ability of the models to predict eight heads and three flows to the stream along three reaches midway along the valley where ecological consequences and, therefore, model accuracy are of great concern. Results suggest: (1) Model averaging appears to have improved prediction accuracy in the problem considered. (2) The most significant model improvements occurred with introduction of spatially distributed recharge and improved bedrock topography. (3) The

  17. Summary of Glaucoma Diagnostic Testing Accuracy: An Evidence-Based Meta-Analysis

    PubMed Central

    Ahmed, Saad; Khan, Zainab; Si, Francie; Mao, Alex; Pan, Irene; Yazdi, Fatemeh; Tsertsvadze, Alexander; Hutnik, Cindy; Moher, David; Tingey, David; Trope, Graham E.; Damji, Karim F.; Tarride, Jean-Eric; Goeree, Ron; Hodge, William

    2016-01-01

    Background New glaucoma diagnostic technologies are penetrating clinical care and are changing rapidly. Having a systematic review of these technologies will help clinicians and decision makers and help identify gaps that need to be addressed. This systematic review studied five glaucoma technologies compared to the gold standard of white on white perimetry for glaucoma detection. Methods OVID® interface: MEDLINE® (In-Process & Other Non-Indexed Citations), EMBASE®, BIOSIS Previews®, CINAHL®, PubMed, and the Cochrane Library were searched. A gray literature search was also performed. A technical expert panel, information specialists, systematic review method experts and biostatisticians were used. A PRISMA flow diagram was created and a random effect meta-analysis was performed. Results A total of 2,474 articles were screened. The greatest accuracy was found with frequency doubling technology (FDT) (diagnostic odds ratio (DOR): 57.7) followed by blue on yellow perimetry (DOR: 46.7), optical coherence tomography (OCT) (DOR: 41.8), GDx (DOR: 32.4) and Heidelberg retina tomography (HRT) (DOR: 17.8). Of greatest concern is that tests for heterogeneity were all above 50%, indicating that cutoffs used in these newer technologies were all very varied and not uniform across studies. Conclusions Glaucoma content experts need to establish uniform cutoffs for these newer technologies, so that studies that compare these technologies can be interpreted more uniformly. Nevertheless, synthesized data at this time demonstrate that amongst the newest technologies, OCT has the highest glaucoma diagnostic accuracy followed by GDx and then HRT. PMID:27540437

  18. Analysis of the Accuracy of Ballistic Descent from a Circular Circumterrestrial Orbit

    NASA Astrophysics Data System (ADS)

    Sikharulidze, Yu. G.; Korchagin, A. N.

    2002-01-01

    The problem of the transportation of the results of experiments and observations to Earth every so often appears in space research. Its simplest and low-cost solution is the employment of a small ballistic reentry spacecraft. Such a spacecraft has no system of control of the descent trajectory in the atmosphere. This can result in a large spread of landing points, which make it difficult to search for the spacecraft and very often a safe landing. In this work, a choice of a compromise scheme of the flight is considered, which includes the optimum braking maneuver, adequate conditions of the entry into the atmosphere with limited heating and overload, and also the possibility of landing within the limits of a circle with a radius of 12.5 km. The following disturbing factors were taken into account in the analysis of the accuracy of landing: the errors of the braking impulse execution, the variations of the atmosphere density and the wind, the error of the specification of the ballistic coefficient of the reentry spacecraft, and a displacement of its center of mass from the symmetry axis. It is demonstrated that the optimum maneuver assures the maximum absolute value of the reentry angle and the insensitivity of the trajectory of descent with respect to small errors of orientation of the braking engine in the plane of the orbit. It is also demonstrated that the possible error of the landing point due to the error of specification of the ballistic coefficient does not depend (in the linear approximation) upon its value and depends only upon the reentry angle and the accuracy of specification of this coefficient. A guided parachute with an aerodynamic efficiency of about two should be used at the last leg of the reentry trajectory. This will allow one to land in a prescribed range and to produce adequate conditions for the interception of the reentry spacecraft by a helicopter in order to prevent a rough landing.

  19. Accuracy of different oxygenation indices in estimating intrapulmonary shunting at increasing infusion rates of dobutamine in horses under general anaesthesia.

    PubMed

    Briganti, A; Portela, D A; Grasso, S; Sgorbini, M; Tayari, H; Bassini, J R Fusar; Vitale, V; Romano, M S; Crovace, A; Breghi, G; Staffieri, F

    2015-06-01

    The aim of this study was to evaluate the correlation of commonly used oxygenation indices with venous admixture (Qs/Qt) in anaesthetised horses under different infusion rates of dobutamine. Six female horses were anaesthetised with acepromazine, xylazine, diazepam, ketamine, and isoflurane, and then intubated and mechanically ventilated with 100% O2. A Swan-Ganz catheter was introduced into the left jugular vein and its tip advanced into the pulmonary artery. Horses received different standardised rates of dobutamine. For each horse, eight samples of arterial and mixed venous blood were simultaneously obtained at fixed times. Arterial and venous haemoglobin (Hb) concentration and O2 saturation, arterial oxygen partial pressure (PaO2), venous oxygen partial pressure (PvO2), and barometric pressure were measured. Arterial (CaO2), mixed venous (CvO2), and capillary (Cc'O2) oxygen contents were calculated using standard formulae. The correlations between F-shunt, arterial oxygen tension to fraction of inspired oxygen ratio (PaO2/FiO2), arterial to alveolar oxygen tension ratio (PaO2/PAO2), alveolar to arterial oxygen tension difference (P[A - a]O2), and respiratory index (P[A - a]O2/PaO2) were tested with linear regression analysis. The goodness-of-fit for each calculated formula was evaluated by means of the coefficient of determination (r(2)). The agreement between Qs/Qt and F-shunt was analysed with the Bland-Altman test. All tested oxygen tension-based indices were weakly correlated (r(2) < 0.2) with the Qs/Qt, whereas F-shunt showed a stronger correlation (r(2) = 0.73). F-shunt also showed substantial agreement with Qs/Qt independent of the dobutamine infusion rate. F-shunt better correlated with Qs/Qt than other oxygen indices in isoflurane-anaesthetised horses under different infusion rates of dobutamine. PMID:25920771

  20. Integrating Landsat and California pesticide exposure estimation at aggregated analysis scales: Accuracy assessment of rurality

    NASA Astrophysics Data System (ADS)

    Vopham, Trang Minh

    Pesticide exposure estimation in epidemiologic studies can be constrained to analysis scales commonly available for cancer data - census tracts and ZIP codes. Research goals included (1) demonstrating the feasibility of modifying an existing geographic information system (GIS) pesticide exposure method using California Pesticide Use Reports (PURs) and land use surveys to incorporate Landsat remote sensing and to accommodate aggregated analysis scales, and (2) assessing the accuracy of two rurality metrics (quality of geographic area being rural), Rural-Urban Commuting Area (RUCA) codes and the U.S. Census Bureau urban-rural system, as surrogates for pesticide exposure when compared to the GIS gold standard. Segments, derived from 1985 Landsat NDVI images, were classified using a crop signature library (CSL) created from 1990 Landsat NDVI images via a sum of squared differences (SSD) measure. Organochlorine, organophosphate, and carbamate Kern County PUR applications (1974-1990) were matched to crop fields using a modified three-tier approach. Annual pesticide application rates (lb/ac), and sensitivity and specificity of each rurality metric were calculated. The CSL (75 land use classes) classified 19,752 segments [median SSD 0.06 NDVI]. Of the 148,671 PUR records included in the analysis, Landsat contributed 3,750 (2.5%) additional tier matches. ZIP Code Tabulation Area (ZCTA) rates ranged between 0 and 1.36 lb/ac and census tract rates between 0 and 1.57 lb/ac. Rurality was a mediocre pesticide exposure surrogate; higher rates were observed among urban areal units. ZCTA-level RUCA codes offered greater specificity (39.1-60%) and sensitivity (25-42.9%). The U.S. Census Bureau metric offered greater specificity (92.9-97.5%) at the census tract level; sensitivity was low (≤6%). The feasibility of incorporating Landsat into a modified three-tier GIS approach was demonstrated. Rurality accuracy is affected by rurality metric, areal aggregation, pesticide chemical

  1. Accuracy in Rietveld quantitative phase analysis: a comparative study of strictly monochromatic Mo and Cu radiations

    PubMed Central

    León-Reina, L.; García-Maté, M.; Álvarez-Pinazo, G.; Santacruz, I.; Vallcorba, O.; De la Torre, A. G.; Aranda, M. A. G.

    2016-01-01

    This study reports 78 Rietveld quantitative phase analyses using Cu Kα1, Mo Kα1 and synchrotron radiations. Synchrotron powder diffraction has been used to validate the most challenging analyses. From the results for three series with increasing contents of an analyte (an inorganic crystalline phase, an organic crystalline phase and a glass), it is inferred that Rietveld analyses from high-energy Mo Kα1 radiation have slightly better accuracies than those obtained from Cu Kα1 radiation. This behaviour has been established from the results of the calibration graphics obtained through the spiking method and also from Kullback–Leibler distance statistic studies. This outcome is explained, in spite of the lower diffraction power for Mo radiation when compared to Cu radiation, as arising because of the larger volume tested with Mo and also because higher energy allows one to record patterns with fewer systematic errors. The limit of detection (LoD) and limit of quantification (LoQ) have also been established for the studied series. For similar recording times, the LoDs in Cu patterns, ∼0.2 wt%, are slightly lower than those derived from Mo patterns, ∼0.3 wt%. The LoQ for a well crystallized inorganic phase using laboratory powder diffraction was established to be close to 0.10 wt% in stable fits with good precision. However, the accuracy of these analyses was poor with relative errors near to 100%. Only contents higher than 1.0 wt% yielded analyses with relative errors lower than 20%. PMID:27275132

  2. The Efficacy of Written Corrective Feedback in Improving L2 Written Accuracy: A Meta-Analysis

    ERIC Educational Resources Information Center

    Kang, EunYoung; Han, Zhaohong

    2015-01-01

    Written corrective feedback has been subject to increasing attention in recent years, in part because of the conceptual controversy surrounding it and in part because of its ubiquitous practice. This study takes a meta-analytic approach to synthesizing extant empirical research, including 21 primary studies. Guiding the analysis are two questions:…

  3. Measuring Speech Recognition Proficiency: A Psychometric Analysis of Speed and Accuracy

    ERIC Educational Resources Information Center

    Rader, Martha H.; Bailey, Glenn A.; Kurth, Linda A.

    2008-01-01

    This study examined the validity of various measures of speed and accuracy for assessing proficiency in speech recognition. The study specifically compared two different word-count indices for speed and accuracy (the 5-stroke word and the 1.4-syllable standard word) on a timing administered to 114 speech recognition students measured at 1-, 2-,…

  4. Accuracy analysis of direct georeferenced UAV images utilising low-cost navigation sensors

    NASA Astrophysics Data System (ADS)

    Briese, Christian; Wieser, Martin; Verhoeven, Geert; Glira, Philipp; Doneus, Michael; Pfeifer, Norbert

    2014-05-01

    control points should be used to improve the estimated values, especially to decrease the amount of systematic errors. For the bundle block adjustment the calibration of the camera and their temporal stability must be determined additionally. This contribution presents next to the theory a practical study on the accuracy analysis of direct georeferenced UAV imagery by low-cost navigation sensors. The analysis was carried out within the research project ARAP (automated (ortho)rectification of archaeological aerial photographs). The utilized UAS consists of the airplane "MAJA", manufactured by "Bormatec" (length: 1.2 m, wingspan: 2.2 m) equipped with the autopilot "ArduPilot Mega 2.5". For image acquisition the camera "Ricoh GR Digital IV" is utilised. The autopilot includes a GNSS receiver capable of DGPS (EGNOS), an inertial measurement system (INS), a barometer, and a magnetometer. In the study the achieved accuracies for the estimated position and orientation of the images are presented. The paper concludes with a summary of the remaining error sources and their possible corrections by applying further improvements on the utilised equipment and the direct georeferencing process.

  5. An Original Stepwise Multilevel Logistic Regression Analysis of Discriminatory Accuracy: The Case of Neighbourhoods and Health

    PubMed Central

    Wagner, Philippe; Ghith, Nermin; Leckie, George

    2016-01-01

    Background and Aim Many multilevel logistic regression analyses of “neighbourhood and health” focus on interpreting measures of associations (e.g., odds ratio, OR). In contrast, multilevel analysis of variance is rarely considered. We propose an original stepwise analytical approach that distinguishes between “specific” (measures of association) and “general” (measures of variance) contextual effects. Performing two empirical examples we illustrate the methodology, interpret the results and discuss the implications of this kind of analysis in public health. Methods We analyse 43,291 individuals residing in 218 neighbourhoods in the city of Malmö, Sweden in 2006. We study two individual outcomes (psychotropic drug use and choice of private vs. public general practitioner, GP) for which the relative importance of neighbourhood as a source of individual variation differs substantially. In Step 1 of the analysis, we evaluate the OR and the area under the receiver operating characteristic (AUC) curve for individual-level covariates (i.e., age, sex and individual low income). In Step 2, we assess general contextual effects using the AUC. Finally, in Step 3 the OR for a specific neighbourhood characteristic (i.e., neighbourhood income) is interpreted jointly with the proportional change in variance (i.e., PCV) and the proportion of ORs in the opposite direction (POOR) statistics. Results For both outcomes, information on individual characteristics (Step 1) provide a low discriminatory accuracy (AUC = 0.616 for psychotropic drugs; = 0.600 for choosing a private GP). Accounting for neighbourhood of residence (Step 2) only improved the AUC for choosing a private GP (+0.295 units). High neighbourhood income (Step 3) was strongly associated to choosing a private GP (OR = 3.50) but the PCV was only 11% and the POOR 33%. Conclusion Applying an innovative stepwise multilevel analysis, we observed that, in Malmö, the neighbourhood context per se had a negligible

  6. Accuracy of ionospheric models used in GNSS and SBAS: methodology and analysis

    NASA Astrophysics Data System (ADS)

    Rovira-Garcia, A.; Juan, J. M.; Sanz, J.; González-Casado, G.; Ibáñez, D.

    2016-03-01

    The characterization of the accuracy of ionospheric models currently used in global navigation satellite systems (GNSSs) is a long-standing issue. The characterization remains a challenging problem owing to the lack of sufficiently accurate slant ionospheric determinations to be used as a reference. The present study proposes a methodology based on the comparison of the predictions of any ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences are separated as hardware delays (a receiver constant plus a satellite constant) per day. The present study was conducted for the entire year of 2014, i.e. during the last solar cycle maximum. The ionospheric models assessed are the operational models broadcast by the global positioning system (GPS) and Galileo constellations, the satellite-based augmentation system (SBAS) (i.e. European Geostationary Navigation Overlay System (EGNOS) and wide area augmentation system (WAAS)), a number of post-process global ionospheric maps (GIMs) from different International GNSS Service (IGS) analysis centres (ACs) and, finally, a more sophisticated GIM computed by the research group of Astronomy and GEomatics (gAGE). Ionospheric models based on GNSS data and represented on a grid (IGS GIMs or SBAS) correct about 85 % of the total slant ionospheric delay, whereas the models broadcasted in the navigation messages of GPS and Galileo only account for about 70 %. Our gAGE GIM is shown to correct 95 % of the delay. The proposed methodology appears to be a useful tool to improve current ionospheric models.

  7. Analysis of Scattering Components from Fully Polarimetric SAR Images for Improving Accuracies of Urban Density Estimation

    NASA Astrophysics Data System (ADS)

    Susaki, J.

    2016-06-01

    In this paper, we analyze probability density functions (PDFs) of scatterings derived from fully polarimetric synthetic aperture radar (SAR) images for improving the accuracies of estimated urban density. We have reported a method for estimating urban density that uses an index Tv+c obtained by normalizing the sum of volume and helix scatterings Pv+c. Validation results showed that estimated urban densities have a high correlation with building-to-land ratios (Kajimoto and Susaki, 2013b; Susaki et al., 2014). While the method is found to be effective for estimating urban density, it is not clear why Tv+c is more effective than indices derived from other scatterings, such as surface or double-bounce scatterings, observed in urban areas. In this research, we focus on PDFs of scatterings derived from fully polarimetric SAR images in terms of scattering normalization. First, we introduce a theoretical PDF that assumes that image pixels have scatterers showing random backscattering. We then generate PDFs of scatterings derived from observations of concrete blocks with different orientation angles, and from a satellite-based fully polarimetric SAR image. The analysis of the PDFs and the derived statistics reveals that the curves of the PDFs of Pv+c are the most similar to the normal distribution among all the scatterings derived from fully polarimetric SAR images. It was found that Tv+c works most effectively because of its similarity to the normal distribution.

  8. The psychology of intelligence analysis: drivers of prediction accuracy in world politics.

    PubMed

    Mellers, Barbara; Stone, Eric; Atanasov, Pavel; Rohrbaugh, Nick; Metz, S Emlen; Ungar, Lyle; Bishop, Michael M; Horowitz, Michael; Merkle, Ed; Tetlock, Philip

    2015-03-01

    This article extends psychological methods and concepts into a domain that is as profoundly consequential as it is poorly understood: intelligence analysis. We report findings from a geopolitical forecasting tournament that assessed the accuracy of more than 150,000 forecasts of 743 participants on 199 events occurring over 2 years. Participants were above average in intelligence and political knowledge relative to the general population. Individual differences in performance emerged, and forecasting skills were surprisingly consistent over time. Key predictors were (a) dispositional variables of cognitive ability, political knowledge, and open-mindedness; (b) situational variables of training in probabilistic reasoning and participation in collaborative teams that shared information and discussed rationales (Mellers, Ungar, et al., 2014); and (c) behavioral variables of deliberation time and frequency of belief updating. We developed a profile of the best forecasters; they were better at inductive reasoning, pattern detection, cognitive flexibility, and open-mindedness. They had greater understanding of geopolitics, training in probabilistic reasoning, and opportunities to succeed in cognitively enriched team environments. Last but not least, they viewed forecasting as a skill that required deliberate practice, sustained effort, and constant monitoring of current affairs. PMID:25581088

  9. Accuracy analysis of mimetic finite volume operators on geodesic grids and a consistent alternative

    NASA Astrophysics Data System (ADS)

    Peixoto, Pedro S.

    2016-04-01

    Many newly developed climate, weather and ocean global models are based on quasi-uniform spherical polygonal grids, aiming for high resolution and better scalability. Thuburn et al. (2009) and Ringler et al. (2010) developed a C staggered finite volume/difference method for arbitrary polygonal spherical grids suitable for these next generation dynamical cores. This method has many desirable mimetic properties and became popular, being adopted in some recent models, in spite of being known to possess low order of accuracy. In this work, we show that, for the nonlinear shallow water equations on non-uniform grids, the method has potentially 3 main sources of inconsistencies (local truncation errors not converging to zero as the grid is refined): (i) the divergence term of the continuity equation, (ii) the perpendicular velocity and (iii) the kinetic energy terms of the vector invariant form of the momentum equations. Although some of these inconsistencies have not impacted the convergence on some standard shallow water test cases up until now, they may constitute a potential problem for high resolution 3D models. Based on our analysis, we propose modifications for the method that will make it first order accurate in the maximum norm. It preserves many of the mimetic properties, albeit having non-steady geostrophic modes on the f-sphere. Experimental results show that the resulting model is a more accurate alternative to the existing formulations and should provide means of having a consistent, computationally cheap and scalable atmospheric or ocean model on C staggered Voronoi grids.

  10. Objective analysis of the Gulf Stream thermal front: methods and accuracy. Technical report

    SciTech Connect

    Tracey, K.L.; Friedlander, A.I.; Watts, R.

    1987-12-01

    The objective-analysis (OA) technique was adapted by Watts and Tracey in order to map the thermal frontal zone of the Gulf Stream. Here, the authors test the robustness of the adapted OA technique to the selection of four control parameters: mean field, standard deviation field, correlation function, and decimation time. Output OA maps of the thermocline depth are most affected by the choice of mean field, with the most-realistic results produced using a time-averaged mean. The choice of the space-time correlation function has a large influence on the size of the estimated error fields, which are associated with the OA maps. The smallest errors occur using the analytic function based on 4 years of inverted echo sounder data collected in the same region of the Gulf Stream. Variations in the selection of the standard deviation field and decimation time have little effect on the output OA maps. Accuracy of the output OA maps is determined by comparing them with independent measurements of the thermal field. Two cases are evaluated: standard maps and high-temporal-resolution maps, with decimation times of 2 days and 1 day, respectively. Standard deviations (STD) between the standard maps at the 15% estimated error level and the XBTs (AXBTs) are determined to be 47-53 m. Comparisons of the high-temporal-resolution maps at the 20% error level with the XBTs (AXBTs) give STD differences of 47 m.

  11. High Accuracy Passive Magnetic Field-Based Localization for Feedback Control Using Principal Component Analysis.

    PubMed

    Foong, Shaohui; Sun, Zhenglong

    2016-01-01

    In this paper, a novel magnetic field-based sensing system employing statistically optimized concurrent multiple sensor outputs for precise field-position association and localization is presented. This method capitalizes on the independence between simultaneous spatial field measurements at multiple locations to induce unique correspondences between field and position. This single-source-multi-sensor configuration is able to achieve accurate and precise localization and tracking of translational motion without contact over large travel distances for feedback control. Principal component analysis (PCA) is used as a pseudo-linear filter to optimally reduce the dimensions of the multi-sensor output space for computationally efficient field-position mapping with artificial neural networks (ANNs). Numerical simulations are employed to investigate the effects of geometric parameters and Gaussian noise corruption on PCA assisted ANN mapping performance. Using a 9-sensor network, the sensing accuracy and closed-loop tracking performance of the proposed optimal field-based sensing system is experimentally evaluated on a linear actuator with a significantly more expensive optical encoder as a comparison. PMID:27529253

  12. Diagnostic accuracy of ascitic cholesterol concentration for malignant ascites: a meta-analysis

    PubMed Central

    Zhu, Hong; Shen, Yongchun; Deng, Kai; Liu, Xia; Zhao, Yaqin; Liu, Taiguo; Huang, Ying

    2015-01-01

    Many studies have investigated whether ascitic cholesterol can aid in diagnosis of malignant related ascites (MRA), and the results have varied considerably. To gain a more reliable answer to this question, we meta-analyzed the literature on using ascitic cholesterol as diagnostic tests to help identify MRA. Literature databases were systematically searched for studies examining accuracy of ascitic cholesterol for diagnosing MRA. Data on sensitivity, specificity, positive/negative likelihood ratio (PLR/NLR), and diagnostic odds ratio (DOR) were pooled using random effects models. Summary receiver operating characteristic (SROC) curves and area under the curve (AUC) were used to summarize overall test performance. At last, our meta-analysis included 8 studies involving 743 subjects. Summary estimates for ascitic cholesterol in the diagnosis of MRA were as follows: sensitivity, 0.82 (95% CI 0.78 to 0.86); specificity, 0.90 (95% CI 0.87 to 0.93); PLR, 9.24 (95% CI 4.58 to 18.66); NLR, 0.16 (95% CI 0.08 to 0.32); and DOR, 66.96 (95% CI 18.83 to 238.11). The AUC was 0.96. The ascitic cholesterol level is helpful for the diagnosis of MRA. Nevertheless, the results of ascitic cholesterol assays should be interpreted in parallel with the results of traditional tests and clinical information. PMID:26770458

  13. The reliability, validity, and accuracy of self-reported absenteeism from work: a meta-analysis.

    PubMed

    Johns, Gary; Miraglia, Mariella

    2015-01-01

    Because of a variety of access limitations, self-reported absenteeism from work is often employed in research concerning health, organizational behavior, and economics, and it is ubiquitous in large scale population surveys in these domains. Several well established cognitive and social-motivational biases suggest that self-reports of absence will exhibit convergent validity with records-based measures but that people will tend to underreport the behavior. We used meta-analysis to summarize the reliability, validity, and accuracy of absence self-reports. The results suggested that self-reports of absenteeism offer adequate test-retest reliability and that they exhibit reasonably good rank order convergence with organizational records. However, people have a decided tendency to underreport their absenteeism, although such underreporting has decreased over time. Also, self-reports were more accurate when sickness absence rather than absence for any reason was probed. It is concluded that self-reported absenteeism might serve as a valid measure in some correlational research designs. However, when accurate knowledge of absolute absenteeism levels is essential, the tendency to underreport could result in flawed policy decisions. PMID:25181281

  14. Accuracy of non-rigid registration for local analysis of elasticity restrictions of the lungs

    NASA Astrophysics Data System (ADS)

    Stein, Daniel; Tetzlaff, Ralf; Wolf, Ivo; Meinzer, Hans-Peter

    2009-02-01

    Diseases of the lung often begin with regionally limited changes altering the tissue elasticity. Therefore, quantification of regional lung tissue motion would be desirable for early diagnosis, treatment monitoring, and follow-up. Dynamic MRI can capture such changes, but quantification requires non-rigid registration. However, analysis of dynamic MRI data of the lung is challenging due to inherently low image signal and contrast. Towards a computer-assisted quantification for regional lung diseases, we have evaluated two Demons-based registration methods for their accuracy in quantifying local lung motion on dynamic MRI data. The registration methods were applied on masked image data, which were pre-segmented with a graph-cut algorithm. Evaluation was performed on five datasets from healthy humans with nine time frames each. As gold standard, manually defined points (between 8 and 24) on prominent landmarks (essentially vessel structures) were used. The distance between these points and the predicted landmark location as well as the overlap (Dice coefficient) of the segmentations transformed with the deformation field were calculated. We found that the Demons algorithm performed better than the Symmetric Forces Demons algorithm with respect to average landmark distance (6.5 mm +/- 4.1 mm vs. 8.6 mm +/- 6.1 mm), but comparable regarding the Dice coefficient (0.946 +/- 0.018 vs. 0.961 +/- 0.018). Additionally, the Demons algorithm computes the deformation in only 10 seconds, whereas the Symmetric Forces Demons algorithm takes about 12 times longer.

  15. SU-E-J-37: Feasibility of Utilizing Carbon Fiducials to Increase Localization Accuracy of Lumpectomy Cavity for Partial Breast Irradiation

    SciTech Connect

    Zhang, Y; Hieken, T; Mutter, R; Park, S; Yan, E; Brinkmann, D; Pafundi, D

    2015-06-15

    Purpose To investigate the feasibility of utilizing carbon fiducials to increase localization accuracy of lumpectomy cavity for partial breast irradiation (PBI). Methods Carbon fiducials were placed intraoperatively in the lumpectomy cavity following resection of breast cancer in 11 patients. The patients were scheduled to receive whole breast irradiation (WBI) with a boost or 3D-conformal PBI. WBI patients were initially setup to skin tattoos using lasers, followed by orthogonal kV on-board-imaging (OBI) matching to bone per clinical practice. Cone beam CT (CBCT) was acquired weekly for offline review. For the boost component of WBI and PBI, patients were setup with lasers, followed by OBI matching to fiducials, with final alignment by CBCT matching to fiducials. Using carbon fiducials as a surrogate for the lumpectomy cavity and CBCT matching to fiducials as the gold standard, setup uncertainties to lasers, OBI bone, OBI fiducials, and CBCT breast were compared. Results Minimal imaging artifacts were introduced by fiducials on the planning CT and CBCT. The fiducials were sufficiently visible on OBI for online localization. The mean magnitude and standard deviation of setup errors were 8.4mm ± 5.3 mm (n=84), 7.3mm ± 3.7mm (n=87), 2.2mm ± 1.6mm (n=40) and 4.8mm ± 2.6mm (n=87), for lasers, OBI bone, OBI fiducials and CBCT breast tissue, respectively. Significant migration occurred in one of 39 implanted fiducials in a patient with a large postoperative seroma. Conclusion OBI carbon fiducial-based setup can improve localization accuracy with minimal imaging artifacts. With increased localization accuracy, setup uncertainties can be reduced from 8mm using OBI bone matching to 3mm using OBI fiducial matching for PBI treatment. This work demonstrates the feasibility of utilizing carbon fiducials to increase localization accuracy to the lumpectomy cavity for PBI. This may be particularly attractive for localization in the setting of proton therapy and other scenarios

  16. The effect of spatial resolution on decoding accuracy in fMRI multivariate pattern analysis.

    PubMed

    Gardumi, Anna; Ivanov, Dimo; Hausfeld, Lars; Valente, Giancarlo; Formisano, Elia; Uludağ, Kâmil

    2016-05-15

    Multivariate pattern analysis (MVPA) in fMRI has been used to extract information from distributed cortical activation patterns, which may go undetected in conventional univariate analysis. However, little is known about the physical and physiological underpinnings of MVPA in fMRI as well as about the effect of spatial smoothing on its performance. Several studies have addressed these issues, but their investigation was limited to the visual cortex at 3T with conflicting results. Here, we used ultra-high field (7T) fMRI to investigate the effect of spatial resolution and smoothing on decoding of speech content (vowels) and speaker identity from auditory cortical responses. To that end, we acquired high-resolution (1.1mm isotropic) fMRI data and additionally reconstructed them at 2.2 and 3.3mm in-plane spatial resolutions from the original k-space data. Furthermore, the data at each resolution were spatially smoothed with different 3D Gaussian kernel sizes (i.e. no smoothing or 1.1, 2.2, 3.3, 4.4, or 8.8mm kernels). For all spatial resolutions and smoothing kernels, we demonstrate the feasibility of decoding speech content (vowel) and speaker identity at 7T using support vector machine (SVM) MVPA. In addition, we found that high spatial frequencies are informative for vowel decoding and that the relative contribution of high and low spatial frequencies is different across the two decoding tasks. Moderate smoothing (up to 2.2mm) improved the accuracies for both decoding of vowels and speakers, possibly due to reduction of noise (e.g. residual motion artifacts or instrument noise) while still preserving information at high spatial frequency. In summary, our results show that - even with the same stimuli and within the same brain areas - the optimal spatial resolution for MVPA in fMRI depends on the specific decoding task of interest. PMID:26899782

  17. Digital core based transmitted ultrasonic wave simulation and velocity accuracy analysis

    NASA Astrophysics Data System (ADS)

    Zhu, Wei; Shan, Rui

    2016-06-01

    Transmitted ultrasonic wave simulation (TUWS) in a digital core is one of the important elements of digital rock physics and is used to study wave propagation in porous cores and calculate equivalent velocity. When simulating wave propagates in a 3D digital core, two additional layers are attached to its two surfaces vertical to the wave-direction and one planar wave source and two receiver-arrays are properly installed. After source excitation, the two receivers then record incident and transmitted waves of the digital rock. Wave propagating velocity, which is the velocity of the digital core, is computed by the picked peak-time difference between the two recorded waves. To evaluate the accuracy of TUWS, a digital core is fully saturated with gas, oil, and water to calculate the corresponding velocities. The velocities increase with decreasing wave frequencies in the simulation frequency band, and this is considered to be the result of scattering. When the pore fluids are varied from gas to oil and finally to water, the velocity-variation characteristics between the different frequencies are similar, thereby approximately following the variation law of velocities obtained from linear elastic statics simulation (LESS), although their absolute values are different. However, LESS has been widely used. The results of this paper show that the transmission ultrasonic simulation has high relative precision.

  18. Quantitative analysis of accuracy of seismic wave-propagation codes in 3D random scattering media

    NASA Astrophysics Data System (ADS)

    Galis, Martin; Imperatori, Walter; Mai, P. Martin

    2013-04-01

    Several recent verification studies (e.g. Day et al., 2001; Bielak et al., 2010, Chaljub et al., 2010) have demonstrated the importance of assessing the accuracy of available numerical tools at low frequency in presence of large-scale features (basins, topography, etc.). The fast progress in high-performance computing, including efficient optimization of numerical codes on petascale supercomputers, has permitted the simulation of 3D seismic wave propagation at frequencies of engineering interest (up to 10Hz) in highly heterogeneous media (e.g. Hartzell et al, 2010; Imperatori and Mai, 2013). However, high frequency numerical simulations involving random scattering media, characterized by small-scale heterogeneities, are much more challenging for most numerical methods, and their verification may therefore be even more crucial than in the low-frequency case. Our goal is to quantitatively compare the accuracy and the behavior of three different numerical codes for seismic wave propagation in 3D random scattering media at high frequency. We deploy a point source with omega-squared spectrum, and focus on the near-source region, being of great interest in strong motion seismology. We use two codes based on finite-difference method (FD1 and FD2) and one code based on support-operator method (SO). Both FD1 and FD2 are 4-th order staggered-grid finite-difference codes (for FD1 see Olsen et al., 2009; for FD2 see Moczo et al., 2007). The FD1 and FD2 codes are characterized by slightly different medium representations, since FD1 uses point values of material parameters in each FD-cell, while FD2 uses the effective material parameters at each grid-point (Moczo et al., 2002). SO is 2-nd order support-operator method (Ely et al., 2008). We considered models with random velocity perturbations described by van Karman correlation function with different correlation lengths and different standard deviations. Our results show significant variability in both phase and amplitude as

  19. 3D combinational curves for accuracy and performance analysis of positive biometrics identification

    NASA Astrophysics Data System (ADS)

    Du, Yingzi; Chang, Chein-I.

    2008-06-01

    The receiver operating characteristic (ROC) curve has been widely used as an evaluation criterion to measure the accuracy of biometrics system. Unfortunately, such an ROC curve provides no indication of the optimum threshold and cost function. In this paper, two kinds of 3D combinational curves are proposed: the 3D combinational accuracy curve and the 3D combinational performance curve. The 3D combinational accuracy curve gives a balanced view of the relationships among FAR (false alarm rate), FRR (false rejection rate), threshold t, and Cost. Six 2D curves can be derived from the 3D combinational accuracy curve: the conventional 2D ROC curve, 2D curve of (FRR, t), 2D curve of (FAR, t), 2D curve of (FRR, Cost), 2D curve of (FAR, Cost), and 2D curve of ( t, Cost). The 3D combinational performance curve can be derived from the 3D combinational accuracy curve which can give a balanced view among Security, Convenience, threshold t, and Cost. The advantages of using the proposed 3D combinational curves are demonstrated by iris recognition systems where the experimental results show that the proposed 3D combinational curves can provide more comprehensive information of the system accuracy and performance.

  20. Accuracy and Feasibility of Video Analysis for Assessing Hamstring Flexibility and Validity of the Sit-and-Reach Test

    ERIC Educational Resources Information Center

    Mier, Constance M.

    2011-01-01

    The accuracy of video analysis of the passive straight-leg raise test (PSLR) and the validity of the sit-and-reach test (SR) were tested in 60 men and women. Computer software measured static hip-joint flexion accurately. High within-session reliability of the PSLR was demonstrated (R greater than 0.97). Test-retest (separate days) reliability for…

  1. The Accuracy of Recidivism Risk Assessments for Sexual Offenders: A Meta-Analysis of 118 Prediction Studies

    ERIC Educational Resources Information Center

    Hanson, R. Karl; Morton-Bourgon, Kelly E.

    2009-01-01

    This review compared the accuracy of various approaches to the prediction of recidivism among sexual offenders. On the basis of a meta-analysis of 536 findings drawn from 118 distinct samples (45,398 sexual offenders, 16 countries), empirically derived actuarial measures were more accurate than unstructured professional judgment for all outcomes…

  2. Diagnostic test accuracy: methods for systematic review and meta-analysis.

    PubMed

    Campbell, Jared M; Klugar, Miloslav; Ding, Sandrine; Carmody, Dennis P; Hakonsen, Sasja J; Jadotte, Yuri T; White, Sarahlouise; Munn, Zachary

    2015-09-01

    Systematic reviews are carried out to provide an answer to a clinical question based on all available evidence (published and unpublished), to critically appraise the quality of studies, and account for and explain variations between the results of studies. The Joanna Briggs Institute specializes in providing methodological guidance for the conduct of systematic reviews and has developed methods and guidance for reviewers conducting systematic reviews of studies of diagnostic test accuracy. Diagnostic tests are used to identify the presence or absence of a condition for the purpose of developing an appropriate treatment plan. Owing to demands for improvements in speed, cost, ease of performance, patient safety, and accuracy, new diagnostic tests are continuously developed, and there are often several tests available for the diagnosis of a particular condition. In order to provide the evidence necessary for clinicians and other healthcare professionals to make informed decisions regarding the optimum test to use, primary studies need to be carried out on the accuracy of diagnostic tests and the results of these studies synthesized through systematic review. The Joanna Briggs Institute and its international collaboration have updated, revised, and developed new guidance for systematic reviews, including systematic reviews of diagnostic test accuracy. This methodological article summarizes that guidance and provides detailed advice on the effective conduct of systematic reviews of diagnostic test accuracy. PMID:26355602

  3. Analysis of accuracy of digital elevation models created from captured data by digital photogrammetry method

    NASA Astrophysics Data System (ADS)

    Hudec, P.

    2011-12-01

    A digital elevation model (DEM) is an important part of many geoinformatic applications. For the creation of DEM, spatial data collected by geodetic measurements in the field, photogrammetric processing of aerial survey photographs, laser scanning and secondary sources (analogue maps) are used. It is very important from a user's point of view to know the vertical accuracy of a DEM. The article describes the verification of the vertical accuracy of a DEM for the region of Medzibodrožie, which was created using digital photogrammetry for the purposes of water resources management and modeling and resolving flood cases based on geodetic measurements in the field.

  4. Comparative analysis of Worldview-2 and Landsat 8 for coastal saltmarsh mapping accuracy assessment

    NASA Astrophysics Data System (ADS)

    Rasel, Sikdar M. M.; Chang, Hsing-Chung; Diti, Israt Jahan; Ralph, Tim; Saintilan, Neil

    2016-05-01

    Coastal saltmarsh and their constituent components and processes are of an interest scientifically due to their ecological function and services. However, heterogeneity and seasonal dynamic of the coastal wetland system makes it challenging to map saltmarshes with remotely sensed data. This study selected four important saltmarsh species Pragmitis australis, Sporobolus virginicus, Ficiona nodosa and Schoeloplectus sp. as well as a Mangrove and Pine tree species, Avecinia and Casuarina sp respectively. High Spatial Resolution Worldview-2 data and Coarse Spatial resolution Landsat 8 imagery were selected in this study. Among the selected vegetation types some patches ware fragmented and close to the spatial resolution of Worldview-2 data while and some patch were larger than the 30 meter resolution of Landsat 8 data. This study aims to test the effectiveness of different classifier for the imagery with various spatial and spectral resolutions. Three different classification algorithm, Maximum Likelihood Classifier (MLC), Support Vector Machine (SVM) and Artificial Neural Network (ANN) were tested and compared with their mapping accuracy of the results derived from both satellite imagery. For Worldview-2 data SVM was giving the higher overall accuracy (92.12%, kappa =0.90) followed by ANN (90.82%, Kappa 0.89) and MLC (90.55%, kappa = 0.88). For Landsat 8 data, MLC (82.04%) showed the highest classification accuracy comparing to SVM (77.31%) and ANN (75.23%). The producer accuracy of the classification results were also presented in the paper.

  5. The Push for More Challenging Texts: An Analysis of Early Readers' Rate, Accuracy, and Comprehension

    ERIC Educational Resources Information Center

    Amendum, Steven J.; Conradi, Kristin; Liebfreund, Meghan D.

    2016-01-01

    The purpose of the study was to examine the relationship between the challenge level of text and early readers' reading comprehension. This relationship was also examined with consideration to students' word recognition accuracy and reading rate. Participants included 636 students, in Grades 1-3, in a southeastern state. Results suggest that…

  6. The Accuracy of Webcams in 2D Motion Analysis: Sources of Error and Their Control

    ERIC Educational Resources Information Center

    Page, A.; Moreno, R.; Candelas, P.; Belmar, F.

    2008-01-01

    In this paper, we show the potential of webcams as precision measuring instruments in a physics laboratory. Various sources of error appearing in 2D coordinate measurements using low-cost commercial webcams are discussed, quantifying their impact on accuracy and precision, and simple procedures to control these sources of error are presented.…

  7. Multidimensional analysis of suction feeding performance in fishes: fluid speed, acceleration, strike accuracy and the ingested volume of water.

    PubMed

    Higham, Timothy E; Day, Steven W; Wainwright, Peter C

    2006-07-01

    Suction feeding fish draw prey into the mouth using a flow field that they generate external to the head. In this paper we present a multidimensional perspective on suction feeding performance that we illustrate in a comparative analysis of suction feeding ability in two members of Centrarchidae, the largemouth bass (Micropterus salmoides) and bluegill sunfish (Lepomis macrochirus). We present the first direct measurements of maximum fluid speed capacity, and we use this to calculate local fluid acceleration and volumetric flow rate. We also calculated the ingested volume and a novel metric of strike accuracy. In addition, we quantified for each species the effects of gape magnitude, time to peak gape, and swimming speed on features of the ingested volume of water. Digital particle image velocimetry (DPIV) and high-speed video were used to measure the flow in front of the mouths of three fish from each species in conjunction with a vertical laser sheet positioned on the mid-sagittal plane of the fish. From this we quantified the maximum fluid speed (in the earthbound and fish's frame of reference), acceleration and ingested volume. Our method for determining strike accuracy involved quantifying the location of the prey relative to the center of the parcel of ingested water. Bluegill sunfish generated higher fluid speeds in the earthbound frame of reference, accelerated the fluid faster, and were more accurate than largemouth bass. However, largemouth bass ingested a larger volume of water and generated a higher volumetric flow rate than bluegill sunfish. In addition, because largemouth bass swam faster during prey capture, they generated higher fluid speeds in the fish's frame of reference. Thus, while bluegill can exert higher drag forces on stationary prey items, largemouth bass more quickly close the distance between themselves and prey. The ingested volume and volumetric flow rate significantly increased as gape increased for both species, while time to peak

  8. Accuracy Analysis of a Robotic Radionuclide Inspection and Mapping System for Surface Contamination

    SciTech Connect

    Mauer, Georg F.; Kawa, Chris

    2008-01-15

    The mapping of localized regions of radionuclide contamination in a building can be a time consuming and costly task. Humans moving hand-held radiation detectors over the target areas are subject to fatigue. A contamination map based on manual surveys can contain significant operator-induced inaccuracies. A Fanuc M16i light industrial robot has been configured for installation on a mobile aerial work platform, such as a tall forklift. When positioned in front of a wall or floor surface, the robot can map the radiation levels over a surface area of up to 3 m by 3 m. The robot's end effector is a commercial alpha-beta radiation sensor, augmented with range and collision avoidance sensors to ensure operational safety as well as to maintain a constant gap between surface and radiation sensors. The accuracy and repeatability of the robotically conducted contamination surveys is directly influenced by the sensors and other hardware employed. This paper presents an in-depth analysis of various non-contact sensors for gap measurement, and the means to compensate for predicted systematic errors that arise during the area survey scans. The range sensor should maintain a constant gap between the radiation counter and the surface being inspected. The inspection robot scans the wall surface horizontally, moving down at predefined vertical intervals after each scan in a meandering pattern. A number of non-contact range sensors can be employed for the measurement of the gap between the robot end effector and the wall. The nominal gap width was specified as 10 mm, with variations during a single scan not to exceed {+-} 2 mm. Unfinished masonry or concrete walls typically exhibit irregularities, such as holes, gaps, or indentations in mortar joints. These irregularities can be sufficiently large to indicate a change of the wall contour. The responses of different sensor types to the wall irregularities vary, depending on their underlying principles of operation. We explored

  9. Canonical analysis for increased classification speed and channel selection

    NASA Technical Reports Server (NTRS)

    Eppler, W.

    1976-01-01

    The quadratic form can be expressed as a monotonically increasing sum of squares when the inverse covariance matrix is represented in canonical form. This formulation has the advantage that, in testing a particular class hypothesis, computations can be discontinued when the partial sum exceeds the smallest value obtained for other classes already tested. A method for channel selection is presented which arranges the original input measurements in that order which minimizes the expected number of computations. The classification algorithm was tested on data from LARS Flight Line C1 and found to reduce the sum-of-products operations by a factor of 6.7 in comparison with the conventional approach. In effect, the accuracy of a twelve-channel classification was achieved using only that CPU time required for a conventional four-channel classification.

  10. Spatio-Temporal Analysis of the Accuracy of Tropical Multisatellite Precipitation Analysis 3B42 Precipitation Data in Mid-High Latitudes of China

    PubMed Central

    Cai, Yancong; Jin, Changjie; Wang, Anzhi; Guan, Dexin; Wu, Jiabing; Yuan, Fenghui; Xu, Leilei

    2015-01-01

    Satellite-based precipitation data have contributed greatly to quantitatively forecasting precipitation, and provides a potential alternative source for precipitation data allowing researchers to better understand patterns of precipitation over ungauged basins. However, the absence of calibration satellite data creates considerable uncertainties for The Tropical Rainfall Measuring Mission (TRMM) Multisatellite Precipitation Analysis (TMPA) 3B42 product over high latitude areas beyond the TRMM satellites latitude band (38°NS). This study attempts to statistically assess TMPA V7 data over the region beyond 40°NS using data obtained from numerous weather stations in 1998–2012. Comparative analysis at three timescales (daily, monthly and annual scale) indicates that adoption of a monthly adjustment significantly improved correlation at a larger timescale increasing from 0.63 to 0.95; TMPA data always exhibits a slight overestimation that is most serious at a daily scale (the absolute bias is 103.54%). Moreover, the performance of TMPA data varies across all seasons. Generally, TMPA data performs best in summer, but worst in winter, which is likely to be associated with the effects of snow/ice-covered surfaces and shortcomings of precipitation retrieval algorithms. Temporal and spatial analysis of accuracy indices suggest that the performance of TMPA data has gradually improved and has benefited from upgrades; the data are more reliable in humid areas than in arid regions. Special attention should be paid to its application in arid areas and in winter with poor scores of accuracy indices. Also, it is clear that the calibration can significantly improve precipitation estimates, the overestimation by TMPA in TRMM-covered area is about a third as much as that in no-TRMM area for monthly and annual precipitation. The systematic evaluation of TMPA over mid-high latitudes provides a broader understanding of satellite-based precipitation estimates, and these data are

  11. Spatio-temporal analysis of the accuracy of tropical multisatellite precipitation analysis 3B42 precipitation data in mid-high latitudes of China.

    PubMed

    Cai, Yancong; Jin, Changjie; Wang, Anzhi; Guan, Dexin; Wu, Jiabing; Yuan, Fenghui; Xu, Leilei

    2015-01-01

    Satellite-based precipitation data have contributed greatly to quantitatively forecasting precipitation, and provides a potential alternative source for precipitation data allowing researchers to better understand patterns of precipitation over ungauged basins. However, the absence of calibration satellite data creates considerable uncertainties for The Tropical Rainfall Measuring Mission (TRMM) Multisatellite Precipitation Analysis (TMPA) 3B42 product over high latitude areas beyond the TRMM satellites latitude band (38°NS). This study attempts to statistically assess TMPA V7 data over the region beyond 40°NS using data obtained from numerous weather stations in 1998-2012. Comparative analysis at three timescales (daily, monthly and annual scale) indicates that adoption of a monthly adjustment significantly improved correlation at a larger timescale increasing from 0.63 to 0.95; TMPA data always exhibits a slight overestimation that is most serious at a daily scale (the absolute bias is 103.54%). Moreover, the performance of TMPA data varies across all seasons. Generally, TMPA data performs best in summer, but worst in winter, which is likely to be associated with the effects of snow/ice-covered surfaces and shortcomings of precipitation retrieval algorithms. Temporal and spatial analysis of accuracy indices suggest that the performance of TMPA data has gradually improved and has benefited from upgrades; the data are more reliable in humid areas than in arid regions. Special attention should be paid to its application in arid areas and in winter with poor scores of accuracy indices. Also, it is clear that the calibration can significantly improve precipitation estimates, the overestimation by TMPA in TRMM-covered area is about a third as much as that in no-TRMM area for monthly and annual precipitation. The systematic evaluation of TMPA over mid-high latitudes provides a broader understanding of satellite-based precipitation estimates, and these data are

  12. An evaluation of the accuracy and speed of metagenome analysis tools

    PubMed Central

    Lindgreen, Stinus; Adair, Karen L.; Gardner, Paul P.

    2016-01-01

    Metagenome studies are becoming increasingly widespread, yielding important insights into microbial communities covering diverse environments from terrestrial and aquatic ecosystems to human skin and gut. With the advent of high-throughput sequencing platforms, the use of large scale shotgun sequencing approaches is now commonplace. However, a thorough independent benchmark comparing state-of-the-art metagenome analysis tools is lacking. Here, we present a benchmark where the most widely used tools are tested on complex, realistic data sets. Our results clearly show that the most widely used tools are not necessarily the most accurate, that the most accurate tool is not necessarily the most time consuming, and that there is a high degree of variability between available tools. These findings are important as the conclusions of any metagenomics study are affected by errors in the predicted community composition and functional capacity. Data sets and results are freely available from http://www.ucbioinformatics.org/metabenchmark.html PMID:26778510

  13. An evaluation of the accuracy and speed of metagenome analysis tools.

    PubMed

    Lindgreen, Stinus; Adair, Karen L; Gardner, Paul P

    2016-01-01

    Metagenome studies are becoming increasingly widespread, yielding important insights into microbial communities covering diverse environments from terrestrial and aquatic ecosystems to human skin and gut. With the advent of high-throughput sequencing platforms, the use of large scale shotgun sequencing approaches is now commonplace. However, a thorough independent benchmark comparing state-of-the-art metagenome analysis tools is lacking. Here, we present a benchmark where the most widely used tools are tested on complex, realistic data sets. Our results clearly show that the most widely used tools are not necessarily the most accurate, that the most accurate tool is not necessarily the most time consuming, and that there is a high degree of variability between available tools. These findings are important as the conclusions of any metagenomics study are affected by errors in the predicted community composition and functional capacity. Data sets and results are freely available from http://www.ucbioinformatics.org/metabenchmark.html. PMID:26778510

  14. Georeferencing Accuracy Analysis of a Single WORLDVIEW-3 Image Collected Over Milan

    NASA Astrophysics Data System (ADS)

    Barazzetti, L.; Roncoroni, F.; Brumana, R.; Previtali, M.

    2016-06-01

    The use of rational functions has become a standard for very high-resolution satellite imagery (VHRSI). On the other hand, the overall geolocalization accuracy via direct georeferencing from on board navigation components is much worse than image ground sampling distance (predicted < 3.5 m CE90 for WorldView-3, whereas GSD = 0.31 m for panchromatic images at nadir). This paper presents the georeferencing accuracy results obtained from a single WorldView-3 image processed with a bias compensated RPC camera model. Orientation results for an image collected over Milan are illustrated and discussed for both direct and indirect georeferencing strategies as well as different bias correction parameters estimated from a set of ground control points. Results highlight that the use of a correction based on two shift parameters is optimal for the considered dataset.

  15. Accuracy analysis for DSM and orthoimages derived from SPOT HRS stereo data using direct georeferencing

    NASA Astrophysics Data System (ADS)

    Reinartz, Peter; Müller, Rupert; Lehner, Manfred; Schroeder, Manfred

    During the HRS (High Resolution Stereo) Scientific Assessment Program the French space agency CNES delivered data sets from the HRS camera system with high precision ancillary data. Two test data sets from this program were evaluated: one is located in Germany, the other in Spain. The first goal was to derive orthoimages and digital surface models (DSM) from the along track stereo data by applying the rigorous model with direct georeferencing and without ground control points (GCPs). For the derivation of DSM, the stereo processing software, developed at DLR for the MOMS-2P three line stereo camera was used. As a first step, the interior and exterior orientation of the camera, delivered as ancillary data from positioning and attitude systems were extracted. A dense image matching, using nearly all pixels as kernel centers provided the parallaxes. The quality of the stereo tie points was controlled by forward and backward matching of the two stereo partners using the local least squares matching method. Forward intersection lead to points in object space which are subsequently interpolated to a DSM in a regular grid. DEM filtering methods were also applied and evaluations carried out differentiating between accuracies in forest and other areas. Additionally, orthoimages were generated from the images of the two stereo looking directions. The orthoimage and DSM accuracy was determined by using GCPs and available reference DEMs of superior accuracy (DEM derived from laser data and/or classical airborne photogrammetry). As expected the results obtained without using GCPs showed a bias in the order of 5-20 m to the reference data for all three coordinates. By image matching it could be shown that the two independently derived orthoimages exhibit a very constant shift behavior. In a second step few GCPs (3-4) were used to calculate boresight alignment angles, introduced into the direct georeferencing process of each image independently. This method improved the absolute

  16. Accuracy aspects of stereo side-looking radar. [analysis of its visual perception and binocular vision

    NASA Technical Reports Server (NTRS)

    Leberl, F. W.

    1979-01-01

    The geometry of the radar stereo model and factors affecting visual radar stereo perception are reviewed. Limits to the vertical exaggeration factor of stereo radar are defined. Radar stereo model accuracies are analyzed with respect to coordinate errors caused by errors of radar sensor position and of range, and with respect to errors of coordinate differences, i.e., cross-track distances and height differences.

  17. Analysis of high accuracy, quantitative proteomics data in the MaxQB database.

    PubMed

    Schaab, Christoph; Geiger, Tamar; Stoehr, Gabriele; Cox, Juergen; Mann, Matthias

    2012-03-01

    MS-based proteomics generates rapidly increasing amounts of precise and quantitative information. Analysis of individual proteomic experiments has made great strides, but the crucial ability to compare and store information across different proteome measurements still presents many challenges. For example, it has been difficult to avoid contamination of databases with low quality peptide identifications, to control for the inflation in false positive identifications when combining data sets, and to integrate quantitative data. Although, for example, the contamination with low quality identifications has been addressed by joint analysis of deposited raw data in some public repositories, we reasoned that there should be a role for a database specifically designed for high resolution and quantitative data. Here we describe a novel database termed MaxQB that stores and displays collections of large proteomics projects and allows joint analysis and comparison. We demonstrate the analysis tools of MaxQB using proteome data of 11 different human cell lines and 28 mouse tissues. The database-wide false discovery rate is controlled by adjusting the project specific cutoff scores for the combined data sets. The 11 cell line proteomes together identify proteins expressed from more than half of all human genes. For each protein of interest, expression levels estimated by label-free quantification can be visualized across the cell lines. Similarly, the expression rank order and estimated amount of each protein within each proteome are plotted. We used MaxQB to calculate the signal reproducibility of the detected peptides for the same proteins across different proteomes. Spearman rank correlation between peptide intensity and detection probability of identified proteins was greater than 0.8 for 64% of the proteome, whereas a minority of proteins have negative correlation. This information can be used to pinpoint false protein identifications, independently of peptide database

  18. Measurement and accuracy analysis of refractive index using a specular reflectivity close to the total internal reflection

    NASA Astrophysics Data System (ADS)

    Li, Hui; Lu, Zukang; Xie, Shusen; Lin, Lei

    1998-08-01

    A new method to measure refractive index and the accuracy analysis as well is presented. The characteristic includes that the direction of incident light is not perpendicular to the interface but close to the critical angle of total internal reflection. That the specular reflectivity changes sharply near the critical angle implies that a high measuring sensitivity be reached easily. A narrow p- polarized laser beam and a prism or a quasi-semi-cylindrical lens in contact with a sample are applied in the apparatus. In order to match a high accuracy, a photoelectronic receiver with dual-channel divider is designed to compensate the stability of output of laser. One of the advantages of the method is its high accuracy. The uncertainty in the refractive index measurement is in the fourth decimal place at least. The exact direction of incident laser beam depends on the accuracy of result expected. Another outstanding advantage is its particularly straightforward in use experimental techniques. The method will be the most promising tool to study the response of refractive index to subtle changes of different conditions.

  19. Accuracy of bleeding scores for patients presenting with myocardial infarction: a meta-analysis of 9 studies and 13 759 patients

    PubMed Central

    D'Ascenzo, Fabrizio; Moretti, Claudio; Omedè, Pierluigi; Montefusco, Antonio; Bach, Richard G.; Alexander, Karen P.; Mehran, Roxana; Ariza-Solé, Albert; Zoccai, Giuseppe Biondi; Gaita, Fiorenzo

    2015-01-01

    Introduction Due to its negative impact on prognosis, a clear assessment of bleeding risk for patients presenting with acute coronary syndrome (ACS) remains crucial. Different risk scores have been proposed and compared, although with inconsistent results. Aim We performed a meta-analysis to evaluate the accuracy of different bleeding risk scores for ACS patients. Material and methods All studies externally validating risk scores for bleeding for patients presenting with ACS were included in the present review. Accuracy of risk scores for external validation cohorts to predict major bleeding in patients with ACS was the primary end point. Sensitivity analysis was performed according to clinical presentation (ST segment elevation myocardial infarction (STEMI) and non-ST segment elevation myocardial infarction (NSTEMI)). Results Nine studies and 13 759 patients were included. CRUSADE, ACUITY, ACTION and GRACE were the scores externally validated. The rate of in-hospital major bleeding was 7.80% (5.5–9.2), 2.05% (1.5–3.0) being related to access and 2.70% (1.7–4.0) needing transfusions. When evaluating all ACS patients, ACTION, CRUSADE and ACUITY performed similarly (AUC 0.75: 0.72–0.79; 0.71: 0.64–0.80 and 0.71: 0.63–0.77 respectively) when compared to GRACE (0.66; 0.64–0.67, all confidence intervals 95%). When appraising only STEMI patients, all the scores performed similarly, while CRUSADE was the only one externally validated for NSTEMI. For ACTION and ACUITY, accuracy increased for radial access patients, while no differences were found for CRUSADE. Conclusions ACTION, CRUSADE and ACUITY perform similarly to predict risk of bleeding in ACS patients. The CRUSADE score is the only one externally validated for NSTEMI, while accuracy of the scores increased with radial access. PMID:26677357

  20. Analysis of the dose calculation accuracy for IMRT in lung: a 2D approach.

    PubMed

    Dvorak, Pavel; Stock, Markus; Kroupa, Bernhard; Bogner, Joachim; Georg, Dietmar

    2007-01-01

    The purpose of this study was to compare the dosimetric accuracy of IMRT plans for targets in lung with the accuracy of standard uniform-intensity conformal radiotherapy for different dose calculation algorithms. Tests were performed utilizing a special phantom manufactured from cork and polystyrene in order to quantify the uncertainty of two commercial TPS for IMRT in the lung. Ionization and film measurements were performed at various measuring points/planes. Additionally, single-beam and uniform-intensity multiple-beam tests were performed, in order to investigate deviations due to other characteristics of IMRT. Helax-TMS V6.1(A) was tested for 6, 10 and 25 MV and BrainSCAN 5.2 for 6 MV photon beams, respectively. Pencil beam (PB) with simple inhomogeneity correction and 'collapsed cone' (CC) algorithms were applied for dose calculations. However, the latter was not incorporated during optimization hence only post-optimization recalculation was tested. Two-dimensional dose distributions were evaluated applying the gamma index concept. Conformal plans showed the same accuracy as IMRT plans. Ionization chamber measurements detected deviations of up to 5% when a PB algorithm was used for IMRT dose calculations. Significant improvement (deviations approximately 2%) was observed when IMRT plans were recalculated with the CC algorithm, especially for the highest nominal energy. All gamma evaluations confirmed substantial improvement with the CC algorithm in 2D. While PB dose distributions showed most discrepancies in lower (<50%) and high (>90%) dose regions, the CC dose distributions deviated mainly in the high dose gradient (20-80%) region. The advantages of IMRT (conformity, intra-target dose control) should be counterbalanced with possible calculation inaccuracies for targets in the lung. Until no superior dose calculation algorithms are involved in the iterative optimization process it should be used with great care. When only PB algorithm with simple

  1. The Accuracy of Thyroid Nodule Ultrasound to Predict Thyroid Cancer: Systematic Review and Meta-Analysis

    PubMed Central

    Gionfriddo, Michael R.; Al Nofal, Alaa; Boehmer, Kasey R.; Leppin, Aaron L.; Reading, Carl; Callstrom, Matthew; Elraiyah, Tarig A.; Prokop, Larry J.; Stan, Marius N.; Murad, M. Hassan; Morris, John C.; Montori, Victor M.

    2014-01-01

    Context: Significant uncertainty remains surrounding the diagnostic accuracy of sonographic features used to predict the malignant potential of thyroid nodules. Objective: The objective of the study was to summarize the available literature related to the accuracy of thyroid nodule ultrasound (US) in the prediction of thyroid cancer. Methods: We searched multiple databases and reference lists for cohort studies that enrolled adults with thyroid nodules with reported diagnostic measures of sonography. A total of 14 relevant US features were analyzed. Results: We included 31 studies between 1985 and 2012 (number of nodules studied 18 288; average size 15 mm). The frequency of thyroid cancer was 20%. The most common type of cancer was papillary thyroid cancer (84%). The US nodule features with the highest diagnostic odds ratio for malignancy was being taller than wider [11.14 (95% confidence interval 6.6–18.9)]. Conversely, the US nodule features with the highest diagnostic odds ratio for benign nodules was spongiform appearance [12 (95% confidence interval 0.61–234.3)]. Heterogeneity across studies was substantial. Estimates of accuracy depended on the experience of the physician interpreting the US, the type of cancer and nodule (indeterminate), and type of reference standard. In a threshold model, spongiform appearance and cystic nodules were the only two features that, if present, could have avoided the use of fine-needle aspiration biopsy. Conclusions: Low- to moderate-quality evidence suggests that individual ultrasound features are not accurate predictors of thyroid cancer. Two features, cystic content and spongiform appearance, however, might predict benign nodules, but this has limited applicability to clinical practice due to their infrequent occurrence. PMID:24276450

  2. Accuracy Analysis of Anisotropic Yield Functions based on the Root-Mean Square Error

    NASA Astrophysics Data System (ADS)

    Huh, Hoon; Lou, Yanshan; Bae, Gihyun; Lee, Changsoo

    2010-06-01

    This paper evaluates the accuracy of popular anisotropic yield functions based on the root-mean square error (RMSE) of the yield stresses and the R-values. The yield functions include Hill48, Yld89, Yld91, Yld96, Yld2000-2d, BBC2000 and Yld2000-18p yield criteria. Two kind steels and five kind aluminum alloys are selected for the accuracy evaluation. The anisotropic coefficients in yield functions are computed from the experimental data. The downhill simplex method is utilized for the parameter evaluation for the yield function except Hill48 and Yld89 yield functions after the error functions are constructed. The yield stresses and the R-values at every 15°from the rolling direction (RD) and the yield stress and R-value at equibiaxial tension conditions are predicted from each yield function. The predicted yield stresses and R-values are then compared with the experimental data. The root-mean square errors (RMSE) are computed to quantitatively evaluate the yield function. The RMSEs are calculated for the yield stresses and the R-values separately because the yield stress difference is much smaller that the difference in the R-values. The RMSEs of different yield functions are compared for each material. The Hill48 and Yld89 yield functions are the worst choices for the anisotropic description of the yield stress anisotropy while Yld91 yield function is the last choice for the modeling of the R-value directionality. Yld2000-2d and BBC2000 yield function have the same accuracy on the modeling of both the yield stress anisotropy and the R-value anisotropy. The best choice is Yld2000-18 yield function to accurately describe the yield tress and R-value directionalities of sheet metals.

  3. Methods in Use for Sensitivity Analysis, Uncertainty Evaluation, and Target Accuracy Assessment

    SciTech Connect

    G. Palmiotti; M. Salvatores; G. Aliberti

    2007-10-01

    Sensitivity coefficients can be used for different objectives like uncertainty estimates, design optimization, determination of target accuracy requirements, adjustment of input parameters, and evaluations of the representativity of an experiment with respect to a reference design configuration. In this paper the theory, based on the adjoint approach, that is implemented in the ERANOS fast reactor code system is presented along with some unique tools and features related to specific types of problems as is the case for nuclide transmutation, reactivity loss during the cycle, decay heat, neutron source associated to fuel fabrication, and experiment representativity.

  4. Accuracy assessment of the ERP prediction method based on analysis of 100-year ERP series

    NASA Astrophysics Data System (ADS)

    Malkin, Z.; Tissen, V. M.

    2012-12-01

    A new method has been developed at the Siberian Research Institute of Metrology (SNIIM) for highly accurate prediction of UT1 and Pole motion (PM). In this study, a detailed comparison was made of real-time UT1 predictions made in 2006-2011 and PMpredictions made in 2009-2011making use of the SNIIM method with simultaneous predictions computed at the International Earth Rotation and Reference Systems Service (IERS), USNO. Obtained results have shown that proposed method provides better accuracy at different prediction lengths.

  5. Automation, Operation, and Data Analysis in the Cryogenic, High Accuracy, Refraction Measuring System (CHARMS)

    NASA Technical Reports Server (NTRS)

    Frey, Bradley J.; Leviton, Douglas B.

    2005-01-01

    The Cryogenic High Accuracy Refraction Measuring System (CHARMS) at NASA's Goddard Space Flight Center has been enhanced in a number of ways in the last year to allow the system to accurately collect refracted beam deviation readings automatically over a range of temperatures from 15 K to well beyond room temperature with high sampling density in both wavelength and temperature. The engineering details which make this possible are presented. The methods by which the most accurate angular measurements are made and the corresponding data reduction methods used to reduce thousands of observed angles to a handful of refractive index values are also discussed.

  6. Automation, Operation, and Data Analysis in the Cryogenic, High Accuracy, Refraction Measuring System (CHARMS)

    NASA Technical Reports Server (NTRS)

    Frey, Bradley; Leviton, Duoglas

    2005-01-01

    The Cryogenic High Accuracy Refraction Measuring System (CHARMS) at NASA s Goddard Space Flight Center has been enhanced in a number of ways in the last year to allow the system to accurately collect refracted beam deviation readings automatically over a range of temperatures from 15 K to well beyond room temperature with high sampling density in both wavelength and temperature. The engineering details which make this possible are presented. The methods by which the most accurate angular measurements are made and the corresponding data reduction methods used to reduce thousands of observed angles to a handful of refractive index values are also discussed.

  7. Analysis Article: Accuracy of the DIDGET Glucose Meter in Children and Young Adults with Diabetes

    PubMed Central

    Kim, Sarah

    2011-01-01

    Diabetes is one of the most common chronic diseases among American children. Although studies show that intensive management, including frequent glucose testing, improves diabetes control, this is difficult to accomplish. Bayer's DIDGET® glucose meter system pairs with a popular handheld video game system and couples good blood glucose testing habits with video-game-based rewards. In this issue, Deeb and colleagues performed a study demonstrating the accuracy of the DIDGET meter, a critical asset to this novel product designed to alleviate some of the challenges of managing pediatric diabetes. PMID:22027311

  8. PIV measurements and data accuracy analysis of flow in complex terrain

    NASA Astrophysics Data System (ADS)

    Yao, Rentai; Hao, Hongwei; Qiao, Qingdang

    2000-10-01

    In this paper velocity fields and flow visualization in complex terrain in an environmental wind tunnel have been measured using PIV. In addition, it would be useful to appraise the PIV data by comparing the PIV results with those obtained from the well- established point measurement methods, such as constant temperature anemometry (CTA) and Dantec FlowMaster, in order to verify the accuracy of PIV measurements. The results indicate that PIV is a powerful tool for velocity measurements in the environmental wind tunnel.

  9. Accuracy of matrix-assisted laser desorption ionization-time of flight mass spectrometry for identification of clinical pathogenic fungi: a meta-analysis.

    PubMed

    Ling, Huazhi; Yuan, Zhijie; Shen, Jilu; Wang, Zhongxin; Xu, Yuanhong

    2014-07-01

    Fungal infections in the clinic have become increasingly serious. In many cases, the identification of clinically relevant fungi remains time-consuming and may also be unreliable. Matrix-assisted laser desorption ionization-time of flight mass spectroscopy (MALDI-TOF MS) is a newly developed diagnostic tool that is increasingly being employed to rapidly and accurately identify clinical pathogenic microorganisms. The present meta-analysis aimed to systematically evaluate the accuracy of MALDI-TOF MS for the identification of clinical pathogenic fungi. After a rigorous selection process, 33 articles, involving 38 trials and a total of 9,977 fungal isolates, were included in the meta-analysis. The random-effects pooled identification accuracy of MALDI-TOF MS increased from 0.955 (95% confidence interval [CI], 0.939 to 0.969) at the species level to 0.977 (95% CI, 0.955 to 0.993) at the genus level (P < 0.001; χ(2) = 15.452). Subgroup analyses were performed at the species level for several categories, including strain, source of strain, system, system database, and modified outcomes, to calculate the accuracy and to investigate heterogeneity. These analyses revealed significant differences between the overall meta-analysis and some of the subanalyses. In parallel, significant differences in heterogeneity among different systems and among different methods for calculating the identification ratios were found by multivariate metaregression, but none of the factors, except for the moderator of outcome, was significantly associated with heterogeneity by univariate metaregression. In summary, the MALDI-TOF MS method is highly accurate for the identification of clinically pathogenic fungi; future studies should analyze the comprehensive capability of this technology for clinical diagnostic microbiology. PMID:24829234

  10. Accuracy of Matrix-Assisted Laser Desorption Ionization–Time of Flight Mass Spectrometry for Identification of Clinical Pathogenic Fungi: a Meta-Analysis

    PubMed Central

    Ling, Huazhi; Yuan, Zhijie; Shen, Jilu; Wang, Zhongxin

    2014-01-01

    Fungal infections in the clinic have become increasingly serious. In many cases, the identification of clinically relevant fungi remains time-consuming and may also be unreliable. Matrix-assisted laser desorption ionization–time of flight mass spectroscopy (MALDI-TOF MS) is a newly developed diagnostic tool that is increasingly being employed to rapidly and accurately identify clinical pathogenic microorganisms. The present meta-analysis aimed to systematically evaluate the accuracy of MALDI-TOF MS for the identification of clinical pathogenic fungi. After a rigorous selection process, 33 articles, involving 38 trials and a total of 9,977 fungal isolates, were included in the meta-analysis. The random-effects pooled identification accuracy of MALDI-TOF MS increased from 0.955 (95% confidence interval [CI], 0.939 to 0.969) at the species level to 0.977 (95% CI, 0.955 to 0.993) at the genus level (P < 0.001; χ2 = 15.452). Subgroup analyses were performed at the species level for several categories, including strain, source of strain, system, system database, and modified outcomes, to calculate the accuracy and to investigate heterogeneity. These analyses revealed significant differences between the overall meta-analysis and some of the subanalyses. In parallel, significant differences in heterogeneity among different systems and among different methods for calculating the identification ratios were found by multivariate metaregression, but none of the factors, except for the moderator of outcome, was significantly associated with heterogeneity by univariate metaregression. In summary, the MALDI-TOF MS method is highly accurate for the identification of clinically pathogenic fungi; future studies should analyze the comprehensive capability of this technology for clinical diagnostic microbiology. PMID:24829234

  11. Analysis of RDSS positioning accuracy based on RNSS wide area differential technique

    NASA Astrophysics Data System (ADS)

    Xing, Nan; Su, RanRan; Zhou, JianHua; Hu, XiaoGong; Gong, XiuQiang; Liu, Li; He, Feng; Guo, Rui; Ren, Hui; Hu, GuangMing; Zhang, Lei

    2013-10-01

    The BeiDou Navigation Satellite System (BDS) provides Radio Navigation Service System (RNSS) as well as Radio Determination Service System (RDSS). RDSS users can obtain positioning by responding the Master Control Center (MCC) inquiries to signal transmitted via GEO satellite transponder. The positioning result can be calculated with elevation constraint by MCC. The primary error sources affecting the RDSS positioning accuracy are the RDSS signal transceiver delay, atmospheric trans-mission delay and GEO satellite position error. During GEO orbit maneuver, poor orbit forecast accuracy significantly impacts RDSS services. A real-time 3-D orbital correction method based on wide-area differential technique is raised to correct the orbital error. Results from the observation shows that the method can successfully improve positioning precision during orbital maneuver, independent from the RDSS reference station. This improvement can reach 50% in maximum. Accurate calibration of the RDSS signal transceiver delay precision and digital elevation map may have a critical role in high precise RDSS positioning services.

  12. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part I. General Guidance and Tips

    PubMed Central

    Kim, Kyung Won; Lee, Juneyoung; Choi, Sang Hyun; Huh, Jimi

    2015-01-01

    In the field of diagnostic test accuracy (DTA), the use of systematic review and meta-analyses is steadily increasing. By means of objective evaluation of all available primary studies, these two processes generate an evidence-based systematic summary regarding a specific research topic. The methodology for systematic review and meta-analysis in DTA studies differs from that in therapeutic/interventional studies, and its content is still evolving. Here we review the overall process from a practical standpoint, which may serve as a reference for those who implement these methods. PMID:26576106

  13. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part I. General Guidance and Tips.

    PubMed

    Kim, Kyung Won; Lee, Juneyoung; Choi, Sang Hyun; Huh, Jimi; Park, Seong Ho

    2015-01-01

    In the field of diagnostic test accuracy (DTA), the use of systematic review and meta-analyses is steadily increasing. By means of objective evaluation of all available primary studies, these two processes generate an evidence-based systematic summary regarding a specific research topic. The methodology for systematic review and meta-analysis in DTA studies differs from that in therapeutic/interventional studies, and its content is still evolving. Here we review the overall process from a practical standpoint, which may serve as a reference for those who implement these methods. PMID:26576106

  14. Diagnostic accuracy of computed tomography for chronic thromboembolic pulmonary hypertension: a systematic review and meta-analysis.

    PubMed

    Dong, Chengjun; Zhou, Min; Liu, Dingxi; Long, Xi; Guo, Ting; Kong, Xiangquan

    2015-01-01

    This study aimed to determine the diagnostic accuracy of computed tomography imaging for the diagnosis of chronic thromboembolic pulmonary hypertension (CTEPH). Additionally, the effect of test and study characteristics was explored. Studies published between 1990 and 2015 identified by PubMed, OVID search and citation tracking were examined. Of the 613 citations, 11 articles (n=712) met the inclusion criteria. The patient-based analysis demonstrated a pooled sensitivity of 76% (95% confidence interval [CI]: 69% to 82%), and a pooled specificity of 96% (95%CI: 93% to 98%). This resulted in a pooled diagnostic odds ratio (DOR) of 191 (95%CI: 75 to 486). The vessel-based analyses were divided into 3 levels: total arteries、main+ lobar arteries and segmental arteries. The pooled sensitivity were 88% (95%CI: 87% to 90%)、95% (95%CI: 92% to 97%) and 88% (95%CI: 87% to 90%), respectively, with a pooled specificity of 90% (95%CI: 88% to 91%)、96% (95%CI: 94% to 97%) and 89% (95% CI: 87% to 91%). This resulted in a pooled diagnostic odds ratio of 76 (95%CI: 23 to 254),751 (95%CI: 57 to 9905) and 189 (95%CI: 21 to 1072), respectively. In conclusion, CT is a favorable method to rule in CTEPH and to rule out pulmonary endarterectomy (PEA) patients for proximal branches. Furthermore, dual-energy and 320-slices CT can increase the sensitivity for subsegmental arterials, which are promising imaging techniques for balloon pulmonary angioplasty (BPA) approach. In the near future, CT could position itself as the key for screening consideration and for surgical and interventional operability. PMID:25923810

  15. A preliminary analysis of human factors affecting the recognition accuracy of a discrete word recognizer for C3 systems

    NASA Astrophysics Data System (ADS)

    Yellen, H. W.

    1983-03-01

    Literature pertaining to Voice Recognition abounds with information relevant to the assessment of transitory speech recognition devices. In the past, engineering requirements have dictated the path this technology followed. But, other factors do exist that influence recognition accuracy. This thesis explores the impact of Human Factors on the successful recognition of speech, principally addressing the differences or variability among users. A Threshold Technology T-600 was used for a 100 utterance vocubalary to test 44 subjects. A statistical analysis was conducted on 5 generic categories of Human Factors: Occupational, Operational, Psychological, Physiological and Personal. How the equipment is trained and the experience level of the speaker were found to be key characteristics influencing recognition accuracy. To a lesser extent computer experience, time or week, accent, vital capacity and rate of air flow, speaker cooperativeness and anxiety were found to affect overall error rates.

  16. An analysis of the accuracy and cost-effectiveness of a cropland inventory utilizing remote sensing techniques

    NASA Technical Reports Server (NTRS)

    Jensen, J. R.; Tinney, L. R.; Estes, J. E.

    1975-01-01

    Cropland inventories utilizing high altitude and Landsat imagery were conducted in Kern County, California. It was found that in terms of the overall mean relative and absolute inventory accuracies, a Landsat multidate analysis yielded the most optimum results, i.e., 98% accuracy. The 1:125,000 CIR high altitude inventory is a serious alternative which can be very accurate (97% or more) if imagery is available for a specific study area. The operational remote sensing cropland inventories documented in this study are considered cost-effective. When compared to conventional survey costs of $62-66 per 10,000 acres, the Landsat and high-altitude inventories required only 3-5% of this amount, i.e., $1.97-2.98.

  17. Relative accuracy evaluation.

    PubMed

    Zhang, Yan; Wang, Hongzhi; Yang, Zhongsheng; Li, Jianzhong

    2014-01-01

    The quality of data plays an important role in business analysis and decision making, and data accuracy is an important aspect in data quality. Thus one necessary task for data quality management is to evaluate the accuracy of the data. And in order to solve the problem that the accuracy of the whole data set is low while a useful part may be high, it is also necessary to evaluate the accuracy of the query results, called relative accuracy. However, as far as we know, neither measure nor effective methods for the accuracy evaluation methods are proposed. Motivated by this, for relative accuracy evaluation, we propose a systematic method. We design a relative accuracy evaluation framework for relational databases based on a new metric to measure the accuracy using statistics. We apply the methods to evaluate the precision and recall of basic queries, which show the result's relative accuracy. We also propose the method to handle data update and to improve accuracy evaluation using functional dependencies. Extensive experimental results show the effectiveness and efficiency of our proposed framework and algorithms. PMID:25133752

  18. Relative Accuracy Evaluation

    PubMed Central

    Zhang, Yan; Wang, Hongzhi; Yang, Zhongsheng; Li, Jianzhong

    2014-01-01

    The quality of data plays an important role in business analysis and decision making, and data accuracy is an important aspect in data quality. Thus one necessary task for data quality management is to evaluate the accuracy of the data. And in order to solve the problem that the accuracy of the whole data set is low while a useful part may be high, it is also necessary to evaluate the accuracy of the query results, called relative accuracy. However, as far as we know, neither measure nor effective methods for the accuracy evaluation methods are proposed. Motivated by this, for relative accuracy evaluation, we propose a systematic method. We design a relative accuracy evaluation framework for relational databases based on a new metric to measure the accuracy using statistics. We apply the methods to evaluate the precision and recall of basic queries, which show the result's relative accuracy. We also propose the method to handle data update and to improve accuracy evaluation using functional dependencies. Extensive experimental results show the effectiveness and efficiency of our proposed framework and algorithms. PMID:25133752

  19. [Situational low self-esteem in pregnant women: an analysis of accuracy].

    PubMed

    Cavalcante, Joyce Carolle Bezerra; de Sousa, Vanessa Emille Carvalho; Lopes, Marcos Venícios de Oliveira

    2012-01-01

    To investigate the accuracy of defining characteristics of Situational low self-esteem we developed a cross-sectional study, with 52 pregnant women assisted in a family centre. The NANDA-I taxonomy was used as well as the Rosenberg's scale. The diagnosis was present in 32.7% of the sample and all characteristics presented statistical significance, except "Reports verbally situational challenge to its own value". The characteristics "Indecisive behavior" and "Helplessness expressions" had 82.35% of sensitivity. On the other hand, the characteristics "Expression of feelings of worthlessness" and "Reports verbally situational challenge to its own value" were the more specific, with 94.29% of specificity. These results can contribute with the nursing practice because the identification of accurate characteristics is essential to a secure inference. PMID:23559177

  20. Estimated results analysis and application of the precise point positioning based high-accuracy ionosphere delay

    NASA Astrophysics Data System (ADS)

    Wang, Shi-tai; Peng, Jun-huan

    2015-12-01

    The characterization of ionosphere delay estimated with precise point positioning is analyzed in this paper. The estimation, interpolation and application of the ionosphere delay are studied based on the processing of 24-h data from 5 observation stations. The results show that the estimated ionosphere delay is affected by the hardware delay bias from receiver so that there is a difference between the estimated and interpolated results. The results also show that the RMSs (root mean squares) are bigger, while the STDs (standard deviations) are better than 0.11 m. When the satellite difference is used, the hardware delay bias can be canceled. The interpolated satellite-differenced ionosphere delay is better than 0.11 m. Although there is a difference between the between the estimated and interpolated ionosphere delay results it cannot affect its application in single-frequency positioning and the positioning accuracy can reach cm level.

  1. Instability analysis of pointing accuracy and power imbalance of spherical hohlraum

    NASA Astrophysics Data System (ADS)

    Duan, Hao; Wu, Changshu; Pei, Wenbing; Zou, Shiyang

    2016-05-01

    An analytic model to describe the statistic behavior of flux asymmetry on the capsule shell under the influence of random fluctuation of laser spots' position and laser energy is developed. Based on our previous work [Duan et al., Phys. Plasmas 22, 092704 (2015)] and a diagram technique, the expectation, variance, and probability density function of flux asymmetry raised by laser pointing accuracy and laser power imbalance of 4, 6, and 8 laser entrance holes (LEHs) spherical hohlraums are shown. For spherical hohlraums with different numbers of LEHs, it is found that the random part of flux asymmetry is proportional to the ratio between laser energy and square root of total spots' number ELaser/√{Nt } , and angle-of-incidence θ0, which indicates that a choice of small θ0 and a great number of Nt can reduce the random flux asymmetry. In order to achieve a cumulative probability in which each l-order flux asymmetry can meet corresponding requirements [Gu et al., Phys. Plasmas 21, 012704 (2014)] beyond 90% in the condition of a 1000 μm capsule and 4000 μm hohlraum, the power imbalance, i.e., the ratio between standard derivation and expectation of laser spots power ΔF/Fspot for 4, 6, and 8 LEHs spherical hohlraums must not exceed 8.1%, 9.1%, and 8.5%, corresponding pointing accuracy rHΔθ must not exceed 79 μm , 102 μm , and 96 μm along the ê θ direction, and rH sin 2 θ0Δϕ must not exceed 77 μm , 99 μm , and 94 μm along the ê ϕ direction, respectively.

  2. Accuracy assessment of satellite altimetry over central East Antarctica by kinematic GNSS and crossover analysis

    NASA Astrophysics Data System (ADS)

    Schröder, Ludwig; Richter, Andreas; Fedorov, Denis; Knöfel, Christoph; Ewert, Heiko; Dietrich, Reinhard; Matveev, Aleksey Yu.; Scheinert, Mirko; Lukin, Valery

    2014-05-01

    Satellite altimetry is a unique technique to observe the contribution of the Antarctic ice sheet to global sea-level change. To fulfill the high quality requirements for its application, the respective products need to be validated against independent data like ground-based measurements. Kinematic GNSS provides a powerful method to acquire precise height information along the track of a vehicle. Within a collaboration of TU Dresden and Russian partners during the Russian Antarctic Expeditions in the seasons from 2001 to 2013 we recorded several such profiles in the region of the subglacial Lake Vostok, East Antarctica. After 2006 these datasets also include observations along seven continental traverses with a length of about 1600km each between the Antarctic coast and the Russian research station Vostok (78° 28' S, 106° 50' E). After discussing some special issues concerning the processing of the kinematic GNSS profiles under the very special conditions of the interior of the Antarctic ice sheet, we will show their application for the validation of NASA's laser altimeter satellite mission ICESat and of ESA's ice mission CryoSat-2. Analysing the height differences at crossover points, we can get clear insights into the height regime at the subglacial Lake Vostok. Thus, these profiles as well as the remarkably flat lake surface itself can be used to investigate the accuracy and possible error influences of these missions. We will show how the transmit-pulse reference selection correction (Gaussian vs. centroid, G-C) released in January 2013 helped to further improve the release R633 ICESat data and discuss the height offsets and other effects of the CryoSat-2 radar data. In conclusion we show that only a combination of laser and radar altimetry can provide both, a high precision and a good spatial coverage. An independent validation with ground-based observations is crucial for a thorough accuracy assessment.

  3. A combined antral and corpus rapid urease testing protocol can increase diagnostic accuracy despite a low prevalence of Helicobacter pylori infection in patients undergoing routine gastroscopy

    PubMed Central

    Holleran, Grainne; Hall, Barry; Brennan, Denise; Crotty, Paul; McNamara, Deirdre

    2015-01-01

    Background The effects of an increased risk of sampling error and the lower prevalence of Helicobacter pylori infection on the diagnostic accuracy of standard invasive tests needs to be considered. Despite evidence of enhanced yield with additional biopsies, combined Rapid Urease Tests (RUTs) have not been widely adopted. We aimed to compare the diagnostic efficacy of a combined antral and corpus rapid urease test (RUT) to a single antral RUT in a low prevalence cohort. Methods Between August 2013 and April 2014 adult patients undergoing a scheduled gastroscopy were prospectively recruited. At endoscopy biopsies were taken and processed for single and combined RUTs, histology and culture using standard techniques. Infection was defined by positive culture or detection of Helicobacter like organisms on either antral or corpus samples. Results In all 123 patients were recruited. H. pylori prevalence was low at 36%, n = 44. There was a significant difference in positivity between single and combined RUTs, 20% (n = 25) versus 30% (n = 37), p = 0.0094, (95% CI 0.15–0.04). The number needed to treat (NNT) for an additional diagnosis of infection using a combined versus a single RUT is 4 (95% CI 2.2–11). The only factor associated with a reduction in RUT yield was regular proton pump inhibitor (PPI) use. Overall the sensitivity, specificity, positive and negative predictive value for any RUT test was 84%, 100%, 100% and 92% respectively. Conclusion Our data suggests taking routine antral and corpus biopsies in conjunction with a combined RUT appears to optimizing H. pylori detection and overcome sampling error in a low prevalence population. PMID:26535121

  4. Effectiveness of slow motion video compared to real time video in improving the accuracy and consistency of subjective gait analysis in dogs.

    PubMed

    Lane, D M; Hill, S A; Huntingford, J L; Lafuente, P; Wall, R; Jones, K A

    2015-01-01

    Objective measures of canine gait quality via force plates, pressure mats or kinematic analysis are considered superior to subjective gait assessment (SGA). Despite research demonstrating that SGA does not accurately detect subtle lameness, it remains the most commonly performed diagnostic test for detecting lameness in dogs. This is largely because the financial, temporal and spatial requirements for existing objective gait analysis equipment makes this technology impractical for use in general practice. The utility of slow motion video as a potential tool to augment SGA is currently untested. To evaluate a more accessible way to overcome the limitations of SGA, a slow motion video study was undertaken. Three experienced veterinarians reviewed video footage of 30 dogs, 15 with a diagnosis of primary limb lameness based on history and physical examination, and 15 with no indication of limb lameness based on history and physical examination. Four different videos were made for each dog, demonstrating each dog walking and trotting in real time, and then again walking and trotting in 50% slow motion. For each video, the veterinary raters assessed both the degree of lameness, and which limb(s) they felt represented the source of the lameness. Spearman's rho, Cramer's V, and t-tests were performed to determine if slow motion video increased either the accuracy or consistency of raters' SGA relative to real time video. Raters demonstrated no significant increase in consistency or accuracy in their SGA of slow motion video relative to real time video. Based on these findings, slow motion video does not increase the consistency or accuracy of SGA values. Further research is required to determine if slow motion video will benefit SGA in other ways. PMID:26623383

  5. Effectiveness of slow motion video compared to real time video in improving the accuracy and consistency of subjective gait analysis in dogs

    PubMed Central

    Lane, D.M.; Hill, S.A.; Huntingford, J.L.; Lafuente, P.; Wall, R.; Jones, K.A.

    2015-01-01

    Objective measures of canine gait quality via force plates, pressure mats or kinematic analysis are considered superior to subjective gait assessment (SGA). Despite research demonstrating that SGA does not accurately detect subtle lameness, it remains the most commonly performed diagnostic test for detecting lameness in dogs. This is largely because the financial, temporal and spatial requirements for existing objective gait analysis equipment makes this technology impractical for use in general practice. The utility of slow motion video as a potential tool to augment SGA is currently untested. To evaluate a more accessible way to overcome the limitations of SGA, a slow motion video study was undertaken. Three experienced veterinarians reviewed video footage of 30 dogs, 15 with a diagnosis of primary limb lameness based on history and physical examination, and 15 with no indication of limb lameness based on history and physical examination. Four different videos were made for each dog, demonstrating each dog walking and trotting in real time, and then again walking and trotting in 50% slow motion. For each video, the veterinary raters assessed both the degree of lameness, and which limb(s) they felt represented the source of the lameness. Spearman’s rho, Cramer’s V, and t-tests were performed to determine if slow motion video increased either the accuracy or consistency of raters’ SGA relative to real time video. Raters demonstrated no significant increase in consistency or accuracy in their SGA of slow motion video relative to real time video. Based on these findings, slow motion video does not increase the consistency or accuracy of SGA values. Further research is required to determine if slow motion video will benefit SGA in other ways. PMID:26623383

  6. Attentional Mechanisms in Simple Visual Detection: A Speed-Accuracy Trade-Off Analysis

    ERIC Educational Resources Information Center

    Liu, Charles C.; Wolfgang, Bradley J.; Smith, Philip L.

    2009-01-01

    Recent spatial cuing studies have shown that detection sensitivity can be increased by the allocation of attention. This increase has been attributed to one of two mechanisms: signal enhancement or uncertainty reduction. Signal enhancement is an increase in the signal-to-noise ratio at the cued location; uncertainty reduction is a reduction in the…

  7. Assessing weight perception accuracy to promote weight loss among U.S. female adolescents: A secondary analysis

    PubMed Central

    2010-01-01

    Background Overweight and obesity have become a global epidemic. The prevalence of overweight and obesity among U.S. adolescents has almost tripled in the last 30 years. Results from recent systematic reviews demonstrate that no single, particular intervention or strategy successfully assists overweight or obese adolescents in losing weight. An understanding of factors that influence healthy weight-loss behaviors among overweight and obese female adolescents promotes effective, multi-component weight-loss interventions. There is limited evidence demonstrating associations between demographic variables, body-mass index, and weight perception among female adolescents trying to lose weight. There is also a lack of previous studies examining the association of the accuracy of female adolescents' weight perception with their efforts to lose weight. This study, therefore, examined the associations of body-mass index, weight perception, and weight-perception accuracy with trying to lose weight and engaging in exercise as a weight-loss method among a representative sample of U.S. female adolescents. Methods A nonexperimental, descriptive, comparative secondary analysis design was conducted using data from Wave II (1996) of the National Longitudinal Study of Adolescent Health (Add Health). Data representative of U.S. female adolescents (N = 2216) were analyzed using STATA statistical software. Descriptive statistics and survey weight logistic regression were performed to determine if demographic and independent (body-mass index, weight perception, and weight perception accuracy) variables were associated with trying to lose weight and engaging in exercise as a weight-loss method. Results Age, Black or African American race, body-mass index, weight perception, and weight perceptions accuracy were consistently associated with the likeliness of trying to lose weight among U.S. female adolescents. Age, body-mass index, weight perception, and weight-perception accuracy were

  8. 3He lung morphometry technique: Accuracy analysis and pulse sequence optimization

    NASA Astrophysics Data System (ADS)

    Sukstanskii, A. L.; Conradi, M. S.; Yablonskiy, D. A.

    2010-12-01

    The 3He lung morphometry technique (Yablonskiy et al., JAP, 2009), based on MRI measurements of hyperpolarized gas diffusion in lung airspaces, provides unique information on the lung microstructure at the alveolar level. 3D tomographic images of standard morphological parameters (mean airspace chord length, lung parenchyma surface-to-volume ratio, and the number of alveoli per unit lung volume) can be created from a rather short (several seconds) MRI scan. These parameters are most commonly used to characterize lung morphometry but were not previously available from in vivo studies. A background of the 3He lung morphometry technique is based on a previously proposed model of lung acinar airways, treated as cylindrical passages of external radius R covered by alveolar sleeves of depth h, and on a theory of gas diffusion in these airways. The initial works approximated the acinar airways as very long cylinders, all with the same R and h. The present work aims at analyzing effects of realistic acinar airway structures, incorporating airway branching, physiological airway lengths, a physiological ratio of airway ducts and sacs, and distributions of R and h. By means of Monte-Carlo computer simulations, we demonstrate that our technique allows rather accurate measurements of geometrical and morphological parameters of acinar airways. In particular, the accuracy of determining one of the most important physiological parameter of lung parenchyma - surface-to-volume ratio - does not exceed several percent. Second, we analyze the effect of the susceptibility induced inhomogeneous magnetic field on the parameter estimate and demonstrate that this effect is rather negligible at B0 ⩽ 3T and becomes substantial only at higher B0 Third, we theoretically derive an optimal choice of MR pulse sequence parameters, which should be used to acquire a series of diffusion-attenuated MR signals, allowing a substantial decrease in the acquisition time and improvement in accuracy of the

  9. A Meta-Analysis of Typhoid Diagnostic Accuracy Studies: A Recommendation to Adopt a Standardized Composite Reference.

    PubMed

    Storey, Helen L; Huang, Ying; Crudder, Chris; Golden, Allison; de los Santos, Tala; Hawkins, Kenneth

    2015-01-01

    Novel typhoid diagnostics currently under development have the potential to improve clinical care, surveillance, and the disease burden estimates that support vaccine introduction. Blood culture is most often used as the reference method to evaluate the accuracy of new typhoid tests; however, it is recognized to be an imperfect gold standard. If no single gold standard test exists, use of a composite reference standard (CRS) can improve estimation of diagnostic accuracy. Numerous studies have used a CRS to evaluate new typhoid diagnostics; however, there is no consensus on an appropriate CRS. In order to evaluate existing tests for use as a reference test or inclusion in a CRS, we performed a systematic review of the typhoid literature to include all index/reference test combinations observed. We described the landscape of comparisons performed, showed results of a meta-analysis on the accuracy of the more common combinations, and evaluated sources of variability based on study quality. This wide-ranging meta-analysis suggests that no single test has sufficiently good performance but some existing diagnostics may be useful as part of a CRS. Additionally, based on findings from the meta-analysis and a constructed numerical example demonstrating the use of CRS, we proposed necessary criteria and potential components of a typhoid CRS to guide future recommendations. Agreement and adoption by all investigators of a standardized CRS is requisite, and would improve comparison of new diagnostics across independent studies, leading to the identification of a better reference test and improved confidence in prevalence estimates. PMID:26566275

  10. A Meta-Analysis of Typhoid Diagnostic Accuracy Studies: A Recommendation to Adopt a Standardized Composite Reference

    PubMed Central

    Storey, Helen L.; Huang, Ying; Crudder, Chris; Golden, Allison; de los Santos, Tala; Hawkins, Kenneth

    2015-01-01

    Novel typhoid diagnostics currently under development have the potential to improve clinical care, surveillance, and the disease burden estimates that support vaccine introduction. Blood culture is most often used as the reference method to evaluate the accuracy of new typhoid tests; however, it is recognized to be an imperfect gold standard. If no single gold standard test exists, use of a composite reference standard (CRS) can improve estimation of diagnostic accuracy. Numerous studies have used a CRS to evaluate new typhoid diagnostics; however, there is no consensus on an appropriate CRS. In order to evaluate existing tests for use as a reference test or inclusion in a CRS, we performed a systematic review of the typhoid literature to include all index/reference test combinations observed. We described the landscape of comparisons performed, showed results of a meta-analysis on the accuracy of the more common combinations, and evaluated sources of variability based on study quality. This wide-ranging meta-analysis suggests that no single test has sufficiently good performance but some existing diagnostics may be useful as part of a CRS. Additionally, based on findings from the meta-analysis and a constructed numerical example demonstrating the use of CRS, we proposed necessary criteria and potential components of a typhoid CRS to guide future recommendations. Agreement and adoption by all investigators of a standardized CRS is requisite, and would improve comparison of new diagnostics across independent studies, leading to the identification of a better reference test and improved confidence in prevalence estimates. PMID:26566275

  11. ROC Analysis of the Accuracy of Noncycloplegic Retinoscopy, Retinomax Autorefractor, and SureSight Vision Screener for Preschool Vision Screening

    PubMed Central

    Maguire, Maureen; Quinn, Graham; Kulp, Marjean Taylor; Cyert, Lynn

    2011-01-01

    Purpose. To evaluate, by receiver operating characteristic (ROC) analysis, the accuracy of three instruments of refractive error in detecting eye conditions among 3- to 5-year-old Head Start preschoolers and to evaluate differences in accuracy between instruments and screeners and by age of the child. Methods. Children participating in the Vision In Preschoolers (VIP) Study (n = 4040), had screening tests administered by pediatric eye care providers (phase I) or by both nurse and lay screeners (phase II). Noncycloplegic retinoscopy (NCR), the Retinomax Autorefractor (Nikon, Tokyo, Japan), and the SureSight Vision Screener (SureSight, Alpharetta, GA) were used in phase I, and Retinomax and SureSight were used in phase II. Pediatric eye care providers performed a standardized eye examination to identify amblyopia, strabismus, significant refractive error, and reduced visual acuity. The accuracy of the screening tests was summarized by the area under the ROC curve (AUC) and compared between instruments and screeners and by age group. Results. The three screening tests had a high AUC for all categories of screening personnel. The AUC for detecting any VIP-targeted condition was 0.83 for NCR, 0.83 (phase I) to 0.88 (phase II) for Retinomax, and 0.86 (phase I) to 0.87 (phase II) for SureSight. The AUC was 0.93 to 0.95 for detecting group 1 (most severe) conditions and did not differ between instruments or screeners or by age of the child. Conclusions. NCR, Retinomax, and SureSight had similar and high accuracy in detecting vision disorders in preschoolers across all types of screeners and age of child, consistent with previously reported results at specificity levels of 90% and 94%. PMID:22125281

  12. Accuracy and concurrent validity of a sensor-based analysis of sit-to-stand movements in older adults.

    PubMed

    Regterschot, G Ruben H; Zhang, Wei; Baldus, Heribert; Stevens, Martin; Zijlstra, Wiebren

    2016-03-01

    Body-fixed motion sensors have been applied for the assessment of sit-to-stand (STS) performance. However, the accuracy and concurrent validity of sensor-based estimations of the body's center of mass (CoM) motion during STS are unclear. Therefore, this study investigated the accuracy and concurrent validity of sensor-based measures of CoM motion during STS in older adults. Accuracy and concurrent validity were investigated by comparing the sensor-based method to a force plate method. Twenty-seven older adults (20 females, 7 males; age: 72-94 years) performed five STS movements while data were collected with force plates and motion sensors on the hip and chest. Hip maximal acceleration provided an accurate estimation of the center of mass (CoM) maximal acceleration (limits of agreement (LOA) smaller than 5% of the CoM maximal acceleration; estimated and real CoM maximal acceleration did not differ (p=0.823)). Other hip STS measures and the chest STS measures did not provide accurate estimations of CoM motion (LOA ranged from -155.6% to 333.3% of the CoM value; sensor-based measures overestimated CoM motion (range p: <0.001 to 0.01)). However, the hip sensor did not overestimate maximal jerk of the CoM (p=0.679). Moderate to very strong associations were observed between sensor-based estimations and actual CoM motion (range r=0.64-0.94, p<0.001). Hence, sensor-based estimations of CoM motion during STS are possible, but accuracy is limited. The sensor-based method cannot replace laboratory methods for a mechanical analysis of CoM motion during STS but it may be a practical alternative for the clinical assessment of STS performance in older persons. PMID:26979906

  13. High-Accuracy Analysis of Compton Scattering in Chiral EFT: Proton and Neutron Polarisabilities

    NASA Astrophysics Data System (ADS)

    Griesshammer, Harald W.; Phillips, Daniel R.; McGovern, Judith A.

    2013-10-01

    Compton scattering from protons and neutrons provides important insight into the structure of the nucleon. A new extraction of the static electric and magnetic dipole polarisabilities αE 1 and βM 1 of the proton and neutron from all published elastic data below 300 MeV in Chiral Effective Field Theory shows that within the statistics-dominated errors, the proton and neutron polarisabilities are identical, i.e. no iso-spin breaking effects of the pion cloud are seen. Particular attention is paid to the precision and accuracy of each data set, and to an estimate of residual theoretical uncertainties. ChiEFT is ideal for that purpose since it provides a model-independent estimate of higher-order corrections and encodes the correct low-energy dynamics of QCD, including, for few-nucleon systems used to extract neutron polarisabilities, consistent nuclear currents, rescattering effects and wave functions. It therefore automatically respects the low-energy theorems for photon-nucleus scattering. The Δ (1232) as active degree of freedom is essential to realise the full power of the world's Compton data.Its parameters are constrained in the resonance region. A brief outlook is provided on what kind of future experiments can improve the database. Supported in part by UK STFC, DOE, NSF, and the Sino-German CRC 110.

  14. The Accuracy Analysis of Five-planet Movements Recorded in China in the Han Dynasty

    NASA Astrophysics Data System (ADS)

    Zhang, J.

    2010-04-01

    The observations and researches of five-planet are one of the important part of ancient calendars and also one of the methods to evaluate their accuracies. So astronomers paid much attention to this field. In "Hanshu·Tian wen zhi" and "Xuhanshu· Tian wen zhi", there are 160 records with detailed dates and positions, which are calculated and studied by the modern astronomical method in this paper. The calculated results show that these positions are mostly correct, taking up 77.5% of the total records. While the rest 36 records are incorrect, taking up 22.5%. In addition, there are three typical or special forms of five-planet movements. The numbers of “shou”, “he”, “fan” movements are 14, 22 and 46, taking up 9%, 14% and 29%, respectively. In this paper, a detailed research on these three typical forms of five-planet movements is carried out. We think that the 36 incorrect records are caused by various reasons, but mainly in the data processes carried out by later generations.

  15. Spatial distribution of soil heavy metal pollution estimated by different interpolation methods: accuracy and uncertainty analysis.

    PubMed

    Xie, Yunfeng; Chen, Tong-bin; Lei, Mei; Yang, Jun; Guo, Qing-jun; Song, Bo; Zhou, Xiao-yong

    2011-01-01

    Mapping the spatial distribution of contaminants in soils is the basis of pollution evaluation and risk control. Interpolation methods are extensively applied in the mapping processes to estimate the heavy metal concentrations at unsampled sites. The performances of interpolation methods (inverse distance weighting, local polynomial, ordinary kriging and radial basis functions) were assessed and compared using the root mean square error for cross validation. The results indicated that all interpolation methods provided a high prediction accuracy of the mean concentration of soil heavy metals. However, the classic method based on percentages of polluted samples, gave a pollution area 23.54-41.92% larger than that estimated by interpolation methods. The difference in contaminated area estimation among the four methods reached 6.14%. According to the interpolation results, the spatial uncertainty of polluted areas was mainly located in three types of region: (a) the local maxima concentration region surrounded by low concentration (clean) sites, (b) the local minima concentration region surrounded with highly polluted samples; and (c) the boundaries of the contaminated areas. PMID:20970158

  16. A New High-Accuracy Analysis of Compton Scattering in Chiral EFT: Neutron Polarisabilities

    NASA Astrophysics Data System (ADS)

    Griesshammer, Harald W.; McGovern, Judith A.; Phillips, Daniel R.

    2015-04-01

    Low-energy Compton scattering tests the symmetries and interaction strengths of a target's internal degrees of freedom in the electric and magnetic fields of a real, external photon. In the single-nucleon sector, information is often compressed into the static scalar dipole polarisabilities which are experimentally not directly accessible but encode information on the pion cloud and the Δ(1232) excitation. The interaction of the photon with the charged pion-exchange also provides a conceptually clean probe of few-nucleon binding. After demonstrating the statistical consistency of the world's γd dataset including the new data from the MAX-IV collaboration described in the preceding talk, we present a new extraction of the neutron polarisabilities in Chiral Effective Field Theory: αn = [ 11 . 55 +/- 1 . 25(stat) +/- 0 . 2(BSR) +/- 0 . 8(th) ] and βn = [ 3 . 65 -/+ 1 . 25(stat) +/- 0 . 2(BSR) -/+ 0 . 8(th) ] , in 10-4 fm3, with χ2 = 45 . 2 for 44 degrees of freedom. The new data reduced the statistical uncertainties by 30%. We discuss data accuracy and consistency, the role of the Δ(1232) , and an estimate of residual theoretical uncertainties. Within statistical and systematic errors, proton and neutron polarisabilities remain identical. Supported in part by UK STFC and US DOE.

  17. Analysis and improvement of detection accuracy for a wireless motion sensing system using integrated coil component

    SciTech Connect

    Hashi, S.; Ishiyama, K.; Yabukami, S.; Kanetaka, H.; Arai, K. I.

    2010-05-15

    Integration of the exciting coil and the pick-up coil array for the wireless magnetic motion sensing system has been investigated to clear the limitation of the system arrangement. From the comparison of the integrated-type and the sandwich-type, which was proposed by our previous study, regardless of the lower signal-to-noise ratio of the integrated-type than that of the sandwich-type a repeatable detection accuracy of around 1 mm is obtained at the distance of 120 mm from the pick-up coil array (sandwich-type: up to 140 mm). A different tendency of the detection errors in detection was also observed. In spite of different tendency, the cause of the errors has been clarified. The impedance change of the exciting coil due to a resonance of the LC marker perturbs strength of the magnetic field which is used for marker excitation. However, the errors are able to compensate to the actual positions and orientations of the marker by using compensatory method which was already established.

  18. Comparative analysis of semantic localization accuracies between adult and pediatric DICOM CT images

    NASA Astrophysics Data System (ADS)

    Robertson, Duncan; Pathak, Sayan D.; Criminisi, Antonio; White, Steve; Haynor, David; Chen, Oliver; Siddiqui, Khan

    2012-02-01

    Existing literature describes a variety of techniques for semantic annotation of DICOM CT images, i.e. the automatic detection and localization of anatomical structures. Semantic annotation facilitates enhanced image navigation, linkage of DICOM image content and non-image clinical data, content-based image retrieval, and image registration. A key challenge for semantic annotation algorithms is inter-patient variability. However, while the algorithms described in published literature have been shown to cope adequately with the variability in test sets comprising adult CT scans, the problem presented by the even greater variability in pediatric anatomy has received very little attention. Most existing semantic annotation algorithms can only be extended to work on scans of both adult and pediatric patients by adapting parameters heuristically in light of patient size. In contrast, our approach, which uses random regression forests ('RRF'), learns an implicit model of scale variation automatically using training data. In consequence, anatomical structures can be localized accurately in both adult and pediatric CT studies without the need for parameter adaptation or additional information about patient scale. We show how the RRF algorithm is able to learn scale invariance from a combined training set containing a mixture of pediatric and adult scans. Resulting localization accuracy for both adult and pediatric data remains comparable with that obtained using RRFs trained and tested using only adult data.

  19. Analysis of accuracy of approximate, simultaneous, nonlinear confidence intervals on hydraulic heads in analytical and numerical test cases

    USGS Publications Warehouse

    Hill, M.C.

    1989-01-01

    Inaccuracies in parameter values, parameterization, stresses, and boundary conditions of analytical solutions and numerical models of groundwater flow produce errors in simulated hydraulic heads. These errors can be quantified in terms of approximate, simultaneous, nonlinear confidence intervals presented in the literature. Approximate confidence intervals can be applied in both error and sensitivity analysis and can be used prior to calibration or when calibration was accomplished by trial and error. The method is expanded for use in numerical problems, and the accuracy of the approximate intervals is evaluated using Monte Carlo runs. Four test cases are reported. -from Author

  20. Electron Microprobe Analysis of Hf in Zircon: Suggestions for Improved Accuracy of a Difficult Measurement

    NASA Astrophysics Data System (ADS)

    Fournelle, J.; Hanchar, J. M.

    2013-12-01

    It is not commonly recognized as such, but the accurate measurement of Hf in zircon is not a trivial analytical issue. This is important to assess because Hf is often used as an internal standard for trace element analyses of zircon by LA-ICPMS. The issues pertaining to accuracy revolve around: (1) whether the Hf Ma or the La line is used; (2) what accelerating voltage is applied if Zr La is also measured, and (3) what standard for Hf is used. Weidenbach, et al.'s (2004) study of the 91500 zircon demonstrated the spread (in accuracy) of possible EPMA values for six EPMA labs, 2 of which used Hf Ma, 3 used Hf La, and one used Hf Lb, and standards ranged from HfO2, a ZrO2-HfO2 compound, Hf metal, and hafnon. Weidenbach, et al., used the ID-TIMS values as the correct value (0.695 wt.% Hf.), for which not one of the EPMA labs came close to that value (3 were low and 3 were high). Those data suggest: (1) that there is a systematic underestimation error of the 0.695 wt% Hf (ID-TIMS Hf) value if Hf Ma is used; most likely an issue with the matrix correction, as the analytical lines and absorption edges of Zr La, Si Ka and Hf Ma are rather tightly packed in the electromagnetic spectrum. Mass absorption coefficients are easily in error (e.g., Donovan's determination of the MAC of Hf by Si Ka of 5061 differs from the typically used Henke value of 5449 (Donovan et al, 2002); and (2) For utilization of the Hf La line, however, the second order Zr Ka line interferes with Hf La if the accelerating voltage is greater than 17.99 keV. If this higher keV is used and differential mode PHA is applied, only a portion of the interference is removed (e.g., removal of escape peaks), causing an overestimation of Hf content. Unfortunately, it is virtually impossible to apply an interference correction in this case, as it is impossible to locate Hf-free Zr probe standard. We have examined many of the combinations used by those six EPMA labs and concluded that the optimal EPMA is done with Hf

  1. Type I Error Inflation in the Traditional By-Participant Analysis to Metamemory Accuracy: A Generalized Mixed-Effects Model Perspective

    ERIC Educational Resources Information Center

    Murayama, Kou; Sakaki, Michiko; Yan, Veronica X.; Smith, Garry M.

    2014-01-01

    In order to examine metacognitive accuracy (i.e., the relationship between metacognitive judgment and memory performance), researchers often rely on by-participant analysis, where metacognitive accuracy (e.g., resolution, as measured by the gamma coefficient or signal detection measures) is computed for each participant and the computed values are…

  2. Increasing the Timeliness and Accuracy of Data Entry for Exceptional Education Students Ages 3 to 21 through the Use of Hands-On Computer Inservice Training.

    ERIC Educational Resources Information Center

    Kofsky, Gale E.

    The practicum utilized four inservice training sessions to teach data entry clerks, staffing specialists, special educators, and gifted teachers to enter student data into the school system's data base management system on a more timely basis and with greater accuracy. The practicum also developed data input forms which could be used regardless of…

  3. Studies on three-temperature, three-cylinder Stirling cycle machine (accuracy of vector analysis)

    SciTech Connect

    Ohtomo, Michihiro; Isshiki, Naotsugu; Watanabe, Hiroichi

    1996-12-31

    A Simple vector analysis method aimed to quickly determine the performance of complicated multi-cylinder Stirling cycle machines such as the Vuilleumier cycle machine was introduced by the authors in 1992. It has proven to be a very simple graphic method for the design and analysis of future complicated Stirling cycle machines. In this method, it is assumed that all volume variations and pressure changes are sinusoidal. Actually all volume fluctuations can be assumed to be sinusoidal without large errors, but pressure fluctuation deviates from sinusoidal assumptions, because pressure is a reciprocal of the total sinusoidal volume fluctuation. The pressure value estimated by this vector analysis method is compared to the exact calculation, and the error is calculated and discussed. It is known that the error depends on the pressure ratio and the dead volume ratio, and the error is less than 10% in many cases. The features and details of a newly made experimental, small, pressurized (6MPa) three-temperature (hot, cold and middle temperature), three-cylinder Stirling cycle machine are described. This machine was basically designed by this vector analysis method. This machine can work as a cooler and/or a heat pump, by adjusting the phase angle of the hot cylinder and depending upon whether motor power and/or heat input is supplied. Rotating vector analysis was done on a free piston or a pulse tube machine (Storch, 1988), but this report gives new convenient methods for complicated Stirling cycle systems.

  4. Oufti: an integrated software package for high-accuracy, high-throughput quantitative microscopy analysis.

    PubMed

    Paintdakhi, Ahmad; Parry, Bradley; Campos, Manuel; Irnov, Irnov; Elf, Johan; Surovtsev, Ivan; Jacobs-Wagner, Christine

    2016-02-01

    With the realization that bacteria display phenotypic variability among cells and exhibit complex subcellular organization critical for cellular function and behavior, microscopy has re-emerged as a primary tool in bacterial research during the last decade. However, the bottleneck in today's single-cell studies is quantitative image analysis of cells and fluorescent signals. Here, we address current limitations through the development of Oufti, a stand-alone, open-source software package for automated measurements of microbial cells and fluorescence signals from microscopy images. Oufti provides computational solutions for tracking touching cells in confluent samples, handles various cell morphologies, offers algorithms for quantitative analysis of both diffraction and non-diffraction-limited fluorescence signals and is scalable for high-throughput analysis of massive datasets, all with subpixel precision. All functionalities are integrated in a single package. The graphical user interface, which includes interactive modules for segmentation, image analysis and post-processing analysis, makes the software broadly accessible to users irrespective of their computational skills. PMID:26538279

  5. A Proton Beam Therapy System Dedicated to Spot-Scanning Increases Accuracy with Moving Tumors by Real-Time Imaging and Gating and Reduces Equipment Size

    PubMed Central

    Shimizu, Shinichi; Miyamoto, Naoki; Matsuura, Taeko; Fujii, Yusuke; Umezawa, Masumi; Umegaki, Kikuo; Hiramoto, Kazuo; Shirato, Hiroki

    2014-01-01

    Purpose A proton beam therapy (PBT) system has been designed which dedicates to spot-scanning and has a gating function employing the fluoroscopy-based real-time-imaging of internal fiducial markers near tumors. The dose distribution and treatment time of the newly designed real-time-image gated, spot-scanning proton beam therapy (RGPT) were compared with free-breathing spot-scanning proton beam therapy (FBPT) in a simulation. Materials and Methods In-house simulation tools and treatment planning system VQA (Hitachi, Ltd., Japan) were used for estimating the dose distribution and treatment time. Simulations were performed for 48 motion parameters (including 8 respiratory patterns and 6 initial breathing timings) on CT data from two patients, A and B, with hepatocellular carcinoma and with clinical target volumes 14.6 cc and 63.1 cc. The respiratory patterns were derived from the actual trajectory of internal fiducial markers taken in X-ray real-time tumor-tracking radiotherapy (RTRT). Results With FBPT, 9/48 motion parameters achieved the criteria of successful delivery for patient A and 0/48 for B. With RGPT 48/48 and 42/48 achieved the criteria. Compared with FBPT, the mean liver dose was smaller with RGPT with statistical significance (p<0.001); it decreased from 27% to 13% and 28% to 23% of the prescribed doses for patients A and B, respectively. The relative lengthening of treatment time to administer 3 Gy (RBE) was estimated to be 1.22 (RGPT/FBPT: 138 s/113 s) and 1.72 (207 s/120 s) for patients A and B, respectively. Conclusions This simulation study demonstrated that the RGPT was able to improve the dose distribution markedly for moving tumors without very large treatment time extension. The proton beam therapy system dedicated to spot-scanning with a gating function for real-time imaging increases accuracy with moving tumors and reduces the physical size, and subsequently the cost of the equipment as well as of the building housing the equipment. PMID

  6. Diagnostic accuracy of loop-mediated isothermal amplification in detection of Clostridium difficile in stool samples: a meta-analysis

    PubMed Central

    Wei, Chen; Yang-Ming, Li; Shan, Luo; Yi-Ming, Zhong

    2015-01-01

    Introduction Clostridium difficile infection (CDI) remains a diagnostic challenge for clinicians. More recently, loop-mediated isothermal amplification (LAMP) has become readily available for the diagnosis of CDI, and many studies have investigated the usefulness of LAMP for rapid and accurate diagnosis of CDI. However, the overall diagnostic accuracy of LAMP for CDI remains unclear. In this meta-analysis, our aim was to establish the overall diagnostic accuracy of LAMP in detection of Clostridium difficile (CD) in stool samples. Material and methods A search was done in PubMed, MEDLINE, EMBASE and Cochrane Library databases up to February 2014 to identify published studies that evaluated the diagnostic role of LAMP for CD. Methodological quality was assessed according to the quality assessment for studies of diagnostic accuracy (QUADAS) instrument. The sensitivities (SEN), specificities (SPE), positive likelihood ratio (PLR), negative likelihood ratio (NLR) and diagnostic odds ratio (DOR) were pooled statistically using random effects models. Statistical analysis was performed by employing Meta-Disc 1.4 software. Summary receiver operating characteristic (SROC) curves were used to summarize overall test performance. Funnel plots were used to test the potential publication bias. Result A total of 9 studies met inclusion criteria for the present meta-analysis. The pooled SEN and SPE for diagnosing CD were 0.93 (95% CI: 0.91–0.95) and 0.98 (95% CI: 0.98–0.99), respectively. The PLR was 47.72 (95% CI: 15.10–150.82), NLR was 0.07 (95% CI: 0.04–0.14) and DOR was 745.19 (95% CI: 229.30−2421.72). The area under the ROC was 0.98. Meta-regression indicated that the total number of samples was a source of heterogeneity for LAMP in detection of CD. The funnel plots suggested no publication bias. Conclusions The LAMP meets the minimum desirable characteristics of a diagnostic test of SEN, SPE and other measures of accuracy in the diagnosis of CD, and it is suitable

  7. Diagnostic Accuracy of Real-Time Shear Wave Elastography for Staging of Liver Fibrosis: A Meta-Analysis.

    PubMed

    Li, Changtian; Zhang, Changsheng; Li, Junlai; Huo, Huiping; Song, Danfei

    2016-01-01

    BACKGROUND The present meta-analysis, based on previous studies, was aimed to evaluate the test accuracy of real-time shear wave elastography (SWE) for the staging of liver fibrosis. MATERIAL AND METHODS A systematic search on MEDLINE, PubMed, Embase, and Google Scholar databases was conducted, and data on SWE tests and liver fibrosis staging were collected. For each cut-off stage of fibrosis (F≥2, F≥3, and F≥4), pooled results of sensitivity, specificity, and area under summary receiver operating characteristic (SROC) curve were analyzed. The study heterogeneity was evaluated by χ2 and I2 tests. I2>50% or P≤0.05 indicates there was heterogeneity, and then a random-effects model was applied. Otherwise, the fixed-effects model was used. The publication bias was evaluated using Deeks funnel plots asymmetry test and Fagan plot analysis was performed. RESULTS Finally, 934 patients from 8 published studies were included in the analysis. The pooled sensitivity and specificity of SWE for F≥2 were 85.0% (95% CI, 82-88%) and 81% (95% CI, 71-88%), respectively. The area under the SROC curve with 95% CI was presented as 0.88 (95% CI, 85-91%). The pooled sensitivity and specificity of SWE for F≥3 were 90.0% (95% CI, 83.0-95.0%) and 81.0% (95% CI, 75.0-86.0%), respectively, corresponding to an area of SROC of 0.94 (95% CI, 92-96%). The pooled sensitivity and specificity of SWE for F≥4 were 87.0% (95% CI, 80.0-92.0%) and 88.0% (95% CI, 80.0-93.0%), respectively, corresponding to an area of SROC of 0.92 (95% CI, 89-94%). CONCLUSIONS The overall accuracy of SWE is high and clinically useful for the staging of liver fibrosis. Compared to the results of meta-analyses on other tests, such as RTE, TE, and ARFI, the performance of SWE is nearly identical in accuracy for the evaluation of cirrhosis. For the evaluation of significant liver fibrosis (F≥2), the overall accuracy of SWE seems to be similar to ARFI, but more accurate than RTE and TE. PMID:27102449

  8. Diagnostic Accuracy of Real-Time Shear Wave Elastography for Staging of Liver Fibrosis: A Meta-Analysis

    PubMed Central

    Li, Changtian; Zhang, Changsheng; Li, Junlai; Huo, Huiping; Song, Danfei

    2016-01-01

    Background The present meta-analysis, based on previous studies, was aimed to evaluate the test accuracy of real-time shear wave elastography (SWE) for the staging of liver fibrosis. Material/Methods A systematic search on MEDLINE, PubMed, Embase, and Google Scholar databases was conducted, and data on SWE tests and liver fibrosis staging were collected. For each cut-off stage of fibrosis (F≥2, F≥3, and F≥4), pooled results of sensitivity, specificity, and area under summary receiver operating characteristic (SROC) curve were analyzed. The study heterogeneity was evaluated by χ2 and I2 tests. I2>50% or P≤0.05 indicates there was heterogeneity, and then a random-effects model was applied. Otherwise, the fixed-effects model was used. The publication bias was evaluated using Deeks funnel plots asymmetry test and Fagan plot analysis was performed. Results Finally, 934 patients from 8 published studies were included in the analysis. The pooled sensitivity and specificity of SWE for F≥2 were 85.0% (95% CI, 82–88%) and 81% (95% CI, 71–88%), respectively. The area under the SROC curve with 95% CI was presented as 0.88 (95% CI, 85–91%). The pooled sensitivity and specificity of SWE for F≥3 were 90.0% (95% CI, 83.0–95.0%) and 81.0% (95% CI, 75.0–86.0%), respectively, corresponding to an area of SROC of 0.94 (95% CI, 92–96%). The pooled sensitivity and specificity of SWE for F≥4 were 87.0% (95% CI, 80.0–92.0%) and 88.0% (95% CI, 80.0–93.0%), respectively, corresponding to an area of SROC of 0.92 (95% CI, 89–94%). Conclusions The overall accuracy of SWE is high and clinically useful for the staging of liver fibrosis. Compared to the results of meta-analyses on other tests, such as RTE, TE, and ARFI, the performance of SWE is nearly identical in accuracy for the evaluation of cirrhosis. For the evaluation of significant liver fibrosis (F≥2), the overall accuracy of SWE seems to be similar to ARFI, but more accurate than RTE and TE. PMID

  9. Cooperative investigation of precision and accuracy: In chemical analysis of silicate rocks

    USGS Publications Warehouse

    Schlecht, W.G.

    1951-01-01

    This is the preliminary report of the first extensive program ever organized to study the analysis of igneous rocks, a study sponsored by the United States Geological Survey, the Massachusetts Institute of Technology, and the Geophysical Laboratory of the Carnegie Institution of Washington. Large samples of two typical igneous rocks, a granite and a diabase, were carefully prepared and divided. Small samples (about 70 grams) of each were sent to 25 rock-analysis laboratories throughout the world; analyses of one or both samples were reported by 34 analysts in these laboratories. The results, which showed rather large discrepancies, are presented in histograms. The great discordance in results reflects the present unsatisfactory state of rock analysis. It is hoped that the ultimate establishment of standard samples and procedures will contribute to the improvement of quality of analyses. The two rock samples have also been thoroughly studied spectrographically and petrographically. Detailed reports of all the studies will be published.

  10. Accuracy of Teachers' Judgments of Students' Academic Achievement: A Meta-Analysis

    ERIC Educational Resources Information Center

    Sudkamp, Anna; Kaiser, Johanna; Moller, Jens

    2012-01-01

    This meta-analysis summarizes empirical results on the correspondence between teachers' judgments of students' academic achievement and students' actual academic achievement. The article further investigates theoretically and methodologically relevant moderators of the correlation between the two measures. Overall, 75 studies reporting…

  11. Effects of a Training Package to Improve the Accuracy of Descriptive Analysis Data Recording

    ERIC Educational Resources Information Center

    Mayer, Kimberly L.; DiGennaro Reed, Florence D.

    2013-01-01

    Functional behavior assessment is an important precursor to developing interventions to address a problem behavior. Descriptive analysis, a type of functional behavior assessment, is effective in informing intervention design only if the gathered data accurately capture relevant events and behaviors. We investigated a training procedure to improve…

  12. Accuracy and Precision for EchoMRI-Infants™ Body Composition Analysis in Piglets

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Body Composition Analysis is used to evaluate infant growth patterns, efficacy of nutritional and medical interventions, progression of chronic disease, and recovery from malnutrition. EchoMRI-Infants is a new Quantitative Magnetic Resonance (QMR) method to measure Total Body Fat, Lean Tissue Mass, ...

  13. Factors Related to Sight-Reading Accuracy: A Meta-Analysis

    ERIC Educational Resources Information Center

    Mishra, Jennifer

    2014-01-01

    The purpose of this meta-analysis was to determine the extent of the overall relationship between previously tested variables and sight-reading. An exhaustive survey of the available research literature was conducted resulting in 92 research studies that reported correlations between sight-reading and another variable. Variables ("n" =…

  14. An investigation of the accuracy of FEM analysis of a graphite epoxy box beam

    NASA Astrophysics Data System (ADS)

    Baldwin, James Daniel; McWhorter, John C., III

    A carbon fiber-epoxy wing box was constructed in order to study the behavior of a stressed-skin wing structure with anisotropic sandwich skins. Two wing skins, two spars, and three ribs made up the structure. The spars were designed to act primarily as shear webs so that the bending load int he structure would be reacted mainly by tension and compression in the skins. Linear, buckling, and nonlinear NASTRAN finite element solution sequences were implemented in the analysis of the structure. The structure was loaded as a cantilever beam and strain and deflection measurements were recorded. The finite element nonlinear analysis accurately modeled stiffness, but results for strains at a point were not as accurate.

  15. Gravity Probe B Data Analysis. Status and Potential for Improved Accuracy of Scientific Results

    NASA Astrophysics Data System (ADS)

    Everitt, C. W. F.; Adams, M.; Bencze, W.; Buchman, S.; Clarke, B.; Conklin, J. W.; Debra, D. B.; Dolphin, M.; Heifetz, M.; Hipkins, D.; Holmes, T.; Keiser, G. M.; Kolodziejczak, J.; Li, J.; Lipa, J.; Lockhart, J. M.; Mester, J. C.; Muhlfelder, B.; Ohshima, Y.; Parkinson, B. W.; Salomon, M.; Silbergleit, A.; Solomonik, V.; Stahl, K.; Taber, M.; Turneaure, J. P.; Wang, S.; Worden, P. W.

    2009-12-01

    This is the first of five connected papers detailing progress on the Gravity Probe B (GP-B) Relativity Mission. GP-B, launched 20 April 2004, is a landmark physics experiment in space to test two fundamental predictions of Einstein’s general relativity theory, the geodetic and frame-dragging effects, by means of cryogenic gyroscopes in Earth orbit. Data collection began 28 August 2004 and science operations were completed 29 September 2005. The data analysis has proven deeper than expected as a result of two mutually reinforcing complications in gyroscope performance: (1) a changing polhode path affecting the calibration of the gyroscope scale factor C g against the aberration of starlight and (2) two larger than expected manifestations of a Newtonian gyro torque due to patch potentials on the rotor and housing. In earlier papers, we reported two methods, ‘geometric’ and ‘algebraic’, for identifying and removing the first Newtonian effect (‘misalignment torque’), and also a preliminary method of treating the second (‘roll-polhode resonance torque’). Central to the progress in both torque modeling and C g determination has been an extended effort on “Trapped Flux Mapping” commenced in November 2006. A turning point came in August 2008 when it became possible to include a detailed history of the resonance torques into the computation. The East-West (frame-dragging) effect is now plainly visible in the processed data. The current statistical uncertainty from an analysis of 155 days of data is 5.4 marc-s/yr (˜14% of the predicted effect), though it must be emphasized that this is a preliminary result requiring rigorous investigation of systematics by methods discussed in the accompanying paper by Muhlfelder et al. A covariance analysis incorporating models of the patch effect torques indicates that a 3-5% determination of frame-dragging is possible with more complete, computationally intensive data analysis.

  16. Improving Accuracy and Temporal Resolution of Learning Curve Estimation for within- and across-Session Analysis.

    PubMed

    Deliano, Matthias; Tabelow, Karsten; König, Reinhard; Polzehl, Jörg

    2016-01-01

    Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning. PMID:27303809

  17. Improving Accuracy and Temporal Resolution of Learning Curve Estimation for within- and across-Session Analysis

    PubMed Central

    Tabelow, Karsten; König, Reinhard; Polzehl, Jörg

    2016-01-01

    Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning. PMID:27303809

  18. A Method Improving the Accuracy of Fluorescence Recovery after Photobleaching Analysis

    PubMed Central

    Jönsson, Peter; Jonsson, Magnus P.; Tegenfeldt, Jonas O.; Höök, Fredrik

    2008-01-01

    Fluorescence recovery after photobleaching has been an established technique of quantifying the mobility of molecular species in cells and cell membranes for more than 30 years. However, under nonideal experimental conditions, the current methods of analysis still suffer from occasional problems; for example, when the signal/noise ratio is low, when there are temporal fluctuations in the illumination, or when there is bleaching during the recovery process. We here present a method of analysis that overcomes these problems, yielding accurate results even under nonideal experimental conditions. The method is based on circular averaging of each image, followed by spatial frequency analysis of the averaged radial data, and requires no prior knowledge of the shape of the bleached area. The method was validated using both simulated and experimental fluorescence recovery after photobleaching data, illustrating that the diffusion coefficient of a single diffusing component can be determined to within ∼1%, even for small signal levels (100 photon counts), and that at typical signal levels (5000 photon counts) a system with two diffusion coefficients can be analyzed with <10% error. PMID:18567628

  19. Quotation accuracy in medical journal articles—a systematic review and meta-analysis

    PubMed Central

    Jergas, Hannah

    2015-01-01

    Background. Quotations and references are an indispensable element of scientific communication. They should support what authors claim or provide important background information for readers. Studies indicate, however, that quotations not serving their purpose—quotation errors—may be prevalent. Methods. We carried out a systematic review, meta-analysis and meta-regression of quotation errors, taking account of differences between studies in error ascertainment. Results. Out of 559 studies screened we included 28 in the main analysis, and estimated major, minor and total quotation error rates of 11,9%, 95% CI [8.4, 16.6] 11.5% [8.3, 15.7], and 25.4% [19.5, 32.4]. While heterogeneity was substantial, even the lowest estimate of total quotation errors was considerable (6.7%). Indirect references accounted for less than one sixth of all quotation problems. The findings remained robust in a number of sensitivity and subgroup analyses (including risk of bias analysis) and in meta-regression. There was no indication of publication bias. Conclusions. Readers of medical journal articles should be aware of the fact that quotation errors are common. Measures against quotation errors include spot checks by editors and reviewers, correct placement of citations in the text, and declarations by authors that they have checked cited material. Future research should elucidate if and to what degree quotation errors are detrimental to scientific progress. PMID:26528420

  20. Accuracy of an approximate static structural analysis technique based on stiffness matrix eigenmodes

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Hajela, P.

    1979-01-01

    Use of the stiffness matrix eigenmodes, instead of the vibration eigenmodes, as generalized coordinates is proposed for condensation of static load deflection equations in finite element stiffness method. The modes are selected by strain energy criteria and the resulting fast, approximate analysis technique is evaluated by applications to idealized built-up wings and a fuselage segment. The best results obtained are a two-order of magnitude reduction of the number of degrees of freedom in a high aspect ratio wing associated with less than one percent error in prediction of the largest displacement.

  1. Analysis of inter- and intrafraction accuracy of a commercial thermoplastic mask system used for image-guided particle radiation therapy.

    PubMed

    Amelio, Dante; Winter, Marcus; Habermehl, Daniel; Jäkel, Oliver; Debus, Jurgen; Combs, Stephanie E

    2013-07-01

    The present paper reports and discusses the results concerning both the inter- and intrafraction accuracy achievable combining the immobilization system employed in patients with head-and-neck, brain and skull base tumors with image guidance at our particle therapy center. Moreover, we investigated the influence of intrafraction time on positioning displacements. A total of 41 patients treated between January and July 2011 represented the study population. All the patients were immobilized with a tailored commercial thermoplastic head mask with standard head-neck rest (HeadSTEP(®), IT-V). Patient treatment position was verified by two orthogonal kilovoltage images acquired through a ceiling imaging robot (Siemens, Erlangen, Germany). The analysis of the applied daily corrections during the first treatment week before and after treatment delivery allowed the evaluation of the interfraction and intrafraction reproducibility of the thermoplastic mask, respectively. Concerning interfraction reproducibility, translational and rotational systematic errors (Σs) were ≤ 2.2 mm and 0.9º, respectively; translational and rotational random errors (σs) were ≤ 1.6 mm and 0.6º, respectively. Regarding the intrafraction accuracy translational and rotational Σs were ≤ 0.4 mm and 0.4º, respectively; translational and rotational σs were ≤ 0.5 mm and 0.3º, respectively. Concerning the time-intrafraction displacements correlation Pearson coefficient was 0.5 for treatment fractions with time between position checks less than or equal to median value, and 0.2 for those with time between position controls longer than the median figure. These results suggest that intrafractional patient motion is smaller than interfractional patient motion. Moreover, we can state that application of different imaging verification protocols translate into a relevant difference of accuracy for the same immobilization device. The magnitude of intrafraction displacements correlates with the

  2. Analysis of inter- and intrafraction accuracy of a commercial thermoplastic mask system used for image-guided particle radiation therapy

    PubMed Central

    Amelio, Dante; Winter, Marcus; Habermehl, Daniel; Jäkel, Oliver; Debus, Jurgen; Combs, Stephanie E.

    2013-01-01

    The present paper reports and discusses the results concerning both the inter- and intrafraction accuracy achievable combining the immobilization system employed in patients with head-and-neck, brain and skull base tumors with image guidance at our particle therapy center. Moreover, we investigated the influence of intrafraction time on positioning displacements. A total of 41 patients treated between January and July 2011 represented the study population. All the patients were immobilized with a tailored commercial thermoplastic head mask with standard head-neck rest (HeadSTEP®, IT-V). Patient treatment position was verified by two orthogonal kilovoltage images acquired through a ceiling imaging robot (Siemens, Erlangen, Germany). The analysis of the applied daily corrections during the first treatment week before and after treatment delivery allowed the evaluation of the interfraction and intrafraction reproducibility of the thermoplastic mask, respectively. Concerning interfraction reproducibility, translational and rotational systematic errors (Σs) were ≤2.2 mm and 0.9º, respectively; translational and rotational random errors (σs) were ≤1.6 mm and 0.6º, respectively. Regarding the intrafraction accuracy translational and rotational Σs were ≤0.4 mm and 0.4º, respectively; translational and rotational σs were ≤ 0.5 mm and 0.3º, respectively. Concerning the time-intrafraction displacements correlation Pearson coefficient was 0.5 for treatment fractions with time between position checks less than or equal to median value, and 0.2 for those with time between position controls longer than the median figure. These results suggest that intrafractional patient motion is smaller than interfractional patient motion. Moreover, we can state that application of different imaging verification protocols translate into a relevant difference of accuracy for the same immobilization device. The magnitude of intrafraction displacements correlates with the time for

  3. Accuracy of Lung Ultrasonography versus Chest Radiography for the Diagnosis of Adult Community-Acquired Pneumonia: Review of the Literature and Meta-Analysis.

    PubMed

    Ye, Xiong; Xiao, Hui; Chen, Bo; Zhang, SuiYang

    2015-01-01

    Lung ultrasonography (LUS) is being increasingly utilized in emergency and critical settings. We performed a systematic review of the current literature to compare the accuracy of LUS and chest radiography (CR) for the diagnosis of adult community-acquired pneumonia (CAP). We searched in Pub Med, EMBASE dealing with both LUS and CR for diagnosis of adult CAP, and conducted a meta-analysis to evaluate the diagnostic accuracy of LUS in comparison with CR. The diagnostic standard that the index test compared was the hospital discharge diagnosis or the result of chest computed tomography scan as a "gold standard". We calculated pooled sensitivity and specificity using the Mantel-Haenszel method and pooled diagnostic odds ratio using the DerSimonian-Laird method. Five articles met our inclusion criteria and were included in the final analysis. Using hospital discharge diagnosis as reference, LUS had a pooled sensitivity of 0.95 (0.93-0.97) and a specificity of 0.90 (0.86 to 0.94), CR had a pooled sensitivity of 0.77 (0.73 to 0.80) and a specificity of 0.91 (0.87 to 0.94). LUS and CR compared with computed tomography scan in 138 patients in total, the Z statistic of the two summary receiver operating characteristic was 3.093 (P = 0.002), the areas under the curve for LUS and CR were 0.901 and 0.590, respectively. Our study indicates that LUS can help to diagnosis adult CAP by clinicians and the accuracy was better compared with CR using chest computed tomography scan as the gold standard. PMID:26107512

  4. Kinematic Accuracy Analysis of Lead Screw W Insertion Mechanism with Flexibility

    NASA Astrophysics Data System (ADS)

    He, Hu; Zhang, Lei; Kong, Jiayuan

    According to the actual requirements of w insertion, a set of variable lead screw w mechanism was designed, motion characteristics of the mechanism were analyzed and kinematics simulation was carried out with MATLAB. Mechanism precision was analyzed with the analytical method and the error coefficient curve of each component in the mechanism was obtained. Dynamics simulation for rigid mechanism and mechanism with flexibility in different speed was conducted with ADAMS, furthermore, real-time elastic deformation of the flexible Connecting rod was obtained. In consideration of the influences of the elastic connecting rod, the outputs motion error and elastic deformation of components were increased with the speed of the loom.

  5. Use of model calibration to achieve high accuracy in analysis of computer networks

    DOEpatents

    Frogner, Bjorn; Guarro, Sergio; Scharf, Guy

    2004-05-11

    A system and method are provided for creating a network performance prediction model, and calibrating the prediction model, through application of network load statistical analyses. The method includes characterizing the measured load on the network, which may include background load data obtained over time, and may further include directed load data representative of a transaction-level event. Probabilistic representations of load data are derived to characterize the statistical persistence of the network performance variability and to determine delays throughout the network. The probabilistic representations are applied to the network performance prediction model to adapt the model for accurate prediction of network performance. Certain embodiments of the method and system may be used for analysis of the performance of a distributed application characterized as data packet streams.

  6. High-Accuracy Analysis of Surges on a Slanting Conductor and a Cylindrical Conductor by an FDTD Method

    NASA Astrophysics Data System (ADS)

    Tanabe, Nobuhiro; Baba, Yoshihiro; Nagaoka, Naoto; Ametani, Akihiro

    A finite-difference time-domain (FDTD) method has become popular in analyzing surge phenomena as well as transient electromagnetic fields because of its high flexibility and straightforwardness. One of the representative limitations of the FDTD method in Cartesian coordinate system is the use of staircase approximation to deal with curved surfaces and slanting thin wires, which are tilted with respect to the coordinate axes. In analyzing a conductor system including curved surfaces, the accuracy may be maintained if the conductor system and its surrounding space are divided into very small cells. It, however, requires long computation time and large memories. The staircase approximation of a slanting wire results in an artificially slowed propagation speed and a lowered resonance frequency. This flaw is inherent in the staircase approximation. In the present paper, surges on a slanting wire and on a cylindrical conductor have been analyzed by the FDTD method with non-rectangular cells, the shapes of which are suitable to fit the curved surface and the slanting conductor. An absorbing boundary condition for the FDTD method using non-rectangular cells has been discussed and the numerical stability has been tested. Also, the accuracy has been investigated in comparison with an analysis based on the method of moments.

  7. Quantifying Vegetation Change in Semiarid Environments: Precision and Accuracy of Spectral Mixture Analysis and the Normalized Difference Vegetation Index

    NASA Technical Reports Server (NTRS)

    Elmore, Andrew J.; Mustard, John F.; Manning, Sara J.; Elome, Andrew J.

    2000-01-01

    Because in situ techniques for determining vegetation abundance in semiarid regions are labor intensive, they usually are not feasible for regional analyses. Remotely sensed data provide the large spatial scale necessary, but their precision and accuracy in determining vegetation abundance and its change through time have not been quantitatively determined. In this paper, the precision and accuracy of two techniques, Spectral Mixture Analysis (SMA) and Normalized Difference Vegetation Index (NDVI) applied to Landsat TM data, are assessed quantitatively using high-precision in situ data. In Owens Valley, California we have 6 years of continuous field data (1991-1996) for 33 sites acquired concurrently with six cloudless Landsat TM images. The multitemporal remotely sensed data were coregistered to within 1 pixel, radiometrically intercalibrated using temporally invariante surface features and geolocated to within 30 m. These procedures facilitated the accurate location of field-monitoring sites within the remotely sensed data. Formal uncertainties in the registration, radiometric alignment, and modeling were determined. Results show that SMA absolute percent live cover (%LC) estimates are accurate to within ?4.0%LC and estimates of change in live cover have a precision of +/-3.8%LC. Furthermore, even when applied to areas of low vegetation cover, the SMA approach correctly determined the sense of clump, (i.e., positive or negative) in 87% of the samples. SMA results are superior to NDVI, which, although correlated with live cover, is not a quantitative measure and showed the correct sense of change in only 67%, of the samples.

  8. Analysis of factors affecting the accuracy, reproducibility, and interpretation of microbial community carbon source utilization patterns

    USGS Publications Warehouse

    Haack, S.K.; Garchow, H.; Klug, M.J.; Forney, L.J.

    1995-01-01

    We determined factors that affect responses of bacterial isolates and model bacterial communities to the 95 carbon substrates in Biolog microliter plates. For isolates and communities of three to six bacterial strains, substrate oxidation rates were typically nonlinear and were delayed by dilution of the inoculum. When inoculum density was controlled, patterns of positive and negative responses exhibited by microbial communities to each of the carbon sources were reproducible. Rates and extents of substrate oxidation by the communities were also reproducible but were not simply the sum of those exhibited by community members when tested separately. Replicates of the same model community clustered when analyzed by principal- components analysis (PCA), and model communities with different compositions were clearly separated un the first PCA axis, which accounted for >60% of the dataset variation. PCA discrimination among different model communities depended on the extent to which specific substrates were oxidized. However, the substrates interpreted by PCA to be most significant in distinguishing the communities changed with reading time, reflecting the nonlinearity of substrate oxidation rates. Although whole-community substrate utilization profiles were reproducible signatures for a given community, the extent of oxidation of specific substrates and the numbers or activities of microorganisms using those substrates in a given community were not correlated. Replicate soil samples varied significantly in the rate and extent of oxidation of seven tested substrates, suggesting microscale heterogeneity in composition of the soil microbial community.

  9. Mutational analysis of S12 protein and implications for the accuracy of decoding by the ribosome

    PubMed Central

    Sharma, Divya; Cukras, Anthony R.; Rogers, Elizabeth J.; Southworth, Daniel R.; Green, Rachel

    2007-01-01

    The fidelity of aminoacyl-tRNA selection by the ribosome depends on a conformational switch in the decoding center of the small ribosomal subunit induced by cognate but not by near-cognate aminoacyl tRNA. The aminoglycosides paromomycin and streptomycin bind to the decoding center and induce related structural rearrangements that explain their observed effects on miscoding. Structural and biochemical studies have identified ribosomal protein S12 (as well as specific nucleotides in 16S rRNA) as a critical molecular contributor in distinguishing between cognate and near-cognate tRNA species as well as in promoting more global rearrangements in the small subunit referred to as “closure”. Here we use a mutational approach to define contributions made by two highly conserved loops in S12 to the process of tRNA selection. Most S12 variant ribosomes tested display increased levels of fidelity (a “restrictive” phenotype). Interestingly, several variants, K42A and R53A, were substantially resistant to the miscoding effects of paromomycin. Further characterization of the compromised paromomycin response identified a probable second, fidelity modulating binding site for paromomycin in the 16S rRNA that facilitates closure of the small subunit and compensates for defects associated with the S12 mutations. PMID:17967466

  10. Multilateral analysis of increasing collective dose and new ALARA programme.

    PubMed

    Oumi, Tadashi; Morii, Yasuki; Imai, Toshirou

    2011-07-01

    JAPC (The Japan Atomic Power Company) is the only electric power company that operates different types of nuclear reactors in Japan; it operates two BWRs (boiling water reactors), one pressurised water reactor and one gas cooled reactor. JAPC has been conducting various activities aimed at reducing radiation dose received by workers for over 45 y. Recently, the collective dose resulting from periodic maintenance has increased at each plant because of the replacement of large equipment and the unexpected extension of the outage period. In particular, the collective dose at Tokai-2 is one of the highest among Japanese BWR plants((1)), owing to the replacement and strengthening of equipment to meet earthquake-proof requirements. In this study, the authors performed a multilateral analysis of unacceptably a large collective dose and devised a new ALARA programme that includes a 3D dose prediction map and the development of machines to assist workers. PMID:21652597

  11. In-line calibration of Raman systems for analysis of gas mixtures of hydrogen isotopologues with sub-percent accuracy.

    PubMed

    Schlösser, Magnus; Seitz, Hendrik; Rupp, Simone; Herwig, Philipp; Alecu, Catalin Gabriel; Sturm, Michael; Bornschein, Beate

    2013-03-01

    Highly accurate, in-line, and real-time composition measurements of gases are mandatory in many processing applications. The quantitative analysis of mixtures of hydrogen isotopologues (H2, D2, T2, HD, HT, and DT) is of high importance in such fields as DT fusion, neutrino mass measurements using tritium β-decay or photonuclear experiments where HD targets are used. Raman spectroscopy is a favorable method for these tasks. In this publication we present a method for the in-line calibration of Raman systems for the nonradioactive hydrogen isotopologues. It is based on precise volumetric gas mixing of the homonuclear species H2/D2 and a controlled catalytic production of the heteronuclear species HD. Systematic effects like spurious exchange reactions with wall materials and others are considered with care during the procedure. A detailed discussion of statistical and systematic uncertainties is presented which finally yields a calibration accuracy of better than 0.4%. PMID:23320553

  12. The effect of inhomogeneities on the distance to the last scattering surface and the accuracy of the CMB analysis

    SciTech Connect

    Bolejko, Krzysztof

    2011-02-01

    The standard analysis of the CMB data assumes that the distance to the last scattering surface can be calculated using the distance-redshift relation as in the Friedmann model. However, in the inhomogeneous universe, even if (δρ) = 0, the distance relation is not the same as in the unperturbed universe. This can be of serious consequences as a change of distance affects the mapping of CMB temperature fluctuations into the angular power spectrum C{sub l}. In addition, if the change of distance is relatively uniform no new temperature fluctuations are generated. It is therefore a different effect than the lensing or ISW effects which introduce additional CMB anisotropies. This paper shows that the accuracy of the CMB analysis can be impaired by the accuracy of calculation of the distance within the cosmological models. Since this effect has not been fully explored before, to test how the inhomogeneities affect the distance-redshift relation, several methods are examined: the Dyer-Roeder relation, lensing approximation, and non-linear Swiss-Cheese model. In all cases, the distance to the last scattering surface is different than when homogeneity is assumed. The difference can be as low as 1% and as high as 80%. An usual change of the distance is around 20–30%. Since the distance to the last scattering surface is set by the position of the CMB peaks, in order to have a good fit, the distance needs to be adjusted. After correcting the distance, the cosmological parameters change. Therefore, a not properly estimated distance to the last scattering surface can be a major source of systematics. This paper shows that if inhomogeneities are taken into account when calculating the distance then models with positive spatial curvature and with Ω{sub Λ} ∼ 0.8−0.9 are preferred.

  13. Accuracy of pleth variability index to predict fluid responsiveness in mechanically ventilated patients: a systematic review and meta-analysis.

    PubMed

    Chu, Haitao; Wang, Yong; Sun, Yanfei; Wang, Gang

    2016-06-01

    To systemically evaluate the accuracy of pleth variability index to predict fluid responsiveness in mechanically ventilated patients. A literature search of PUBMED, OVID, CBM, CNKI and Wanfang Data for clinical studies in which the accuracy of pleth variability index to predict fluid responsiveness was performed (last update 5 April 2015). Related journals were also searched manually. Two reviewers independently assessed trial quality according to the modified QUADAS items. Heterogeneous studies and meta-analysis were conducted by Meta-Disc 1.4 software. A subgroup analysis in the operating room (OR) and in intensive care unit (ICU) was also performed. Differences between subgroups were analyzed using the interaction test. A total of 18 studies involving 665 subjects were included. The pooled area under the receiver operating characteristic curve (AUC) to predict fluid responsiveness in mechanically ventilated patients was 0.88 [95 % confidence interval (CI) 0.84-0.91]. The pooled sensitivity and specificity were 0.73 (95 % CI 0.68-0.78) and 0.82 (95 % CI 0.77-0.86), respectively. No heterogeneity was found within studies nor between studies. And there was no significant heterogeneity within each subgroup. No statistical differences were found between OR subgroup and ICU subgroup in the AUC [0.89 (95 % CI 0.85-0.92) versus 0.90 (95 % CI 0.82-0.94); P = 0.97], and in the specificity [0.84 (95 % CI 0.75-0.86) vs. 0.84 (95 % CI 0.75-0.91); P = 1.00]. Sensitivity was higher in the OR subgroup than the ICU subgroup [0.84 (95 % CI 0.78-0.88) vs. 0.56 (95 % CI 0.47-0.64); P = 0.00004]. The pleth variability index has a reasonable ability to predict fluid responsiveness. PMID:26242233

  14. Earthworms increase plant production: a meta-analysis.

    PubMed

    van Groenigen, Jan Willem; Lubbers, Ingrid M; Vos, Hannah M J; Brown, George G; De Deyn, Gerlinde B; van Groenigen, Kees Jan

    2014-01-01

    To meet the challenge of feeding a growing world population with minimal environmental impact, we need comprehensive and quantitative knowledge of ecological factors affecting crop production. Earthworms are among the most important soil dwelling invertebrates. Their activity affects both biotic and abiotic soil properties, in turn affecting plant growth. Yet, studies on the effect of earthworm presence on crop yields have not been quantitatively synthesized. Here we show, using meta-analysis, that on average earthworm presence in agroecosystems leads to a 25% increase in crop yield and a 23% increase in aboveground biomass. The magnitude of these effects depends on presence of crop residue, earthworm density and type and rate of fertilization. The positive effects of earthworms become larger when more residue is returned to the soil, but disappear when soil nitrogen availability is high. This suggests that earthworms stimulate plant growth predominantly through releasing nitrogen locked away in residue and soil organic matter. Our results therefore imply that earthworms are of crucial importance to decrease the yield gap of farmers who can't -or won't- use nitrogen fertilizer. PMID:25219785

  15. Increasing microscopy resolution with photobleaching and intensity cumulant analysis.

    PubMed

    Brutkowski, Wojtek; Dziob, Daniel; Bernas, Tytus

    2015-11-01

    Super-resolution fluorescence microscopy and its applications for analysis of biological structures are evolving rapidly field. A number of approaches aimed at overcoming the fundamental limit imposed by diffraction have been proposed in recent years. Here we present a modification of super-resolution optical fluctuation imaging (SOFI), a technique based on spatio-temporal evaluation of the optical signal from independently fluctuating emitters. Instead of rapid, reversible photoswitching, photobleaching is used to produce irreversible transitions between emitting and nonemitting states of the fluorochrome molecules. Simulated images are used to demonstrate that, in the absence of noise, the proposed SOFI modification increases the efficiency of transfer of high spatial frequencies in a fluorescence microscope. Correspondingly, a decrease of the point spread function (PSF) width is obtained. Moreover, the modified SOFI algorithm is capable of resolving point emitters in the presence of simulated noise. Using real biological images we demonstrate that an increase of resolution is obtained in 2D optical sections through densely packed chromatin in cell nuclei and lamin layer at the nuclear envelope. Finally, the approach is extended to 3D wide-field microscopy, allowing reduction of out-of-focus image blurring. PMID:26278779

  16. Diagnostic Accuracy of Ultrasonography and Radiography in Detection of Pulmonary Contusion; a Systematic Review and Meta-Analysis

    PubMed Central

    Hosseini, Mostafa; Ghelichkhani, Parisa; Baikpour, Masoud; Tafakhori, Abbas; Asady, Hadi; Haji Ghanbari, Mohammad Javad; Yousefifard, Mahmoud; Safari, Saeed

    2015-01-01

    Introduction: Ultrasonography is currently being used as one of the diagnostic modalities in various medical emergencies for screening of trauma patients. The diagnostic value of this modality in detection of traumatic chest injuries has been evaluated by several studies but its diagnostic accuracy in diagnosis of pulmonary contusion is a matter of discussion. Therefore, the present study aimed to determine the diagnostic accuracy of ultrasonography and radiography in detection of pulmonary contusion through a systematic review and meta-analysis. Methods: An extended systematic search was performed by two reviewers in databases of Medline, EMBASE, ISI Web of Knowledge, Scopus, Cochrane Library, and ProQuest. They extracted the data and assessed the quality of the studies. After summarization of data into true positive, false positive, true negative, and false negative meta-analysis was carried out via a mixed-effects binary regression model. Further subgroup analysis was performed due to a significant heterogeneity between the studies. Results: 12 studies were included in this meta-analysis (1681 chest trauma patients, 76% male). Pooled sensitivity of ultrasonography in detection of pulmonary contusion was 0.92 (95% CI: 0.81-0.96; I2= 95.81, p<0.001) and its pooled specificity was calculated to be 0.89 (95% CI: 0.85-0.93; I2 = 67.29, p<0.001) while these figures for chest radiography were 0.44 (95% CI: 0.32-0.58; I2= 87.52, p<0.001) and 0.98 (95% CI: 0.88-1.0; I2= 95.22, p<0.001), respectively. Subgroup analysis showed that the sources of heterogeneity between the studies were sampling method, operator, frequency of the transducer, and sample size. Conclusion: Ultrasonography was found to be a better screening tool in detection of pulmonary contusion. Moreover, an ultrasonography performed by a radiologist / intensivist with 1-5MHz probe has a higher diagnostic value in identifying pulmonary contusions. PMID:26495401

  17. Linear Discriminant Analysis Achieves High Classification Accuracy for the BOLD fMRI Response to Naturalistic Movie Stimuli.

    PubMed

    Mandelkow, Hendrik; de Zwart, Jacco A; Duyn, Jeff H

    2016-01-01

    Naturalistic stimuli like movies evoke complex perceptual processes, which are of great interest in the study of human cognition by functional MRI (fMRI). However, conventional fMRI analysis based on statistical parametric mapping (SPM) and the general linear model (GLM) is hampered by a lack of accurate parametric models of the BOLD response to complex stimuli. In this situation, statistical machine-learning methods, a.k.a. multivariate pattern analysis (MVPA), have received growing attention for their ability to generate stimulus response models in a data-driven fashion. However, machine-learning methods typically require large amounts of training data as well as computational resources. In the past, this has largely limited their application to fMRI experiments involving small sets of stimulus categories and small regions of interest in the brain. By contrast, the present study compares several classification algorithms known as Nearest Neighbor (NN), Gaussian Naïve Bayes (GNB), and (regularized) Linear Discriminant Analysis (LDA) in terms of their classification accuracy in discriminating the global fMRI response patterns evoked by a large number of naturalistic visual stimuli presented as a movie. Results show that LDA regularized by principal component analysis (PCA) achieved high classification accuracies, above 90% on average for single fMRI volumes acquired 2 s apart during a 300 s movie (chance level 0.7% = 2 s/300 s). The largest source of classification errors were autocorrelations in the BOLD signal compounded by the similarity of consecutive stimuli. All classifiers performed best when given input features from a large region of interest comprising around 25% of the voxels that responded significantly to the visual stimulus. Consistent with this, the most informative principal components represented widespread distributions of co-activated brain regions that were similar between subjects and may represent functional networks. In light of these

  18. Linear Discriminant Analysis Achieves High Classification Accuracy for the BOLD fMRI Response to Naturalistic Movie Stimuli

    PubMed Central

    Mandelkow, Hendrik; de Zwart, Jacco A.; Duyn, Jeff H.

    2016-01-01

    Naturalistic stimuli like movies evoke complex perceptual processes, which are of great interest in the study of human cognition by functional MRI (fMRI). However, conventional fMRI analysis based on statistical parametric mapping (SPM) and the general linear model (GLM) is hampered by a lack of accurate parametric models of the BOLD response to complex stimuli. In this situation, statistical machine-learning methods, a.k.a. multivariate pattern analysis (MVPA), have received growing attention for their ability to generate stimulus response models in a data-driven fashion. However, machine-learning methods typically require large amounts of training data as well as computational resources. In the past, this has largely limited their application to fMRI experiments involving small sets of stimulus categories and small regions of interest in the brain. By contrast, the present study compares several classification algorithms known as Nearest Neighbor (NN), Gaussian Naïve Bayes (GNB), and (regularized) Linear Discriminant Analysis (LDA) in terms of their classification accuracy in discriminating the global fMRI response patterns evoked by a large number of naturalistic visual stimuli presented as a movie. Results show that LDA regularized by principal component analysis (PCA) achieved high classification accuracies, above 90% on average for single fMRI volumes acquired 2 s apart during a 300 s movie (chance level 0.7% = 2 s/300 s). The largest source of classification errors were autocorrelations in the BOLD signal compounded by the similarity of consecutive stimuli. All classifiers performed best when given input features from a large region of interest comprising around 25% of the voxels that responded significantly to the visual stimulus. Consistent with this, the most informative principal components represented widespread distributions of co-activated brain regions that were similar between subjects and may represent functional networks. In light of these

  19. Analysis of the accuracy of the inverse problem solution for a differential heterodyne microscope as applied to rectangular plasmonic waveguides

    NASA Astrophysics Data System (ADS)

    Akhmedzhanov, I. M.; Baranov, D. V.; Zolotov, E. M.

    2016-06-01

    The existence, uniqueness, and stability of the inverse problem solution for a scanning differential heterodyne microscope as applied to rectangular plasmonic waveguides have been analyzed. The consideration is based on an algorithm using a trial-and-error method that we proposed previously to characterize plasmonic waveguides with a triangular profile. The error of the inverse problem (IP) solution is calculated as dependent on the initial data and with allowance for their errors. Instability domains are found for the IP solution, where the solution error sharply increases. It is shown that the instability domains can be eliminated and the accuracy of the IP solution can be significantly improved in the entire range of initial data by taking initial data in the form of two phase responses of the microscope at different wavelengths.

  20. Contribution of Sample Processing to Variability and Accuracy of the Results of Pesticide Residue Analysis in Plant Commodities.

    PubMed

    Ambrus, Árpád; Buczkó, Judit; Hamow, Kamirán Á; Juhász, Viktor; Solymosné Majzik, Etelka; Szemánné Dobrik, Henriett; Szitás, Róbert

    2016-08-10

    Significant reduction of concentration of some pesticide residues and substantial increase of the uncertainty of the results derived from the homogenization of sample materials have been reported in scientific papers long ago. Nevertheless, performance of methods is frequently evaluated on the basis of only recovery tests, which exclude sample processing. We studied the effect of sample processing on accuracy and uncertainty of the measured residue values with lettuce, tomato, and maize grain samples applying mixtures of selected pesticides. The results indicate that the method is simple and robust and applicable in any pesticide residue laboratory. The analytes remaining in the final extract are influenced by their physical-chemical properties, the nature of the sample material, the temperature of comminution of sample, and the mass of test portion extracted. Consequently, validation protocols should include testing the effect of sample processing, and the performance of the complete method should be regularly checked within internal quality control. PMID:26755282

  1. Analysis of the lattice Boltzmann Bhatnagar-Gross-Krook no-slip boundary condition: Ways to improve accuracy and stability

    NASA Astrophysics Data System (ADS)

    Verschaeve, Joris C. G.

    2009-09-01

    An analytical and numerical analysis of the no-slip boundary condition at walls at rest for the lattice Boltzmann Bhatnagar-Gross-Krook method is performed. The main result of this analysis is an alternative formulation for the no-slip boundary condition at walls at rest. Numerical experiments assess the accuracy and stability of this formulation for Poiseuille and Womersley flows, flow over a backward facing step, and unsteady flow around a square cylinder. This no-slip boundary condition is compared analytically and numerically to the boundary conditions of Inamuro [Phys. Fluids 7, 2928 (1995)] and Zou and He [Phys. Fluids 9, 1591 (1997)] and it is found that all three make use of the same mechanism for the off-diagonal element of the stress tensor. Mass conservation, however, is only assured by the present one. In addition, our analysis points out which mechanism lies behind the instabilities also observed by Lätt [Phys. Rev. E 77, 056703 (2008)] for this kind of boundary conditions. We present a way to remove these instabilities, allowing one to reach relaxation frequencies considerably closer to 2.

  2. Generating and using patient-specific whole-body models for organ dose estimates in CT with increased accuracy: feasibility and validation.

    PubMed

    Kalender, Willi A; Saltybaeva, Natalia; Kolditz, Daniel; Hupfer, Martin; Beister, Marcel; Schmidt, Bernhard

    2014-12-01

    The estimation of patient dose using Monte Carlo (MC) simulations based on the available patient CT images is limited to the length of the scan. Software tools for dose estimation based on standard computational phantoms overcome this problem; however, they are limited with respect to taking individual patient anatomy into account. The purpose of this study was to generate whole-body patient models in order to take scattered radiation and over-scanning effects into account. Thorax examinations were performed on three physical anthropomorphic phantoms at tube voltages of 80 kV and 120 kV; absorbed dose was measured using thermoluminescence dosimeters (TLD). Whole-body voxel models were built as a combination of the acquired CT images appended by data taken from widely used anthropomorphic voxel phantoms. MC simulations were performed both for the CT image volumes alone and for the whole-body models. Measured and calculated dose distributions were compared for each TLD chip position; additionally, organ doses were determined. MC simulations based only on CT data underestimated dose by 8%-15% on average depending on patient size with highest underestimation values of 37% for the adult phantom at the caudal border of the image volume. The use of whole-body models substantially reduced these errors; measured and simulated results consistently agreed to better than 10%. This study demonstrates that combined whole-body models can provide three-dimensional dose distributions with improved accuracy. Using the presented concept should be of high interest for research studies which demand high accuracy, e.g. for dose optimization efforts. PMID:25288527

  3. Analysis of Influence of Terrain Relief Roughness on dem Accuracy Generated from LIDAR in the Czech Republic Territory

    NASA Astrophysics Data System (ADS)

    Hubacek, M.; Kovarik, V.; Kratochvil, V.

    2016-06-01

    Digital elevation models are today a common part of geographic information systems and derived applications. The way of their creation is varied. It depends on the extent of area, required accuracy, delivery time, financial resources and technologies available. The first model covering the whole territory of the Czech Republic was created already in the early 1980's. Currently, the 5th DEM generation is being finished. Data collection for this model was realized using the airborne laser scanning which allowed creating the DEM of a new generation having the precision up to a decimetre. Model of such a precision expands the possibilities of employing the DEM and it also offers new opportunities for the use of elevation data especially in a domain of modelling the phenomena dependent on highly accurate data. The examples are precise modelling of hydrological phenomena, studying micro-relief objects, modelling the vehicle movement, detecting and describing historical changes of a landscape, designing constructions etc. Due to a nature of the technology used for collecting data and generating DEM, it is assumed that the resulting model achieves lower accuracy in areas covered by vegetation and in built-up areas. Therefore the verification of model accuracy was carried out in five selected areas in Moravia. The network of check points was established using a total station in each area. To determine the reference heights of check points, the known geodetic points whose heights were defined using levelling were used. Up to several thousands of points were surveyed in each area. Individual points were selected according to a different configuration of relief, different surface types, and different vegetation coverage. The sets of deviations were obtained by comparing the DEM 5G heights with reference heights which was followed by verification of tested elevation model. Results of the analysis showed that the model reaches generally higher precision than the declared one in

  4. Diagnostic accuracy of rapid immunoassays for heparin-induced thrombocytopenia. A systematic review and meta-analysis.

    PubMed

    Sun, Lova; Gimotty, Phyllis A; Lakshmanan, Suvasini; Cuker, Adam

    2016-05-01

    The platelet factor 4/heparin ELISA has limited specificity for heparin-induced thrombocytopenia (HIT) and frequently does not provide same-day results. Rapid immunoassays (RIs) have been developed which provide results in 30 minutes or less. We conducted a systematic review and meta-analysis to evaluate the diagnostic accuracy of RIs for HIT. We searched the literature for studies in which samples from patients with suspected HIT were tested using a RI and a functional assay against which the performance of the RI could be measured. We performed sensitivity analyses of studies that directly compared different RIs with each other and with ELISAs. Estimates of sensitivity and specificity for each RI were calculated. Twenty-three articles, collectively involving six different RIs, met eligibility criteria. All RIs exhibited high sensitivity (0.96 to 1.00); there was wider variability in specificity (0.68 to 0.94). Specificity of the IgG-specific chemiluminescent assay (IgG-CA) was greater than the polyspecific chemiluminescent assay [0.94 (95 %CI 0.89-0.99) vs 0.82 (0.77-0.87)]. The particle gel immunoassay demonstrated greater specificity than the polyspecific ELISA [0.96 (0.95-0.97) vs 0.91 (0.89-0.92)]. The IgG-CA and lateral flow immunoassay [0.94 (0.91-0.97)] exhibited greater specificity than the IgG-specific ELISA [0.86 (0.82-0.90)]. Given their high sensitivity and rapid turnaround time, RIs are a reliable means of excluding HIT at the point-of-care in patients with low or intermediate clinical probability. Additionally, some RIs have greater specificity than HIT ELISAs. In summary, IgG-specific RIs appear to have improved diagnostic accuracy compared with ELISAs in patients with suspected HIT and may reduce misdiagnosis and overtreatment. PMID:26763074

  5. At risk or not at risk? A meta-analysis of the prognostic accuracy of psychometric interviews for psychosis prediction

    PubMed Central

    Fusar-Poli, Paolo; Cappucciati, Marco; Rutigliano, Grazia; Schultze-Lutter, Frauke; Bonoldi, Ilaria; Borgwardt, Stefan; Riecher-Rössler, Anita; Addington, Jean; Perkins, Diana; Woods, Scott W; McGlashan, Thomas H; Lee, Jimmy; Klosterkötter, Joachim; Yung, Alison R; McGuire, Philip

    2015-01-01

    An accurate detection of individuals at clinical high risk (CHR) for psychosis is a prerequisite for effective preventive interventions. Several psychometric interviews are available, but their prognostic accuracy is unknown. We conducted a prognostic accuracy meta-analysis of psychometric interviews used to examine referrals to high risk services. The index test was an established CHR psychometric instrument used to identify subjects with and without CHR (CHR+ and CHR−). The reference index was psychosis onset over time in both CHR+ and CHR− subjects. Data were analyzed with MIDAS (STATA13). Area under the curve (AUC), summary receiver operating characteristic curves, quality assessment, likelihood ratios, Fagan’s nomogram and probability modified plots were computed. Eleven independent studies were included, with a total of 2,519 help-seeking, predominately adult subjects (CHR+: N=1,359; CHR−: N=1,160) referred to high risk services. The mean follow-up duration was 38 months. The AUC was excellent (0.90; 95% CI: 0.87-0.93), and comparable to other tests in preventive medicine, suggesting clinical utility in subjects referred to high risk services. Meta-regression analyses revealed an effect for exposure to antipsychotics and no effects for type of instrument, age, gender, follow-up time, sample size, quality assessment, proportion of CHR+ subjects in the total sample. Fagan’s nomogram indicated a low positive predictive value (5.74%) in the general non-help-seeking population. Albeit the clear need to further improve prediction of psychosis, these findings support the use of psychometric prognostic interviews for CHR as clinical tools for an indicated prevention in subjects seeking help at high risk services worldwide. PMID:26407788

  6. Diagnostic Accuracy of Serum CA19-9 in Patients with Cholangiocarcinoma: A Systematic Review and Meta-Analysis

    PubMed Central

    Liang, Bin; Zhong, Liansheng; He, Qun; Wang, Shaocheng; Pan, Zhongcheng; Wang, Tianjiao; Zhao, Yujie

    2015-01-01

    Background Cholangiocarcinoma (CCA) is a relatively rare cancer worldwide; however, its incidence is extremely high in Asia. Numerous studies reported that serum carbohydrate antigen 19-9 (CA19-9) plays a role in the diagnosis of CCA patients. However, published data are inconclusive. The aim of this meta-analysis was to provide a systematic review of the diagnostic performance of CA19-9 for CCA. Material/Methods We searched the public databases including PubMed, Web of Science, Embase, Chinese National Knowledge Infrastructure (CNKI), and WANFANG databases for articles evaluating the diagnostic accuracy of serum CA19-9 to predict CCA. The diagnostic sensitivity (SEN), specificity (SPE), positive likelihood ratio (PLR), negative likelihood ratio (NLR), diagnostic odds ratio (DOR), and summary receiver operating characteristic curve (SROC) were pooled by Meta-DiSc 1.4 software. Results A total of 31 articles met the inclusion criteria, including 1,264 patients and 2,039 controls. The pooled SEN, SPE, PLR, NLR, and DOR were 0.72 (95% CI: 0.70–0.75), 0.84 (95% CI: 0.82–0.85), 4.93 (95% CI, 3.67–6.64), 0.35 (95%CI, 0.30–0.41), and 15.10 (95% CI, 10.70–21.32), respectively. The area under SROC curve was 0.8300. The subgroup analyses based on different control type, geographical location, and sample size revealed that the diagnostic accuracy of CA19-9 tends to be same in different control type, but showed low sensitivity in European patients and small size group. Conclusions Serum CA19-9 is a useful non-invasive biomarker for CCA detection and may become a clinically useful tool to identify high-risk patients. PMID:26576628

  7. Increase of the fuel cell system efficiency - Modular testing, analysis and development environment

    NASA Astrophysics Data System (ADS)

    König, P.; Ivers-Tiffée, E.

    The main issue in preparing fuel cell systems for the future market is system reliability and efficiency. Apart from successful field test trials, any type of stationary, in general automotive or portable fuel cell systems are at the development stage. One task to deal with is to increase the component and system efficiencies by facilitating the system construction or eliminating parasitic components.With newly established effective standardised system and component tests, linked with a flexible modelling and simulation environment, the development process and the determination of the system efficiencies as well as the inaccessible system values can be accelerated.In this work a modular model-aided system analysis and development environment is presented which has been evaluated and validated at the IWE. The tool, a combination of standardised testing, modelling and simulation, has been applied to different types of fuel cell systems showing the tool flexibility, modularity and accuracy. In the presented case the tool was used for system analysis and studies on efficiency increase of a complex prototype stationary PEMFC system.

  8. Are the Conventional Commercial Yeast Identification Methods Still Helpful in the Era of New Clinical Microbiology Diagnostics? A Meta-Analysis of Their Accuracy

    PubMed Central

    Efremov, Ljupcho; Leoncini, Emanuele; Amore, Rosarita; Posteraro, Patrizia; Ricciardi, Walter

    2015-01-01

    Accurate identification of pathogenic species is important for early appropriate patient management, but growing diversity of infectious species/strains makes the identification of clinical yeasts increasingly difficult. Among conventional methods that are commercially available, the API ID32C, AuxaColor, and Vitek 2 systems are currently the most used systems in routine clinical microbiology. We performed a systematic review and meta-analysis to estimate and to compare the accuracy of the three systems, in order to assess whether they are still of value for the species-level identification of medically relevant yeasts. After adopting rigorous selection criteria, we included 26 published studies involving Candida and non-Candida yeasts that were tested with the API ID32C (674 isolates), AuxaColor (1,740 isolates), and Vitek 2 (2,853 isolates) systems. The random-effects pooled identification ratios at the species level were 0.89 (95% confidence interval [CI], 0.80 to 0.95) for the API ID32C system, 0.89 (95% CI, 0.83 to 0.93) for the AuxaColor system, and 0.93 (95% CI, 0.89 to 0.96) for the Vitek 2 system (P for heterogeneity, 0.255). Overall, the accuracy of studies using phenotypic analysis-based comparison methods was comparable to that of studies using molecular analysis-based comparison methods. Subanalysis of studies conducted on Candida yeasts showed that the Vitek 2 system was significantly more accurate (pooled ratio, 0.94 [95% CI, 0.85 to 0.99]) than the API ID32C system (pooled ratio, 0.84 [95% CI, 0.61 to 0.99]) and the AuxaColor system (pooled ratio, 0.76 [95% CI, 0.67 to 0.84]) with respect to uncommon species (P for heterogeneity, <0.05). Subanalysis of studies conducted on non-Candida yeasts (i.e., Cryptococcus, Rhodotorula, Saccharomyces, and Trichosporon) revealed pooled identification accuracies of ≥98% for the Vitek 2, API ID32C (excluding Cryptococcus), and AuxaColor (only Rhodotorula) systems, with significant low or null levels of

  9. Are the Conventional Commercial Yeast Identification Methods Still Helpful in the Era of New Clinical Microbiology Diagnostics? A Meta-Analysis of Their Accuracy.

    PubMed

    Posteraro, Brunella; Efremov, Ljupcho; Leoncini, Emanuele; Amore, Rosarita; Posteraro, Patrizia; Ricciardi, Walter; Sanguinetti, Maurizio

    2015-08-01

    Accurate identification of pathogenic species is important for early appropriate patient management, but growing diversity of infectious species/strains makes the identification of clinical yeasts increasingly difficult. Among conventional methods that are commercially available, the API ID32C, AuxaColor, and Vitek 2 systems are currently the most used systems in routine clinical microbiology. We performed a systematic review and meta-analysis to estimate and to compare the accuracy of the three systems, in order to assess whether they are still of value for the species-level identification of medically relevant yeasts. After adopting rigorous selection criteria, we included 26 published studies involving Candida and non-Candida yeasts that were tested with the API ID32C (674 isolates), AuxaColor (1,740 isolates), and Vitek 2 (2,853 isolates) systems. The random-effects pooled identification ratios at the species level were 0.89 (95% confidence interval [CI], 0.80 to 0.95) for the API ID32C system, 0.89 (95% CI, 0.83 to 0.93) for the AuxaColor system, and 0.93 (95% CI, 0.89 to 0.96) for the Vitek 2 system (P for heterogeneity, 0.255). Overall, the accuracy of studies using phenotypic analysis-based comparison methods was comparable to that of studies using molecular analysis-based comparison methods. Subanalysis of studies conducted on Candida yeasts showed that the Vitek 2 system was significantly more accurate (pooled ratio, 0.94 [95% CI, 0.85 to 0.99]) than the API ID32C system (pooled ratio, 0.84 [95% CI, 0.61 to 0.99]) and the AuxaColor system (pooled ratio, 0.76 [95% CI, 0.67 to 0.84]) with respect to uncommon species (P for heterogeneity, <0.05). Subanalysis of studies conducted on non-Candida yeasts (i.e., Cryptococcus, Rhodotorula, Saccharomyces, and Trichosporon) revealed pooled identification accuracies of ≥98% for the Vitek 2, API ID32C (excluding Cryptococcus), and AuxaColor (only Rhodotorula) systems, with significant low or null levels of

  10. Diagnostic Accuracy of 2D-Shear Wave Elastography for Liver Fibrosis Severity: A Meta-Analysis

    PubMed Central

    Jiang, Tian’an; Tian, Guo; Zhao, Qiyu; Kong, Dexing; Cheng, Chao; Zhong, Liyun; Li, Lanjuan

    2016-01-01

    Purpose To evaluate the accuracy of shear wave elastography (SWE) in the quantitative diagnosis of liver fibrosis severity. Methods The published literatures were systematically retrieved from PubMed, Embase, Web of science and Scopus up to May 13th, 2016. Included studies reported the pooled sensitivity, specificity, positive and negative predictive values, as well as the diagnostic odds ratio of SWE in populations with liver fibrosis. A bivariate mixed-effects regression model was used, which was estimated by the I2 statistics. The quality of articles was evaluated by quality assessment of diagnostic accuracy studies (QUADAS). Results Thirteen articles including 2303 patients were qualified for the study. The pooled sensitivity and specificity of SWE for the diagnosis of liver fibrosis are as follows: ≥F1 0.76 (p<0.001, 95% CI, 0.71–0.81, I2 = 75.33%), 0.92 (p<0.001, 95% CI, 0.80–0.97, I2 = 79.36%); ≥F2 0.84 (p = 0.35, 95% CI, 0.81–0.86, I2 = 9.55%), 0.83 (p<0.001, 95% CI, 0.77–0.88, I2 = 86.56%); ≥F3 0.89 (p = 0.56, 95% CI, 0.86–0.92, I2 = 0%), 0.86 (p<0.001, 95% CI, 0.82–0.90, I2 = 75.73%); F4 0.89 (p = 0.24, 95% CI, 0.84–0.92, I2 = 20.56%), 0.88 (p<0.001, 95% CI, 0.84–0.92, I2 = 82.75%), respectively. Sensitivity analysis showed no significant changes if any one of the studies was excluded. Publication bias was not detected in this meta-analysis. Conclusions Our study suggests that SWE is a helpful method to appraise liver fibrosis severity. Future studies that validate these findings would be appropriate. PMID:27300569

  11. Increased throughput of proteomics analysis by multiplexing high-resolution tandem mass spectra.

    PubMed

    Ledvina, A R; Savitski, M M; Zubarev, A R; Good, D M; Coon, J J; Zubarev, R A

    2011-10-15

    High-resolution and high-accuracy Fourier transform mass spectrometry (FTMS) is becoming increasingly attractive due to its specificity. However, the speed of tandem FTMS analysis severely limits the competitive advantage of this approach relative to faster low-resolution quadrupole ion trap MS/MS instruments. Here we demonstrate an entirely FTMS-based analysis method with a 2.5-3.0-fold greater throughput than a conventional FT MS/MS approach. The method consists of accumulating together the MS/MS fragments ions from multiple precursors, with subsequent high-resolution analysis of the mixture. Following acquisition, the multiplexed spectrum is deconvoluted into individual MS/MS spectra which are then combined into a single concatenated file and submitted for peptide identification to a search engine. The method is tested both in silico using a database of MS/MS spectra as well as in situ using a modified LTQ Orbitrap mass spectrometer. The performance of the method in the experiment was consistent with theoretical expectations. PMID:21913643

  12. Increasing Liability Premiums in Obstetrics – Analysis, Effects and Options

    PubMed Central

    Soergel, P.; Schöffski, O.; Hillemanns, P.; Hille-Betz, U.; Kundu, S.

    2015-01-01

    Whenever people act, mistakes are made. In Germany, it is thought that a total of 40 000 cases of malpractice occur per year. In recent years, costs for liability insurance have risen significantly in almost all spheres of medicine as a whole. Liability in the health care sector is founded on the contractual relationship between doctor and patient. Most recently, case law developed over many years has been codified with the Patientsʼ Rights Act. In obstetrics, the focus of liability law is on brain damage caused by hypoxia or ischemia as a result of management errors during birth. The costs per claim are made up of various components together with different shares of damage costs (increased needs, in particular therapy costs and nursing fees, acquisition damage, treatment costs, compensation). In obstetrics in particular, recent focus has been on massively increased liability payments, also accompanied by higher liability premiums. This causes considerable financial burdens on hospitals as well as on midwives and attending physicians. The premiums are so high, especially for midwives and attending physicians, that professional practice becomes uneconomical in some cases. In recent years, these circumstances have also been intensely debated in the public sphere and in politics. However, the focus here is on the occupation of midwife. In 2014, in the GKV-FQWG (Statutory Health Insurance – Quality and Further Development Act), a subsidy towards the occupational liability premium was defined for midwives who only attended a few deliveries. However, to date, a complete solution to the problem has not been found. A birth will never be a fully controllable risk, but in rare cases will always end with injury to the child. The goal must be to minimise this risk, through good education and continuous training, as well as constant critical analysis of oneʼs own activities. Furthermore, it seems sensible, especially in non-clinical Obstetrics, to look at the current

  13. Is increasing complexity of algorithms the price for higher accuracy? virtual comparison of three algorithms for tertiary level management of chronic cough in people living with HIV in a low-income country

    PubMed Central

    2012-01-01

    Background The algorithmic approach to guidelines has been introduced and promoted on a large scale since the 1970s. This study aims at comparing the performance of three algorithms for the management of chronic cough in patients with HIV infection, and at reassessing the current position of algorithmic guidelines in clinical decision making through an analysis of accuracy, harm and complexity. Methods Data were collected at the University Hospital of Kigali (CHUK) in a total of 201 HIV-positive hospitalised patients with chronic cough. We simulated management of each patient following the three algorithms. The first was locally tailored by clinicians from CHUK, the second and third were drawn from publications by Médecins sans Frontières (MSF) and the World Health Organisation (WHO). Semantic analysis techniques known as Clinical Algorithm Nosology were used to compare them in terms of complexity and similarity. For each of them, we assessed the sensitivity, delay to diagnosis and hypothetical harm of false positives and false negatives. Results The principal diagnoses were tuberculosis (21%) and pneumocystosis (19%). Sensitivity, representing the proportion of correct diagnoses made by each algorithm, was 95.7%, 88% and 70% for CHUK, MSF and WHO, respectively. Mean time to appropriate management was 1.86 days for CHUK and 3.46 for the MSF algorithm. The CHUK algorithm was the most complex, followed by MSF and WHO. Total harm was by far the highest for the WHO algorithm, followed by MSF and CHUK. Conclusions This study confirms our hypothesis that sensitivity and patient safety (i.e. less expected harm) are proportional to the complexity of algorithms, though increased complexity may make them difficult to use in practice. PMID:22260242

  14. Increasing the Accuracy of Volume and ADC Delineation for Heterogeneous Tumor on Diffusion-Weighted MRI: Correlation with PET/CT

    SciTech Connect

    Gong, Nan-Jie; Wong, Chun-Sing; Chu, Yiu-Ching; Guo, Hua; Huang, Bingsheng; Chan, Queenie

    2013-10-01

    Purpose: To improve the accuracy of volume and apparent diffusion coefficient (ADC) measurements in diffusion-weighted magnetic resonance imaging (MRI), we proposed a method based on thresholding both the b0 images and the ADC maps. Methods and Materials: In 21 heterogeneous lesions from patients with metastatic gastrointestinal stromal tumors (GIST), gross lesion were manually contoured, and corresponding volumes and ADCs were denoted as gross tumor volume (GTV) and gross ADC (ADC{sub g}), respectively. Using a k-means clustering algorithm, the probable high-cellularity tumor tissues were selected based on b0 images and ADC maps. ADC and volume of the tissues selected using the proposed method were denoted as thresholded ADC (ADC{sub thr}) and high-cellularity tumor volume (HCTV), respectively. The metabolic tumor volume (MTV) in positron emission tomography (PET)/computed tomography (CT) was measured using 40% maximum standard uptake value (SUV{sub max}) as the lower threshold, and corresponding mean SUV (SUV{sub mean}) was also measured. Results: HCTV had excellent concordance with MTV according to Pearson's correlation (r=0.984, P<.001) and linear regression (slope = 1.085, intercept = −4.731). In contrast, GTV overestimated the volume and differed significantly from MTV (P=.005). ADC{sub thr} correlated significantly and strongly with SUV{sub mean} (r=−0.807, P<.001) and SUV{sub max} (r=−0.843, P<.001); both were stronger than those of ADC{sub g}. Conclusions: The proposed lesion-adaptive semiautomatic method can help segment high-cellularity tissues that match hypermetabolic tissues in PET/CT and enables more accurate volume and ADC delineation on diffusion-weighted MR images of GIST.

  15. SU-F-BRF-14: Increasing the Accuracy of Dose Calculation On Cone-Beam Imaging Using Deformable Image Registration in the Case of Prostate Translation

    SciTech Connect

    Fillion, O; Gingras, L; Archambault, L

    2014-06-15

    Purpose: Artifacts can reduce the quality of dose re-calculations on CBCT scans during a treatment. The aim of this project is to correct the CBCT images in order to allow for more accurate and exact dose calculations in the case of a translation of the tumor in prostate cancer. Methods: Our approach is to develop strategies based on deformable image registration algorithms using the elastix software (Klein et al., 2010) to register the treatment planning CT on a daily CBCT scan taken during treatment. Sets of images are provided by a 3D deformable phantom and comprise two CT and two CBCT scans: one of both with the reference anatomy and the others with known deformations (i.e. translations of the prostate). The reference CT is registered onto the deformed CBCT and the deformed CT serves as the control for dose calculation accuracy. The planned treatment used for the evaluation of dose calculation is a 2-Gy fraction prescribed at the location of the reference prostate and assigned to 7 rectangular fields. Results: For a realistic 0.5-cm translation of the prostate, the relative dose discrepancy between the CBCT and the CT control scan at the prostate's centroid is 8.9 ± 0.8 % while dose discrepancy between the registered CT and the control scan lessens to −2.4 ± 0.8 %. For a 2-cm translation, clinical indices like the V90 and the D100 are more accurate by 0.7 ± 0.3 % and 8.0 ± 0.5 cGy respectively when using registered CT than when using CBCT for dose calculation. Conclusion: The results show that this strategy gives doses in agreement within a few percents with those from calculations on actual CT scans. In the future, various deformations of the phantom anatomy will allow a thorough characterization of the registration strategies needed for more complex anatomies.

  16. Diagnostic accuracy of noninvasive markers of liver fibrosis in patients with psoriasis taking methotrexate: a systematic review and meta-analysis.

    PubMed

    Maybury, C M; Samarasekera, E; Douiri, A; Barker, J N; Smith, C H

    2014-06-01

    People with psoriasis taking methotrexate may be at increased risk of developing liver fibrosis compared with the general population. Noninvasive methods of detecting fibrosis have been widely adopted but their clinical utility is uncertain. To evaluate the diagnostic accuracy of noninvasive methods to detect fibrosis compared with liver biopsy (reference standard) in people with psoriasis taking methotrexate. A systematic search using Ovid/Medline, Embase, Cumulative Index to Nursing and Allied Health Literature, the Cochrane Library and Clinical Trials Register was performed. Diagnostic cohorts or case-control studies of adults taking or being considered for methotrexate therapy were considered. Study quality was evaluated using the Quality Assessment tool for Diagnostic Accuracy Studies (QUADAS-2). Pooled data analysis was performed using RevMan 5.1. Bayesian meta-analysis was conducted using Markov chain Monte Carlo simulation. Seventeen studies were included. Sensitivity and specificity were 38% and 83% for standard liver function tests (LFTs), 74% and 77% for procollagen-3 N-terminal peptide (P3NP), 60% and 80% for Fibroscan(®) (Echosens, France, www.echosens.com), and 55% and 49% for ultrasound. Confidence in these results is limited owing to low-quality data; old, small studies displayed significant selection bias and significant variation in the prevalence of fibrosis. No studies were identified evaluating recently developed markers. The clinical utility of LFTs, P3NP and liver ultrasound is poor. Therefore if these tests are used in isolation, a significant proportion of patients with liver fibrosis may remain unidentified. Larger prospective studies are required in this population to validate newer non-invasive methods. PMID:24588075

  17. Diagnostic test accuracy of D-dimer for acute aortic syndrome: systematic review and meta-analysis of 22 studies with 5000 subjects

    PubMed Central

    Watanabe, Hiroki; Horita, Nobuyuki; Shibata, Yuji; Minegishi, Shintaro; Ota, Erika; Kaneko, Takeshi

    2016-01-01

    Diagnostic test accuracy of D-dimer for acute aortic dissection (AAD) has not been evaluated by meta-analysis with the bivariate model methodology. Four databases were electrically searched. We included both case-control and cohort studies that could provide sufficient data concerning both sensitivity and specificity of D-dimer for AAD. Non-English language articles and conference abstract were allowed. Intramural hematoma and penetrating aortic ulcer were regarded as AAD. Based on 22 eligible articles consisting of 1140 AAD subjects and 3860 non-AAD subjects, the diagnostic odds ratio was 28.5 (95% CI 17.6–46.3, I2 = 17.4%) and the area under curve was 0.946 (95% CI 0.903–0.994). Based on 833 AAD subjects and 1994 non-AAD subjects constituting 12 studies that used the cutoff value of 500 ng/ml, the sensitivity was 0.952 (95% CI 0.901–0.978), the specificity was 0.604 (95% CI 0.485–0.712), positive likelihood ratio was 2.4 (95% CI 1.8–3.3), and negative likelihood ratio was 0.079 (95% CI 0.036–0.172). Sensitivity analysis using data of three high-quality studies almost replicated these results. In conclusion, D-dimer has very good overall accuracy. D-dimer <500 ng/ml largely decreases the possibility of AAD. D-dimer >500 ng/ml moderately increases the possibility of AAD. PMID:27230962

  18. Finite element analysis of transonic flows in cascades: Importance of computational grids in improving accuracy and convergence

    NASA Technical Reports Server (NTRS)

    Ecer, A.; Akay, H. U.

    1981-01-01

    The finite element method is applied for the solution of transonic potential flows through a cascade of airfoils. Convergence characteristics of the solution scheme are discussed. Accuracy of the numerical solutions is investigated for various flow regions in the transonic flow configuration. The design of an efficient finite element computational grid is discussed for improving accuracy and convergence.

  19. Diagnostic accuracy of serum biomarkers for head and neck cancer: A systematic review and meta-analysis.

    PubMed

    Guerra, Eliete Neves Silva; Rêgo, Daniela Fortunato; Elias, Silvia Taveira; Coletta, Ricardo D; Mezzomo, Luis André Mendonça; Gozal, David; De Luca Canto, Graziela

    2016-05-01

    Serum biomarkers could be helpful to characterize head and neck squamous cell carcinoma (HNSCC). Thus, the purpose of this systematic review and meta-analysis was to determine the diagnostic capability of serum biomarkers in the assessment of HNSCC patients. Studies were gathered by searching LILACS, PubMed, Science Direct, Scopus and Web of Science up to April 10th, 2015. Studies that focused on serum biomarkers in the diagnosis of HNSCC compared with controls were considered. Sixty-five studies were identified, and the sample size included 9098 subjects. Combined biomarkers demonstrated improved accuracy than those tested individually. Therefore, 12.8% of single and 34.3% of combined indicated that serum biomarkers discriminate patients with HNSCC from controls. The combined biomarkers with better diagnostic capability included Epidermal growth factor receptor (EGFR)+Cyclin D1 and squamous cell cancer-associated antigen (SCCA)+EGFR+Cyclin D1. Beta2-microglobin may also be a promising single biomarker for future studies. Serum biomarkers can be potentially useful in the diagnosis of HNSCC. However, further research is required to validate these biomarkers. PMID:26971993

  20. Use of multivariate analysis to improve the accuracy of radionuclide angiography with stress in detecting coronary artery disease in men

    SciTech Connect

    Greenberg, P.S.; Bible, M.; Ellestad, M.H.; Berge, R.; Johnson, K.; Hayes, M.

    1983-01-01

    A multivariate analysis (MVA) system was derived retrospectively from a population of 76 males with coronary artery disease and 18 control subjects. Posterior probabilities were then derived from such a system prospectively in a new male population of 11 subjects with normal coronary arteries and hemodynamics and 63 patients with coronary artery disease. The sensitivity was 84% compared to that for change in ejection fraction (delta EF) greater than or equal to 5 criterion of 71% (p less than 0.01), the specificity was 91% compared to 73% for the delta EF greater than or equal to 5 criterion (p greater than 0.05), and the correct classification rate was 85% compared to 72% for the delta EF greater than or equal to 5 criterion (p less than 0.01). The significant variables were: change in EF with exercise, percent maximal heart rate, change in end-diastolic volume (delta EDV) with exercise, change in R wave, and exercise duration. Application of the multivariate approach to radionuclide imaging with stress, including both exercise and nuclear parameters, significantly improved the diagnostic accuracy of the test and allowed for a probability statement concerning the likelihood of disease.

  1. A comparative analysis of the accuracy of implant master casts fabricated from two different transfer impression techniques

    PubMed Central

    Patil, Rupali; Kadam, Pankaj; Oswal, Chetan; Patil, Seema; Jajoo, Shweta; Gachake, Arati

    2016-01-01

    Aim: This study evaluated and compared two impression techniques in terms of their dimensional accuracies to reproduce implant positions on working casts. Materials and Methods: A master model was designed to simulate a clinical situation. Impressions were made using four techniques: (1) Stock open tray (SOT) technique; (2) stock closed tray (SCT) technique; (3) custom open tray (COT) technique; and (3) custom closed tray (CCT) technique. Reference points on the hexagonal silhouette of the implant on master model and onto the analogs of the obtained master casts were compared after using the four impression techniques. Measurements were made using an optical microscope, capable of recording under 50x magnifications. The means and standard deviations of all the groups and subgroups were calculated and statically analyzed using analysis of variance (ANOVA) and Tukey's test. Results: The open tray impressions showed significantly less variation from the master model and all the techniques studied were comparable. Conclusion: All the techniques studied shown some distortion. COT showed the most accurate results of all the techniques. PMID:27114954

  2. Overlay accuracy fundamentals

    NASA Astrophysics Data System (ADS)

    Kandel, Daniel; Levinski, Vladimir; Sapiens, Noam; Cohen, Guy; Amit, Eran; Klein, Dana; Vakshtein, Irina

    2012-03-01

    Currently, the performance of overlay metrology is evaluated mainly based on random error contributions such as precision and TIS variability. With the expected shrinkage of the overlay metrology budget to < 0.5nm, it becomes crucial to include also systematic error contributions which affect the accuracy of the metrology. Here we discuss fundamental aspects of overlay accuracy and a methodology to improve accuracy significantly. We identify overlay mark imperfections and their interaction with the metrology technology, as the main source of overlay inaccuracy. The most important type of mark imperfection is mark asymmetry. Overlay mark asymmetry leads to a geometrical ambiguity in the definition of overlay, which can be ~1nm or less. It is shown theoretically and in simulations that the metrology may enhance the effect of overlay mark asymmetry significantly and lead to metrology inaccuracy ~10nm, much larger than the geometrical ambiguity. The analysis is carried out for two different overlay metrology technologies: Imaging overlay and DBO (1st order diffraction based overlay). It is demonstrated that the sensitivity of DBO to overlay mark asymmetry is larger than the sensitivity of imaging overlay. Finally, we show that a recently developed measurement quality metric serves as a valuable tool for improving overlay metrology accuracy. Simulation results demonstrate that the accuracy of imaging overlay can be improved significantly by recipe setup optimized using the quality metric. We conclude that imaging overlay metrology, complemented by appropriate use of measurement quality metric, results in optimal overlay accuracy.

  3. GALA: group analysis leads to accuracy, a novel approach for solving the inverse problem in exploratory analysis of group MEG recordings.

    PubMed

    Kozunov, Vladimir V; Ossadtchi, Alexei

    2015-01-01

    Although MEG/EEG signals are highly variable between subjects, they allow characterizing systematic changes of cortical activity in both space and time. Traditionally a two-step procedure is used. The first step is a transition from sensor to source space by the means of solving an ill-posed inverse problem for each subject individually. The second is mapping of cortical regions consistently active across subjects. In practice the first step often leads to a set of active cortical regions whose location and timecourses display a great amount of interindividual variability hindering the subsequent group analysis. We propose Group Analysis Leads to Accuracy (GALA)-a solution that combines the two steps into one. GALA takes advantage of individual variations of cortical geometry and sensor locations. It exploits the ensuing variability in electromagnetic forward model as a source of additional information. We assume that for different subjects functionally identical cortical regions are located in close proximity and partially overlap and their timecourses are correlated. This relaxed similarity constraint on the inverse solution can be expressed within a probabilistic framework, allowing for an iterative algorithm solving the inverse problem jointly for all subjects. A systematic simulation study showed that GALA, as compared with the standard min-norm approach, improves accuracy of true activity recovery, when accuracy is assessed both in terms of spatial proximity of the estimated and true activations and correct specification of spatial extent of the activated regions. This improvement obtained without using any noise normalization techniques for both solutions, preserved for a wide range of between-subject variations in both spatial and temporal features of regional activation. The corresponding activation timecourses exhibit significantly higher similarity across subjects. Similar results were obtained for a real MEG dataset of face-specific evoked responses

  4. GALA: group analysis leads to accuracy, a novel approach for solving the inverse problem in exploratory analysis of group MEG recordings

    PubMed Central

    Kozunov, Vladimir V.; Ossadtchi, Alexei

    2015-01-01

    Although MEG/EEG signals are highly variable between subjects, they allow characterizing systematic changes of cortical activity in both space and time. Traditionally a two-step procedure is used. The first step is a transition from sensor to source space by the means of solving an ill-posed inverse problem for each subject individually. The second is mapping of cortical regions consistently active across subjects. In practice the first step often leads to a set of active cortical regions whose location and timecourses display a great amount of interindividual variability hindering the subsequent group analysis. We propose Group Analysis Leads to Accuracy (GALA)—a solution that combines the two steps into one. GALA takes advantage of individual variations of cortical geometry and sensor locations. It exploits the ensuing variability in electromagnetic forward model as a source of additional information. We assume that for different subjects functionally identical cortical regions are located in close proximity and partially overlap and their timecourses are correlated. This relaxed similarity constraint on the inverse solution can be expressed within a probabilistic framework, allowing for an iterative algorithm solving the inverse problem jointly for all subjects. A systematic simulation study showed that GALA, as compared with the standard min-norm approach, improves accuracy of true activity recovery, when accuracy is assessed both in terms of spatial proximity of the estimated and true activations and correct specification of spatial extent of the activated regions. This improvement obtained without using any noise normalization techniques for both solutions, preserved for a wide range of between-subject variations in both spatial and temporal features of regional activation. The corresponding activation timecourses exhibit significantly higher similarity across subjects. Similar results were obtained for a real MEG dataset of face-specific evoked responses

  5. Accuracy of methods for detecting an irregular pulse and suspected atrial fibrillation: A systematic review and meta-analysis

    PubMed Central

    Coleman, Tim; Lewis, Sarah; Heneghan, Carl; Jones, Matthew

    2015-01-01

    Background Pulse palpation has been recommended as the first step of screening to detect atrial fibrillation. We aimed to determine and compare the accuracy of different methods for detecting pulse irregularities caused by atrial fibrillation. Methods We systematically searched MEDLINE, EMBASE, CINAHL and LILACS until 16 March 2015. Two reviewers identified eligible studies, extracted data and appraised quality using the QUADAS-2 instrument. Meta-analysis, using the bivariate hierarchical random effects method, determined average operating points for sensitivities, specificities, positive and negative likelihood ratios (PLR, NLR); we constructed summary receiver operating characteristic plots. Results Twenty-one studies investigated 39 interventions (n = 15,129 pulse assessments) for detecting atrial fibrillation. Compared to 12-lead electrocardiography (ECG) diagnosed atrial fibrillation, blood pressure monitors (BPMs; seven interventions) and non-12-lead ECGs (20 interventions) had the greatest accuracy for detecting pulse irregularities attributable to atrial fibrillation (BPM: sensitivity 0.98 (95% confidence interval (CI) 0.92–1.00), specificity 0.92 (95% CI 0.88–0.95), PLR 12.1 (95% CI 8.2–17.8) and NLR 0.02 (95% CI 0.00–0.09); non-12-lead ECG: sensitivity 0.91 (95% CI 0.86–0.94), specificity 0.95 (95% CI 0.92–0.97), PLR 20.1 (95% CI 12–33.7), NLR 0.09 (95% CI 0.06–0.14)). There were similar findings for smartphone applications (six interventions) although these studies were small in size. The sensitivity and specificity of pulse palpation (six interventions) were 0.92 (95% CI 0.85–0.96) and 0.82 (95% CI 0.76–0.88), respectively (PLR 5.2 (95% CI 3.8–7.2), NLR 0.1 (95% CI 0.05–0.18)). Conclusions BPMs and non-12-lead ECG were most accurate for detecting pulse irregularities caused by atrial fibrillation; other technologies may therefore be pragmatic alternatives to pulse palpation for the first step of atrial fibrillation screening

  6. The Accuracy of Conformation of a Generic Surface Mesh for the Analysis of Facial Soft Tissue Changes

    PubMed Central

    Cheung, Man Yan; Almukhtar, Anas; Keeling, Andrew; Hsung, Tai-Chiu; Ju, Xiangyang; McDonald, James; Ayoub, Ashraf; Khambay, Balvinder Singh

    2016-01-01

    Purpose Three dimensional analysis of the face is required for the assessment of complex changes following surgery, pathological conditions and to monitor facial growth. The most suitable method may be “dense surface correspondence”. Materials and Methods This method utilizes a generic facial mesh and “conformation process” to establish anatomical correspondences between two facial images. The aim of this study was to validate the use of conformed meshes to measure simulated maxillary and mandibular surgical movements. The “simulation” was performed by deforming the actual soft tissues of the participant during image acquisition. The study was conducted on 20 volunteers and used 77 facial landmarks pre-marked over six anatomical regions; left cheek, right cheek, left upper lip, philtrum, right upper lip and chin region. Each volunteer was imaged at rest and after performing 5 different simulated surgical procedures using 3D stereophotogrammetry. The simulated surgical movement was determined by measuring the Euclidean distances and the mean absolute x, y and z distances of the landmarks making up the six regions following digitization. A generic mesh was then conformed to each of the aligned six facial 3D images. The same six regions were selected on the aligned conformed simulated meshes and the surgical movement determined by determining the Euclidean distances and the mean absolute x, y and z distances of the mesh points making up the six regions were determined. Results In all cases the mean Euclidian distance between the simulated movement and conformed region was less than 0.7mm. For the x, y and z directions the majority of differences in the mean absolute distances were less than 1.0mm except in the x-direction for the left and right cheek regions, which was above 2.0mm. Conclusions This concludes that the conformation process has an acceptable level of accuracy and is a valid method of measuring facial change between two images i.e. pre- and

  7. Estimating subsurface water volumes and transit times in Hokkaido river catchments, Japan, using high-accuracy tritium analysis

    NASA Astrophysics Data System (ADS)

    Gusyev, Maksym; Yamazaki, Yusuke; Morgenstern, Uwe; Stewart, Mike; Kashiwaya, Kazuhisa; Hirai, Yasuyuki; Kuribayashi, Daisuke; Sawano, Hisaya

    2015-04-01

    The goal of this study is to estimate subsurface water transit times and volumes in headwater catchments of Hokkaido, Japan, using the New Zealand high-accuracy tritium analysis technique. Transit time provides insights into the subsurface water storage and therefore provides a robust and quick approach to quantifying the subsurface groundwater volume. Our method is based on tritium measurements in river water. Tritium is a component of meteoric water, decays with a half-life of 12.32 years, and is inert in the subsurface after the water enters the groundwater system. Therefore, tritium is ideally suited for characterization of the catchment's responses and can provide information on mean water transit times up to 200 years. Only in recent years has it become possible to use tritium for dating of stream and river water, due to the fading impact of the bomb-tritium from thermo-nuclear weapons testing, and due to improved measurement accuracy for the extremely low natural tritium concentrations. Transit time of the water discharge is one of the most crucial parameters for understanding the response of catchments and estimating subsurface water volume. While many tritium transit time studies have been conducted in New Zealand, only a limited number of tritium studies have been conducted in Japan. In addition, the meteorological, orographic and geological conditions of Hokkaido Island are similar to those in parts of New Zealand, allowing for comparison between these regions. In 2014, three field trips were conducted in Hokkaido in June, July and October to sample river water at river gauging stations operated by the Ministry of Land, Infrastructure, Transport and Tourism (MLIT). These stations have altitudes between 36 m and 860 m MSL and drainage areas between 45 and 377 km2. Each sampled point is located upstream of MLIT dams, with hourly measurements of precipitation and river water levels enabling us to distinguish between the snow melt and baseflow contributions

  8. X-ray Microscopy as an Approach to Increasing Accuracy and Efficiency of Serial Block-face Imaging for Correlated Light and Electron Microscopy of Biological Specimens

    PubMed Central

    Bushong, Eric A.; Johnson, Donald D.; Kim, Keun-Young; Terada, Masako; Hatori, Megumi; Peltier, Steven T.; Panda, Satchidananda; Merkle, Arno; Ellisman, Mark H.

    2015-01-01

    The recently developed three-dimensional electron microscopic (EM) method of serial block-face scanning electron microscopy (SBEM) has rapidly established itself as a powerful imaging approach. Volume EM imaging with this scanning electron microscopy (SEM) method requires intense staining of biological specimens with heavy metals to allow sufficient back-scatter electron signal and also to render specimens sufficiently conductive to control charging artifacts. These more extreme heavy metal staining protocols render specimens light opaque and make it much more difficult to track and identify regions of interest (ROIs) for the SBEM imaging process than for a typical thin section transmission electron microscopy correlative light and electron microscopy study. We present a strategy employing X-ray microscopy (XRM) both for tracking ROIs and for increasing the efficiency of the workflow used for typical projects undertaken with SBEM. XRM was found to reveal an impressive level of detail in tissue heavily stained for SBEM imaging, allowing for the identification of tissue landmarks that can be subsequently used to guide data collection in the SEM. Furthermore, specific labeling of individual cells using diaminobenzidine is detectable in XRM volumes. We demonstrate that tungsten carbide particles or upconverting nanophosphor particles can be used as fiducial markers to further increase the precision and efficiency of SBEM imaging. PMID:25392009

  9. X-ray microscopy as an approach to increasing accuracy and efficiency of serial block-face imaging for correlated light and electron microscopy of biological specimens.

    PubMed

    Bushong, Eric A; Johnson, Donald D; Kim, Keun-Young; Terada, Masako; Hatori, Megumi; Peltier, Steven T; Panda, Satchidananda; Merkle, Arno; Ellisman, Mark H

    2015-02-01

    The recently developed three-dimensional electron microscopic (EM) method of serial block-face scanning electron microscopy (SBEM) has rapidly established itself as a powerful imaging approach. Volume EM imaging with this scanning electron microscopy (SEM) method requires intense staining of biological specimens with heavy metals to allow sufficient back-scatter electron signal and also to render specimens sufficiently conductive to control charging artifacts. These more extreme heavy metal staining protocols render specimens light opaque and make it much more difficult to track and identify regions of interest (ROIs) for the SBEM imaging process than for a typical thin section transmission electron microscopy correlative light and electron microscopy study. We present a strategy employing X-ray microscopy (XRM) both for tracking ROIs and for increasing the efficiency of the workflow used for typical projects undertaken with SBEM. XRM was found to reveal an impressive level of detail in tissue heavily stained for SBEM imaging, allowing for the identification of tissue landmarks that can be subsequently used to guide data collection in the SEM. Furthermore, specific labeling of individual cells using diaminobenzidine is detectable in XRM volumes. We demonstrate that tungsten carbide particles or upconverting nanophosphor particles can be used as fiducial markers to further increase the precision and efficiency of SBEM imaging. PMID:25392009

  10. Development of response models for the Earth Radiation Budget Experiment (ERBE) sensors. Part 3: ERBE scanner measurement accuracy analysis due to reduced housekeeping data

    NASA Technical Reports Server (NTRS)

    Choi, Sang H.; Chrisman, Dan A., Jr.; Halyo, Nesim

    1987-01-01

    The accuracy of scanner measurements was evaluated when the sampling frequency of sensor housekeeping (HK) data was reduced from once every scan to once every eight scans. The resulting increase in uncertainty was greatest for sources with rapid or extreme temperature changes. This analysis focused on the mirror attenuator mosaic (MAM) baffle and plate and scanner radiometer baffle due to their relatively high temperature changes during solar calibrations. Since only solar simulator data were available, the solar temperatures were approximated on these components and the radiative and thermal gradients in the MAM baffle due to reflected sunlight. Of the two cases considered for the MAM plate and baffle temperatures, one uses temperatures obtained from the ground calibration. The other attempt uses temperatures computed from the MAM baffle model. This analysis shows that the heat input variations due largely to the solar radiance and irradiance during a scan cycle are small. It also demonstrates that reasonable intervals longer than the current HK data acquisition interval should not significantly affect the estimation of a radiation field in the sensor field-of-view.

  11. A Comparative Analysis of Diagnostic Accuracy of Focused Assessment With Sonography for Trauma Performed by Emergency Medicine and Radiology Residents

    PubMed Central

    Zamani, Majid; Masoumi, Babak; Esmailian, Mehrdad; Habibi, Amin; Khazaei, Mehdi; Mohammadi Esfahani, Mohammad

    2015-01-01

    Background: Focused assessment with sonography in trauma (FAST) is a method for prompt detection of the abdominal free fluid in patients with abdominal trauma. Objectives: This study was conducted to compare the diagnostic accuracy of FAST performed by emergency medicine residents (EMR) and radiology residents (RRs) in detecting peritoneal free fluids. Patients and Methods: Patients triaged in the emergency department with blunt abdominal trauma, high energy trauma, and multiple traumas underwent a FAST examination by EMRs and RRs with the same techniques to obtain the standard views. Ultrasound findings for free fluid in peritoneal cavity for each patient (positive/negative) were compared with the results of computed tomography, operative exploration, or observation as the final outcome. Results: A total of 138 patients were included in the final analysis. Good diagnostic agreement was noted between the results of FAST scans performed by EMRs and RRs (κ = 0.701, P < 0.001), also between the results of EMRs-performed FAST and the final outcome (κ = 0.830, P < 0.0010), and finally between the results of RRs-performed FAST and final outcome (κ = 0.795, P < 0.001). No significant differences were noted between EMRs- and RRs-performed FASTs regarding sensitivity (84.6% vs 84.6%), specificity (98.4% vs 97.6%), positive predictive value (84.6% vs 84.6%), and negative predictive value (98.4% vs 98.4%). Conclusions: Trained EMRs like their fellow RRs have the ability to perform FAST scan with high diagnostic value in patients with blunt abdominal trauma. PMID:26756009

  12. A meta-analysis of the diagnostic accuracy of dengue virus-specific IgA antibody-based tests for detection of dengue infection.

    PubMed

    Alagarasu, K; Walimbe, A M; Jadhav, S M; Deoshatwar, A R

    2016-03-01

    Immunoglobulin A (IgA)-based tests have been evaluated in different studies for their utility in diagnosing dengue infections. In most of the studies, the results were inconclusive because of a small sample size. Hence, a meta-analysis involving nine studies with 2096 samples was performed to assess the diagnostic accuracy of IgA-based tests in diagnosing dengue infections. The analysis was conducted using Meta-Disc software. The results revealed that IgA-based tests had an overall sensitivity, specificity, diagnostic odds ratio, and positive and negative likelihood ratios of 73·9%, 95·2%, 66·7, 22·0 and 0·25, respectively. Significant heterogeneity was observed between the studies. The type of test, infection status and day of sample collection influenced the diagnostic accuracy. The IgA-based diagnostic tests showed a greater accuracy when the samples were collected 4 days after onset of symptoms and for secondary infections. The results suggested that IgA-based tests had a moderate level of accuracy and are diagnostic of the disease. However, negative results cannot be used alone for dengue diagnosis. More prospective studies comparing the diagnostic accuracy of combinations of antigen-based tests with either IgA or IgM are needed and might be useful for suggesting the best strategy for dengue diagnosis. PMID:26289218

  13. Effectiveness of Preanalytic Practices on Contamination and Diagnostic Accuracy of Urine Cultures: a Laboratory Medicine Best Practices Systematic Review and Meta-analysis

    PubMed Central

    Franek, Jacob; Leibach, Elizabeth K.; Weissfeld, Alice S.; Kraft, Colleen S.; Sautter, Robert L.; Baselski, Vickie; Rodahl, Debra; Peterson, Edward J.; Cornish, Nancy E.

    2015-01-01

    SUMMARY Background. Urinary tract infection (UTI) in the United States is the most common bacterial infection, and urine cultures often make up the largest portion of workload for a hospital-based microbiology laboratory. Appropriately managing the factors affecting the preanalytic phase of urine culture contributes significantly to the generation of meaningful culture results that ultimately affect patient diagnosis and management. Urine culture contamination can be reduced with proper techniques for urine collection, preservation, storage, and transport, the major factors affecting the preanalytic phase of urine culture. Objectives. The purposes of this review were to identify and evaluate preanalytic practices associated with urine specimens and to assess their impact on the accuracy of urine culture microbiology. Specific practices included collection methods for men, women, and children; preservation of urine samples in boric acid solutions; and the effect of refrigeration on stored urine. Practice efficacy and effectiveness were measured by two parameters: reduction of urine culture contamination and increased accuracy of patient diagnosis. The CDC Laboratory Medicine Best Practices (LMBP) initiative's systematic review method for assessment of quality improvement (QI) practices was employed. Results were then translated into evidence-based practice guidelines. Search strategy. A search of three electronic bibliographic databases (PubMed, SCOPUS, and CINAHL), as well as hand searching of bibliographies from relevant information sources, for English-language articles published between 1965 and 2014 was conducted. Selection criteria. The search contained the following medical subject headings and key text words: urinary tract infections, UTI, urine/analysis, urine/microbiology, urinalysis, specimen handling, preservation, biological, preservation, boric acid, boric acid/borate, refrigeration, storage, time factors, transportation, transport time, time delay

  14. Response Time Analysis in Cognitive Tasks with Increasing Difficulty

    ERIC Educational Resources Information Center

    Dodonov, Yury S.; Dodonova, Yulia A.

    2012-01-01

    In the present study, speeded tasks with differing assumed difficulties of the trials are regarded as a special class of simple cognitive tasks. Exploratory latent growth modeling with data-driven shape of a growth curve and nonlinear structured latent curve modeling with predetermined monotonically increasing functions were used to analyze…

  15. Pricing for scarcity? An efficiency analysis of increasing block tariffs

    NASA Astrophysics Data System (ADS)

    Monteiro, Henrique; Roseta-Palma, Catarina

    2011-06-01

    Water pricing schedules often contain significant nonlinearities, such as the increasing block tariff (IBT) structure that is abundantly applied for residential users. The IBT is frequently supported as a good tool for achieving the goals of equity, water conservation, and revenue neutrality but seldom has been grounded on efficiency justifications. In particular, existing literature on water pricing establishes that although efficient schedules will depend on demand and supply characteristics, IBT cannot usually be recommended. In this paper, we consider whether the explicit inclusion of scarcity considerations can strengthen the appeal of IBT. Results show that when both demand and costs react to climate factors, increasing marginal prices may come about as a response to a combination of water scarcity and customer heterogeneity. We derive testable conditions and then illustrate their application through an estimation of Portuguese residential water demand. We show that the recommended tariff schedule hinges crucially on the choice of functional form for demand.

  16. Interoceptive accuracy and panic.

    PubMed

    Zoellner, L A; Craske, M G

    1999-12-01

    Psychophysiological models of panic hypothesize that panickers focus attention on and become anxious about the physical sensations associated with panic. Attention on internal somatic cues has been labeled interoception. The present study examined the role of physiological arousal and subjective anxiety on interoceptive accuracy. Infrequent panickers and nonanxious participants participated in an initial baseline to examine overall interoceptive accuracy. Next, participants ingested caffeine, about which they received either safety or no safety information. Using a mental heartbeat tracking paradigm, participants' count of their heartbeats during specific time intervals were coded based on polygraph measures. Infrequent panickers were more accurate in the perception of their heartbeats than nonanxious participants. Changes in physiological arousal were not associated with increased accuracy on the heartbeat perception task. However, higher levels of self-reported anxiety were associated with superior performance. PMID:10596462

  17. Performance-Based Cognitive Screening Instruments: An Extended Analysis of the Time versus Accuracy Trade-off

    PubMed Central

    Larner, Andrew J.

    2015-01-01

    Early and accurate diagnosis of dementia is key to appropriate treatment and management. Clinical assessment, including the use of cognitive screening instruments, remains integral to the diagnostic process. Many cognitive screening instruments have been described, varying in length and hence administration time, but it is not known whether longer tests offer greater diagnostic accuracy than shorter tests. Data from several pragmatic diagnostic test accuracy studies examining various cognitive screening instruments in a secondary care setting were analysed to correlate measures of test diagnostic accuracy and test duration, building on the findings of a preliminary study. High correlations which were statistically significant were found between one measure of diagnostic accuracy, area under the receiver operating characteristic curve, and surrogate measures of test duration, namely total test score and total number of test items/questions. Longer cognitive screening instruments may offer greater accuracy for the diagnosis of dementia, an observation which has possible implications for the optimal organisation of dedicated cognitive disorders clinics. PMID:26854168

  18. Spontaneous Subarachnoid Hemorrhage: A Systematic Review and Meta-Analysis Describing the Diagnostic Accuracy of History, Physical Exam, Imaging, and Lumbar Puncture with an Exploration of Test Thresholds

    PubMed Central

    Carpenter, Christopher R.; Hussain, Adnan M.; Ward, Michael J.; Zipfel, Gregory J.; Fowler, Susan; Pines, Jesse M.; Sivilotti, Marco L.A.

    2016-01-01

    Background Spontaneous subarachnoid hemorrhage (SAH) is a rare, but serious etiology of headache. The diagnosis of SAH is especially challenging in alert, neurologically intact patients, as missed or delayed diagnosis can be catastrophic. Objectives To perform a diagnostic accuracy systematic review and meta-analysis of history, physical examination, cerebrospinal fluid (CSF) tests, computed tomography (CT), and clinical decision rules for spontaneous SAH. A secondary objective was to delineate probability of disease thresholds for imaging and lumbar puncture (LP). Methods PUBMED, EMBASE, SCOPUS, and research meeting abstracts were searched up to June 2015 for studies of emergency department (ED) patients with acute headache clinically concerning for spontaneous SAH. QUADAS-2 was used to assess study quality and, when appropriate, meta-analysis was conducted using random effects models. Outcomes were sensitivity, specificity, positive (LR+) and negative (LR−) likelihood ratios. To identify test- and treatment-thresholds, we employed the Pauker-Kassirer method with Bernstein test-indication curves using the summary estimates of diagnostic accuracy. Results A total of 5,022 publications were identified, of which 122 underwent full text-review; 22 studies were included (average SAH prevalence 7.5%). Diagnostic studies differed in assessment of history and physical exam findings, CT technology, analytical techniques used to identify xanthochromia, and criterion standards for SAH. Study quality by QUADAS-2 was variable; however, most had a relatively low-risk of biases. A history of neck pain (LR+ 4.1 [95% CI 2.2-7.6]) and neck stiffness on physical exam (LR+ 6.6 [4.0-11.0]) were the individual findings most strongly associated with SAH. Combinations of findings may rule out SAH, yet promising clinical decision rules await external validation. Non-contrast cranial CT within 6 hours of headache onset accurately ruled-in (LR+ 230 [6-8700]) and ruled-out SAH (LR− 0

  19. Increasing the performance of tritium analysis by electrolytic enrichment.

    PubMed

    Groning, M; Auer, R; Brummer, D; Jaklitsch, M; Sambandam, C; Tanweer, A; Tatzber, H

    2009-06-01

    Several improvements are described for the existing tritium enrichment system at the Isotope Hydrology Laboratory of the International Atomic Energy Agency for processing natural water samples. The improvements include a simple method for pretreatment of electrolytic cells to ensure a high tritium separation factor, an improved design of the exhaust system for explosive gases, and a vacuum distillation line for faster initial preparation of water samples for electrolytic enrichment and for tritium analysis. Achievements included the reduction of variation of individual enrichment parameters of all cells to less than 1% and an improvement of 50% of the stability of the background mean. It resulted in an improved detection limit of less than 0.4 TU (at 2s), important for application of tritium measurements in the future at low concentration levels, and resulted in measurement precisions of+/-0.2 TU and+/-0.15 TU for liquid scintillation counting and for gas proportional counting, respectively. PMID:20183225

  20. The influence of multivariate analysis methods and target grain size on the accuracy of remote quantitative chemical analysis of rocks using laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Anderson, Ryan B.; Morris, Richard V.; Clegg, Samuel M.; Bell, James F.; Wiens, Roger C.; Humphries, Seth D.; Mertzman, Stanley A.; Graff, Trevor G.; McInroy, Rhonda

    2011-10-01

    by the ratio of laser beam diameter (˜490 μm) to grain size, with coarse-grained rocks often resulting in lower accuracy and precision than analyses of fine-grained rocks and powders. The number of analysis spots that were normally required to produce a chemical analysis within one standard deviation of the true bulk composition ranged from ˜10 for fine-grained rocks to >20 for some coarse-grained rocks.

  1. Stability and accuracy analysis of some fully-discrete algorithms for the one-dimensional second-order wave equation

    NASA Technical Reports Server (NTRS)

    Hughes, T. J. R.; Tezduyar, T. E.

    1984-01-01

    The present investigation is concerned with some basic results for a predictor-multicorrector algorithm applied to the one-dimensional wave equation, giving particular attention to so-called 2-pass explicit schemes in which both lumped and coupled mass matrices are employed. In an assessment of the accuracy and stability properties of the algorithms, use is made of the one-dimensional, second-order wave equation. The maximum stable time step of the lumped right-hand-side mass, 2-pass explicit algorithm is twice that of the 1-pass explicit algorithm. Improved accuracy is obtained by employing higher-order, or consistent, right-hand-side, mass.

  2. Diagnostic accuracy of refractometer and Brix refractometer to assess failure of passive transfer in calves: protocol for a systematic review and meta-analysis.

    PubMed

    Buczinski, S; Fecteau, G; Chigerwe, M; Vandeweerd, J M

    2016-06-01

    Calves are highly dependent of colostrum (and antibody) intake because they are born agammaglobulinemic. The transfer of passive immunity in calves can be assessed directly by dosing immunoglobulin G (IgG) or by refractometry or Brix refractometry. The latter are easier to perform routinely in the field. This paper presents a protocol for a systematic review meta-analysis to assess the diagnostic accuracy of refractometry or Brix refractometry versus dosage of IgG as a reference standard test. With this review protocol we aim to be able to report refractometer and Brix refractometer accuracy in terms of sensitivity and specificity as well as to quantify the impact of any study characteristic on test accuracy. PMID:27427188

  3. Diagnostic accuracy of ultrasonography, MRI and MR arthrography in the characterisation of rotator cuff disorders: a systematic review and meta-analysis

    PubMed Central

    Roy, Jean-Sébastien; Braën, Caroline; Leblond, Jean; Desmeules, François; Dionne, Clermont E; MacDermid, Joy C; Bureau, Nathalie J; Frémont, Pierre

    2015-01-01

    Background Different diagnostic imaging modalities, such as ultrasonography (US), MRI, MR arthrography (MRA) are commonly used for the characterisation of rotator cuff (RC) disorders. Since the most recent systematic reviews on medical imaging, multiple diagnostic studies have been published, most using more advanced technological characteristics. The first objective was to perform a meta-analysis on the diagnostic accuracy of medical imaging for characterisation of RC disorders. Since US is used at the point of care in environments such as sports medicine, a secondary analysis assessed accuracy by radiologists and non-radiologists. Methods A systematic search in three databases was conducted. Two raters performed data extraction and evaluation of risk of bias independently, and agreement was achieved by consensus. Hierarchical summary receiver-operating characteristic package was used to calculate pooled estimates of included diagnostic studies. Results Diagnostic accuracy of US, MRI and MRA in the characterisation of full-thickness RC tears was high with overall estimates of sensitivity and specificity over 0.90. As for partial RC tears and tendinopathy, overall estimates of specificity were also high (>0.90), while sensitivity was lower (0.67–0.83). Diagnostic accuracy of US was similar whether a trained radiologist, sonographer or orthopaedist performed it. Conclusions Our results show the diagnostic accuracy of US, MRI and MRA in the characterisation of full-thickness RC tears. Since full thickness tear constitutes a key consideration for surgical repair, this is an important characteristic when selecting an imaging modality for RC disorder. When considering accuracy, cost, and safety, US is the best option. PMID:25677796

  4. Development of Automated Image Analysis Tools for Verification of Radiotherapy Field Accuracy with AN Electronic Portal Imaging Device.

    NASA Astrophysics Data System (ADS)

    Dong, Lei

    1995-01-01

    The successful management of cancer with radiation relies on the accurate deposition of a prescribed dose to a prescribed anatomical volume within the patient. Treatment set-up errors are inevitable because the alignment of field shaping devices with the patient must be repeated daily up to eighty times during the course of a fractionated radiotherapy treatment. With the invention of electronic portal imaging devices (EPIDs), patient's portal images can be visualized daily in real-time after only a small fraction of the radiation dose has been delivered to each treatment field. However, the accuracy of human visual evaluation of low-contrast portal images has been found to be inadequate. The goal of this research is to develop automated image analysis tools to detect both treatment field shape errors and patient anatomy placement errors with an EPID. A moments method has been developed to align treatment field images to compensate for lack of repositioning precision of the image detector. A figure of merit has also been established to verify the shape and rotation of the treatment fields. Following proper alignment of treatment field boundaries, a cross-correlation method has been developed to detect shifts of the patient's anatomy relative to the treatment field boundary. Phantom studies showed that the moments method aligned the radiation fields to within 0.5mm of translation and 0.5^ circ of rotation and that the cross-correlation method aligned anatomical structures inside the radiation field to within 1 mm of translation and 1^ circ of rotation. A new procedure of generating and using digitally reconstructed radiographs (DRRs) at megavoltage energies as reference images was also investigated. The procedure allowed a direct comparison between a designed treatment portal and the actual patient setup positions detected by an EPID. Phantom studies confirmed the feasibility of the methodology. Both the moments method and the cross -correlation technique were

  5. Differential uncertainty analysis for evaluating the accuracy of S-parameter retrieval methods for electromagnetic properties of metamaterial slabs.

    PubMed

    Hasar, Ugur Cem; Barroso, Joaquim J; Sabah, Cumali; Kaya, Yunus; Ertugrul, Mehmet

    2012-12-17

    We apply a complete uncertainty analysis, not studied in the literature, to investigate the dependences of retrieved electromagnetic properties of two MM slabs (the first one with only split-ring resonators (SRRs) and the second with SRRs and a continuous wire) with single-band and dual-band resonating properties on the measured/simulated scattering parameters, the slab length, and the operating frequency. Such an analysis is necessary for the selection of a suitable retrieval method together with the correct examination of exotic properties of MM slabs especially in their resonance regions. For this analysis, a differential uncertainty model is developed to monitor minute changes in the dependent variables (electromagnetic properties of MM slabs) in functions of independent variables (scattering (S-) parameters, the slab length, and the operating frequency). Two complementary approaches (the analytical approach and the dispersion model approach) each with different strengths are utilized to retrieve the electromagnetic properties of various MM slabs, which are needed for the application of the uncertainty analysis. We note the following important results from our investigation. First, uncertainties in the retrieved electromagnetic properties of the analyzed MM slabs drastically increase when values of electromagnetic properties shrink to zero or near resonance regions where S-parameters exhibit rapid changes. Second, any low-loss or medium-loss inside the MM slabs due to an imperfect dielectric substrate or a finite conductivity of metals can decrease these uncertainties near resonance regions because these losses hinder abrupt changes in S-parameters. Finally, we note that precise information of especially the slab length and the operating frequency is a prerequisite for accurate analysis of exotic electromagnetic properties of MM slabs (especially multiband MM slabs) near resonance regions. PMID:23263141

  6. Systematic review of discharge coding accuracy

    PubMed Central

    Burns, E.M.; Rigby, E.; Mamidanna, R.; Bottle, A.; Aylin, P.; Ziprin, P.; Faiz, O.D.

    2012-01-01

    Introduction Routinely collected data sets are increasingly used for research, financial reimbursement and health service planning. High quality data are necessary for reliable analysis. This study aims to assess the published accuracy of routinely collected data sets in Great Britain. Methods Systematic searches of the EMBASE, PUBMED, OVID and Cochrane databases were performed from 1989 to present using defined search terms. Included studies were those that compared routinely collected data sets with case or operative note review and those that compared routinely collected data with clinical registries. Results Thirty-two studies were included. Twenty-five studies compared routinely collected data with case or operation notes. Seven studies compared routinely collected data with clinical registries. The overall median accuracy (routinely collected data sets versus case notes) was 83.2% (IQR: 67.3–92.1%). The median diagnostic accuracy was 80.3% (IQR: 63.3–94.1%) with a median procedure accuracy of 84.2% (IQR: 68.7–88.7%). There was considerable variation in accuracy rates between studies (50.5–97.8%). Since the 2002 introduction of Payment by Results, accuracy has improved in some respects, for example primary diagnoses accuracy has improved from 73.8% (IQR: 59.3–92.1%) to 96.0% (IQR: 89.3–96.3), P= 0.020. Conclusion Accuracy rates are improving. Current levels of reported accuracy suggest that routinely collected data are sufficiently robust to support their use for research and managerial decision-making. PMID:21795302

  7. Visuospatial ability, accuracy of size estimation, and bulimic disturbance in a noneating-disordered college sample: a neuropsychological analysis.

    PubMed

    Thompson, J K; Spana, R E

    1991-08-01

    The relationship between visuospatial ability and size accuracy in perception was assessed in 69 normal college females. In general, correlations indicated small associations between visuospatial defects and size overestimation and little relationship between visuospatial ability and level of bulimic disturbance. Implications for research on the size overestimation of body image are addressed. PMID:1945715

  8. How Can We Evaluate the Accuracy of Small Stream Maps? -Focusing on Sampling Method and Statistical Analysis -

    NASA Astrophysics Data System (ADS)

    Park, J.

    2010-12-01

    The Washington State Department of Natural Resources’ (DNR) Forest Practices Habitat Conservation Plan (FPHCP) requires establishment of riparian management zones (RMZs) or equipment limitation zones (ELZs). In order to establish RMZs and ELZs, the DNR is required to update GIS-based stream maps showing the locations of type Ns (Non-fish seasonal) streams as well as type S (Shorelines of the state), type F (Fish habitat), and type Np (Non-fish perennial) streams. While there are few disputes over the positional accuracy of large streams, the representation of small streams such as Ns and small type S or F streams (less than 10’ width) have been considered to need more improvement of their positional accuracy. Numerous remotely sensed stream-mapping methods have been developed in the last several decades that use an array of remote sensing data such as aerial photography, satellite optical imagery, and Digital Elevation Model (DEM) topographic data. While the positional accuracy of the final stream map products has been considered essential to determine the map quality, the estimation or comparison of the positional accuracy of small stream map products has not been well studied, and rarely attempted by remotely sensed stream map developers. Assessments of the positional accuracy of stream maps are not covered properly because it is not easy to acquire the field reference data, especially for small streams under the canopy located in remote forest areas. More importantly, as of this writing, we are not aware of any prominent method to estimate or compare the positional accuracy of stream maps. Since general positional accuracy assessment methods for remotely sensed map products are designed for at least two dimensional features, they are not suitable for linear features such as streams. Due to the difficulties inherent in stream features, estimation methods for stream maps' accuracy have not dealt with the positional accuracy itself but the hydrological

  9. A systematic review and meta-analysis of diagnostic accuracy of serum 1,3-β-D-glucan for invasive fungal infection: Focus on cutoff levels.

    PubMed

    He, Song; Hang, Ju-Ping; Zhang, Ling; Wang, Fang; Zhang, De-Chun; Gong, Fang-Hong

    2015-08-01

    To assess the diagnostic accuracy of 1,3-β-D-glucan (BDG) assay for diagnosing invasive fungal infections (IFI), we searched the Medline and Embase databases, and studies reporting the performance of BDG assays for the diagnosis of IFI were identified. Our analysis was mainly focused on the cutoff level. Meta-analysis was performed using conventional meta-analytical pooling and bivariate analysis. Our meta-analysis covered 28 individual studies, in which 896 out of 4214 patients were identified as IFI positive. The pooled sensitivity, specificity, diagnostic odds ratio, and area under the summary receiver operating characteristic (AUC-SROC) curve were 0.78 [95% confidence interval (CI), 0.75-0.81], 0.81 (95% CI, 0.80-0.83), 21.88 (95% CI, 12.62-37.93), and 0.8855, respectively. Subgroup analyses indicated that in cohort studies, the cutoff value of BDG at 80 pg/mL had the best diagnostic accuracy, whereas in case-control studies the cutoff value of 20 pg/mL had the best diagnostic accuracy; moreover, the AUC-SROC in cohort studies was lower than that in case-control studies. The cutoff value of 60 pg/mL has the best diagnostic accuracy with the European Organization for Research and Treatment of Cancer/Mycoses Study Group criteria as a reference standard. The 60 pg/mL cutoff value has the best diagnostic accuracy with the Fungitell assay compared to the BDG detection assay. The cutoff value of 20 pg/mL has the best diagnostic accuracy with the Fungitec G-test assay, and the cutoff value of 11 pg/mL has the best diagnostic accuracy with the Wako assay. Serum BDG detection is highly accurate for diagnosing IFIs. As such, 60 pg/mL of BDG level can be used as the best cutoff value to distinguish patients with IFIs from patients without IFI (mainly due to Candida and Aspergillus). PMID:25081986

  10. Landsat classification accuracy assessment procedures

    USGS Publications Warehouse

    Mead, R. R.; Szajgin, John

    1982-01-01

    A working conference was held in Sioux Falls, South Dakota, 12-14 November, 1980 dealing with Landsat classification Accuracy Assessment Procedures. Thirteen formal presentations were made on three general topics: (1) sampling procedures, (2) statistical analysis techniques, and (3) examples of projects which included accuracy assessment and the associated costs, logistical problems, and value of the accuracy data to the remote sensing specialist and the resource manager. Nearly twenty conference attendees participated in two discussion sessions addressing various issues associated with accuracy assessment. This paper presents an account of the accomplishments of the conference.

  11. The Effect of Study Design Biases on the Diagnostic Accuracy of Magnetic Resonance Imaging to Detect Silicone Breast Implant Ruptures: A Meta-Analysis

    PubMed Central

    Song, Jae W.; Kim, Hyungjin Myra; Bellfi, Lillian T.; Chung, Kevin C.

    2010-01-01

    Background All silicone breast implant recipients are recommended by the US Food and Drug Administration to undergo serial screening to detect implant rupture with magnetic resonance imaging (MRI). We performed a systematic review of the literature to assess the quality of diagnostic accuracy studies utilizing MRI or ultrasound to detect silicone breast implant rupture and conducted a meta-analysis to examine the effect of study design biases on the estimation of MRI diagnostic accuracy measures. Method Studies investigating the diagnostic accuracy of MRI and ultrasound in evaluating ruptured silicone breast implants were identified using MEDLINE, EMBASE, ISI Web of Science, and Cochrane library databases. Two reviewers independently screened potential studies for inclusion and extracted data. Study design biases were assessed using the QUADAS tool and the STARDS checklist. Meta-analyses estimated the influence of biases on diagnostic odds ratios. Results Among 1175 identified articles, 21 met the inclusion criteria. Most studies using MRI (n= 10 of 16) and ultrasound (n=10 of 13) examined symptomatic subjects. Meta-analyses revealed that MRI studies evaluating symptomatic subjects had 14-fold higher diagnostic accuracy estimates compared to studies using an asymptomatic sample (RDOR 13.8; 95% CI 1.83–104.6) and 2-fold higher diagnostic accuracy estimates compared to studies using a screening sample (RDOR 1.89; 95% CI 0.05–75.7). Conclusion Many of the published studies utilizing MRI or ultrasound to detect silicone breast implant rupture are flawed with methodological biases. These methodological shortcomings may result in overestimated MRI diagnostic accuracy measures and should be interpreted with caution when applying the data to a screening population. PMID:21364405

  12. Sensitivity and accuracy analysis of CT image in PRISM autocontouring using confusion matrix and ROC/AUC curve methods

    NASA Astrophysics Data System (ADS)

    Yunisa, Regina; Haryanto, Freddy

    2015-09-01

    The research was conducted to evaluate and analyze the results of the CT image autocontouring Prism TPS using confusion matrix and ROC methods. This study begins by treating thoracic CT images using a grayscale format TPS Prism software. Autocontouring done in the area of spinal cord and right lung with appropriate parameter settings window. The average value of the sensitivity, specificity and accuracy for 23 slices of spinal cord are 0.93, 0.99, and 0.99. For two slices of the right lung, average value of sensitivity, specificity, and accuracy of 2 slices were 0.99, 0.99, and 0.99. These values are classified as `Excellent'.

  13. Analysis of the accuracy of a neural algorithm for defect depth estimation using PCA processing from active thermography data

    NASA Astrophysics Data System (ADS)

    Dudzik, S.

    2013-01-01

    In the paper a neural algorithm, which uses an active thermography for defect depth estimation, is presented. Simulations of the algorithm, for three datasets representing different phases of the heat transfer process developing in the test sample were performed. The influence of the emissivity error of the test sample surface on the accuracy of defect depth estimation is discussed. The investigations were performed for test sample made of the material with low thermal diffusivity.

  14. Accuracy of the ABC/2 score for intracerebral hemorrhage: Systematic review and analysis of MISTIE, CLEAR-IVH, CLEAR III

    PubMed Central

    Webb, Alastair JS; Ullman, Natalie L; Morgan, Tim C; Muschelli, John; Kornbluth, Joshua; Awad, Issam A; Mayo, Stephen; Rosenblum, Michael; Ziai, Wendy; Zuccarrello, Mario; Aldrich, Francois; John, Sayona; Harnof, Sagi; Lopez, George; Broaddus, William C; Wijman, Christine; Vespa, Paul; Bullock, Ross; Haines, Stephen J; Cruz-Flores, Salvador; Tuhrim, Stan; Hill, Michael D; Narayan, Raj; Hanley, Daniel F

    2015-01-01

    Background and Purpose The ABC/2 score estimates intracerebral hemorrhage (ICH) volume, yet validations have been limited by small samples and inappropriate outcome measures. We determined accuracy of the ABC/2 score calculated at a specialized Reading Center (RC-ABC) or local site (site-ABC) versus the reference-standard CT-based planimetry (CTP). Methods In MISTIE-II, CLEAR-IVH and CLEAR-III trials, ICH volume was prospectively calculated by CTP, RC-ABC and site-ABC. Agreement between CTP and ABC/2 was defined as an absolute difference up to 5ml and relative difference within 20%. Determinants of ABC/2 accuracy were assessed by logistic regression. Results In 4369 scans from 507 patients, CTP was more strongly correlated with RC-ABC (r2=0.93) than site-ABC (r2=0.87). Although RC-ABC overestimated CTP-based volume on average (RC-ABC=15.2cm3, CTP=12.7cm3), agreement was reasonable when categorised into mild, moderate and severe ICH (kappa 0.75, p<0.001). This was consistent with overestimation of ICH volume in 6/8 previous studies. Agreement with CTP was greater for RC-ABC (84% within 5ml; 48% of scans within 20%) than for site-ABC (81% within 5ml; 41% within 20%). RC-ABC had moderate accuracy for detecting ≥ 5ml change in CTP volume between consecutive scans (sensitivity 0.76, specificity 0.86) and was more accurate with smaller ICH, thalamic haemorrhage and homogeneous clots. Conclusions ABC/2 scores at local or central sites are sufficiently accurate to categorise ICH volume and assess eligibility for the CLEAR III and MISTIE III studies, and moderately accurate for change in ICH volume. However, accuracy decreases with large, irregular or lobar clots. Clinical Trial Registration MISTIE-II NCT00224770; CLEAR-III NCT00784134; www.clinicaltrials.gov PMID:26243227

  15. Algorithms for the reconstruction of the singular wave front of laser radiation: analysis and improvement of accuracy

    SciTech Connect

    Aksenov, V P; Kanev, F Yu; Izmailov, I V; Starikov, F A

    2008-07-31

    The possibility of reconstructing a singular wave front of laser beams by the local tilts of the wave front measured with a Hartmann sensor is considered. The accuracy of the reconstruction algorithm described by Fried is estimated and its modification is proposed, which allows one to improve the reliability of the phase reconstruction. Based on the Fried algorithm and its modification, a combined algorithm is constructed whose advantages are demonstrated in numerical experiments. (control of laser radiation parameters)

  16. Comparison of the Accuracy of Two Conventional Phenotypic Methods and Two MALDI-TOF MS Systems with That of DNA Sequencing Analysis for Correctly Identifying Clinically Encountered Yeasts

    PubMed Central

    Chao, Qiao-Ting; Lee, Tai-Fen; Teng, Shih-Hua; Peng, Li-Yun; Chen, Ping-Hung; Teng, Lee-Jene; Hsueh, Po-Ren

    2014-01-01

    We assessed the accuracy of species-level identification of two commercially available matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) systems (Bruker Biotyper and Vitek MS) and two conventional phenotypic methods (Phoenix 100 YBC and Vitek 2 Yeast ID) with that of rDNA gene sequencing analysis among 200 clinical isolates of commonly encountered yeasts. The correct identification rates of the 200 yeast isolates to species or complex (Candida parapsilosis complex, C. guilliermondii complex and C. rugosa complex) levels by the Bruker Biotyper, Vitek MS (using in vitro devices [IVD] database), Phoenix 100 YBC and Vitek 2 Yeast ID (Sabouraud's dextrose agar) systems were 92.5%, 79.5%, 89%, and 74%, respectively. An additional 72 isolates of C. parapsilosis complex and 18 from the above 200 isolates (30 in each of C. parapsilosis, C. metapsilosis, and C. orthopsilosis) were also evaluated separately. Bruker Biotyper system could accurately identify all C. parapsilosis complex to species level. Using Vitek 2 MS (IVD) system, all C. parapsilosis but none of C. metapsilosis, or C. orthopsilosis could be accurately identified. Among the 89 yeasts misidentified by the Vitek 2 MS (IVD) system, 39 (43.8%), including 27 C. orthopsilosis isolates, could be correctly identified Using the Vitek MS Plus SARAMIS database for research use only. This resulted in an increase in the rate of correct identification of all yeast isolates (87.5%) by Vitek 2 MS. The two species in C. guilliermondii complex (C. guilliermondii and C. fermentati) isolates were correctly identified by cluster analysis of spectra generated by the Bruker Biotyper system. Based on the results obtained in the current study, MALDI-TOF MS systems present a promising alternative for the routine identification of yeast species, including clinically commonly and rarely encountered yeast species and several species belonging to C. parapsilosis complex, C. guilliermondii complex

  17. UV-visible microscope spectrophotometric polarization and dichroism with increased discrimination power in forensic analysis

    NASA Astrophysics Data System (ADS)

    Purcell, Dale Kevin

    merit investigated included: 1) wavelength accuracy, 2) wavelength precision, 3) wavelength resolution stability, 4) photometric accuracy, 5) photometric precision, 6) photometric linearity, 7) photometric noise, and 8) short-term baseline stability. In addition, intrinsic instrument polarization effects were investigated to determine the impact of these properties on spectral interpretation and data quality. Finally, a set of recommendations were developed which describe instrument performance characteristics for microscope and spectrometer features and functions, and specific instrument parameters that must be controlled in order to acquire high quality data from an ultraviolet-visible forensic microscope spectrophotometer system for increased discrimination power.

  18. Accuracy analysis of a multi-view stereo approach for phenotyping of tomato plants at the organ level.

    PubMed

    Rose, Johann Christian; Paulus, Stefan; Kuhlmann, Heiner

    2015-01-01

    Accessing a plant's 3D geometry has become of significant importance for phenotyping during the last few years. Close-up laser scanning is an established method to acquire 3D plant shapes in real time with high detail, but it is stationary and has high investment costs. 3D reconstruction from images using structure from motion (SfM) and multi-view stereo (MVS) is a flexible cost-effective method, but requires post-processing procedures. The aim of this study is to evaluate the potential measuring accuracy of an SfM- and MVS-based photogrammetric method for the task of organ-level plant phenotyping. For this, reference data are provided by a high-accuracy close-up laser scanner. Using both methods, point clouds of several tomato plants were reconstructed at six following days. The parameters leaf area, main stem height and convex hull of the complete plant were extracted from the 3D point clouds and compared to the reference data regarding accuracy and correlation. These parameters were chosen regarding the demands of current phenotyping scenarios. The study shows that the photogrammetric approach is highly suitable for the presented monitoring scenario, yielding high correlations to the reference measurements. This cost-effective 3D reconstruction method depicts an alternative to an expensive laser scanner in the studied scenarios with potential for automated procedures. PMID:25919368

  19. Accuracy Analysis of a Multi-View Stereo Approach for Phenotyping of Tomato Plants at the Organ Level

    PubMed Central

    Rose, Johann Christian; Paulus, Stefan; Kuhlmann, Heiner

    2015-01-01

    Accessing a plant's 3D geometry has become of significant importance for phenotyping during the last few years. Close-up laser scanning is an established method to acquire 3D plant shapes in real time with high detail, but it is stationary and has high investment costs. 3D reconstruction from images using structure from motion (SfM) and multi-view stereo (MVS) is a flexible cost-effective method, but requires post-processing procedures. The aim of this study is to evaluate the potential measuring accuracy of an SfM- and MVS-based photogrammetric method for the task of organ-level plant phenotyping. For this, reference data are provided by a high-accuracy close-up laser scanner. Using both methods, point clouds of several tomato plants were reconstructed at six following days. The parameters leaf area, main stem height and convex hull of the complete plant were extracted from the 3D point clouds and compared to the reference data regarding accuracy and correlation. These parameters were chosen regarding the demands of current phenotyping scenarios. The study shows that the photogrammetric approach is highly suitable for the presented monitoring scenario, yielding high correlations to the reference measurements. This cost-effective 3D reconstruction method depicts an alternative to an expensive laser scanner in the studied scenarios with potential for automated procedures. PMID:25919368

  20. Comparing the accuracy of video-oculography and the scleral search coil system in human eye movement analysis.

    PubMed

    Imai, Takao; Sekine, Kazunori; Hattori, Kousuke; Takeda, Noriaki; Koizuka, Izumi; Nakamae, Koji; Miura, Katsuyoshi; Fujioka, Hiromu; Kubo, Takeshi

    2005-03-01

    The measurement of eye movements in three dimensions is an important tool to investigate the human vestibular and oculomotor system. The primary methods for three dimensional eye movement measurement are the scleral search coil system (SSCS) and video-oculography (VOG). In the present study, we compare the accuracy of VOG with that of SSCS using an artificial eye. We then analyzed the Y (pitch) and Z (yaw) component of human eye movements during saccades, smooth pursuit and optokinetic nystagmus, and the X (roll) component of human eye movement during the torsional vestibulo-ocular reflex induced by rotation in normal subjects, using simultaneous VOG and SSCS measures. The coefficients of the linear relationship between the angle of a simulated eyeball and the angle measured by both VOG and SSCS was almost unity with y-intercepts close to zero for torsional (X), vertical (Y) and horizontal (Z) movements, indicating that the in vitro accuracy of VOG was similar to that of SSCS. The average difference between VOG and SSCS was 0.56 degrees , 0.78 degrees and 0.18 degrees for the X, Y and Z components of human eye movements, respectively. Both the in vitro and in vivo comparisons demonstrate that VOG has accuracy comparable to SSCS, and is a reliable method for measurement of three dimensions (3D) human eye movements. PMID:15882818

  1. Thiopurine S-methyltransferase testing for averting drug toxicity: a meta-analysis of diagnostic test accuracy.

    PubMed

    Zur, R M; Roy, L M; Ito, S; Beyene, J; Carew, C; Ungar, W J

    2016-08-01

    Thiopurine S-methyltransferase (TPMT) deficiency increases the risk of serious adverse events in persons receiving thiopurines. The objective was to synthesize reported sensitivity and specificity of TPMT phenotyping and genotyping using a latent class hierarchical summary receiver operating characteristic meta-analysis. In 27 studies, pooled sensitivity and specificity of phenotyping for deficient individuals was 75.9% (95% credible interval (CrI), 58.3-87.0%) and 98.9% (96.3-100%), respectively. For genotype tests evaluating TPMT*2 and TPMT*3, sensitivity and specificity was 90.4% (79.1-99.4%) and 100.0% (99.9-100%), respectively. For individuals with deficient or intermediate activity, phenotype sensitivity and specificity was 91.3% (86.4-95.5%) and 92.6% (86.5-96.6%), respectively. For genotype tests evaluating TPMT*2 and TPMT*3, sensitivity and specificity was 88.9% (81.6-97.5%) and 99.2% (98.4-99.9%), respectively. Genotyping has higher sensitivity as long as TPMT*2 and TPMT*3 are tested. Both approaches display high specificity. Latent class meta-analysis is a useful method for synthesizing diagnostic test performance data for clinical practice guidelines.The Pharmacogenomics Journal advance online publication, 24 May 2016; doi:10.1038/tpj.2016.37. PMID:27217052

  2. Diagnostic Accuracy of Transcranial Sonography of the Substantia Nigra in Parkinson’s disease: A Systematic Review and Meta-analysis

    PubMed Central

    Li, Dun-Hui; He, Ya-Chao; Liu, Jun; Chen, Sheng-Di

    2016-01-01

    A large number of articles have reported substantia nigra hyperechogenicity in Parkinson’s disease (PD) and have assessed the diagnostic accuracy of transcranial sonography (TCS); however, the conclusions are discrepant. Consequently, this systematic review and meta-analysis aims to consolidate the available observational studies and provide a comprehensive evaluation of the clinical utility of TCS in PD. Totally, 31 studies containing 4,386 participants from 13 countries were included. A random effects model was utilized to pool the effect sizes. Meta-regression and sensitivity analysis were performed to explore potential heterogeneity. Overall diagnostic accuracy of TCS in differentiating PD from normal controls was quite high, with a pooled sensitivity of 0.83 (95% CI: 0.81–0.85) and a pooled specificity of 0.87 (95% CI: 0.85–0.88). The positive likelihood ratio, the negative likelihood ratio and diagnostic odds ratio were calculated 6.94 (95% CI: 5.09–9.48), 0.19 (95% CI: 0.16–0.23), and 42.89 (95% CI: 30.03–61.25) respectively. Our systematic review of the literature and meta-analysis suggest that TCS has high diagnostic accuracy in the diagnosis of PD when compared to healthy control. PMID:26878893

  3. Borderline Ovarian Tumors and Diagnostic Dilemma of Intraoperative Diagnosis: Could Preoperative He4 Assay and ROMA Score Assessment Increase the Frozen Section Accuracy? A Multicenter Case-Control Study

    PubMed Central

    Gizzo, Salvatore; Berretta, Roberto; Di Gangi, Stefania; Guido, Maria; Zanni, Giuliano Carlo; Franceschetti, Ilaria; Quaranta, Michela; Plebani, Mario; Nardelli, Giovanni Battista; Patrelli, Tito Silvio

    2014-01-01

    The aim of our study was to assess the value of a preoperative He4-serum-assay and ROMA-score assessment in improving the accuracy of frozen section histology in the diagnosis of borderline ovarian tumors (BOT). 113 women presenting with a unilateral ovarian mass diagnosed as serous/mucinous BOT at frozen-section-histology (FS) and/or confirmed on final pathology were recruited. Pathologists were informed of the results of preoperative clinical/instrumental assessment of all patients. For Group_A patients, additional information regarding He4, CA125, and ROMA score was available (in Group_B only CA125 was known). The comparison between Group A and Group B in terms of FS accuracy, demonstrated a consensual diagnosis in 62.8% versus 58.6% (P: n.s.), underdiagnosis in 25.6% versus 41.4% (P < 0.05), and overdiagnosis in 11.6% versus 0% (P < 0.01). Low FS diagnostic accuracy was associated with menopausal status (OR: 2.13), laparoscopic approach (OR: 2.18), mucinous histotype (OR: 2.23), low grading (OR: 1.30), and FIGO stage I (OR: 2.53). Ultrasound detection of papillae (OR: 0.29), septa (OR: 0.39), atypical vascularization (OR: 0.34), serum He4 assay (OR: 0.39), and ROMA score assessment (OR: 0.44) decreased the probability of underdiagnosis. A combined preoperative assessment through serum markers and ultrasonographic features may potentially reduce the risk of underdiagnosis of BOTs on FS while likely increasing the concomitant incidence of false-positive events. PMID:25431767

  4. Video image analysis in the Australian meat industry - precision and accuracy of predicting lean meat yield in lamb carcasses.

    PubMed

    Hopkins, D L; Safari, E; Thompson, J M; Smith, C R

    2004-06-01

    A wide selection of lamb types of mixed sex (ewes and wethers) were slaughtered at a commercial abattoir and during this process images of 360 carcasses were obtained online using the VIAScan® system developed by Meat and Livestock Australia. Soft tissue depth at the GR site (thickness of tissue over the 12th rib 110 mm from the midline) was measured by an abattoir employee using the AUS-MEAT sheep probe (PGR). Another measure of this thickness was taken in the chiller using a GR knife (NGR). Each carcass was subsequently broken down to a range of trimmed boneless retail cuts and the lean meat yield determined. The current industry model for predicting meat yield uses hot carcass weight (HCW) and tissue depth at the GR site. A low level of accuracy and precision was found when HCW and PGR were used to predict lean meat yield (R(2)=0.19, r.s.d.=2.80%), which could be improved markedly when PGR was replaced by NGR (R(2)=0.41, r.s.d.=2.39%). If the GR measures were replaced by 8 VIAScan® measures then greater prediction accuracy could be achieved (R(2)=0.52, r.s.d.=2.17%). A similar result was achieved when the model was based on principal components (PCs) computed from the 8 VIAScan® measures (R(2)=0.52, r.s.d.=2.17%). The use of PCs also improved the stability of the model compared to a regression model based on HCW and NGR. The transportability of the models was tested by randomly dividing the data set and comparing coefficients and the level of accuracy and precision. Those models based on PCs were superior to those based on regression. It is demonstrated that with the appropriate modeling the VIAScan® system offers a workable method for predicting lean meat yield automatically. PMID:22061323

  5. Detection accuracy of root fractures in cone-beam computed tomography images: a systematic review and meta-analysis.

    PubMed

    Ma, R H; Ge, Z P; Li, G

    2016-07-01

    The aim of this review was to evaluate whether CBCT is reliable for the detection of root fractures in teeth without root fillings, and whether the voxel size has an impact on diagnostic accuracy. The studies published in PubMed, Web of Science, ScienceDirect, Cochrane Library, Embase, Scopus, CNKI and Wanfang up to May 2014 were the data source. Studies on nonroot filled teeth with the i-CAT (n = 8) and 3D Accuitomo CBCT (n = 5) units were eventually selected. In the studies on i-CAT, the pooled sensitivity was 0.83 and the pooled specificity was 0.91; in the 3D Accuitomo studies, the pooled sensitivity was 0.95 and pooled specificity was 0.96. The i-CAT group comprised 5 voxel size subgroups and the 3D Accuitomo group contained 2 subgroups. For the i-CAT group, there was a significant difference amongst the five subgroups (0.125, 0.2, 0.25, 0.3 and 0.4 mm; P = 0.000). Pairwise comparison revealed that 0.125 mm voxel subgroup was significantly different from those of 0.2, 0.25 and 0.3 mm voxel subgroups, but not from the 0.4 mm voxel subgroup. There were no significant differences amongst any other two subgroups (by α' = 0.005). No significant difference was found between 0.08 mm and 0.125 mm voxel subgroups (P = 0.320) for the 3D Accuitomo group. The present review confirms the detection accuracy of root fractures in CBCT images, but does not support the concept that voxel size may play a role in improving the detection accuracy of root fractures in nonroot filled teeth. PMID:26102215

  6. LP-search and its use in analysis of the accuracy of control systems with acoustical models

    NASA Technical Reports Server (NTRS)

    Sergeyev, V. I.; Sobol, I. M.; Statnikov, R. B.; Statnikov, I. N.

    1973-01-01

    The LP-search is proposed as an analog of the Monte Carlo method for finding values in nonlinear statistical systems. It is concluded that: To attain the required accuracy in solution to the problem of control for a statistical system in the LP-search, a considerably smaller number of tests is required than in the Monte Carlo method. The LP-search allows the possibility of multiple repetitions of tests under identical conditions and observability of the output variables of the system.

  7. Diagnostic Accuracy of In-House PCR for Pulmonary Tuberculosis in Smear-Positive Patients: Meta-Analysis and Metaregression▿ †

    PubMed Central

    Greco, S.; Rulli, M.; Girardi, E.; Piersimoni, C.; Saltini, C.

    2009-01-01

    In-house PCR (hPCR) could speed differential diagnosis between tuberculosis (TB) and nontuberculous mycobacterial disease in patients with positive smears and pulmonary infiltrates, but its reported accuracy fluctuates across studies. We conducted a systematic review and meta-analysis of hPCR sensitivity and specificity for smear-positive TB diagnosis, using culture as the reference standard. After searching English language studies in MEDLINE and EMBASE, we estimated cumulative accuracy by means of summary receiver operating characteristic analysis. The possible influence of hPCR procedures and study methodological features on accuracy was explored by univariate metaregression, followed by multivariate adjustment of items selected as significant. Thirty-five articles (1991 to 2006) met the inclusion criteria. The pooled estimates of the diagnostic odds ratio, sensitivity, and specificity (random-effect model) were, respectively, 60 (confidence interval [CI], 29 to 123), 0.96 (CI, 0.95 to 0.97), and 0.81 (CI, 0.78 to 0.84), but significant variations (mainly in specificity) limit their clinical applicability. The quality of the reference test, the detection method, and real-time PCR use explained some of the observed heterogeneity. Probably due to the limited study power of our meta-analysis and to the wide differences in both laboratory techniques and methodological quality, only real-time PCR also displayed a positive impact on accuracy in the multivariate model. Currently, hPCR can be confidently used to exclude TB in smear-positive patients, but its low specificity could lead to erroneous initiation of therapy, isolation, and contact investigation. As the inclusion of samples from treated patients could have artificially reduced specificity, future studies should report mycobacterial-culture results for each TB and non-TB sample analyzed. PMID:19144797

  8. Accuracy of gap analysis habitat models in predicting physical features for wildlife-habitat associations in the southwest U.S.

    USGS Publications Warehouse

    Boykin, K.G.; Thompson, B.C.; Propeck-Gray, S.

    2010-01-01

    Despite widespread and long-standing efforts to model wildlife-habitat associations using remotely sensed and other spatially explicit data, there are relatively few evaluations of the performance of variables included in predictive models relative to actual features on the landscape. As part of the National Gap Analysis Program, we specifically examined physical site features at randomly selected sample locations in the Southwestern U.S. to assess degree of concordance with predicted features used in modeling vertebrate habitat distribution. Our analysis considered hypotheses about relative accuracy with respect to 30 vertebrate species selected to represent the spectrum of habitat generalist to specialist and categorization of site by relative degree of conservation emphasis accorded to the site. Overall comparison of 19 variables observed at 382 sample sites indicated ???60% concordance for 12 variables. Directly measured or observed variables (slope, soil composition, rock outcrop) generally displayed high concordance, while variables that required judgments regarding descriptive categories (aspect, ecological system, landform) were less concordant. There were no differences detected in concordance among taxa groups, degree of specialization or generalization of selected taxa, or land conservation categorization of sample sites with respect to all sites. We found no support for the hypothesis that accuracy of habitat models is inversely related to degree of taxa specialization when model features for a habitat specialist could be more difficult to represent spatially. Likewise, we did not find support for the hypothesis that physical features will be predicted with higher accuracy on lands with greater dedication to biodiversity conservation than on other lands because of relative differences regarding available information. Accuracy generally was similar (>60%) to that observed for land cover mapping at the ecological system level. These patterns demonstrate

  9. The zero-multipole summation method for estimating electrostatic interactions in molecular dynamics: analysis of the accuracy and application to liquid systems.

    PubMed

    Fukuda, Ikuo; Kamiya, Narutoshi; Nakamura, Haruki

    2014-05-21

    In the preceding paper [I. Fukuda, J. Chem. Phys. 139, 174107 (2013)], the zero-multipole (ZM) summation method was proposed for efficiently evaluating the electrostatic Coulombic interactions of a classical point charge system. The summation takes a simple pairwise form, but prevents the electrically non-neutral multipole states that may artificially be generated by a simple cutoff truncation, which often causes large energetic noises and significant artifacts. The purpose of this paper is to judge the ability of the ZM method by investigating the accuracy, parameter dependencies, and stability in applications to liquid systems. To conduct this, first, the energy-functional error was divided into three terms and each term was analyzed by a theoretical error-bound estimation. This estimation gave us a clear basis of the discussions on the numerical investigations. It also gave a new viewpoint between the excess energy error and the damping effect by the damping parameter. Second, with the aid of these analyses, the ZM method was evaluated based on molecular dynamics (MD) simulations of two fundamental liquid systems, a molten sodium-chlorine ion system and a pure water molecule system. In the ion system, the energy accuracy, compared with the Ewald summation, was better for a larger value of multipole moment l currently induced until l ≲ 3 on average. This accuracy improvement with increasing l is due to the enhancement of the excess-energy accuracy. However, this improvement is wholly effective in the total accuracy if the theoretical moment l is smaller than or equal to a system intrinsic moment L. The simulation results thus indicate L ∼ 3 in this system, and we observed less accuracy in l = 4. We demonstrated the origins of parameter dependencies appearing in the crossing behavior and the oscillations of the energy error curves. With raising the moment l we observed, smaller values of the damping parameter provided more accurate results and smoother

  10. The zero-multipole summation method for estimating electrostatic interactions in molecular dynamics: Analysis of the accuracy and application to liquid systems

    SciTech Connect

    Fukuda, Ikuo; Kamiya, Narutoshi; Nakamura, Haruki

    2014-05-21

    In the preceding paper [I. Fukuda, J. Chem. Phys. 139, 174107 (2013)], the zero-multipole (ZM) summation method was proposed for efficiently evaluating the electrostatic Coulombic interactions of a classical point charge system. The summation takes a simple pairwise form, but prevents the electrically non-neutral multipole states that may artificially be generated by a simple cutoff truncation, which often causes large energetic noises and significant artifacts. The purpose of this paper is to judge the ability of the ZM method by investigating the accuracy, parameter dependencies, and stability in applications to liquid systems. To conduct this, first, the energy-functional error was divided into three terms and each term was analyzed by a theoretical error-bound estimation. This estimation gave us a clear basis of the discussions on the numerical investigations. It also gave a new viewpoint between the excess energy error and the damping effect by the damping parameter. Second, with the aid of these analyses, the ZM method was evaluated based on molecular dynamics (MD) simulations of two fundamental liquid systems, a molten sodium-chlorine ion system and a pure water molecule system. In the ion system, the energy accuracy, compared with the Ewald summation, was better for a larger value of multipole moment l currently induced until l ≲ 3 on average. This accuracy improvement with increasing l is due to the enhancement of the excess-energy accuracy. However, this improvement is wholly effective in the total accuracy if the theoretical moment l is smaller than or equal to a system intrinsic moment L. The simulation results thus indicate L ∼ 3 in this system, and we observed less accuracy in l = 4. We demonstrated the origins of parameter dependencies appearing in the crossing behavior and the oscillations of the energy error curves. With raising the moment l we observed, smaller values of the damping parameter provided more accurate results and smoother

  11. Experimental analysis of multi-attribute decision-making based on Atanassov intuitionistic fuzzy sets: a discussion of anchor dependency and accuracy functions

    NASA Astrophysics Data System (ADS)

    Chen, Ting-Yu

    2012-06-01

    This article presents a useful method for relating anchor dependency and accuracy functions to multiple attribute decision-making (MADM) problems in the context of Atanassov intuitionistic fuzzy sets (A-IFSs). Considering anchored judgement with displaced ideals and solution precision with minimal hesitation, several auxiliary optimisation models have proposed to obtain the optimal weights of the attributes and to acquire the corresponding TOPSIS (the technique for order preference by similarity to the ideal solution) index for alternative rankings. Aside from the TOPSIS index, as a decision-maker's personal characteristics and own perception of self may also influence the direction in the axiom of choice, the evaluation of alternatives is conducted based on distances of each alternative from the positive and negative ideal alternatives, respectively. This article originates from Li's [Li, D.-F. (2005), 'Multiattribute Decision Making Models and Methods Using Intuitionistic Fuzzy Sets', Journal of Computer and System Sciences, 70, 73-85] work, which is a seminal study of intuitionistic fuzzy decision analysis using deduced auxiliary programming models, and deems it a benchmark method for comparative studies on anchor dependency and accuracy functions. The feasibility and effectiveness of the proposed methods are illustrated by a numerical example. Finally, a comparative analysis is illustrated with computational experiments on averaging accuracy functions, TOPSIS indices, separation measures from positive and negative ideal alternatives, consistency rates of ranking orders, contradiction rates of the top alternative and average Spearman correlation coefficients.

  12. Diagnostic accuracy of conventional or age adjusted D-dimer cut-off values in older patients with suspected venous thromboembolism: systematic review and meta-analysis

    PubMed Central

    Geersing, G J; Koek, H L; Zuithoff, Nicolaas P A; Janssen, Kristel J M; Douma, Renée A; van Delden, Johannes J M; Moons, Karel G M; Reitsma, Johannes B

    2013-01-01

    Objective To review the diagnostic accuracy of D-dimer testing in older patients (>50 years) with suspected venous thromboembolism, using conventional or age adjusted D-dimer cut-off values. Design Systematic review and bivariate random effects meta-analysis. Data sources We searched Medline and Embase for studies published before 21 June 2012 and we contacted the authors of primary studies. Study selection Primary studies that enrolled older patients with suspected venous thromboembolism in whom D-dimer testing, using both conventional (500 µg/L) and age adjusted (age×10 µg/L) cut-off values, and reference testing were performed. For patients with a non-high clinical probability, 2×2 tables were reconstructed and stratified by age category and applied D-dimer cut-off level. Results 13 cohorts including 12 497 patients with a non-high clinical probability were included in the meta-analysis. The specificity of the conventional cut-off value decreased with increasing age, from 57.6% (95% confidence interval 51.4% to 63.6%) in patients aged 51-60 years to 39.4% (33.5% to 45.6%) in those aged 61-70, 24.5% (20.0% to 29.7% in those aged 71-80, and 14.7% (11.3% to 18.6%) in those aged >80. Age adjusted cut-off values revealed higher specificities over all age categories: 62.3% (56.2% to 68.0%), 49.5% (43.2% to 55.8%), 44.2% (38.0% to 50.5%), and 35.2% (29.4% to 41.5%), respectively. Sensitivities of the age adjusted cut-off remained above 97% in all age categories. Conclusions The application of age adjusted cut-off values for D-dimer tests substantially increases specificity without modifying sensitivity, thereby improving the clinical utility of D-dimer testing in patients aged 50 or more with a non-high clinical probability. PMID:23645857

  13. Design and analysis of ALE schemes with provable second-order time-accuracy for inviscid and viscous flow simulations

    NASA Astrophysics Data System (ADS)

    Geuzaine, Philippe; Grandmont, Céline; Farhat, Charbel

    2003-10-01

    We consider the solution of inviscid as well as viscous unsteady flow problems with moving boundaries by the arbitrary Lagrangian-Eulerian (ALE) method. We present two computational approaches for achieving formal second-order time-accuracy on moving grids. The first approach is based on flux time-averaging, and the second one on mesh configuration time-averaging. In both cases, we prove that formally second-order time-accurate ALE schemes can be designed. We illustrate our theoretical findings and highlight their impact on practice with the solution of inviscid as well as viscous, unsteady, nonlinear flow problems associated with the AGARD Wing 445.6 and a complete F-16 configuration.

  14. An analysis of approach navigation accuracy and guidance requirements for the grand tour mission to the outer planets

    NASA Technical Reports Server (NTRS)

    Jones, D. W.

    1971-01-01

    The navigation and guidance process for the Jupiter, Saturn and Uranus planetary encounter phases of the 1977 Grand Tour interior mission was simulated. Reference approach navigation accuracies were defined and the relative information content of the various observation types were evaluated. Reference encounter guidance requirements were defined, sensitivities to assumed simulation model parameters were determined and the adequacy of the linear estimation theory was assessed. A linear sequential estimator was used to provide an estimate of the augmented state vector, consisting of the six state variables of position and velocity plus the three components of a planet position bias. The guidance process was simulated using a nonspherical model of the execution errors. Computation algorithms which simulate the navigation and guidance process were derived from theory and implemented into two research-oriented computer programs, written in FORTRAN.

  15. Diagnostic accuracy of cytokeratin-19 fragment (CYFRA 21-1) for bladder cancer: a systematic review and meta-analysis.

    PubMed

    Huang, Yuan-Lan; Chen, Jie; Yan, Wei; Zang, Ding; Qin, Qin; Deng, An-Mei

    2015-05-01

    Previous studies have evaluated the accuracy of serum and urinary measurements of cytokeratin-19 fragment (CYFRA 21-1) for the diagnosis of bladder cancer; however, the results have been inconsistent. The aim of this study was to evaluate the overall accuracy of CYFRA 21-1 for the diagnosis of bladder cancer. We performed a search for English-language publications reporting on the detection of CYFRA21-1 levels for the diagnosis of bladder cancer through November 2, 2014, using public medical databases, including EMBASE, Web of Science, and Medline. The quality of the studies was assessed by revised QUADAS tools. The performance characteristics were pooled and analyzed using a bivariate model. Publication bias was explored with the Deek's test. Sixteen studies, with a total 1,262 bladder-cancer patients and 1,233 non-bladder-cancer patients, were included in the study. The pooled sensitivities for serum and urine CYFRA 21-1 were 0.42 (95 % confidence interval (CI), 0.33-0.51) and 0.82 (95 % CI, 0.70-0.90), respectively. The corresponding specificities were 0.94 (95 % CI, 0.90-0.96) and 0.80 (95 % CI, 0.73-0.86), respectively. The areas under the summary receiver-operating-characteristic curves for serum and urine CYFRA 21-1 were 0.88 (95 % CI, 0.85-0.91) and 0.87 (95 % CI, 0.84-0.90), respectively. The major design deficiencies of the included studies were participant-selection bias, potential review, and verification bias. Therefore, we concluded that both serum and urine CYFRA 21-1 served as efficient indexes for bladder-cancer diagnosis. Additional, well-designed studies should be performed to rigorously evaluate the diagnostic value of CYFRA 21-1 for bladder cancer. PMID:25854170

  16. Data accuracy assessment using enterprise architecture

    NASA Astrophysics Data System (ADS)

    Närman, Per; Holm, Hannes; Johnson, Pontus; König, Johan; Chenine, Moustafa; Ekstedt, Mathias

    2011-02-01

    Errors in business processes result in poor data accuracy. This article proposes an architecture analysis method which utilises ArchiMate and the Probabilistic Relational Model formalism to model and analyse data accuracy. Since the resources available for architecture analysis are usually quite scarce, the method advocates interviews as the primary data collection technique. A case study demonstrates that the method yields correct data accuracy estimates and is more resource-efficient than a competing sampling-based data accuracy estimation method.

  17. A systematic review and meta-analysis of the diagnostic accuracy of point-of-care tests for the detection of hyperketonemia in dairy cows.

    PubMed

    Tatone, Elise H; Gordon, Jessica L; Hubbs, Jessie; LeBlanc, Stephen J; DeVries, Trevor J; Duffield, Todd F

    2016-08-01

    Several rapid tests for use on farm have been validated for the detection of hyperketonemia (HK) in dairy cattle, however the reported sensitivity and specificity of each method varies and no single study has compared them all. Meta-analysis of diagnostic test accuracy is becoming more common in human medical literature but there are few veterinary examples. The objective of this work was to perform a systematic review and meta-analysis to determine the point-of-care testing method with the highest combined sensitivity and specificity, the optimal threshold for each method, and to identify gaps in the literature. A comprehensive literature search resulted in 5196 references. After removing duplicates and performing relevance screening, 23 studies were included for the qualitative synthesis and 18 for the meta-analysis. The three index tests evaluated in the meta-analysis were: the Precision Xtra(®) handheld device measuring beta-hydroxybutyrate (BHB) concentration in whole blood, and Ketostix(®) and KetoTest(®) semi-quantitative strips measuring the concentration of acetoacetate in urine and BHB in milk, respectively. The diagnostic accuracy of the 3 index tests relative to the reference standard measurement of BHB in serum or whole blood between 1.0-1.4mmol/L was compared using the hierarchical summary receiver operator characteristic (HSROC) method. Subgroup analysis was conducted for each index test to examine the accuracy at different thresholds. The impact of the reference standard threshold, the reference standard method, the prevalence of HK in the population, the primary study source and risk of bias of the primary study was explored using meta-regression. The Precision Xtra(®) device had the highest summary sensitivity in whole blood BHB at 1.2mmol/L, 94.8% (CI95%: 92.6-97.0), and specificity, 97.5% (CI95%: 96.9-98.1). The threshold employed (1.2-1.4mmol/L) did not impact the diagnostic accuracy of the test. The Ketostix(®) and KetoTest(®) strips had

  18. Added value of cost-utility analysis in simple diagnostic studies of accuracy: 18F-fluoromethylcholine PET/CT in prostate cancer staging

    PubMed Central

    Gerke, Oke; Poulsen, Mads H; Høilund-Carlsen, Poul Flemming

    2015-01-01

    Diagnostic studies of accuracy targeting sensitivity and specificity are commonly done in a paired design in which all modalities are applied in each patient, whereas cost-effectiveness and cost-utility analyses are usually assessed either directly alongside to or indirectly by means of stochastic modeling based on larger randomized controlled trials (RCTs). However the conduct of RCTs is hampered in an environment such as ours, in which technology is rapidly evolving. As such, there is a relatively limited number of RCTs. Therefore, we investigated as to which extent paired diagnostic studies of accuracy can be also used to shed light on economic implications when considering a new diagnostic test. We propose a simple decision tree model-based cost-utility analysis of a diagnostic test when compared to the current standard procedure and exemplify this approach with published data from lymph node staging of prostate cancer. Average procedure costs were taken from the Danish Diagnosis Related Groups Tariff in 2013 and life expectancy was estimated for an ideal 60 year old patient based on prostate cancer stage and prostatectomy or radiation and chemotherapy. Quality-adjusted life-years (QALYs) were deduced from the literature, and an incremental cost-effectiveness ratio (ICER) was used to compare lymph node dissection with respective histopathological examination (reference standard) and 18F-fluoromethylcholine positron emission tomography/computed tomography (FCH-PET/CT). Lower bounds of sensitivity and specificity of FCH-PET/CT were established at which the replacement of the reference standard by FCH-PET/CT comes with a trade-off between worse effectiveness and lower costs. Compared to the reference standard in a diagnostic accuracy study, any imperfections in accuracy of a diagnostic test imply that replacing the reference standard generates a loss in effectiveness and utility. We conclude that diagnostic studies of accuracy can be put to a more extensive use

  19. The accuracy of pain drawing in identifying psychological distress in low back pain—systematic review and meta-analysis of diagnostic studies

    PubMed Central

    Bertozzi, Lucia; Rosso, Anna; Romeo, Antonio; Villafañe, Jorge Hugo; Guccione, Andrew A.; Pillastrini, Paolo; Vanti, Carla

    2015-01-01

    The aim of this systematic review and meta-analysis was to estimate the accuracy of qualitative pain drawings (PDs) in identifying psychological distress in subacute and chronic low back pain (LBP) patients. [Subjects and Methods] Data were obtained from searches of PubMed, EBSCO, Scopus, PsycINFO and ISI Web of Science from their inception to July 2014. Quality assessments of bias and applicability were conducted using the Quality of Diagnostic Accuracy Studies-2 (QUADAS-2). [Results] The summary estimates were: sensitivity=0.45 (95% CI 0.34, 0.61), specificity=0.66 (95% CI 0.53, 0.82), positive likelihood ratio=1.23 (95% CI 0.93, 1.62), negative likelihood ratio=0.84 (95% CI 0.70, 1.01), and diagnostic odds ratio=1.46 (95% CI 0.79, 2.68). The area under the curve was 78% (CI, 57 to 99%). [Conclusion] The results of this systematic review do not show broad and unqualified support for the accuracy of PDs in detecting psychological distress in subacute and chronic LBP. PMID:26644701

  20. Diagnostic Accuracy of Point-of-Care Tests for Hepatitis C Virus Infection: A Systematic Review and Meta-Analysis

    PubMed Central

    Khuroo, Mehnaaz Sultan; Khuroo, Naira Sultan; Khuroo, Mohammad Sultan

    2015-01-01

    Background Point-of-care tests provide a plausible diagnostic strategy for hepatitis C infection in economically impoverished areas. However, their utility depends upon the overall performance of individual tests. Methods A literature search was conducted using the metasearch engine Mettā, a query interface for retrieving articles from five leading medical databases. Studies were included if they employed point-of-care tests to detect antibodies of hepatitis C virus and compared the results with reference tests. Two reviewers performed a quality assessment of the studies and extracted data for estimating test accuracy. Findings Thirty studies that had evaluated 30 tests fulfilled the inclusion criteria. The overall pooled sensitivity, specificity, positive likelihood-ratio, negative likelihood-ratio and diagnostic odds ratio for all tests were 97.4% (95% CI: 95.9–98.4), 99.5% (99.2–99.7), 80.17 (55.35–116.14), 0.03 (0.02–0.04), and 3032.85 (1595.86–5763.78), respectively. This suggested a high pooled accuracy for all studies. We found substantial heterogeneity between studies, but none of the subgroups investigated could account for the heterogeneity. Genotype diversity of HCV had no or minimal influence on test performance. Of the seven tests evaluated in the meta-regression model, OraQuick had the highest test sensitivity and specificity and showed better performance than a third generation enzyme immunoassay in seroconversion panels. The next highest test sensitivities and specificities were from TriDot and SDBioline, followed by Genedia and Chembio. The Spot and Multiplo tests produced poor test sensitivities but high test specificities. Nine of the remaining 23 tests produced poor test sensitivities and specificities and/or showed poor performances in seroconversion panels, while 14 tests had high test performances with diagnostic odds ratios ranging from 590.70 to 28822.20. Conclusions Performances varied widely among individual point-of-care tests

  1. Curriculum-based measurement of oral reading (R-CBM): a diagnostic test accuracy meta-analysis of evidence supporting use in universal screening.

    PubMed

    Kilgus, Stephen P; Methe, Scott A; Maggin, Daniel M; Tomasula, Jessica L

    2014-08-01

    A great deal of research over the past decade has examined the appropriateness of curriculum-based measurement of oral reading (R-CBM) in universal screening. Multiple researchers have meta-analyzed available correlational evidence, yielding support for the interpretation of R-CBM as an indicator of general reading proficiency. In contrast, researchers have yet to synthesize diagnostic accuracy evidence, which pertains to the defensibility of the use of R-CBM for screening purposes. The overall purpose of this research was to therefore conduct the first meta-analysis of R-CBM diagnostic accuracy research. A systematic search of the literature resulted in the identification of 34 studies, including 20 peer-reviewed articles, 7 dissertations, and 7 technical reports. Bivariate hierarchical linear models yielded generalized estimates of diagnostic accuracy statistics, which predominantly exceeded standards for acceptable universal screener performance. For instance, when predicting criterion outcomes within a school year (≤9 months), R-CBM sensitivity ranged between .80 and .83 and specificity ranged between .71 and .73. Multiple moderators of R-CBM diagnostic accuracy were identified, including the (a) R-CBM cut score used to define risk, (b) lag in time between R-CBM and criterion test administration, and (c) percentile rank corresponding to the criterion test cut score through which students were identified as either truly at risk or not at risk. Follow-up analyses revealed substantial variability of extracted cut scores within grade and time of year (i.e., fall, winter, and spring). This result called into question the inflexible application of a single cut score across contexts and suggested the potential necessity of local cut scores. Implications for practices, directions for future research, and limitations are discussed. PMID:25107410

  2. Personal Verification/Identification via Analysis of the Peripheral ECG Leads: Influence of the Personal Health Status on the Accuracy

    PubMed Central

    Jekova, Irena; Bortolan, Giovanni

    2015-01-01

    Traditional means for identity validation (PIN codes, passwords), and physiological and behavioral biometric characteristics (fingerprint, iris, and speech) are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (rI), II (rII), calculated from them first principal ECG component (rPCA), linear and nonlinear combinations between rI, rII, and rPCA. For the verification task, the one-to-one scenario is applied and threshold values for rI, rII, and rPCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension) has been considered. In addition a common reference PTB dataset (14 healthy individuals) with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%. PMID:26568954

  3. Personal Verification/Identification via Analysis of the Peripheral ECG Leads: Influence of the Personal Health Status on the Accuracy.

    PubMed

    Jekova, Irena; Bortolan, Giovanni

    2015-01-01

    Traditional means for identity validation (PIN codes, passwords), and physiological and behavioral biometric characteristics (fingerprint, iris, and speech) are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (r I), II (r II), calculated from them first principal ECG component (r PCA), linear and nonlinear combinations between r I, r II, and r PCA. For the verification task, the one-to-one scenario is applied and threshold values for r I, r II, and r PCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension) has been considered. In addition a common reference PTB dataset (14 healthy individuals) with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%. PMID:26568954

  4. Analysis of the effect of cone-beam geometry and test object configuration on the measurement accuracy of a computed tomography scanner used for dimensional measurement

    NASA Astrophysics Data System (ADS)

    Kumar, Jagadeesha; Attridge, Alex; Wood, P. K. C.; Williams, Mark A.

    2011-03-01

    Industrial x-ray computed tomography (CT) scanners are used for non-contact dimensional measurement of small, fragile components and difficult-to-access internal features of castings and mouldings. However, the accuracy and repeatability of measurements are influenced by factors such as cone-beam system geometry, test object configuration, x-ray power, material and size of test object, detector characteristics and data analysis methods. An attempt is made in this work to understand the measurement errors of a CT scanner over the complete scan volume, taking into account only the errors in system geometry and the object configuration within the scanner. A cone-beam simulation model is developed with the radiographic image projection and reconstruction steps. A known amount of errors in geometrical parameters were introduced in the model to understand the effect of geometry of the cone-beam CT system on measurement accuracy for different positions, orientations and sizes of the test object. Simulation analysis shows that the geometrical parameters have a significant influence on the dimensional measurement at specific configurations of the test object. Finally, the importance of system alignment and estimation of correct parameters for accurate CT measurements is outlined based on the analysis.

  5. Expected accuracy in a measurement of the CKM angle alpha using a Dalitz plot analysis of B0 ---> rho pi decays in the BTeV project

    SciTech Connect

    Shestermanov, K.E.; Vasiliev, A.N; Butler, J.; Derevschikov, A.A.; Kasper, P.; Kiselev, V.V.; Kravtsov, V.I.; Kubota, Y.; Kutschke, R.; Matulenko, Y.A.; Minaev, N.G.; /Serpukhov, IHEP /Fermilab /Minnesota U. /Syracuse U. /INFN, Milan

    2005-12-01

    A precise measurement of the angle {alpha} in the CKM triangle is very important for a complete test of Standard Model. A theoretically clean method to extract {alpha} is provided by B{sup 0} {yields} {rho}{pi} decays. Monte Carlo simulations to obtain the BTeV reconstruction efficiency and to estimate the signal to background ratio for these decays were performed. Finally the time-dependent Dalitz plot analysis, using the isospin amplitude formalism for tre and penguin contributions, was carried out. It was shown that in one year of data taking BTeV could achieve an accuracy on {alpha} better than 5{sup o}.

  6. Evaluating LANDSAT wildland classification accuracies

    NASA Technical Reports Server (NTRS)

    Toll, D. L.

    1980-01-01

    Procedures to evaluate the accuracy of LANDSAT derived wildland cover classifications are described. The evaluation procedures include: (1) implementing a stratified random sample for obtaining unbiased verification data; (2) performing area by area comparisons between verification and LANDSAT data for both heterogeneous and homogeneous fields; (3) providing overall and individual classification accuracies with confidence limits; (4) displaying results within contingency tables for analysis of confusion between classes; and (5) quantifying the amount of information (bits/square kilometer) conveyed in the LANDSAT classification.

  7. Clinical Accuracy of the Respiratory Tumor Tracking System of the CyberKnife: Assessment by Analysis of Log Files

    SciTech Connect

    Hoogeman, Mischa Prevost, Jean-Briac; Nuyttens, Joost; Poell, Johan; Levendag, Peter; Heijmen, Ben

    2009-05-01

    Purpose: To quantify the clinical accuracy of the respiratory motion tracking system of the CyberKnife treatment device. Methods and Materials: Data in log files of 44 lung cancer patients treated with tumor tracking were analyzed. Errors in the correlation model, which relates the internal target motion with the external breathing motion, were quantified. The correlation model error was compared with the geometric error obtained when no respiratory tracking was used. Errors in the prediction method were calculated by subtracting the predicted position from the actual measured position after 192.5 ms (the time lag to prediction in our current system). The prediction error was also measured for a time lag of 115 ms and a new prediction method. Results: The mean correlation model errors were less than 0.3 mm. Standard deviations describing intrafraction variations around the whole-fraction mean error were 0.2 to 1.9 mm for cranio-caudal, 0.1 to 1.9 mm for left-right, and 0.2 to 2.5 mm for anterior-posterior directions. Without the use of respiratory tracking, these variations would have been 0.2 to 8.1 mm, 0.2 to 5.5 mm, and 0.2 to 4.4 mm. The overall mean prediction error was small (0.0 {+-} 0.0 mm) for all directions. The intrafraction standard deviation ranged from 0.0 to 2.9 mm for a time delay of 192.5 ms but was halved by using the new prediction method. Conclusions: Analyses of the log files of real clinical cases have shown that the geometric error caused by respiratory motion is substantially reduced by the application of respiratory motion tracking.

  8. Noninvasive identification of left main and triple vessel coronary artery disease: improved accuracy using quantitative analysis of regional myocardial stress distribution and washout of thallium-201

    SciTech Connect

    Maddahi, J.; Abdulla, A.; Garcia, E.V.; Swan, H.J.; Berman, D.S.

    1986-01-01

    The capabilities of visual and quantitative analysis of stress redistribution thallium-201 scintigrams, exercise electrocardiography and exercise blood pressure response were compared for correct identification of extensive coronary disease, defined as left main or triple vessel coronary artery disease, or both (50% or more luminal diameter coronary narrowing), in 105 consecutive patients with suspected coronary artery disease. Extensive disease was present in 56 patients and the remaining 49 had either less extensive coronary artery disease (n = 34) or normal coronary arteriograms (n = 15). Although exercise blood pressure response, exercise electrocardiography and visual thallium-201 analysis were highly specific (98, 88 and 96%, respectively), they were insensitive for identification of patients with extensive disease (14, 45 and 16%, respectively). Quantitative thallium-201 analysis significantly improved the sensitivity of visual thallium-201 analysis for identification of patients with extensive disease (from 16 to 63%, p less than 0.001) without a significant loss of specificity (96 versus 86%, p = NS). Eighteen (64%) of the 28 patients who were misclassified by visual analysis as having less extensive disease were correctly classified as having extensive disease by virtue of quantitative analysis of regional myocardial thallium-201 washout. When the results of quantitative thallium-201 analysis were combined with those of blood pressure and electrocardiographic response to exercise, the sensitivity and specificity for identification of patients with extensive disease was 86 and 76%, respectively, and the highest overall accuracy (0.82) was obtained.

  9. Diagnostic accuracy of stroke volume variation measured with uncalibrated arterial waveform analysis for the prediction of fluid responsiveness in patients with impaired left ventricular function: a prospective, observational study.

    PubMed

    Montenij, L J; Sonneveld, J P C; Nierich, A P; Buhre, W F; de Waal, E E C

    2016-08-01

    Uncalibrated arterial waveform analysis enables dynamic preload assessment in a minimally invasive fashion. Evidence about the validity of the technique in patients with impaired left ventricular function is scarce, while adequate cardiac preload assessment would be of great value in these patients. The aim of this study was to investigate the diagnostic accuracy of stroke volume variation (SVV) measured with the FloTrac/Vigileo™ system in patients with impaired left ventricular function. In this prospective, observational study, 22 patients with a left ventricular ejection fraction of 40 % or less undergoing elective coronary artery bypass grafting were included. Patients were considered fluid responsive if cardiac output increased with 15 % or more after volume loading (7 ml kg(-1) ideal body weight). The following variables were calculated: area under the receiver operating characteristics (ROC) curve, ideal cut-off value for SVV, sensitivity, specificity, positive and negative predictive values, and overall accuracy. In addition, SVV cut-off points to obtain 90 % true positive and 90 % true negative predictions were determined. ROC analysis revealed an area under the curve of 0.70 [0.47; 0.92]. The ideal SVV cut-off value was 10 %, with a corresponding sensitivity and specificity of 56 and 69 % respectively. Overall accuracy was 64 %, positive and negative predictive values were 69 and 56 % respectively. SVV values to obtain more than 90 % true positive and negative predictions were 16 and 6 % respectively. The ability of uncalibrated arterial waveform analysis SVV to predict fluid responsiveness in patients with impaired LVF was low. PMID:26227160

  10. Diagnostic accuracy of serum squamous cell carcinoma antigen and squamous cell carcinoma antigen-immunoglobulin M for hepatocellular carcinoma: A meta-analysis

    PubMed Central

    ZHANG, JIAN; SHAO, CHUXIAO; ZHOU, QINGYUN; ZHU, YIMIN; ZHU, JINDE; TU, CHAOYONG

    2015-01-01

    A number of individual studies have evaluated the diagnostic efficiency of serum squamous cell carcinoma antigen (SCCA) and SCCA-immunoglobulin (IgM) for diagnosing hepatocellular carcinoma (HCC), but the results have been conflicting. The aim of this study was to determine the diagnostic accuracy of serum SCCA and SCCA-IgM for HCC. A systematic review of related studies was conducted and relevant data on the accuracy of serum SCCA and SCCA-IgM in the diagnosis of HCC were pooled using random-effects models. Summary receiver operating characteristic curve (SROC) analysis was used to summarize the overall test performance. A total of 12 studies were included in our meta-analysis. The summary estimates for serum SCCA and SCCA-IgM for HCC diagnosis in the included studies were as follows: Sensitivity = 0.59 (95% CI: 0.56–0.62) vs. 0.60 (95% CI: 0.56–0.63); specificity = 0.76 (95% CI: 0.73–0.79) vs. 0.70 (95% CI: 0.67–0.73); diagnostic odds ratio (DOR) = 6.68 (95% CI: 3.71–12.03) vs. 7.32 (95% CI: 3.31–16.15); and area under the SROC curve = 0.7826 vs. 0.7955. Therefore, SCCA and SCCA-IgM exhibited moderate diagnostic accuracy for HCC. Due to the design limitations, the results of published studies should be interpreted with caution. In addition, well-designed studies including larger sample sizes should be conducted to rigorously evaluate the diagnostic value of SCCA and SCCA-IgM. PMID:26623071

  11. Bayesian meta-analysis of test accuracy in the absence of a perfect reference test applied to bone scintigraphy for the diagnosis of complex regional pain syndrome.

    PubMed

    Held, Ulrike; Brunner, Florian; Steurer, Johann; Wertli, Maria M

    2015-11-01

    There is conflicting evidence about the accuracy of bone scintigraphy (BS) for the diagnosis of complex regional pain syndrome 1 (CRPS 1). In a meta-analysis of diagnostic studies, the evaluation of test accuracy is impeded by the use of different imperfect reference tests. The aim of our study is to summarize sensitivity and specificity of BS for CRPS 1 and to identify factors to explain heterogeneity. We use a hierarchical Bayesian approach to model test accuracy and threshold, and we present different models accounting for the imperfect nature of the reference tests, and assuming conditional dependence between BS and the reference test results. Further, we include disease duration as explanatory variable in the model. The models are compared using summary ROC curves and the deviance information criterion (DIC). Our results show that those models which account for different imperfect reference tests with conditional dependence and inclusion of the covariate are the ones with the smallest DIC. The sensitivity of BS was 0.87 (95% credible interval 0.73-0.97) and the overall specificity was 0.87 (0.73-0.95) in the model with the smallest DIC, in which missing values of the covariate are imputed within the Bayesian framework. The estimated effect of duration of symptoms on the threshold parameter was 0.17 (-0.25 to 0.57). We demonstrate that the Bayesian models presented in this paper are useful to address typical problems occurring in meta-analysis of diagnostic studies, including conditional dependence between index test and reference test, as well as missing values in the study-specific covariates. PMID:26479506

  12. ACCURACY AND TRACE ORGANIC ANALYSES

    EPA Science Inventory

    Accuracy in trace organic analysis presents a formidable problem to the residue chemist. He is confronted with the analysis of a large number and variety of compounds present in a multiplicity of substrates at levels as low as parts-per-trillion. At these levels, collection, isol...

  13. The accuracy of automatic tracking

    NASA Technical Reports Server (NTRS)

    Kastrov, V. V.

    1974-01-01

    It has been generally assumed that tracking accuracy changes similarly to the rate of change of the curve of the measurement conversion. The problem that internal noise increases along with the signals processed by the tracking device and that tracking accuracy thus drops were considered. The main prerequisite for solution is consideration of the dependences of the output signal of the tracking device sensor not only on the measured parameter but on the signal itself.

  14. The accuracy of endometrial sampling in women with postmenopausal bleeding: a systematic review and meta-analysis.

    PubMed

    van Hanegem, Nehalennia; Prins, Marileen M C; Bongers, Marlies Y; Opmeer, Brent C; Sahota, Daljit Singh; Mol, Ben Willem J; Timmermans, Anne

    2016-02-01

    Postmenopausal bleeding (PMB) can be the first sign of endometrial cancer. In case of thickened endometrium, endometrial sampling is often used in these women. In this systematic review, we studied the accuracy of endometrial sampling for the diagnoses of endometrial cancer, atypical hyperplasia and endometrial disease (endometrial pathology, including benign polyps). We systematically searched the literature for studies comparing the results of endometrial sampling in women with postmenopausal bleeding with two different reference standards: blind dilatation and curettage (D&C) and hysteroscopy with histology. We assessed the quality of the detected studies by the QUADAS-2 tool. For each included study, we calculated the fraction of women in whom endometrial sampling failed. Furthermore, we extracted numbers of cases of endometrial cancer, atypical hyperplasia and endometrial disease that were identified or missed by endometrial sampling. We detected 12 studies reporting on 1029 women with postmenopausal bleeding: five studies with dilatation and curettage (D&C) and seven studies with hysteroscopy as a reference test. The weighted sensitivity of endometrial sampling with D&C as a reference for the diagnosis of endometrial cancer was 100% (range 100-100%) and 92% (71-100) for the diagnosis of atypical hyperplasia. Only one study reported sensitivity for endometrial disease, which was 76%. When hysteroscopy was used as a reference, weighted sensitivities of endometrial sampling were 90% (range 50-100), 82% (range 56-94) and 39% (21-69) for the diagnosis of endometrial cancer, atypical hyperplasia and endometrial disease, respectively. For all diagnosis studied and the reference test used, specificity was 98-100%. The weighted failure rate of endometrial sampling was 11% (range 1-53%), while insufficient samples were found in 31% (range 7-76%). In these women with insufficient or failed samples, an endometrial (pre) cancer was found in 7% (range 0-18%). In women with

  15. Assessing the Classification Accuracy of Early Numeracy Curriculum-Based Measures Using Receiver Operating Characteristic Curve Analysis

    ERIC Educational Resources Information Center

    Laracy, Seth D.; Hojnoski, Robin L.; Dever, Bridget V.

    2016-01-01

    Receiver operating characteristic curve (ROC) analysis was used to investigate the ability of early numeracy curriculum-based measures (EN-CBM) administered in preschool to predict performance below the 25th and 40th percentiles on a quantity discrimination measure in kindergarten. Areas under the curve derived from a sample of 279 students ranged…

  16. The accuracy of the anti-mitochondrial antibody and the M2 subtype test for diagnosis of primary biliary cirrhosis: a meta-analysis.

    PubMed

    Hu, Shiling; Zhao, Fengrong; Wang, Qingsong; Chen, Wei-Xian

    2014-11-01

    The aim of this study was to evaluate the diagnostic value of anti-mitochondrial antibodies (AMAs) and/or the M2 subtype (AMA-M2) in patients with primary biliary cirrhosis (PBC). AMA/AMA-M2 data were obtained by searching electronic databases. Studies showing AMA/AMA-M2 results in patients with PBC and control groups with other liver diseases or healthy livers were included. The quality of the involved studies was assessed using the QUADAS tool. The pooled sensitivity and specificity were calculated, and stratified analysis was performed according to possible heterogeneity sources. The pooled AMA (all methods) sensitivity and specificity were 84.5% (95% confidence interval (CI) 83.3%-85.6%) and 97.8% (95% CI 97.6%-98.0%), respectively. The positive and negative likelihood ratios were 25.201 (95% CI 17.583-36.118) and 0.162 (95% CI 0.131-0.199), respectively. The current evidence suggests that AMA and AMA-M2 show favorable accuracy for the diagnosis of PBC with high specificity and sensitivity. AMA is a better and more comprehensive marker than AMA-M2. The accuracy established in this meta-analysis is based on clinical studies using patient cohorts from different ethnicities. PMID:24501161

  17. Diagnostic accuracy of posterior pole asymmetry analysis parameters of spectralis optical coherence tomography in detecting early unilateral glaucoma

    PubMed Central

    Dave, Paaraj; Shah, Juhi

    2015-01-01

    Purpose: To report the diagnostic ability of posterior pole asymmetry analysis (PPAA) parameters of spectralis optical coherence tomography (OCT) in detecting early unilateral glaucoma. Methods: A prospective, cross-sectional study which included 80 eyes of 80 normal subjects and 76 eyes of 76 patients with unilateral early primary open-angle glaucoma by Hodapp-Anderson-Parrish classification. All subjects were of age more than 18 years, best-corrected visual acuity 20/40 or better, and a refractive error within ± 5 diopter (D) sphere and ± 3 D cylinder. Control subjects had a normal ocular examination, intraocular pressure (IOP) <22 mmHg, no past history of high IOP, no family history of glaucoma, normal optic disc morphology, and visual field in both eyes. One eye of the control subject was randomly included. All eyes underwent OCT for retinal nerve fiber layer (RNFL) analysis and PPAA. The number of continuous black squares was noted in the asymmetry analysis (right-left + hemisphere asymmetry). The area under curve (AUC) was calculated for all OCT parameters. Results: The best value for AUC for RNFL analysis was 0.858 for the inferotemporal quadrant thickness. This was similar to the best value for AUC for PPAA which was 0.833 for the inferior macular thickness parameter (P = 0.5). The AUC for the right-left and the hemisphere asymmetry part of PPAA was 0.427 and 0.499, respectively. Conclusion: The macular thickness PPAA parameters were equally good as the RNFL parameters. However, the asymmetry analysis parameters performed poorly and need further refinement before its use in early unilateral glaucoma diagnosis. PMID:26669335

  18. The accuracy of auditors' and layered voice Analysis (LVA) operators' judgments of truth and deception during police questioning.

    PubMed

    Horvath, Frank; McCloughan, Jamie; Weatherman, Dan; Slowik, Stanley

    2013-03-01

    The purpose of this study was to determine if auditors could identify truthful and deceptive persons in a sample (n = 74) of audio recordings used to assess the effectiveness of layered voice analysis (LVA). The LVA employs an automated algorithm to detect deception, but it was not effective here. There were 31 truthful and 43 deceptive persons in the sample and two LVA operators averaged 48% correct decisions on truth-tellers and 25% on deceivers. Subsequent to the LVA analysis the recordings were audited by three interviewers, each independently rendering a decision of truthful or deceptive and indicating their confidence. Auditors' judgments averaged 68% correct decisions on truth-tellers and 71% on deceivers. Auditors' detection rates, generally, exceeded chance and there was significantly (p < 0.05) greater confidence on correct than incorrect judgments of deceivers but not on truth-tellers. These results suggest that the success reported for LVA analysis may be due to operator's judgment. PMID:23406506

  19. Development of methodology for the optimization of classification accuracy of Landsat TM/ETM+ imagery for supporting fast flood hydrological analysis

    NASA Astrophysics Data System (ADS)

    Alexakis, D. D.; Hadjimitsis, D. G.; Agapiou, A.; Retalis, A.; Themistocleous, K.; Michaelides, S.; Pashiardis, S.

    2012-04-01

    One of the important tools for detection and quantification of land-cover changes across catchment areas is the classification of multispectral satellite imagery. Land cover changes, may be used to describe dynamics of urban settlements and vegetation patterns as an important indicator of urban ecological environments. Several techniques have been reported to improve classification results in terms of land use discrimination and accuracy of resulting classes. The aim of this study is to improve classification results of multispectral satellite imagery for supporting flood risk assessment analysis in a catchment area in Cyprus (Yialias river). This paper describes the results obtained by integrating remote sensing techniques such as classification analysis and contemporary statistical analysis (maximum entropy) for detecting urbanization activities in a catchment area in Cyprus. The final results were incorporated in an integrated flood risk management model. This study aims to test different material samples in the Yialias region in order to examine: a) their spectral behavior under different precipitation rates and b) to introduce an alternative methodology to optimize the classification results derived from single satellite imagery with the combined use of satellite, spectroradiometric and precipitation data. At the end, different classification algorithms and statistical analysis are used to verify and optimize the final results such as object based classification and maximum entropy. The main aim of the study is the verification of the hypothesis that the multispectral classification accuracy is improved if the land surface humidity is high. This hypothesis was tested against Landsat derived reflectance values and validated with in-situ reflectance observations with the use of high spectral resolution spectroradiometers. This study aspires to highlight the potential of medium resolution satellite images such as those of Landsat TM/ETM+ for Land Use / Land cover

  20. A step-by-step guide to the systematic review and meta-analysis of diagnostic and prognostic test accuracy evaluations.

    PubMed

    Liu, Z; Yao, Z; Li, C; Liu, X; Chen, H; Gao, C

    2013-06-11

    In evidence-based medicine (EBM), systematic reviews and meta-analyses have been widely applied in biological and medical research. Moreover, the most popular application of meta-analyses in this field may be to examine diagnostic (sensitivity and specificity) and prognostic (hazard ratio (HR) and its variance, standard error (SE) or confidence interval (CI)) test accuracy. However, conducting such analyses requires not only a great deal of time but also an advanced professional knowledge of mathematics, statistics and computer science. Regarding the practical application of meta-analyses for diagnostic and prognostic markers, the majority of users are clinicians and biologists, most of whom are not skilled at mathematics and computer science in particular. Hence, it is necessary for these users to have a simplified version of a protocol to help them to quickly conduct meta-analyses of the accuracy of diagnostic and prognostic tests. The aim of this paper is to enable individuals who have never performed a meta-analysis to do so from scratch. The paper does not attempt to serve as a comprehensive theoretical guide but instead describes one rigorous way of conducting a meta-analysis for diagnostic and prognostic markers. Investigators who follow the outlined methods should be able to understand the basic ideas behind the steps taken, the meaning of the meta-analysis results obtained for diagnostic and prognostic markers and the scope of questions that can be answered with Systematic Reviews and Meta-Analyses (SRMA). The presented protocols have been successfully tested by clinicians without meta-analysis experience. PMID:23695015

  1. Diagnostic test accuracy of anti-glycopeptidolipid-core IgA antibodies for Mycobacterium avium complex pulmonary disease: systematic review and meta-analysis

    PubMed Central

    Shibata, Yuji; Horita, Nobuyuki; Yamamoto, Masaki; Tsukahara, Toshinori; Nagakura, Hideyuki; Tashiro, Ken; Watanabe, Hiroki; Nagai, Kenjiro; Nakashima, Kentaro; Ushio, Ryota; Ikeda, Misako; Narita, Atsuya; Kanai, Akinori; Sato, Takashi; Kaneko, Takeshi

    2016-01-01

    Currently, an anti-glycopeptidolipid (GPL)-core IgA antibody assay kit for diagnosing Mycobacterium avium complex (MAC) is commercially available. We conducted this systematic review and meta-analysis to reveal the precise diagnostic accuracy of anti-GPL-core IgA antibodies for MAC pulmonary disease (MAC-PD). We systematically searched reports that could provide data for both sensitivity and specificity by anti-GPL-core IgA antibody for clinically diagnosed MAC-PD. Diagnostic test accuracy was estimated using the bivariate model. Of the 257 articles that we had found through primary search, we finally included 16 reports consisted of 1098 reference positive subjects and 2270 reference negative subjects. The diagnostic odds ratio was 24.8 (95% CI 11.6–52.8, I2 = 5.5%) and the area under the hierarchical summary receiver operating characteristic curves was 0.873 (95% CI 0.837–0.913). With a cutoff value of 0.7 U/mL, the summary estimates of sensitivity and specificity were 0.696 (95% CI 0.621–0.761) and 0.906 (95% CI 0.836–0.951), respectively. The positive and negative likelihood ratios were 7.4 (95% CI 4.1–13.8) and 0.34 (95% CI 0.26–0.43), respectively. The demanding clinical diagnostic criteria may be a cause of false positive of the index test. The index test had good overall diagnostic accuracy and was useful to ruling in MAC-PD with the cutoff value. PMID:27373718

  2. Toward subchemical accuracy in computational thermochemistry: focal point analysis of the heat of formation of NCO and [H,N,C,O] isomers.

    PubMed

    Schuurman, Michael S; Muir, Steven R; Allen, Wesley D; Schaefer, Henry F

    2004-06-22

    In continuing pursuit of thermochemical accuracy to the level of 0.1 kcal mol(-1), the heats of formation of NCO, HNCO, HOCN, HCNO, and HONC have been rigorously determined using state-of-the-art ab initio electronic structure theory, including conventional coupled cluster methods [coupled cluster singles and doubles (CCSD), CCSD with perturbative triples (CCSD(T)), and full coupled cluster through triple excitations (CCSDT)] with large basis sets, conjoined in cases with explicitly correlated MP2-R12/A computations. Limits of valence and all-electron correlation energies were extrapolated via focal point analysis using correlation consistent basis sets of the form cc-pVXZ (X=2-6) and cc-pCVXZ (X=2-5), respectively. In order to reach subchemical accuracy targets, core correlation, spin-orbit coupling, special relativity, the diagonal Born-Oppenheimer correction, and anharmonicity in zero-point vibrational energies were accounted for. Various coupled cluster schemes for partially including connected quadruple excitations were also explored, although none of these approaches gave reliable improvements over CCSDT theory. Based on numerous, independent thermochemical paths, each designed to balance residual ab initio errors, our final proposals are DeltaH(f,0) ( composite function )(NCO)=+30.5, DeltaH(f,0) ( composite function )(HNCO)=-27.6, DeltaH(f,0) ( composite function )(HOCN)=-3.1, DeltaH(f,0) ( composite function )(HCNO)=+40.9, and DeltaH(f,0) ( composite function )(HONC)=+56.3 kcal mol(-1). The internal consistency and convergence behavior of the data suggests accuracies of +/-0.2 kcal mol(-1) in these predictions, except perhaps in the HCNO case. However, the possibility of somewhat larger systematic errors cannot be excluded, and the need for CCSDTQ [full coupled cluster through quadruple excitations] computations to eliminate remaining uncertainties is apparent. PMID:15268193

  3. Diagnostic test accuracy of anti-glycopeptidolipid-core IgA antibodies for Mycobacterium avium complex pulmonary disease: systematic review and meta-analysis.

    PubMed

    Shibata, Yuji; Horita, Nobuyuki; Yamamoto, Masaki; Tsukahara, Toshinori; Nagakura, Hideyuki; Tashiro, Ken; Watanabe, Hiroki; Nagai, Kenjiro; Nakashima, Kentaro; Ushio, Ryota; Ikeda, Misako; Narita, Atsuya; Kanai, Akinori; Sato, Takashi; Kaneko, Takeshi

    2016-01-01

    Currently, an anti-glycopeptidolipid (GPL)-core IgA antibody assay kit for diagnosing Mycobacterium avium complex (MAC) is commercially available. We conducted this systematic review and meta-analysis to reveal the precise diagnostic accuracy of anti-GPL-core IgA antibodies for MAC pulmonary disease (MAC-PD). We systematically searched reports that could provide data for both sensitivity and specificity by anti-GPL-core IgA antibody for clinically diagnosed MAC-PD. Diagnostic test accuracy was estimated using the bivariate model. Of the 257 articles that we had found through primary search, we finally included 16 reports consisted of 1098 reference positive subjects and 2270 reference negative subjects. The diagnostic odds ratio was 24.8 (95% CI 11.6-52.8, I(2) = 5.5%) and the area under the hierarchical summary receiver operating characteristic curves was 0.873 (95% CI 0.837-0.913). With a cutoff value of 0.7 U/mL, the summary estimates of sensitivity and specificity were 0.696 (95% CI 0.621-0.761) and 0.906 (95% CI 0.836-0.951), respectively. The positive and negative likelihood ratios were 7.4 (95% CI 4.1-13.8) and 0.34 (95% CI 0.26-0.43), respectively. The demanding clinical diagnostic criteria may be a cause of false positive of the index test. The index test had good overall diagnostic accuracy and was useful to ruling in MAC-PD with the cutoff value. PMID:27373718

  4. Shadow Analysis Technique for Extraction of Building Height using High Resolution Satellite Single Image and Accuracy Assessment

    NASA Astrophysics Data System (ADS)

    Raju, P. L. N.; Chaudhary, H.; Jha, A. K.

    2014-11-01

    These High resolution satellite data with metadata information is used to extract the height of the building using shadow. Proposed approach divides into two phases 1) rooftop and shadow extraction and 2) height estimation. Firstly the rooftop and shadow region were extracted by manual/ automatic methods using Example - Based and Rule - Based approaches. After feature extraction next step is estimating height of the building by taking rooftop in association with shadow using Ratio Method and by using the relation between sun-satellite geometry. The performance analysis shows the total mean error of height is 0.67 m from ratio method, 1.51 m from Example - Based Approach and 0.96 m from Rule - Based Approach. Analysis concluded that Ratio Method i.e. manual method is best for height estimation but it is time consuming so the automatic Rule Based approach is best for height estimation in comparison to Example Based Approach because it require more knowledge and selection of more training samples as well as slows the processing rate of the method.

  5. Evaluation of Accuracy for 2D Elastic-Plastic Analysis by Embedded Force Doublet Model Combined with Automated Delaunay Tessellation

    NASA Astrophysics Data System (ADS)

    Ino, Takuichiro; Hasib, M. D. Abdul; Takase, Toru; Saimoto, Akihide

    2015-03-01

    An embedded force doublet (EFD) model is proposed to express the presence of permanent strain in body force method (BFM). BFM is known as a boundary type method for elastic stress analysis based on the principle of superposition. In EFD model, the permanent strain is replaced by distributed force doublets. In an actual elastic-plastic analysis, the plastic region whose shape is not clear a priori, have to be discretized into elements where the magnitude of embedded force doublets is unknown to be determined numerically. In general, the determination process of magnitude of EFD is considerably difficult due to nonlinear nature of yield criterion and plastic constitutive relations. In this study, by introducing the automated Delaunay tessellation scheme for discretizing the prospective plastic region, appreciable reduction in input data was realized. Adding to this, in order to improve the computational efficiency, influence coefficients used for determining the magnitude of EFD are stored in a database. The effectiveness of these two inventions was examined by computing the elastic-plastic problem of an infinite medium with circular hole subjected to uniform internal pressure. The numerical solution was compared with Nadai’s closed form solution and found a good agreement.

  6. Correlative and multivariate analysis of increased radon concentration in underground laboratory.

    PubMed

    Maletić, Dimitrije M; Udovičić, Vladimir I; Banjanac, Radomir M; Joković, Dejan R; Dragić, Aleksandar L; Veselinović, Nikola B; Filipović, Jelena

    2014-11-01

    The results of analysis using correlative and multivariate methods, as developed for data analysis in high-energy physics and implemented in the Toolkit for Multivariate Analysis software package, of the relations of the variation of increased radon concentration with climate variables in shallow underground laboratory is presented. Multivariate regression analysis identified a number of multivariate methods which can give a good evaluation of increased radon concentrations based on climate variables. The use of the multivariate regression methods will enable the investigation of the relations of specific climate variable with increased radon concentrations by analysis of regression methods resulting in 'mapped' underlying functional behaviour of radon concentrations depending on a wide spectrum of climate variables. PMID:25080439

  7. The precision and accuracy of iterative and non-iterative methods of photopeak integration in activation analysis, with particular reference to the analysis of multiplets

    USGS Publications Warehouse

    Baedecker, P.A.

    1977-01-01

    The relative precisions obtainable using two digital methods, and three iterative least squares fitting procedures of photopeak integration have been compared empirically using 12 replicate counts of a test sample with 14 photopeaks of varying intensity. The accuracy by which the various iterative fitting methods could analyse synthetic doublets has also been evaluated, and compared with a simple non-iterative approach. ?? 1977 Akade??miai Kiado??.

  8. JASMINE -- Japan Astrometry Satellite Mission for INfrared Exploration: Data Analysis and Accuracy Assessment with a Kalman Filter

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Shimokawa, T.; Shinomoto, S. Yano, T.; Gouda, N.

    2009-09-01

    For the purpose of determining the celestial coordinates of stellar positions, consecutive observational images are laid overlapping each other with clues of stars belonging to multiple plates. In the analysis, one has to estimate not only the coordinates of individual plates, but also the possible expansion and distortion of the frame. This problem reduces to a least-squares fit that can in principle be solved by a huge matrix inversion, which is, however, impracticable. Here, we propose using Kalman filtering to perform the least-squares fit and implement a practical iterative algorithm. We also estimate errors associated with this iterative method and suggest a design of overlapping plates to minimize the error.

  9. Accuracy of plasma sTREM-1 for sepsis diagnosis in systemic inflammatory patients: a systematic review and meta-analysis

    PubMed Central

    2012-01-01

    Introduction Early diagnosis of sepsis is vital to the clinical course and outcome of septic patients. Recently, soluble triggering receptor expressed on myeloid cells-1 (sTREM-1) appears to be a potential marker of infection. The objective of this systematic review and meta-analysis was to evaluate the accuracy of plasma sTREM-1 for sepsis diagnosis in systemic inflammatory patients. Methods A systematic literature search of PubMed, Embase and Cochrane Central Register of Controlled Trials was performed using specific search terms (up to 15 October 2012). Studies were included if they assessed the accuracy of plasma sTREM-1 for sepsis diagnosis in adult patients with systemic inflammatory response syndrome (SIRS) and provided sufficient information to construct a 2 X 2 contingency table. Results Eleven studies with a total of 1,795 patients were included. The pooled sensitivity and specificity was 79% (95% confidence interval (CI), 65 to 89) and 80% (95% CI, 69 to 88), respectively. The positive likelihood ratio, negative likelihood ratio and diagnostic odds ratio were 4.0 (95% CI, 2.4 to 6.9), 0.26 (95% CI, 0.14 to 0.48), and 16 (95% CI, 5 to 46), respectively. The area under the curve of the summary receiver operator characteristic was 0.87 (95% CI, 0.84 to 0.89). Meta-regression analysis suggested that patient sample size and assay method were the main sources of heterogeneity. Publication bias was suggested by an asymmetrical funnel plot (P = 0.02). Conclusions The present meta-analysis showed that plasma sTREM-1 had a moderate diagnostic performance in differentiating sepsis from SIRS. Accordingly, plasma sTREM-1 as a single marker was not sufficient for sepsis diagnosis in systemic inflammatory patients. PMID:23194114

  10. Accuracy of ultrasound for the diagnosis of cervical lymph node metastasis in esophageal cancer: a systematic review and meta-analysis

    PubMed Central

    Leng, Xue-Feng; Zhu, Yi; Wang, Ge-Ping; Jin, Jian; Zhang, Yu-Hong

    2016-01-01

    Background Esophageal cancer is considered a serious malignancy with respect to its prognosis and mortality rate. Cervical lymph node status is one of the keys to determining prognosis and treatment methods. However, published data vary regarding the accuracy of ultrasound in the diagnosis of cervical lymph node metastasis. We performed a meta-analysis to assess the efficacy of ultrasound for detecting cervical lymph node metastasis in patients with esophageal cancer. Methods The PubMed/MEDLINE, EMBASE, Web of Science, and Cochrane Library databases were searched to identify studies related to cervical lymph node metastasis, and 22 studies comprising 3,513 patients met our inclusion criteria. We used a bivariate meta-analysis following a random effects model to summarize the data. We also explored reasons for statistical heterogeneity using meta-regression, subgroup, and sensitivity analyses. Publication bias was assessed with a Deeks funnel plot. Results The area under the receiver operating characteristic curve was 0.97 [95% confidence interval (CI): 0.95–0.98], and the pooled diagnostic odds ratio was 121.00 (95% CI: 47.57–307.79). With cut-off values of 5 mm and >5 mm for cervical lymph node size, the sensitivities and specificities (95% confidence interval) for ultrasound detection of cervical lymph node metastasis were 84% (67–93%) and 93% (90–95%); and 94% (76–98%) and 98% (89–100%), respectively. Conclusions We show for the first time the diagnostic accuracy of ultrasound for predicting cervical lymph node-positive metastasis in esophageal cancer. Our analysis shows that ultrasonography may be an effective and reliable approach to detect cervical lymph node metastasis in esophageal cancer. However, to accommodate heterogeneity, high-quality studies are needed to further verify the efficacy of ultrasound detection. PMID:27621871

  11. Speed/accuracy tradeoff in force perception.

    PubMed

    Rank, Markus; Di Luca, Massimiliano

    2015-06-01

    There is a well-known tradeoff between speed and accuracy in judgments made under uncertainty. Diffusion models have been proposed to capture the increase in response time for more uncertain decisions and the change in performance due to a prioritization of speed or accuracy in the responses. Experimental paradigms have been confined to the visual modality and model analysis have mostly used quantile-probability (QP) plots--response probability as a function of quantized RTs. Here, we extend diffusion modeling to haptics and test a novel type of analysis for judging model fitting. Participants classified force stimuli applied to the hand as "high" or "low." Data in QP plots indicate that the diffusion model captures well the overall pattern of responses in conditions where either speed or accuracy has been prioritized. To further the analysis, we compute just noticeable difference (JND) values separately for responses delivered with different RTs--we define these plots as JND quantile. The pattern of results evidences that slower responses lead to better force discrimination up to a plateau that is unaffected by prioritization instructions. Instead, the diffusion model predicts two well-separated plateaus depending on the condition. We propose that analyzing the relation between JNDs and response time should be considered in the evaluation of the diffusion model beyond the haptic modality, thus including vision. PMID:25867512

  12. A fast Monte Carlo EM algorithm for estimation in latent class model analysis with an application to assess diagnostic accuracy for cervical neoplasia in women with AGC

    PubMed Central

    Kang, Le; Carter, Randy; Darcy, Kathleen; Kauderer, James; Liao, Shu-Yuan

    2013-01-01

    In this article we use a latent class model (LCM) with prevalence modeled as a function of covariates to assess diagnostic test accuracy in situations where the true disease status is not observed, but observations on three or more conditionally independent diagnostic tests are available. A fast Monte Carlo EM (MCEM) algorithm with binary (disease) diagnostic data is implemented to estimate parameters of interest; namely, sensitivity, specificity, and prevalence of the disease as a function of covariates. To obtain standard errors for confidence interval construction of estimated parameters, the missing information principle is applied to adjust information matrix estimates. We compare the adjusted information matrix based standard error estimates with the bootstrap standard error estimates both obtained using the fast MCEM algorithm through an extensive Monte Carlo study. Simulation demonstrates that the adjusted information matrix approach estimates the standard error similarly with the bootstrap methods under certain scenarios. The bootstrap percentile intervals have satisfactory coverage probabilities. We then apply the LCM analysis to a real data set of 122 subjects from a Gynecologic Oncology Group (GOG) study of significant cervical lesion (S-CL) diagnosis in women with atypical glandular cells of undetermined significance (AGC) to compare the diagnostic accuracy of a histology-based evaluation, a CA-IX biomarker-based test and a human papillomavirus (HPV) DNA test. PMID:24163493

  13. Accuracy of the interferon-gamma release assay for the diagnosis of tuberculous pleurisy: an updated meta-analysis

    PubMed Central

    Zhu, Jing; Feng, Mei; Wan, Chun

    2015-01-01

    Background and Objectives. The best method for diagnosing tuberculous pleurisy (TP) remains controversial. Since a growing number of publications focus on the interferon-gamma release assay (IGRA), we meta-analyzed the available evidence on the overall diagnostic performance of IGRA applied to pleural fluid and peripheral blood. Materials and Methods. PubMed and Embase were searched for relevant English papers up to October 31, 2014. Statistical analyses were performed using Stata and Meta-DiSc. Pooled sensitivity, specificity, positive likelihood ratio (PLR), negative likelihood ratio (NLR), positive predictive value (PPV), negative predictive value (NPV) and diagnostic odds ratio (DOR) were count. Summary receiver operating characteristic curves and area under the curve (AUC) were used to summarize the overall diagnostic performance. Results. Fifteen publications met our inclusion criteria and were included in the meta analysis. The following pooled estimates for diagnostic parameters of pleural IGRA were obtained: sensitivity, 0.82 (95% CI [0.79–0.85]); specificity, 0.87 (95% CI [0.84–0.90]); PLR, 4.94 (95% CI [2.60–9.39]); NLR, 0.22 (95% CI [0.13–0.38]); PPV, 0.91 (95% CI [0.85–0.96]); NPV, 0.79 (95% CI [0.71–0.85]); DOR, 28.37 (95% CI [10.53–76.40]); and AUC, 0.91. The corresponding estimates for blood IGRA were as follows: sensitivity, 0.80 (95% CI [0.76–0.83]); specificity, 0.70 (95% CI [0.65–0.75]); PLR, 2.48 (95% CI [1.95–3.17]); NLR, 0.30 (95% CI [0.24–0.37]); PPV, 0.79 (95% CI [0.60–0.87]); NPV, 0.75 (95% CI [0.62–0.83]); DOR, 9.96 (95% CI [6.02–16.48]); and AUC, 0.89. Conclusions. This meta analysis suggested that pleural IGRA has potential for serving as a complementary method for diagnosing TP; however, its cost, high turn around time, and sub-optimal performance make it unsuitable as a stand-alone diagnostic tool. Better tests for the diagnosis of TP are required. PMID:26038718

  14. Comparative analysis of the positional accuracy of CCD measurements of small bodies in the solar system: software CoLiTec and Astrometrica

    NASA Astrophysics Data System (ADS)

    Savanevych, V. E.; Briukhovetskyi, A. B.; Ivashchenko, Yu. N.; Vavilova, I. B.; Bezkrovniy, M. M.; Dikov, E. N.; Vlasenko, V. P.; Sokovikova, N. S.; Movsesian, Ia. S.; Dikhtyar, N. Yu.; Elenin, L. V.; Pohorelov, A. V.; Khlamov, S. V.

    2015-11-01

    The CoLiTec software for the automated search for small celestial objects of the solar system on a series of CCD frames has been developed within the Ukrainian virtual observatory project. Four comets and more than a thousand asteroids were discovered using the software. It was also used to send approximately 700000 positional CCD measurements to the Minor Planet Center. In this paper, accuracy factors of positional CCD measurements using the CoLiTec software are analyzed according to data from the Minor Planet Center. The comparative analysis of these factors according to the results of the processing of the same frames using CoLiTec and Astrometrica software is also conducted. In the case of low signal to noise ratios, the standard deviation of positional CCD measurements using the Astrometrica software is 30-50% greater than that of the CoLiTec software.

  15. Microscale receiver operating characteristic analysis of micrometastasis recognition using activatable fluorescent probes indicates leukocyte imaging as a critical factor to enhance accuracy

    NASA Astrophysics Data System (ADS)

    Spring, Bryan Q.; Palanisami, Akilan; Hasan, Tayyaba

    2014-06-01

    Molecular-targeted probes are emerging with applications for optical biopsy of cancer. An underexplored potential clinical use of these probes is to monitor residual cancer micrometastases that escape cytoreductive surgery and chemotherapy. Here, we show that leukocytes, or white blood cells, residing in nontumor tissues-as well as those infiltrating micrometastatic lesions-uptake cancer cell-targeted, activatable immunoconjugates nonspecifically, which limits the accuracy and resolution of micrometastasis recognition using these probes. Receiver operating characteristic analysis of freshly excised tissues from a mouse model of peritoneal carcinomatosis suggests that dual-color imaging, adding an immunostain for leukocytes, offers promise for enabling accurate recognition of single cancer cells. Our results indicate that leukocyte identification improves micrometastasis recognition sensitivity and specificity from 92 to 93%-for multicellular metastases >20 to 30 μm in size-to 98 to 99.9% for resolving metastases as small as a single cell.

  16. Accuracy analysis on C/A code and P(Y) code pseudo-range of GPS dual frequency receiver and application in point positioning

    NASA Astrophysics Data System (ADS)

    Peng, Xiuying; Fan, Shijie; Guo, Jiming

    2008-10-01

    When the Anti-Spoofing (A-S) is active, the civilian users have some difficulties in using the P(Y) code for precise navigation and positioning. Z-tracking technique is one of the effective methods to acquire the P(Y) code. In this paper, the accuracy of pseudoranges from C/A code and P(Y) code for dual frequency GPS receiver is discussed. The principle of measuring the encrypted P(Y) code is described firstly, then a large data set from IGS tracking stations is utilized for analysis and verification with the help of a precise point positioning software developed by authors. Especially, P(Y) code pseudoranges of civilian GPS receivers allow eliminating/reducing the effect of ionospheric delay and improve the precision of positioning. The point positioning experiments for this are made in the end.

  17. SU-E-T-248: Near Real-Time Analysis of Radiation Delivery and Imaging, Accuracy to Ensure Patient Safety

    SciTech Connect

    Wijesooriya, K; Seitter, K; Desai, V; Read, P; Larner, J

    2014-06-01

    Purpose: To develop and optimize an effective software method for comparing planned to delivered control point machine parameters for all VARIAN TrueBeam treatments so as to permit (1) assessment of a large patient pool throughout their treatment course to quantify treatment technique specific systematic and random uncertainty of observables, (2) quantify the site specific daily imaging shifts required for target alignment, and (3) define tolerance levels for mechanical parameters and imaging parameters based on statistical analysis data gathered, and the dosimetric impact of variations. Methods: Treatment and imaging log files were directly compared to plan parameters for Eclipse and Pinnacle planned treatments via 3D, IMRT, control point, RapidArc, and electrons. Each control point from all beams/arcs (7984) for all fractions (1940) of all patients treated over six months were analyzed. At each control point gantry angle, collimator angle, couch angle, jaw positions, MLC positions, MU were compared. Additionally per-treatment isocenter shifts were calculated. Results were analyzed as a whole in treatment type subsets: IMRT, 3D, RapidArc; and in treatment site subsets: brain, chest/mediastinum, esophagus, H and N, lung, pelvis, prostate. Results: Daily imaging isocenter shifts from initial external tattoo alignment were dependent on the treatment site with < 0.5 cm translational shifts for H and N, Brain, and lung SBRT, while pelvis, esophagus shifts were ∼1 cm. Mechanical delivery parameters were within tolerance levels for all sub-beams. The largest variations were for RapidArc plans: gantry angle 0.11±0.12,collimator angle 0.00±0.00, jaw positions 0.48±0.26, MLC leaf positions 0.66±0.08, MU 0.14±0.34. Conclusion: Per-control point validation reveals deviations between planned and delivered parameters. If used in a near real-time error checking system, patient safety can be improved by equipping the treatment delivery system with additional forcing

  18. Non-invasive prenatal diagnostic test accuracy for fetal sex using cell-free DNA a review and meta-analysis

    PubMed Central

    2012-01-01

    Background Cell-free fetal DNA (cffDNA) can be detected in maternal blood during pregnancy, opening the possibility of early non-invasive prenatal diagnosis for a variety of genetic conditions. Since 1997, many studies have examined the accuracy of prenatal fetal sex determination using cffDNA, particularly for pregnancies at risk of an X-linked condition. Here we report a review and meta-analysis of the published literature to evaluate the use of cffDNA for prenatal determination (diagnosis) of fetal sex. We applied a sensitive search of multiple bibliographic databases including PubMed (MEDLINE), EMBASE, the Cochrane library and Web of Science. Results Ninety studies, incorporating 9,965 pregnancies and 10,587 fetal sex results met our inclusion criteria. Overall mean sensitivity was 96.6% (95% credible interval 95.2% to 97.7%) and mean specificity was 98.9% (95% CI = 98.1% to 99.4%). These results vary very little with trimester or week of testing, indicating that the performance of the test is reliably high. Conclusions Based on this review and meta-analysis we conclude that fetal sex can be determined with a high level of accuracy by analyzing cffDNA. Using cffDNA in prenatal diagnosis to replace or complement existing invasive methods can remove or reduce the risk of miscarriage. Future work should concentrate on the economic and ethical considerations of implementing an early non-invasive test for fetal sex. PMID:22937795

  19. Effect of heart rate on the diagnostic accuracy of 256-slice computed tomography angiography in the detection of coronary artery stenosis: ROC curve analysis

    PubMed Central

    WANG, GANG; WU, YIFEN; ZHANG, ZHENTAO; ZHENG, XIAOLIN; ZHANG, YULAN; LIANG, MANQIU; YUAN, HUANCHU; SHEN, HAIPING; LI, DEWEI

    2016-01-01

    The aim of the present study was to investigate the effect of heart rate (HR) on the diagnostic accuracy of 256-slice computed tomography angiography (CTA) in the detection of coronary artery stenosis. Coronary imaging was performed using a Philips 256-slice spiral CT, and receiver operating characteristic (ROC) curve analysis was conducted to evaluate the diagnostic value of 256-slice CTA in coronary artery stenosis. The HR of the research subjects in the study was within a certain range (39–107 bpm). One hundred patients suspected of coronary heart disease underwent 256-slice CTA examination. The cases were divided into three groups: Low HR (HR <75 bpm), moderate HR (75≤ HR <90 bpm) and high HR (HR ≥90 bpm). For the three groups, two observers independently assessed the image quality for all coronary segments on a four-point ordinal scale. An image quality of grades 1–3 was considered diagnostic, while grade 4 was non-diagnostic. A total of 97.76% of the images were diagnostic in the low-HR group, 96.86% in the moderate-HR group and 95.80% in the high-HR group. According to the ROC curve analysis, the specificity of CTA in diagnosing coronary artery stenosis was 98.40, 96.00 and 97.60% in the low-, moderate- and high-HR groups, respectively. In conclusion, 256-slice coronary CTA can be used to clearly show the main segments of the coronary artery and to effectively diagnose coronary artery stenosis. Within the range of HRs investigated, HR was found to have no significant effect on the diagnostic accuracy of 256-slice coronary CTA for coronary artery stenosis. PMID:27168831

  20. Mineral content of vertebral trabecular bone: accuracy of dual energy quantitative computed tomography evaluated against neutron activation analysis and flame atomic absorption spectrometry.

    PubMed

    Louis, O; Van den Winkel, P; Covens, P; Schoutens, A; Osteaux, M

    1994-01-01

    The goal of this study was to evaluate the accuracy of preprocessing dual energy quantitative computed tomography (QCT) for assessment of trabecular bone mineral content (BMC) in lumbar vertebrae. The BMC of 49 lumbar vertebrae taken from 16 cadavers was measured using dual energy QCT with advanced software and hardware capabilities, including an automated definition of the trabecular region of interest (ROI). The midvertebral part of each vertebral body was embedded in a polyester resin and, subsequently, an experimental ROI was cut out using a scanjet image transmission procedure and a computer-assisted milling machine in order to mimic the ROI defined on QCT. After low temperature ashing, the experimental ROIs reduced to a bone powder were submitted to either nondestructive neutron activation analysis (n = 49) or to flame atomic absorption spectrometry (n = 45). BMC obtained with neutron activation analysis was closely related (r = 0.896) to that derived from atomic absorption spectrometry, taken as the gold standard, with, however, a slight overestimation. BMC values measured by QCT were highly correlated with those assessed using the two reference methods, all correlation coefficients being > 0.841. The standard errors of the estimate ranged 47.4-58.9 mg calcium hydroxyapatite in the regressions of BMC obtained with reference methods against BMC assessed by single energy QCT, 47.1-51.9 in the regressions involving dual energy QCT. We conclude that the trabecular BMC of lumbar vertebrae can be accurately measured by QCT and that the superiority in accuracy of dual energy is moderate, which is possible a characteristic of the preprocessing method. PMID:8024849

  1. Development and validation of an automated and marker-free CT-based spatial analysis method (CTSA) for assessment of femoral hip implant migration In vitro accuracy and precision comparable to that of radiostereometric analysis (RSA).

    PubMed

    Scheerlinck, Thierry; Polfliet, Mathias; Deklerck, Rudi; Van Gompel, Gert; Buls, Nico; Vandemeulebroucke, Jef

    2016-04-01

    Background and purpose - We developed a marker-free automated CT-based spatial analysis (CTSA) method to detect stem-bone migration in consecutive CT datasets and assessed the accuracy and precision in vitro. Our aim was to demonstrate that in vitro accuracy and precision of CTSA is comparable to that of radiostereometric analysis (RSA). Material and methods - Stem and bone were segmented in 2 CT datasets and both were registered pairwise. The resulting rigid transformations were compared and transferred to an anatomically sound coordinate system, taking the stem as reference. This resulted in 3 translation parameters and 3 rotation parameters describing the relative amount of stem-bone displacement, and it allowed calculation of the point of maximal stem migration. Accuracy was evaluated in 39 comparisons by imposing known stem migration on a stem-bone model. Precision was estimated in 20 comparisons based on a zero-migration model, and in 5 patients without stem loosening. Results - Limits of the 95% tolerance intervals (TIs) for accuracy did not exceed 0.28 mm for translations and 0.20° for rotations (largest standard deviation of the signed error (SDSE): 0.081 mm and 0.057°). In vitro, limits of the 95% TI for precision in a clinically relevant setting (8 comparisons) were below 0.09 mm and 0.14° (largest SDSE: 0.012 mm and 0.020°). In patients, the precision was lower, but acceptable, and dependent on CT scan resolution. Interpretation - CTSA allows detection of stem-bone migration with an accuracy and precision comparable to that of RSA. It could be valuable for evaluation of subtle stem loosening in clinical practice. PMID:26634843

  2. Development and validation of an automated and marker-free CT-based spatial analysis method (CTSA) for assessment of femoral hip implant migration In vitro accuracy and precision comparable to that of radiostereometric analysis (RSA)

    PubMed Central

    Scheerlinck, Thierry; Polfliet, Mathias; Deklerck, Rudi; Van Gompel, Gert; Buls, Nico; Vandemeulebroucke, Jef

    2016-01-01

    Background and purpose — We developed a marker-free automated CT-based spatial analysis (CTSA) method to detect stem-bone migration in consecutive CT datasets and assessed the accuracy and precision in vitro. Our aim was to demonstrate that in vitro accuracy and precision of CTSA is comparable to that of radiostereometric analysis (RSA). Material and methods — Stem and bone were segmented in 2 CT datasets and both were registered pairwise. The resulting rigid transformations were compared and transferred to an anatomically sound coordinate system, taking the stem as reference. This resulted in 3 translation parameters and 3 rotation parameters describing the relative amount of stem-bone displacement, and it allowed calculation of the point of maximal stem migration. Accuracy was evaluated in 39 comparisons by imposing known stem migration on a stem-bone model. Precision was estimated in 20 comparisons based on a zero-migration model, and in 5 patients without stem loosening. Results — Limits of the 95% tolerance intervals (TIs) for accuracy did not exceed 0.28 mm for translations and 0.20° for rotations (largest standard deviation of the signed error (SDSE): 0.081 mm and 0.057°). In vitro, limits of the 95% TI for precision in a clinically relevant setting (8 comparisons) were below 0.09 mm and 0.14° (largest SDSE: 0.012 mm and 0.020°). In patients, the precision was lower, but acceptable, and dependent on CT scan resolution. Interpretation — CTSA allows detection of stem-bone migration with an accuracy and precision comparable to that of RSA. It could be valuable for evaluation of subtle stem loosening in clinical practice. PMID:26634843

  3. The effect of increased consumer demand on fees for aesthetic surgery: an economic analysis.

    PubMed

    Krieger, L M; Shaw, W W

    1999-12-01

    Economic theory dictates that changes in consumer demand have predictable effects on prices. Demographics represents an important component of demand for aesthetic surgery. Between the years of 1997 and 2010, the U.S. population is projected to increase by 12 percent. The population increase will be skewed such that those groups undergoing the most aesthetic surgery will see the largest increase. Accounting for the age-specific frequencies of aesthetic surgery and the population increase yields an estimate that the overall market for aesthetic surgery will increase by 19 percent. Barring unforeseen changes in general economic conditions or consumer tastes, demand should increase by an analogous amount. An economic demonstration shows the effects of increasing demand for aesthetic surgery on its fees. Between the years of 1992 and 1997, there was an increase in demand for breast augmentation as fears of associated autoimmune disorders subsided. Similarly, there was increased male acceptance of aesthetic surgery. The number of breast augmentations and procedures to treat male pattern baldness, plastic surgeons, and fees for the procedures were tracked. During the study period, the supply of surgeons and consumer demand increased for both of these procedures. Volume of breast augmentation increased by 275 percent, whereas real fees remained stable. Volume of treatment for male pattern baldness increased by 107 percent, and the fees increased by 29 percent. Ordinarily, an increase in supply leads to a decrease in prices. This did not occur during the study period. Economic analysis demonstrates that the increased supply of surgeons performing breast augmentation was offset by increased consumer demand for the procedure. For this reason, fees were not lowered. Similarly, increased demand for treatment of male pattern baldness more than offset the increased supply of surgeons performing it. The result was higher fees. Emphasis should be placed on using these economic

  4. Emotional state and its impact on voice authentication accuracy

    NASA Astrophysics Data System (ADS)

    Voznak, Miroslav; Partila, Pavol; Penhaker, Marek; Peterek, Tomas; Tomala, Karel; Rezac, Filip; Safarik, Jakub

    2013-05-01

    The paper deals with the increasing accuracy of voice authentication methods. The developed algorithm first extracts segmental parameters, such as Zero Crossing Rate, the Fundamental Frequency and Mel-frequency cepstral coefficients from voice. Based on these parameters, the neural network classifier detects the speaker's emotional state. These parameters shape the distribution of neurons in Kohonen maps, forming clusters of neurons on the map characterizing a particular emotional state. Using regression analysis, we can calculate the function of the parameters of individual emotional states. This relationship increases voice authentication accuracy and prevents unjust rejection.

  5. Improving Speaking Accuracy through Awareness

    ERIC Educational Resources Information Center

    Dormer, Jan Edwards

    2013-01-01

    Increased English learner accuracy can be achieved by leading students through six stages of awareness. The first three awareness stages build up students' motivation to improve, and the second three provide learners with crucial input for change. The final result is "sustained language awareness," resulting in ongoing…

  6. Accuracy of non-invasive prenatal testing using cell-free DNA for detection of Down, Edwards and Patau syndromes: a systematic review and meta-analysis

    PubMed Central

    Taylor-Phillips, Sian; Freeman, Karoline; Geppert, Julia; Agbebiyi, Adeola; Uthman, Olalekan A; Madan, Jason; Clarke, Angus; Quenby, Siobhan; Clarke, Aileen

    2016-01-01

    Objective To measure test accuracy of non-invasive prenatal testing (NIPT) for Down, Edwards and Patau syndromes using cell-free fetal DNA and identify factors affecting accuracy. Design Systematic review and meta-analysis of published studies. Data sources PubMed, Ovid Medline, Ovid Embase and the Cochrane Library published from 1997 to 9 February 2015, followed by weekly autoalerts until 1 April 2015. Eligibility criteria for selecting studies English language journal articles describing case–control studies with ≥15 trisomy cases or cohort studies with ≥50 pregnant women who had been given NIPT and a reference standard. Results 41, 37 and 30 studies of 2012 publications retrieved were included in the review for Down, Edwards and Patau syndromes. Quality appraisal identified high risk of bias in included studies, funnel plots showed evidence of publication bias. Pooled sensitivity was 99.3% (95% CI 98.9% to 99.6%) for Down, 97.4% (95.8% to 98.4%) for Edwards, and 97.4% (86.1% to 99.6%) for Patau syndrome. The pooled specificity was 99.9% (99.9% to 100%) for all three trisomies. In 100 000 pregnancies in the general obstetric population we would expect 417, 89 and 40 cases of Downs, Edwards and Patau syndromes to be detected by NIPT, with 94, 154 and 42 false positive results. Sensitivity was lower in twin than singleton pregnancies, reduced by 9% for Down, 28% for Edwards and 22% for Patau syndrome. Pooled sensitivity was also lower in the first trimester of pregnancy, in studies in the general obstetric population, and in cohort studies with consecutive enrolment. Conclusions NIPT using cell-free fetal DNA has very high sensitivity and specificity for Down syndrome, with slightly lower sensitivity for Edwards and Patau syndrome. However, it is not 100% accurate and should not be used as a final diagnosis for positive cases. Trial registration number CRD42014014947. PMID:26781507

  7. Association between increase in fixed penalties and road safety outcomes: A meta-analysis.

    PubMed

    Elvik, Rune

    2016-07-01

    Studies that have evaluated the association between increases in traffic fine amounts (fixed penalties) and changes in compliance with road traffic law or the number of accidents are synthesised by means of meta-analysis. The studies were few and different in many respects. Nine studies were included in the meta-analysis of changes in compliance. Four studies were included in the meta-analysis of changes in accidents. Increasing traffic fines was found to be associated with small changes in the rate of violations. The changes were non-linear. For increases up to about 100%, violations were reduced. For larger increases, no reduction in violations was found. A small reduction in fatal accidents was associated with increased fixed penalties, varying between studies from less than 1-12%. The main pattern of changes in violations was similar in the fixed-effects and random-effects models of meta-analysis, meta-regression and when simple (non-weighted) mean values were computed. The main findings are thus robust, although most of the primary studies did not control very well for potentially confounding factors. Summary estimates of changes in violations or accidents should be treated as provisional and do not necessarily reflect causal relationships. PMID:27085146

  8. Accuracy of Bolton analysis measured in laser scanned digital models compared with plaster models (gold standard) and cone-beam computer tomography images

    PubMed Central

    Kim, Jooseong

    2016-01-01

    Objective The aim of this study was to compare the accuracy of Bolton analysis obtained from digital models scanned with the Ortho Insight three-dimensional (3D) laser scanner system to those obtained from cone-beam computed tomography (CBCT) images and traditional plaster models. Methods CBCT scans and plaster models were obtained from 50 patients. Plaster models were scanned using the Ortho Insight 3D laser scanner; Bolton ratios were calculated with its software. CBCT scans were imported and analyzed using AVIZO software. Plaster models were measured with a digital caliper. Data were analyzed with descriptive statistics and the intraclass correlation coefficient (ICC). Results Anterior and overall Bolton ratios obtained by the three different modalities exhibited excellent agreement (> 0.970). The mean differences between the scanned digital models and physical models and between the CBCT images and scanned digital models for overall Bolton ratios were 0.41 ± 0.305% and 0.45 ± 0.456%, respectively; for anterior Bolton ratios, 0.59 ± 0.520% and 1.01 ± 0.780%, respectively. ICC results showed that intraexaminer error reliability was generally excellent (> 0.858 for all three diagnostic modalities), with < 1.45% discrepancy in the Bolton analysis. Conclusions Laser scanned digital models are highly accurate compared to physical models and CBCT scans for assessing the spatial relationships of dental arches for orthodontic diagnosis. PMID:26877978

  9. Newspaper Content Analysis in Evaluation of a Community-Based Participatory Project to Increase Physical Activity

    ERIC Educational Resources Information Center

    Granner, Michelle L.; Sharpe, Patricia A.; Burroughs, Ericka L.; Fields, Regina; Hallenbeck, Joyce

    2010-01-01

    This study conducted a newspaper content analysis as part of an evaluation of a community-based participatory research project focused on increasing physical activity through policy and environmental changes, which included activities related to media advocacy and media-based community education. Daily papers (May 2003 to December 2005) from both…

  10. Decreasing Sports Activity with Increasing Age? Findings from a 20-Year Longitudinal and Cohort Sequence Analysis

    ERIC Educational Resources Information Center

    Breuer, Christoph; Wicker, Pamela

    2009-01-01

    According to cross-sectional studies in sport science literature, decreasing sports activity with increasing age is generally assumed. In this paper, the validity of this assumption is checked by applying more effective methods of analysis, such as longitudinal and cohort sequence analyses. With the help of 20 years' worth of data records from the…

  11. Interventions to Increase Attendance at Psychotherapy: A Meta-Analysis of Randomized Controlled Trials

    ERIC Educational Resources Information Center

    Oldham, Mary; Kellett, Stephen; Miles, Eleanor; Sheeran, Paschal

    2012-01-01

    Objective: Rates of nonattendance for psychotherapy hinder the effective delivery of evidence-based treatments. Although many strategies have been developed to increase attendance, the effectiveness of these strategies has not been quantified. Our aim in the present study was to undertake a meta-analysis of rigorously controlled studies to…

  12. The Use of Gap Analysis to Increase Student Completion Rates at Travelor Adult School

    ERIC Educational Resources Information Center

    Gil, Blanca Estela

    2013-01-01

    This project applied the gap analysis problem-solving framework (Clark & Estes, 2008) in order to help develop strategies to increase completion rates at Travelor Adult School. The purpose of the study was to identify whether the knowledge, motivation and organization barriers were contributing to the identified gap. A mixed method approached…

  13. Does Service-Learning Increase Student Learning?: A Meta-Analysis

    ERIC Educational Resources Information Center

    Warren, Jami L.

    2012-01-01

    Research studies reflect mixed results on whether or not service-learning increases student learning outcomes. The current study seeks to reconcile these findings by extending a meta-analysis conducted by Novak, Markey, and Allen (2007) in which these authors examined service-learning and student learning outcomes. In the current study, 11…

  14. A Systematic Review and Meta-Analysis of Indicated Interventions to Increase School Attendance

    ERIC Educational Resources Information Center

    Maynard, Brandy R.; Tyson-McCrea, Katherine; Pigott, Therese; Kelly, Michael

    2011-01-01

    The main objective of this systematic review and meta-analysis was to examine the effects of intervention programs on school attendance behaviors of elementary and secondary school students to inform policy and practice. The specific questions guiding this study were: (1) Do indicated programs with a goal of increasing student attendance affect…

  15. An in-depth evaluation of accuracy and precision in Hg isotopic analysis via pneumatic nebulization and cold vapor generation multi-collector ICP-mass spectrometry.

    PubMed

    Rua-Ibarz, Ana; Bolea-Fernandez, Eduardo; Vanhaecke, Frank

    2016-01-01

    Mercury (Hg) isotopic analysis via multi-collector inductively coupled plasma (ICP)-mass spectrometry (MC-ICP-MS) can provide relevant biogeochemical information by revealing sources, pathways, and sinks of this highly toxic metal. In this work, the capabilities and limitations of two different sample introduction systems, based on pneumatic nebulization (PN) and cold vapor generation (CVG), respectively, were evaluated in the context of Hg isotopic analysis via MC-ICP-MS. The effect of (i) instrument settings and acquisition parameters, (ii) concentration of analyte element (Hg), and internal standard (Tl)-used for mass discrimination correction purposes-and (iii) different mass bias correction approaches on the accuracy and precision of Hg isotope ratio results was evaluated. The extent and stability of mass bias were assessed in a long-term study (18 months, n = 250), demonstrating a precision ≤0.006% relative standard deviation (RSD). CVG-MC-ICP-MS showed an approximately 20-fold enhancement in Hg signal intensity compared with PN-MC-ICP-MS. For CVG-MC-ICP-MS, the mass bias induced by instrumental mass discrimination was accurately corrected for by using either external correction in a sample-standard bracketing approach (SSB) or double correction, consisting of the use of Tl as internal standard in a revised version of the Russell law (Baxter approach), followed by SSB. Concomitant matrix elements did not affect CVG-ICP-MS results. Neither with PN, nor with CVG, any evidence for mass-independent discrimination effects in the instrument was observed within the experimental precision obtained. CVG-MC-ICP-MS was finally used for Hg isotopic analysis of reference materials (RMs) of relevant environmental origin. The isotopic composition of Hg in RMs of marine biological origin testified of mass-independent fractionation that affected the odd-numbered Hg isotopes. While older RMs were used for validation purposes, novel Hg isotopic data are provided for the

  16. Thermodynamics of protein-ligand interactions as a reference for computational analysis: how to assess accuracy, reliability and relevance of experimental data.

    PubMed

    Krimmer, Stefan G; Klebe, Gerhard

    2015-09-01

    For a conscientious interpretation of thermodynamic parameters (Gibbs free energy, enthalpy and entropy) obtained by isothermal titration calorimetry (ITC), it is necessary to first evaluate the experimental setup and conditions at which the data were measured. The data quality must be assessed and the precision and accuracy of the measured parameters must be estimated. This information provides the basis at which level discussion of the data is appropriate, and allows insight into the significance of comparisons with other data. The aim of this article is to provide the reader with basic understanding of the ITC technique and the experimental practices commonly applied, in order to foster an appreciation for how much measured thermodynamic parameters can deviate from ideal, error-free values. Particular attention is paid to the shape of the recorded isotherm (c-value), the influence of the applied buffer used for the reaction (protonation reactions, pH), the chosen experimental settings (temperature), impurities of protein and ligand, sources of systematic errors (solution concentration, solution activity, and device calibration) and to the applied analysis software. Furthermore, we comment on enthalpy-entropy compensation, heat capacities and van't Hoff enthalpies. PMID:26376645

  17. Characterizing accuracy of total hemoglobin recovery using contrast-detail analysis in 3D image-guided near infrared spectroscopy with the boundary element method

    PubMed Central

    Ghadyani, Hamid R.; Srinivasan, Subhadra; Pogue, Brian W.; Paulsen, Keith D.

    2010-01-01

    The quantification of total hemoglobin concentration (HbT) obtained from multi-modality image-guided near infrared spectroscopy (IG-NIRS) was characterized using the boundary element method (BEM) for 3D image reconstruction. Multi-modality IG-NIRS systems use a priori information to guide the reconstruction process. While this has been shown to improve resolution, the effect on quantitative accuracy is unclear. Here, through systematic contrast-detail analysis, the fidelity of IG-NIRS in quantifying HbT was examined using 3D simulations. These simulations show that HbT could be recovered for medium sized (20mm in 100mm total diameter) spherical inclusions with an average error of 15%, for the physiologically relevant situation of 2:1 or higher contrast between background and inclusion. Using partial 3D volume meshes to reduce the ill-posed nature of the image reconstruction, inclusions as small as 14mm could be accurately quantified with less than 15% error, for contrasts of 1.5 or higher. This suggests that 3D IG-NIRS provides quantitatively accurate results for sizes seen early in treatment cycle of patients undergoing neoadjuvant chemotherapy when the tumors are larger than 30mm. PMID:20720975

  18. GEOSPATIAL DATA ACCURACY ASSESSMENT

    EPA Science Inventory

    The development of robust accuracy assessment methods for the validation of spatial data represent's a difficult scientific challenge for the geospatial science community. The importance and timeliness of this issue is related directly to the dramatic escalation in the developmen...

  19. An analysis of increasing the size of the strategic petroleum reserve to one billion barrels

    SciTech Connect

    Not Available

    1990-01-01

    The Department of Energy's Office of Energy Emergency Policy and Evaluation requested that the Energy Information Administration complete an analysis of the proposed expansion in the Strategic Petroleum Reserve (SPR) from its currently planned size of 750 million barrels to 1000 million barrels. Because the SPR contains only 580 million barrels at this point in time, the benefits and costs of increasing the SPR from 600 to 750 million barrels were also estimated. This report documents the assumptions, methodology, and results of the analysis. 17 figs., 15 tabs.

  20. Diagnostic accuracy of PLA2R autoantibodies and glomerular staining for the differentiation of idiopathic and secondary membranous nephropathy: an updated meta-analysis

    PubMed Central

    Dai, Huanzi; Zhang, Huhai; He, Yani

    2015-01-01

    The diagnostic performance of M-type phospholipase A2 receptor (PLA2R) autoantibodies and PLA2R glomerular staining in discriminating between idiopathic membranous nephropathy (iMN) and secondary membranous nephropathy (sMN) has not been fully evaluated. We conducted an updated meta-analysis to investigate the accuracy and clinical value of serological anti-PLA2R test and histological PLA2R staining for differentiation iMN from sMN. A total of 19 studies involving 1160 patients were included in this meta-analysis. The overall sensitivity, specificity, diagnostic odds ratio (DOR) and area under the receiver operating characteristic curve (AUROC) of serum anti-PLA2R were 0.68 (95% CI, 0.61–074), 0.97 (95% CI, 0.85–1.00), 73.75 (95% CI, 12.56–432.96) and 0.82 (95% CI, 0.78–0.85), respectively, with substantial heterogeneity (I2 = 86.42%). Subgroup analyses revealed the study design, publication type, study origin, assay method might account for the heterogeneity. Additionally, the overall sensitivity, specificity, DOR and AUROC of glomerular PLA2R staining were 0.78 (95% CI, 0.72–0.83), 0.91 (95% CI, 0.75–0.97), 34.70 (95% CI, 9.93–121.30) and 0.84 (95% CI, 0.81–0.87), respectively, without heterogeneity (I2 = 0%). Serological anti-PLA2R testing has diagnostic value, but it must be interpreted in context with patient clinical characteristics and histological PLA2R staining in seronegative patients is recommended. PMID:25740009

  1. Guidelines for Dual Energy X-Ray Absorptiometry Analysis of Trabecular Bone-Rich Regions in Mice: Improved Precision, Accuracy, and Sensitivity for Assessing Longitudinal Bone Changes.

    PubMed

    Shi, Jiayu; Lee, Soonchul; Uyeda, Michael; Tanjaya, Justine; Kim, Jong Kil; Pan, Hsin Chuan; Reese, Patricia; Stodieck, Louis; Lin, Andy; Ting, Kang; Kwak, Jin Hee; Soo, Chia

    2016-05-01

    Trabecular bone is frequently studied in osteoporosis research because changes in trabecular bone are the most common cause of osteoporotic fractures. Dual energy X-ray absorptiometry (DXA) analysis specific to trabecular bone-rich regions is crucial to longitudinal osteoporosis research. The purpose of this study is to define a novel method for accurately analyzing trabecular bone-rich regions in mice via DXA. This method will be utilized to analyze scans obtained from the International Space Station in an upcoming study of microgravity-induced bone loss. Thirty 12-week-old BALB/c mice were studied. The novel method was developed by preanalyzing trabecular bone-rich sites in the distal femur, proximal tibia, and lumbar vertebrae via high-resolution X-ray imaging followed by DXA and micro-computed tomography (micro-CT) analyses. The key DXA steps described by the novel method were (1) proper mouse positioning, (2) region of interest (ROI) sizing, and (3) ROI positioning. The precision of the new method was assessed by reliability tests and a 14-week longitudinal study. The bone mineral content (BMC) data from DXA was then compared to the BMC data from micro-CT to assess accuracy. Bone mineral density (BMD) intra-class correlation coefficients of the new method ranging from 0.743 to 0.945 and Levene's test showing that there was significantly lower variances of data generated by new method both verified its consistency. By new method, a Bland-Altman plot displayed good agreement between DXA BMC and micro-CT BMC for all sites and they were strongly correlated at the distal femur and proximal tibia (r=0.846, p<0.01; r=0.879, p<0.01, respectively). The results suggest that the novel method for site-specific analysis of trabecular bone-rich regions in mice via DXA yields more precise, accurate, and repeatable BMD measurements than the conventional method. PMID:26956416

  2. Analysis and design of numerical schemes for gas dynamics 1: Artificial diffusion, upwind biasing, limiters and their effect on accuracy and multigrid convergence

    NASA Technical Reports Server (NTRS)

    Jameson, Antony

    1994-01-01

    The theory of non-oscillatory scalar schemes is developed in this paper in terms of the local extremum diminishing (LED) principle that maxima should not increase and minima should not decrease. This principle can be used for multi-dimensional problems on both structured and unstructured meshes, while it is equivalent to the total variation diminishing (TVD) principle for one-dimensional problems. A new formulation of symmetric limited positive (SLIP) schemes is presented, which can be generalized to produce schemes with arbitrary high order of accuracy in regions where the solution contains no extrema, and which can also be implemented on multi-dimensional unstructured meshes. Systems of equations lead to waves traveling with distinct speeds and possibly in opposite directions. Alternative treatments using characteristic splitting and scalar diffusive fluxes are examined, together with modification of the scalar diffusion through the addition of pressure differences to the momentum equations to produce full upwinding in supersonic flow. This convective upwind and split pressure (CUSP) scheme exhibits very rapid convergence in multigrid calculations of transonic flow, and provides excellent shock resolution at very high Mach numbers.

  3. Using Language Sample Analysis in Clinical Practice: Measures of Grammatical Accuracy for Identifying Language Impairment in Preschool and School-Aged Children.

    PubMed

    Eisenberg, Sarita; Guo, Ling-Yu

    2016-05-01

    This article reviews the existing literature on the diagnostic accuracy of two grammatical accuracy measures for differentiating children with and without language impairment (LI) at preschool and early school age based on language samples. The first measure, the finite verb morphology composite (FVMC), is a narrow grammatical measure that computes children's overall accuracy of four verb tense morphemes. The second measure, percent grammatical utterances (PGU), is a broader grammatical measure that computes children's accuracy in producing grammatical utterances. The extant studies show that FVMC demonstrates acceptable (i.e., 80 to 89% accurate) to good (i.e., 90% accurate or higher) diagnostic accuracy for children between 4;0 (years;months) and 6;11 in conversational or narrative samples. In contrast, PGU yields acceptable to good diagnostic accuracy for children between 3;0 and 8;11 regardless of sample types. Given the diagnostic accuracy shown in the literature, we suggest that FVMC and PGU can be used as one piece of evidence for identifying children with LI in assessment when appropriate. However, FVMC or PGU should not be used as therapy goals directly. Instead, when children are low in FVMC or PGU, we suggest that follow-up analyses should be conducted to determine the verb tense morphemes or grammatical structures that children have difficulty with. PMID:27111270

  4. Finite Element Analysis Generates an Increasing Interest in Dental Research: A Bibliometric Study

    PubMed Central

    Diarra, Abdoulaziz; Mushegyan, Vagan; Naveau, Adrien

    2016-01-01

    Purpose: The purpose was to provide a longitudinal overview of published studies that use finite element analysis in dental research, by using the SCI-expanded database of Web of Science® (Thomson Reuters). Material and Methods: Eighty publications from 1999-2000 and 473 from 2009-2010 were retrieved. This literature grew faster than the overall dental literature. The number of publishing countries doubled. The main journals were American or English, and dealt with implantology. For the top 10 journals publishing dental finite element papers, the mean impact factor increased by 75% during the decade. Results: Finite elements generate an increasing interest from dental authors and publishers worldwide. PMID:27006722

  5. The diagnostic accuracy of the natriuretic peptides in heart failure: systematic review and diagnostic meta-analysis in the acute care setting

    PubMed Central

    Roberts, Emmert; Dworzynski, Katharina; Al-Mohammad, Abdallah; Cowie, Martin R; McMurray, John J V; Mant, Jonathan

    2015-01-01

    Objectives To determine and compare the diagnostic accuracy of serum natriuretic peptide levels (B type natriuretic peptide, N terminal probrain natriuretic peptide (NTproBNP), and mid-regional proatrial natriuretic peptide (MRproANP)) in people presenting with acute heart failure to acute care settings using thresholds recommended in the 2012 European Society of Cardiology guidelines for heart failure. Design Systematic review and diagnostic meta-analysis. Data sources Medline, Embase, Cochrane central register of controlled trials, Cochrane database of systematic reviews, database of abstracts of reviews of effects, NHS economic evaluation database, and Health Technology Assessment up to 28 January 2014, using combinations of subject headings and terms relating to heart failure and natriuretic peptides. Eligibility criteria for selecting studies Eligible studies evaluated one or more natriuretic peptides (B type natriuretic peptide, NTproBNP, or MRproANP) in the diagnosis of acute heart failure against an acceptable reference standard in consecutive or randomly selected adults in an acute care setting. Studies were excluded if they did not present sufficient data to extract or calculate true positives, false positives, false negatives, and true negatives, or report age independent natriuretic peptide thresholds. Studies not available in English were also excluded. Results 37 unique study cohorts described in 42 study reports were included, with a total of 48 test evaluations reporting 15 263 test results. At the lower recommended thresholds of 100 ng/L for B type natriuretic peptide and 300 ng/L for NTproBNP, the natriuretic peptides have sensitivities of 0.95 (95% confidence interval 0.93 to 0.96) and 0.99 (0.97 to 1.00) and negative predictive values of 0.94 (0.90 to 0.96) and 0.98 (0.89 to 1.0), respectively, for a diagnosis of acute heart failure. At the lower recommended threshold of 120 pmol/L, MRproANP has a sensitivity ranging from 0.95 (range 0

  6. Does ovarian stimulation for IVF increase gynaecological cancer risk? A systematic review and meta-analysis.

    PubMed

    Zhao, Jing; Li, Yanping; Zhang, Qiong; Wang, Yonggang

    2015-07-01

    The aim of this study was to evaluate whether ovarian stimulation for IVF increases the risk of gynaecological cancer, including ovarian, endometrial, cervical and breast cancers, as an independent risk factor. A systematic review and meta-analysis was conducted. Clinical trials that examined the association between ovarian stimulation for IVF and gynaecologic cancers were included. The outcomes of interest were incidence rate of gynaecologic cancers. Twelve cohort studies with 178,396 women exposed to IVF were included; 10 studies were used to analyse ovarian (167,640 women) and breast (151,702 women) cancers, and six studies were identified in the analysis of endometrial (116,672 women) and cervical cancer (114,799 women). Among these studies, 175 ovarian, 48 endometrial, 502 cervical and 866 cases of breast cancer were reported. The meta-analysis found no significant association between ovarian stimulation for IVF and increased ovarian, endometrial, cervical and breast cancer risk (odds ratio [OR] 1.06, 95% confidence interval [CI] 0.85 to 1.32; OR 0.97, 95% CI 0.58 to 1.63; OR 0.43, 95% CI 0.30 to 0.60; OR 0.69, 95% CI 0.63 to 0.76, respectively). Ovarian stimulation for IVF, therefore, does not increase the gynaecologic cancer risk, whether hormone-dependent endometrial and breast cancer or non-hormone-dependent ovarian and cervical cancer. PMID:26003452

  7. SACRIFICING THE ECOLOGICAL RESOLUTION OF VEGETATION MAPS AT THE ALTAR OF THEMATIC ACCURACY: ASSESSED MAP ACCURACIES FOR HIERARCHICAL VEGETATION CLASSIFICATIONS IN THE EASTERN GREAT BASIN OF THE SOUTHWEST REGIONAL GAP ANALYSIS PROJECT (SW REGAP)

    EPA Science Inventory

    The Southwest Regional Gap Analysis Project (SW ReGAP) improves upon previous GAP projects conducted in Arizona, Colorado, Nevada, New Mexico, and Utah to provide a
    consistent, seamless vegetation map for this large and ecologically diverse geographic region. Nevada's compone...

  8. Towards scar-free surgery: An analysis of the increasing complexity from laparoscopic surgery to NOTES

    PubMed Central

    Chellali, Amine; Schwaitzberg, Steven D.; Jones, Daniel B.; Romanelli, John; Miller, Amie; Rattner, David; Roberts, Kurt E.; Cao, Caroline G.L.

    2014-01-01

    Background NOTES is an emerging technique for performing surgical procedures, such as cholecystectomy. Debate about its real benefit over the traditional laparoscopic technique is on-going. There have been several clinical studies comparing NOTES to conventional laparoscopic surgery. However, no work has been done to compare these techniques from a Human Factors perspective. This study presents a systematic analysis describing and comparing different existing NOTES methods to laparoscopic cholecystectomy. Methods Videos of endoscopic/laparoscopic views from fifteen live cholecystectomies were analyzed to conduct a detailed task analysis of the NOTES technique. A hierarchical task analysis of laparoscopic cholecystectomy and several hybrid transvaginal NOTES cholecystectomies was performed and validated by expert surgeons. To identify similarities and differences between these techniques, their hierarchical decomposition trees were compared. Finally, a timeline analysis was conducted to compare the steps and substeps. Results At least three variations of the NOTES technique were used for cholecystectomy. Differences between the observed techniques at the substep level of hierarchy and on the instruments being used were found. The timeline analysis showed an increase in time to perform some surgical steps and substeps in NOTES compared to laparoscopic cholecystectomy. Conclusion As pure NOTES is extremely difficult given the current state of development in instrumentation design, most surgeons utilize different hybrid methods – combination of endoscopic and laparoscopic instruments/optics. Results of our hierarchical task analysis yielded an identification of three different hybrid methods to perform cholecystectomy with significant variability amongst them. The varying degrees to which laparoscopic instruments are utilized to assist in NOTES methods appear to introduce different technical issues and additional tasks leading to an increase in the surgical time. The

  9. Does Global Warming Increase Establishment Rates of Invasive Alien Species? A Centurial Time Series Analysis

    PubMed Central

    Huang, Dingcheng; Haack, Robert A.; Zhang, Runzhi

    2011-01-01

    Background The establishment rate of invasive alien insect species has been increasing worldwide during the past century. This trend has been widely attributed to increased rates of international trade and associated species introductions, but rarely linked to environmental change. To better understand and manage the bioinvasion process, it is crucial to understand the relationship between global warming and establishment rate of invasive alien species, especially for poikilothermic invaders such as insects. Methodology/Principal Findings We present data that demonstrate a significant positive relationship between the change in average annual surface air temperature and the establishment rate of invasive alien insects in mainland China during 1900–2005. This relationship was modeled by regression analysis, and indicated that a 1°C increase in average annual surface temperature in mainland China was associated with an increase in the establishment rate of invasive alien insects of about 0.5 species year−1. The relationship between rising surface air temperature and increasing establishment rate remained significant even after accounting for increases in international trade during the period 1950–2005. Moreover, similar relationships were detected using additional data from the United Kingdom and the contiguous United States. Conclusions/Significance These findings suggest that the perceived increase in establishments of invasive alien insects can be explained only in part by an increase in introduction rate or propagule pressure. Besides increasing propagule pressure, global warming is another driver that could favor worldwide bioinvasions. Our study highlights the need to consider global warming when designing strategies and policies to deal with bioinvasions. PMID:21931837

  10. Global genome splicing analysis reveals an increased number of alternatively spliced genes with aging.

    PubMed

    Rodríguez, Sofía A; Grochová, Diana; McKenna, Tomás; Borate, Bhavesh; Trivedi, Niraj S; Erdos, Michael R; Eriksson, Maria

    2016-04-01

    Alternative splicing (AS) is a key regulatory mechanism for the development of different tissues; however, not much is known about changes to alternative splicing during aging. Splicing events may become more frequent and widespread genome-wide as tissues age and the splicing machinery stringency decreases. Using skin, skeletal muscle, bone, thymus, and white adipose tissue from wild-type C57BL6/J male mice (4 and 18 months old), we examined the effect of age on splicing by AS analysis of the differential exon usage of the genome. The results identified a considerable number of AS genes in skeletal muscle, thymus, bone, and white adipose tissue between the different age groups (ranging from 27 to 246 AS genes corresponding to 0.3-3.2% of the total number of genes analyzed). For skin, skeletal muscle, and bone, we included a later age group (28 months old) that showed that the number of alternatively spliced genes increased with age in all three tissues (P < 0.01). Analysis of alternatively spliced genes across all tissues by gene ontology and pathway analysis identified 158 genes involved in RNA processing. Additional analysis of AS in a mouse model for the premature aging disease Hutchinson-Gilford progeria syndrome was performed. The results show that expression of the mutant protein, progerin, is associated with an impaired developmental splicing. As progerin accumulates, the number of genes with AS increases compared to in wild-type skin. Our results indicate the existence of a mechanism for increased AS during aging in several tissues, emphasizing that AS has a more important role in the aging process than previously known. PMID:26685868

  11. Anemia increases the mortality risk in patients with stroke: A meta-analysis of cohort studies

    PubMed Central

    Li, Zhanzhan; Zhou, Tao; Li, Yanyan; Chen, Peng; Chen, Lizhang

    2016-01-01

    The impact of anemia on the outcome of patients with stroke remains inconsistent. We performed a meta-analysis of cohort studies to assess the mortality risk in stroke patients with and without anemia. Systematic searches were conducted in the PubMed, China National Knowledge Infrastructure, Web of Science and Wanfang databases to identify relevant studies from inception to November 2015. The estimated odds ratio with a 95% confidence interval was pooled. subgroup analyses and sensitivity analyses were also conducted. We used Begg’s funnel plot and Egger’s test to detect the potential publication bias. Thirteen cohort studies with a total of 19239 patients with stroke were included in this meta-analysis. The heterogeneity among studies was slight (I2 = 59.0%, P = 0.031). The results from a random-effect model suggest that anemia is associated with an increased mortality risk in patients with stroke (adjusted odds ratio = 1.39, 95% confidence interval: 1.22–1.58, P < 0.001). The subgroup analyses are consistent with the total results. This meta-analysis of 13 cohort studies finds that anemia increases the mortality risk in patients with stroke. Future studies should perform longer follow-up to confirm this finding and explore its possible mechanism. PMID:27211606

  12. Anemia increases the mortality risk in patients with stroke: A meta-analysis of cohort studies.

    PubMed

    Li, Zhanzhan; Zhou, Tao; Li, Yanyan; Chen, Peng; Chen, Lizhang

    2016-01-01

    The impact of anemia on the outcome of patients with stroke remains inconsistent. We performed a meta-analysis of cohort studies to assess the mortality risk in stroke patients with and without anemia. Systematic searches were conducted in the PubMed, China National Knowledge Infrastructure, Web of Science and Wanfang databases to identify relevant studies from inception to November 2015. The estimated odds ratio with a 95% confidence interval was pooled. subgroup analyses and sensitivity analyses were also conducted. We used Begg's funnel plot and Egger's test to detect the potential publication bias. Thirteen cohort studies with a total of 19239 patients with stroke were included in this meta-analysis. The heterogeneity among studies was slight (I(2) = 59.0%, P = 0.031). The results from a random-effect model suggest that anemia is associated with an increased