Science.gov

Sample records for accuracy studies tool

  1. Accuracy study of the main screening tools for temporomandibular disorder in children and adolescents.

    PubMed

    de Santis, Tatiana Oliveira; Motta, Lara Jansiski; Biasotto-Gonzalez, Daniela Aparecida; Mesquita-Ferrari, Raquel Agnelli; Fernandes, Kristianne Porta Santos; de Godoy, Camila Haddad Leal; Alfaya, Thays Almeida; Bussadori, Sandra Kalil

    2014-01-01

    The aims of the present study were to assess the degree of sensitivity and specificity of the screening questionnaire recommended by the American Academy of Orofacial Pain (AAOP) and the patient-history index proposed by Helkimo (modified by Fonseca) and correlate the findings with a clinical exam. All participants answered the questionnaires and were submitted to a clinical exam by a dentist who had undergone calibration training. Both the AAOP questionnaire and Helkimo index achieved low degrees of sensitivity for the detection of temporomandibular disorder (TMD), but exhibited a high degree of specificity. With regard to concordance, the AAOP questionnaire and Helkimo index both achieved low levels of agreement with the clinical exam. The different instruments available in the literature for the assessment of TMD and examined herein exhibit low sensitivity and high specificity when administered to children and adolescents stemming from difficulties in comprehension due to the age group studied and the language used in the self-explanatory questions.

  2. The neglected tool in the Bayesian ecologist's shed: a case study testing informative priors' effect on model accuracy.

    PubMed

    Morris, William K; Vesk, Peter A; McCarthy, Michael A; Bunyavejchewin, Sarayudh; Baker, Patrick J

    2015-01-01

    Despite benefits for precision, ecologists rarely use informative priors. One reason that ecologists may prefer vague priors is the perception that informative priors reduce accuracy. To date, no ecological study has empirically evaluated data-derived informative priors' effects on precision and accuracy. To determine the impacts of priors, we evaluated mortality models for tree species using data from a forest dynamics plot in Thailand. Half the models used vague priors, and the remaining half had informative priors. We found precision was greater when using informative priors, but effects on accuracy were more variable. In some cases, prior information improved accuracy, while in others, it was reduced. On average, models with informative priors were no more or less accurate than models without. Our analyses provide a detailed case study on the simultaneous effect of prior information on precision and accuracy and demonstrate that when priors are specified appropriately, they lead to greater precision without systematically reducing model accuracy.

  3. Multinomial tree models for assessing the status of the reference in studies of the accuracy of tools for binary classification

    PubMed Central

    Botella, Juan; Huang, Huiling; Suero, Manuel

    2013-01-01

    Studies that evaluate the accuracy of binary classification tools are needed. Such studies provide 2 × 2 cross-classifications of test outcomes and the categories according to an unquestionable reference (or gold standard). However, sometimes a suboptimal reliability reference is employed. Several methods have been proposed to deal with studies where the observations are cross-classified with an imperfect reference. These methods require that the status of the reference, as a gold standard or as an imperfect reference, is known. In this paper a procedure for determining whether it is appropriate to maintain the assumption that the reference is a gold standard or an imperfect reference, is proposed. This procedure fits two nested multinomial tree models, and assesses and compares their absolute and incremental fit. Its implementation requires the availability of the results of several independent studies. These should be carried out using similar designs to provide frequencies of cross-classification between a test and the reference under investigation. The procedure is applied in two examples with real data. PMID:24106484

  4. Multinomial tree models for assessing the status of the reference in studies of the accuracy of tools for binary classification.

    PubMed

    Botella, Juan; Huang, Huiling; Suero, Manuel

    2013-01-01

    Studies that evaluate the accuracy of binary classification tools are needed. Such studies provide 2 × 2 cross-classifications of test outcomes and the categories according to an unquestionable reference (or gold standard). However, sometimes a suboptimal reliability reference is employed. Several methods have been proposed to deal with studies where the observations are cross-classified with an imperfect reference. These methods require that the status of the reference, as a gold standard or as an imperfect reference, is known. In this paper a procedure for determining whether it is appropriate to maintain the assumption that the reference is a gold standard or an imperfect reference, is proposed. This procedure fits two nested multinomial tree models, and assesses and compares their absolute and incremental fit. Its implementation requires the availability of the results of several independent studies. These should be carried out using similar designs to provide frequencies of cross-classification between a test and the reference under investigation. The procedure is applied in two examples with real data.

  5. A Meta-Analysis of the Accuracy of Prostate Cancer Studies Which Use Magnetic Resonance Spectroscopy as a Diagnostic Tool

    PubMed Central

    Guo, You-min; Liu, Min; Qiang, Yong-qian; Guo, Xiao-juan; Zhang, Yi-li; Duan, Xiao-Yi; Zhang, Qiu-Juan; Liang, Weifeng

    2008-01-01

    Objective We aimed to do a meta-analysis of the existing literature to assess the accuracy of prostate cancer studies which use magnetic resonance spectroscopy (MRS) as a diagnostic tool. Materials and Methods Prospectively, independent, blind studies were selected from the Cochrane library, Pubmed, and other network databases. The criteria for inclusion and exclusion in this study referenced the criteria of diagnostic research published by the Cochrane center. The statistical analysis was adopted by using Meta-Test version 6.0. Using the homogeneity test, a statistical effect model was chosen to calculate different pooled weighted values of sensitivity, specificity, and the corresponding 95% confidence intervals (95% CI). The summary receiver operating characteristic (SROC) curves method was used to assess the results. Results We chose two cut-off values (0.75 and 0.86) as the diagnostic criteria for discriminating between benign and malignant. In the first diagnostic criterion, the pooled weighted sensitivity, specificity, and corresponding 95% CI (expressed as area under curve [AUC]) were 0.82 (0.73, 0.89), 0.68 (0.58, 0.76), and 83.4% (74.97, 91.83). In the second criterion, the pooled weighted sensitivity, specificity, and corresponding 95% CI were 0.64 (0.55, 0.72), 0.86 (0.79, 0.91) and 82.7% (68.73, 96.68). Conclusion As a new method in the diagnostic of prostate cancer, MRS has a better applied value compared to other common modalities. Ultimately, large scale RCT randomized controlled trial studies are necessary to assess its clinical value. PMID:18838853

  6. EOS mapping accuracy study

    NASA Technical Reports Server (NTRS)

    Forrest, R. B.; Eppes, T. A.; Ouellette, R. J.

    1973-01-01

    Studies were performed to evaluate various image positioning methods for possible use in the earth observatory satellite (EOS) program and other earth resource imaging satellite programs. The primary goal is the generation of geometrically corrected and registered images, positioned with respect to the earth's surface. The EOS sensors which were considered were the thematic mapper, the return beam vidicon camera, and the high resolution pointable imager. The image positioning methods evaluated consisted of various combinations of satellite data and ground control points. It was concluded that EOS attitude control system design must be considered as a part of the image positioning problem for EOS, along with image sensor design and ground image processing system design. Study results show that, with suitable efficiency for ground control point selection and matching activities during data processing, extensive reliance should be placed on use of ground control points for positioning the images obtained from EOS and similar programs.

  7. Wind Prediction Accuracy for Air Traffic Management Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Cole, Rod; Green, Steve; Jardin, Matt; Schwartz, Barry; Benjamin, Stan

    2000-01-01

    The performance of Air Traffic Management and flight deck decision support tools depends in large part on the accuracy of the supporting 4D trajectory predictions. This is particularly relevant to conflict prediction and active advisories for the resolution of conflicts and the conformance with of traffic-flow management flow-rate constraints (e.g., arrival metering / required time of arrival). Flight test results have indicated that wind prediction errors may represent the largest source of trajectory prediction error. The tests also discovered relatively large errors (e.g., greater than 20 knots), existing in pockets of space and time critical to ATM DST performance (one or more sectors, greater than 20 minutes), are inadequately represented by the classic RMS aggregate prediction-accuracy studies of the past. To facilitate the identification and reduction of DST-critical wind-prediction errors, NASA has lead a collaborative research and development activity with MIT Lincoln Laboratories and the Forecast Systems Lab of the National Oceanographic and Atmospheric Administration (NOAA). This activity, begun in 1996, has focussed on the development of key metrics for ATM DST performance, assessment of wind-prediction skill for state of the art systems, and development/validation of system enhancements to improve skill. A 13 month study was conducted for the Denver Center airspace in 1997. Two complementary wind-prediction systems were analyzed and compared to the forecast performance of the then standard 60 km Rapid Update Cycle - version 1 (RUC-1). One system, developed by NOAA, was the prototype 40-km RUC-2 that became operational at NCEP in 1999. RUC-2 introduced a faster cycle (1 hr vs. 3 hr) and improved mesoscale physics. The second system, Augmented Winds (AW), is a prototype en route wind application developed by MITLL based on the Integrated Terminal Wind System (ITWS). AW is run at a local facility (Center) level, and updates RUC predictions based on an

  8. Comparison of Dimensional Accuracies Using Two Elastomeric Impression Materials in Casting Three-dimensional Tool Marks.

    PubMed

    Wang, Zhen

    2016-05-01

    The purpose of this study was to evaluate two types of impression materials which were frequently used for casting three-dimensional tool marks in China, namely (i) dental impression material and (ii) special elastomeric impression material for tool mark casting. The two different elastomeric impression materials were compared under equal conditions. The parameters measured were dimensional accuracies, the number of air bubbles, the ease of use, and the sharpness and quality of the individual characteristics present on casts. The results showed that dental impression material had the advantage of special elastomeric impression material in casting tool marks in crime scenes; hence, it combined ease of use, dimensional accuracy, sharpness and high quality.

  9. Effect of Flexural Rigidity of Tool on Machining Accuracy during Microgrooving by Ultrasonic Vibration Cutting Method

    NASA Astrophysics Data System (ADS)

    Furusawa, Toshiaki

    2010-12-01

    It is necessary to form fine holes and grooves by machining in the manufacture of equipment in the medical or information field and the establishment of such a machining technology is required. In micromachining, the use of the ultrasonic vibration cutting method is expected and examined. In this study, I experimentally form microgrooves in stainless steel SUS304 by the ultrasonic vibration cutting method and examine the effects of the shape and material of the tool on the machining accuracy. As a result, the following are clarified. The evaluation of the machining accuracy of the straightness of the finished surface revealed that there is an optimal rake angle of the tools related to the increase in cutting resistance as a result of increases in work hardening and the cutting area. The straightness is improved by using a tool with low flexural rigidity. In particular, Young's modulus more significantly affects the cutting accuracy than the shape of the tool.

  10. Data Accuracy in Citation Studies.

    ERIC Educational Resources Information Center

    Boyce, Bert R.; Banning, Carolyn Sue

    1979-01-01

    Four hundred eighty-seven citations of the 1976 issues of the Journal of the American Society for Information Science and the Personnel and Guidance Journal were checked for accuracy: total error was 13.6 percent and 10.7 percent, respectively. Error categories included incorrect author name, article/book title, journal title; wrong entry; and…

  11. Machine tool accuracy characterization workshops. Final report, May 5, 1992--November 5 1993

    SciTech Connect

    1995-01-06

    The ability to assess the accuracy of machine tools is required by both tool builders and users. Builders must have this ability in order to predict the accuracy capability of a machine tool for different part geometry`s, to provide verifiable accuracy information for sales purposes, and to locate error sources for maintenance, troubleshooting, and design enhancement. Users require the same ability in order to make intelligent choices in selecting or procuring machine tools, to predict component manufacturing accuracy, and to perform maintenance and troubleshooting. In both instances, the ability to fully evaluate the accuracy capabilities of a machine tool and the source of its limitations is essential for using the tool to its maximum accuracy and productivity potential. This project was designed to transfer expertise in modern machine tool accuracy testing methods from LLNL to US industry, and to educate users on the use and application of emerging standards for machine tool performance testing.

  12. The Accuracy of IOS Device-based uHear as a Screening Tool for Hearing Loss: A Preliminary Study From the Middle East

    PubMed Central

    Al-Abri, Rashid; Al-Balushi, Mustafa; Kolethekkat, Arif; Bhargava, Deepa; Al-Alwi, Amna; Al-Bahlani, Hana; Al-Garadi, Manal

    2016-01-01

    Objectives To determine and explore the potential use of uHear as a screening test for determining hearing disability by evaluating its accuracy in a clinical setting and a soundproof booth when compared to the gold standard conventional audiometry.   Methods Seventy Sultan Qaboos University students above the age of 17 years who had normal hearing were recruited for the study. They underwent a hearing test using conventional audiometry in a soundproof room, a self-administered uHear evaluation in a side room resembling a clinic setting, and a self-administered uHear test in a soundproof booth. The mean pure tone average (PTA) of thresholds at 500, 1000, 2000 and 4000 Hz for all the three test modalities was calculated, compared, and analyzed statistically.   Results There were 36 male and 34 female students in the study. The PTA with conventional audiometry ranged from 1 to 21 dB across left and right ears. The PTA using uHear in the side room for the same participants was 25 dB in the right ear and 28 dB in the left ear (3–54 dB across all ears). The PTA for uHear in the soundproof booth was 18 dB and 17 dB (1–43 dB) in the right and left ears, respectively. Twenty-three percent of participants were reported to have a mild hearing impairment (PTA > 25 dB) using the soundproof uHear test, and this number was 64% for the same test in the side room. For the same group, only 3% of participants were reported to have a moderate hearing impairment (PTA > 40 dB) using the uHear test in a soundproof booth, and 13% in the side room.   Conclusion uHear in any setting lacks specificity in the range of normal hearing and is highly unreliable in giving the exact hearing threshold in clinical settings. However, there is a potential for the use of uHear if it is used to rule out moderate hearing loss, even in a clinical setting, as exemplified by our study. This method needs standardization through further research. PMID:27168926

  13. Application of a Monte Carlo accuracy assessment tool to TDRS and GPS

    NASA Technical Reports Server (NTRS)

    Pavloff, Michael S.

    1994-01-01

    In support of a NASA study on the application of radio interferometry to satellite orbit determination, MITRE developed a simulation tool for assessing interferometric tracking accuracy. Initially, the tool was applied to the problem of determining optimal interferometric station siting for orbit determination of the Tracking and Data Relay Satellite (TDRS). Subsequently, the Orbit Determination Accuracy Estimator (ODAE) was expanded to model the general batch maximum likelihood orbit determination algorithms of the Goddard Trajectory Determination System (GTDS) with measurement types including not only group and phase delay from radio interferometry, but also range, range rate, angular measurements, and satellite-to-satellite measurements. The user of ODAE specifies the statistical properties of error sources, including inherent observable imprecision, atmospheric delays, station location uncertainty, and measurement biases. Upon Monte Carlo simulation of the orbit determination process, ODAE calculates the statistical properties of the error in the satellite state vector and any other parameters for which a solution was obtained in the orbit determination. This paper presents results from ODAE application to two different problems: (1)determination of optimal geometry for interferometirc tracking of TDRS, and (2) expected orbit determination accuracy for Global Positioning System (GPS) tracking of low-earth orbit (LEO) satellites. Conclusions about optimal ground station locations for TDRS orbit determination by radio interferometry are presented, and the feasibility of GPS-based tracking for IRIDIUM, a LEO mobile satellite communications (MOBILSATCOM) system, is demonstrated.

  14. Quantifying the prediction accuracy of a 1-D SVAT model at a range of ecosystems in the USA and Australia: evidence towards its use as a tool to study Earth's system interactions

    NASA Astrophysics Data System (ADS)

    Petropoulos, G. P.; North, M. R.; Ireland, G.; Srivastava, P. K.; Rendall, D. V.

    2015-10-01

    This paper describes the validation of the SimSphere SVAT (Soil-Vegetation-Atmosphere Transfer) model conducted at a range of US and Australian ecosystem types. Specific focus was given to examining the models' ability in predicting shortwave incoming solar radiation (Rg), net radiation (Rnet), latent heat (LE), sensible heat (H), air temperature at 1.3 m (Tair 1.3 m) and air temperature at 50 m (Tair 50 m). Model predictions were compared against corresponding in situ measurements acquired for a total of 72 selected days of the year 2011 obtained from eight sites belonging to the AmeriFlux (USA) and OzFlux (Australia) monitoring networks. Selected sites were representative of a variety of environmental, biome and climatic conditions, to allow for the inclusion of contrasting conditions in the model evaluation. Overall, results showed a good agreement between the model predictions and the in situ measurements, particularly so for the Rg, Rnet, Tair 1.3 m and Tair 50 m parameters. The simulated Rg parameter exhibited a root mean square deviation (RMSD) within 25 % of the observed fluxes for 58 of the 72 selected days, whereas an RMSD within ~ 24 % of the observed fluxes was reported for the Rnet parameter for all days of study (RMSD = 58.69 W m-2). A systematic underestimation of Rg and Rnet (mean bias error (MBE) = -19.48 and -16.46 W m-2) was also found. Simulations for the Tair 1.3 m and Tair 50 m showed good agreement with the in situ observations, exhibiting RMSDs of 3.23 and 3.77 °C (within ~ 15 and ~ 18 % of the observed) for all days of analysis, respectively. Comparable, yet slightly less satisfactory simulation accuracies were exhibited for the H and LE parameters (RMSDs = 38.47 and 55.06 W m-2, ~ 34 and ~ 28 % of the observed). Highest simulation accuracies were obtained for the open woodland savannah and mulga woodland sites for most of the compared parameters. The Nash-Sutcliffe efficiency index for all parameters ranges from 0.720 to 0.998, suggesting

  15. Investigation into the accuracy of a proposed laser diode based multilateration machine tool calibration system

    NASA Astrophysics Data System (ADS)

    Fletcher, S.; Longstaff, A. P.; Myers, A.

    2005-01-01

    Geometric and thermal calibration of CNC machine tools is required in modern machine shops with volumetric accuracy assessment becoming the standard machine tool qualification in many industries. Laser interferometry is a popular method of measuring the errors but this, and other alternatives, tend to be expensive, time consuming or both. This paper investigates the feasibility of using a laser diode based system that capitalises on the low cost nature of the diode to provide multiple laser sources for fast error measurement using multilateration. Laser diode module technology enables improved wavelength stability and spectral linewidth which are important factors for laser interferometry. With more than three laser sources, the set-up process can be greatly simplified while providing flexibility in the location of the laser sources improving the accuracy of the system.

  16. Evaluating radiographers' diagnostic accuracy in screen-reading mammograms: what constitutes a quality study?

    SciTech Connect

    Debono, Josephine C; Poulos, Ann E

    2015-03-15

    The aim of this study was to first evaluate the quality of studies investigating the diagnostic accuracy of radiographers as mammogram screen-readers and then to develop an adapted tool for determining the quality of screen-reading studies. A literature search was used to identify relevant studies and a quality evaluation tool constructed by combining the criteria for quality of Whiting, Rutjes, Dinnes et al. and Brealey and Westwood. This constructed tool was then applied to the studies and subsequently adapted specifically for use in evaluating quality in studies investigating diagnostic accuracy of screen-readers. Eleven studies were identified and the constructed tool applied to evaluate quality. This evaluation resulted in the identification of quality issues with the studies such as potential for bias, applicability of results, study conduct, reporting of the study and observer characteristics. An assessment of the applicability and relevance of the tool for this area of research resulted in adaptations to the criteria and the development of a tool specifically for evaluating diagnostic accuracy in screen-reading. This tool, with further refinement and rigorous validation can make a significant contribution to promoting well-designed studies in this important area of research and practice.

  17. An evaluation of the accuracy and speed of metagenome analysis tools

    PubMed Central

    Lindgreen, Stinus; Adair, Karen L.; Gardner, Paul P.

    2016-01-01

    Metagenome studies are becoming increasingly widespread, yielding important insights into microbial communities covering diverse environments from terrestrial and aquatic ecosystems to human skin and gut. With the advent of high-throughput sequencing platforms, the use of large scale shotgun sequencing approaches is now commonplace. However, a thorough independent benchmark comparing state-of-the-art metagenome analysis tools is lacking. Here, we present a benchmark where the most widely used tools are tested on complex, realistic data sets. Our results clearly show that the most widely used tools are not necessarily the most accurate, that the most accurate tool is not necessarily the most time consuming, and that there is a high degree of variability between available tools. These findings are important as the conclusions of any metagenomics study are affected by errors in the predicted community composition and functional capacity. Data sets and results are freely available from http://www.ucbioinformatics.org/metabenchmark.html PMID:26778510

  18. Technical Highlight: NREL Evaluates the Thermal Performance of Uninsulated Walls to Improve the Accuracy of Building Energy Simulation Tools

    SciTech Connect

    Ridouane, E.H.

    2012-01-01

    This technical highlight describes NREL research to develop models of uninsulated wall assemblies that help to improve the accuracy of building energy simulation tools when modeling potential energy savings in older homes.

  19. Expansion/De-expansion Tool to Quantify the Accuracy of Prostate Contours

    SciTech Connect

    Chung, Eugene; Stenmark, Matthew H.; Evans, Cheryl; Narayana, Vrinda; McLaughlin, Patrick W.

    2012-05-01

    Purpose: Accurate delineation of the prostate gland on computed tomography (CT) remains a persistent challenge and continues to introduce geometric uncertainty into the planning and delivery of external beam radiotherapy. We, therefore, developed an expansion/de-expansion tool to quantify the contour errors and determine the location of the deviations. Methods and Materials: A planning CT scan and magnetic resonance imaging scan were prospectively acquired for 10 patients with prostate cancer. The prostate glands were contoured by 3 independent observers using the CT data sets with instructions to contour the prostate without underestimation but to minimize overestimation. The standard prostate for each patient was defined using magnetic resonance imaging and CT on multiple planes. After registration of the CT and magnetic resonance imaging data sets, the CT-defined prostates were scored for accuracy. The contours were defined as ideal if they were within a 2.5-mm expansion of the standard without underestimation, acceptable if they were within a 5.0-mm expansion and a 2.5-mm de-expansion, and unacceptable if they extended >5.0 mm or underestimated the prostate by >2.5 mm. Results: A total of 636 CT slices were individually analyzed, with the vast majority scored as ideal or acceptable. However, none of the 30 prostate contour sets had all the contours scored as ideal or acceptable. For all 3 observers, the unacceptable contours were more likely from underestimation than overestimation of the prostate. The errors were more common at the base and apex than the mid-gland. Conclusions: The expansion/de-expansion tool allows for directed feedback on the location of contour deviations, as well as the determination of over- or underestimation of the prostate. This metric might help improve the accuracy of prostate contours.

  20. Dynamics of Complexity and Accuracy: A Longitudinal Case Study of Advanced Untutored Development

    ERIC Educational Resources Information Center

    Polat, Brittany; Kim, Youjin

    2014-01-01

    This longitudinal case study follows a dynamic systems approach to investigate an under-studied research area in second language acquisition, the development of complexity and accuracy for an advanced untutored learner of English. Using the analytical tools of dynamic systems theory (Verspoor et al. 2011) within the framework of complexity,…

  1. Air traffic control surveillance accuracy and update rate study

    NASA Technical Reports Server (NTRS)

    Craigie, J. H.; Morrison, D. D.; Zipper, I.

    1973-01-01

    The results of an air traffic control surveillance accuracy and update rate study are presented. The objective of the study was to establish quantitative relationships between the surveillance accuracies, update rates, and the communication load associated with the tactical control of aircraft for conflict resolution. The relationships are established for typical types of aircraft, phases of flight, and types of airspace. Specific cases are analyzed to determine the surveillance accuracies and update rates required to prevent two aircraft from approaching each other too closely.

  2. NREL Evaluates the Thermal Performance of Uninsulated Walls to Improve the Accuracy of Building Energy Simulation Tools (Fact Sheet)

    SciTech Connect

    Not Available

    2012-01-01

    This technical highlight describes NREL research to develop models of uninsulated wall assemblies that help to improve the accuracy of building energy simulation tools when modeling potential energy savings in older homes. Researchers at the National Renewable Energy Laboratory (NREL) have developed models for evaluating the thermal performance of walls in existing homes that will improve the accuracy of building energy simulation tools when predicting potential energy savings of existing homes. Uninsulated walls are typical in older homes where the wall cavities were not insulated during construction or where the insulating material has settled. Accurate calculation of heat transfer through building enclosures will help determine the benefit of energy efficiency upgrades in order to reduce energy consumption in older American homes. NREL performed detailed computational fluid dynamics (CFD) analysis to quantify the energy loss/gain through the walls and to visualize different airflow regimes within the uninsulated cavities. The effects of ambient outdoor temperature, radiative properties of building materials, and insulation level were investigated. The study showed that multi-dimensional airflows occur in walls with uninsulated cavities and that the thermal resistance is a function of the outdoor temperature - an effect not accounted for in existing building energy simulation tools. The study quantified the difference between CFD prediction and the approach currently used in building energy simulation tools over a wide range of conditions. For example, researchers found that CFD predicted lower heating loads and slightly higher cooling loads. Implementation of CFD results into building energy simulation tools such as DOE2 and EnergyPlus will likely reduce the predicted heating load of homes. Researchers also determined that a small air gap in a partially insulated cavity can lead to a significant reduction in thermal resistance. For instance, a 4-in. tall air gap

  3. A study of laseruler accuracy and precision (1986-1987)

    SciTech Connect

    Ramachandran, R.S.; Armstrong, K.P.

    1989-06-22

    A study was conducted to investigate Laserruler accuracy and precision. Tests were performed on 0.050 in., 0.100 in., and 0.120 in. gauge block standards. Results showed and accuracy of 3.7 {mu}in. for the 0.12 in. standard, with higher accuracies for the two thinner blocks. The Laserruler precision was 4.83 {mu}in. for the 0.120 in. standard, 3.83 {mu}in. for the 0.100 in. standard, and 4.2 {mu}in. for the 0.050 in. standard.

  4. On the accuracy of Hipparcos using binary stars as a calibration tool

    SciTech Connect

    Docobo, J. A.; Andrade, M. E-mail: manuel.andrade@usc.es

    2015-02-01

    Stellar binary systems, specifically those that present the most accurate available orbital elements, are a reliable tool to test the accuracy of astrometric observations. We selected all 35 binaries with these characteristics. Our objective is to provide standard uncertainties for the positions and parallaxes measured by Hipparcos relative to this trustworthy set, as well as to check supposed correlations between several parameters (measurement residuals, positions, magnitudes, and parallaxes). In addition, using the high-confidence subset of visual–spectroscopic binaries, we implemented a validation test of the Hipparcos trigonometric parallaxes of binary systems that allowed the evaluation of their reliability. Standard and non-standard statistical analysis techniques were applied in order to achieve well-founded conclusions. In particular, errors-in-variables models such as the total least-squares method were used to validate Hipparcos parallaxes by comparison with those obtained directly from the orbital elements. Previously, we executed Thompson's τ technique in order to detect suspected outliers in the data. Furthermore, several statistical hypothesis tests were carried out to verify if our results were statistically significant. A statistically significant trend indicating larger Hipparcos angular separations with respect to the reference values in 5.2 ± 1.4 mas was found at the 10{sup −8} significance level. Uncertainties in the polar coordinates θ and ρ of 1.°8 and 6.3 mas, respectively, were estimated for the Hipparcos observations of binary systems. We also verified that the parallaxes of binary systems measured in this mission are absolutely compatible with the set of orbital parallaxes obtained from the most accurate orbits at least at the 95% confidence level. This methodology allows us to better estimate the accuracy of Hipparcos observations of binary systems. Indeed, further application to the data collected by Gaia should yield a

  5. On the Hipparcos Accuracy Using Binary Stars as a Calibration Tool

    NASA Astrophysics Data System (ADS)

    Docobo, J. A.; Andrade, M.

    2015-02-01

    Stellar binary systems, specifically those that present the most accurate available orbital elements, are a reliable tool to test the accuracy of astrometric observations. We selected all 35 binaries with these characteristics. Our objective is to provide standard uncertainties for the positions and parallaxes measured by Hipparcos relative to this trustworthy set, as well as to check supposed correlations between several parameters (measurement residuals, positions, magnitudes, and parallaxes). In addition, using the high-confidence subset of visual-spectroscopic binaries, we implemented a validation test of the Hipparcos trigonometric parallaxes of binary systems that allowed the evaluation of their reliability. Standard and non-standard statistical analysis techniques were applied in order to achieve well-founded conclusions. In particular, errors-in-variables models such as the total least-squares method were used to validate Hipparcos parallaxes by comparison with those obtained directly from the orbital elements. Previously, we executed Thompson's τ technique in order to detect suspected outliers in the data. Furthermore, several statistical hypothesis tests were carried out to verify if our results were statistically significant. A statistically significant trend indicating larger Hipparcos angular separations with respect to the reference values in 5.2 ± 1.4 mas was found at the 10-8 significance level. Uncertainties in the polar coordinates θ and ρ of 1.°8 and 6.3 mas, respectively, were estimated for the Hipparcos observations of binary systems. We also verified that the parallaxes of binary systems measured in this mission are absolutely compatible with the set of orbital parallaxes obtained from the most accurate orbits at least at the 95% confidence level. This methodology allows us to better estimate the accuracy of Hipparcos observations of binary systems. Indeed, further application to the data collected by Gaia should yield a standard

  6. Assessing the quality of studies on the diagnostic accuracy of tumor markers

    PubMed Central

    Goebell, Peter J.; Kamat, Ashish M.; Sylvester, Richard J.; Black, Peter; Droller, Michael; Godoy, Guilherme; Hudson, M’Liss A.; Junker, Kerstin; Kassouf, Wassim; Knowles, Margaret A.; Schulz, Wolfgang A.; Seiler, Roland; Schmitz-Dräger, Bernd J.

    2015-01-01

    Objectives With rapidly increasing numbers of publications, assessments of study quality, reporting quality, and classification of studies according to their level of evidence or developmental stage have become key issues in weighing the relevance of new information reported. Diagnostic marker studies are often criticized for yielding highly discrepant and even controversial results. Much of this discrepancy has been attributed to differences in study quality. So far, numerous tools for measuring study quality have been developed, but few of them have been used for systematic reviews and meta-analysis. This is owing to the fact that most tools are complicated and time consuming, suffer from poor reproducibility, and do not permit quantitative scoring. Methods The International Bladder Cancer Network (IBCN) has adopted this problem and has systematically identified the more commonly used tools developed since 2000. Results In this review, those tools addressing study quality (Quality Assessment of Studies of Diagnostic Accuracy and Newcastle-Ottawa Scale), reporting quality (Standards for Reporting of Diagnostic Accuracy), and developmental stage (IBCN phases) of studies on diagnostic markers in bladder cancer are introduced and critically analyzed. Based upon this, the IBCN has launched an initiative to assess and validate existing tools with emphasis on diagnostic bladder cancer studies. Conclusions The development of simple and reproducible tools for quality assessment of diagnostic marker studies permitting quantitative scoring is suggested. PMID:25159014

  7. Cutting tool study: 21-6-9 stainless steel

    SciTech Connect

    McManigle, A.P.

    1992-07-29

    The Rocky Flats Plant conducted a study to test cermet cutting tools by performing machinability studies on War Reserve product under controlled conditions. The purpose of these studies was to determine the most satisfactory tools that optimize tool life, minimize costs, improve reliability and chip control, and increase productivity by performing the operations to specified Accuracies. This study tested three manufacturers` cermet cutting tools and a carbide tool used previously by the Rocky Flats Plant for machining spherical-shaped 21-6-9 stainless steel forgings (Figure 1). The 80-degree diamond inserts were tested by experimenting with various chip-breaker geometries, cutting speeds, feedrates, and cermet grades on the outside contour roughing operation. The cermets tested were manufactured by Kennametal, Valenite, and NTK. The carbide tool ordinarily used for this operation is manufactured by Carboloy. Evaluation of tho tools was conducted by investigating the number of passes per part and parts per insert, tool wear, cutting time, tool life, surface finish, and stem taper. Benefits to be gained from this study were: improved part quality, better chip control, increased tool life and utilization, and greater fabrication productivity. This was to be accomplished by performing the operation to specified accuracies within the scope of the tools tested.

  8. The accuracy of a patient or parent-administered bleeding assessment tool administered in a paediatric haematology clinic.

    PubMed

    Lang, A T; Sturm, M S; Koch, T; Walsh, M; Grooms, L P; O'Brien, S H

    2014-11-01

    Classifying and describing bleeding symptoms is essential in the diagnosis and management of patients with mild bleeding disorders (MBDs). There has been increased interest in the use of bleeding assessment tools (BATs) to more objectively quantify the presence and severity of bleeding symptoms. To date, the administration of BATs has been performed almost exclusively by clinicians; the accuracy of a parent-proxy BAT has not been studied. Our objective was to determine the accuracy of a parent-administered BAT by measuring the level of agreement between parent and clinician responses to the Condensed MCMDM-1VWD Bleeding Questionnaire. Our cross-sectional study included children 0-21 years presenting to a haematology clinic for initial evaluation of a suspected MBD or follow-up evaluation of a previously diagnosed MBD. The parent/caregiver completed a modified version of the BAT; the clinician separately completed the BAT through interview. The mean parent-report bleeding score (BS) was 6.09 (range: -2 to 25); the mean clinician report BS was 4.54 (range: -1 to 17). The mean percentage of agreement across all bleeding symptoms was 78% (mean κ = 0.40; Gwet's AC1 = 0.74). Eighty percent of the population had an abnormal BS (defined as ≥2) when rated by parents and 76% had an abnormal score when rated by clinicians (86% agreement, κ = 0.59, Gwet's AC1 = 0.79). While parents tended to over-report bleeding as compared to clinicians, overall, BSs were similar between groups. These results lend support for further study of a modified proxy-report BAT as a clinical and research tool.

  9. Embodied Rules in Tool Use: A Tool-Switching Study

    ERIC Educational Resources Information Center

    Beisert, Miriam; Massen, Cristina; Prinz, Wolfgang

    2010-01-01

    In tool use, a transformation rule defines the relation between an operating movement and its distal effect. This rule is determined by the tool structure and requires no explicit definition. The present study investigates how humans represent and apply compatible and incompatible transformation rules in tool use. In Experiment 1, participants had…

  10. Does diagnosis affect the predictive accuracy of risk assessment tools for juvenile offenders: Conduct Disorder and Attention Deficit Hyperactivity Disorder.

    PubMed

    Khanna, Dinesh; Shaw, Jenny; Dolan, Mairead; Lennox, Charlotte

    2014-10-01

    Studies have suggested an increased risk of criminality in juveniles if they suffer from co-morbid Attention Deficit Hyperactivity Disorder (ADHD) along with Conduct Disorder. The Structured Assessment of Violence Risk in Youth (SAVRY), the Psychopathy Checklist Youth Version (PCL:YV), and Youth Level of Service/Case Management Inventory (YLS/CMI) have been shown to be good predictors of violent and non-violent re-offending. The aim was to compare the accuracy of these tools to predict violent and non-violent re-offending in young people with co-morbid ADHD and Conduct Disorder and Conduct Disorder only. The sample included 109 White-British adolescent males in secure settings. Results revealed no significant differences between the groups for re-offending. SAVRY factors had better predictive values than PCL:YV or YLS/CMI. Tools generally had better predictive values for the Conduct Disorder only group than the co-morbid group. Possible reasons for these findings have been discussed along with limitations of the study. PMID:25173178

  11. Structure alignment of membrane proteins: Accuracy of available tools and a consensus strategy.

    PubMed

    Stamm, Marcus; Forrest, Lucy R

    2015-09-01

    Protein structure alignment methods are used for the detection of evolutionary and functionally related positions in proteins. A wide array of different methods are available, but the choice of the best method is often not apparent to the user. Several studies have assessed the alignment accuracy and consistency of structure alignment methods, but none of these explicitly considered membrane proteins, which are important targets for drug development and have distinct structural features. Here, we compared 13 widely used pairwise structural alignment methods on a test set of homologous membrane protein structures (called HOMEP3). Each pair of structures was aligned and the corresponding sequence alignment was used to construct homology models. The model accuracy compared to the known structures was assessed using scoring functions not incorporated in the tested structural alignment methods. The analysis shows that fragment-based approaches such as FR-TM-align are the most useful for aligning structures of membrane proteins. Moreover, fragment-based approaches are more suitable for comparison of protein structures that have undergone large conformational changes. Nevertheless, no method was clearly superior to all other methods. Additionally, all methods lack a measure to rate the reliability of a position within a structure alignment. To solve both of these problems, we propose a consensus-type approach, combining alignments from four different methods, namely FR-TM-align, DaliLite, MATT, and FATCAT. Agreement between the methods is used to assign confidence values to each position of the alignment. Overall, we conclude that there remains scope for the improvement of structural alignment methods for membrane proteins.

  12. Study of accuracy of precipitation measurements using simulation method

    NASA Astrophysics Data System (ADS)

    Nagy, Zoltán; Lajos, Tamás; Morvai, Krisztián

    2013-04-01

    Hungarian Meteorological Service1 Budapest University of Technology and Economics2 Precipitation is one of the the most important meteorological parameters describing the state of the climate and to get correct information from trends, accurate measurements of precipitation is very important. The problem is that the precipitation measurements are affected by systematic errors leading to an underestimation of actual precipitation which errors vary by type of precipitaion and gauge type. It is well known that the wind speed is the most important enviromental factor that contributes to the underestimation of actual precipitation, especially for solid precipitation. To study and correct the errors of precipitation measurements there are two basic possibilities: · Use of results and conclusion of International Precipitation Measurements Intercomparisons; · To build standard reference gauges (DFIR, pit gauge) and make own investigation; In 1999 at the HMS we tried to achieve own investigation and built standard reference gauges But the cost-benefit ratio in case of snow (use of DFIR) was very bad (we had several winters without significant amount of snow, while the state of DFIR was continously falling) Due to the problem mentioned above there was need for new approximation that was the modelling made by Budapest University of Technology and Economics, Department of Fluid Mechanics using the FLUENT 6.2 model. The ANSYS Fluent package is featured fluid dynamics solution for modelling flow and other related physical phenomena. It provides the tools needed to describe atmospheric processes, design and optimize new equipment. The CFD package includes solvers that accurately simulate behaviour of the broad range of flows that from single-phase to multi-phase. The questions we wanted to get answer to are as follows: · How do the different types of gauges deform the airflow around themselves? · Try to give quantitative estimation of wind induced error. · How does the use

  13. The science of and advanced technology for cost-effective manufacture of high precision engineering products. Volume 4. Thermal effects on the accuracy of numerically controlled machine tool

    NASA Astrophysics Data System (ADS)

    Venugopal, R.; Barash, M. M.; Liu, C. R.

    1985-10-01

    Thermal effects on the accuracy of numerically controlled machine tools are specially important in the context of unmanned manufacture or under conditions of precision metal cutting. Removal of the operator from the direct control of the metal cutting process has created problems in terms of maintaining accuracy. The objective of this research is to study thermal effects on the accuracy of numerically controlled machine tools. The initial part of the research report is concerned with the analysis of a hypothetical machine. The thermal characteristics of this machine are studied. Numerical methods for evaluating the errors exhibited by the slides of the machine are proposed and the possibility of predicting thermally induced errors by the use of regression equations is investigated. A method for computing the workspace error is also presented. The final part is concerned with the actual measurement of errors on a modern CNC machining center. Thermal influences on the errors is the main objective of the experimental work. Thermal influences on the errors of machine tools are predictable. Techniques for determining thermal effects on machine tools at a design stage are also presented. ; Error models and prediction; Metrology; Automation.

  14. A Study on the Effect of Input Parameters on Springback Prediction Accuracy

    NASA Astrophysics Data System (ADS)

    Han, Y. S.; Yang, W. H.; Choi, K. Y.; Kim, B. H.

    2011-08-01

    In this study, it is considered the input parameters in springback simulation affect factors to use member part by Taguchi's method into six-sigma tool on the basis of experiment for acquiring much more accurate springback prediction in Pamstamp2G. The best combination of input parameters for higher springback prediction accuracy is determined to the fender part as the one is applied for member part. The cracks and wrinkles in drawing and flanging operation must be removed for predicting the higher springback in accuracy. The compensation of springback on the basis of simulation is carried out. It is concluded that 95% of accuracy for springback prediction in dimension is secured as comparing with tryout panel.

  15. Accuracy study of the IDO scheme by Fourier analysis

    NASA Astrophysics Data System (ADS)

    Imai, Yohsuke; Aoki, Takayuki

    2006-09-01

    The numerical accuracy of the Interpolated Differential Operator (IDO) scheme is studied with Fourier analysis for the solutions of Partial Differential Equations (PDEs): advection, diffusion, and Poisson equations. The IDO scheme solves governing equations not only for physical variable but also for first-order spatial derivative. Spatial discretizations are based on Hermite interpolation functions with both of them. In the Fourier analysis for the IDO scheme, the Fourier coefficients of the physical variable and the first-order derivative are coupled by the equations derived from the governing equations. The analysis shows the IDO scheme resolves all the wavenumbers with higher accuracy than the fourth-order Finite Difference (FD) and Compact Difference (CD) schemes for advection equation. In particular, for high wavenumbers, the accuracy is superior to that of the sixth-order Combined Compact Difference (CCD) scheme. The diffusion and Poisson equations are also more accurately solved in comparison with the FD and CD schemes. These results show that the IDO scheme guarantees highly resolved solutions for all the terms of fluid flow equations.

  16. Bias due to composite reference standards in diagnostic accuracy studies.

    PubMed

    Schiller, Ian; van Smeden, Maarten; Hadgu, Alula; Libman, Michael; Reitsma, Johannes B; Dendukuri, Nandini

    2016-04-30

    Composite reference standards (CRSs) have been advocated in diagnostic accuracy studies in the absence of a perfect reference standard. The rationale is that combining results of multiple imperfect tests leads to a more accurate reference than any one test in isolation. Focusing on a CRS that classifies subjects as disease positive if at least one component test is positive, we derive algebraic expressions for sensitivity and specificity of this CRS, sensitivity and specificity of a new (index) test compared with this CRS, as well as the CRS-based prevalence. We use as a motivating example the problem of evaluating a new test for Chlamydia trachomatis, an asymptomatic disease for which no gold-standard test exists. As the number of component tests increases, sensitivity of this CRS increases at the expense specificity, unless all tests have perfect specificity. Therefore, such a CRS can lead to significantly biased accuracy estimates of the index test. The bias depends on disease prevalence and accuracy of the CRS. Further, conditional dependence between the CRS and index test can lead to over-estimation of index test accuracy estimates. This commonly-used CRS combines results from multiple imperfect tests in a way that ignores information and therefore is not guaranteed to improve over a single imperfect reference unless each component test has perfect specificity, and the CRS is conditionally independent of the index test. When these conditions are not met, as in the case of C. trachomatis testing, more realistic statistical models should be researched instead of relying on such CRSs.

  17. A New 3D Tool for Assessing the Accuracy of Bimaxillary Surgery: The OrthoGnathicAnalyser

    PubMed Central

    Xi, Tong; Schreurs, Ruud; de Koning, Martien; Bergé, Stefaan; Maal, Thomas

    2016-01-01

    Aim The purpose of this study was to present and validate an innovative semi-automatic approach to quantify the accuracy of the surgical outcome in relation to 3D virtual orthognathic planning among patients who underwent bimaxillary surgery. Material and Method For the validation of this new semi-automatic approach, CBCT scans of ten patients who underwent bimaxillary surgery were acquired pre-operatively. Individualized 3D virtual operation plans were made for all patients prior to surgery. During surgery, the maxillary and mandibular segments were positioned as planned by using 3D milled interocclusal wafers. Consequently, post-operative CBCT scan were acquired. The 3D rendered pre- and postoperative virtual head models were aligned by voxel-based registration upon the anterior cranial base. To calculate the discrepancies between the 3D planning and the actual surgical outcome, the 3D planned maxillary and mandibular segments were segmented and superimposed upon the postoperative maxillary and mandibular segments. The translation matrices obtained from this registration process were translated into translational and rotational discrepancies between the 3D planning and the surgical outcome, by using the newly developed tool, the OrthoGnathicAnalyser. To evaluate the reproducibility of this method, the process was performed by two independent observers multiple times. Results Low intra-observer and inter-observer variations in measurement error (mean error < 0.25 mm) and high intraclass correlation coefficients (> 0.97) were found, supportive of the observer independent character of the OrthoGnathicAnalyser. The pitch of the maxilla and mandible showed the highest discrepancy between the 3D planning and the postoperative results, 2.72° and 2.75° respectively. Conclusion This novel method provides a reproducible tool for the evaluation of bimaxillary surgery, making it possible to compare larger patient groups in an objective and time-efficient manner in order to

  18. Numerical Stability and Accuracy of Temporally Coupled Multi-Physics Modules in Wind-Turbine CAE Tools

    SciTech Connect

    Gasmi, A.; Sprague, M. A.; Jonkman, J. M.; Jones, W. B.

    2013-02-01

    In this paper we examine the stability and accuracy of numerical algorithms for coupling time-dependent multi-physics modules relevant to computer-aided engineering (CAE) of wind turbines. This work is motivated by an in-progress major revision of FAST, the National Renewable Energy Laboratory's (NREL's) premier aero-elastic CAE simulation tool. We employ two simple examples as test systems, while algorithm descriptions are kept general. Coupled-system governing equations are framed in monolithic and partitioned representations as differential-algebraic equations. Explicit and implicit loose partition coupling is examined. In explicit coupling, partitions are advanced in time from known information. In implicit coupling, there is dependence on other-partition data at the next time step; coupling is accomplished through a predictor-corrector (PC) approach. Numerical time integration of coupled ordinary-differential equations (ODEs) is accomplished with one of three, fourth-order fixed-time-increment methods: Runge-Kutta (RK), Adams-Bashforth (AB), and Adams-Bashforth-Moulton (ABM). Through numerical experiments it is shown that explicit coupling can be dramatically less stable and less accurate than simulations performed with the monolithic system. However, PC implicit coupling restored stability and fourth-order accuracy for ABM; only second-order accuracy was achieved with RK integration. For systems without constraints, explicit time integration with AB and explicit loose coupling exhibited desired accuracy and stability.

  19. High-accuracy mass spectrometry for fundamental studies.

    PubMed

    Kluge, H-Jürgen

    2010-01-01

    Mass spectrometry for fundamental studies in metrology and atomic, nuclear and particle physics requires extreme sensitivity and efficiency as well as ultimate resolving power and accuracy. An overview will be given on the global status of high-accuracy mass spectrometry for fundamental physics and metrology. Three quite different examples of modern mass spectrometric experiments in physics are presented: (i) the retardation spectrometer KATRIN at the Forschungszentrum Karlsruhe, employing electrostatic filtering in combination with magnetic-adiabatic collimation-the biggest mass spectrometer for determining the smallest mass, i.e. the mass of the electron anti-neutrino, (ii) the Experimental Cooler-Storage Ring at GSI-a mass spectrometer of medium size, relative to other accelerators, for determining medium-heavy masses and (iii) the Penning trap facility, SHIPTRAP, at GSI-the smallest mass spectrometer for determining the heaviest masses, those of super-heavy elements. Finally, a short view into the future will address the GSI project HITRAP at GSI for fundamental studies with highly-charged ions.

  20. Cadastral Positioning Accuracy Improvement: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Hashim, N. M.; Omar, A. H.; Omar, K. M.; Abdullah, N. M.; Yatim, M. H. M.

    2016-09-01

    Cadastral map is a parcel-based information which is specifically designed to define the limitation of boundaries. In Malaysia, the cadastral map is under authority of the Department of Surveying and Mapping Malaysia (DSMM). With the growth of spatial based technology especially Geographical Information System (GIS), DSMM decided to modernize and reform its cadastral legacy datasets by generating an accurate digital based representation of cadastral parcels. These legacy databases usually are derived from paper parcel maps known as certified plan. The cadastral modernization will result in the new cadastral database no longer being based on single and static parcel paper maps, but on a global digital map. Despite the strict process of the cadastral modernization, this reform has raised unexpected queries that remain essential to be addressed. The main focus of this study is to review the issues that have been generated by this transition. The transformed cadastral database should be additionally treated to minimize inherent errors and to fit them to the new satellite based coordinate system with high positional accuracy. This review result will be applied as a foundation for investigation to study the systematic and effectiveness method for Positional Accuracy Improvement (PAI) in cadastral database modernization.

  1. High-accuracy mass spectrometry for fundamental studies.

    PubMed

    Kluge, H-Jürgen

    2010-01-01

    Mass spectrometry for fundamental studies in metrology and atomic, nuclear and particle physics requires extreme sensitivity and efficiency as well as ultimate resolving power and accuracy. An overview will be given on the global status of high-accuracy mass spectrometry for fundamental physics and metrology. Three quite different examples of modern mass spectrometric experiments in physics are presented: (i) the retardation spectrometer KATRIN at the Forschungszentrum Karlsruhe, employing electrostatic filtering in combination with magnetic-adiabatic collimation-the biggest mass spectrometer for determining the smallest mass, i.e. the mass of the electron anti-neutrino, (ii) the Experimental Cooler-Storage Ring at GSI-a mass spectrometer of medium size, relative to other accelerators, for determining medium-heavy masses and (iii) the Penning trap facility, SHIPTRAP, at GSI-the smallest mass spectrometer for determining the heaviest masses, those of super-heavy elements. Finally, a short view into the future will address the GSI project HITRAP at GSI for fundamental studies with highly-charged ions. PMID:20530821

  2. Accuracy of the Emotion Thermometers (ET) screening tool in patients undergoing surgery for upper gastrointestinal malignancies.

    PubMed

    Schubart, Jane R; Mitchell, Alex J; Dietrich, Laura; Gusani, Niraj J

    2015-01-01

    Distress is common in patients with gastrointestinal cancers. Most conventional scales are too long for routine clinic use. We tested the Emotion Thermometers (ET) tool, a brief visual-analogue scale. There are four emotional upset thermometers: distress, anxiety, depression, and anger. Sixty-nine surgical patients were recruited from an academic hospital clinic in 2012; 64 had complete data for Beck depression inventory and ET. The sample size was modest due to the specialist nature of the sample. We examined sensitivity, specificity, and area under the receiver-operator-curve. A dimensional multi-domain approach to screening for emotional disorders is preferable to using the distress thermometer alone and can be achieved with little extra time burden to clinicians. The ET is a diagnostic tool that is primarily designed for screening to identify cancer patients who would benefit by enhanced psychosocial care.

  3. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis.

    PubMed

    Litjens, Geert; Sánchez, Clara I; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen-van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen

    2016-01-01

    Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce 'deep learning' as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30-40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that 'deep learning' holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging. PMID:27212078

  4. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis.

    PubMed

    Litjens, Geert; Sánchez, Clara I; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen-van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen

    2016-05-23

    Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce 'deep learning' as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30-40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that 'deep learning' holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging.

  5. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis

    NASA Astrophysics Data System (ADS)

    Litjens, Geert; Sánchez, Clara I.; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen-van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen

    2016-05-01

    Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce ‘deep learning’ as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30–40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that ‘deep learning’ holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging.

  6. Accuracy and efficiency of detection dogs: a powerful new tool for koala conservation and management.

    PubMed

    Cristescu, Romane H; Foley, Emily; Markula, Anna; Jackson, Gary; Jones, Darryl; Frère, Céline

    2015-01-01

    Accurate data on presence/absence and spatial distribution for fauna species is key to their conservation. Collecting such data, however, can be time consuming, laborious and costly, in particular for fauna species characterised by low densities, large home ranges, cryptic or elusive behaviour. For such species, including koalas (Phascolarctos cinereus), indicators of species presence can be a useful shortcut: faecal pellets (scats), for instance, are widely used. Scat surveys are not without their difficulties and often contain a high false negative rate. We used experimental and field-based trials to investigate the accuracy and efficiency of the first dog specifically trained for koala scats. The detection dog consistently out-performed human-only teams. Off-leash, the dog detection rate was 100%. The dog was also 19 times more efficient than current scat survey methods and 153% more accurate (the dog found koala scats where the human-only team did not). This clearly demonstrates that the use of detection dogs decreases false negatives and survey time, thus allowing for a significant improvement in the quality and quantity of data collection. Given these unequivocal results, we argue that to improve koala conservation, detection dog surveys for koala scats could in the future replace human-only teams.

  7. Accuracy and efficiency of detection dogs: a powerful new tool for koala conservation and management

    PubMed Central

    Cristescu, Romane H.; Foley, Emily; Markula, Anna; Jackson, Gary; Jones, Darryl; Frère, Céline

    2015-01-01

    Accurate data on presence/absence and spatial distribution for fauna species is key to their conservation. Collecting such data, however, can be time consuming, laborious and costly, in particular for fauna species characterised by low densities, large home ranges, cryptic or elusive behaviour. For such species, including koalas (Phascolarctos cinereus), indicators of species presence can be a useful shortcut: faecal pellets (scats), for instance, are widely used. Scat surveys are not without their difficulties and often contain a high false negative rate. We used experimental and field-based trials to investigate the accuracy and efficiency of the first dog specifically trained for koala scats. The detection dog consistently out-performed human-only teams. Off-leash, the dog detection rate was 100%. The dog was also 19 times more efficient than current scat survey methods and 153% more accurate (the dog found koala scats where the human-only team did not). This clearly demonstrates that the use of detection dogs decreases false negatives and survey time, thus allowing for a significant improvement in the quality and quantity of data collection. Given these unequivocal results, we argue that to improve koala conservation, detection dog surveys for koala scats could in the future replace human-only teams. PMID:25666691

  8. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis

    PubMed Central

    Litjens, Geert; Sánchez, Clara I.; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen - van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen

    2016-01-01

    Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce ‘deep learning’ as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30–40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that ‘deep learning’ holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging. PMID:27212078

  9. Accuracy and efficiency of detection dogs: a powerful new tool for koala conservation and management.

    PubMed

    Cristescu, Romane H; Foley, Emily; Markula, Anna; Jackson, Gary; Jones, Darryl; Frère, Céline

    2015-01-01

    Accurate data on presence/absence and spatial distribution for fauna species is key to their conservation. Collecting such data, however, can be time consuming, laborious and costly, in particular for fauna species characterised by low densities, large home ranges, cryptic or elusive behaviour. For such species, including koalas (Phascolarctos cinereus), indicators of species presence can be a useful shortcut: faecal pellets (scats), for instance, are widely used. Scat surveys are not without their difficulties and often contain a high false negative rate. We used experimental and field-based trials to investigate the accuracy and efficiency of the first dog specifically trained for koala scats. The detection dog consistently out-performed human-only teams. Off-leash, the dog detection rate was 100%. The dog was also 19 times more efficient than current scat survey methods and 153% more accurate (the dog found koala scats where the human-only team did not). This clearly demonstrates that the use of detection dogs decreases false negatives and survey time, thus allowing for a significant improvement in the quality and quantity of data collection. Given these unequivocal results, we argue that to improve koala conservation, detection dog surveys for koala scats could in the future replace human-only teams. PMID:25666691

  10. Real-time diagnosis of H. pylori infection during endoscopy: Accuracy of an innovative tool (EndoFaster)

    PubMed Central

    Costamagna, Guido; Zullo, Angelo; Bizzotto, Alessandra; Hassan, Cesare; Riccioni, Maria Elena; Marmo, Clelia; Strangio, Giuseppe; Di Rienzo, Teresa Antonella; Cammarota, Giovanni; Gasbarrini, Antonio; Repici, Alessandro

    2015-01-01

    Background EndoFaster is novel device able to perform real-time ammonium measurement in gastric juice allowing H. pylori diagnosis during endoscopy. This large study aimed to validate the accuracy of EndoFaster for real-time H. pylori detection. Methods Consecutive patients who underwent upper endoscopy in two centres were prospectively enrolled. During endoscopy, 4 ml of gastric juice were aspirated to perform automatic analysis by EndoFaster within 90 seconds, and H. pylori was considered present (>62 ppm/ml) or absent (≤62 ppm/ml). Accuracy was measured by using histology as gold standard, and 13C-urea breath test (UBT) in discordant cases. Accuracy, sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) were calculated. Results Overall, 189 patients were enrolled, but in seven (3.4%) the aspirated gastric juice amount was insufficient to perform the test. The accuracy, sensitivity, specificity, PPV, and NPV were 87.4%, 90.3%, 85.5%, 80.2%, 93.1%, respectively, and 92.6%, 97.1%, 89.7%, 85.9%, 98.0%, respectively, when H. pylori status was reclassified according to the UBT result in discordant cases. Conclusions This study found a high accuracy/feasibility of EndoFaster for real-time H. pylori diagnosis. Use of EndoFaster may allow selecting those patients in whom routine gastric biopsies could be avoided. PMID:27403299

  11. Nonparametric meta-analysis for diagnostic accuracy studies.

    PubMed

    Zapf, Antonia; Hoyer, Annika; Kramer, Katharina; Kuss, Oliver

    2015-12-20

    Summarizing the information of many studies using a meta-analysis becomes more and more important, also in the field of diagnostic studies. The special challenge in meta-analysis of diagnostic accuracy studies is that in general sensitivity and specificity are co-primary endpoints. Across the studies both endpoints are correlated, and this correlation has to be considered in the analysis. The standard approach for such a meta-analysis is the bivariate logistic random effects model. An alternative approach is to use marginal beta-binomial distributions for the true positives and the true negatives, linked by copula distributions. In this article, we propose a new, nonparametric approach of analysis, which has greater flexibility with respect to the correlation structure, and always converges. In a simulation study, it becomes apparent that the empirical coverage of all three approaches is in general below the nominal level. Regarding bias, empirical coverage, and mean squared error the nonparametric model is often superior to the standard model, and comparable with the copula model. The three approaches are also applied to two example meta-analyses. PMID:26174020

  12. Alaska national hydrography dataset positional accuracy assessment study

    USGS Publications Warehouse

    Arundel, Samantha; Yamamoto, Kristina H.; Constance, Eric; Mantey, Kim; Vinyard-Houx, Jeremy

    2013-01-01

    Initial visual assessments Wide range in the quality of fit between features in NHD and these new image sources. No statistical analysis has been performed to actually quantify accuracy Determining absolute accuracy is cost prohibitive (must collect independent, well defined test points) Quantitative analysis of relative positional error is feasible.

  13. Reporting standards for studies of diagnostic test accuracy in dementia

    PubMed Central

    Noel-Storr, Anna H.; McCleery, Jenny M.; Richard, Edo; Ritchie, Craig W.; Flicker, Leon; Cullum, Sarah J.; Davis, Daniel; Quinn, Terence J.; Hyde, Chris; Rutjes, Anne W.S.; Smailagic, Nadja; Marcus, Sue; Black, Sandra; Blennow, Kaj; Brayne, Carol; Fiorivanti, Mario; Johnson, Julene K.; Köpke, Sascha; Schneider, Lon S.; Simmons, Andrew; Mattsson, Niklas; Zetterberg, Henrik; Bossuyt, Patrick M.M.; Wilcock, Gordon

    2014-01-01

    Objective: To provide guidance on standards for reporting studies of diagnostic test accuracy for dementia disorders. Methods: An international consensus process on reporting standards in dementia and cognitive impairment (STARDdem) was established, focusing on studies presenting data from which sensitivity and specificity were reported or could be derived. A working group led the initiative through 4 rounds of consensus work, using a modified Delphi process and culminating in a face-to-face consensus meeting in October 2012. The aim of this process was to agree on how best to supplement the generic standards of the STARD statement to enhance their utility and encourage their use in dementia research. Results: More than 200 comments were received during the wider consultation rounds. The areas at most risk of inadequate reporting were identified and a set of dementia-specific recommendations to supplement the STARD guidance were developed, including better reporting of patient selection, the reference standard used, avoidance of circularity, and reporting of test-retest reliability. Conclusion: STARDdem is an implementation of the STARD statement in which the original checklist is elaborated and supplemented with guidance pertinent to studies of cognitive disorders. Its adoption is expected to increase transparency, enable more effective evaluation of diagnostic tests in Alzheimer disease and dementia, contribute to greater adherence to methodologic standards, and advance the development of Alzheimer biomarkers. PMID:24944261

  14. When does haste make waste? Speed-accuracy tradeoff, skill level, and the tools of the trade.

    PubMed

    Beilock, Sian L; Bertenthal, Bennett I; Hoerger, Michael; Carr, Thomas H

    2008-12-01

    Novice and skilled golfers took a series of golf putts with a standard putter (Exp. 1) or a distorted funny putter (consisting of an s-shaped and arbitrarily weighted putter shaft; Exp. 2) under instructions to either (a) take as much time as needed to be accurate or to (b) putt as fast as possible while still being accurate. Planning and movement time were measured for each putt. In both experiments, novices produced the typical speed-accuracy trade-off. Going slower, in terms of both the planning and movement components of execution, improved performance. In contrast, skilled golfers benefited from reduced performance time when using the standard putter in Exp. 1 and, specifically, taking less time to plan improved performance. In Exp. 2, skilled golfers improved by going slower when using the funny putter, but only when it was unfamiliar. Thus, skilled performance benefits from speed instructions when wielding highly familiar tools (i.e., the standard putter) is harmed when using new tools (i.e., the funny putter), and benefits again by speed instructions as the new tool becomes familiar. Planning time absorbs these changes. PMID:19102617

  15. A hyperspectral imager for high radiometric accuracy Earth climate studies

    NASA Astrophysics Data System (ADS)

    Espejo, Joey; Drake, Ginger; Heuerman, Karl; Kopp, Greg; Lieber, Alex; Smith, Paul; Vermeer, Bill

    2011-10-01

    We demonstrate a visible and near-infrared prototype pushbroom hyperspectral imager for Earth climate studies that is capable of using direct solar viewing for on-orbit cross calibration and degradation tracking. Direct calibration to solar spectral irradiances allow the Earth-viewing instrument to achieve required climate-driven absolute radiometric accuracies of <0.2% (1σ). A solar calibration requires viewing scenes having radiances 105 higher than typical Earth scenes. To facilitate this calibration, the instrument features an attenuation system that uses an optimized combination of different precision aperture sizes, neutral density filters, and variable integration timing for Earth and solar viewing. The optical system consists of a three-mirror anastigmat telescope and an Offner spectrometer. The as-built system has a 12.2° cross track field of view with 3 arcmin spatial resolution and covers a 350-1050 nm spectral range with 10 nm resolution. A polarization compensated configuration using the Offner in an out of plane alignment is demonstrated as a viable approach to minimizing polarization sensitivity. The mechanical design takes advantage of relaxed tolerances in the optical design by using rigid, non-adjustable diamond-turned tabs for optical mount locating surfaces. We show that this approach achieves the required optical performance. A prototype spaceflight unit is also demonstrated to prove the applicability of these solar cross calibration methods to on-orbit environments. This unit is evaluated for optical performance prior to and after GEVS shake, thermal vacuum, and lifecycle tests.

  16. Template for Systems Engineering Tools Trade Study

    NASA Technical Reports Server (NTRS)

    Bailey, Michelle D.

    2005-01-01

    A discussion of Systems Engineering tools brings out numerous preferences and reactions regarding tools of choice as well as the functions those tools are to perform. A recent study of Systems Engineering Tools for a new Program illustrated the need for a generic template for use by new Programs or Projects to determine the toolset appropriate for their use. This paper will provide the guidelines new initiatives can follow and tailor to their specific needs, to enable them to make their choice of tools in an efficient and informed manner. Clearly, those who perform purely technical functions will need different tools than those who perform purely systems engineering functions. And, everyone has tools they are comfortable with. That degree of comfort is frequently the deciding factor in tools choice rather than an objective study of all criteria and weighting factors. This paper strives to produce a comprehensive list of criteria for selection with suggestions for weighting factors based on a number of assumptions regarding the given Program or Project. In addition, any given Program will begin with assumptions for its toolset based on Program size, tool cost, user base and technical needs. In providing a template for tool selection, this paper will guide the reader through assumptions based on Program need; decision criteria; potential weighting factors; the need for a compilation of available tools; the importance of tool demonstrations; and finally a down selection of tools. While specific vendors cannot be mentioned in this work, it is expected that this template could serve other Programs in the formulation phase by alleviating the trade study process of some of its subjectivity.

  17. Accuracy of Nurse-Performed Lung Ultrasound in Patients With Acute Dyspnea: A Prospective Observational Study.

    PubMed

    Mumoli, Nicola; Vitale, Josè; Giorgi-Pierfranceschi, Matteo; Cresci, Alessandra; Cei, Marco; Basile, Valentina; Brondi, Barbara; Russo, Elisa; Giuntini, Lucia; Masi, Lorenzo; Cocciolo, Massimo; Dentali, Francesco

    2016-03-01

    In clinical practice lung ultrasound (LUS) is becoming an easy and reliable noninvasive tool for the evaluation of dyspnea. The aim of this study was to assess the accuracy of nurse-performed LUS, in particular, in the diagnosis of acute cardiogenic pulmonary congestion. We prospectively evaluated all the consecutive patients admitted for dyspnea in our Medicine Department between April and July 2014. At admission, serum brain natriuretic peptide (BNP) levels and LUS was performed by trained nurses blinded to clinical and laboratory data. The accuracy of nurse-performed LUS alone and combined with BNP for the diagnosis of acute cardiogenic dyspnea was calculated. Two hundred twenty-six patients (41.6% men, mean age 78.7 ± 12.7 years) were included in the study. Nurse-performed LUS alone had a sensitivity of 95.3% (95% CI: 92.6-98.1%), a specificity of 88.2% (95% CI: 84.0-92.4%), a positive predictive value of 87.9% (95% CI: 83.7-92.2%) and a negative predictive value of 95.5% (95% CI: 92.7-98.2%). The combination of nurse-performed LUS with BNP level (cut-off 400 pg/mL) resulted in a higher sensitivity (98.9%, 95% CI: 97.4-100%), negative predictive value (98.8%, 95% CI: 97.2-100%), and corresponding negative likelihood ratio (0.01, 95% CI: 0.0, 0.07). Nurse-performed LUS had a good accuracy in the diagnosis of acute cardiogenic dyspnea. Use of this technique in combination with BNP seems to be useful in ruling out cardiogenic dyspnea. Other studies are warranted to confirm our preliminary findings and to establish the role of this tool in other settings. PMID:26945396

  18. Accuracy of optical dental digitizers: an in vitro study.

    PubMed

    Vandeweghe, Stefan; Vervack, Valentin; Vanhove, Christian; Dierens, Melissa; Jimbo, Ryo; De Bruyn, Hugo

    2015-01-01

    The aim of this study was to evaluate the accuracy, in terms of trueness and precision, of optical dental scanners. An experimental acrylic resin cast was created and digitized using a microcomputed tomography (microCT) scanner, which served as the reference model. Five polyether impressions were made of the acrylic resin cast to create five stone casts. Each dental digitizer (Imetric, Lava ST, Smart Optics, KaVo Everest) made five scans of the acrylic resin cast and one scan of every stone cast. The scans were superimposed and compared using metrology software. Deviations were calculated between the datasets obtained from the dental digitizers and the microCT scanner (= trueness) and between datasets from the same dental digitizer (= precision). With exception of the Smart Optics scanner, there were no significant differences in trueness for the acrylic resin cast. For the stone casts, however, the Lava ST performed better than Imetric, which did better than the KaVo scanner. The Smart Optics scanner demonstrated the highest deviation. All digitizers demonstrated a significantly higher trueness for the acrylic resin cast compared to the plaster cast, except the Lava ST. The Lava ST was significantly more precise compared to the other scanners. Imetric and Smart Optics also demonstrated a higher level of precision compared to the KaVo scanner. All digitizers demonstrated some degree of error. Stone cast copies are less accurate because of difficulties with scanning the rougher surface or dimensional deformations caused during the production process. For complex, large-span reconstructions, a highly accurate scanner should be selected. PMID:25734714

  19. Accuracy Study of a 2-Component Point Doppler Velocimeter (PDV)

    NASA Technical Reports Server (NTRS)

    Kuhlman, John; Naylor, Steve; James, Kelly; Ramanath, Senthil

    1997-01-01

    A two-component Point Doppler Velocimeter (PDV) which has recently been developed is described, and a series of velocity measurements which have been obtained to quantify the accuracy of the PDV system are summarized. This PDV system uses molecular iodine vapor cells as frequency discriminating filters to determine the Doppler shift of laser light which is scattered off of seed particles in a flow. The majority of results which have been obtained to date are for the mean velocity of a rotating wheel, although preliminary data are described for fully-developed turbulent pipe flow. Accuracy of the present wheel velocity data is approximately +/- 1 % of full scale, while linearity of a single channel is on the order of +/- 0.5 % (i.e., +/- 0.6 m/sec and +/- 0.3 m/sec, out of 57 m/sec, respectively). The observed linearity of these results is on the order of the accuracy to which the speed of the rotating wheel has been set for individual data readings. The absolute accuracy of the rotating wheel data is shown to be consistent with the level of repeatability of the cell calibrations. The preliminary turbulent pipe flow data show consistent turbulence intensity values, and mean axial velocity profiles generally agree with pitot probe data. However, there is at present an offset error in the radial velocity which is on the order of 5-10 % of the mean axial velocity.

  20. High-accuracy diagnostic tool for electron cloud observation in the LHC based on synchronous phase measurements

    NASA Astrophysics Data System (ADS)

    Esteban Müller, J. F.; Baudrenghien, P.; Mastoridis, T.; Shaposhnikova, E.; Valuch, D.

    2015-11-01

    Electron cloud effects, which include heat load in the cryogenic system, pressure rise, and beam instabilities, are among the main intensity limitations for the LHC operation with 25 ns spaced bunches. A new observation tool was proposed and developed to monitor the e-cloud activity and it has already been used successfully during the LHC run 1 (2010-2012) and it is being intensively used in operation during the start of the LHC run 2 (2015-2018). It is based on the fact that the power loss of each bunch due to e-cloud can be estimated using bunch-by-bunch measurement of the synchronous phase. The measurements were done using the existing beam phase module of the low-level rf control system. In order to achieve the very high accuracy required, corrections for reflection in the cables and for systematic errors need to be applied followed by a post-processing of the measurements. Results clearly show the e-cloud buildup along the bunch trains and its time evolution during each LHC fill as well as from fill to fill. Measurements during the 2012 LHC scrubbing run reveal a progressive reduction in the e-cloud activity and therefore a decrease in the secondary electron yield. The total beam power loss can be computed as a sum of the contributions from all bunches and compared with the heat load deposited in the cryogenic system.

  1. Study on machining mechanism of nanotwinned CBN cutting tool

    NASA Astrophysics Data System (ADS)

    Chen, Junyun; Jin, Tianye; Wang, Jinhu; Zhao, Qingliang; Lu, Ling

    2014-08-01

    The latest developed nanotwinned cubic boron nitride (nt-CBN) with isotropic nano-sized microstructure possesses an extremely high hardness (~100GPa Hv), very large fracture toughness (>12Mpa m1/2) and excellent high temperature stability. Thus nt-CBN is a promising tool material to realize ultra-precision cutting of hardened steel which is widely used in mold insert of optical and opto-electrical mass products. In view of its hard machinability, the machining mechanism is studied in this paper. Three feasible methods of mechanical lapping, laser machining as well as ion beam sputtering are applied to process nt-CBN. The results indicate that among the three kinds of methods, mechanical lapping not only can achieve the highest machining accuracy because of material removing at ductile mode completely, but also has satisfactory high material removal rate. Thus mechanical lapping method is appropriate to finish machining of nt-CBN cutting tool. Moreover, laser machining method can be only used in contour machining or rough machining of cutting tool as worse machined surface quality. With regard to ion beam sputtering method, the material remove rate is too low in spite of high machining accuracy. Additionally, no phase transition was found in any machining process of nt-CBN.

  2. Pose estimation with a Kinect for ergonomic studies: evaluation of the accuracy using a virtual mannequin.

    PubMed

    Plantard, Pierre; Auvinet, Edouard; Pierres, Anne-Sophie Le; Multon, Franck

    2015-01-01

    Analyzing human poses with a Kinect is a promising method to evaluate potentials risks of musculoskeletal disorders at workstations. In ecological situations, complex 3D poses and constraints imposed by the environment make it difficult to obtain reliable kinematic information. Thus, being able to predict the potential accuracy of the measurement for such complex 3D poses and sensor placements is challenging in classical experimental setups. To tackle this problem, we propose a new evaluation method based on a virtual mannequin. In this study, we apply this method to the evaluation of joint positions (shoulder, elbow, and wrist), joint angles (shoulder and elbow), and the corresponding RULA (a popular ergonomics assessment grid) upper-limb score for a large set of poses and sensor placements. Thanks to this evaluation method, more than 500,000 configurations have been automatically tested, which would be almost impossible to evaluate with classical protocols. The results show that the kinematic information obtained by the Kinect software is generally accurate enough to fill in ergonomic assessment grids. However inaccuracy strongly increases for some specific poses and sensor positions. Using this evaluation method enabled us to report configurations that could lead to these high inaccuracies. As a supplementary material, we provide a software tool to help designers to evaluate the expected accuracy of this sensor for a set of upper-limb configurations. Results obtained with the virtual mannequin are in accordance with those obtained from a real subject for a limited set of poses and sensor placements. PMID:25599426

  3. Pose Estimation with a Kinect for Ergonomic Studies: Evaluation of the Accuracy Using a Virtual Mannequin

    PubMed Central

    Plantard, Pierre; Auvinet, Edouard; Le Pierres, Anne-Sophie; Multon, Franck

    2015-01-01

    Analyzing human poses with a Kinect is a promising method to evaluate potentials risks of musculoskeletal disorders at workstations. In ecological situations, complex 3D poses and constraints imposed by the environment make it difficult to obtain reliable kinematic information. Thus, being able to predict the potential accuracy of the measurement for such complex 3D poses and sensor placements is challenging in classical experimental setups. To tackle this problem, we propose a new evaluation method based on a virtual mannequin. In this study, we apply this method to the evaluation of joint positions (shoulder, elbow, and wrist), joint angles (shoulder and elbow), and the corresponding RULA (a popular ergonomics assessment grid) upper-limb score for a large set of poses and sensor placements. Thanks to this evaluation method, more than 500,000 configurations have been automatically tested, which would be almost impossible to evaluate with classical protocols. The results show that the kinematic information obtained by the Kinect software is generally accurate enough to fill in ergonomic assessment grids. However inaccuracy strongly increases for some specific poses and sensor positions. Using this evaluation method enabled us to report configurations that could lead to these high inaccuracies. As a supplementary material, we provide a software tool to help designers to evaluate the expected accuracy of this sensor for a set of upper-limb configurations. Results obtained with the virtual mannequin are in accordance with those obtained from a real subject for a limited set of poses and sensor placements. PMID:25599426

  4. Using Meta-Analysis to Inform the Design of Subsequent Studies of Diagnostic Test Accuracy

    ERIC Educational Resources Information Center

    Hinchliffe, Sally R.; Crowther, Michael J.; Phillips, Robert S.; Sutton, Alex J.

    2013-01-01

    An individual diagnostic accuracy study rarely provides enough information to make conclusive recommendations about the accuracy of a diagnostic test; particularly when the study is small. Meta-analysis methods provide a way of combining information from multiple studies, reducing uncertainty in the result and hopefully providing substantial…

  5. High accuracy NMR chemical shift corrected for bulk magnetization as a tool for structural elucidation of dilutable microemulsions. Part 1 - Proof of concept.

    PubMed

    Hoffman, Roy E; Darmon, Eliezer; Aserin, Abraham; Garti, Nissim

    2016-02-01

    In microemulsions, changes in droplet size and shape and possible transformations occur under various conditions. They are difficult to characterize by most analytical tools because of their nano-sized structure and dynamic nature. Several methods are usually combined to obtain reliable information, guiding the scientist in understanding their physical behavior. We felt that there is a need for a technique that complements those in use today in order to provide more information on the microemulsion behavior, mainly as a function of dilution with water. The improvement of NMR chemical shift measurements independent of bulk magnetization effects makes it possible to study the very weak intermolecular chemical shift effects. In the present study, we used NMR high resolution magic angle spinning to measure the chemical shift very accurately, free of bulk magnetization effects. The chemical shift of microemulsion components is measured as a function of the water content in order to validate the method in an interesting and promising, U-type dilutable microemulsion, which had been previously studied by a variety of techniques. Phase transition points of the microemulsion (O/W, bicontinuous, W/O) and changes in droplet shape were successfully detected using high-accuracy chemical shift measurements. We analyzed the results and found them to be compatible with the previous studies, paving the way for high-accuracy chemical shifts to be used for the study of other microemulsion systems. We detected two transition points along the water dilution line of the concentrate (reverse micelles) corresponding to the transition from swollen W/O nano-droplets to bicontinuous to the O/W droplets along with the changes in the droplets' sizes and shapes. The method seems to be in excellent agreement with other previously studied techniques and shows the advantage of this easy and valid technique.

  6. STARD 2015: An Updated List of Essential Items for Reporting Diagnostic Accuracy Studies.

    PubMed

    Bossuyt, Patrick M; Reitsma, Johannes B; Bruns, David E; Gatsonis, Constantine A; Glasziou, Paul P; Irwig, Les; Lijmer, Jeroen G; Moher, David; Rennie, Drummond; de Vet, Henrica C W; Kressel, Herbert Y; Rifai, Nader; Golub, Robert M; Altman, Douglas G; Hooft, Lotty; Korevaar, Daniël A; Cohen, Jérémie F

    2015-12-01

    Incomplete reporting has been identified as a major source of avoidable waste in biomedical research. Essential information is often not provided in study reports, impeding the identification, critical appraisal, and replication of studies. To improve the quality of reporting of diagnostic accuracy studies, the Standards for Reporting of Diagnostic Accuracy Studies (STARD) statement was developed. Here we present STARD 2015, an updated list of 30 essential items that should be included in every report of a diagnostic accuracy study. This update incorporates recent evidence about sources of bias and variability in diagnostic accuracy and is intended to facilitate the use of STARD. As such, STARD 2015 may help to improve completeness and transparency in reporting of diagnostic accuracy studies.

  7. STARD 2015: An Updated List of Essential Items for Reporting Diagnostic Accuracy Studies.

    PubMed

    Bossuyt, Patrick M; Reitsma, Johannes B; Bruns, David E; Gatsonis, Constantine A; Glasziou, Paul P; Irwig, Les; Lijmer, Jeroen G; Moher, David; Rennie, Drummond; de Vet, Henrica C W; Kressel, Herbert Y; Rifai, Nader; Golub, Robert M; Altman, Douglas G; Hooft, Lotty; Korevaar, Daniël A; Cohen, Jérémie F

    2015-12-01

    Incomplete reporting has been identified as a major source of avoidable waste in biomedical research. Essential information is often not provided in study reports, impeding the identification, critical appraisal, and replication of studies. To improve the quality of reporting of diagnostic accuracy studies, the Standards for Reporting of Diagnostic Accuracy Studies (STARD) statement was developed. Here we present STARD 2015, an updated list of 30 essential items that should be included in every report of a diagnostic accuracy study. This update incorporates recent evidence about sources of bias and variability in diagnostic accuracy and is intended to facilitate the use of STARD. As such, STARD 2015 may help to improve completeness and transparency in reporting of diagnostic accuracy studies. PMID:26510957

  8. STARD 2015: An Updated List of Essential Items for Reporting Diagnostic Accuracy Studies.

    PubMed

    Bossuyt, Patrick M; Reitsma, Johannes B; Bruns, David E; Gatsonis, Constantine A; Glasziou, Paul P; Irwig, Les; Lijmer, Jeroen G; Moher, David; Rennie, Drummond; de Vet, Henrica C W; Kressel, Herbert Y; Rifai, Nader; Golub, Robert M; Altman, Douglas G; Hooft, Lotty; Korevaar, Daniël A; Cohen, Jérémie F

    2015-12-01

    Incomplete reporting has been identified as a major source of avoidable waste in biomedical research. Essential information is often not provided in study reports, impeding the identification, critical appraisal, and replication of studies. To improve the quality of reporting of diagnostic accuracy studies, the Standards for Reporting of Diagnostic Accuracy Studies (STARD) statement was developed. Here we present STARD 2015, an updated list of 30 essential items that should be included in every report of a diagnostic accuracy study. This update incorporates recent evidence about sources of bias and variability in diagnostic accuracy and is intended to facilitate the use of STARD. As such, STARD 2015 may help to improve completeness and transparency in reporting of diagnostic accuracy studies. PMID:26509226

  9. Seismicity map tools for earthquake studies

    NASA Astrophysics Data System (ADS)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  10. Dynamic optimization case studies in DYNOPT tool

    NASA Astrophysics Data System (ADS)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.

  11. Effects of Varying Feedback Accuracy on Task Acquisition: A Computerized Translational Study

    ERIC Educational Resources Information Center

    Hirst, Jason M.; DiGennaro Reed, Florence D.; Reed, Derek D.

    2013-01-01

    Research has shown that the accuracy of instructions influences responding immediately and under later conditions. The purpose of the present study was to extend this literature and use a translational approach to assess the short- and long-term effects of feedback accuracy on the acquisition of a task. Three levels of inaccurate feedback were…

  12. Dynamic Development of Complexity and Accuracy: A Case Study in Second Language Academic Writing

    ERIC Educational Resources Information Center

    Rosmawati

    2014-01-01

    This paper reports on the development of complexity and accuracy in English as a Second Language (ESL) academic writing. Although research into complexity and accuracy development in second language (L2) writing has been well established, few studies have assumed the multidimensionality of these two constructs (Norris & Ortega, 2009) or…

  13. Accuracy of clinical diagnosis in parkinsonism--a prospective study.

    PubMed

    Rajput, A H; Rozdilsky, B; Rajput, A

    1991-08-01

    Clinical diagnosis of Parkinson's syndrome (PS) is reasonably easy in most cases but the distinction between different variants of PS may be difficult in early cases. The correct diagnosis is not only important for counselling and management of patients but also in conducting pharmacological and epidemiological studies. There is very little critical literature on the pathological verification of the clinical diagnosis in PS. We report our 22 years experience to address that issue. Between 1968 and 1990, 65 PS patients came to autopsy. Complete data are available in 59 (M-50, F-19) cases. The initial diagnosis made by a qualified neurologist was idiopathic Parkinson's disease (IPD) in 43 cases. Of those 28 (65%) had Lewy body pathology. After a mean duration of 12 years the final diagnosis was IPD in 41 cases which was confirmed in 31 (76%). The IPD could not be clinically distinguished from cases with severe substantia nigra neuronal loss without inclusions or from those with neurofibrillary tangle inclusions and neuronal loss at the anatomical sites typically involved in IPD. All progressive supra-nuclear palsy, olivopontocerebellar atrophy, Jakob-Creutzfeldt's disease and the majority of the multiple system atrophy cases were diagnosed correctly during life. The correct clinical diagnosis in most non-IPD variants of PS was possible within 5 years of onset (range: 2 months to 18 years). We recommend that studies aimed at including only the IPD cases restrict the enrollment to those cases that have had PS motor manifestations for five years or longer duration. PMID:1913360

  14. The predictive accuracy of PREDICT: a personalized decision-making tool for Southeast Asian women with breast cancer.

    PubMed

    Wong, Hoong-Seam; Subramaniam, Shridevi; Alias, Zarifah; Taib, Nur Aishah; Ho, Gwo-Fuang; Ng, Char-Hong; Yip, Cheng-Har; Verkooijen, Helena M; Hartman, Mikael; Bhoo-Pathy, Nirmala

    2015-02-01

    Web-based prognostication tools may provide a simple and economically feasible option to aid prognostication and selection of chemotherapy in early breast cancers. We validated PREDICT, a free online breast cancer prognostication and treatment benefit tool, in a resource-limited setting. All 1480 patients who underwent complete surgical treatment for stages I to III breast cancer from 1998 to 2006 were identified from the prospective breast cancer registry of University Malaya Medical Centre, Kuala Lumpur, Malaysia. Calibration was evaluated by comparing the model-predicted overall survival (OS) with patients' actual OS. Model discrimination was tested using receiver-operating characteristic (ROC) analysis. Median age at diagnosis was 50 years. The median tumor size at presentation was 3 cm and 54% of patients had lymph node-negative disease. About 55% of women had estrogen receptor-positive breast cancer. Overall, the model-predicted 5 and 10-year OS was 86.3% and 77.5%, respectively, whereas the observed 5 and 10-year OS was 87.6% (difference: -1.3%) and 74.2% (difference: 3.3%), respectively; P values for goodness-of-fit test were 0.18 and 0.12, respectively. The program was accurate in most subgroups of patients, but significantly overestimated survival in patients aged <40 years, and in those receiving neoadjuvant chemotherapy. PREDICT performed well in terms of discrimination; areas under ROC curve were 0.78 (95% confidence interval [CI]: 0.74-0.81) and 0.73 (95% CI: 0.68-0.78) for 5 and 10-year OS, respectively. Based on its accurate performance in this study, PREDICT may be clinically useful in prognosticating women with breast cancer and personalizing breast cancer treatment in resource-limited settings. PMID:25715267

  15. Twin Studies: A Unique Epidemiological Tool

    PubMed Central

    Sahu, Monalisha; Prasuna, Josyula G

    2016-01-01

    Twin studies are a special type of epidemiological studies designed to measure the contribution of genetics as opposed to the environment, to a given trait. Despite the facts that the classical twin studies are still being guided by assumptions made back in the 1920s and that the inherent limitation lies in the study design itself, the results suggested by earlier twin studies have often been confirmed by molecular genetic studies later. Use of twin registries and various innovative yet complex software packages such as the (SAS) and their extensions (e.g., SAS PROC GENMOD and SAS PROC PHREG) has increased the potential of this epidemiological tool toward contributing significantly to the field of genetics and other life sciences. PMID:27385869

  16. Evaluation of accuracy of cone beam computed tomography for measurement of periodontal defects: A clinical study

    PubMed Central

    Banodkar, Akshaya Bhupesh; Gaikwad, Rajesh Prabhakar; Gunjikar, Tanay Udayrao; Lobo, Tanya Arthur

    2015-01-01

    Aims: The aim of the present study was to evaluate the accuracy of Cone Beam Computed Tomography (CBCT) measurements of alveolar bone defects caused due to periodontal disease, by comparing it with actual surgical measurements which is the gold standard. Materials and Methods: Hundred periodontal bone defects in fifteen patients suffering from periodontitis and scheduled for flap surgery were included in the study. On the day of surgery prior to anesthesia, CBCT of the quadrant to be operated was taken. After reflection of the flap, clinical measurements of periodontal defect were made using a reamer and digital vernier caliper. The measurements taken during surgery were then compared to the measurements done with CBCT and subjected to statistical analysis using the Pearson's correlation test. Results: Overall there was a very high correlation of 0.988 between the surgical and CBCT measurements. In case of type of defects the correlation was higher in horizontal defects as compared to vertical defects. Conclusions: CBCT is highly accurate in measurement of periodontal defects and proves to be a very useful tool in periodontal diagnosis and treatment assessment. PMID:26229268

  17. Study on the accuracy factors of large-scale photogrammetry system

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Dong, Mingli; Lu, Naiguang

    2011-05-01

    In photogrammetry system, the Base-Distance Ratio, the Image Scale, and the Image Standard Error, which construct the network strength of the system, are the main accuracy factors. In this paper, the normal and convergent network configurations of the photogrammetry are studied and the Network Strength, which presents the strength and accuracy of the camera station network, is expressed with the accuracy factors mentioned above. In order to verify the validation of this expression, the large-scale 3D reference field is designed and used to test the effects of these accuracy factors. The experimental results show that the relationship between the accuracy and the factors is consistent with the expression. These conclusions will guide the photogrammetric work to reduce the system errors.

  18. Study on the accuracy factors of large-scale photogrammetry system

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Dong, Mingli; Lu, Naiguang

    2010-12-01

    In photogrammetry system, the Base-Distance Ratio, the Image Scale, and the Image Standard Error, which construct the network strength of the system, are the main accuracy factors. In this paper, the normal and convergent network configurations of the photogrammetry are studied and the Network Strength, which presents the strength and accuracy of the camera station network, is expressed with the accuracy factors mentioned above. In order to verify the validation of this expression, the large-scale 3D reference field is designed and used to test the effects of these accuracy factors. The experimental results show that the relationship between the accuracy and the factors is consistent with the expression. These conclusions will guide the photogrammetric work to reduce the system errors.

  19. Podcasts as tools in introductory environmental studies.

    PubMed

    Vatovec, Christine; Balser, Teri

    2009-01-01

    Technological tools have increasingly become a part of the college classroom, often appealing to teachers because of their potential to increase student engagement with course materials. Podcasts in particular have gained popularity as tools to better inform students by providing access to lectures outside of the classroom. In this paper, we argue that educators should expand course materials to include prepublished podcasts to engage students with both course topics and a broader skill set for evaluating readily available media. We present a pre- and postassignment survey evaluation assessing student preferences for using podcasts and the ability of a podcast assignment to support learning objectives in an introductory environmental studies course. Overall, students reported that the podcasts were useful tools for learning, easy to use, and increased their understanding of course topics. However, students also provided insightful comments on visual versus aural learning styles, leading us to recommend assigning video podcasts or providing text-based transcripts along with audio podcasts. A qualitative analysis of survey data provides evidence that the podcast assignment supported the course learning objective for students to demonstrate critical evaluation of media messages. Finally, we provide recommendations for selecting published podcasts and designing podcast assignments.

  20. Analysis of Accuracy in Pointing with Redundant Hand-held Tools: A Geometric Approach to the Uncontrolled Manifold Method

    PubMed Central

    Campolo, Domenico; Widjaja, Ferdinan; Xu, Hong; Ang, Wei Tech; Burdet, Etienne

    2013-01-01

    This work introduces a coordinate-independent method to analyse movement variability of tasks performed with hand-held tools, such as a pen or a surgical scalpel. We extend the classical uncontrolled manifold (UCM) approach by exploiting the geometry of rigid body motions, used to describe tool configurations. In particular, we analyse variability during a static pointing task with a hand-held tool, where subjects are asked to keep the tool tip in steady contact with another object. In this case the tool is redundant with respect to the task, as subjects control position/orientation of the tool, i.e. 6 degrees-of-freedom (dof), to maintain the tool tip position (3dof) steady. To test the new method, subjects performed a pointing task with and without arm support. The additional dof introduced in the unsupported condition, injecting more variability into the system, represented a resource to minimise variability in the task space via coordinated motion. The results show that all of the seven subjects channeled more variability along directions not directly affecting the task (UCM), consistent with previous literature but now shown in a coordinate-independent way. Variability in the unsupported condition was only slightly larger at the endpoint but much larger in the UCM. PMID:23592956

  1. Analysis of accuracy in pointing with redundant hand-held tools: a geometric approach to the uncontrolled manifold method.

    PubMed

    Campolo, Domenico; Widjaja, Ferdinan; Xu, Hong; Ang, Wei Tech; Burdet, Etienne

    2013-04-01

    This work introduces a coordinate-independent method to analyse movement variability of tasks performed with hand-held tools, such as a pen or a surgical scalpel. We extend the classical uncontrolled manifold (UCM) approach by exploiting the geometry of rigid body motions, used to describe tool configurations. In particular, we analyse variability during a static pointing task with a hand-held tool, where subjects are asked to keep the tool tip in steady contact with another object. In this case the tool is redundant with respect to the task, as subjects control position/orientation of the tool, i.e. 6 degrees-of-freedom (dof), to maintain the tool tip position (3dof) steady. To test the new method, subjects performed a pointing task with and without arm support. The additional dof introduced in the unsupported condition, injecting more variability into the system, represented a resource to minimise variability in the task space via coordinated motion. The results show that all of the seven subjects channeled more variability along directions not directly affecting the task (UCM), consistent with previous literature but now shown in a coordinate-independent way. Variability in the unsupported condition was only slightly larger at the endpoint but much larger in the UCM.

  2. Regression Modeling and Meta-Analysis of Diagnostic Accuracy of SNP-Based Pathogenicity Detection Tools for UGT1A1 Gene Mutation

    PubMed Central

    Rahim, Fakher; Galehdari, Hamid; Mohammadi-asl, Javad; Saki, Najmaldin

    2013-01-01

    Aims. This review summarized all available evidence on the accuracy of SNP-based pathogenicity detection tools and introduced regression model based on functional scores, mutation score, and genomic variation degree. Materials and Methods. A comprehensive search was performed to find all mutations related to Crigler-Najjar syndrome. The pathogenicity prediction was done using SNP-based pathogenicity detection tools including SIFT, PHD-SNP, PolyPhen2, fathmm, Provean, and Mutpred. Overall, 59 different SNPs related to missense mutations in the UGT1A1 gene, were reviewed. Results. Comparing the diagnostic OR, our model showed high detection potential (diagnostic OR: 16.71, 95% CI: 3.38–82.69). The highest MCC and ACC belonged to our suggested model (46.8% and 73.3%), followed by SIFT (34.19% and 62.71%). The AUC analysis showed a significance overall performance of our suggested model compared to the selected SNP-based pathogenicity detection tool (P = 0.046). Conclusion. Our suggested model is comparable to the well-established SNP-based pathogenicity detection tools that can appropriately reflect the role of a disease-associated SNP in both local and global structures. Although the accuracy of our suggested model is not relatively high, the functional impact of the pathogenic mutations is highlighted at the protein level, which improves the understanding of the molecular basis of mutation pathogenesis. PMID:23997956

  3. A Study on the Tool Sharing Policy in FMS with Tool Center

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoming; Fujii, Susumu; Kaihara, Toshiya

    In this paper we examine the issue of tool management in a Flexible Manufacturing System (FMS). A tool center storing all tools for jobs is introduced and all tools can be shared by the machining centers through tool change between the machining centers and the tool center. Two tool-change methods, “Augmented Magazine Method (AMM)” and “Magazine Carrier Method (MCM)”, are proposed and evaluated using the methods for the allocation of tools and jobs proposed in this study. A job in this study is defined as a sequence of processing operations and is thus processed by a single machining center. We propose a two-stage method to integrate tool planning and the allocation of tools and jobs. The method in the second stage for the allocation of tools and jobs is proposed on the basis of an auction method including a dispatching rule. For the tool planning with AMM at the first stage, we propose a greedy search method to decide the total number of tools. By simulation, we investigate the characteristics and effectiveness of the proposed two-stage method and the tool-change methods together with the effectiveness of several dispatching rules.

  4. Short-Term Forecasting of Loads and Wind Power for Latvian Power System: Accuracy and Capacity of the Developed Tools

    NASA Astrophysics Data System (ADS)

    Radziukynas, V.; Klementavičius, A.

    2016-04-01

    The paper analyses the performance results of the recently developed short-term forecasting suit for the Latvian power system. The system load and wind power are forecasted using ANN and ARIMA models, respectively, and the forecasting accuracy is evaluated in terms of errors, mean absolute errors and mean absolute percentage errors. The investigation of influence of additional input variables on load forecasting errors is performed. The interplay of hourly loads and wind power forecasting errors is also evaluated for the Latvian power system with historical loads (the year 2011) and planned wind power capacities (the year 2023).

  5. Fluorescence microscopy: A tool to study autophagy

    NASA Astrophysics Data System (ADS)

    Rai, Shashank; Manjithaya, Ravi

    2015-08-01

    Autophagy is a cellular recycling process through which a cell degrades old and damaged cellular components such as organelles and proteins and the degradation products are reused to provide energy and building blocks. Dysfunctional autophagy is reported in several pathological situations. Hence, autophagy plays an important role in both cellular homeostasis and diseased conditions. Autophagy can be studied through various techniques including fluorescence based microscopy. With the advancements of newer technologies in fluorescence microscopy, several novel processes of autophagy have been discovered which makes it an essential tool for autophagy research. Moreover, ability to tag fluorescent proteins with sub cellular targets has enabled us to evaluate autophagy processes in real time under fluorescent microscope. In this article, we demonstrate different aspects of autophagy in two different model organisms i.e. yeast and mammalian cells, with the help of fluorescence microscopy.

  6. Do knowledge, knowledge sources and reasoning skills affect the accuracy of nursing diagnoses? a randomised study

    PubMed Central

    2012-01-01

    Background This paper reports a study about the effect of knowledge sources, such as handbooks, an assessment format and a predefined record structure for diagnostic documentation, as well as the influence of knowledge, disposition toward critical thinking and reasoning skills, on the accuracy of nursing diagnoses. Knowledge sources can support nurses in deriving diagnoses. A nurse’s disposition toward critical thinking and reasoning skills is also thought to influence the accuracy of his or her nursing diagnoses. Method A randomised factorial design was used in 2008–2009 to determine the effect of knowledge sources. We used the following instruments to assess the influence of ready knowledge, disposition, and reasoning skills on the accuracy of diagnoses: (1) a knowledge inventory, (2) the California Critical Thinking Disposition Inventory, and (3) the Health Science Reasoning Test. Nurses (n = 249) were randomly assigned to one of four factorial groups, and were instructed to derive diagnoses based on an assessment interview with a simulated patient/actor. Results The use of a predefined record structure resulted in a significantly higher accuracy of nursing diagnoses. A regression analysis reveals that almost half of the variance in the accuracy of diagnoses is explained by the use of a predefined record structure, a nurse’s age and the reasoning skills of `deduction’ and `analysis’. Conclusions Improving nurses’ dispositions toward critical thinking and reasoning skills, and the use of a predefined record structure, improves accuracy of nursing diagnoses. PMID:22852577

  7. Non-parametric Evaluation of Biomarker Accuracy under Nested Case-control Studies

    PubMed Central

    Cai, Tianxi; Zheng, Yingye

    2012-01-01

    Summary To evaluate the clinical utility of new risk markers, a crucial step is to measure their predictive accuracy with prospective studies. However, it is often infeasible to obtain marker values for all study participants. The nested case-control (NCC) design is a useful cost-effective strategy for such settings. Under the NCC design, markers are only ascertained for cases and a fraction of controls sampled randomly from the risk sets. The outcome dependent sampling generates a complex data structure and therefore a challenge for analysis. Existing methods for analyzing NCC studies focus primarily on association measures. Here, we propose a class of non-parametric estimators for commonly used accuracy measures. We derived asymptotic expansions for accuracy estimators based on both finite population and Bernoulli sampling and established asymptotic equivalence between the two. Simulation results suggest that the proposed procedures perform well in finite samples. The new procedures were illustrated with data from the Framingham Offspring study. PMID:22844169

  8. An observational study of the accuracy and completeness of an anesthesia information management system: recommendations for documentation system changes.

    PubMed

    Wilbanks, Bryan A; Moss, Jacqueline A; Berner, Eta S

    2013-08-01

    Anesthesia information management systems must often be tailored to fit the environment in which they are implemented. Extensive customization necessitates that systems be analyzed for both accuracy and completeness of documentation design to ensure that the final record is a true representation of practice. The purpose of this study was to determine the accuracy of a recently installed system in the capture of key perianesthesia data. This study used an observational design and was conducted using a convenience sample of nurse anesthetists. Observational data of the nurse anesthetists'delivery of anesthesia care were collected using a touch-screen tablet computer utilizing an Access database customized observational data collection tool. A questionnaire was also administered to these nurse anesthetists to assess perceived accuracy, completeness, and satisfaction with the electronic documentation system. The major sources of data not documented in the system were anesthesiologist presence (20%) and placement of intravenous lines (20%). The major sources of inaccuracies in documentation were gas flow rates (45%), medication administration times (30%), and documentation of neuromuscular function testing (20%)-all of the sources of inaccuracies were related to the use of charting templates that were not altered to reflect the actual interventions performed. PMID:23851709

  9. An observational study of the accuracy and completeness of an anesthesia information management system: recommendations for documentation system changes.

    PubMed

    Wilbanks, Bryan A; Moss, Jacqueline A; Berner, Eta S

    2013-08-01

    Anesthesia information management systems must often be tailored to fit the environment in which they are implemented. Extensive customization necessitates that systems be analyzed for both accuracy and completeness of documentation design to ensure that the final record is a true representation of practice. The purpose of this study was to determine the accuracy of a recently installed system in the capture of key perianesthesia data. This study used an observational design and was conducted using a convenience sample of nurse anesthetists. Observational data of the nurse anesthetists'delivery of anesthesia care were collected using a touch-screen tablet computer utilizing an Access database customized observational data collection tool. A questionnaire was also administered to these nurse anesthetists to assess perceived accuracy, completeness, and satisfaction with the electronic documentation system. The major sources of data not documented in the system were anesthesiologist presence (20%) and placement of intravenous lines (20%). The major sources of inaccuracies in documentation were gas flow rates (45%), medication administration times (30%), and documentation of neuromuscular function testing (20%)-all of the sources of inaccuracies were related to the use of charting templates that were not altered to reflect the actual interventions performed.

  10. Study the effect of gray component replacement level on reflectance spectra and color reproduction accuracy

    NASA Astrophysics Data System (ADS)

    Spiridonov, I.; Shopova, M.; Boeva, R.

    2013-03-01

    The aim of this study is investigation of gray component replacement (GCR) levels on reflectance spectrum for different overprints of the inks and color reproduction accuracy. The most commonly implemented method in practice for generation of achromatic composition is gray component replacement (GCR). The experiments in this study, have been performed in real production conditions with special test form generated by specialized software. The measuring of reflection spectrum of printed colors, gives a complete conception for the effect of different gray component replacement levels on color reproduction accuracy. For better data analyses and modeling of processes, we have calculated (converted) the CIEL*a*b* color coordinates from the reflection spectra data. The assessment of color accuracy by using different GCR amount has been made by calculation of color difference ΔE* ab. In addition for the specific printing conditions we have created ICC profiles with different GCR amounts. A comparison of the color gamuts has been performed. For a first time a methodology is implemented for examination and estimation of effect of GCR levels on color reproduction accuracy by studying a big number of colors in entire visible spectrum. Implementation in practice of the results achieved in this experiment, will lead to improved gray balance and better color accuracy. Another important effect of this research is reduction of financial costs of printing production by decreasing of ink consumption, indirect reduction of emissions during the manufacture of inks and facilitates the process of deinking during the recycling paper.

  11. Experimental studies of high-accuracy RFID localization with channel impairments

    NASA Astrophysics Data System (ADS)

    Pauls, Eric; Zhang, Yimin D.

    2015-05-01

    Radio frequency identification (RFID) systems present an incredibly cost-effective and easy-to-implement solution to close-range localization. One of the important applications of a passive RFID system is to determine the reader position through multilateration based on the estimated distances between the reader and multiple distributed reference tags obtained from, e.g., the received signal strength indicator (RSSI) readings. In practice, the achievable accuracy of passive RFID reader localization suffers from many factors, such as the distorted RSSI reading due to channel impairments in terms of the susceptibility to reader antenna patterns and multipath propagation. Previous studies have shown that the accuracy of passive RFID localization can be significantly improved by properly modeling and compensating for such channel impairments. The objective of this paper is to report experimental study results that validate the effectiveness of such approaches for high-accuracy RFID localization. We also examine a number of practical issues arising in the underlying problem that limit the accuracy of reader-tag distance measurements and, therefore, the estimated reader localization. These issues include the variations in tag radiation characteristics for similar tags, effects of tag orientations, and reader RSS quantization and measurement errors. As such, this paper reveals valuable insights of the issues and solutions toward achieving high-accuracy passive RFID localization.

  12. Visual DMDX: A web-based authoring tool for DMDX, a Windows display program with millisecond accuracy.

    PubMed

    Garaizar, Pablo; Reips, Ulf-Dietrich

    2015-09-01

    DMDX is a software package for the experimental control and timing of stimulus display for Microsoft Windows systems. DMDX is reliable, flexible, millisecond accurate, and can be downloaded free of charge; therefore it has become very popular among experimental researchers. However, setting up a DMDX-based experiment is burdensome because of its command-based interface. Further, DMDX relies on RTF files in which parts of the stimuli, design, and procedure of an experiment are defined in a complicated (DMASTR-compatible) syntax. Other experiment software, such as E-Prime, Psychopy, and WEXTOR, became successful as a result of integrated visual authoring tools. Such an intuitive interface was lacking for DMDX. We therefore created and present here Visual DMDX (http://visualdmdx.com/), a HTML5-based web interface to set up experiments and export them to DMDX item files format in RTF. Visual DMDX offers most of the features available from the rich DMDX/DMASTR syntax, and it is a useful tool to support researchers who are new to DMDX. Both old and modern versions of DMDX syntax are supported. Further, with Visual DMDX, we go beyond DMDX by having added export to JSON (a versatile web format), easy backup, and a preview option for experiments. In two examples, one experiment each on lexical decision making and affective priming, we explain in a step-by-step fashion how to create experiments using Visual DMDX. We release Visual DMDX under an open-source license to foster collaboration in its continuous improvement.

  13. Immunogenetics as a tool in anthropological studies

    PubMed Central

    Sanchez-Mazas, Alicia; Fernandez-Viña, Marcelo; Middleton, Derek; Hollenbach, Jill A; Buhler, Stéphane; Di, Da; Rajalingam, Raja; Dugoujon, Jean-Michel; Mack, Steven J; Thorsby, Erik

    2011-01-01

    The genes coding for the main molecules involved in the human immune system – immunoglobulins, human leucocyte antigen (HLA) molecules and killer-cell immunoglobulin-like receptors (KIR) – exhibit a very high level of polymorphism that reveals remarkable frequency variation in human populations. ‘Genetic marker’ (GM) allotypes located in the constant domains of IgG antibodies have been studied for over 40 years through serological typing, leading to the identification of a variety of GM haplotypes whose frequencies vary sharply from one geographic region to another. An impressive diversity of HLA alleles, which results in amino acid substitutions located in the antigen-binding region of HLA molecules, also varies greatly among populations. The KIR differ between individuals according to both gene content and allelic variation, and also display considerable population diversity. Whereas the molecular evolution of these polymorphisms has most likely been subject to natural selection, principally driven by host–pathogen interactions, their patterns of genetic variation worldwide show significant signals of human geographic expansion, demographic history and cultural diversification. As current developments in population genetic analysis and computer simulation improve our ability to discriminate among different – either stochastic or deterministic – forces acting on the genetic evolution of human populations, the study of these systems shows great promise for investigating both the peopling history of modern humans in the time since their common origin and human adaptation to past environmental (e.g. pathogenic) changes. Therefore, in addition to mitochondrial DNA, Y-chromosome, microsatellites, single nucleotide polymorphisms and other markers, immunogenetic polymorphisms represent essential and complementary tools for anthropological studies. PMID:21480890

  14. Microinjection--a tool to study gravitropism

    NASA Technical Reports Server (NTRS)

    Scherp, P.; Hasenstein, K. H.

    2003-01-01

    Despite extensive studies on plant gravitropism this phenomenon is still poorly understood. The separation of gravity sensing, signal transduction and response is a common concept but especially the mechanism of gravisensing remains unclear. This paper focuses on microinjection as powerful tool to investigate gravisensing in plants. We describe the microinjection of magnetic beads in rhizoids of the green alga Chara and related subsequent manipulation of the gravisensing system. After injection, an external magnet can control the movement of the magnetic beads. We demonstrate successful injection of magnetic beads into rhizoids and describe a multitude of experiments that can be carried out to investigate gravitropism in Chara rhizoids. In addition to examining mechanical properties, bead microinjection is also useful for probing the function of the cytoskeleton by coating beads with drugs that interfere with the cytoskeleton. The injection of fluorescently labeled beads or probes may reveal the involvement of the cytoskeleton during gravistimulation and response in living cells. c2003 COSPAR. Published by Elsevier Ltd. All rights reserved.

  15. Visual DMDX: A web-based authoring tool for DMDX, a Windows display program with millisecond accuracy.

    PubMed

    Garaizar, Pablo; Reips, Ulf-Dietrich

    2015-09-01

    DMDX is a software package for the experimental control and timing of stimulus display for Microsoft Windows systems. DMDX is reliable, flexible, millisecond accurate, and can be downloaded free of charge; therefore it has become very popular among experimental researchers. However, setting up a DMDX-based experiment is burdensome because of its command-based interface. Further, DMDX relies on RTF files in which parts of the stimuli, design, and procedure of an experiment are defined in a complicated (DMASTR-compatible) syntax. Other experiment software, such as E-Prime, Psychopy, and WEXTOR, became successful as a result of integrated visual authoring tools. Such an intuitive interface was lacking for DMDX. We therefore created and present here Visual DMDX (http://visualdmdx.com/), a HTML5-based web interface to set up experiments and export them to DMDX item files format in RTF. Visual DMDX offers most of the features available from the rich DMDX/DMASTR syntax, and it is a useful tool to support researchers who are new to DMDX. Both old and modern versions of DMDX syntax are supported. Further, with Visual DMDX, we go beyond DMDX by having added export to JSON (a versatile web format), easy backup, and a preview option for experiments. In two examples, one experiment each on lexical decision making and affective priming, we explain in a step-by-step fashion how to create experiments using Visual DMDX. We release Visual DMDX under an open-source license to foster collaboration in its continuous improvement. PMID:24912762

  16. A study of neural network parameters for improvement in classification accuracy

    NASA Astrophysics Data System (ADS)

    Pathak, Avijit; Tiwari, K. C.

    2016-05-01

    Hyperspectral data due to large number of spectral bands facilitates discrimination between large numbers of classes in a data; however, the advantage afforded by the hyperspectral data often tends to get lost in the limitations of convection al classifier techniques. Artificial Neural Networks (ANN) in several studies has shown to outperform convection al classifiers, however; there are several issues with regard to selection of parameters for achieving best possible classification accuracy. Objectives of this study have been accordingly formulated to include an investigation of t he effect of various Neural Network parameters on the accuracy of hyperspectral image classification. AVIRIS Hyperspectral Indian Pine Test site 3 dataset acquiredin220 Bands on June 12, 1992 has been used in the stud y. Thereafter, maximal feature extraction technique of Principle component analysis (PCA) is used to reduce the dataset t o 10 bands preserving of 99.96% variance. The data contains 16 major classes of which 4 have been considered for ANN based classification. The parameters selected for the study are - number of hidden layers, hidden Nodes, training sample size, learning rate and learning momentum. Backpropagation method of learning is adopted. The overall accuracy of the network trained has been assessed using test sample size of 300 pixels. Although, the study throws up certain distinct ranges within which higher classification accuracies can be expected, however, no definite relationship could be identified between various ANN parameters under study.

  17. Effects of implant angulation, material selection, and impression technique on impression accuracy: a preliminary laboratory study.

    PubMed

    Rutkunas, Vygandas; Sveikata, Kestutis; Savickas, Raimondas

    2012-01-01

    The aim of this preliminary laboratory study was to evaluate the effects of 5- and 25-degree implant angulations in simulated clinical casts on an impression's accuracy when using different impression materials and tray selections. A convenience sample of each implant angulation group was selected for both open and closed trays in combination with one polyether and two polyvinyl siloxane impression materials. The influence of material and technique appeared to be significant for both 5- and 25-degree angulations (P < .05), and increased angulation tended to decrease impression accuracy. The open-tray technique was more accurate with highly nonaxially oriented implants for the small sample size investigated.

  18. Early-Onset Neonatal Sepsis: Still Room for Improvement in Procalcitonin Diagnostic Accuracy Studies.

    PubMed

    Chiesa, Claudio; Pacifico, Lucia; Osborn, John F; Bonci, Enea; Hofer, Nora; Resch, Bernhard

    2015-07-01

    To perform a systematic review assessing accuracy and completeness of diagnostic studies of procalcitonin (PCT) for early-onset neonatal sepsis (EONS) using the Standards for Reporting of Diagnostic Accuracy (STARD) initiative.EONS, diagnosed during the first 3 days of life, remains a common and serious problem. Increased PCT is a potentially useful diagnostic marker of EONS, but reports in the literature are contradictory. There are several possible explanations for the divergent results including the quality of studies reporting the clinical usefulness of PCT in ruling in or ruling out EONS.We systematically reviewed PubMed, Scopus, and the Cochrane Library databases up to October 1, 2014. Studies were eligible for inclusion in our review if they provided measures of PCT accuracy for diagnosing EONS. A data extraction form based on the STARD checklist and adapted for neonates with EONS was used to appraise the quality of the reporting of included studies.We found 18 articles (1998-2014) fulfilling our eligibility criteria which were included in the final analysis. Overall, the results of our analysis showed that the quality of studies reporting diagnostic accuracy of PCT for EONS was suboptimal leaving ample room for improvement. Information on key elements of design, analysis, and interpretation of test accuracy were frequently missing.Authors should be aware of the STARD criteria before starting a study in this field. We welcome stricter adherence to this guideline. Well-reported studies with appropriate designs will provide more reliable information to guide decisions on the use and interpretations of PCT test results in the management of neonates with EONS.

  19. The Accuracy of a Simple, Low-Cost GPS Data Logger/Receiver to Study Outdoor Human Walking in View of Health and Clinical Studies

    PubMed Central

    Noury-Desvaux, Bénédicte; Abraham, Pierre; Mahé, Guillaume; Sauvaget, Thomas; Leftheriotis, Georges; Le Faucheur, Alexis

    2011-01-01

    Introduction Accurate and objective measurements of physical activity and lower-extremity function are important in health and disease monitoring, particularly given the current epidemic of chronic diseases and their related functional impairment. Purpose The aim of the present study was to determine the accuracy of a handy (lightweight, small, only one stop/start button) and low-cost (∼$75 with its external antenna) Global Positioning System (GPS) data logger/receiver (the DG100) as a tool to study outdoor human walking in perspective of health and clinical research studies. Methods. Healthy subjects performed two experiments that consisted of different prescribed outdoor walking protocols. Experiment 1. We studied the accuracy of the DG100 for detecting bouts of walking and resting. Experiment 2. We studied the accuracy of the DG100 for estimating distances and speeds of walking. Results Experiment 1. The performance in the detection of bouts, expressed as the percentage of walking and resting bouts that were correctly detected, was 92.4% [95% Confidence Interval: 90.6–94.3]. Experiment 2. The coefficients of variation [95% Confidence Interval] for the accuracy of estimating the distances and speeds of walking were low: 3.1% [2.9–3.3] and 2.8% [2.6–3.1], respectively. Conclusion The DG100 produces acceptable accuracy both in detecting bouts of walking and resting and in estimating distances and speeds of walking during the detected walking bouts. However, before we can confirm that the DG100 can be used to study walking with respect to health and clinical studies, the inter- and intra-DG100 variability should be studied. Trial Registration ClinicalTrials.gov NCT00485147 PMID:21931593

  20. Who Should Mark What? A Study of Factors Affecting Marking Accuracy in a Biology Examination

    ERIC Educational Resources Information Center

    Suto, Irenka; Nadas, Rita; Bell, John

    2011-01-01

    Accurate marking is crucial to the reliability and validity of public examinations, in England and internationally. Factors contributing to accuracy have been conceptualised as affecting either marking task demands or markers' personal expertise. The aim of this empirical study was to develop this conceptualisation through investigating the…

  1. Breaking the Code of Silence: A Study of Teachers' Nonverbal Decoding Accuracy of Foreign Language Anxiety

    ERIC Educational Resources Information Center

    Gregersen, Tammy

    2007-01-01

    This study examined teachers' accuracy in decoding nonverbal behaviour indicative of foreign language anxiety. Teachers and teacher trainees twice observed a videotape without sound of seven beginning French foreign language students as they participated in an oral exam; four of these students were defined as anxious language learners by the…

  2. Tools for the study of dynamical spacetimes

    NASA Astrophysics Data System (ADS)

    Zhang, Fan

    This thesis covers a range of topics in numerical and analytical relativity, centered around introducing tools and methodologies for the study of dynamical spacetimes. The scope of the studies is limited to classical (as opposed to quantum) vacuum spacetimes described by Einstein's general theory of relativity. The numerical works presented here are carried out within the Spectral Einstein Code (SpEC) infrastructure, while analytical calculations extensively utilize Wolfram's Mathematica program. We begin by examining highly dynamical spacetimes such as binary black hole mergers, which can be investigated using numerical simulations. However, there are difficulties in interpreting the output of such simulations. One difficulty stems from the lack of a canonical coordinate system (henceforth referred to as gauge freedom) and tetrad, against which quantities such as Newman-Penrose Psi4 (usually interpreted as the gravitational wave part of curvature) should be measured. We tackle this problem in Chapter 2 by introducing a set of geometrically motivated coordinates that are independent of the simulation gauge choice, as well as a quasi-Kinnersley tetrad, also invariant under gauge changes in addition to being optimally suited to the task of gravitational wave extraction. Another difficulty arises from the need to condense the overwhelming amount of data generated by the numerical simulations. In order to extract physical information in a succinct and transparent manner, one may define a version of gravitational field lines and field strength using spatial projections of the Weyl curvature tensor. Introduction, investigation and utilization of these quantities will constitute the main content in Chapters 3 through 6. For the last two chapters, we turn to the analytical study of a simpler dynamical spacetime, namely a perturbed Kerr black hole. We will introduce in Chapter 7 a new analytical approximation to the quasi-normal mode (QNM) frequencies, and relate various

  3. Galaxy tools to study genome diversity

    PubMed Central

    2013-01-01

    Background Intra-species genetic variation can be used to investigate population structure, selection, and gene flow in non-model vertebrates; and due to the plummeting costs for genome sequencing, it is now possible for small labs to obtain full-genome variation data from their species of interest. However, those labs may not have easy access to, and familiarity with, computational tools to analyze those data. Results We have created a suite of tools for the Galaxy web server aimed at handling nucleotide and amino-acid polymorphisms discovered by full-genome sequencing of several individuals of the same species, or using a SNP genotyping microarray. In addition to providing user-friendly tools, a main goal is to make published analyses reproducible. While most of the examples discussed in this paper deal with nuclear-genome diversity in non-human vertebrates, we also illustrate the application of the tools to fungal genomes, human biomedical data, and mitochondrial sequences. Conclusions This project illustrates that a small group can design, implement, test, document, and distribute a Galaxy tool collection to meet the needs of a particular community of biologists. PMID:24377391

  4. Wound assessment tools and nurses' needs: an evaluation study.

    PubMed

    Greatrex-White, Sheila; Moxey, Helen

    2015-06-01

    The purpose of this study was to ascertain how well different wound assessment tools meet the needs of nurses in carrying out general wound assessment and whether current tools are fit for purpose. The methodology employed was evaluation research. In order to conduct the evaluation, a literature review was undertaken to identify the criteria of an optimal wound assessment tool which would meet nurses' needs. Several freely available wound assessment tools were selected based on predetermined inclusion and exclusion criteria and an audit tool was developed to evaluate the selected tools based on how well they met the criteria of the optimal wound assessment tool. The results provide a measure of how well the selected wound assessment tools meet the criteria of the optimal wound assessment tool. No tool was identified which fulfilled all the criteria, but two (the Applied Wound Management tool and the National Wound Assessment Form) met the most criteria of the optimal tool and were therefore considered to best meet nurses' needs in wound assessment. The study provides a mechanism for the appraisal of wound assessment tools using a set of optimal criteria which could aid practitioners in their search for the best wound assessment tool.

  5. Meta-analysis diagnostic accuracy of SNP-based pathogenicity detection tools: a case of UTG1A1 gene mutations

    PubMed Central

    Galehdari, Hamid; Saki, Najmaldin; Mohammadi-asl, Javad; Rahim, Fakher

    2013-01-01

    Crigler-Najjar syndrome (CNS) type I and type II are usually inherited as autosomal recessive conditions that result from mutations in the UGT1A1 gene. The main objective of the present review is to summarize results of all available evidence on the accuracy of SNP-based pathogenicity detection tools compared to published clinical result for the prediction of in nsSNPs that leads to disease using prediction performance method. A comprehensive search was performed to find all mutations related to CNS. Database searches included dbSNP, SNPdbe, HGMD, Swissvar, ensemble, and OMIM. All the mutation related to CNS was extracted. The pathogenicity prediction was done using SNP-based pathogenicity detection tools include SIFT, PHD-SNP, PolyPhen2, fathmm, Provean, and Mutpred. Overall, 59 different SNPs related to missense mutations in the UGT1A1 gene, were reviewed. Comparing the diagnostic OR, PolyPhen2 and Mutpred have the highest detection 4.983 (95% CI: 1.24 – 20.02) in both, following by SIFT (diagnostic OR: 3.25, 95% CI: 1.07 – 9.83). The highest MCC of SNP-based pathogenicity detection tools, was belong to SIFT (34.19%) followed by Provean, PolyPhen2, and Mutpred (29.99%, 29.89%, and 29.89%, respectively). Hence the highest SNP-based pathogenicity detection tools ACC, was fit to SIFT (62.71%) followed by PolyPhen2, and Mutpred (61.02%, in both). Our results suggest that some of the well-established SNP-based pathogenicity detection tools can appropriately reflect the role of a disease-associated SNP in both local and global structures. PMID:23875061

  6. Comparative study of application accuracy of two frameless neuronavigation systems: experimental error assessment quantifying registration methods and clinically influencing factors.

    PubMed

    Paraskevopoulos, Dimitrios; Unterberg, Andreas; Metzner, Roland; Dreyhaupt, Jens; Eggers, Georg; Wirtz, Christian Rainer

    2010-04-01

    This study aimed at comparing the accuracy of two commercial neuronavigation systems. Error assessment and quantification of clinical factors and surface registration, often resulting in decreased accuracy, were intended. Active (Stryker Navigation) and passive (VectorVision Sky, BrainLAB) neuronavigation systems were tested with an anthropomorphic phantom with a deformable layer, simulating skin and soft tissue. True coordinates measured by computer numerical control were compared with coordinates on image data and during navigation, to calculate software and system accuracy respectively. Comparison of image and navigation coordinates was used to evaluate navigation accuracy. Both systems achieved an overall accuracy of <1.5 mm. Stryker achieved better software accuracy, whereas BrainLAB better system and navigation accuracy. Factors with conspicuous influence (P<0.01) were imaging, instrument replacement, sterile cover drape and geometry of instruments. Precision data indicated by the systems did not reflect measured accuracy in general. Surface matching resulted in no improvement of accuracy, confirming former studies. Laser registration showed no differences compared to conventional pointers. Differences between the two systems were limited. Surface registration may improve inaccurate point-based registrations but does not in general affect overall accuracy. Accuracy feedback by the systems does not always match with true target accuracy and requires critical evaluation from the surgeon.

  7. Studies of the accuracy of time integration methods for reaction-diffusion equations

    NASA Astrophysics Data System (ADS)

    Ropp, David L.; Shadid, John N.; Ober, Curtis C.

    2004-03-01

    In this study we present numerical experiments of time integration methods applied to systems of reaction-diffusion equations. Our main interest is in evaluating the relative accuracy and asymptotic order of accuracy of the methods on problems which exhibit an approximate balance between the competing component time scales. Nearly balanced systems can produce a significant coupling of the physical mechanisms and introduce a slow dynamical time scale of interest. These problems provide a challenging test for this evaluation and tend to reveal subtle differences between the various methods. The methods we consider include first- and second-order semi-implicit, fully implicit, and operator-splitting techniques. The test problems include a prototype propagating nonlinear reaction-diffusion wave, a non-equilibrium radiation-diffusion system, a Brusselator chemical dynamics system and a blow-up example. In this evaluation we demonstrate a "split personality" for the operator-splitting methods that we consider. While operator-splitting methods often obtain very good accuracy, they can also manifest a serious degradation in accuracy due to stability problems.

  8. Accuracy of bite mark analysis from food substances: A comparative study

    PubMed Central

    Daniel, M. Jonathan; Pazhani, Ambiga

    2015-01-01

    Aims and Objectives: The aims and objectives of the study were to compare the accuracy of bite mark analysis from three different food substances-apple, cheese and chocolate using two techniques-the manual docking procedure and computer assisted overlay generation technique and to compare the accuracy of the two techniques for bite mark analysis on food substances. Materials and Methods: The individuals who participated in the study were made to bite on three food substances-apple, cheese, and chocolate. Dentate individuals were included in the study. Edentulous individuals and individuals having a missing anterior tooth were excluded from the study. The dental casts of the individual were applied to the positive cast of the bitten food substance to determine docking or matching. Then, computer generated overlays were compared with bite mark pattern on the foodstuff. Results: The results were tabulated and the comparison of bite mark analysis on the three different food substances was analyzed by Kruskall-Wallis ANOVA test and the comparison of the two techniques was analyzed by Spearman's Rho correlation coefficient. Conclusion: On comparing the bite marks analysis from the three food substances-apple, cheese and chocolate, the accuracy was found to be greater for chocolate and cheese than apple. PMID:26816463

  9. Does the Reporting Quality of Diagnostic Test Accuracy Studies, as Defined by STARD 2015, Affect Citation?

    PubMed Central

    Choi, Young Jun; Chung, Mi Sun; Koo, Hyun Jung; Park, Ji Eun; Yoon, Hee Mang

    2016-01-01

    Objective To determine the rate with which diagnostic test accuracy studies that are published in a general radiology journal adhere to the Standards for Reporting of Diagnostic Accuracy Studies (STARD) 2015, and to explore the relationship between adherence rate and citation rate while avoiding confounding by journal factors. Materials and Methods All eligible diagnostic test accuracy studies that were published in the Korean Journal of Radiology in 2011–2015 were identified. Five reviewers assessed each article for yes/no compliance with 27 of the 30 STARD 2015 checklist items (items 28, 29, and 30 were excluded). The total STARD score (number of fulfilled STARD items) was calculated. The score of the 15 STARD items that related directly to the Quality Assessment of Diagnostic Accuracy Studies (QUADAS)-2 was also calculated. The number of times each article was cited (as indicated by the Web of Science) after publication until March 2016 and the article exposure time (time in months between publication and March 2016) were extracted. Results Sixty-three articles were analyzed. The mean (range) total and QUADAS-2-related STARD scores were 20.0 (14.5–25) and 11.4 (7–15), respectively. The mean citation number was 4 (0–21). Citation number did not associate significantly with either STARD score after accounting for exposure time (total score: correlation coefficient = 0.154, p = 0.232; QUADAS-2-related score: correlation coefficient = 0.143, p = 0.266). Conclusion The degree of adherence to STARD 2015 was moderate for this journal, indicating that there is room for improvement. When adjusted for exposure time, the degree of adherence did not affect the citation rate. PMID:27587959

  10. Comparative study of dimensional accuracy of different impression techniques using addition silicone impression material.

    PubMed

    Penaflor, C F; Semacio, R C; De Las Alas, L T; Uy, H G

    1998-01-01

    This study compared dimensional accuracy of the single, double with spacer, double with cut-out and double mix impression technique using addition silicone impression material. A typhodont containing Ivorine teeth model with six (6) full-crown tooth preparations were used as the positive control. Two stone replication models for each impression technique were made as test materials. Accuracy of the techniques were assessed by measuring four dimensions on the stone dies poured from the impression of the Ivorine teeth model. Results indicated that most of the measurements for the height, width and diameter slightly decreased and a few increased compared with the Ivorine teeth model. The double with cut-out and double mix technique presents the least difference from the master model as compared to the two latter impression techniques. PMID:10202524

  11. Spectroscopic Accuracy in Quantum Chemistry: a Benchmark Study on Na_3

    NASA Astrophysics Data System (ADS)

    Hauser, Andreas W.; Pototschnig, Johann V.; Ernst, Wolfgang E.

    2015-06-01

    Modern techniques of quantum chemistry allow the prediction of molecular properties to good accuracy, provided the systems are small and their electronic structure is not too complex. For most users of common program packages, `chemical' accuracy in the order of a few kJ/mol for relative energies between different geometries is sufficient. The demands of molecular spectroscopists are typically much more stringent, and often include a detailed topographical survey of multi-dimensional potential energy surfaces with an accuracy in the range of wavenumbers. In a benchmark study of current predictive capabilities we pick the slightly sophisticated, but conceptually simple and well studied case of the Na_3 ground state, and present a thorough investigation of the interplay between Jahn-Teller-, spin-orbit-, rovibrational- and hyperfine-interactions based only on ab initio calculations. The necessary parameters for the effective Hamiltonian are derived from the potential energy surface of the 12E'{} ground state and from spin density evaluations at selected geometries, without any fitting adjustments to experimental data. We compare our results to highly resolved microwave spectra. L. H. Coudert, W. E. Ernst and O. Golonzka, J. Chem. Phys. 117, 7102-7116 (2002)

  12. High accuracy differential pressure measurements using fluid-filled catheters - A feasibility study in compliant tubes.

    PubMed

    Rotman, Oren Moshe; Weiss, Dar; Zaretsky, Uri; Shitzer, Avraham; Einav, Shmuel

    2015-09-18

    High accuracy differential pressure measurements are required in various biomedical and medical applications, such as in fluid-dynamic test systems, or in the cath-lab. Differential pressure measurements using fluid-filled catheters are relatively inexpensive, yet may be subjected to common mode pressure errors (CMP), which can significantly reduce the measurement accuracy. Recently, a novel correction method for high accuracy differential pressure measurements was presented, and was shown to effectively remove CMP distortions from measurements acquired in rigid tubes. The purpose of the present study was to test the feasibility of this correction method inside compliant tubes, which effectively simulate arteries. Two tubes with varying compliance were tested under dynamic flow and pressure conditions to cover the physiological range of radial distensibility in coronary arteries. A third, compliant model, with a 70% stenosis severity was additionally tested. Differential pressure measurements were acquired over a 3 cm tube length using a fluid-filled double-lumen catheter, and were corrected using the proposed CMP correction method. Validation of the corrected differential pressure signals was performed by comparison to differential pressure recordings taken via a direct connection to the compliant tubes, and by comparison to predicted differential pressure readings of matching fluid-structure interaction (FSI) computational simulations. The results show excellent agreement between the experimentally acquired and computationally determined differential pressure signals. This validates the application of the CMP correction method in compliant tubes of the physiological range for up to intermediate size stenosis severity of 70%.

  13. High accuracy differential pressure measurements using fluid-filled catheters - A feasibility study in compliant tubes.

    PubMed

    Rotman, Oren Moshe; Weiss, Dar; Zaretsky, Uri; Shitzer, Avraham; Einav, Shmuel

    2015-09-18

    High accuracy differential pressure measurements are required in various biomedical and medical applications, such as in fluid-dynamic test systems, or in the cath-lab. Differential pressure measurements using fluid-filled catheters are relatively inexpensive, yet may be subjected to common mode pressure errors (CMP), which can significantly reduce the measurement accuracy. Recently, a novel correction method for high accuracy differential pressure measurements was presented, and was shown to effectively remove CMP distortions from measurements acquired in rigid tubes. The purpose of the present study was to test the feasibility of this correction method inside compliant tubes, which effectively simulate arteries. Two tubes with varying compliance were tested under dynamic flow and pressure conditions to cover the physiological range of radial distensibility in coronary arteries. A third, compliant model, with a 70% stenosis severity was additionally tested. Differential pressure measurements were acquired over a 3 cm tube length using a fluid-filled double-lumen catheter, and were corrected using the proposed CMP correction method. Validation of the corrected differential pressure signals was performed by comparison to differential pressure recordings taken via a direct connection to the compliant tubes, and by comparison to predicted differential pressure readings of matching fluid-structure interaction (FSI) computational simulations. The results show excellent agreement between the experimentally acquired and computationally determined differential pressure signals. This validates the application of the CMP correction method in compliant tubes of the physiological range for up to intermediate size stenosis severity of 70%. PMID:26087881

  14. Simulation approach for the evaluation of tracking accuracy in radiotherapy: a preliminary study.

    PubMed

    Tanaka, Rie; Ichikawa, Katsuhiro; Mori, Shinichiro; Sanada, Sigeru

    2013-01-01

    Real-time tumor tracking in external radiotherapy can be achieved by diagnostic (kV) X-ray imaging with a dynamic flat-panel detector (FPD). It is important to keep the patient dose as low as possible while maintaining tracking accuracy. A simulation approach would be helpful to optimize the imaging conditions. This study was performed to develop a computer simulation platform based on a noise property of the imaging system for the evaluation of tracking accuracy at any noise level. Flat-field images were obtained using a direct-type dynamic FPD, and noise power spectrum (NPS) analysis was performed. The relationship between incident quantum number and pixel value was addressed, and a conversion function was created. The pixel values were converted into a map of quantum number using the conversion function, and the map was then input into the random number generator to simulate image noise. Simulation images were provided at different noise levels by changing the incident quantum numbers. Subsequently, an implanted marker was tracked automatically and the maximum tracking errors were calculated at different noise levels. The results indicated that the maximum tracking error increased with decreasing incident quantum number in flat-field images with an implanted marker. In addition, the range of errors increased with decreasing incident quantum number. The present method could be used to determine the relationship between image noise and tracking accuracy. The results indicated that the simulation approach would aid in determining exposure dose conditions according to the necessary tracking accuracy. PMID:22843379

  15. Sex differences in accuracy and precision when judging time to arrival: data from two Internet studies.

    PubMed

    Sanders, Geoff; Sinclair, Kamila

    2011-12-01

    We report two Internet studies that investigated sex differences in the accuracy and precision of judging time to arrival. We used accuracy to mean the ability to match the actual time to arrival and precision to mean the consistency with which each participant made their judgments. Our task was presented as a computer game in which a toy UFO moved obliquely towards the participant through a virtual three-dimensional space on route to a docking station. The UFO disappeared before docking and participants pressed their space bar at the precise moment they thought the UFO would have docked. Study 1 showed it was possible to conduct quantitative studies of spatiotemporal judgments in virtual reality via the Internet and confirmed reports that men are more accurate because women underestimate, but found no difference in precision measured as intra-participant variation. Study 2 repeated Study 1 with five additional presentations of one condition to provide a better measure of precision. Again, men were more accurate than women but there were no sex differences in precision. However, within the coincidence-anticipation timing (CAT) literature, of those studies that report sex differences, a majority found that males are both more accurate and more precise than females. Noting that many CAT studies report no sex differences, we discuss appropriate interpretations of such null findings. While acknowledging that CAT performance may be influenced by experience we suggest that the sex difference may have originated among our ancestors with the evolutionary selection of men for hunting and women for gathering.

  16. A Comparison of Parameter Study Creation and Job Submission Tools

    NASA Technical Reports Server (NTRS)

    DeVivo, Adrian; Yarrow, Maurice; McCann, Karen M.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We consider the differences between the available general purpose parameter study and job submission tools. These tools necessarily share many features, but frequently with differences in the way they are designed and implemented For this class of features, we will only briefly outline the essential differences. However we will focus on the unique features which distinguish the ILab parameter study and job submission tool from other packages, and which make the ILab tool easier and more suitable for use in our research and engineering environment.

  17. Results of a remote multiplexer/digitizer unit accuracy and environmental study

    NASA Technical Reports Server (NTRS)

    Wilner, D. O.

    1977-01-01

    A remote multiplexer/digitizer unit (RMDU), a part of the airborne integrated flight test data system, was subjected to an accuracy study. The study was designed to show the effects of temperature, altitude, and vibration on the RMDU. The RMDU was subjected to tests at temperatures from -54 C (-65 F) to 71 C (160 F), and the resulting data are presented here, along with a complete analysis of the effects. The methods and means used for obtaining correctable data and correcting the data are also discussed.

  18. Effect of x-ray tube current on the accuracy of cerebral perfusion parameters obtained by CT perfusion studies

    NASA Astrophysics Data System (ADS)

    Murase, Kenya; Nanjo, Takafumi; Satoshi, Ii; Miyazaki, Shohei; Hirata, Masaaki; Sugawara, Yoshifumi; Kudo, Masayuki; Sasaki, Kousuke; Mochizuki, Teruhito

    2005-11-01

    The purpose of this study was to investigate the effect of x-ray tube current on the accuracy of cerebral perfusion parameters obtained by CT perfusion studies using multi-detector row CT (MDCT). Following the standard CT perfusion study protocol, continuous (cine) scans (1 s/rotation × 60 s) consisting of four 5 mm thick contiguous slices were performed using an MDCT scanner with a tube voltage of 80 kVp and a tube current of 200 mA. We generated the simulated images with tube currents of 50 mA, 100 mA and 150 mA by adding the corresponding noise to the raw scan data of the original image acquired above using a noise simulation tool. From the original and simulated images, we generated the functional images of cerebral blood flow (CBF), cerebral blood volume (CBV) and mean transit time (MTT) in seven patients with cerebrovascular disease, and compared the correlation coefficients (CCs) between the perfusion parameter values obtained from the original and simulated images. The coefficients of variation (CVs) in the white matter were also compared. The CC values deteriorated with decreasing tube current. There was a significant difference between 50 mA and 100 mA for all perfusion parameters. The CV values increased with decreasing tube current. There were significant differences between 50 mA and 100 mA and between 100 mA and 150 mA for CBF. For CBV and MTT, there was also a significant difference between 150 mA and 200 mA. This study will be useful for understanding the effect of x-ray tube current on the accuracy of cerebral perfusion parameters obtained by CT perfusion studies using MDCT, and for selecting the tube current.

  19. Dimensional Accuracy of Hydrophilic and Hydrophobic VPS Impression Materials Using Different Impression Techniques - An Invitro Study

    PubMed Central

    Pilla, Ajai; Pathipaka, Suman

    2016-01-01

    Introduction The dimensional stability of the impression material could have an influence on the accuracy of the final restoration. Vinyl Polysiloxane Impression materials (VPS) are most frequently used as the impression material in fixed prosthodontics. As VPS is hydrophobic when it is poured with gypsum products, manufacturers added intrinsic surfactants and marketed as hydrophilic VPS. These hydrophilic VPS have shown increased wettability with gypsum slurries. VPS are available in different viscosities ranging from very low to very high for usage under different impression techniques. Aim To compare the dimensional accuracy of hydrophilic VPS and hydrophobic VPS using monophase, one step and two step putty wash impression techniques. Materials and Methods To test the dimensional accuracy of the impression materials a stainless steel die was fabricated as prescribed by ADA specification no. 19 for elastomeric impression materials. A total of 60 impressions were made. The materials were divided into two groups, Group1 hydrophilic VPS (Aquasil) and Group 2 hydrophobic VPS (Variotime). These were further divided into three subgroups A, B, C for monophase, one-step and two-step putty wash technique with 10 samples in each subgroup. The dimensional accuracy of the impressions was evaluated after 24 hours using vertical profile projector with lens magnification range of 20X-125X illumination. The study was analyzed through one-way ANOVA, post-hoc Tukey HSD test and unpaired t-test for mean comparison between groups. Results Results showed that the three different impression techniques (monophase, 1-step, 2-step putty wash techniques) did cause significant change in dimensional accuracy between hydrophilic VPS and hydrophobic VPS impression materials. One-way ANOVA disclosed, mean dimensional change and SD for hydrophilic VPS varied between 0.56% and 0.16%, which were low, suggesting hydrophilic VPS was satisfactory with all three impression techniques. However, mean

  20. Study on Increasing the Accuracy of Classification Based on Ant Colony algorithm

    NASA Astrophysics Data System (ADS)

    Yu, M.; Chen, D.-W.; Dai, C.-Y.; Li, Z.-L.

    2013-05-01

    The application for GIS advances the ability of data analysis on remote sensing image. The classification and distill of remote sensing image is the primary information source for GIS in LUCC application. How to increase the accuracy of classification is an important content of remote sensing research. Adding features and researching new classification methods are the ways to improve accuracy of classification. Ant colony algorithm based on mode framework defined, agents of the algorithms in nature-inspired computation field can show a kind of uniform intelligent computation mode. It is applied in remote sensing image classification is a new method of preliminary swarm intelligence. Studying the applicability of ant colony algorithm based on more features and exploring the advantages and performance of ant colony algorithm are provided with very important significance. The study takes the outskirts of Fuzhou with complicated land use in Fujian Province as study area. The multi-source database which contains the integration of spectral information (TM1-5, TM7, NDVI, NDBI) and topography characters (DEM, Slope, Aspect) and textural information (Mean, Variance, Homogeneity, Contrast, Dissimilarity, Entropy, Second Moment, Correlation) were built. Classification rules based different characters are discovered from the samples through ant colony algorithm and the classification test is performed based on these rules. At the same time, we compare with traditional maximum likelihood method, C4.5 algorithm and rough sets classifications for checking over the accuracies. The study showed that the accuracy of classification based on the ant colony algorithm is higher than other methods. In addition, the land use and cover changes in Fuzhou for the near term is studied and display the figures by using remote sensing technology based on ant colony algorithm. In addition, the land use and cover changes in Fuzhou for the near term is studied and display the figures by using

  1. The Eye Phone Study: reliability and accuracy of assessing Snellen visual acuity using smartphone technology

    PubMed Central

    Perera, C; Chakrabarti, R; Islam, F M A; Crowston, J

    2015-01-01

    Purpose Smartphone-based Snellen visual acuity charts has become popularized; however, their accuracy has not been established. This study aimed to evaluate the equivalence of a smartphone-based visual acuity chart with a standard 6-m Snellen visual acuity (6SVA) chart. Methods First, a review of available Snellen chart applications on iPhone was performed to determine the most accurate application based on optotype size. Subsequently, a prospective comparative study was performed by measuring conventional 6SVA and then iPhone visual acuity using the ‘Snellen' application on an Apple iPhone 4. Results Eleven applications were identified, with accuracy of optotype size ranging from 4.4–39.9%. Eighty-eight patients from general medical and surgical wards in a tertiary hospital took part in the second part of the study. The mean difference in logMAR visual acuity between the two charts was 0.02 logMAR (95% limit of agreement −0.332, 0.372 logMAR). The largest mean difference in logMAR acuity was noted in the subgroup of patients with 6SVA worse than 6/18 (n=5), who had a mean difference of two Snellen visual acuity lines between the charts (0.276 logMAR). Conclusion We did not identify a Snellen visual acuity app at the time of study, which could predict a patients standard Snellen visual acuity within one line. There was considerable variability in the optotype accuracy of apps. Further validation is required for assessment of acuity in patients with severe vision impairment. PMID:25931170

  2. Updating Risk Prediction Tools: A Case Study in Prostate Cancer

    PubMed Central

    Ankerst, Donna P.; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J.; Feng, Ziding; Sanda, Martin G.; Partin, Alan W.; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M.

    2013-01-01

    Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [−2]proPSA measured on an external case control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. PMID:22095849

  3. Technology Tools in the Social Studies Curriculum.

    ERIC Educational Resources Information Center

    Braun, Joseph A., Jr.; Fernlund, Phyllis; White, Charles S.

    This book is designed to help educators understand different types of technological resources for teaching social studies, plan and evaluate these resources in the curriculum, and examine potential future trends. The topics include an overview of social studies education and resources for exploring technologies, and guidance on how technology and…

  4. User Studies: Developing Learning Strategy Tool Software for Children.

    ERIC Educational Resources Information Center

    Fitzgerald, Gail E.; Koury, Kevin A.; Peng, Hsinyi

    This paper is a report of user studies for developing learning strategy tool software for children. The prototype software demonstrated is designed for children with learning and behavioral disabilities. The tools consist of easy-to-use templates for creating organizational, memory, and learning approach guides for use in classrooms and at home.…

  5. [Comparative study on hyperspectral inversion accuracy of soil salt content and electrical conductivity].

    PubMed

    Peng, Jie; Wang, Jia-Qiang; Xiang, Hong-Ying; Teng, Hong-Fen; Liu, Wei-Yang; Chi, Chun-Ming; Niu, Jian-Long; Guo, Yan; Shi, Zhou

    2014-02-01

    The objective of the present article is to ascertain the mechanism of hyperspectral remote sensing monitoring for soil salinization, which is of great importance for improving the accuracy of hyperspectral remote sensing monitoring. Paddy soils in Wensu, Hetian and Baicheng counties of the southern Xinjiang were selected. Hyperspectral data of soils were obtained. Soil salt content (S(t)) an electrical conductivity of 1:5 soil-to-water extracts (EC(1:5)) were determined. Relationships between S(t) and EC(1:5) were studied. Correlations between hyperspectral indices and S(t), and EC(1:5) were analyzed. The inversion accuracy of S(t) using hyperspectral technique was compared with that of EC(1:5). Results showed that: significant (p<0.01) relationships were found between S(t) and EC(1:5) for soils in Wensu and Hetian counties, and correlation coefficients were 0.86 and 0.45, respectively; there was no significant relationship between S(t) and EC(1:5) for soils in Baicheng county. Therefore, the correlations between S(t) and EC(1:5) varied with studied sites. S(t) and EC(1:5) were significantly related with spectral reflectance, first derivative reflectance and continuum-removed reflectance, respectively; but correlation coefficients between S(t) and spectral indices were higher than those between EC(1:5) and spectral indices, which was obvious in some sensitive bands for soil salinization such as 660, 35, 1229, 1414, 1721, 1738, 1772, 2309 nm, and so on. Prediction equations of St and EC(1:5) were established using multivariate linear regression, principal component regression and partial least-squares regression methods, respectively. Coefficients of determination, determination coefficients of prediction, and relative analytical errors of these equations were analyzed. Coefficients of determination and relative analytical errors of equations between S(t) and spectral indices were higher than those of equations between EC(1:5) and spectral indices. Therefore, the

  6. Astra: Interdisciplinary study on enhancement of the end-to-end accuracy for spacecraft tracking techniques

    NASA Astrophysics Data System (ADS)

    Iess, Luciano; Di Benedetto, Mauro; James, Nick; Mercolino, Mattia; Simone, Lorenzo; Tortora, Paolo

    2014-02-01

    Navigation of deep-space probes is accomplished through a variety of different radio observables, namely Doppler, ranging and Delta-Differential One-Way Ranging (Delta-DOR). The particular mix of observations used for navigation mainly depends on the available on-board radio system, the mission phase and orbit determination requirements. The accuracy of current ESA and NASA tracking systems is at level of 0.1 mm/s at 60 s integration time for Doppler, 1-5 m for ranging and 6-15 nrad for Delta-DOR measurements in a wide range of operational conditions. The ASTRA study, funded under ESA's General Studies Programme (GSP), addresses the ways to improve the end-to-end accuracy of Doppler, ranging and Delta-DOR systems by roughly a factor of 10. The target accuracies were set to 0.01 mm/s at 60 s integration time for Doppler, 20 cm for ranging and 1 nrad for Delta-DOR. The companies and universities that took part in the study were the University of Rome Sapienza, ALMASpace, BAE Systems and Thales Alenia Space Italy. The analysis of an extensive data set of radio-metric observables and dedicated tests of the ground station allowed consolidating the error budget for each measurement technique. The radio-metric data set comprises X/X, X/Ka and Ka/Ka range and Doppler observables from the Cassini and Rosetta missions. It includes also measurements from the Advanced Media Calibration System (AMCS) developed by JPL for the radio science experiments of the Cassini mission. The error budget for the three radio-metric observables was consolidated by comparing the statistical properties of the data set with the expected error models. The analysis confirmed the contribution from some error sources, but revealed also some discrepancies and ultimately led to improved error models. The error budget reassessment provides adequate information for building guidelines and strategies to effectively improve the navigation accuracies of future deep space missions. We report both on updated

  7. Diagnostic Accuracy of Frozen Section of Central Nervous System Lesions: A 10-Year Study

    PubMed Central

    KHODDAMI, Maliheh; AKBARZADEH, Ali; MORDAI, Afshin; BIDARI - ZEREHPOUSH, Farahnaz; ALIPOUR, Hamid; SAMADZADEH, Sara; ALIPOUR, Bijan

    2015-01-01

    Objective Definitive diagnosis of the central nervous system (CNS) lesions is unknown prior to histopathological examination. To determine the method and the endpoint for surgery, intraoperative evaluation of the lesion helps the surgeon. In this study, the diagnostic accuracy and pitfalls of using frozen section (FS) of CNS lesions is determined. Materials & Methods In this retrospective study, we analyzed the results of FS and permanent diagnoses of all CNS lesions by reviewing reports from 3 general hospitals between March 2001 and March 2011. Results 273 cases were reviewed and patients with an age range from 3 to 77 years of age were considered. 166 (60.4%) had complete concordance between FS and permanent section diagnosis, 83 (30.2%) had partial concordance, and 24 cases (9.5%) were discordant. Considering the concordant and partially concordant cases, the accuracy rate was 99.5%, sensitivity was 91.4%, specificity was 99.7%, and positive and negative predictive values were 88.4% and 99.8%, respectively. Conclusion Our results show high sensitivity and specificity of FS diagnosis in the evaluation of CNS lesions. A Kappa agreement score of 0.88 shows high concordance for FS results with permanent section. Pathologist’s misinterpretation, small biopsy samples (not representative of the entire tumor), suboptimal slides, and inadequate information about tumor location and radiologic findings appear to be the major causes for these discrepancies indicated from our study. PMID:25767535

  8. [An experimental study on human bitemarks digital analysis and its accuracy].

    PubMed

    Wu, Yan; Chen, Xinmin; Sun, Dahong

    2005-10-01

    This experiment was designed to study the method of human bitemarks digital analysis and its accuracy. The human bitemarks were made on the dog skin by human dentition. The related parameters of human bitemarks and suspects criminal dentitions were digitally recorded and managed. The digital picture of human bitemark was obtained, and the dental study model, bite in wax and bitemark on pig skin of suspected criminal were scanned. The overlay was prepared with Adobe Photoshop 5. 5 and the parameters were measured with AutoCAD R14, then their matches were compared. The result shows that the human bitemarks digital analysis is a more accurate approach to human bitemarks identification. Three methods for collecting evidence dental study model, bite in wax and bitemark on pig skin all can be used as aids in forensic sciences. Dental study model is the most accurate one of all the three methods mentioned above. PMID:16294721

  9. [An experimental study on human bitemarks digital analysis and its accuracy].

    PubMed

    Wu, Yan; Chen, Xinmin; Sun, Dahong

    2005-10-01

    This experiment was designed to study the method of human bitemarks digital analysis and its accuracy. The human bitemarks were made on the dog skin by human dentition. The related parameters of human bitemarks and suspects criminal dentitions were digitally recorded and managed. The digital picture of human bitemark was obtained, and the dental study model, bite in wax and bitemark on pig skin of suspected criminal were scanned. The overlay was prepared with Adobe Photoshop 5. 5 and the parameters were measured with AutoCAD R14, then their matches were compared. The result shows that the human bitemarks digital analysis is a more accurate approach to human bitemarks identification. Three methods for collecting evidence dental study model, bite in wax and bitemark on pig skin all can be used as aids in forensic sciences. Dental study model is the most accurate one of all the three methods mentioned above.

  10. Review article: Diagnostic accuracy of risk stratification tools for patients with chest pain in the rural emergency department: A systematic review.

    PubMed

    Roche, Tina; Jennings, Natasha; Clifford, Stuart; O'connell, Jane; Lutze, Matthew; Gosden, Edward; Hadden, N Fionna; Gardner, Glenn

    2016-10-01

    Risk stratification tools for patients presenting to rural EDs with undifferentiated chest pain enable early definitive treatment in high-risk patients. This systematic review compares the most commonly used risk stratification tools used to predict the risk of major adverse cardiac event (MACE) for patients presenting to rural EDs with chest pain. A comprehensive search of MEDLINE and Embase for studies published between January 2011 and January 2015 was undertaken. Study quality was assessed using QUADAS-2 criteria and the PRISMA guidelines.Eleven studies using eight risk stratification tools met the inclusion criteria. The percentage of MACE in the patients stratified as suitable for discharge, and the percentage of patients whose scores would have recommended admission that did not experience a MACE event were used as comparisons. Using the findings of a survey of emergency physicians that found a 1% MACE rate acceptable in discharged patients, the EDACS-ADP was considered the best performer. EDACS-ADP had one of the lowest rates of MACE in those discharged (3/1148, 0.3%) and discharged one of the highest percentage of patients (44.5%). Only the GRACE tool discharged more patients (69% - all patients with scores <100) but had a MACE rate of 0.3% in discharged patients. The HFA/CSANZ guidelines achieved zero cases of MACE but discharged only 1.3% of patients.EDACS-ADP can potentially increase diagnostic efficiency of patients presenting at ED with chest pain. Further assessment of tool in a rural context is recommended.

  11. Tools for Teaching Climate Change Studies

    SciTech Connect

    Maestas, A.M.; Jones, L.A.

    2005-03-18

    The Atmospheric Radiation Measurement Climate Research Facility (ACRF) develops public outreach materials and educational resources for schools. Studies prove that science education in rural and indigenous communities improves when educators integrate regional knowledge of climate and environmental issues into school curriculum and public outreach materials. In order to promote understanding of ACRF climate change studies, ACRF Education and Outreach has developed interactive kiosks about climate change for host communities close to the research sites. A kiosk for the North Slope of Alaska (NSA) community was installed at the Iupiat Heritage Center in 2003, and a kiosk for the Tropical Western Pacific locales will be installed in 2005. The kiosks feature interviews with local community elders, regional agency officials, and Atmospheric Radiation Measurement (ARM) Program scientists, which highlight both research and local observations of some aspects of environmental and climatic change in the Arctic and Pacific. The kiosks offer viewers a unique opportunity to learn about the environmental concerns and knowledge of respected community elders, and to also understand state-of-the-art climate research. An archive of interviews from the communities will also be distributed with supplemental lessons and activities to encourage teachers and students to compare and contrast climate change studies and oral history observations from two distinct locations. The U.S. Department of Energy's ACRF supports education and outreach efforts for communities and schools located near its sites. ACRF Education and Outreach has developed interactive kiosks at the request of the communities to provide an opportunity for the public to learn about climate change from both scientific and indigenous perspectives. Kiosks include interviews with ARM scientists and provide users with basic information about climate change studies as well as interviews with elders and community leaders

  12. A PILOT STUDY OF THE ACCURACY OF CO2 SENSORS IN COMMERCIAL BUILDINGS

    SciTech Connect

    Fisk, William; Fisk, William J.; Faulkner, David; Sullivan, Douglas P.

    2007-09-01

    Carbon dioxide (CO2) sensors are often deployed in commercial buildings to obtain CO2 data that are used to automatically modulate rates of outdoor air supply. The goal is to keep ventilation rates at or above design requirements and to save energy by avoiding ventilation rates exceeding design requirements. However, there have been many anecdotal reports of poor CO2 sensor performance in actual commercial building applications. This study evaluated the accuracy of 44 CO2 sensors located in nine commercial buildings to determine if CO2 sensor performance, in practice, is generally acceptable or problematic. CO2 measurement errors varied widely and were sometimes hundreds of parts per million. Despite its small size, this study provides a strong indication that the accuracy of CO2 sensors, as they are applied and maintained in commercial buildings, is frequently less than needed to measure typical values of maximum one-hour-average indoor-outdoor CO2 concentration differences with less than a 20percent error. Thus, we conclude that there is a need for more accurate CO2 sensors and/or better sensor maintenance or calibration procedures.

  13. Sex differences in accuracy and precision when judging time to arrival: data from two Internet studies.

    PubMed

    Sanders, Geoff; Sinclair, Kamila

    2011-12-01

    We report two Internet studies that investigated sex differences in the accuracy and precision of judging time to arrival. We used accuracy to mean the ability to match the actual time to arrival and precision to mean the consistency with which each participant made their judgments. Our task was presented as a computer game in which a toy UFO moved obliquely towards the participant through a virtual three-dimensional space on route to a docking station. The UFO disappeared before docking and participants pressed their space bar at the precise moment they thought the UFO would have docked. Study 1 showed it was possible to conduct quantitative studies of spatiotemporal judgments in virtual reality via the Internet and confirmed reports that men are more accurate because women underestimate, but found no difference in precision measured as intra-participant variation. Study 2 repeated Study 1 with five additional presentations of one condition to provide a better measure of precision. Again, men were more accurate than women but there were no sex differences in precision. However, within the coincidence-anticipation timing (CAT) literature, of those studies that report sex differences, a majority found that males are both more accurate and more precise than females. Noting that many CAT studies report no sex differences, we discuss appropriate interpretations of such null findings. While acknowledging that CAT performance may be influenced by experience we suggest that the sex difference may have originated among our ancestors with the evolutionary selection of men for hunting and women for gathering. PMID:21125324

  14. The effect of morphology on spelling and reading accuracy: a study on Italian children

    PubMed Central

    Angelelli, Paola; Marinelli, Chiara Valeria; Burani, Cristina

    2014-01-01

    In opaque orthographies knowledge of morphological information helps in achieving reading and spelling accuracy. In transparent orthographies with regular print-to-sound correspondences, such as Italian, the mappings of orthography onto phonology and phonology onto orthography are in principle sufficient to read and spell most words. The present study aimed to investigate the role of morphology in the reading and spelling accuracy of Italian children as a function of school experience to determine whether morphological facilitation was present in children learning a transparent orthography. The reading and spelling performances of 15 third-grade and 15 fifth-grade typically developing children were analyzed. Children read aloud and spelled both low-frequency words and pseudowords. Low-frequency words were manipulated for the presence of morphological structure (morphemic words vs. non-derived words). Morphemic words could also vary for the frequency (high vs. low) of roots and suffixes. Pseudo-words were made up of either a real root and a real derivational suffix in a combination that does not exist in the Italian language or had no morphological constituents. Results showed that, in Italian, morphological information is a useful resource for both reading and spelling. Typically developing children benefitted from the presence of morphological structure when they read and spelled pseudowords; however, in processing low-frequency words, morphology facilitated reading but not spelling. These findings are discussed in terms of morpho-lexical access and successful cooperation between lexical and sublexical processes in reading and spelling. PMID:25477855

  15. Impact of contacting study authors to obtain additional data for systematic reviews: diagnostic accuracy studies for hepatic fibrosis

    PubMed Central

    2014-01-01

    Background Seventeen of 172 included studies in a recent systematic review of blood tests for hepatic fibrosis or cirrhosis reported diagnostic accuracy results discordant from 2 × 2 tables, and 60 studies reported inadequate data to construct 2 × 2 tables. This study explores the yield of contacting authors of diagnostic accuracy studies and impact on the systematic review findings. Methods Sixty-six corresponding authors were sent letters requesting additional information or clarification of data from 77 studies. Data received from the authors were synthesized with data included in the previous review, and diagnostic accuracy sensitivities, specificities, and positive and likelihood ratios were recalculated. Results Of the 66 authors, 68% were successfully contacted and 42% provided additional data for 29 out of 77 studies (38%). All authors who provided data at all did so by the third emailed request (ten authors provided data after one request). Authors of more recent studies were more likely to be located and provide data compared to authors of older studies. The effects of requests for additional data on the conclusions regarding the utility of blood tests to identify patients with clinically significant fibrosis or cirrhosis were generally small for ten out of 12 tests. Additional data resulted in reclassification (using median likelihood ratio estimates) from less useful to moderately useful or vice versa for the remaining two blood tests and enabled the calculation of an estimate for a third blood test for which previously the data had been insufficient to do so. We did not identify a clear pattern for the directional impact of additional data on estimates of diagnostic accuracy. Conclusions We successfully contacted and received results from 42% of authors who provided data for 38% of included studies. Contacting authors of studies evaluating the diagnostic accuracy of serum biomarkers for hepatic fibrosis and cirrhosis in hepatitis C patients

  16. Longitudinal Study: Efficacy of Online Technology Tools for Instructional Use

    NASA Technical Reports Server (NTRS)

    Uenking, Michael D.

    2011-01-01

    Studies show that the student population (secondary and post secondary) is becoming increasingly more technologically savvy. Use of the internet, computers, MP3 players, and other technologies along with online gaming has increased tremendously amongst this population such that it is creating an apparent paradigm shift in the learning modalities of these students. Instructors and facilitators of learning can no longer rely solely on traditional lecture-based lesson formals. In order to achieve student academic success and satisfaction and to increase student retention, instructors must embrace various technology tools that are available and employ them in their lessons. A longitudinal study (January 2009-June 2010) has been performed that encompasses the use of several technology tools in an instructional setting. The study provides further evidence that students not only like the tools that are being used, but prefer that these tools be used to help supplement and enhance instruction.

  17. Accuracy in Rietveld quantitative phase analysis: a comparative study of strictly monochromatic Mo and Cu radiations

    PubMed Central

    León-Reina, L.; García-Maté, M.; Álvarez-Pinazo, G.; Santacruz, I.; Vallcorba, O.; De la Torre, A. G.; Aranda, M. A. G.

    2016-01-01

    This study reports 78 Rietveld quantitative phase analyses using Cu Kα1, Mo Kα1 and synchrotron radiations. Synchrotron powder diffraction has been used to validate the most challenging analyses. From the results for three series with increasing contents of an analyte (an inorganic crystalline phase, an organic crystalline phase and a glass), it is inferred that Rietveld analyses from high-energy Mo Kα1 radiation have slightly better accuracies than those obtained from Cu Kα1 radiation. This behaviour has been established from the results of the calibration graphics obtained through the spiking method and also from Kullback–Leibler distance statistic studies. This outcome is explained, in spite of the lower diffraction power for Mo radiation when compared to Cu radiation, as arising because of the larger volume tested with Mo and also because higher energy allows one to record patterns with fewer systematic errors. The limit of detection (LoD) and limit of quantification (LoQ) have also been established for the studied series. For similar recording times, the LoDs in Cu patterns, ∼0.2 wt%, are slightly lower than those derived from Mo patterns, ∼0.3 wt%. The LoQ for a well crystallized inorganic phase using laboratory powder diffraction was established to be close to 0.10 wt% in stable fits with good precision. However, the accuracy of these analyses was poor with relative errors near to 100%. Only contents higher than 1.0 wt% yielded analyses with relative errors lower than 20%. PMID:27275132

  18. Compensation of kinematic geometric parameters error and comparative study of accuracy testing for robot

    NASA Astrophysics Data System (ADS)

    Du, Liang; Shi, Guangming; Guan, Weibin; Zhong, Yuansheng; Li, Jin

    2014-12-01

    Geometric error is the main error of the industrial robot, and it plays a more significantly important fact than other error facts for robot. The compensation model of kinematic error is proposed in this article. Many methods can be used to test the robot accuracy, therefore, how to compare which method is better one. In this article, a method is used to compare two methods for robot accuracy testing. It used Laser Tracker System (LTS) and Three Coordinate Measuring instrument (TCM) to test the robot accuracy according to standard. According to the compensation result, it gets the better method which can improve the robot accuracy apparently.

  19. Speed and accuracy of facial expression classification in avoidant personality disorder: a preliminary study.

    PubMed

    Rosenthal, M Zachary; Kim, Kwanguk; Herr, Nathaniel R; Smoski, Moria J; Cheavens, Jennifer S; Lynch, Thomas R; Kosson, David S

    2011-10-01

    The aim of this preliminary study was to examine whether individuals with avoidant personality disorder (APD) could be characterized by deficits in the classification of dynamically presented facial emotional expressions. Using a community sample of adults with APD (n = 17) and non-APD controls (n = 16), speed and accuracy of facial emotional expression recognition was investigated in a task that morphs facial expressions from neutral to prototypical expressions (Multi-Morph Facial Affect Recognition Task; Blair, Colledge, Murray, & Mitchell, 2001). Results indicated that individuals with APD were significantly more likely than controls to make errors when classifying fully expressed fear. However, no differences were found between groups in the speed to correctly classify facial emotional expressions. The findings are some of the first to investigate facial emotional processing in a sample of individuals with APD and point to an underlying deficit in processing social cues that may be involved in the maintenance of APD. PMID:22448805

  20. EM-navigated catheter placement for gynecologic brachytherapy: an accuracy study

    NASA Astrophysics Data System (ADS)

    Mehrtash, Alireza; Damato, Antonio; Pernelle, Guillaume; Barber, Lauren; Farhat, Nabgha; Viswanathan, Akila; Cormack, Robert; Kapur, Tina

    2014-03-01

    Gynecologic malignancies, including cervical, endometrial, ovarian, vaginal and vulvar cancers, cause significant mortality in women worldwide. The standard care for many primary and recurrent gynecologic cancers consists of chemoradiation followed by brachytherapy. In high dose rate (HDR) brachytherapy, intracavitary applicators and /or interstitial needles are placed directly inside the cancerous tissue so as to provide catheters to deliver high doses of radiation. Although technology for the navigation of catheters and needles is well developed for procedures such as prostate biopsy, brain biopsy, and cardiac ablation, it is notably lacking for gynecologic HDR brachytherapy. Using a benchtop study that closely mimics the clinical interstitial gynecologic brachytherapy procedure, we developed a method for evaluating the accuracy of image-guided catheter placement. Future bedside translation of this technology offers the potential benefit of maximizing tumor coverage during catheter placement while avoiding damage to the adjacent organs, for example bladder, rectum and bowel. In the study, two independent experiments were performed on a phantom model to evaluate the targeting accuracy of an electromagnetic (EM) tracking system. The procedure was carried out using a laptop computer (2.1GHz Intel Core i7 computer, 8GB RAM, Windows 7 64-bit), an EM Aurora tracking system with a 1.3mm diameter 6 DOF sensor, and 6F (2 mm) brachytherapy catheters inserted through a Syed-Neblett applicator. The 3D Slicer and PLUS open source software were used to develop the system. The mean of the targeting error was less than 2.9mm, which is comparable to the targeting errors in commercial clinical navigation systems.

  1. EM-Navigated Catheter Placement for Gynecologic Brachytherapy: An Accuracy Study

    PubMed Central

    Mehrtash, Alireza; Damato, Antonio; Pernelle, Guillaume; Barber, Lauren; Farhat, Nabgha; Viswanathan, Akila; Cormack, Robert; Kapur, Tina

    2014-01-01

    Gynecologic malignancies, including cervical, endometrial, ovarian, vaginal and vulvar cancers, cause significant mortality in women worldwide. The standard care for many primary and recurrent gynecologic cancers consists of chemoradiation followed by brachytherapy. In high dose rate (HDR) brachytherapy, intracavitary applicators and/or interstitial needles are placed directly inside the cancerous tissue so as to provide catheters to deliver high doses of radiation. Although technology for the navigation of catheters and needles is well developed for procedures such as prostate biopsy, brain biopsy, and cardiac ablation, it is notably lacking for gynecologic HDR brachytherapy. Using a benchtop study that closely mimics the clinical interstitial gynecologic brachytherapy procedure, we developed a method for evaluating the accuracy of image-guided catheter placement. Future bedside translation of this technology offers the potential benefit of maximizing tumor coverage during catheter placement while avoiding damage to the adjacent organs, for example bladder, rectum and bowel. In the study, two independent experiments were performed on a phantom model to evaluate the targeting accuracy of an electromagnetic (EM) tracking system. The procedure was carried out using a laptop computer (2.1GHz Intel Core i7 computer, 8GB RAM, Windows 7 64-bit), an EM Aurora tracking system with a 1.3mm diameter 6 DOF sensor, and 6F (2 mm) brachytherapy catheters inserted through a Syed-Neblett applicator. The 3D Slicer and PLUS open source software were used to develop the system. The mean of the targeting error was less than 2.9mm, which is comparable to the targeting errors in commercial clinical navigation systems. PMID:25076828

  2. Physical Activity Level Improves the Predictive Accuracy of Cardiovascular Disease Risk Score: The ATTICA Study (2002–2012)

    PubMed Central

    Georgousopoulou, Ekavi N.; Panagiotakos, Demosthenes B.; Bougatsas, Dimitrios; Chatzigeorgiou, Michael; Kavouras, Stavros A.; Chrysohoou, Christina; Skoumas, Ioannis; Tousoulis, Dimitrios; Stefanadis, Christodoulos; Pitsavos, Christos

    2016-01-01

    Background: Although physical activity (PA) has long been associated with cardiovascular disease (CVD), assessment of PA status has never been used as a part of CVD risk prediction tools. The aim of the present work was to examine whether the inclusion of PA status in a CVD risk model improves its predictive accuracy. Methods: Data from the 10-year follow-up (2002–2012) of the n = 2020 participants (aged 18–89 years) of the ATTICA prospective study were used to test the research hypothesis. The HellenicSCORE (that incorporates age, sex, smoking, total cholesterol, and systolic blood pressure levels) was calculated to estimate the baseline 10-year CVD risk; assessment of PA status was based on the International Physical Activity Questionnaire. The estimated CVD risk was tested against the observed 10-year incidence (i.e., development of acute coronary syndromes, stroke, or other CVD according to the World Health Organization [WHO]-International Classification of Diseases [ICD]-10 criteria). Changes in the predictive ability of the nested CVD risk model that contained the HellenicSCORE plus PA assessment were evaluated using Harrell's C and net reclassification index. Results: Both HellenicSCORE and PA status were predictors of future CVD events (P < 0.05). However, the estimating classification bias of the model that included only the HellenicSCORE was significantly reduced when PA assessment was included (Harrel's C = 0.012, P = 0.032); this reduction remained significant even when adjusted for diabetes mellitus and dietary habits (P < 0.05). Conclusions: CVD risk scores seem to be more accurate by incorporating individuals’ PA status; thus, may be more effective tools in primary prevention by efficiently allocating CVD candidates. PMID:27076890

  3. Studying Doctoral Education: Using Activity Theory to Shape Methodological Tools

    ERIC Educational Resources Information Center

    Beauchamp, Catherine; Jazvac-Martek, Marian; McAlpine, Lynn

    2009-01-01

    The study reported here, one part of a larger study on doctoral education, describes a pilot study that used Activity Theory to shape a methodological tool for better understanding the tensions inherent in the doctoral experience. As doctoral students may function within a range of activity systems, we designed data collection protocols based on…

  4. Summarising and validating test accuracy results across multiple studies for use in clinical practice.

    PubMed

    Riley, Richard D; Ahmed, Ikhlaaq; Debray, Thomas P A; Willis, Brian H; Noordzij, J Pieter; Higgins, Julian P T; Deeks, Jonathan J

    2015-06-15

    Following a meta-analysis of test accuracy studies, the translation of summary results into clinical practice is potentially problematic. The sensitivity, specificity and positive (PPV) and negative (NPV) predictive values of a test may differ substantially from the average meta-analysis findings, because of heterogeneity. Clinicians thus need more guidance: given the meta-analysis, is a test likely to be useful in new populations, and if so, how should test results inform the probability of existing disease (for a diagnostic test) or future adverse outcome (for a prognostic test)? We propose ways to address this. Firstly, following a meta-analysis, we suggest deriving prediction intervals and probability statements about the potential accuracy of a test in a new population. Secondly, we suggest strategies on how clinicians should derive post-test probabilities (PPV and NPV) in a new population based on existing meta-analysis results and propose a cross-validation approach for examining and comparing their calibration performance. Application is made to two clinical examples. In the first example, the joint probability that both sensitivity and specificity will be >80% in a new population is just 0.19, because of a low sensitivity. However, the summary PPV of 0.97 is high and calibrates well in new populations, with a probability of 0.78 that the true PPV will be at least 0.95. In the second example, post-test probabilities calibrate better when tailored to the prevalence in the new population, with cross-validation revealing a probability of 0.97 that the observed NPV will be within 10% of the predicted NPV.

  5. Study on Improvement of Accuracy in Inertial Photogrammetry by Combining Images with Inertial Measurement Unit

    NASA Astrophysics Data System (ADS)

    Kawasaki, Hideaki; Anzai, Shojiro; Koizumi, Toshio

    2016-06-01

    Inertial photogrammetry is defined as photogrammetry that involves using a camera on which an inertial measurement unit (IMU) is mounted. In inertial photogrammetry, the position and inclination of a shooting camera are calculated using the IMU. An IMU is characterized by error growth caused by time accumulation because acceleration is integrated with respect to time. This study examines the procedure to estimate the position of the camera accurately while shooting using the IMU and the structure from motion (SfM) technology, which is applied in many fields, such as computer vision. When neither the coordinates of the position of the camera nor those of feature points are known, SfM provides a similar positional relationship between the position of the camera and feature points. Therefore, the actual length of positional coordinates is not determined. If the actual length of the position of the camera is unknown, the camera acceleration is obtained by calculating the second order differential of the position of the camera, with respect to the shooting time. The authors had determined the actual length by assigning the position of IMU to the SfM-calculated position. Hence, accuracy decreased because of the error growth, which was the characteristic feature of IMU. In order to solve this problem, a new calculation method was proposed. Using this method, the difference between the IMU-calculated acceleration and the camera-calculated acceleration can be obtained using the method of least squares, and the magnification required for calculating the actual dimension from the position of the camera can be obtained. The actual length can be calculated by multiplying all the SfM point groups by the obtained magnification factor. This calculation method suppresses the error growth, which is due to the time accumulation in IMU, and improves the accuracy of inertial photogrammetry.

  6. Model accuracy impact through rescaled observations in hydrological data assimilation studies

    NASA Astrophysics Data System (ADS)

    Tugrul Yilmaz, M.; Crow, Wade T.; Ryu, Dongryeol

    2015-04-01

    Relative magnitudes of signal and noise in soil moisture datasets (e.g. satellite-, model-, station-based) feature significant variability. Optimality of the analysis when assimilating observations into the model depends on the degree that the differences between the signal variances of model and observations are minimized. Rescaling techniques that aim to reduce such differences in general only focus on matching certain statistics of the model and the observations while the impact of their relative accuracy over the optimality of the analysis remains unexplored. In this study the impacts of the relative accuracies of seasonality and anomaly components of modeled and observation-based soil moisture time series on optimality of assimilation analysis is investigated. Experiments using well-controlled synthetic and real datasets are performed. Experiments are performed by rescaling observations to model with varying aggressiveness: i) rescaling the entire observation time-series as one-piece or each month separately; ii) rescaling observation seasonality and anomaly components separately; iii) inserting model seasonality directly into observations while anomaly components are only rescaled. A simple Antecedent Precipitation Index (API) model is selected in both synthetic and real dataset experiments. Observations are assimilated into the API model using Kalman filter. Real dataset experiments use the Land Parameter Retrieval Model (LPRM) product based on the Advanced Microwave Scanning Radiometer on the Aqua platform (AMSR-E) observations over four USDA-ARS watersheds, while ground-based observations collected over these watersheds are used for validation. Results show that it is favorable to rescale observations more aggressively to a model when the model is more accurate (higher signal to noise ratio than the observations), while rescaling the observations strongly to the model degrades the analysis if the observations are more skillful.

  7. Screw Placement Accuracy and Outcomes Following O-Arm-Navigated Atlantoaxial Fusion: A Feasibility Study.

    PubMed

    Smith, Jacob D; Jack, Megan M; Harn, Nicholas R; Bertsch, Judson R; Arnold, Paul M

    2016-06-01

    Study Design Case series of seven patients. Objective C2 stabilization can be challenging due to the complex anatomy of the upper cervical vertebrae. We describe seven cases of C1-C2 fusion using intraoperative navigation to aid in the screw placement at the atlantoaxial (C1-C2) junction. Methods Between 2011 and 2014, seven patients underwent posterior atlantoaxial fusion using intraoperative frameless stereotactic O-arm Surgical Imaging and StealthStation Surgical Navigation System (Medtronic, Inc., Minneapolis, Minnesota, United States). Outcome measures included screw accuracy, neurologic status, radiation dosing, and surgical complications. Results Four patients had fusion at C1-C2 only, and in the remaining three, fixation extended down to C3 due to anatomical considerations for screw placement recognized on intraoperative imaging. Out of 30 screws placed, all demonstrated minimal divergence from desired placement in either C1 lateral mass, C2 pedicle, or C3 lateral mass. No neurovascular compromise was seen following the use of intraoperative guided screw placement. The average radiation dosing due to intraoperative imaging was 39.0 mGy. All patients were followed for a minimum of 12 months. All patients went on to solid fusion. Conclusion C1-C2 fusion using computed tomography-guided navigation is a safe and effective way to treat atlantoaxial instability. Intraoperative neuronavigation allows for high accuracy of screw placement, limits complications by sparing injury to the critical structures in the upper cervical spine, and can help surgeons make intraoperative decisions regarding complex pathology. PMID:27190736

  8. Accuracy evaluation of the optical surface monitoring system on EDGE linear accelerator in a phantom study.

    PubMed

    Mancosu, Pietro; Fogliata, Antonella; Stravato, Antonella; Tomatis, Stefano; Cozzi, Luca; Scorsetti, Marta

    2016-01-01

    Frameless stereotactic radiosurgery (SRS) requires dedicated systems to monitor the patient position during the treatment to avoid target underdosage due to involuntary shift. The optical surface monitoring system (OSMS) is here evaluated in a phantom-based study. The new EDGE linear accelerator from Varian (Varian, Palo Alto, CA) integrates, for cranial lesions, the common cone beam computed tomography (CBCT) and kV-MV portal images to the optical surface monitoring system (OSMS), a device able to detect real-time patient׳s face movements in all 6 couch axes (vertical, longitudinal, lateral, rotation along the vertical axis, pitch, and roll). We have evaluated the OSMS imaging capability in checking the phantoms׳ position and monitoring its motion. With this aim, a home-made cranial phantom was developed to evaluate the OSMS accuracy in 4 different experiments: (1) comparison with CBCT in isocenter location, (2) capability to recognize predefined shifts up to 2° or 3cm, (3) evaluation at different couch angles, (4) ability to properly reconstruct the surface when the linac gantry visually block one of the cameras. The OSMS system showed, with a phantom, to be accurate for positioning in respect to the CBCT imaging system with differences of 0.6 ± 0.3mm for linear vector displacement, with a maximum rotational inaccuracy of 0.3°. OSMS presented an accuracy of 0.3mm for displacement up to 1cm and 1°, and 0.5mm for larger displacements. Different couch angles (45° and 90°) induced a mean vector uncertainty < 0.4mm. Coverage of 1 camera produced an uncertainty < 0.5mm. Translations and rotations of a phantom can be accurately detect with the optical surface detector system.

  9. Determining immunisation status of children from history: a diagnostic accuracy study

    PubMed Central

    Nohavicka, Laura; Ashdown, Helen F; Kelly, Dominic F

    2013-01-01

    Objectives Children presenting unplanned to healthcare services are routinely asked about previous immunisations as part of their assessment. We aimed to assess the accuracy of screening children for immunisation status by history. Design Diagnostic accuracy study. We compared information from patient history by a retrospective review of notes and used a central database of child immunisation records as the reference standard. Setting Paediatric emergency department in a tertiary hospital in Oxford, UK. Participants Consecutive children aged 6 months to 6 years presenting over a 2-month period. Outcome measures Proportion of children with documented immunisation history; sensitivity and specificity of detecting overdue immunisations by history compared to central records. Results 1166 notes were surveyed. 76.3% children were asked about immunisations. The proportion of children who were fully immunised on central records was 93.1%. History had a sensitivity of 41.3% (95% CI 27% to 56.8%) and a specificity of 98.7% (95% CI 97.5% to 99.4%) for detecting those who were overdue. Negative predictive value was 95.8% (95% CI 93.9% to 97.2%). Only around a third of children with overdue immunisations are detected by the current screening methods, and approximately 1 in 20 children stated as being up to date are in fact overdue. Conclusions History had poor sensitivity for identifying overdue immunisation. Strategies to improve detection of children overdue with immunisation should focus on alternative strategies for alerting clinicians, such as linkage of community and hospital electronic records. PMID:23633421

  10. Dual-energy CT for the diagnosis of gout: an accuracy and diagnostic yield study

    PubMed Central

    Bongartz, Tim; Glazebrook, Katrina N; Kavros, Steven J; Murthy, Naveen S; Merry, Stephen P; Franz, Walter B; Michet, Clement J; Veetil, Barath M Akkara; Davis, John M; Mason, Thomas G; Warrington, Kenneth J; Ytterberg, Steven R; Matteson, Eric L; Crowson, Cynthia S; Leng, Shuai; McCollough, Cynthia H

    2015-01-01

    Objectives To assess the accuracy of dual-energy CT (DECT) for diagnosing gout, and to explore whether it can have any impact on clinical decision making beyond the established diagnostic approach using polarising microscopy of synovial fluid (diagnostic yield). Methods Diagnostic single-centre study of 40 patients with active gout, and 41 individuals with other types of joint disease. Sensitivity and specificity of DECT for diagnosing gout was calculated against a combined reference standard (polarising and electron microscopy of synovial fluid). To explore the diagnostic yield of DECT scanning, a third cohort was assembled consisting of patients with inflammatory arthritis and risk factors for gout who had negative synovial fluid polarising microscopy results. Among these patients, the proportion of subjects with DECT findings indicating a diagnosis of gout was assessed. Results The sensitivity and specificity of DECT for diagnosing gout was 0.90 (95% CI 0.76 to 0.97) and 0.83 (95% CI 0.68 to 0.93), respectively. All false negative patients were observed among patients with acute, recent-onset gout. All false positive patients had advanced knee osteoarthritis. DECT in the diagnostic yield cohort revealed evidence of uric acid deposition in 14 out of 30 patients (46.7%). Conclusions DECT provides good diagnostic accuracy for detection of monosodium urate (MSU) deposits in patients with gout. However, sensitivity is lower in patients with recent-onset disease. DECT has a significant impact on clinical decision making when gout is suspected, but polarising microscopy of synovial fluid fails to demonstrate the presence of MSU crystals. PMID:24671771

  11. Cocontraction of Pairs of Muscles around Joints May Improve an Accuracy of a Reaching Movement: a Numerical Simulation Study

    NASA Astrophysics Data System (ADS)

    Ueyama, Yuki; Miyashita, Eizo

    2011-06-01

    We have pair muscle groups on a joint; agonist and antagonist muscles. Simultaneous activation of agonist and antagonist muscles around a joint, which is called cocontraction, is suggested to take a role of increasing the joint stiffness in order to decelerate hand speed and improve movement accuracy. However, it has not been clear how cocontraction and the joint stiffness are varied during movements. In this study, muscle activation and the joint stiffness in reaching movements were studied under several requirements of end-point accuracy using a 2-joint 6-muscle model and an approximately optimal control. The time-varying cocontraction and the joint stiffness were showed by the numerically simulation study. It indicated that the strength of cocontraction and the joint stiffness increased synchronously as the required accuracy level increased. We conclude that cocontraction may get the joint stiffness increased to achieve higher requirement of the movement accuracy.

  12. Effects of sampling and mineral separation on accuracy of detrital zircon studies

    NASA Astrophysics Data System (ADS)

    SláMa, JiřÃ.­; KošLer, Jan

    2012-05-01

    We investigated some of the sampling and mineral separation biases that affect the accuracy of detrital zircon provenance studies. The study has been carried on a natural catchment in the Scottish Highlands that represents a simple two-component source system and on samples of synthetic sediment prepared for this study to test the effects of heavy mineral separation on the resulting zircon age spectra. The results suggest that zircon fertility of the source rocks and physical properties of zircon represent the most important factors affecting the distribution of zircon age populations in the stream sediments. The sample preparation and selection of zircons for analysis may result in preferential loss of information from small zircon grains. Together with the preference for larger crystals during handpicking, it can result in several-fold difference compared to the real age distribution in the sediment sample. These factors appear to be more important for the reproducibility of zircon age spectra than is the number of zircon grains analyzed per sample.

  13. Clinical relevance of studies on the accuracy of visual inspection for detecting caries lesions: a systematic review.

    PubMed

    Gimenez, Thais; Piovesan, Chaiana; Braga, Mariana M; Raggio, Daniela P; Deery, Chris; Ricketts, David N; Ekstrand, Kim R; Mendes, Fausto Medeiros

    2015-01-01

    Although visual inspection is the most commonly used method for caries detection, and consequently the most investigated, studies have not been concerned about the clinical relevance of this procedure. Therefore, we conducted a systematic review in order to perform a critical evaluation considering the clinical relevance and methodological quality of studies on the accuracy of visual inspection for assessing caries lesions. Two independent reviewers searched several databases through July 2013 to identify papers/articles published in English. Other sources were checked to identify unpublished literature. The eligible studies were those which (1) assessed the accuracy of the visual method for detecting caries lesions on occlusal, approximal or smooth surfaces, in primary or permanent teeth, (2) used a reference standard, and (3) reported data about sample size and accuracy of the methods. Aspects related to clinical relevance and the methodological quality of the studies were evaluated. 96 of the 5,578 articles initially identified met the inclusion criteria. In general, most studies failed in considering some clinically relevant aspects: only 1 included study validated activity status of lesions, no study considered its prognosis, 79 studies did not consider a clinically relevant outcome, and only 1 evaluated a patient-centred outcome. Concerning methodological quality, the majority of the studies presented a high risk of bias in sample selection. In conclusion, studies on the accuracy of the visual method for caries detection should consider clinically relevant outcomes besides accuracy; moreover, they should be conducted with higher methodological quality, mainly regarding sample selection.

  14. What level of accuracy is achievable for preclinical dose painting studies on a clinical irradiation platform?

    PubMed

    Trani, Daniela; Reniers, Brigitte; Persoon, Lucas; Podesta, Mark; Nalbantov, Georgi; Leijenaar, Ralph T H; Granzier, Marlies; Yaromina, Ala; Dubois, Ludwig; Verhaegen, Frank; Lambin, Philippe

    2015-05-01

    Advancements made over the past decades in both molecular imaging and radiotherapy planning and delivery have enabled studies that explore the efficacy of heterogeneous radiation treatment ("dose painting") of solid cancers based on biological information provided by different imaging modalities. In addition to clinical trials, preclinical studies may help contribute to identifying promising dose painting strategies. The goal of this current study was twofold: to develop a reproducible positioning and set-up verification protocol for a rat tumor model to be imaged and treated on a clinical platform, and to assess the dosimetric accuracy of dose planning and delivery for both uniform and positron emission tomography-computed tomography (PET-CT) based heterogeneous dose distributions. We employed a syngeneic rat rhabdomyosarcoma model, which was irradiated by volumetric modulated arc therapy (VMAT) with uniform or heterogeneous 6 MV photon dose distributions. Mean dose to the gross tumor volume (GTV) as a whole was kept at 12 Gy for all treatment arms. For the nonuniform plans, the dose was redistributed to treat the 30% of the GTV representing the biological target volume (BTV) with a dose 40% higher than the rest of the GTV (GTV - BTV) (~15 Gy was delivered to the BTV vs. ~10.7 Gy was delivered to the GTV - BTV). Cone beam computed tomography (CBCT) images acquired for each rat prior to irradiation were used to correctly reposition the tumor and calculate the delivered 3D dose. Film quality assurance was performed using a water-equivalent rat phantom. A comparison between CT or CBCT doses and film measurements resulted in passing rates >98% with a gamma criterion of 3%/2 mm using 2D dose images. Moreover, between the CT and CBCT calculated doses for both uniform and heterogeneous plans, we observed maximum differences of <2% for mean dose to the tumor and mean dose to the biological target volumes. In conclusion, we have developed a robust method for dose painting

  15. Combining growth curves when a longitudinal study switches measurement tools

    PubMed Central

    Oleson, Jacob J.; Cavanaugh, Joseph E.; Tomblin, J. Bruce; Walker, Elizabeth; Dunn, Camille

    2014-01-01

    When longitudinal studies are performed to investigate the growth of traits in children, the measurement tool being used to quantify the trait may need to change as the subjects age throughout the study. Changing the measurement tool at some point in the longitudinal study makes the analysis of that growth challenging which, in turn, makes it difficult to determine what other factors influence the growth rate. We developed a Bayesian hierarchical modeling framework that relates the growth curves per individual for each of the different measurement tools and allows for covariates to influence the shapes of the curves by borrowing strength across curves. The method is motivated by and demonstrated by speech perception outcome measurements of children who were implanted with cochlear implants. Researchers are interested in assessing the impact of age at implantation, and comparing the growth rates of children who are implanted under the age of two versus those implanted between the ages of two and four. PMID:24821002

  16. Accuracy of computer-aided template-guided oral implant placement: a prospective clinical study

    PubMed Central

    2014-01-01

    Purpose The aim of the present study was to evaluate the in vivo accuracy of flapless, computer-aided implant placement by comparing the three-dimensional (3D) position of planned and placed implants through an analysis of linear and angular deviations. Methods Implant position was virtually planned using 3D planning software based on the functional and aesthetic requirements of the final restorations. Computer-aided design/computer-assisted manufacture technology was used to transfer the virtual plan to the surgical environment. The 3D position of the planned and placed implants, in terms of the linear deviations of the implant head and apex and the angular deviations of the implant axis, was compared by overlapping the pre- and postoperative computed tomography scans using dedicated software. Results The comparison of 14 implants showed a mean linear deviation of the implant head of 0.56 mm (standard deviation [SD], 0.23), a mean linear deviation of the implant apex of 0.64 mm (SD, 0.29), and a mean angular deviation of the long axis of 2.42° (SD, 1.02). Conclusions In the present study, computer-aided flapless implant surgery seemed to provide several advantages to the clinicians as compared to the standard procedure; however, linear and angular deviations are to be expected. Therefore, accurate presurgical planning taking into account anatomical limitations and prosthetic demands is mandatory to ensure a predictable treatment, without incurring possible intra- and postoperative complications. Graphical Abstract PMID:25177520

  17. Camino® intracranial pressure monitor: prospective study of accuracy and complications

    PubMed Central

    Martinez-Manas, R.; Santamarta, D.; de Campos, J. M; Ferrer, E.

    2000-01-01

    OBJECTIVES—The fibreoptic device is a type of intracranial pressure monitor which seems to offer certain advantages over conventional monitoring systems. This study was undertaken to analyse the accuracy, drift characteristics, and complications of the Camino® fibreoptic device.
METHODS—One hundred and eight Camino® intracranial pressure (ICP) devices, in their three modalities, were implanted during 1997.The most frequent indication for monitoring was severe head injury due to road traffic accidents.
RESULTS—Sixty eight probe tips were cultured; 13.2% of the cases had a positive culture without clinical signs of infection, and 2.9% had a positive culture with clinical signs of ventriculitis. The most common isolated pathogen was Staphylococcus epidermidis. All patients were under cephalosporin prophylaxis during monitoring. Haemorrhage rate in patients without coagulation disorders was 2.1% and 15.3% in patients with coagulation abnormalities. Drift characteristics were studied in 56 cases; there was no drifting from the values expected according to the manufacturer's specifications in 34 probes. There was no relation between direction of the drift and duration of placement, nor between drift and time.
CONCLUSIONS—Although the complication and drift rates were similar to those reported elsewhere, there was no correlation between the direction of the drift and long term monitoring despite the fact that some published papers refer to overestimation of values with time with this type of device.

 PMID:10864608

  18. Micropillar substrates: a tool for studying cell mechanobiology.

    PubMed

    Gupta, Mukund; Kocgozlu, Leyla; Sarangi, Bibhu Ranjan; Margadant, Felix; Ashraf, Mohammed; Ladoux, Benoit

    2015-01-01

    Increasing evidence has shown that mechanical cues from the environment play an important role in cell biology. Mechanotransduction or the study of how cells can sense these mechanical cues, and respond to them, is an active field of research. However, it is still not clear how cells sense and respond to mechanical cues. Thus, new tools are being rapidly developed to quantitatively study cell mechanobiology. Particularly, force measurement tools such as micropillar substrates have provided new insights into the underlying mechanisms of mechanosensing. In this chapter, we provide detailed protocol for fabrication, characterization, functionalization, and use of the micropillar substrates.

  19. Accuracy of positioning and irradiation targeting for multiple targets in intracranial image-guided radiation therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Tominaga, Hirofumi; Araki, Fujio; Shimohigashi, Yoshinobu; Ishihara, Terunobu; Kawasaki, Keiichi; Kanetake, Nagisa; Sakata, Junichi; Iwashita, Yuki

    2014-12-01

    This study investigated the accuracy of positioning and irradiation targeting for multiple off-isocenter targets in intracranial image-guided radiation therapy (IGRT). A phantom with nine circular targets was created to evaluate both accuracies. First, the central point of the isocenter target was positioned with a combination of an ExacTrac x-ray (ETX) and a 6D couch. The positioning accuracy was determined from the deviations of coordinates of the central point in each target obtained from the kV-cone beam computed tomography (kV-CBCT) for IGRT and the planning CT. Similarly, the irradiation targeting accuracy was evaluated from the deviations of the coordinates between the central point of each target and the central point of each multi-leaf collimator (MLC) field for multiple targets. Secondly, the 6D couch was intentionally rotated together with both roll and pitch angles of 0.5° and 1° at the isocenter and similarly the deviations were evaluated. The positioning accuracy for all targets was less than 1 mm after 6D positioning corrections. The irradiation targeting accuracy was up to 1.3 mm in the anteroposterior (AP) direction for a target 87 mm away from isocenter. For the 6D couch rotations with both roll and pitch angles of 0.5° and 1°, the positioning accuracy was up to 1.0 mm and 2.3 mm in the AP direction for the target 87 mm away from the isocenter, respectively. The irradiation targeting accuracy was up to 2.1 mm and 2.6 mm in the AP direction for the target 87 mm away from the isocenter, respectively. The off-isocenter irradiation targeting accuracy became worse than the positioning accuracy. Both off-isocenter accuracies worsened in proportion to rotation angles and the distance from the isocenter to the targets. It is necessary to examine the set-up margin for off-isocenter multiple targets at each institution because irradiation targeting accuracy is peculiar to the linac machine.

  20. Study of on-machine error identification and compensation methods for micro machine tools

    NASA Astrophysics Data System (ADS)

    Wang, Shih-Ming; Yu, Han-Jen; Lee, Chun-Yi; Chiu, Hung-Sheng

    2016-08-01

    Micro machining plays an important role in the manufacturing of miniature products which are made of various materials with complex 3D shapes and tight machining tolerance. To further improve the accuracy of a micro machining process without increasing the manufacturing cost of a micro machine tool, an effective machining error measurement method and a software-based compensation method are essential. To avoid introducing additional errors caused by the re-installment of the workpiece, the measurement and compensation method should be on-machine conducted. In addition, because the contour of a miniature workpiece machined with a micro machining process is very tiny, the measurement method should be non-contact. By integrating the image re-constructive method, camera pixel correction, coordinate transformation, the error identification algorithm, and trajectory auto-correction method, a vision-based error measurement and compensation method that can on-machine inspect the micro machining errors and automatically generate an error-corrected numerical control (NC) program for error compensation was developed in this study. With the use of the Canny edge detection algorithm and camera pixel calibration, the edges of the contour of a machined workpiece were identified and used to re-construct the actual contour of the work piece. The actual contour was then mapped to the theoretical contour to identify the actual cutting points and compute the machining errors. With the use of a moving matching window and calculation of the similarity between the actual and theoretical contour, the errors between the actual cutting points and theoretical cutting points were calculated and used to correct the NC program. With the use of the error-corrected NC program, the accuracy of a micro machining process can be effectively improved. To prove the feasibility and effectiveness of the proposed methods, micro-milling experiments on a micro machine tool were conducted, and the results

  1. An experimental study of the accuracy in measurement of modulation transfer function using an edge method

    NASA Astrophysics Data System (ADS)

    Lee, Dong-Hoon; Kim, Ye-seul; Park, Hye-Suk; Lee, Young-Jin; Kim, Hee-Joung

    2015-03-01

    Image evaluation is necessary in digital radiography (DR) which is widely used in medical imaging. Among parameters of image evaluation, modulation transfer function (MTF) is the important factor in the field of medical imaging and necessary to obtain detective quantum efficiency (DQE) which represents overall performance of the detector signal-to-noise ratio. However, the accurate measurement of MTF is still not easy because of geometric effect, electric noise, quantum noise, and truncation error. Therefore, in order to improve accuracy of MTF, four experimental methods were tested in this study such as changing the tube current, applying smoothing method in edge spread function (ESF), adjusting line spread function (LSF) range, and changing tube angle. Our results showed that MTF's fluctuation was decreased by high tube current and smoothing method. However, tube current should not exceed detector saturation and smoothing in ESF causes a distortion in ESF and MTF. In addition, decreasing LSF range diminished fluctuation and the number of sampling in MTF and high tube angle generates degradation in MTF. Based on these results, excessively low tube current and the smoothing method should be avoided. Also, optimal range of LSF considering reduction of fluctuation and the number of sampling in MTF was necessary and precise tube angle is essential to obtain an accurate MTF. In conclusion, our results demonstrated that accurate MTF can be acquired.

  2. Surgical accuracy under virtual reality-enhanced ultrasound guidance: an in vitro epicardial dynamic study.

    PubMed

    Linte, Cristian A; Wiles, Andrew D; Moore, John; Wedlake, Chris; Peters, Terry M

    2008-01-01

    In the context of our ongoing objective to reduce morbidity associated with cardiac interventions, minimizing invasiveness has inevitably led to more limited visual access to the target tissues. To ameliorate these challenges, we provide the surgeons with a complex visualization environment that integrates interventional ultrasound imaging augmented with pre-operative anatomical models and virtual surgical instruments within a virtual reality environment. In this paper we present an in vitro study on a cardiac phantom aimed at assessing the feasibility and targeting accuracy of our surgical system in comparison to traditional ultrasound imaging for intra-operative surgical guidance. The 'therapy delivery' was modeled in the context of a blinded procedure, mimicking a closed-chest intervention. Four users navigated a tracked pointer to a target, under guidance provide by either US imaging or virtual reality-enhanced ultrasound. A 2.8 mm RMS targeting error was achieved using our novel surgical system, which is adequate from both a clinical and engineering perspective, under the inherent procedure requirements and limitations of the system. PMID:19162594

  3. Accuracy, Effectiveness and Improvement of Vibration-Based Maintenance in Paper Mills: Case Studies

    NASA Astrophysics Data System (ADS)

    AL-NAJJAR, B.

    2000-01-01

    Many current vibration-based maintenance (VBM) policies for rolling element bearings do not use as much as possible of their useful lives. Evidence and indications to prolong the bearings' mean effective lives by using more accurate diagnosis and prognosis are confirmed when faulty bearing installation, faulty machinery design, harsh environmental condition and when a bearing is replaced as soon as its vibration level exceeds the normal. Analysis of data from roller bearings at two paper mills suggests that longer bearing lives can be safely achieved by increasing the accuracy of the vibration data. This paper relates bearing failure modes to the observed vibration spectra and their development patterns over the bearings' lives. A systematic approach, which describes the objectives and performance of studies in two Swedish paper mills, is presented. Explanations of the mechanisms behind some frequent modes of early failure and ways to avoid them are suggested. It is shown theoretically, and partly confirmed by the analysis of (unfortunately incomplete) data from two paper mills over many years, that accurate prediction of remaining bearing life requires: (a) enough vibration measurements, (b) numerate records of operating conditions, (c) better discrimination between frequencies in the spectrum and (d) correlation of (b) and (c). This is because life prediction depends on precise knowledge of primary, harmonic and side-band frequency amplitudes and their development over time. Further, the available data, which are collected from relevant plant activities, can be utilized to perform cyclic improvements in diagnosis, prognosis, experience and economy.

  4. Functional Knowledge Transfer for High-accuracy Prediction of Under-studied Biological Processes

    PubMed Central

    Rowland, Jessica; Guan, Yuanfang; Bongo, Lars A.; Burdine, Rebecca D.; Troyanskaya, Olga G.

    2013-01-01

    A key challenge in genetics is identifying the functional roles of genes in pathways. Numerous functional genomics techniques (e.g. machine learning) that predict protein function have been developed to address this question. These methods generally build from existing annotations of genes to pathways and thus are often unable to identify additional genes participating in processes that are not already well studied. Many of these processes are well studied in some organism, but not necessarily in an investigator's organism of interest. Sequence-based search methods (e.g. BLAST) have been used to transfer such annotation information between organisms. We demonstrate that functional genomics can complement traditional sequence similarity to improve the transfer of gene annotations between organisms. Our method transfers annotations only when functionally appropriate as determined by genomic data and can be used with any prediction algorithm to combine transferred gene function knowledge with organism-specific high-throughput data to enable accurate function prediction. We show that diverse state-of-art machine learning algorithms leveraging functional knowledge transfer (FKT) dramatically improve their accuracy in predicting gene-pathway membership, particularly for processes with little experimental knowledge in an organism. We also show that our method compares favorably to annotation transfer by sequence similarity. Next, we deploy FKT with state-of-the-art SVM classifier to predict novel genes to 11,000 biological processes across six diverse organisms and expand the coverage of accurate function predictions to processes that are often ignored because of a dearth of annotated genes in an organism. Finally, we perform in vivo experimental investigation in Danio rerio and confirm the regulatory role of our top predicted novel gene, wnt5b, in leftward cell migration during heart development. FKT is immediately applicable to many bioinformatics techniques and will

  5. Accuracy of the TeacherInsight Online Perceiver Tool in Determining the Effectiveness of High Rated and Low Rated Math and Science New Hire Teachers Following One Year and Three Years of Single School District Employment

    NASA Astrophysics Data System (ADS)

    Reina Romo, Celia

    The purpose of this study is to explore the accuracy of the TeacherInsight online perceiver tool (Gallup University, 2007) in determining the effectiveness of high rated (n =14) and low rated ( n =36) math and science new hire teachers summative appraisal ratings, completed graduate coursework, and retention status following one year and three years of single school district employment. Using the TeacherInsight tool to recognize qualities of an effective teacher, the study compared other factors that contribute to teacher effectiveness as it pertains to the teacher retention rate, summative appraisal rating scores, and completed graduate coursework. The findings of this study show significance in the growth of teachers with a low TeacherInsight rating after three years of employment. While there was no significant difference for teachers with a high TeacherInsight rating in the performance domains of Planning and Preparation, and Instruction, these teachers did indicate significant growth in the Domain II, Classroom Environment, and Domain IV, Professional Responsibilities. Only teachers with a low TeacherInsight rating made a statistical difference in their participation of graduate coursework after three years with the district. Both groups of teachers maintained a consistent rate of retention in math and science which was higher than the research school district's overall average. There was no statistical difference between teachers with a high or a low TeacherInsight rating when compared to performance ratings after the first and third year of teaching. The study results support the advantages of professional development activities for teachers at the beginning years of employment.

  6. SMS as a Learning Tool: An Experimental Study

    ERIC Educational Resources Information Center

    Plana, Mar Gutiérrez-Colon; Torrano, Pere Gallardo; Grova, M. Elisa

    2012-01-01

    The aim of this experimental study was to find out the potential of using mobile phones in teaching English as a foreign language (EFL), specifically the use of Short Message Service (SMS) as a support tool in the EFL class. The research questions formulated for this project are the following: (1) Is using SMS messages via a mobile phone an…

  7. Popular Music as a Learning Tool in the Social Studies.

    ERIC Educational Resources Information Center

    Litevich, John A., Jr.

    This teaching guide reflects the belief that popular music is an effective tool for teachers to use in presenting social studies lessons to students. Titles of songs representative of popular music from 1955 to 1982 are listed by subject matter and suggest a possible lesson to be used in teaching that particular issue. Subject areas listed…

  8. Softdesk energy: A case study in early design tool integration

    SciTech Connect

    Gowri, K.; Chassin, D.P.; Friedrich, M.

    1998-04-01

    Softdesk Energy is a design tool that integrates building energy analysis capability into a highly automated production drafting environment (AutoCAD and Softdesk AutoArchitect). This tool provides users of computer aided software the opportunity to evaluate the aided design/drafting (CAD) energy impact of design decisions much earlier in the design process than previously possible with energy analysis software. The authors review the technical challenges of integrating analytic methods into design tools, the opportunities such integrated tools create for building designers, and a usage scenario from the perspective of a current user of Softdesk Energy. A comparison between the simplified calculations in Softdesk Energy and detailed simulations using DOE-2 energy analysis is made to evaluate the applicability of the Softdesk Energy approach. As a unique example of integrating decision and drafting, Softdesk Energy provides an opportunity to study the strengths and weaknesses of integrated design tools and gives some insight into the future direction of the CAD software towards meeting the needs of diverse design disciplines.

  9. Hepatic perfusion in a tumor model using DCE-CT: an accuracy and precision study

    NASA Astrophysics Data System (ADS)

    Stewart, Errol E.; Chen, Xiaogang; Hadway, Jennifer; Lee, Ting-Yim

    2008-08-01

    In the current study we investigate the accuracy and precision of hepatic perfusion measurements based on the Johnson and Wilson model with the adiabatic approximation. VX2 carcinoma cells were implanted into the livers of New Zealand white rabbits. Simultaneous dynamic contrast-enhanced computed tomography (DCE-CT) and radiolabeled microsphere studies were performed under steady-state normo-, hyper- and hypo-capnia. The hepatic arterial blood flows (HABF) obtained using both techniques were compared with ANOVA. The precision was assessed by the coefficient of variation (CV). Under normo-capnia the microsphere HABF were 51.9 ± 4.2, 40.7 ± 4.9 and 99.7 ± 6.0 ml min-1 (100 g)-1 while DCE-CT HABF were 50.0 ± 5.7, 37.1 ± 4.5 and 99.8 ± 6.8 ml min-1 (100 g)-1 in normal tissue, tumor core and rim, respectively. There were no significant differences between HABF measurements obtained with both techniques (P > 0.05). Furthermore, a strong correlation was observed between HABF values from both techniques: slope of 0.92 ± 0.05, intercept of 4.62 ± 2.69 ml min-1 (100 g)-1 and R2 = 0.81 ± 0.05 (P < 0.05). The Bland-Altman plot comparing DCE-CT and microsphere HABF measurements gives a mean difference of -0.13 ml min-1 (100 g)-1, which is not significantly different from zero. DCE-CT HABF is precise, with CV of 5.7, 24.9 and 1.4% in the normal tissue, tumor core and rim, respectively. Non-invasive measurement of HABF with DCE-CT is accurate and precise. DCE-CT can be an important extension of CT to assess hepatic function besides morphology in liver diseases.

  10. DNA barcoding and minibarcoding as a powerful tool for feather mite studies.

    PubMed

    Doña, Jorge; Diaz-Real, Javier; Mironov, Sergey; Bazaga, Pilar; Serrano, David; Jovani, Roger

    2015-09-01

    Feather mites (Astigmata: Analgoidea and Pterolichoidea) are among the most abundant and commonly occurring bird ectosymbionts. Basic questions on the ecology and evolution of feather mites remain unanswered because feather mite species identification is often only possible for adult males, and it is laborious even for specialized taxonomists, thus precluding large-scale identifications. Here, we tested DNA barcoding as a useful molecular tool to identify feather mites from passerine birds. Three hundred and sixty-one specimens of 72 species of feather mites from 68 species of European passerine birds from Russia and Spain were barcoded. The accuracy of barcoding and minibarcoding was tested. Moreover, threshold choice (a controversial issue in barcoding studies) was also explored in a new way, by calculating through simulations the effect of sampling effort (in species number and species composition) on threshold calculations. We found one 200-bp minibarcode region that showed the same accuracy as the full-length barcode (602 bp) and was surrounded by conserved regions potentially useful for group-specific degenerate primers. Species identification accuracy was perfect (100%) but decreased when singletons or species of the Proctophyllodes pinnatus group were included. In fact, barcoding confirmed previous taxonomic issues within the P. pinnatus group. Following an integrative taxonomy approach, we compared our barcode study with previous taxonomic knowledge on feather mites, discovering three new putative cryptic species and validating three previous morphologically different (but still undescribed) new species.

  11. Application accuracy study of a semipermanent fiducial system for frameless stereotaxis.

    PubMed

    Vinas, F C; Zamorano, L; Buciuc, R; Li, Q H; Shamsa, F; Jiang, Z; Diaz, F G

    1997-01-01

    The accuracy of a semipermanent fiducial marker system developed at Wayne State University in collaboration with Fisher-Leibinger (Freiburg, Germany) was compared with reference to a standard stereotactic frame (Zamorano-Dujovny Localizing Unit; Fisher-Leibinger). For each patient in our study, 10 semipermanent markers were placed on the skull through a small incision and a pilot hole drilled for the marker; five markers were used for registration, and five were used for comparison. Gadolinium-enhanced magnetic resonance imaging was performed, and, upon registration using both ring and fiducial markers, 184 random points were collected by infrared digitization. All three-dimensional measurements (x, y, z) were converted into distance values correlating each value to the origin by the formula dij = SQRT (xij2 + yij2 + zij2). The mean difference of fiducial coordinates vs. absolute image coordinates was 1.72 +/- 0.42 mm (P = .0001), implying no significant difference. The mean difference in dij of the stereotactic ring coordinates vs. the absolute image coordinates was 3.35 +/- 0.59 mm (P = .00011). The mean difference in the fiducial markers vs. the stereotactic ring coordinates was 2.95 +/- 0.45 mm (P = .0001). All tests were declared significant at alpha = .016. The combination of interactive guidance with semipermanent fiducial markers allows for accurate localization of intracranial targets (as accurate or even more accurate than the stereotactic frame). Semipermanent fiducial markers facilitate the procedure logistically, allow for staged procedures (i.e., at the skull base or in epilepsy), and provide access for combined supra- and infratentorial approaches. We believe that the semipermanent fiducial markers system might represent an important development leading toward widespread use of interactive image guidance in conventional neurosurgery. PMID:9484586

  12. Alaska Case Study: Scientists Venturing Into Field with Journalists Improves Accuracy

    NASA Astrophysics Data System (ADS)

    Ekwurzel, B.; Detjen, J.; Hayes, R.; Nurnberger, L.; Pavangadkar, A.; Poulson, D.

    2008-12-01

    Issues such as climate change, stem cell research, public health vaccination, etc., can be fraught with public misunderstanding, myths, as well as deliberate distortions of the fundamental science. Journalists are adept at creating print, radio, and video content that can be both compelling and informative to the public. Yet most scientists have little time or training to devote to developing media content for the public and spend little time with journalists who cover science stories. We conducted a case study to examine whether the time and funding invested in exposing journalists to scientists in the field over several days would improve accuracy of media stories about complex scientific topics. Twelve journalists were selected from the 70 who applied for a four-day environmental journalism fellowship in Alaska. The final group achieved the goal of a broad geographic spectrum of the media outlets (small regional to large national organizations), medium (print, radio, online), and experience (early career to senior producers). Reporters met with a diverse group of scientists. The lessons learned and successful techniques will be presented. Initial results demonstrate that stories were highly accurate and rich with audio or visual content for lay audiences. The journalists have also maintained contact with the scientists, asking for leads on emerging stories and seeking new experts that can assist in their reporting. Science-based institutions should devote more funding to foster direct journalist-scientist interactions in the lab and field. These positive goals can be achieved: (1) more accurate dissemination of science information to the public; (2) a broader portion of the scientific community will become a resource to journalists instead of the same eloquent few in the community; (3) scientists will appreciate the skill and pressures of those who survive the media downsizing and provide media savvy content; and (4) the public may incorporate science evidence

  13. Accuracy of migrant landbird habitat maps produced from LANDSAT TM data: Two case studies in southern Belize

    USGS Publications Warehouse

    Spruce, J.P.; Sader, S.; Robbins, C.S.; Dowell, B.A.; Wilson, Marcia H.; Sader, Steven A.

    1995-01-01

    The study investigated the utility of Landsat TM data applied to produce geo-referenced habitat maps for two study areas (Toledo and Stann Creek). Locational and non-site-specific map accuracy was evaluated by stratified random sampling and statistical analysis of satellite classification (SCR) versus air photo interpretation results (PIR) for the overall classification and individual classes. The effect of classification scheme specificity on map accuracy was also assessed. A decision criteria was developed for the minimum acceptable level of map performance (i.e., classification accuracy and scheme specificity). A satellite map was deemed acceptable if it has a useful degree of classification specificity, plus either an adequate overall locational agreement (< 70%) and/or non-site specific agreement (Chi Square goodness of fit test results indicating insufficient evidence to reject the null hypothesis that the overall classification distribution for the SCR and PIR are equal). For the most detailed revised classification, overall locational accuracy ranges from 52% (5 classes) for the Toledo to 63% (9 classes) for the Stann Creek. For the least detailed revised classification, overall locational accuracy ranges from 91% (2 classes) for Toledo to 86% (5 classes) for Stann Creek. Considering both location and non-site-specific accuracy results, the most detailed yet insufficient accurate classification for both sites includes low/medium/tall broadleaf forest, broadleaf forest scrub and herb-dominated openings. For these classifications, the overall locational accuracy is 72% for Toledo (4 classes) and 75% for Stann Creek (7 classes). This level of classification detail is suitable for aiding many analyses of migrant landbird habitat use.

  14. Accuracy of Person-Fit Statistics: A Monte Carlo Study of the Influence of Aberrance Rates

    ERIC Educational Resources Information Center

    St-Onge, Christina; Valois, Pierre; Abdous, Belkacem; Germain, Stephane

    2011-01-01

    Using a Monte Carlo experimental design, this research examined the relationship between answer patterns' aberrance rates and person-fit statistics (PFS) accuracy. It was observed that as the aberrance rate increased, the detection rates of PFS also increased until, in some situations, a peak was reached and then the detection rates of PFS…

  15. Accuracy of Range Restriction Correction with Multiple Imputation in Small and Moderate Samples: A Simulation Study

    ERIC Educational Resources Information Center

    Pfaffel, Andreas; Spiel, Christiane

    2016-01-01

    Approaches to correcting correlation coefficients for range restriction have been developed under the framework of large sample theory. The accuracy of missing data techniques for correcting correlation coefficients for range restriction has thus far only been investigated with relatively large samples. However, researchers and evaluators are…

  16. Applying Signal-Detection Theory to the Study of Observer Accuracy and Bias in Behavioral Assessment

    ERIC Educational Resources Information Center

    Lerman, Dorothea C.; Tetreault, Allison; Hovanetz, Alyson; Bellaci, Emily; Miller, Jonathan; Karp, Hilary; Mahmood, Angela; Strobel, Maggie; Mullen, Shelley; Keyl, Alice; Toupard, Alexis

    2010-01-01

    We evaluated the feasibility and utility of a laboratory model for examining observer accuracy within the framework of signal-detection theory (SDT). Sixty-one individuals collected data on aggression while viewing videotaped segments of simulated teacher-child interactions. The purpose of Experiment 1 was to determine if brief feedback and…

  17. Accuracy of stated energy contents of restaurant foods in a multi-site study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Context National recommendations for prevention and treatment of obesity emphasize reducing energy intake. Foods purchased in restaurants provide approximately 35% of daily energy intake, but the accuracy of information on the energy contents of these foods is unknown. Objective To examine the a...

  18. A study of the accuracy of wing calculations based on different schemes

    NASA Astrophysics Data System (ADS)

    Putilin, S. I.; Savchenko, V. T.

    Solutions obtained for the same flow problem using different vortex and computation point distribution laws are presented. The problem considered here is the plane problem for a plate near a circular contour, which is relevant to the design of wings with optimal load distribution and low aspect ratio wings in a bounded fluid. Ways to improve the accuracy of the results are discussed.

  19. Accuracy of navigation-assisted acetabular component positioning studied by computed tomography measurements: methods and results.

    PubMed

    Ybinger, Thomas; Kumpan, W; Hoffart, H E; Muschalik, B; Bullmann, W; Zweymüller, K

    2007-09-01

    The postoperative position of the acetabular component is key for the outcome of total hip arthroplasty. Various aids have been developed to support the surgeon during implant placement. In a prospective study involving 4 centers, the computer-recorded cup alignment of 37 hip systems at the end of navigation-assisted surgery was compared with the cup angles measured on postoperative computerized tomograms. This comparison showed an average difference of 3.5 degrees (SD, 4.4 degrees ) for inclination and 6.5 degrees (SD, 7.3 degrees ) for anteversion angles. The differences in inclination correlated with the thickness of the soft tissue overlying the anterior superior iliac spine (r = 0.44; P = .007), whereas the differences in anteversion showed a correlation with the thickness of the soft tissue overlying the pubic tubercles (r = 0.52; P = .001). In centers experienced in the use of navigational tools, deviations were smaller than in units with little experience in their use. PMID:17826270

  20. Tools of the trade: studying molecular networks in plants.

    PubMed

    Proost, Sebastian; Mutwil, Marek

    2016-04-01

    Driven by recent technological improvements, genes can be now studied in a larger biological context. Genes and their protein products rarely operate as a single entity and large-scale mapping by protein-protein interactions can unveil the molecular complexes that form in the cell to carry out various functions. Expression analysis under multiple conditions, supplemented with protein-DNA binding data can highlight when genes are active and how they are regulated. Representing these data in networks and finding strongly connected sub-graphs has proven to be a powerful tool to predict the function of unknown genes. As such networks are gradually becoming available for various plant species, it becomes possible to study how networks evolve. This review summarizes currently available network data and related tools for plants. Furthermore we aim to provide an outlook of future analyses that can be done in plants based on work done in other fields.

  1. Influence of Blood on the Accuracy of Raypex 5 and Root ZX Electronic Foramen Locators: An In Vivo Study.

    PubMed

    Saatchi, Masoud; Aminozarbian, Mohammad Ghasem; Noormohammadi, Hamid; Baghaei, Badri

    2016-01-01

    The aim of this study was to evaluate in vivo the accuracy of the Raypex 5 and Root ZX electronic foramen locators (EFLs) in the presence of blood in the root canal space. Forty single-canal teeth scheduled for extraction were selected. Access cavity was prepared and coronal enlargement was carried out. Approximately two drops of blood were collected by finger prick and injected into the root canal space. The electronic working length (EWL) of each tooth by each device was established twice before (NB group) and after (WB group) injecting blood into the root canal. The tooth was extracted and the actual working length (AWL) was determined. Data were analyzed using McNemar's test. The accuracy rates of Raypex 5 and Root ZX within 0.5 mm in the NB group were 88.9% and 91.5%, with 83.3% and 86.2% in the WB group, respectively. There were no significant differences between the accuracy of each EFL in the two groups (p>0.05). Considering the NB and WB groups, there were no statistically significant differences in the accuracy of the EFLs (p>0.05). The presence of blood in the root canal space did not influence the accuracy of the EFLs.

  2. Influence of Blood on the Accuracy of Raypex 5 and Root ZX Electronic Foramen Locators: An In Vivo Study.

    PubMed

    Saatchi, Masoud; Aminozarbian, Mohammad Ghasem; Noormohammadi, Hamid; Baghaei, Badri

    2016-01-01

    The aim of this study was to evaluate in vivo the accuracy of the Raypex 5 and Root ZX electronic foramen locators (EFLs) in the presence of blood in the root canal space. Forty single-canal teeth scheduled for extraction were selected. Access cavity was prepared and coronal enlargement was carried out. Approximately two drops of blood were collected by finger prick and injected into the root canal space. The electronic working length (EWL) of each tooth by each device was established twice before (NB group) and after (WB group) injecting blood into the root canal. The tooth was extracted and the actual working length (AWL) was determined. Data were analyzed using McNemar's test. The accuracy rates of Raypex 5 and Root ZX within 0.5 mm in the NB group were 88.9% and 91.5%, with 83.3% and 86.2% in the WB group, respectively. There were no significant differences between the accuracy of each EFL in the two groups (p>0.05). Considering the NB and WB groups, there were no statistically significant differences in the accuracy of the EFLs (p>0.05). The presence of blood in the root canal space did not influence the accuracy of the EFLs. PMID:27224570

  3. Comparative Accuracy Evaluation of Fine-Scale Global and Local Digital Surface Models: The Tshwane Case Study I

    NASA Astrophysics Data System (ADS)

    Breytenbach, A.

    2016-10-01

    Conducted in the City of Tshwane, South Africa, this study set about to test the accuracy of DSMs derived from different remotely sensed data locally. VHR digital mapping camera stereo-pairs, tri-stereo imagery collected by a Pléiades satellite and data detected from the Tandem-X InSAR satellite configuration were fundamental in the construction of seamless DSM products at different postings, namely 2 m, 4 m and 12 m. The three DSMs were sampled against independent control points originating from validated airborne LiDAR data. The reference surfaces were derived from the same dense point cloud at grid resolutions corresponding to those of the samples. The absolute and relative positional accuracies were computed using well-known DEM error metrics and accuracy statistics. Overall vertical accuracies were also assessed and compared across seven slope classes and nine primary land cover classes. Although all three DSMs displayed significantly more vertical errors where solid waterbodies, dense natural and/or alien woody vegetation and, in a lesser degree, urban residential areas with significant canopy cover were encountered, all three surpassed their expected positional accuracies overall.

  4. Polarization as a tool for studying particle properties

    SciTech Connect

    Grosse-Wiesmann, P.

    1988-05-01

    The use of polarized beams in e/sup /plus//e/sup /minus// collisions at the Z/sup 0/pole provides a powerful tool for the separation of the charge and spin of the produced fermions. Such a separation is essential for many investigations of particle properties. It is shown that this technique can be used to substantially improve studies of CP violation in neutral B mesons and the charged structure of /tau/ decays.

  5. Design and Preliminary Accuracy Studies of an MRI-Guided Transrectal Prostate Intervention System

    PubMed Central

    Krieger, Axel; Csoma, Csaba; Iordachita, Iulian I.; Guion, Peter; Singh, Anurag K.; Fichtinger, Gabor; Whitcomb, Louis L.

    2012-01-01

    This paper reports a novel system for magnetic resonance imaging (MRI) guided transrectal prostate interventions, such as needle biopsy, fiducial marker placement, and therapy delivery. The system utilizes a hybrid tracking method, comprised of passive fiducial tracking for initial registration and subsequent incremental motion measurement along the degrees of freedom using fiber-optical encoders and mechanical scales. Targeting accuracy of the system is evaluated in prostate phantom experiments. Achieved targeting accuracy and procedure times were found to compare favorably with existing systems using passive and active tracking methods. Moreover, the portable design of the system using only standard MRI image sequences and minimal custom scanner interfacing allows the system to be easily used on different MRI scanners. PMID:18044553

  6. Recognition Accuracy Using 3D Endoscopic Images for Superficial Gastrointestinal Cancer: A Crossover Study

    PubMed Central

    Kaise, Mitsuru; Kikuchi, Daisuke; Iizuka, Toshiro; Fukuma, Yumiko; Kuribayashi, Yasutaka; Tanaka, Masami; Toba, Takahito; Furuhata, Tsukasa; Yamashita, Satoshi; Matsui, Akira; Mitani, Toshifumi; Hoteya, Shu

    2016-01-01

    Aim. To determine whether 3D endoscopic images improved recognition accuracy for superficial gastrointestinal cancer compared with 2D images. Methods. We created an image catalog using 2D and 3D images of 20 specimens resected by endoscopic submucosal dissection. The twelve participants were allocated into two groups. Group 1 evaluated only 2D images at first, group 2 evaluated 3D images, and, after an interval of 2 weeks, group 1 next evaluated 3D and group 2 evaluated 2D images. The evaluation items were as follows: (1) diagnostic accuracy of the tumor extent and (2) confidence levels in assessing (a) tumor extent, (b) morphology, (c) microsurface structure, and (d) comprehensive recognition. Results. The use of 3D images resulted in an improvement in diagnostic accuracy in both group 1 (2D: 76.9%, 3D: 78.6%) and group 2 (2D: 79.9%, 3D: 83.6%), with no statistically significant difference. The confidence levels were higher for all items ((a) to (d)) when 3D images were used. With respect to experience, the degree of the improvement showed the following trend: novices > trainees > experts. Conclusions. By conversion into 3D images, there was a significant improvement in the diagnostic confidence level for superficial tumors, and the improvement was greater in individuals with lower endoscopic expertise. PMID:27597863

  7. Recognition Accuracy Using 3D Endoscopic Images for Superficial Gastrointestinal Cancer: A Crossover Study

    PubMed Central

    Kaise, Mitsuru; Kikuchi, Daisuke; Iizuka, Toshiro; Fukuma, Yumiko; Kuribayashi, Yasutaka; Tanaka, Masami; Toba, Takahito; Furuhata, Tsukasa; Yamashita, Satoshi; Matsui, Akira; Mitani, Toshifumi; Hoteya, Shu

    2016-01-01

    Aim. To determine whether 3D endoscopic images improved recognition accuracy for superficial gastrointestinal cancer compared with 2D images. Methods. We created an image catalog using 2D and 3D images of 20 specimens resected by endoscopic submucosal dissection. The twelve participants were allocated into two groups. Group 1 evaluated only 2D images at first, group 2 evaluated 3D images, and, after an interval of 2 weeks, group 1 next evaluated 3D and group 2 evaluated 2D images. The evaluation items were as follows: (1) diagnostic accuracy of the tumor extent and (2) confidence levels in assessing (a) tumor extent, (b) morphology, (c) microsurface structure, and (d) comprehensive recognition. Results. The use of 3D images resulted in an improvement in diagnostic accuracy in both group 1 (2D: 76.9%, 3D: 78.6%) and group 2 (2D: 79.9%, 3D: 83.6%), with no statistically significant difference. The confidence levels were higher for all items ((a) to (d)) when 3D images were used. With respect to experience, the degree of the improvement showed the following trend: novices > trainees > experts. Conclusions. By conversion into 3D images, there was a significant improvement in the diagnostic confidence level for superficial tumors, and the improvement was greater in individuals with lower endoscopic expertise.

  8. Recognition Accuracy Using 3D Endoscopic Images for Superficial Gastrointestinal Cancer: A Crossover Study.

    PubMed

    Nomura, Kosuke; Kaise, Mitsuru; Kikuchi, Daisuke; Iizuka, Toshiro; Fukuma, Yumiko; Kuribayashi, Yasutaka; Tanaka, Masami; Toba, Takahito; Furuhata, Tsukasa; Yamashita, Satoshi; Matsui, Akira; Mitani, Toshifumi; Hoteya, Shu

    2016-01-01

    Aim. To determine whether 3D endoscopic images improved recognition accuracy for superficial gastrointestinal cancer compared with 2D images. Methods. We created an image catalog using 2D and 3D images of 20 specimens resected by endoscopic submucosal dissection. The twelve participants were allocated into two groups. Group 1 evaluated only 2D images at first, group 2 evaluated 3D images, and, after an interval of 2 weeks, group 1 next evaluated 3D and group 2 evaluated 2D images. The evaluation items were as follows: (1) diagnostic accuracy of the tumor extent and (2) confidence levels in assessing (a) tumor extent, (b) morphology, (c) microsurface structure, and (d) comprehensive recognition. Results. The use of 3D images resulted in an improvement in diagnostic accuracy in both group 1 (2D: 76.9%, 3D: 78.6%) and group 2 (2D: 79.9%, 3D: 83.6%), with no statistically significant difference. The confidence levels were higher for all items ((a) to (d)) when 3D images were used. With respect to experience, the degree of the improvement showed the following trend: novices > trainees > experts. Conclusions. By conversion into 3D images, there was a significant improvement in the diagnostic confidence level for superficial tumors, and the improvement was greater in individuals with lower endoscopic expertise.

  9. Recognition Accuracy Using 3D Endoscopic Images for Superficial Gastrointestinal Cancer: A Crossover Study.

    PubMed

    Nomura, Kosuke; Kaise, Mitsuru; Kikuchi, Daisuke; Iizuka, Toshiro; Fukuma, Yumiko; Kuribayashi, Yasutaka; Tanaka, Masami; Toba, Takahito; Furuhata, Tsukasa; Yamashita, Satoshi; Matsui, Akira; Mitani, Toshifumi; Hoteya, Shu

    2016-01-01

    Aim. To determine whether 3D endoscopic images improved recognition accuracy for superficial gastrointestinal cancer compared with 2D images. Methods. We created an image catalog using 2D and 3D images of 20 specimens resected by endoscopic submucosal dissection. The twelve participants were allocated into two groups. Group 1 evaluated only 2D images at first, group 2 evaluated 3D images, and, after an interval of 2 weeks, group 1 next evaluated 3D and group 2 evaluated 2D images. The evaluation items were as follows: (1) diagnostic accuracy of the tumor extent and (2) confidence levels in assessing (a) tumor extent, (b) morphology, (c) microsurface structure, and (d) comprehensive recognition. Results. The use of 3D images resulted in an improvement in diagnostic accuracy in both group 1 (2D: 76.9%, 3D: 78.6%) and group 2 (2D: 79.9%, 3D: 83.6%), with no statistically significant difference. The confidence levels were higher for all items ((a) to (d)) when 3D images were used. With respect to experience, the degree of the improvement showed the following trend: novices > trainees > experts. Conclusions. By conversion into 3D images, there was a significant improvement in the diagnostic confidence level for superficial tumors, and the improvement was greater in individuals with lower endoscopic expertise. PMID:27597863

  10. Additional studies of forest classification accuracy as influenced by multispectral scanner spatial resolution

    NASA Technical Reports Server (NTRS)

    Sadowski, F. E.; Sarno, J. E.

    1976-01-01

    First, an analysis of forest feature signatures was used to help explain the large variation in classification accuracy that can occur among individual forest features for any one case of spatial resolution and the inconsistent changes in classification accuracy that were demonstrated among features as spatial resolution was degraded. Second, the classification rejection threshold was varied in an effort to reduce the large proportion of unclassified resolution elements that previously appeared in the processing of coarse resolution data when a constant rejection threshold was used for all cases of spatial resolution. For the signature analysis, two-channel ellipse plots showing the feature signature distributions for several cases of spatial resolution indicated that the capability of signatures to correctly identify their respective features is dependent on the amount of statistical overlap among signatures. Reductions in signature variance that occur in data of degraded spatial resolution may not necessarily decrease the amount of statistical overlap among signatures having large variance and small mean separations. Features classified by such signatures may thus continue to have similar amounts of misclassified elements in coarser resolution data, and thus, not necessarily improve in classification accuracy.

  11. Classification Accuracy of MMPI-2 Validity Scales in the Detection of Pain-Related Malingering: A Known-Groups Study

    ERIC Educational Resources Information Center

    Bianchini, Kevin J.; Etherton, Joseph L.; Greve, Kevin W.; Heinly, Matthew T.; Meyers, John E.

    2008-01-01

    The purpose of this study was to determine the accuracy of "Minnesota Multiphasic Personality Inventory" 2nd edition (MMPI-2; Butcher, Dahlstrom, Graham, Tellegen, & Kaemmer, 1989) validity indicators in the detection of malingering in clinical patients with chronic pain using a hybrid clinical-known groups/simulator design. The sample consisted…

  12. Improving accuracy and usability of growth charts: case study in Rwanda

    PubMed Central

    Brown, Suzana; McSharry, Patrick

    2016-01-01

    Objectives We evaluate and compare manually collected paper records against electronic records for monitoring the weights of children under the age of 5. Setting Data were collected by 24 community health workers (CHWs) in 2 Rwandan communities, 1 urban and 1 rural. Participants The same CHWs collected paper and electronic records. Paper data contain weight and age for 320 boys and 380 girls. Electronic data contain weight and age for 922 girls and 886 boys. Electronic data were collected over 9 months; most of the data is cross-sectional, with about 330 children with time-series data. Both data sets are compared with the international standard provided by the WHO growth chart. Primary and secondary outcome measures The plan was to collect 2000 individual records for the electronic data set—we finally collected 1878 records. Paper data were collected by the same CHWs, but most data were fragmented and hard to read. We transcribed data only from children for whom we were able to obtain the date of birth, to determine the exact age at the time of measurement. Results Mean absolute error (MAE) and mean absolute percentage error (MAPE) provide a way to quantify the magnitude of the error in using a given model. Comparing a model, log(weight)=a+b log(age), shows that electronic records provide considerable improvements over paper records, with 40% reduction in both performance metrics. Electronic data improve performance over the WHO model by 10% in MAPE and 7% in MAE. Results are statistically significant using the Kolmogorov-Smirnov test at p<0.01. Conclusions This study demonstrates that using modern electronic tools for health data collection is allowing better tracking of health indicators. We have demonstrated that electronic records facilitate development of a country-specific model that is more accurate than the international standard provided by the WHO growth chart. PMID:26817635

  13. Pre-training assessment tool (JPAT)--a pilot study.

    PubMed

    Chow, J; Bennett, L

    2001-01-01

    A tool for assessing the suitability of candidates for home dialysis (Jo-Pre-training Assessment Tool--JPAT) was developed. JPAT acts as a screening instrument to identify suitable candidates for the home dialysis programme, and therefore increases a patient's chance of learning to manage the programme. JPAT is in the form of an interview questionnaire consisting of 38 assessment items in six domains: physical stability, nutritional status, communication, ability to maintain self-care, psychological suitability and social support. A pilot study was conducted (n = 20, 1996-1997) using a descriptive study design, with subjects randomly selected from an existing dialysis programme. Pearson correlation and 2-tailed tests were employed to explore the relationship between the assessment outcome (i.e. the initial JPAT scores) and the follow up data (i.e. data collected within the seven days following the initial JPAT assessment). Many of the variables attained statistical significance (p < 0.05). The inter-rater reliability was calculated at an average Kappa value of 0.909. Overall, results suggest that JPAT is sufficiently reliable to be used as a tool for assessing patients who suffer from ESRD. PMID:12603073

  14. Pre-training assessment tool (JPAT)--a pilot study.

    PubMed

    Chow, J; Bennett, L

    2001-01-01

    A tool for assessing the suitability of candidates for home dialysis (Jo-Pre-training Assessment Tool--JPAT) was developed. JPAT acts as a screening instrument to identify suitable candidates for the home dialysis programme, and therefore increases a patient's chance of learning to manage the programme. JPAT is in the form of an interview questionnaire consisting of 38 assessment items in six domains: physical stability, nutritional status, communication, ability to maintain self-care, psychological suitability and social support. A pilot study was conducted (n = 20, 1996-1997) using a descriptive study design, with subjects randomly selected from an existing dialysis programme. Pearson correlation and 2-tailed tests were employed to explore the relationship between the assessment outcome (i.e. the initial JPAT scores) and the follow up data (i.e. data collected within the seven days following the initial JPAT assessment). Many of the variables attained statistical significance (p < 0.05). The inter-rater reliability was calculated at an average Kappa value of 0.909. Overall, results suggest that JPAT is sufficiently reliable to be used as a tool for assessing patients who suffer from ESRD.

  15. Experimental Tools to Study Molecular Recognition within the Nanoparticle Corona

    PubMed Central

    Landry, Markita P.; Kruss, Sebastian; Nelson, Justin T.; Bisker, Gili; Iverson, Nicole M.; Reuel, Nigel F.; Strano, Michael S.

    2014-01-01

    Advancements in optical nanosensor development have enabled the design of sensors using syntheticmolecular recognition elements through a recently developed method called Corona Phase MolecularRecognition (CoPhMoRe). The synthetic sensors resulting from these design principles are highly selective for specific analytes, and demonstrate remarkable stability for use under a variety of conditions. An essential element of nanosensor development hinges on the ability to understand the interface between nanoparticles and the associated corona phase surrounding the nanosensor, an environment outside of the range of traditional characterization tools, such as NMR. This review discusses the need for new strategies and instrumentation to study the nanoparticle corona, operating in both in vitro and in vivo environments. Approaches to instrumentation must have the capacity to concurrently monitor nanosensor operation and the molecular changes in the corona phase. A detailed overview of new tools for the understanding of CoPhMoRe mechanisms is provided for future applications. PMID:25184487

  16. Tools to Study SUMO Conjugation in Caenorhabditis elegans.

    PubMed

    Pelisch, Federico; Hay, Ronald T

    2016-01-01

    The cell biology of sumoylation has mostly been studied using transformed cultured cells and yeast. In recent years, genetic analysis has demonstrated important roles for sumoylation in the biology of C. elegans. Here, we expand the existing set of tools making it possible to address the role of sumoylation in the nematode C. elegans using a combination of genetics, imaging, and biochemistry. Most importantly, the dynamics of SUMO conjugation and deconjugation can be followed very precisely both in space and time within living worms. Additionally, the biochemistry of SUMO conjugation and deconjugation can be addressed using recombinant purified components of the C. elegans sumoylation machinery, including E3 ligases and SUMO proteases. These tools and reagents will be useful to gain insights into the biological role of SUMO in the context of a multicellular organism. PMID:27631810

  17. Accuracy of magnetic resonance imaging for measuring maturing cartilage: A phantom study

    PubMed Central

    McKinney, Jennifer R; Sussman, Marshall S; Moineddin, Rahim; Amirabadi, Afsaneh; Rayner, Tammy; Doria, Andrea S

    2016-01-01

    OBJECTIVES: To evaluate the accuracy of magnetic resonance imaging measurements of cartilage tissue-mimicking phantoms and to determine a combination of magnetic resonance imaging parameters to optimize accuracy while minimizing scan time. METHOD: Edge dimensions from 4 rectangular agar phantoms ranging from 10.5 to 14.5 mm in length and 1.25 to 5.5 mm in width were independently measured by two readers using a steel ruler. Coronal T1 spin echo (T1 SE), fast spoiled gradient-recalled echo (FSPGR) and multiplanar gradient-recalled echo (GRE MPGR) sequences were used to obtain phantom images on a 1.5-T scanner. RESULTS: Inter- and intra-reader reliability were high for both direct measurements and for magnetic resonance imaging measurements of phantoms. Statistically significant differences were noted between the mean direct measurements and the mean magnetic resonance imaging measurements for phantom 1 when using a GRE MPGR sequence (512x512 pixels, 1.5-mm slice thickness, 5:49 min scan time), while borderline differences were noted for T1 SE sequences with the following parameters: 320x320 pixels, 1.5-mm slice thickness, 6:11 min scan time; 320x320 pixels, 4-mm slice thickness, 6:11 min scan time; and 512x512 pixels, 1.5-mm slice thickness, 9:48 min scan time. Borderline differences were also noted when using a FSPGR sequence with 512x512 pixels, a 1.5-mm slice thickness and a 3:36 min scan time. CONCLUSIONS: FSPGR sequences, regardless of the magnetic resonance imaging parameter combination used, provided accurate measurements. The GRE MPGR sequence using 512x512 pixels, a 1.5-mm slice thickness and a 5:49 min scan time and, to a lesser degree, all tested T1 SE sequences produced suboptimal accuracy when measuring the widest phantom. PMID:27464298

  18. A statistical study of the surface accuracy of a planar truss beam

    NASA Technical Reports Server (NTRS)

    Kenner, W. Scott; Fichter, W. B.

    1990-01-01

    Surface error statistics for single-layer and double-layer planar truss beams with random member-length errors were calculated using a Monte-Carlo technique in conjunction with finite-element analysis. Surface error was calculated in terms of the normal distance from a regression line to the surface nodes of the distorted beam. Results for both single-layer and double-layer beams indicate that a minimum root-mean-square surface error can be achieved by optimizing the depth-to-length ratio of a truss beam. The statically indeterminate double-layer beams can provide greater surface accuracy, though at the expense of significantly greater complexity.

  19. Agricultural case studies of classification accuracy, spectral resolution, and model over-fitting.

    PubMed

    Nansen, Christian; Geremias, Leandro Delalibera; Xue, Yingen; Huang, Fangneng; Parra, Jose Roberto

    2013-11-01

    This paper describes the relationship between spectral resolution and classification accuracy in analyses of hyperspectral imaging data acquired from crop leaves. The main scope is to discuss and reduce the risk of model over-fitting. Over-fitting of a classification model occurs when too many and/or irrelevant model terms are included (i.e., a large number of spectral bands), and it may lead to low robustness/repeatability when the classification model is applied to independent validation data. We outline a simple way to quantify the level of model over-fitting by comparing the observed classification accuracies with those obtained from explanatory random data. Hyperspectral imaging data were acquired from two crop-insect pest systems: (1) potato psyllid (Bactericera cockerelli) infestations of individual bell pepper plants (Capsicum annuum) with the acquisition of hyperspectral imaging data under controlled-light conditions (data set 1), and (2) sugarcane borer (Diatraea saccharalis) infestations of individual maize plants (Zea mays) with the acquisition of hyperspectral imaging data from the same plants under two markedly different image-acquisition conditions (data sets 2a and b). For each data set, reflectance data were analyzed based on seven spectral resolutions by dividing 160 spectral bands from 405 to 907 nm into 4, 16, 32, 40, 53, 80, or 160 bands. In the two data sets, similar classification results were obtained with spectral resolutions ranging from 3.1 to 12.6 nm. Thus, the size of the initial input data could be reduced fourfold with only a negligible loss of classification accuracy. In the analysis of data set 1, several validation approaches all demonstrated consistently that insect-induced stress could be accurately detected and that therefore there was little indication of model over-fitting. In the analyses of data set 2, inconsistent validation results were obtained and the observed classification accuracy (81.06%) was only a few percentage

  20. Comparative evaluation of dimensional accuracy of different polyvinyl siloxane putty-wash impression techniques-in vitro study.

    PubMed Central

    Dugal, Ramandeep; Railkar, Bhargavi; Musani, Smita

    2013-01-01

    Background: Dimensional accuracy when making impressions is crucial to the quality of fixed prosthodontic treatment, and the impression technique is a critical factor affecting this accuracy. The purpose of this in vitro study was to compare the dimensional accuracy of the casts obtained from one step double mix, two step double mix polyvinyl siloxane putty- wash impression techniques using three different spacer thicknesses (0.5mm, 1mm and 1.5mm), in order to determine the impression technique that displays the maximum linear dimensional accuracy. Materials & Methods: A Mild steel model with 2 abutment preparations was fabricated, and impressions were made 15 times with each technique. All impressions were made with an addition-reaction silicone impression material (Express, 3M ESPE) and customarily made perforated metal trays. The 1-step putty/light-body impressions were made with simultaneous use of putty and light-body materials. The 2-step putty/light-body impressions were made with 0.5-mm, 1mm and 1.5mm-thick metal-prefabricated spacer caps. The accuracy of the 4 different impression techniques was assessed by measuring 7 dimensions (intra- and inter abutment) (20-μm accuracy) on stone casts poured from the impressions of the mild steel model. The data were analyzed by one sample‘t’ test. Results: The stone dies obtained with all the techniques had significantly larger or smaller dimensions as compared to those of the mild steel model (P<0.05). The order for highest to lowest deviation from the mild steel model was: single step putty/light body, 2-step putty/light body with 0.5mm spacer thickness, 2-step putty/light body1.5mm spacer thickness, and 2-step putty/light body with 1mm spacer thickness. Significant differences among all of the groups for both absolute dimensions of the stone dies, and their standard deviations from the master model (P<0.05), were noted. Conclusions: The 2-step putty/light-body impression technique with 1mm spacer thickness was

  1. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  2. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  3. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2015-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this paper is on tool presentation, verification, and validation. These processes are carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  4. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2014-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio and number of control surfaces. A doublet lattice approach is taken to compute generalized forces. A rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. Although, all parameters can be easily modified if desired.The focus of this paper is on tool presentation, verification and validation. This process is carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool. Therefore the flutter speed and frequency for a clamped plate are computed using V-g and V-f analysis. The computational results are compared to a previously published computational analysis and wind tunnel results for the same structure. Finally a case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to V-g and V-f analysis. This also includes the analysis of the model in response to a 1-cos gust.

  5. A retrospective study to validate an intraoperative robotic classification system for assessing the accuracy of kirschner wire (K-wire) placements with postoperative computed tomography classification system for assessing the accuracy of pedicle screw placements

    PubMed Central

    Tsai, Tai-Hsin; Wu, Dong-Syuan; Su, Yu-Feng; Wu, Chieh-Hsin; Lin, Chih-Lung

    2016-01-01

    Abstract This purpose of this retrospective study is validation of an intraoperative robotic grading classification system for assessing the accuracy of Kirschner-wire (K-wire) placements with the postoperative computed tomography (CT)-base classification system for assessing the accuracy of pedicle screw placements. We conducted a retrospective review of prospectively collected data from 35 consecutive patients who underwent 176 robotic assisted pedicle screws instrumentation at Kaohsiung Medical University Hospital from September 2014 to November 2015. During the operation, we used a robotic grading classification system for verifying the intraoperative accuracy of K-wire placements. Three months after surgery, we used the common CT-base classification system to assess the postoperative accuracy of pedicle screw placements. The distributions of accuracy between the intraoperative robot-assisted and various postoperative CT-based classification systems were compared using kappa statistics of agreement. The intraoperative accuracies of K-wire placements before and after repositioning were classified as excellent (131/176, 74.4% and 133/176, 75.6%, respectively), satisfactory (36/176, 20.5% and 41/176, 23.3%, respectively), and malpositioned (9/176, 5.1% and 2/176, 1.1%, respectively) In postoperative CT-base classification systems were evaluated. No screw placements were evaluated as unacceptable under any of these systems. Kappa statistics revealed no significant differences between the proposed system and the aforementioned classification systems (P <0.001). Our results revealed no significant differences between the intraoperative robotic grading system and various postoperative CT-based grading systems. The robotic grading classification system is a feasible method for evaluating the accuracy of K-wire placements. Using the intraoperative robot grading system to classify the accuracy of K-wire placements enables predicting the postoperative accuracy of

  6. A retrospective study to validate an intraoperative robotic classification system for assessing the accuracy of kirschner wire (K-wire) placements with postoperative computed tomography classification system for assessing the accuracy of pedicle screw placements.

    PubMed

    Tsai, Tai-Hsin; Wu, Dong-Syuan; Su, Yu-Feng; Wu, Chieh-Hsin; Lin, Chih-Lung

    2016-09-01

    This purpose of this retrospective study is validation of an intraoperative robotic grading classification system for assessing the accuracy of Kirschner-wire (K-wire) placements with the postoperative computed tomography (CT)-base classification system for assessing the accuracy of pedicle screw placements.We conducted a retrospective review of prospectively collected data from 35 consecutive patients who underwent 176 robotic assisted pedicle screws instrumentation at Kaohsiung Medical University Hospital from September 2014 to November 2015. During the operation, we used a robotic grading classification system for verifying the intraoperative accuracy of K-wire placements. Three months after surgery, we used the common CT-base classification system to assess the postoperative accuracy of pedicle screw placements. The distributions of accuracy between the intraoperative robot-assisted and various postoperative CT-based classification systems were compared using kappa statistics of agreement.The intraoperative accuracies of K-wire placements before and after repositioning were classified as excellent (131/176, 74.4% and 133/176, 75.6%, respectively), satisfactory (36/176, 20.5% and 41/176, 23.3%, respectively), and malpositioned (9/176, 5.1% and 2/176, 1.1%, respectively)In postoperative CT-base classification systems were evaluated. No screw placements were evaluated as unacceptable under any of these systems. Kappa statistics revealed no significant differences between the proposed system and the aforementioned classification systems (P <0.001).Our results revealed no significant differences between the intraoperative robotic grading system and various postoperative CT-based grading systems. The robotic grading classification system is a feasible method for evaluating the accuracy of K-wire placements. Using the intraoperative robot grading system to classify the accuracy of K-wire placements enables predicting the postoperative accuracy of pedicle screw

  7. A study on the grinding accuracy of a copy milling machine for dental use.

    PubMed

    Seido, T; Hasegawa, K; Kawada, E; Oda, Y

    1997-08-01

    In order to assess the relationship between the profiling pressure of a copy milling machine for dental use and the accuracy of the dimensions of the objects produced, a Celay System was used to profile a metal-cylinder model, and its dimensions were compared with the workpieces. The results showed that, when a cylindrical model with a diameter of 6 mm was subjected to freehand profiling, the mean processing error of the object produced was -0.026 mm, and the profiling pressure was 4.6 gf to 131.7 gf (mean: 76.6 gf). However, the fluctuations in profiling pressure decreased by 1/5 during profiling operations when accessories that fixed the profiling pressure and the profiling loci were used. Moreover, while the processing error when the model was profiled at a profiling pressure of 76 gf or less was a mere 0.005 mm, at 110 gf it rose to 0.022 mm. Based on the above, the processing error that occurred as a result of profiling pressure appeared to affect processing accuracy. On the basis of these findings, profiling as light weight must be performed with as possible in order to obtain accurate products using the copy milling machine for dental use, and the results suggested the necessity of attaching control devices to the copy milling machine to make accurate restorations.

  8. The effect of a low radiation CT protocol on accuracy of CT guided implant migration measurement: A cadaver study.

    PubMed

    Boettner, Friedrich; Sculco, Peter K; Lipman, Joseph; Saboeiro, Gregory; Renner, Lisa; Faschingbauer, Martin

    2016-04-01

    The current study compared the impact of low radiation CT protocols on the accuracy, repeatability, and inter- and intra-observer variability of implant migration studies in total hip arthroplasty. Two total hip replacements were performed in two human cadavers and six tantalum beads were inserted into the femur similar to radiostereometric analysis. Six different 28 mm heads (-3 mm, 0 mm, 2.5 mm, 5.0 mm, 7.5 mm, and 10 mm) were added to simulate five reproducible translations (maximum total point migration) of the center of the head. Three CT scans with varying levels of radiation were performed for each head position. The effective dose (mSv) was 3.8 mSv for Protocol A (standard protocol), 0.7 mSv for Protocol B and 1.6 mSv for Protocol C. Implant migration was measured in a 3-D analysis software (Geomagic Studio 7). The accuracy was 0.16 mm for CT Protocol A, 0.13 mm for Protocol B and 0.14 mm for Protocol C; The repeatability was 0.22 mm for CT Protocol A, 0.18 mm for Protocol B and 0.20 mm for Protocol C; ICC for inter observer reliability was 0.89, intra observer reliability was 0.95. The difference in accuracy between standard protocol A and the two low radiation protocols (B, C) was less than 0.05 mm. The accuracy, inter- and intra-observer reliability of all three CT protocols is comparable to radiostereometric analysis. Reducing the CT radiation exposure to numbers similar to an AP Pelvis radiograph (0.7 mSv protocol B) does not affect the accuracy of implant migration measurements. PMID:26425921

  9. The effect of a low radiation CT protocol on accuracy of CT guided implant migration measurement: A cadaver study.

    PubMed

    Boettner, Friedrich; Sculco, Peter K; Lipman, Joseph; Saboeiro, Gregory; Renner, Lisa; Faschingbauer, Martin

    2016-04-01

    The current study compared the impact of low radiation CT protocols on the accuracy, repeatability, and inter- and intra-observer variability of implant migration studies in total hip arthroplasty. Two total hip replacements were performed in two human cadavers and six tantalum beads were inserted into the femur similar to radiostereometric analysis. Six different 28 mm heads (-3 mm, 0 mm, 2.5 mm, 5.0 mm, 7.5 mm, and 10 mm) were added to simulate five reproducible translations (maximum total point migration) of the center of the head. Three CT scans with varying levels of radiation were performed for each head position. The effective dose (mSv) was 3.8 mSv for Protocol A (standard protocol), 0.7 mSv for Protocol B and 1.6 mSv for Protocol C. Implant migration was measured in a 3-D analysis software (Geomagic Studio 7). The accuracy was 0.16 mm for CT Protocol A, 0.13 mm for Protocol B and 0.14 mm for Protocol C; The repeatability was 0.22 mm for CT Protocol A, 0.18 mm for Protocol B and 0.20 mm for Protocol C; ICC for inter observer reliability was 0.89, intra observer reliability was 0.95. The difference in accuracy between standard protocol A and the two low radiation protocols (B, C) was less than 0.05 mm. The accuracy, inter- and intra-observer reliability of all three CT protocols is comparable to radiostereometric analysis. Reducing the CT radiation exposure to numbers similar to an AP Pelvis radiograph (0.7 mSv protocol B) does not affect the accuracy of implant migration measurements.

  10. Accuracy Study of the Space-Time CE/SE Method for Computational Aeroacoustics Problems Involving Shock Waves

    NASA Technical Reports Server (NTRS)

    Wang, Xiao Yen; Chang, Sin-Chung; Jorgenson, Philip C. E.

    1999-01-01

    The space-time conservation element and solution element(CE/SE) method is used to study the sound-shock interaction problem. The order of accuracy of numerical schemes is investigated. The linear model problem.govemed by the 1-D scalar convection equation, sound-shock interaction problem governed by the 1-D Euler equations, and the 1-D shock-tube problem which involves moving shock waves and contact surfaces are solved to investigate the order of accuracy of numerical schemes. It is concluded that the accuracy of the CE/SE numerical scheme with designed 2nd-order accuracy becomes 1st order when a moving shock wave exists. However, the absolute error in the CE/SE solution downstream of the shock wave is on the same order as that obtained using a fourth-order accurate essentially nonoscillatory (ENO) scheme. No special techniques are used for either high-frequency low-amplitude waves or shock waves.

  11. Toward robust deconvolution of pass-through paleomagnetic measurements: new tool to estimate magnetometer sensor response and laser interferometry of sample positioning accuracy

    NASA Astrophysics Data System (ADS)

    Oda, Hirokuni; Xuan, Chuang; Yamamoto, Yuhji

    2016-07-01

    Pass-through superconducting rock magnetometers (SRM) offer rapid and high-precision remanence measurements for continuous samples that are essential for modern paleomagnetism studies. However, continuous SRM measurements are inevitably smoothed and distorted due to the convolution effect of SRM sensor response. Deconvolution is necessary to restore accurate magnetization from pass-through SRM data, and robust deconvolution requires reliable estimate of SRM sensor response as well as understanding of uncertainties associated with the SRM measurement system. In this paper, we use the SRM at Kochi Core Center (KCC), Japan, as an example to introduce new tool and procedure for accurate and efficient estimate of SRM sensor response. To quantify uncertainties associated with the SRM measurement due to track positioning errors and test their effects on deconvolution, we employed laser interferometry for precise monitoring of track positions both with and without placing a u-channel sample on the SRM tray. The acquired KCC SRM sensor response shows significant cross-term of Z-axis magnetization on the X-axis pick-up coil and full widths of ~46-54 mm at half-maximum response for the three pick-up coils, which are significantly narrower than those (~73-80 mm) for the liquid He-free SRM at Oregon State University. Laser interferometry measurements on the KCC SRM tracking system indicate positioning uncertainties of ~0.1-0.2 and ~0.5 mm for tracking with and without u-channel sample on the tray, respectively. Positioning errors appear to have reproducible components of up to ~0.5 mm possibly due to patterns or damages on tray surface or rope used for the tracking system. Deconvolution of 50,000 simulated measurement data with realistic error introduced based on the position uncertainties indicates that although the SRM tracking system has recognizable positioning uncertainties, they do not significantly debilitate the use of deconvolution to accurately restore high

  12. Overlay accuracy fundamentals

    NASA Astrophysics Data System (ADS)

    Kandel, Daniel; Levinski, Vladimir; Sapiens, Noam; Cohen, Guy; Amit, Eran; Klein, Dana; Vakshtein, Irina

    2012-03-01

    Currently, the performance of overlay metrology is evaluated mainly based on random error contributions such as precision and TIS variability. With the expected shrinkage of the overlay metrology budget to < 0.5nm, it becomes crucial to include also systematic error contributions which affect the accuracy of the metrology. Here we discuss fundamental aspects of overlay accuracy and a methodology to improve accuracy significantly. We identify overlay mark imperfections and their interaction with the metrology technology, as the main source of overlay inaccuracy. The most important type of mark imperfection is mark asymmetry. Overlay mark asymmetry leads to a geometrical ambiguity in the definition of overlay, which can be ~1nm or less. It is shown theoretically and in simulations that the metrology may enhance the effect of overlay mark asymmetry significantly and lead to metrology inaccuracy ~10nm, much larger than the geometrical ambiguity. The analysis is carried out for two different overlay metrology technologies: Imaging overlay and DBO (1st order diffraction based overlay). It is demonstrated that the sensitivity of DBO to overlay mark asymmetry is larger than the sensitivity of imaging overlay. Finally, we show that a recently developed measurement quality metric serves as a valuable tool for improving overlay metrology accuracy. Simulation results demonstrate that the accuracy of imaging overlay can be improved significantly by recipe setup optimized using the quality metric. We conclude that imaging overlay metrology, complemented by appropriate use of measurement quality metric, results in optimal overlay accuracy.

  13. Accuracy studies with carbon clusters at the Penning trap mass spectrometer TRIGA-TRAP

    NASA Astrophysics Data System (ADS)

    Ketelaer, J.; Beyer, T.; Blaum, K.; Block, M.; Eberhardt, K.; Eibach, M.; Herfurth, F.; Smorra, C.; Nagy, Sz.

    2010-05-01

    Extensive cross-reference measurements of well-known frequency ratios using various sizes of carbon cluster ions 12Cn + (10≤n≤23) were performed to determine the effects limiting the accuracy of mass measurements at the Penning-trap facility TRIGA-TRAP. Two major contributions to the uncertainty of a mass measurement have been identified. Fluctuations of the magnetic field cause an uncertainty in the frequency ratio due to the required calibration by a reference ion of uf(νref)/νref = 6(2) × 10-11/min × Δt. A mass-dependent systematic shift of the frequency ratio of epsilonm(r)/r = -2.2(2) × 10-9 × (m-mref)/u has been found as well. Finally, the nuclide 197Au was used as a cross-check since its mass is already known with an uncertainty of 0.6 keV.

  14. Study on Improving the Accuracy of Satellite Measurement in Urban Areas

    NASA Astrophysics Data System (ADS)

    Matsushita, Takashi; Tanaka, Toshiyuki

    GPS/GNSS (Global Positioning System/Global Navigation Satellite System) is a 3D positioning system using space satellites for measuring a receiver's current position. Recently, many people use GPS as the navigation system in car and cellular phone, so the positioning accuracy of several meters is required to satisfy the user's need. However the measurement error reaches hundreds of meters in urban areas. One of the reasons is that the receiver fails to measure pseudo range accurately due to multipath from the buildings and so on. The other reason is that the satellite constellation is biased because of the decreasing number of observable satellites. Therefore, we proposed methods for reducing the multipath error and the lack of visible satellites. At the present day, although multipath error is reduced by the choke ring antenna and the correlators, this method has a problem that the antenna is expensive, big or complex. We devise methods to reduce the multipath error by only using measurement data. By these methods, we can reduce the size of the receiver and to use the satellite that contains the multipath error for the measurement. We achieved the improvement from 35.3m to 30.5m in 2drms by this method. We achieved about 69% improvement in 2drms and about 5% increase in measurement rate. We can describe that we succeeded not only in improving the measurement accuracy but also in increasing the measurement rate in urban area. The results show that our proposed method is effective in urban areas measurement.

  15. Meta-analysis for diagnostic accuracy studies: a new statistical model using beta-binomial distributions and bivariate copulas.

    PubMed

    Kuss, Oliver; Hoyer, Annika; Solms, Alexander

    2014-01-15

    There are still challenges when meta-analyzing data from studies on diagnostic accuracy. This is mainly due to the bivariate nature of the response where information on sensitivity and specificity must be summarized while accounting for their correlation within a single trial. In this paper, we propose a new statistical model for the meta-analysis for diagnostic accuracy studies. This model uses beta-binomial distributions for the marginal numbers of true positives and true negatives and links these margins by a bivariate copula distribution. The new model comes with all the features of the current standard model, a bivariate logistic regression model with random effects, but has the additional advantages of a closed likelihood function and a larger flexibility for the correlation structure of sensitivity and specificity. In a simulation study, which compares three copula models and two implementations of the standard model, the Plackett and the Gauss copula do rarely perform worse but frequently better than the standard model. We use an example from a meta-analysis to judge the diagnostic accuracy of telomerase (a urinary tumor marker) for the diagnosis of primary bladder cancer for illustration.

  16. A Meta-Analysis of Typhoid Diagnostic Accuracy Studies: A Recommendation to Adopt a Standardized Composite Reference.

    PubMed

    Storey, Helen L; Huang, Ying; Crudder, Chris; Golden, Allison; de los Santos, Tala; Hawkins, Kenneth

    2015-01-01

    Novel typhoid diagnostics currently under development have the potential to improve clinical care, surveillance, and the disease burden estimates that support vaccine introduction. Blood culture is most often used as the reference method to evaluate the accuracy of new typhoid tests; however, it is recognized to be an imperfect gold standard. If no single gold standard test exists, use of a composite reference standard (CRS) can improve estimation of diagnostic accuracy. Numerous studies have used a CRS to evaluate new typhoid diagnostics; however, there is no consensus on an appropriate CRS. In order to evaluate existing tests for use as a reference test or inclusion in a CRS, we performed a systematic review of the typhoid literature to include all index/reference test combinations observed. We described the landscape of comparisons performed, showed results of a meta-analysis on the accuracy of the more common combinations, and evaluated sources of variability based on study quality. This wide-ranging meta-analysis suggests that no single test has sufficiently good performance but some existing diagnostics may be useful as part of a CRS. Additionally, based on findings from the meta-analysis and a constructed numerical example demonstrating the use of CRS, we proposed necessary criteria and potential components of a typhoid CRS to guide future recommendations. Agreement and adoption by all investigators of a standardized CRS is requisite, and would improve comparison of new diagnostics across independent studies, leading to the identification of a better reference test and improved confidence in prevalence estimates.

  17. Manual landmark identification and tracking during the medial rotation test of the shoulder: an accuracy study using three-dimensional ultrasound and motion analysis measures.

    PubMed

    Morrissey, D; Morrissey, M C; Driver, W; King, J B; Woledge, R C

    2008-12-01

    Palpation of movement is a common clinical tool for assessment of movement in patients with musculoskeletal symptoms. The purpose of this study was to measure the accuracy of palpation of shoulder girdle translation during the medial rotation test (MRT) of the shoulder. The translation of the gleno-humeral and scapulo-thoracic joints was measured using both three-dimensional ultrasound and palpation in order to determine the accuracy of translation tracking during the MRT of the shoulder. Two movements of 11 normal subjects (mean age 24 (SD=4), range 19-47 years) were measured. The agreement between measures was good for scapulo-thoracic translation (r=0.83). Gleno-humeral translation was systematically under estimated (p=0.03) although moderate correlation was found (r=0.65). These results indicate that translation of the measured joints can be tracked by palpation and further tests of the efficacy of palpation tracking during musculoskeletal assessment may be warranted. PMID:18359266

  18. Linked color imaging application for improving the endoscopic diagnosis accuracy: a pilot study.

    PubMed

    Sun, Xiaotian; Dong, Tenghui; Bi, Yiliang; Min, Min; Shen, Wei; Xu, Yang; Liu, Yan

    2016-01-01

    Endoscopy has been widely used in diagnosing gastrointestinal mucosal lesions. However, there are still lack of objective endoscopic criteria. Linked color imaging (LCI) is newly developed endoscopic technique which enhances color contrast. Thus, we investigated the clinical application of LCI and further analyzed pixel brightness for RGB color model. All the lesions were observed by white light endoscopy (WLE), LCI and blue laser imaging (BLI). Matlab software was used to calculate pixel brightness for red (R), green (G) and blue color (B). Of the endoscopic images for lesions, LCI had significantly higher R compared with BLI but higher G compared with WLE (all P < 0.05). R/(G + B) was significantly different among 3 techniques and qualified as a composite LCI marker. Our correlation analysis of endoscopic diagnosis with pathology revealed that LCI was quite consistent with pathological diagnosis (P = 0.000) and the color could predict certain kinds of lesions. ROC curve demonstrated at the cutoff of R/(G+B) = 0.646, the area under curve was 0.646, and the sensitivity and specificity was 0.514 and 0.773. Taken together, LCI could improve efficiency and accuracy of diagnosing gastrointestinal mucosal lesions and benefit target biopsy. R/(G + B) based on pixel brightness may be introduced as a objective criterion for evaluating endoscopic images. PMID:27641243

  19. Linked color imaging application for improving the endoscopic diagnosis accuracy: a pilot study

    PubMed Central

    Sun, Xiaotian; Dong, Tenghui; Bi, Yiliang; Min, Min; Shen, Wei; Xu, Yang; Liu, Yan

    2016-01-01

    Endoscopy has been widely used in diagnosing gastrointestinal mucosal lesions. However, there are still lack of objective endoscopic criteria. Linked color imaging (LCI) is newly developed endoscopic technique which enhances color contrast. Thus, we investigated the clinical application of LCI and further analyzed pixel brightness for RGB color model. All the lesions were observed by white light endoscopy (WLE), LCI and blue laser imaging (BLI). Matlab software was used to calculate pixel brightness for red (R), green (G) and blue color (B). Of the endoscopic images for lesions, LCI had significantly higher R compared with BLI but higher G compared with WLE (all P < 0.05). R/(G + B) was significantly different among 3 techniques and qualified as a composite LCI marker. Our correlation analysis of endoscopic diagnosis with pathology revealed that LCI was quite consistent with pathological diagnosis (P = 0.000) and the color could predict certain kinds of lesions. ROC curve demonstrated at the cutoff of R/(G+B) = 0.646, the area under curve was 0.646, and the sensitivity and specificity was 0.514 and 0.773. Taken together, LCI could improve efficiency and accuracy of diagnosing gastrointestinal mucosal lesions and benefit target biopsy. R/(G + B) based on pixel brightness may be introduced as a objective criterion for evaluating endoscopic images. PMID:27641243

  20. A study of the accuracy of neutrally buoyant bubbles used as flow tracers in air

    NASA Technical Reports Server (NTRS)

    Kerho, Michael F.

    1993-01-01

    Research has been performed to determine the accuracy of neutrally buoyant and near neutrally buoyant bubbles used as flow tracers in air. Theoretical, computational, and experimental results are presented to evaluate the dynamics of bubble trajectories and factors affecting their ability to trace flow-field streamlines. The equation of motion for a single bubble was obtained and evaluated using a computational scheme to determine the factors which affect a bubble's trajectory. A two-dimensional experiment was also conducted to experimentally determine bubble trajectories in the stagnation region of NACA 0012 airfoil at 0 deg angle of attack using a commercially available helium bubble generation system. Physical properties of the experimental bubble trajectories were estimated using the computational scheme. These properties included the density ratio and diameter of the individual bubbles. the helium bubble system was then used to visualize and document the flow field about a 30 deg swept semispan wing with simulated glaze ice. Results were compared to Navier-Stokes calculations and surface oil flow visualization. The theoretical and computational analysis have shown that neutrally buoyant bubbles will trace even the most complex flow patterns. Experimental analysis revealed that the use of bubbles to trace flow patterns should be limited to qualitative measurements unless care is taken to ensure neutral buoyancy. This is due to the difficulty in the production of neutrally buoyant bubbles.

  1. Accuracy of the unified approach in maternally influenced traits - illustrated by a simulation study in the honey bee (Apis mellifera)

    PubMed Central

    2013-01-01

    Background The honey bee is an economically important species. With a rapid decline of the honey bee population, it is necessary to implement an improved genetic evaluation methodology. In this study, we investigated the applicability of the unified approach and its impact on the accuracy of estimation of breeding values for maternally influenced traits on a simulated dataset for the honey bee. Due to the limitation to the number of individuals that can be genotyped in a honey bee population, the unified approach can be an efficient strategy to increase the genetic gain and to provide a more accurate estimation of breeding values. We calculated the accuracy of estimated breeding values for two evaluation approaches, the unified approach and the traditional pedigree based approach. We analyzed the effects of different heritabilities as well as genetic correlation between direct and maternal effects on the accuracy of estimation of direct, maternal and overall breeding values (sum of maternal and direct breeding values). The genetic and reproductive biology of the honey bee was accounted for by taking into consideration characteristics such as colony structure, uncertain paternity, overlapping generations and polyandry. In addition, we used a modified numerator relationship matrix and a realistic genome for the honey bee. Results For all values of heritability and correlation, the accuracy of overall estimated breeding values increased significantly with the unified approach. The increase in accuracy was always higher for the case when there was no correlation as compared to the case where a negative correlation existed between maternal and direct effects. Conclusions Our study shows that the unified approach is a useful methodology for genetic evaluation in honey bees, and can contribute immensely to the improvement of traits of apicultural interest such as resistance to Varroa or production and behavioural traits. In particular, the study is of great interest for

  2. Study of academic achievements using spatial analysis tools

    NASA Astrophysics Data System (ADS)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  3. A new statistical tool for NOAA local climate studies

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Meyers, J. C.; Hollingshead, A.

    2011-12-01

    The National Weather Services (NWS) Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to efficiently access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NOAA's staff to conduct regional and local climate studies using state-of-the-art station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. LCAT will augment current climate reference materials with information pertinent to the local and regional levels as they apply to diverse variables appropriate to each locality. The LCAT main emphasis is to enable studies of extreme meteorological and hydrological events such as tornadoes, flood, drought, severe storms, etc. LCAT will close a very critical gap in NWS local climate services because it will allow addressing climate variables beyond average temperature and total precipitation. NWS external partners and government agencies will benefit from the LCAT outputs that could be easily incorporated into their own analysis and/or delivery systems. Presently we identified five existing requirements for local climate: (1) Local impacts of climate change; (2) Local impacts of climate variability; (3) Drought studies; (4) Attribution of severe meteorological and hydrological events; and (5) Climate studies for water resources. The methodologies for the first three requirements will be included in the LCAT first phase implementation. Local rate of climate change is defined as a slope of the mean trend estimated from the ensemble of three trend techniques: (1) hinge, (2) Optimal Climate Normals (running mean for optimal time periods), (3) exponentially

  4. Tools and methods for studying Notch signaling in Drosophila melanogaster

    PubMed Central

    Zacharioudaki, Evanthia; Bray, Sarah J.

    2014-01-01

    Notch signaling involves a highly conserved pathway that mediates communication between neighboring cells. Activation of Notch by its ligands, results in the release of the Notch intracellular domain (NICD), which enters the nucleus and regulates transcription. This pathway has been implicated in many developmental decisions and diseases (including cancers) over the past decades. The simplicity of the Notch pathway in Drosophila melanogaster, in combination with the availability of powerful genetics, make this an attractive model for studying fundamental principles of Notch regulation and function. In this article we present some of the established and emerging tools that are available to monitor and manipulate the Notch pathway in Drosophila and discuss their strengths and weaknesses. PMID:24704358

  5. Nanoparticle microinjection and Raman spectroscopy as tools for nanotoxicology studies.

    PubMed

    Candeloro, Patrizio; Tirinato, Luca; Malara, Natalia; Fregola, Annalisa; Casals, Eudald; Puntes, Victor; Perozziello, Gerardo; Gentile, Francesco; Coluccio, Maria Laura; Das, Gobind; Liberale, Carlo; De Angelis, Francesco; Di Fabrizio, Enzo

    2011-11-01

    Microinjection techniques and Raman spectroscopy have been combined to provide a new methodology to investigate the cytotoxic effects due to the interaction of nanomaterials with cells. In the present work, this novel technique has been used to investigate the effects of Ag and Fe(3)O(4) nanoparticles on Hela cells. The nanoparticles are microinjected inside the cells and these latter ones are probed by means of Raman spectroscopy after a short incubation time, in order to highlight the first and impulsive mechanisms developed by the cells to counteract the presence of the nanoparticles. The results put in evidence a different behaviour of the cells treated with nanoparticles in comparison with the control cells; these differences are supposed to be generated by an emerging oxidative stress due to the nanoparticles. The achieved results demonstrate the suitability of the proposed method as a new tool for nanotoxicity studies.

  6. Total Diet Studies as a Tool for Ensuring Food Safety.

    PubMed

    Lee, Joon-Goo; Kim, Sheen-Hee; Kim, Hae-Jung; Yoon, Hae-Jung

    2015-09-01

    With the diversification and internationalization of the food industry and the increased focus on health from a majority of consumers, food safety policies are being implemented based on scientific evidence. Risk analysis represents the most useful scientific approach for making food safety decisions. Total diet study (TDS) is often used as a risk assessment tool to evaluate exposure to hazardous elements. Many countries perform TDSs to screen for chemicals in foods and analyze exposure trends to hazardous elements. TDSs differ from traditional food monitoring in two major aspects: chemicals are analyzed in food in the form in which it will be consumed and it is cost-effective in analyzing composite samples after processing multiple ingredients together. In Korea, TDSs have been conducted to estimate dietary intakes of heavy metals, pesticides, mycotoxins, persistent organic pollutants, and processing contaminants. TDSs need to be carried out periodically to ensure food safety. PMID:26483881

  7. Numerical Relativity as a tool for studying the Early Universe

    NASA Astrophysics Data System (ADS)

    Garrison, David

    2013-04-01

    Numerical simulations are becoming a more effective tool for conducting detailed investigations into the evolution of our universe. In this presentation, I show how the framework of numerical relativity can be used for studying cosmological models. We are working to develop a large-scale simulation of the dynamical processes in the early universe. These take into account interactions of dark matter, scalar perturbations, gravitational waves, magnetic fields and a turbulent plasma. The code described in this report is a GRMHD code based on the Cactus framework and is structured to utilize one of several different differencing methods chosen at run-time. It is being developed and tested on the Texas Learning and Computation Center's Xanadu cluster.

  8. Total Diet Studies as a Tool for Ensuring Food Safety

    PubMed Central

    Lee, Joon-Goo; Kim, Sheen-Hee; Kim, Hae-Jung

    2015-01-01

    With the diversification and internationalization of the food industry and the increased focus on health from a majority of consumers, food safety policies are being implemented based on scientific evidence. Risk analysis represents the most useful scientific approach for making food safety decisions. Total diet study (TDS) is often used as a risk assessment tool to evaluate exposure to hazardous elements. Many countries perform TDSs to screen for chemicals in foods and analyze exposure trends to hazardous elements. TDSs differ from traditional food monitoring in two major aspects: chemicals are analyzed in food in the form in which it will be consumed and it is cost-effective in analyzing composite samples after processing multiple ingredients together. In Korea, TDSs have been conducted to estimate dietary intakes of heavy metals, pesticides, mycotoxins, persistent organic pollutants, and processing contaminants. TDSs need to be carried out periodically to ensure food safety. PMID:26483881

  9. Assessing the accuracy of the International Classification of Diseases codes to identify abusive head trauma: a feasibility study

    PubMed Central

    Berger, Rachel P; Parks, Sharyn; Fromkin, Janet; Rubin, Pamela; Pecora, Peter J

    2016-01-01

    Objective To assess the accuracy of an International Classification of Diseases (ICD) code-based operational case definition for abusive head trauma (AHT). Methods Subjects were children <5 years of age evaluated for AHT by a hospital-based Child Protection Team (CPT) at a tertiary care paediatric hospital with a completely electronic medical record (EMR) system. Subjects were designated as non-AHT traumatic brain injury (TBI) or AHT based on whether the CPT determined that the injuries were due to AHT. The sensitivity and specificity of the ICD-based definition were calculated. Results There were 223 children evaluated for AHT: 117 AHT and 106 non-AHT TBI. The sensitivity and specificity of the ICD-based operational case definition were 92% (95% CI 85.8 to 96.2) and 96% (95% CI 92.3 to 99.7), respectively. All errors in sensitivity and three of the four specificity errors were due to coder error; one specificity error was a physician error. Conclusions In a paediatric tertiary care hospital with an EMR system, the accuracy of an ICD-based case definition for AHT was high. Additional studies are needed to assess the accuracy of this definition in all types of hospitals in which children with AHT are cared for. PMID:24167034

  10. Comparison of Accuracy of Uncorrected and Corrected Sagittal Tomography in Detection of Mandibular Condyle Erosions: an Exvivo Study

    PubMed Central

    Naser, Asieh Zamani; Shirani, Amir Mansour; Hekmatian, Ehsan; Valiani, Ali; Ardestani, Pegah; Vali, Ava

    2010-01-01

    Background: Radiographic examination of TMJ is indicated when there are clinical signs of pathological conditions, mainly bone changes that may influence the diagnosis and treatment planning. The purpose of this study was to evaluate and to compare the validity and diagnostic accuracy of uncorrected and corrected sagittal tomographic images in the detection of simulated mandibular condyle erosions. Methods Simulated lesions were created in 10 dry mandibles using a dental round bur. Using uncorrected and corrected sagittal tomography techniques, mandibular condyles were imaged by a Cranex Tome X-ray unit before and after creating the lesions. The uncorrected and corrected tomography images were examined by two independent observers for absence or presence of a lesion. The accuracy for detecting mandibular condyle lesions was expressed as sensitivity, specificity, and validity values. Differences between the two radiographic modalities were tested by Wilcoxon for paired data tests. Inter-observer agreement was determined by Cohen's Kappa. Results: The sensitivity, specificity and validity were 45%, 85% and 30% in uncorrected sagittal tomographic images, respectively, and 70%, 92.5% and 60% in corrected sagittal tomographic images, respectively. There was a significant statistical difference between the accuracy of uncorrected and corrected sagittal tomography in detection of mandibular condyle erosions (P = 0.016). The inter-observer agreement was slight for uncorrected sagittal tomography and moderate for corrected sagittal tomography. Conclusion: The accuracy of corrected sagittal tomography is significantly higher than that of uncorrected sagittal tomography. Therefore, corrected sagittal tomography seems to be a better modality in detection of mandibular condyle erosions. PMID:22013461

  11. [Analysis on evaluation tool for literature quality in clinical study].

    PubMed

    Liu, Qing; Zhai, Wei; Tan, Ya-qin; Huang, Juan

    2014-09-01

    The tools used for the literature quality evaluation are introduced. The common evaluation tools that are publicly and extensively used for the evaluation of clinical trial literature quality in the world are analyzed, including Jadad scale, Consolidated Standards of Reporting Trials (CONSORT) statement and Grades of Recommendations Assessment, Development and Evaluation (GRADE) system and the others. Additionally, the present development, updates and applications of these tools are involved in analysis.

  12. Accuracy of tablet splitting: Comparison study between hand splitting and tablet cutter

    PubMed Central

    Habib, Walid A.; Alanizi, Abdulaziz S.; Abdelhamid, Magdi M.; Alanizi, Fars K.

    2013-01-01

    Background Tablet splitting is often used in pharmacy practice to adjust the administered doses. It is also used as a method of reducing medication costs. Objective To investigate the accuracy of tablet splitting by comparing hand splitting vs. a tablet cutter for a low dose drug tablet. Methods Salbutamol tablets (4 mg) were chosen as low dose tablets. A randomly selected equal number of tablets were split by hand and a tablet cutter, and the remaining tablets were kept whole. Weight variation and drug content were analysed for salbutamol in 0.1 N HCl using a validated spectrophotometric method. The percentages by which each whole tablet’s or half-tablet’s drug content and weight difference from sample mean values were compared with USP specification ranges for drug content. The %RSD was also calculated in order to determine whether the drugs met USP specification for %RSD. The tablets and half tablets were scanned using electron microscopy to show any visual differences arising from splitting. Results 27.5% of samples differed from sample mean values by a percentage that fell outside of USP specification for weight, of which 15% from the tablet cutter and 25% from those split by hand fell outside the specifications. All whole tablets and half tablets met the USP specifications for drug content but the variation of content between the two halves reached 21.3% of total content in case of hand splitting, and 7.13% only for the tablet cutter. The %RSDs for drug content and weight met the USP specification for whole salbutamol tablets and the half tablets which were split by tablet cutter. The halves which were split by hand fell outside the specification for %RSD (drug content = 6.43%, weight = 8.33%). The differences were visually clear in the electron microscope scans. Conclusion Drug content variation in half-tablets appeared to be attributable to weight variation occurring during the splitting process. This could have serious clinical consequences for

  13. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    NASA Technical Reports Server (NTRS)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  14. Drosophila tools and assays for the study of human diseases

    PubMed Central

    Ugur, Berrak; Chen, Kuchuan; Bellen, Hugo J.

    2016-01-01

    ABSTRACT Many of the internal organ systems of Drosophila melanogaster are functionally analogous to those in vertebrates, including humans. Although humans and flies differ greatly in terms of their gross morphological and cellular features, many of the molecular mechanisms that govern development and drive cellular and physiological processes are conserved between both organisms. The morphological differences are deceiving and have led researchers to undervalue the study of invertebrate organs in unraveling pathogenic mechanisms of diseases. In this review and accompanying poster, we highlight the physiological and molecular parallels between fly and human organs that validate the use of Drosophila to study the molecular pathogenesis underlying human diseases. We discuss assays that have been developed in flies to study the function of specific genes in the central nervous system, heart, liver and kidney, and provide examples of the use of these assays to address questions related to human diseases. These assays provide us with simple yet powerful tools to study the pathogenic mechanisms associated with human disease-causing genes. PMID:26935102

  15. Databases and registers: useful tools for research, no studies.

    PubMed

    Curbelo, Rafael J; Loza, Estíbaliz; de Yébenes, Maria Jesús García; Carmona, Loreto

    2014-04-01

    There are many misunderstandings about databases. Database is a commonly misused term in reference to any set of data entered into a computer. However, true databases serve a main purpose, organising data. They do so by establishing several layers of relationships; databases are hierarchical. Databases commonly organise data over different levels and over time, where time can be measured as the time between visits, or between treatments, or adverse events, etc. In this sense, medical databases are closely related to longitudinal observational studies, as databases allow the introduction of data on the same patient over time. Basically, we could establish four types of databases in medicine, depending on their purpose: (1) administrative databases, (2) clinical databases, (3) registers, and (4) study-oriented databases. But a database is a useful tool for a large variety of studies, not a type of study itself. Different types of databases serve very different purposes, and a clear understanding of the different research designs mentioned in this paper would prevent many of the databases we launch from being just a lot of work and very little science. PMID:24509895

  16. High-accuracy EUV reflectometer

    NASA Astrophysics Data System (ADS)

    Hinze, U.; Fokoua, M.; Chichkov, B.

    2007-03-01

    Developers and users of EUV-optics need precise tools for the characterization of their products. Often a measurement accuracy of 0.1% or better is desired to detect and study slow-acting aging effect or degradation by organic contaminants. To achieve a measurement accuracy of 0.1% an EUV-source is required which provides an excellent long-time stability, namely power stability, spatial stability and spectral stability. Naturally, it should be free of debris. An EUV-source particularly suitable for this task is an advanced electron-based EUV-tube. This EUV source provides an output of up to 300 μW at 13.5 nm. Reflectometers benefit from the excellent long-time stability of this tool. We design and set up different reflectometers using EUV-tubes for the precise characterisation of EUV-optics, such as debris samples, filters, multilayer mirrors, grazing incidence optics, collectors and masks. Reflectivity measurements from grazing incidence to near normal incidence as well as transmission studies were realised at a precision of down to 0.1%. The reflectometers are computer-controlled and allow varying and scanning all important parameters online. The concepts of a sample reflectometer is discussed and results are presented. The devices can be purchased from the Laser Zentrum Hannover e.V.

  17. Bellis perennis: a useful tool for protein localization studies.

    PubMed

    Jaedicke, Katharina; Rösler, Jutta; Gans, Tanja; Hughes, Jon

    2011-10-01

    Fluorescent fusion proteins together with transient transformation techniques are commonly used to investigate intracellular protein localisation in vivo. Biolistic transfection is reliable, efficient and avoids experimental problems associated with producing and handling fragile protoplasts. Onion epidermis pavement cells are frequently used with this technique, their excellent properties for microscopy resulting from their easy removal from the underlying tissues and large size. They also have advantages over mesophyll cells for fluorescence microscopy, as they are devoid of chloroplasts whose autofluorescence can pose problems. The arrested plastid development is peculiar to epidermal cells, however, and stands in the way of studies on protein targeting to plastids. We have developed a system enabling studies of in vivo protein targeting to organelles including chloroplasts within a photosynthetically active plant cell with excellent optical properties using a transient transformation procedure. We established biolistic transfection in epidermal pavement cells of the lawn daisy (Bellis perennis L., cultivar "Galaxy red") which unusually contain a moderate number of functional chloroplasts. These cells are excellent objects for fluorescence microscopy using current reporters, combining the advantages of the ease of biolistic transfection, the excellent optical properties of a single cell layer and access to chloroplast protein targeting. We demonstrate chloroplast targeting of plastid-localised heme oxygenase, and two further proteins whose localisation was equivocal. We also demonstrate unambiguous targeting to mitochondria, peroxisomes and nuclei. We thus propose that the Bellis system represents a valuable tool for protein localisation studies in living plant cells. PMID:21626148

  18. A Study of the Training of Tool and Die Makers.

    ERIC Educational Resources Information Center

    Horowitz, Morris A.; Herrnstadt, Irwin L.

    To develop and test a methodology which would help determine the combination of education, training, and experience that is most likely to yield highly qualified workers in specific occupations, the tool and die maker trade was selected for examination in the Boston Metropolitan Area. Tool and die making was chosen because it is a clearly…

  19. Accuracy of a Computer-Aided Surgical Simulation (CASS) Protocol for Orthognathic Surgery: A Prospective Multicenter Study

    PubMed Central

    Hsu, Sam Sheng-Pin; Gateno, Jaime; Bell, R. Bryan; Hirsch, David L.; Markiewicz, Michael R.; Teichgraeber, John F.; Zhou, Xiaobo; Xia, James J.

    2012-01-01

    Purpose The purpose of this prospective multicenter study was to assess the accuracy of a computer-aided surgical simulation (CASS) protocol for orthognathic surgery. Materials and Methods The accuracy of the CASS protocol was assessed by comparing planned and postoperative outcomes of 65 consecutive patients enrolled from 3 centers. Computer-generated surgical splints were used for all patients. For the genioplasty, one center utilized computer-generated chin templates to reposition the chin segment only for patients with asymmetry. Standard intraoperative measurements were utilized without the chin templates for the remaining patients. The primary outcome measurements were linear and angular differences for the maxilla, mandible and chin when the planned and postoperative models were registered at the cranium. The secondary outcome measurements were: maxillary dental midline difference between the planned and postoperative positions; and linear and angular differences of the chin segment between the groups with and without the use of the template. The latter was measured when the planned and postoperative models were registered at mandibular body. Statistical analyses were performed, and the accuracy was reported using root mean square deviation (RMSD) and Bland and Altman's method for assessing measurement agreement. Results In the primary outcome measurements, there was no statistically significant difference among the 3 centers for the maxilla and mandible. The largest RMSD was 1.0mm and 1.5° for the maxilla, and 1.1mm and 1.8° for the mandible. For the chin, there was a statistically significant difference between the groups with and without the use of the chin template. The chin template group showed excellent accuracy with largest positional RMSD of 1.0mm and the largest orientational RSMD of 2.2°. However, larger variances were observed in the group not using the chin template. This was significant in anteroposterior and superoinferior directions, as in

  20. An Evaluation of the Effects of an Oven Timer Study Behavior and Concurrent Completion and Accuracy of Assignments for a First Grade Repeater: A Case Study.

    ERIC Educational Resources Information Center

    Riegelman, Elizabeth D.; And Others

    The effects of an oven timer as an antecedent stimulus on study behavior and concurrent completion and accuracy of reading and writing assignments were investigated for an 8-year-old first grade repeater who lacked motivation. Following baseline observations during which the teacher recorded study behavior and collected assignments with no…

  1. A Meta-Analysis of Typhoid Diagnostic Accuracy Studies: A Recommendation to Adopt a Standardized Composite Reference

    PubMed Central

    Storey, Helen L.; Huang, Ying; Crudder, Chris; Golden, Allison; de los Santos, Tala; Hawkins, Kenneth

    2015-01-01

    Novel typhoid diagnostics currently under development have the potential to improve clinical care, surveillance, and the disease burden estimates that support vaccine introduction. Blood culture is most often used as the reference method to evaluate the accuracy of new typhoid tests; however, it is recognized to be an imperfect gold standard. If no single gold standard test exists, use of a composite reference standard (CRS) can improve estimation of diagnostic accuracy. Numerous studies have used a CRS to evaluate new typhoid diagnostics; however, there is no consensus on an appropriate CRS. In order to evaluate existing tests for use as a reference test or inclusion in a CRS, we performed a systematic review of the typhoid literature to include all index/reference test combinations observed. We described the landscape of comparisons performed, showed results of a meta-analysis on the accuracy of the more common combinations, and evaluated sources of variability based on study quality. This wide-ranging meta-analysis suggests that no single test has sufficiently good performance but some existing diagnostics may be useful as part of a CRS. Additionally, based on findings from the meta-analysis and a constructed numerical example demonstrating the use of CRS, we proposed necessary criteria and potential components of a typhoid CRS to guide future recommendations. Agreement and adoption by all investigators of a standardized CRS is requisite, and would improve comparison of new diagnostics across independent studies, leading to the identification of a better reference test and improved confidence in prevalence estimates. PMID:26566275

  2. A Meta-Analysis of Typhoid Diagnostic Accuracy Studies: A Recommendation to Adopt a Standardized Composite Reference.

    PubMed

    Storey, Helen L; Huang, Ying; Crudder, Chris; Golden, Allison; de los Santos, Tala; Hawkins, Kenneth

    2015-01-01

    Novel typhoid diagnostics currently under development have the potential to improve clinical care, surveillance, and the disease burden estimates that support vaccine introduction. Blood culture is most often used as the reference method to evaluate the accuracy of new typhoid tests; however, it is recognized to be an imperfect gold standard. If no single gold standard test exists, use of a composite reference standard (CRS) can improve estimation of diagnostic accuracy. Numerous studies have used a CRS to evaluate new typhoid diagnostics; however, there is no consensus on an appropriate CRS. In order to evaluate existing tests for use as a reference test or inclusion in a CRS, we performed a systematic review of the typhoid literature to include all index/reference test combinations observed. We described the landscape of comparisons performed, showed results of a meta-analysis on the accuracy of the more common combinations, and evaluated sources of variability based on study quality. This wide-ranging meta-analysis suggests that no single test has sufficiently good performance but some existing diagnostics may be useful as part of a CRS. Additionally, based on findings from the meta-analysis and a constructed numerical example demonstrating the use of CRS, we proposed necessary criteria and potential components of a typhoid CRS to guide future recommendations. Agreement and adoption by all investigators of a standardized CRS is requisite, and would improve comparison of new diagnostics across independent studies, leading to the identification of a better reference test and improved confidence in prevalence estimates. PMID:26566275

  3. Oral Fluency, Accuracy, and Complexity in Formal Instruction and Study Abroad Learning Contexts

    ERIC Educational Resources Information Center

    Mora, Joan C.; Valls-Ferrer, Margalida

    2012-01-01

    This study investigates the differential effects of two learning contexts, formal instruction (FI) at home and a study abroad period (SA), on the oral production skills of advanced-level Catalan-Spanish undergraduate learners of English. Speech samples elicited through an interview at three data collection times over a 2-year period were…

  4. Effects of a Word Study Intervention on Spelling Accuracy among Low-Literate Adults

    ERIC Educational Resources Information Center

    Shaw, Donita Massengill; Berg, Margaret A.

    2008-01-01

    The purpose of this study was to evaluate the impact of an instructional spelling approach, Word Study, on the spelling ability of adults with limited literacy proficiency. Ten adults (five control and five experimental) were given the Developmental Spelling Assessment. The control participants received traditional spelling instruction, and the…

  5. Ciliobrevins as tools for studying dynein motor function

    PubMed Central

    Roossien, Douglas H.; Miller, Kyle E.; Gallo, Gianluca

    2015-01-01

    Dyneins are a small class of molecular motors that bind to microtubules and walk toward their minus ends. They are essential for the transport and distribution of organelles, signaling complexes and cytoskeletal elements. In addition dyneins generate forces on microtubule arrays that power the beating of cilia and flagella, cell division, migration and growth cone motility. Classical approaches to the study of dynein function in axons involve the depletion of dynein, expression of mutant/truncated forms of the motor, or interference with accessory subunits. By necessity, these approaches require prolonged time periods for the expression or manipulation of cellular dynein levels. With the discovery of the ciliobrevins, a class of cell permeable small molecule inhibitors of dynein, it is now possible to acutely disrupt dynein both globally and locally. In this review, we briefly summarize recent work using ciliobrevins to inhibit dynein and discuss the insights ciliobrevins have provided about dynein function in various cell types with a focus on neurons. We temper this with a discussion of the need for studies that will elucidate the mechanism of action of ciliobrevin and as well as the need for experiments to further analyze the specificity of ciliobreviens for dynein. Although much remains to be learned about ciliobrevins, these small molecules are proving themselves to be valuable novel tools to assess the cellular functions of dynein. PMID:26217180

  6. 76 FR 71341 - BASINS and WEPP Climate Assessment Tools: Case Study Guide to Potential Applications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-17

    ... AGENCY BASINS and WEPP Climate Assessment Tools: Case Study Guide to Potential Applications AGENCY... Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (EPA/600/R-11/123A). EPA also... WEPP climate assessment tools. The report presents a series of short case studies designed...

  7. A Longitudinal Study of Novice-Level Changes in Fluency and Accuracy in Student Monologues

    ERIC Educational Resources Information Center

    Long, Robert W., III.

    2012-01-01

    Detailed research concerning the issue fluency, specifically relating to pauses, mean length runs, and fluency rates in Japanese EFL learners, is limited. Furthermore, the issue of tracking fluency gains has often been ignored, misunderstood or minimized in EFL educational research. The present study, which is based on six monologues conducted…

  8. A Longitudinal Study of Complexity, Accuracy and Fluency Variation in Second Language Development

    ERIC Educational Resources Information Center

    Ferraris, Stefania

    2012-01-01

    This chapter presents the results of a study on interlanguage variation. The production of four L2 learners of Italian, tested four times at yearly intervals while engaged in four oral tasks, is compared to that of two native speakers, and analysed with quantitative CAF measures. Thus, time, task type, nativeness, as well as group vs. individual…

  9. Biosphere 2 Center as a unique tool for environmental studies.

    PubMed

    Walter, Achim; Lambrecht, Susanne Carmen

    2004-04-01

    The Biosphere 2 Laboratory of Biosphere 2 Center, Arizona, is a unique, self-contained glasshouse fostering several mesocosms of tropical and subtropical regions on an area of 12,700 m2. It was constructed around 1990 to test whether human life is possible in this completely sealed, self-sustaining artificial ecosystem. Mainly due to overly rich organic soils, the initial mission failed in a spectacular manner that raised enormous disbelief in the scientific seriousness of the project. From 1995 to 2003, the facility had been operated by Columbia University under a completely new scientific management. The aim of the project had then been to conduct research in the field of 'experimental climate change science'. Climatic conditions within the mesocosms can be precisely controlled. In studies with elevated CO2, altered temperature and irrigation regimes performed in the rainforest, coral reef and agriforestry mesocosm, the facility had proven to be a valuable tool for global climate change research. Upon submission of this manuscript, Columbia University is relinquishing the management of this facility now although there was a contract to operate the facility until 2010, leaving it with an unclear destiny that might bring about anything from complete abandonment to a new flowering phase with a new destination.

  10. ent-Steroids: novel tools for studies of signaling pathways.

    PubMed

    Covey, Douglas F

    2009-07-01

    Membrane receptors are often modulated by steroids and it is necessary to distinguish the effects of steroids at these receptors from effects occurring at nuclear receptors. Additionally, it may also be mechanistically important to distinguish between direct effects caused by binding of steroids to membrane receptors and indirect effects on membrane receptor function caused by steroid perturbation of the membrane containing the receptor. In this regard, ent-steroids, the mirror images of naturally occurring steroids, are novel tools for distinguishing between these various actions of steroids. The review provides a background for understanding the different actions that can be expected of steroids and ent-steroids in biological systems, references for the preparation of ent-steroids, a short discussion about relevant forms of stereoisomerism and the requirements that need to be fulfilled for the interaction between two molecules to be enantioselective. The review then summarizes results of biophysical, biochemical and pharmacological studies published since 1992 in which ent-steroids have been used to investigate the actions of steroids in membranes and/or receptor-mediated signaling pathways.

  11. The impact of registration accuracy on imaging validation study design: A novel statistical power calculation.

    PubMed

    Gibson, Eli; Fenster, Aaron; Ward, Aaron D

    2013-10-01

    Novel imaging modalities are pushing the boundaries of what is possible in medical imaging, but their signal properties are not always well understood. The evaluation of these novel imaging modalities is critical to achieving their research and clinical potential. Image registration of novel modalities to accepted reference standard modalities is an important part of characterizing the modalities and elucidating the effect of underlying focal disease on the imaging signal. The strengths of the conclusions drawn from these analyses are limited by statistical power. Based on the observation that in this context, statistical power depends in part on uncertainty arising from registration error, we derive a power calculation formula relating registration error, number of subjects, and the minimum detectable difference between normal and pathologic regions on imaging, for an imaging validation study design that accommodates signal correlations within image regions. Monte Carlo simulations were used to evaluate the derived models and test the strength of their assumptions, showing that the model yielded predictions of the power, the number of subjects, and the minimum detectable difference of simulated experiments accurate to within a maximum error of 1% when the assumptions of the derivation were met, and characterizing sensitivities of the model to violations of the assumptions. The use of these formulae is illustrated through a calculation of the number of subjects required for a case study, modeled closely after a prostate cancer imaging validation study currently taking place at our institution. The power calculation formulae address three central questions in the design of imaging validation studies: (1) What is the maximum acceptable registration error? (2) How many subjects are needed? (3) What is the minimum detectable difference between normal and pathologic image regions?

  12. Assessing accuracy of a probabilistic model for very large fire in the Rocky Mountains: A High Park Fire case study

    NASA Astrophysics Data System (ADS)

    Stavros, E.; Abatzoglou, J. T.; Larkin, N.; McKenzie, D.; Steel, A.

    2012-12-01

    Across the western United States, the largest wildfires account for a major proportion of the area burned and substantially affect mountain forests and their associated ecosystem services, among which is pristine air quality. These fires commandeer national attention and significant fire suppression resources. Despite efforts to understand the influence of fuel loading, climate, and weather on annual area burned, few studies have focused on understanding what abiotic factors enable and drive the very largest wildfires. We investigated the correlation between both antecedent climate and in-situ biophysical variables and very large (>20,000 ha) fires in the western United States from 1984 to 2009. We built logistic regression models, at the spatial scale of the national Geographic Area Coordination Centers (GACCs), to estimate the probability that a given day is conducive to a very large wildfire. Models vary in accuracy and in which variables are the best predictors. In a case study of the conditions of the High Park Fire, neighboring Fort Collins, Colorado, occurring in early summer 2012, we evaluate the predictive accuracy of the Rocky Mountain model.

  13. Improved accuracy of markerless motion tracking on bone suppression images: preliminary study for image-guided radiation therapy (IGRT)

    NASA Astrophysics Data System (ADS)

    Tanaka, Rie; Sanada, Shigeru; Sakuta, Keita; Kawashima, Hiroki

    2015-05-01

    The bone suppression technique based on advanced image processing can suppress the conspicuity of bones on chest radiographs, creating soft tissue images obtained by the dual-energy subtraction technique. This study was performed to evaluate the usefulness of bone suppression image processing in image-guided radiation therapy. We demonstrated the improved accuracy of markerless motion tracking on bone suppression images. Chest fluoroscopic images of nine patients with lung nodules during respiration were obtained using a flat-panel detector system (120 kV, 0.1 mAs/pulse, 5 fps). Commercial bone suppression image processing software was applied to the fluoroscopic images to create corresponding bone suppression images. Regions of interest were manually located on lung nodules and automatic target tracking was conducted based on the template matching technique. To evaluate the accuracy of target tracking, the maximum tracking error in the resulting images was compared with that of conventional fluoroscopic images. The tracking errors were decreased by half in eight of nine cases. The average maximum tracking errors in bone suppression and conventional fluoroscopic images were 1.3   ±   1.0 and 3.3   ±   3.3 mm, respectively. The bone suppression technique was especially effective in the lower lung area where pulmonary vessels, bronchi, and ribs showed complex movements. The bone suppression technique improved tracking accuracy without special equipment and implantation of fiducial markers, and with only additional small dose to the patient. Bone suppression fluoroscopy is a potential measure for respiratory displacement of the target. This paper was presented at RSNA 2013 and was carried out at Kanazawa University, JAPAN.

  14. Improved accuracy of markerless motion tracking on bone suppression images: preliminary study for image-guided radiation therapy (IGRT).

    PubMed

    Tanaka, Rie; Sanada, Shigeru; Sakuta, Keita; Kawashima, Hiroki

    2015-05-21

    The bone suppression technique based on advanced image processing can suppress the conspicuity of bones on chest radiographs, creating soft tissue images obtained by the dual-energy subtraction technique. This study was performed to evaluate the usefulness of bone suppression image processing in image-guided radiation therapy. We demonstrated the improved accuracy of markerless motion tracking on bone suppression images. Chest fluoroscopic images of nine patients with lung nodules during respiration were obtained using a flat-panel detector system (120 kV, 0.1 mAs/pulse, 5 fps). Commercial bone suppression image processing software was applied to the fluoroscopic images to create corresponding bone suppression images. Regions of interest were manually located on lung nodules and automatic target tracking was conducted based on the template matching technique. To evaluate the accuracy of target tracking, the maximum tracking error in the resulting images was compared with that of conventional fluoroscopic images. The tracking errors were decreased by half in eight of nine cases. The average maximum tracking errors in bone suppression and conventional fluoroscopic images were 1.3 ± 1.0 and 3.3 ± 3.3 mm, respectively. The bone suppression technique was especially effective in the lower lung area where pulmonary vessels, bronchi, and ribs showed complex movements. The bone suppression technique improved tracking accuracy without special equipment and implantation of fiducial markers, and with only additional small dose to the patient. Bone suppression fluoroscopy is a potential measure for respiratory displacement of the target.

  15. Effect of considering the initial parameters on accuracy of experimental studies conclusions

    NASA Astrophysics Data System (ADS)

    Zagulova, D.; Nesterenko, A.; Kapilevich, L.; Popova, J.

    2015-11-01

    The presented paper contains the evidences of the necessity to take into account the initial level of physiological parameters while conducting the biomedical research; it is exemplified by certain indicators of cardiorespiratory system. The analysis is based on the employment of data obtained via the multiple surveys of medical and pharmaceutical college students. There has been revealed a negative correlation of changes of the studied parameters of cardiorespiratory system in the repeated measurements compared to their initial level. It is assumed that the dependence of the changes of physiological parameters from the baseline can be caused by the biorhythmic changes inherent for all body systems.

  16. Strategies to Improve the Accuracy of Mars-GRAM Sensitivity Studies at Large Optical Depths

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, Carl G.; Badger, Andrew M.

    2010-01-01

    The poster provides an overview of techniques to improve the Mars Global Reference Atmospheric Model (Mars-GRAM) sensitivity. It has been discovered during the Mars Science Laboratory (MSL) site selection process that the Mars Global Reference Atmospheric Model (Mars-GRAM) when used for sensitivity studies for TES MapYear = 0 and large optical depth values such as tau = 3 is less than realistic. A preliminary fix has been made to Mars-GRAM by adding a density factor value that was determined for tau = 0.3, 1 and 3.

  17. Improving Mars-GRAM: Increasing the Accuracy of Sensitivity Studies at Large Optical Depths

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, C. G.; Badger, Andrew M.

    2010-01-01

    Extensively utilized for numerous mission applications, the Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model. In a Monte-Carlo mode, Mars-GRAM's perturbation modeling capability is used to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). Mars-GRAM has been found to be inexact when used during the Mars Science Laboratory (MSL) site selection process for sensitivity studies for MapYear=0 and large optical depth values such as tau=3. Mars-GRAM is based on the NASA Ames Mars General Circulation Model (MGCM) from the surface to 80 km altitude. Mars-GRAM with the MapYear parameter set to 0 utilizes results from a MGCM run with a fixed value of tau=3 at all locations for the entire year. Imprecise atmospheric density and pressure at all altitudes is a consequence of this use of MGCM with tau=3. Density factor values have been determined for tau=0.3, 1 and 3 as a preliminary fix to this pressure-density problem. These factors adjust the input values of MGCM MapYear 0 pressure and density to achieve a better match of Mars-GRAM MapYear 0 with Thermal Emission Spectrometer (TES) observations for MapYears 1 and 2 at comparable dust loading. These density factors are fixed values for all latitudes and Ls and are included in Mars-GRAM Release 1.3. Work currently being done, to derive better multipliers by including variations with latitude and/or Ls by comparison of MapYear 0 output directly against TES limb data, will be highlighted in the presentation. The TES limb data utilized in this process has been validated by a comparison study between Mars atmospheric density estimates from Mars-GRAM and measurements by Mars Global Surveyor (MGS). This comparison study was undertaken for locations on Mars of varying latitudes, Ls, and LTST. The more precise density factors will be included in Mars-GRAM 2005 Release 1.4 and thus improve the results of future sensitivity studies done for large

  18. Strategies to Improve the Accuracy of Mars-GRAM Sensitivity Studies at Large Optical Depths

    NASA Astrophysics Data System (ADS)

    Justh, H. L.; Justus, C. G.; Badger, A. M.

    2009-12-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM’s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). It has been discovered during the Mars Science Laboratory (MSL) site selection process that Mars-GRAM when used for sensitivity studies for MapYear=0 and large optical depth values such as tau=3 is less than realistic. A comparison study between Mars atmospheric density estimates from Mars-GRAM and measurements by Mars Global Surveyor (MGS) has been undertaken for locations of varying latitudes, Ls, and LTST on Mars. The preliminary results from this study have validated the Thermal Emission Spectrometer (TES) limb data. From the surface to 80 km altitude, Mars-GRAM is based on the NASA Ames Mars General Circulation Model (MGCM). MGCM results that were used for Mars-GRAM with MapYear=0 were from a MGCM run with a fixed value of tau=3 for the entire year at all locations. Unrealistic energy absorption by uniform atmospheric dust leads to an unrealistic thermal energy balance on the polar caps. The outcome is an inaccurate cycle of condensation/sublimation of the polar caps and, as a consequence, an inaccurate cycle of total atmospheric mass and global-average surface pressure. Under an assumption of unchanged temperature profile and hydrostatic equilibrium, a given percentage change in surface pressure would produce a corresponding percentage change in density at all altitudes. Consequently, the final result of a change in surface pressure is an imprecise atmospheric density at all altitudes. To solve this pressure-density problem, a density factor value was determined for tau=.3, 1 and 3 that will adjust the input values of MGCM MapYear 0 pressure and density to achieve a better match of Mars-GRAM MapYear 0 with MapYears 1 and 2 MGCM output

  19. Strategies to Improve the Accuracy of Mars-GRAM Sensitivity Studies at Large Optical Depths

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, Carl G.; Badger, Andrew M.

    2009-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). It has been discovered during the Mars Science Laboratory (MSL) site selection process that Mars-GRAM when used for sensitivity studies for MapYear=0 and large optical depth values such as tau=3 is less than realistic. A comparison study between Mars atmospheric density estimates from Mars- GRAM and measurements by Mars Global Surveyor (MGS) has been undertaken for locations of varying latitudes, Ls, and LTST on Mars. The preliminary results from this study have validated the Thermal Emission Spectrometer (TES) limb data. From the surface to 80 km altitude, Mars- GRAM is based on the NASA Ames Mars General Circulation Model (MGCM). MGCM results that were used for Mars-GRAM with MapYear=0 were from a MGCM run with a fixed value of tau=3 for the entire year at all locations. Unrealistic energy absorption by uniform atmospheric dust leads to an unrealistic thermal energy balance on the polar caps. The outcome is an inaccurate cycle of condensation/sublimation of the polar caps and, as a consequence, an inaccurate cycle of total atmospheric mass and global-average surface pressure. Under an assumption of unchanged temperature profile and hydrostatic equilibrium, a given percentage change in surface pressure would produce a corresponding percentage change in density at all altitudes. Consequently, the final result of a change in surface pressure is an imprecise atmospheric density at all altitudes. To solve this pressure-density problem, a density factor value was determined for tau=.3, 1 and 3 that will adjust the input values of MGCM MapYear 0 pressure and density to achieve a better match of Mars-GRAM MapYear=0 with MapYears 1 and 2 MGCM output

  20. A case control study to improve accuracy of an electronic fall prevention toolkit.

    PubMed

    Dykes, Patricia C; I-Ching, Evita Hou; Soukup, Jane R; Chang, Frank; Lipsitz, Stuart

    2012-01-01

    Patient falls are a serious and commonly report adverse event in hospitals. In 2009, our team conducted the first randomized control trial of a health information technology-based intervention that significantly reduced falls in acute care hospitals. However, some patients on intervention units with access to the electronic toolkit fell. The purpose of this case control study was to use data mining and modeling techniques to identify the factors associated with falls in hospitalized patients when the toolkit was in place. Our ultimate aim was to apply our findings to improve the toolkit logic and to generate practice recommendations. The results of our evaluation suggest that the fall prevention toolkit logic is accurate but strategies are needed to improve adherence with the fall prevention intervention recommendations generated by the electronic toolkit.

  1. Associations between visual perception accuracy and confidence in a dopaminergic manipulation study

    PubMed Central

    Andreou, Christina; Bozikas, Vasilis P.; Luedtke, Thies; Moritz, Steffen

    2015-01-01

    Delusions are defined as fixed erroneous beliefs that are based on misinterpretation of events or perception, and cannot be corrected by argumentation to the opposite. Cognitive theories of delusions regard this symptom as resulting from specific distorted thinking styles that lead to biased integration and interpretation of perceived stimuli (i.e., reasoning biases). In previous studies, we were able to show that one of these reasoning biases, overconfidence in errors, can be modulated by drugs that act on the dopamine system, a major neurotransmitter system implicated in the pathogenesis of delusions and other psychotic symptoms. Another processing domain suggested to involve the dopamine system and to be abnormal in psychotic disorders is sensory perception. The present study aimed to investigate whether (lower-order) sensory perception and (higher-order) overconfidence in errors are similarly affected by dopaminergic modulation in healthy subjects. Thirty-four healthy individuals were assessed upon administration of l-dopa, placebo, or haloperidol within a randomized, double-blind, cross-over design. Variables of interest were hits and false alarms in an illusory perception paradigm requiring speeded detection of pictures over a noisy background, and subjective confidence ratings for correct and incorrect responses. There was a significant linear increase of false alarm rates from haloperidol to placebo to l-dopa, whereas hit rates were not affected by dopaminergic manipulation. As hypothesized, confidence in error responses was significantly higher with l-dopa compared to placebo. Moreover, confidence in erroneous responses significantly correlated with false alarm rates. These findings suggest that overconfidence in errors and aberrant sensory processing might be both interdependent and related to dopaminergic transmission abnormalities in patients with psychosis. PMID:25932015

  2. Updating Mars-GRAM to Increase the Accuracy of Sensitivity Studies at Large Optical Depths

    NASA Technical Reports Server (NTRS)

    Justh, Hiliary L.; Justus, C. G.; Badger, Andrew M.

    2010-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). During the Mars Science Laboratory (MSL) site selection process, it was discovered that Mars-GRAM, when used for sensitivity studies for MapYear=0 and large optical depth values such as tau=3, is less than realistic. From the surface to 80 km altitude, Mars-GRAM is based on the NASA Ames Mars General Circulation Model (MGCM). MGCM results that were used for Mars-GRAM with MapYear set to 0 were from a MGCM run with a fixed value of tau=3 for the entire year at all locations. This has resulted in an imprecise atmospheric density at all altitudes. As a preliminary fix to this pressure-density problem, density factor values were determined for tau=0.3, 1 and 3 that will adjust the input values of MGCM MapYear 0 pressure and density to achieve a better match of Mars-GRAM MapYear 0 with Thermal Emission Spectrometer (TES) observations for MapYears 1 and 2 at comparable dust loading. Currently, these density factors are fixed values for all latitudes and Ls. Results will be presented from work being done to derive better multipliers by including variation with latitude and/or Ls by comparison of MapYear 0 output directly against TES limb data. The addition of these more precise density factors to Mars-GRAM 2005 Release 1.4 will improve the results of the sensitivity studies done for large optical depths.

  3. Textbook-Bundled Metacognitive Tools: A Study of LearnSmart's Efficacy in General Chemistry

    ERIC Educational Resources Information Center

    Thadani, Vandana; Bouvier-Brown, Nicole C.

    2016-01-01

    College textbook publishers increasingly bundle sophisticated technology-based study tools with their texts. These tools appear promising, but empirical work on their efficacy is needed. We examined whether LearnSmart, a study tool bundled with McGraw-Hill's textbook "Chemistry" (Chang & Goldsby, 2013), improved learning in an…

  4. Accurate Radiometry from Space: An Essential Tool for Climate Studies

    NASA Technical Reports Server (NTRS)

    Fox, Nigel; Kaiser-Weiss, Andrea; Schmutz, Werner; Thome, Kurtis; Young, Dave; Wielicki, Bruce; Winkler, Rainer; Woolliams, Emma

    2011-01-01

    The Earth s climate is undoubtedly changing; however, the time scale, consequences and causal attribution remain the subject of significant debate and uncertainty. Detection of subtle indicators from a background of natural variability requires measurements over a time base of decades. This places severe demands on the instrumentation used, requiring measurements of sufficient accuracy and sensitivity that can allow reliable judgements to be made decades apart. The International System of Units (SI) and the network of National Metrology Institutes were developed to address such requirements. However, ensuring and maintaining SI traceability of sufficient accuracy in instruments orbiting the Earth presents a significant new challenge to the metrology community. This paper highlights some key measurands and applications driving the uncertainty demand of the climate community in the solar reflective domain, e.g. solar irradiances and reflectances/radiances of the Earth. It discusses how meeting these uncertainties facilitate significant improvement in the forecasting abilities of climate models. After discussing the current state of the art, it describes a new satellite mission, called TRUTHS, which enables, for the first time, high-accuracy SI traceability to be established in orbit. The direct use of a primary standard and replication of the terrestrial traceability chain extends the SI into space, in effect realizing a metrology laboratory in space . Keywords: climate change; Earth observation; satellites; radiometry; solar irradiance

  5. The accuracy of dose-rate-regulated tracking: a parametric study

    NASA Astrophysics Data System (ADS)

    Han-Oh, S.; Yi, B.; Berman, B. L.; Lerma, F.; Yu, C.

    2010-02-01

    Dose-rate-regulated tracking (DRRT) is a novel tumor-tracking technique based on a preprogrammed multileaf-collimator (MLC) sequence and dose-rate modulation. We have performed a parametric study on how limitations of the DRRT system and breathing irregularities affect the tracking error and the duty cycle of DRRT. The time delay and the allowed dose-rate increment (continuous-, discrete-increment or beam switching) were used as two parameters for the DRRT system limitation. The breathing irregularity was quantified in terms of three variables, namely, breathing period variation, variation of peak-to-peak amplitude and baseline drift. DRRT treatments were simulated using 2126 breathing cycles obtained from 24 lung-cancer patients. Tracking errors and duty cycles from all 24 patients were combined to evaluate their dependence on each parameter or variable. The tracking error and the duty cycle show a modest difference among the three dose-rate-increment cases. Time delay, breathing peak-to-peak variation and baseline drift are the main factors affecting tracking error. The duty cycle is affected mostly by the allowed dose-rate increment, peak-to-peak variation and baseline drift.

  6. Paramedic accuracy in using a decision support algorithm when recognising adult death: a prospective cohort study

    PubMed Central

    Jones, T; Woollard, M

    2003-01-01

    Method: This prospective 16 month cohort study evaluated 188 events of recognition of adult death (ROAD) by paramedics in the period from November 1999 to February 2001. Results: Of 188 ROAD applications, errors were made in 13 cases (6.9%, 95% CI 3.7 to 11.5. Additionally, there was one adverse clinical incident associated with a case in which ROAD was applied (0.5%, 95% CI 0.01 to 2.9%). ECG strips were unavailable for eight cases, although ambulance records indicated a rhythm of asystole for each of these. Assuming this diagnosis was correct, ROAD was used 174 times without errors (93%, 95% CI 88 to 96%). Assuming that it was not, the ROAD protocol was applied without errors in 166 cases (88.3%, 95% CI 82.8 to 92.5%). None of the errors made appeared to be attributable to poor clinical decision making, compromised treatment, or changed patient outcome. The mean on-scene time for ambulance crews using the ROAD policy was 60 minutes. Conclusion: Paramedics can accurately apply a decision support algorithm when recognising adult death. It could be argued that the attendance of a medical practitioner to confirm death is therefore an inappropriate use of such personnel and may result in unnecessarily protracted on-scene times for ambulance crews. Further research is required to confirm this, and to determine the proportion of patients suitable for recognition of adult death who are actually identified as such by paramedics. PMID:12954697

  7. Acute Response in vivo of a Fiber-Optic Sensor for Continuous Glucose Monitoring from Canine Studies on Point Accuracy

    PubMed Central

    Liao, Kuo-Chih; Chang, Shih-Chieh; Chiu, Cheng-Yang; Chou, Yu-Hsiang

    2010-01-01

    The objective of this study was to evaluate the acute response of Sencil™, a fiber-optic sensor, in point accuracy for glucose monitoring in vivo on healthy dogs under anesthesia. A total of four dogs with clinically normal glycemia were implanted with one sensor each in the chest region to measure the interstitial glucose concentration during the ovariohysterectomy procedure. The data was acquired every 10 seconds after initiation, and was compared to the concentration of venous plasma glucose sampled during the surgery procedures for accuracy of agreement analysis. In the four trials with a range of 71–297 mg/dL plasma glucose, the collected 21 pairs of ISF readings from the Sencil™ and the plasma reference showed superior dispersion of residue values than the conventional system, and a linear correlation (the Pearson correlation coefficient is 0.9288 and the y-intercept is 14.22 mg/dL). The MAD (17.6 mg/dL) and RMAD (16.16%) of Sencil™ measurements were in the comparable range of the conventional system. The Clarke error grid analysis indicated that 100% of the paired points were in the clinically acceptable zone A (61.9%) and B (38.1%). PMID:22163627

  8. Acute response in vivo of a fiber-optic sensor for continuous glucose monitoring from canine studies on point accuracy.

    PubMed

    Liao, Kuo-Chih; Chang, Shih-Chieh; Chiu, Cheng-Yang; Chou, Yu-Hsiang

    2010-01-01

    The objective of this study was to evaluate the acute response of Sencil(™), a fiber-optic sensor, in point accuracy for glucose monitoring in vivo on healthy dogs under anesthesia. A total of four dogs with clinically normal glycemia were implanted with one sensor each in the chest region to measure the interstitial glucose concentration during the ovariohysterectomy procedure. The data was acquired every 10 seconds after initiation, and was compared to the concentration of venous plasma glucose sampled during the surgery procedures for accuracy of agreement analysis. In the four trials with a range of 71-297 mg/dL plasma glucose, the collected 21 pairs of ISF readings from the Sencil™ and the plasma reference showed superior dispersion of residue values than the conventional system, and a linear correlation (the Pearson correlation coefficient is 0.9288 and the y-intercept is 14.22 mg/dL). The MAD (17.6 mg/dL) and RMAD (16.16%) of Sencil™ measurements were in the comparable range of the conventional system. The Clarke error grid analysis indicated that 100% of the paired points were in the clinically acceptable zone A (61.9%) and B (38.1%). PMID:22163627

  9. Accuracy of tumor motion compensation algorithm from a robotic respiratory tracking system: A simulation study

    SciTech Connect

    Seppenwoolde, Yvette; Berbeco, Ross I.; Nishioka, Seiko; Shirato, Hiroki; Heijmen, Ben

    2007-07-15

    The Synchrony{sup TM} Respiratory Tracking System (RTS) is a treatment option of the CyberKnife robotic treatment device to irradiate extra-cranial tumors that move due to respiration. Advantages of RTS are that patients can breath normally and that there is no loss of linac duty cycle such as with gated therapy. Tracking is based on a measured correspondence model (linear or polynomial) between internal tumor motion and external (chest/abdominal) marker motion. The radiation beam follows the tumor movement via the continuously measured external marker motion. To establish the correspondence model at the start of treatment, the 3D internal tumor position is determined at 15 discrete time points by automatic detection of implanted gold fiducials in two orthogonal x-ray images; simultaneously, the positions of the external markers are measured. During the treatment, the relationship between internal and external marker positions is continuously accounted for and is regularly checked and updated. Here we use computer simulations based on continuously and simultaneously recorded internal and external marker positions to investigate the effectiveness of tumor tracking by the RTS. The Cyberknife does not allow continuous acquisition of x-ray images to follow the moving internal markers (typical imaging frequency is once per minute). Therefore, for the simulations, we have used data for eight lung cancer patients treated with respiratory gating. All of these patients had simultaneous and continuous recordings of both internal tumor motion and external abdominal motion. The available continuous relationship between internal and external markers for these patients allowed investigation of the consequences of the lower acquisition frequency of the RTS. With the use of the RTS, simulated treatment errors due to breathing motion were reduced largely and consistently over treatment time for all studied patients. A considerable part of the maximum reduction in treatment error

  10. Molecular characterization of ten F8 splicing mutations in RNA isolated from patient's leucocytes: assessment of in silico prediction tools accuracy.

    PubMed

    Martorell, L; Corrales, I; Ramirez, L; Parra, R; Raya, A; Barquinero, J; Vidal, F

    2015-03-01

    Although 8% of reported FVIII gene (F8) mutations responsible for haemophilia A (HA) affect mRNA processing, very few have been fully characterized at the mRNA level and/or systematically predicted their biological consequences by in silico analysis. This study is aimed to elucidate the effect of potential splice site mutations (PSSM) on the F8 mRNA processing, investigate its correlation with disease severity, and assess their concordance with in silico predictions. We studied the F8 mRNA from 10 HA patient's leucocytes with PSSM by RT-PCR and compared the experimental results with those predicted in silico. The mRNA analysis could explain all the phenotypes observed and demonstrated exon skipping in six cases (c.222G>A, c.601+1delG, c.602-11T>G, c.671-3C>G, c.6115+9C>G and c.6116-1G>A) and activation of cryptic splicing sites, both donor (c.1009+1G>A and c.1009+3A>C) and acceptor sites (c.266-3delC and c.5587-1G>A). In contrast, the in silico analysis was able to predict the score variation of most of the affected splice site, but the precise mechanism could only be correctly determined in two of the 10 mutations analysed. In addition, we have detected aberrant F8 transcripts, even in healthy controls, so this must be taken into account as they could mask the actual contribution of some PSSM. We conclude that F8 mRNA analysis using leucocytes still constitutes an excellent approach to investigate the transcriptional effects of the PSSM in HA, whereas prediction in silico is not always reliable for diagnostic decision-making.

  11. VIGS: a tool to study fruit development in Solanum lycopersicum.

    PubMed

    Fernandez-Moreno, Josefina-Patricia; Orzaez, Diego; Granell, Antonio

    2013-01-01

    A visually traceable system for fast analysis of gene functions based on Fruit-VIGS methodology is described. In our system, the anthocyanin accumulation from purple transgenic tomato lines provides the appropriate background for fruit-specific gene silencing. The tomato Del/Ros1 background ectopically express Delila (Del) and Rosea1 (Ros1) transgenes under the control of fruit ripening E8 promoter, activating specifically anthocyanin biosynthesis during tomato fruit ripening. The Virus-Induced Gene Silencing (VIGS) of Delila and Rosea1 produces a color change in the silenced area easily identifiable. Del/Ros1 VIGS is achieved by agroinjection of an infective clone of Tobacco Rattle Virus (pTRV1 and pTRV2 binary plasmids) directly into the tomato fruit. The infective clone contains a small fragment of Del and Ros1 coding regions (named DR module). The co-silencing of reporter Del/Ros1 genes and a gene of interest (GOI) in the same region enables us to identify the precise region where silencing is occurring. The function of the GOI is established by comparing silenced sectors of fruits where both GOI and reporter DR genes have been silenced with fruits in which only the reporter DR genes have been silenced. The Gateway vector pTRV2_DR_GW was developed to facilitate the cloning of different GOIs together with DR genes. Our tool is particularly useful to study genes involved in metabolic processes during fruit ripening, which by themselves would not produce a visual phenotype. PMID:23386304

  12. High-resolution terrain and landcover mapping with a lightweight, semi-autonomous, remotely-piloted aircraft (RPA): a case study and accuracy assessment

    NASA Astrophysics Data System (ADS)

    Hugenholtz, C.; Whitehead, K.; Moorman, B.; Brown, O.; Hamilton, T.; Barchyn, T.; Riddell, K.; LeClair, A.

    2012-04-01

    Remotely-piloted aircraft (RPA) have evolved into a viable research tool for a range of Earth science applications. Significant technological advances driven by military and surveillance programs have steadily become mainstream and affordable. Thus, RPA technology has the potential to reinvigorate various aspects of geomorphological research, especially at the landform scale. In this presentation we will report results and experiences using a lightweight, semi-autonomous RPA for high-resolution terrain and landcover mapping. The goal was to test the accuracy of the photogrammetrically-derived terrain model and assess the overall performance of the RPA system for landform characterization. The test site was comprised an area of semi-vegetated sand dunes in the Canadian Prairies. The RPA survey was conducted with a RQ-84Z AreoHawk (Hawkeye UAV Ltd) and a low-cost digital camera. During the survey the RPA acquired images semi-autonomously with the aid of proprietary mission planning software developed by Accuas Inc. A total of 44 GCPs were used in the block adjustment to create the terrain model, while an additional 400 independent GPS check points were used for accuracy assessment. The 1 m resolution terrain model developed with Trimble's INPHO photogrammetric software was compared to the independent check points, yielding a RMS error comparable to airborne LiDAR data. The resulting orthophoto mosaic had a resolution of 0.1 m, revealing a number of geomorphic features beyond the resolution of airborne and QuickBird imagery. Overall, this case study highlights the potential of RPA technology for resolving terrain and landcover attributes at the landform scale. We believe one of the most significant and emerging applications of RPA in geomorphology is their potential to quantify rates of landform erosion/deposition in an affordable and flexible manner, allowing investigators to reduce the gap between recorded and natural morphodynamics.

  13. Developing a temperature sensitive tool for studying spin dissipation

    NASA Astrophysics Data System (ADS)

    Wickey, Kurtis Jon

    Measuring the thermodynamic properties of nanoscale structures is becoming increasingly important as heterostructures and devices shrink in size. For example, recent discoveries of spin thermal effects such as spin Seebeck and spin Peltier show that thermal gradients can manipulate spin systems and vice versa. However, the relevant interactions occur within a spin diffusion length of a spin active interface, making study of these spin thermal effects challenging. In addition, recent ferromagnetic resonance studies of spatially confined nanomagnets have shown unique magnon modes in arrays and lines which may give rise to unique magnon-phonon interactions. In this case, the small volume of magnetic material presents a challenge to measurement and as a result the bulk of the work is done on arrays with measurements of the magnetization of individual particles possible through various microscopies but limited access to thermal properties. As a result, tools capable of measuring the thermal properties of nanoscale structures are required to fully explore this emerging science. One approach to addressing this challenge is the use of microscale suspended platforms that maximize their sensitivity to these spin thermal interactions through thermal isolation from their surroundings. Combining this thermal decoupling with sensitive thermometry allows for the measurement of nanojoule heat accumulations, such as those resulting from the small heat flows associated with spin transport and spin relaxation. As these heat flows may manifest themselves in a variety of spin-thermal effects, the development of measurement platforms that can be tailored to optimize their sensitivity to specific thermal measurements is essential. To address these needs, I have fabricated thermally isolated platforms using a unique focused ion beam (FIB) machining that allow for flexible geometries as well as a wide choice of material systems. The thermal characteristics of these platforms were

  14. Immersion defectivity study with volume production immersion lithography tool

    NASA Astrophysics Data System (ADS)

    Nakano, Katsushi; Kato, Hiroshi; Fujiwara, Tomoharu; Shiraishi, K.; Iriuchijima, Yasuhiro; Owa, Soichi; Malik, Irfan; Woodman, Steve; Terala, Prasad; Pelissier, Christine; Zhang, Haiping

    2007-03-01

    ArF immersion lithography has become accepted as the critical layer patterning solution for lithography going forward. Volume production of 55 nm devices using immersion lithography has begun. One of the key issues for the success of volume production immersion lithography is the control of immersion defectivity. Because the defectivity is influenced by the exposure tool, track, materials, and the wafer environment, a broad range of analysis and optimization is needed to minimize defect levels. Defect tests were performed using a dedicated immersion cluster consisting of a volume production immersion exposure tool, Nikon NSR-S609B, having NA of 1.07, and a resist coater-developer, TEL LITHIUS i+. Miniaturization of feature size by immersion lithography requires higher sensitivity defect inspection. In this paper, first we demonstrate the high sensitivity defect measurement using a next generation wafer inspection system, KLA-Tencor 2800 and Surfscan SP2, on both patterned and non-patterned wafers. Long-term defect stability is very important from the viewpoint of device mass production. Secondly, we present long-term defectivity data using a topcoat-less process. For tool and process qualification, a simple monitor method is required. Simple, non-pattern immersion scanned wafer measurement has been proposed elsewhere, but the correlation between such a non-pattern defect and pattern defect must be confirmed. In this paper, using a topcoat process, the correlation between topcoat defects and pattern defects is analyzed using the defect source analysis (DSA) method. In case of accidental tool contamination, a cleaning process should be established. Liquid cleaning is suitable because it can be easily introduced through the immersion nozzle. An in-situ tool cleaning method is introduced. A broad range of optimization of tools, materials, and processes provide convincing evidence that immersion lithography is ready for volume production chip manufacturing.

  15. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Technical Reports Server (NTRS)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  16. The dimensional accuracy of polyvinyl siloxane impression materials using two different impression techniques: An in vitro study

    PubMed Central

    Kumari, Nirmala; Nandeeshwar, D. B.

    2015-01-01

    Aim of the Study: To evaluate and compare the linear dimensional changes of the three representative polyvinyl siloxane (PVS) impression materials and to compare the accuracy of single mix with double mix impression technique. Methodology: A study mold was prepared according to revised American Dental Association specification number 19 for nonaqueous elastic dental impression materials. Three PVS impression materials selected were Elite-HD, Imprint™ II Garant, Aquasil Ultra Heavy. Two impression techniques used were single mix and double mix impression technique. A total of 60 specimens were made and after 24 h the specimens were measured using profile projector. Statistical Analysis: The data were analyzed using one-way analyses of variance analysis and significant differences were separated using Student's Newman–Keul's test. Results: When all the three study group impression materials were compared for double mix technique, the statistically significant difference was found only between Imprint™ II Garantand Elite-HD (P < 0.05). Similarly, using single mix technique, statistically significant difference were found between Elite-HD and Imprint™ II Garant (P < 0.05) and also between Aquasil Ultra Heavy and Elite-HD (P < 0.05). When the linear dimensional accuracy of all three impression material in double mix impression technique and single mix impression technique were compared with the control group, Imprint™ II Garant showed the values more nearing to the values of master die, followed by Aquasil Ultra Heavy and Elite-HD respectively. Conclusion: Among the impression materials Imprint™ II Garant showed least dimensional change. Among the impression techniques, double mix impression technique showed the better results. PMID:26929515

  17. Neutron Reflectivity as a Tool for Physics-Based Studies of Model Bacterial Membranes.

    PubMed

    Barker, Robert D; McKinley, Laura E; Titmuss, Simon

    2016-01-01

    The principles of neutron reflectivity and its application as a tool to provide structural information at the (sub-) molecular unit length scale from models for bacterial membranes are described. The model membranes can take the form of a monolayer for a single leaflet spread at the air/water interface, or bilayers of increasing complexity at the solid/liquid interface. Solid-supported bilayers constrain the bilayer to 2D but can be used to characterize interactions with antimicrobial peptides and benchmark high throughput lab-based techniques. Floating bilayers allow for membrane fluctuations, making the phase behaviour more representative of native membranes. Bilayers of varying levels of compositional accuracy can now be constructed, facilitating studies with aims that range from characterizing the fundamental physical interactions, through to the characterization of accurate mimetics for the inner and outer membranes of Gram-negative bacteria. Studies of the interactions of antimicrobial peptides with monolayer and bilayer models for the inner and outer membranes have revealed information about the molecular control of the outer membrane permeability, and the mode of interaction of antimicrobials with both inner and outer membranes.

  18. Neutron Reflectivity as a Tool for Physics-Based Studies of Model Bacterial Membranes.

    PubMed

    Barker, Robert D; McKinley, Laura E; Titmuss, Simon

    2016-01-01

    The principles of neutron reflectivity and its application as a tool to provide structural information at the (sub-) molecular unit length scale from models for bacterial membranes are described. The model membranes can take the form of a monolayer for a single leaflet spread at the air/water interface, or bilayers of increasing complexity at the solid/liquid interface. Solid-supported bilayers constrain the bilayer to 2D but can be used to characterize interactions with antimicrobial peptides and benchmark high throughput lab-based techniques. Floating bilayers allow for membrane fluctuations, making the phase behaviour more representative of native membranes. Bilayers of varying levels of compositional accuracy can now be constructed, facilitating studies with aims that range from characterizing the fundamental physical interactions, through to the characterization of accurate mimetics for the inner and outer membranes of Gram-negative bacteria. Studies of the interactions of antimicrobial peptides with monolayer and bilayer models for the inner and outer membranes have revealed information about the molecular control of the outer membrane permeability, and the mode of interaction of antimicrobials with both inner and outer membranes. PMID:27193548

  19. A Visualization Tool for Managing and Studying Online Communications

    ERIC Educational Resources Information Center

    Gibbs, William J.; Olexa, Vladimir; Bernas, Ronan S.

    2006-01-01

    Most colleges and universities have adopted course management systems (e.g., Blackboard, WebCT). Worldwide faculty and students use them for class communications and discussions. The discussion tools provided by course management systems, while powerful, often do not offer adequate capabilities to appraise communication patterns, online behaviors,…

  20. Information Literacy and Office Tool Competencies: A Benchmark Study

    ERIC Educational Resources Information Center

    Heinrichs, John H.; Lim, Jeen-Su

    2010-01-01

    Present information science literature recognizes the importance of information technology to achieve information literacy. The authors report the results of a benchmarking student survey regarding perceived functional skills and competencies in word-processing and presentation tools. They used analysis of variance and regression analysis to…

  1. Study of hot hardness characteristics of tool steels

    NASA Technical Reports Server (NTRS)

    Chevalier, J. L.; Dietrich, M. W.; Zaretsky, E. V.

    1972-01-01

    Hardness measurements of tool steel materials in electric furnace at elevated temperatures and low oxygen environment are discussed. Development of equation to predict short term hardness as function of intial room temperature hardness of steel is reported. Types of steel involved in the process are identified.

  2. Continuous glucose monitoring and trend accuracy: news about a trend compass.

    PubMed

    Signal, Matthew; Gottlieb, Rebecca; Le Compte, Aaron; Chase, J Geoffrey

    2014-09-01

    Continuous glucose monitoring (CGM) devices are being increasingly used to monitor glycemia in people with diabetes. One advantage with CGM is the ability to monitor the trend of sensor glucose (SG) over time. However, there are few metrics available for assessing the trend accuracy of CGM devices. The aim of this study was to develop an easy to interpret tool for assessing trend accuracy of CGM data. SG data from CGM were compared to hourly blood glucose (BG) measurements and trend accuracy was quantified using the dot product. Trend accuracy results are displayed on the Trend Compass, which depicts trend accuracy as a function of BG. A trend performance table and Trend Index (TI) metric are also proposed. The Trend Compass was tested using simulated CGM data with varying levels of error and variability, as well as real clinical CGM data. The results show that the Trend Compass is an effective tool for differentiating good trend accuracy from poor trend accuracy, independent of glycemic variability. Furthermore, the real clinical data show that the Trend Compass assesses trend accuracy independent of point bias error. Finally, the importance of assessing trend accuracy as a function of BG level is highlighted in a case example of low and falling BG data, with corresponding rising SG data. This study developed a simple to use tool for quantifying trend accuracy. The resulting trend accuracy is easily interpreted on the Trend Compass plot, and if required, performance table and TI metric. PMID:24876437

  3. A Comparison of Accuracy of Matrix Impression System with Putty Reline Technique and Multiple Mix Technique: An In Vitro Study

    PubMed Central

    Kumar, M Praveen; Patil, Suneel G; Dheeraj, Bhandari; Reddy, Keshav; Goel, Dinker; Krishna, Gopi

    2015-01-01

    Background: The difficulty in obtaining an acceptable impression increases exponentially as the number of abutments increases. Accuracy of the impression material and the use of a suitable impression technique are of utmost importance in the fabrication of a fixed partial denture. This study compared the accuracy of the matrix impression system with conventional putty reline and multiple mix technique for individual dies by comparing the inter-abutment distance in the casts obtained from the impressions. Materials and Methods: Three groups, 10 impressions each with three impression techniques (matrix impression system, putty reline technique and multiple mix technique) were made of a master die. Typodont teeth were embedded in a maxillary frasaco model base. The left first premolar was removed to create a three-unit fixed partial denture situation and the left canine and second premolar were prepared conservatively, and hatch marks were made on the abutment teeth. The final casts obtained from the impressions were examined under a profile projector and the inter-abutment distance was calculated for all the casts and compared. Results: The results from this study showed that in the mesiodistal dimensions the percentage deviation from master model in Group I was 0.1 and 0.2, in Group II was 0.9 and 0.3, and Group III was 1.6 and 1.5, respectively. In the labio-palatal dimensions the percentage deviation from master model in Group I was 0.01 and 0.4, Group II was 1.9 and 1.3, and Group III was 2.2 and 2.0, respectively. In the cervico-incisal dimensions the percentage deviation from the master model in Group I was 1.1 and 0.2, Group II was 3.9 and 1.7, and Group III was 1.9 and 3.0, respectively. In the inter-abutment dimension of dies, percentage deviation from master model in Group I was 0.1, Group II was 0.6, and Group III was 1.0. Conclusion: The matrix impression system showed more accuracy of reproduction for individual dies when compared with putty reline

  4. HEFCE's People Management Self-Assessment Tool: Ticking Boxes or Adding Value? A Case Study

    ERIC Educational Resources Information Center

    McDonald, Claire

    2009-01-01

    This article examines one specific organisational development tool in depth and uses a case study to investigate whether using the tool is more than a tick-box exercise and really can add value and help organisations to develop and improve. The People Management Self-Assessment Tool (SAT) is used to examine higher education institutions' (HEIs)…

  5. BASINS and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (External Review Draft)

    EPA Science Inventory

    This draft report supports application of two recently developed water modeling tools, the BASINS and WEPP climate assessment tools. The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments...

  6. Social Networking Tools and Teacher Education Learning Communities: A Case Study

    ERIC Educational Resources Information Center

    Poulin, Michael T.

    2014-01-01

    Social networking tools have become an integral part of a pre-service teacher's educational experience. As a result, the educational value of social networking tools in teacher preparation programs must be examined. The specific problem addressed in this study is that the role of social networking tools in teacher education learning communities…

  7. Dosimetric accuracy of the cone-beam CT-based treatment planning of the Vero system: a phantom study.

    PubMed

    Yohannes, Indra; Prasetio, Heru; Kallis, Karoline; Bert, Christoph

    2016-01-01

    We report an investigation on the accuracy of dose calculation based on the cone-beam computed tomography (CBCT) images of the nonbowtie filter kV imaging system of the Vero linear accelerator. Different sets of materials and tube voltages were employed to generate the Hounsfield unit lookup tables (HLUTs) for both CBCT and fan-beam CT (FBCT) systems. The HLUTs were then implemented for the dose calculation in a treatment planning system (TPS). Dosimetric evaluation was carried out on an in-house-developed cube phantom that consists of water-equivalent slabs and inhomogeneity inserts. Two independent dosimeters positioned in the cube phantom were used in this study for point-dose and two-dimensional (2D) dose distribution measurements. The differences of HLUTs from various materials and tube voltages in both CT systems resulted in differences in dose calculation accuracy. We found that the higher the tube voltage used to obtain CT images, the better the point-dose calculation and the gamma passing rate of the 2D dose distribution agree to the values determined in the TPS. Moreover, the insert materials that are not tissue-equivalent led to higher dose-calculation inaccuracy. There were negligible differences in dosimetric evaluation between the CBCT- and FBCT-based treatment planning if the HLUTs were generated using the tissue-equivalent materials. In this study, the CBCT images of the Vero system from a complex inhomogeneity phantom can be applied for the TPS dose calculation if the system is calibrated using tissue-equivalent materials scanned at high tube voltage (i.e., 120 kV). PMID:27455496

  8. Precision and accuracy in the quantitative analysis of biological samples by accelerator mass spectrometry: application in microdose absolute bioavailability studies.

    PubMed

    Gao, Lan; Li, Jing; Kasserra, Claudia; Song, Qi; Arjomand, Ali; Hesk, David; Chowdhury, Swapan K

    2011-07-15

    Determination of the pharmacokinetics and absolute bioavailability of an experimental compound, SCH 900518, following a 89.7 nCi (100 μg) intravenous (iv) dose of (14)C-SCH 900518 2 h post 200 mg oral administration of nonradiolabeled SCH 900518 to six healthy male subjects has been described. The plasma concentration of SCH 900518 was measured using a validated LC-MS/MS system, and accelerator mass spectrometry (AMS) was used for quantitative plasma (14)C-SCH 900518 concentration determination. Calibration standards and quality controls were included for every batch of sample analysis by AMS to ensure acceptable quality of the assay. Plasma (14)C-SCH 900518 concentrations were derived from the regression function established from the calibration standards, rather than directly from isotopic ratios from AMS measurement. The precision and accuracy of quality controls and calibration standards met the requirements of bioanalytical guidance (U.S. Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research, Center for Veterinary Medicine. Guidance for Industry: Bioanalytical Method Validation (ucm070107), May 2001. http://www.fda.gov/downloads/Drugs/GuidanceCompilanceRegulatoryInformation/Guidances/ucm070107.pdf ). The AMS measurement had a linear response range from 0.0159 to 9.07 dpm/mL for plasma (14)C-SCH 900158 concentrations. The CV and accuracy were 3.4-8.5% and 94-108% (82-119% for the lower limit of quantitation (LLOQ)), respectively, with a correlation coefficient of 0.9998. The absolute bioavailability was calculated from the dose-normalized area under the curve of iv and oral doses after the plasma concentrations were plotted vs the sampling time post oral dose. The mean absolute bioavailability of SCH 900518 was 40.8% (range 16.8-60.6%). The typical accuracy and standard deviation in AMS quantitative analysis of drugs from human plasma samples have been reported for the first time, and the impact of these

  9. Accuracy of a Wrist-Worn Wearable Device for Monitoring Heart Rates in Hospital Inpatients: A Prospective Observational Study

    PubMed Central

    Kroll, Ryan R; Boyd, J Gordon

    2016-01-01

    Background As the sensing capabilities of wearable devices improve, there is increasing interest in their application in medical settings. Capabilities such as heart rate monitoring may be useful in hospitalized patients as a means of enhancing routine monitoring or as part of an early warning system to detect clinical deterioration. Objective To evaluate the accuracy of heart rate monitoring by a personal fitness tracker (PFT) among hospital inpatients. Methods We conducted a prospective observational study of 50 stable patients in the intensive care unit who each completed 24 hours of heart rate monitoring using a wrist-worn PFT. Accuracy of heart rate recordings was compared with gold standard measurements derived from continuous electrocardiographic (cECG) monitoring. The accuracy of heart rates measured by pulse oximetry (Spo2.R) was also measured as a positive control. Results On a per-patient basis, PFT-derived heart rate values were slightly lower than those derived from cECG monitoring (average bias of −1.14 beats per minute [bpm], with limits of agreement of 24 bpm). By comparison, Spo2.R recordings produced more accurate values (average bias of +0.15 bpm, limits of agreement of 13 bpm, P<.001 as compared with PFT). Personal fitness tracker device performance was significantly better in patients in sinus rhythm than in those who were not (average bias −0.99 bpm vs −5.02 bpm, P=.02). Conclusions Personal fitness tracker–derived heart rates were slightly lower than those derived from cECG monitoring in real-world testing and not as accurate as Spo2.R-derived heart rates. Performance was worse among patients who were not in sinus rhythm. Further clinical evaluation is indicated to see if PFTs can augment early warning systems in hospitals. Trial Registration ClinicalTrials.gov NCT02527408; https://clinicaltrials.gov/ct2/show/NCT02527408 (Archived by WebCite at  http://www.webcitation.org/6kOFez3on) PMID:27651304

  10. A comparative evaluation of the marginal accuracy of crowns fabricated from four commercially available provisional materials: An in vitro study

    PubMed Central

    Amin, Bhavya Mohandas; Aras, Meena Ajay; Chitre, Vidya

    2015-01-01

    Purpose: The purpose of this in vitro study was to evaluate and compare the primary marginal accuracy of four commercially available provisional materials (Protemp 4, Luxatemp Star, Visalys Temp and DPI tooth moulding powder and liquid) at 2 time intervals (10 and 30 min). Materials and Methods: A customized stainless steel master model containing two interchangeable dies was used for fabrication of provisional crowns. Forty crowns (n = 10) were fabricated, and each crown was evaluated under a stereomicroscope. Vertical marginal discrepancies were noted and compared at 10 min since the start of mixing and then at 30 min. Observations and Results: Protemp 4 showed the least vertical marginal discrepancy (71.59 μ), followed by Luxatemp Star (91.93 μ) at 10 min. DPI showed a marginal discrepancy of 95.94 μ while Visalys Temp crowns had vertical marginal discrepancy of 106.81 μ. There was a significant difference in the marginal discrepancy values of Protemp 4 and Visalys Temp. At 30 min, there was a significant difference between the marginal discrepancy of Protemp 4 crowns (83.11 μ) and Visalys Temp crowns (128.97 μ) and between Protemp 4 and DPI (118.88 μ). No significant differences were observed between Protemp 4 and Luxatemp Star. Conclusion: The vertical marginal discrepancy of temporary crowns fabricated from the four commercially available provisional materials ranged from 71 to 106 μ immediately after fabrication (at 10 min from the start of mix) to 83–128 μ (30 min from the start of mix). The time elapsed after mixing had a significant influence on the marginal accuracy of the crowns. PMID:26097348

  11. Accuracy and Precision of Three-Dimensional Low Dose CT Compared to Standard RSA in Acetabular Cups: An Experimental Study

    PubMed Central

    Olivecrona, Henrik; Maguire, Gerald Q.; Noz, Marilyn E.; Zeleznik, Michael P.

    2016-01-01

    Background and Purpose. The gold standard for detection of implant wear and migration is currently radiostereometry (RSA). The purpose of this study is to compare a three-dimensional computed tomography technique (3D CT) to standard RSA as an alternative technique for measuring migration of acetabular cups in total hip arthroplasty. Materials and Methods. With tantalum beads, we marked one cemented and one uncemented cup and mounted these on a similarly marked pelvic model. A comparison was made between 3D CT and standard RSA for measuring migration. Twelve repeated stereoradiographs and CT scans with double examinations in each position and gradual migration of the implants were made. Precision and accuracy of the 3D CT were calculated. Results. The accuracy of the 3D CT ranged between 0.07 and 0.32 mm for translations and 0.21 and 0.82° for rotation. The precision ranged between 0.01 and 0.09 mm for translations and 0.06 and 0.29° for rotations, respectively. For standard RSA, the precision ranged between 0.04 and 0.09 mm for translations and 0.08 and 0.32° for rotations, respectively. There was no significant difference in precision between 3D CT and standard RSA. The effective radiation dose of the 3D CT method, comparable to RSA, was estimated to be 0.33 mSv. Interpretation. Low dose 3D CT is a comparable method to standard RSA in an experimental setting. PMID:27478832

  12. Accuracy and stability of measuring GABA, glutamate, and glutamine by proton magnetic resonance spectroscopy: A phantom study at 4 Tesla

    NASA Astrophysics Data System (ADS)

    Henry, Michael E.; Lauriat, Tara L.; Shanahan, Meghan; Renshaw, Perry F.; Jensen, J. Eric

    2011-02-01

    Proton magnetic resonance spectroscopy has the potential to provide valuable information about alterations in gamma-aminobutyric acid (GABA), glutamate (Glu), and glutamine (Gln) in psychiatric and neurological disorders. In order to use this technique effectively, it is important to establish the accuracy and reproducibility of the methodology. In this study, phantoms with known metabolite concentrations were used to compare the accuracy of 2D J-resolved MRS, single-echo 30 ms PRESS, and GABA-edited MEGA-PRESS for measuring all three aforementioned neurochemicals simultaneously. The phantoms included metabolite concentrations above and below the physiological range and scans were performed at baseline, 1 week, and 1 month time-points. For GABA measurement, MEGA-PRESS proved optimal with a measured-to-target correlation of R2 = 0.999, with J-resolved providing R2 = 0.973 for GABA. All three methods proved effective in measuring Glu with R2 = 0.987 (30 ms PRESS), R2 = 0.996 (J-resolved) and R2 = 0.910 (MEGA-PRESS). J-resolved and MEGA-PRESS yielded good results for Gln measures with respective R2 = 0.855 (J-resolved) and R2 = 0.815 (MEGA-PRESS). The 30 ms PRESS method proved ineffective in measuring GABA and Gln. When measurement stability at in vivo concentration was assessed as a function of varying spectral quality, J-resolved proved the most stable and immune to signal-to-noise and linewidth fluctuation compared to MEGA-PRESS and 30 ms PRESS.

  13. A new automatic blood pressure kit auscultates for accurate reading with a smartphone: A diagnostic accuracy study.

    PubMed

    Wu, Hongjun; Wang, Bingjian; Zhu, Xinpu; Chu, Guang; Zhang, Zhi

    2016-08-01

    The widely used oscillometric automated blood pressure (BP) monitor was continuously questioned on its accuracy. A novel BP kit named Accutension which adopted Korotkoff auscultation method was then devised. Accutension worked with a miniature microphone, a pressure sensor, and a smartphone. The BP values were automatically displayed on the smartphone screen through the installed App. Data recorded in the phone could be played back and reconfirmed after measurement. They could also be uploaded and saved to the iCloud. The accuracy and consistency of this novel electronic auscultatory sphygmomanometer was preliminarily verified here. Thirty-two subjects were included and 82 qualified readings were obtained. The mean differences ± SD for systolic and diastolic BP readings between Accutension and mercury sphygmomanometer were 0.87 ± 2.86 and -0.94 ± 2.93 mm Hg. Agreements between Accutension and mercury sphygmomanometer were highly significant for systolic (ICC = 0.993, 95% confidence interval (CI): 0.989-0.995) and diastolic (ICC = 0.987, 95% CI: 0.979-0.991). In conclusion, Accutension worked accurately based on our pilot study data. The difference was acceptable. ICC and Bland-Altman plot charts showed good agreements with manual measurements. Systolic readings of Accutension were slightly higher than those of manual measurement, while diastolic readings were slightly lower. One possible reason was that Accutension captured the first and the last korotkoff sound more sensitively than human ear during manual measurement and avoided sound missing, so that it might be more accurate than traditional mercury sphygmomanometer. By documenting and analyzing of variant tendency of BP values, Accutension helps management of hypertension and therefore contributes to the mobile heath service. PMID:27512876

  14. Accuracy and Precision of Three-Dimensional Low Dose CT Compared to Standard RSA in Acetabular Cups: An Experimental Study.

    PubMed

    Brodén, Cyrus; Olivecrona, Henrik; Maguire, Gerald Q; Noz, Marilyn E; Zeleznik, Michael P; Sköldenberg, Olof

    2016-01-01

    Background and Purpose. The gold standard for detection of implant wear and migration is currently radiostereometry (RSA). The purpose of this study is to compare a three-dimensional computed tomography technique (3D CT) to standard RSA as an alternative technique for measuring migration of acetabular cups in total hip arthroplasty. Materials and Methods. With tantalum beads, we marked one cemented and one uncemented cup and mounted these on a similarly marked pelvic model. A comparison was made between 3D CT and standard RSA for measuring migration. Twelve repeated stereoradiographs and CT scans with double examinations in each position and gradual migration of the implants were made. Precision and accuracy of the 3D CT were calculated. Results. The accuracy of the 3D CT ranged between 0.07 and 0.32 mm for translations and 0.21 and 0.82° for rotation. The precision ranged between 0.01 and 0.09 mm for translations and 0.06 and 0.29° for rotations, respectively. For standard RSA, the precision ranged between 0.04 and 0.09 mm for translations and 0.08 and 0.32° for rotations, respectively. There was no significant difference in precision between 3D CT and standard RSA. The effective radiation dose of the 3D CT method, comparable to RSA, was estimated to be 0.33 mSv. Interpretation. Low dose 3D CT is a comparable method to standard RSA in an experimental setting. PMID:27478832

  15. Postmarketing Safety Study Tool: A Web Based, Dynamic, and Interoperable System for Postmarketing Drug Surveillance Studies.

    PubMed

    Sinaci, A Anil; Laleci Erturkmen, Gokce B; Gonul, Suat; Yuksel, Mustafa; Invernizzi, Paolo; Thakrar, Bharat; Pacaci, Anil; Cinar, H Alper; Cicekli, Nihan Kesim

    2015-01-01

    Postmarketing drug surveillance is a crucial aspect of the clinical research activities in pharmacovigilance and pharmacoepidemiology. Successful utilization of available Electronic Health Record (EHR) data can complement and strengthen postmarketing safety studies. In terms of the secondary use of EHRs, access and analysis of patient data across different domains are a critical factor; we address this data interoperability problem between EHR systems and clinical research systems in this paper. We demonstrate that this problem can be solved in an upper level with the use of common data elements in a standardized fashion so that clinical researchers can work with different EHR systems independently of the underlying information model. Postmarketing Safety Study Tool lets the clinical researchers extract data from different EHR systems by designing data collection set schemas through common data elements. The tool interacts with a semantic metadata registry through IHE data element exchange profile. Postmarketing Safety Study Tool and its supporting components have been implemented and deployed on the central data warehouse of the Lombardy region, Italy, which contains anonymized records of about 16 million patients with over 10-year longitudinal data on average. Clinical researchers in Roche validate the tool with real life use cases. PMID:26543873

  16. Postmarketing Safety Study Tool: A Web Based, Dynamic, and Interoperable System for Postmarketing Drug Surveillance Studies

    PubMed Central

    Sinaci, A. Anil; Laleci Erturkmen, Gokce B.; Gonul, Suat; Yuksel, Mustafa; Invernizzi, Paolo; Thakrar, Bharat; Pacaci, Anil; Cinar, H. Alper; Cicekli, Nihan Kesim

    2015-01-01

    Postmarketing drug surveillance is a crucial aspect of the clinical research activities in pharmacovigilance and pharmacoepidemiology. Successful utilization of available Electronic Health Record (EHR) data can complement and strengthen postmarketing safety studies. In terms of the secondary use of EHRs, access and analysis of patient data across different domains are a critical factor; we address this data interoperability problem between EHR systems and clinical research systems in this paper. We demonstrate that this problem can be solved in an upper level with the use of common data elements in a standardized fashion so that clinical researchers can work with different EHR systems independently of the underlying information model. Postmarketing Safety Study Tool lets the clinical researchers extract data from different EHR systems by designing data collection set schemas through common data elements. The tool interacts with a semantic metadata registry through IHE data element exchange profile. Postmarketing Safety Study Tool and its supporting components have been implemented and deployed on the central data warehouse of the Lombardy region, Italy, which contains anonymized records of about 16 million patients with over 10-year longitudinal data on average. Clinical researchers in Roche validate the tool with real life use cases. PMID:26543873

  17. Zagreb Amblyopia Preschool Screening Study: near and distance visual acuity testing increase the diagnostic accuracy of screening for amblyopia

    PubMed Central

    Bušić, Mladen; Bjeloš, Mirjana; Petrovečki, Mladen; Kuzmanović Elabjer, Biljana; Bosnar, Damir; Ramić, Senad; Miletić, Daliborka; Andrijašević, Lidija; Kondža Krstonijević, Edita; Jakovljević, Vid; Bišćan Tvrdi, Ana; Predović, Jurica; Kokot, Antonio; Bišćan, Filip; Kovačević Ljubić, Mirna; Motušić Aras, Ranka

    2016-01-01

    Aim To present and evaluate a new screening protocol for amblyopia in preschool children. Methods Zagreb Amblyopia Preschool Screening (ZAPS) study protocol performed screening for amblyopia by near and distance visual acuity (VA) testing of 15 648 children aged 48-54 months attending kindergartens in the City of Zagreb County between September 2011 and June 2014 using Lea Symbols in lines test. If VA in either eye was >0.1 logMAR, the child was re-tested, if failed at re-test, the child was referred to comprehensive eye examination at the Eye Clinic. Results 78.04% of children passed the screening test. Estimated prevalence of amblyopia was 8.08%. Testability, sensitivity, and specificity of the ZAPS study protocol were 99.19%, 100.00%, and 96.68% respectively. Conclusion The ZAPS study used the most discriminative VA test with optotypes in lines as they do not underestimate amblyopia. The estimated prevalence of amblyopia was considerably higher than reported elsewhere. To the best of our knowledge, the ZAPS study protocol reached the highest sensitivity and specificity when evaluating diagnostic accuracy of VA tests for screening. The pass level defined at ≤0.1 logMAR for 4-year-old children, using Lea Symbols in lines missed no amblyopia cases, advocating that both near and distance VA testing should be performed when screening for amblyopia. PMID:26935612

  18. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  19. Does gadolinium-based contrast material improve diagnostic accuracy of local invasion in rectal cancer MRI? A multireader study.

    PubMed

    Gollub, Marc J; Lakhman, Yulia; McGinty, Katrina; Weiser, Martin R; Sohn, Michael; Zheng, Junting; Shia, Jinru

    2015-02-01

    OBJECTIVE. The purpose of this study was to compare reader accuracy and agreement on rectal MRI with and without gadolinium administration in the detection of T4 rectal cancer. MATERIALS AND METHODS. In this study, two radiologists and one fellow independently interpreted all posttreatment MRI studies for patients with locally advanced or recurrent rectal cancer using unenhanced images alone or combined with contrast-enhanced images, with a minimum interval of 4 weeks. Readers evaluated involvement of surrounding structures on a 5-point scale and were blinded to pathology and disease stage. Sensitivity, specificity, negative predictive value, positive predictive value, and AUC were calculated and kappa statistics were used to describe interreader agreement. RESULTS. Seventy-two patients (38 men and 34 women) with a mean age of 61 years (range, 32-86 years) were evaluated. Fifteen patients had 32 organs invaded. Global AUCs without and with gadolinium administration were 0.79 and 0.77, 0.91 and 0.86, and 0.83 and 0.78 for readers 1, 2, and 3, respectively. AUCs before and after gadolinium administration were similar. Kappa values before and after gadolinium administration for pairs of readers ranged from 0.5 to 0.7. CONCLUSION. On the basis of pathology as a reference standard, the use of gadolinium during rectal MRI did not significantly improve radiologists' agreement or ability to detect T4 disease.

  20. Databases and web tools for cancer genomics study.

    PubMed

    Yang, Yadong; Dong, Xunong; Xie, Bingbing; Ding, Nan; Chen, Juan; Li, Yongjun; Zhang, Qian; Qu, Hongzhu; Fang, Xiangdong

    2015-02-01

    Publicly-accessible resources have promoted the advance of scientific discovery. The era of genomics and big data has brought the need for collaboration and data sharing in order to make effective use of this new knowledge. Here, we describe the web resources for cancer genomics research and rate them on the basis of the diversity of cancer types, sample size, omics data comprehensiveness, and user experience. The resources reviewed include data repository and analysis tools; and we hope such introduction will promote the awareness and facilitate the usage of these resources in the cancer research community. PMID:25707591

  1. Pitfalls at the root of facial assessment on photographs: a quantitative study of accuracy in positioning facial landmarks.

    PubMed

    Cummaudo, M; Guerzoni, M; Marasciuolo, L; Gibelli, D; Cigada, A; Obertovà, Z; Ratnayake, M; Poppa, P; Gabriel, P; Ritz-Timme, S; Cattaneo, C

    2013-05-01

    In the last years, facial analysis has gained great interest also for forensic anthropology. The application of facial landmarks may bring about relevant advantages for the analysis of 2D images by measuring distances and extracting quantitative indices. However, this is a complex task which depends upon the variability in positioning facial landmarks. In addition, literature provides only general indications concerning the reliability in positioning facial landmarks on photographic material, and no study is available concerning the specific errors which may be encountered in such an operation. The aim of this study is to analyze the inter- and intra-observer error in defining facial landmarks on photographs by using a software specifically developed for this purpose. Twenty-four operators were requested to define 22 facial landmarks on frontal view photographs and 11 on lateral view images; in addition, three operators repeated the procedure on the same photographs 20 times (at distance of 24 h). In the frontal view, the landmarks with less dispersion were the pupil, cheilion, endocanthion, and stomion (sto), and the landmarks with the highest dispersion were gonion, zygion, frontotemporale, tragion, and selion (se). In the lateral view, the landmarks with the least dispersion were se, pronasale, subnasale, and sto, whereas landmarks with the highest dispersion were gnathion, pogonion, and tragion. Results confirm that few anatomical points can be defined with the highest accuracy and show the importance of the preliminary investigation of reliability in positioning facial landmarks. PMID:23515681

  2. Pitfalls at the root of facial assessment on photographs: a quantitative study of accuracy in positioning facial landmarks.

    PubMed

    Cummaudo, M; Guerzoni, M; Marasciuolo, L; Gibelli, D; Cigada, A; Obertovà, Z; Ratnayake, M; Poppa, P; Gabriel, P; Ritz-Timme, S; Cattaneo, C

    2013-05-01

    In the last years, facial analysis has gained great interest also for forensic anthropology. The application of facial landmarks may bring about relevant advantages for the analysis of 2D images by measuring distances and extracting quantitative indices. However, this is a complex task which depends upon the variability in positioning facial landmarks. In addition, literature provides only general indications concerning the reliability in positioning facial landmarks on photographic material, and no study is available concerning the specific errors which may be encountered in such an operation. The aim of this study is to analyze the inter- and intra-observer error in defining facial landmarks on photographs by using a software specifically developed for this purpose. Twenty-four operators were requested to define 22 facial landmarks on frontal view photographs and 11 on lateral view images; in addition, three operators repeated the procedure on the same photographs 20 times (at distance of 24 h). In the frontal view, the landmarks with less dispersion were the pupil, cheilion, endocanthion, and stomion (sto), and the landmarks with the highest dispersion were gonion, zygion, frontotemporale, tragion, and selion (se). In the lateral view, the landmarks with the least dispersion were se, pronasale, subnasale, and sto, whereas landmarks with the highest dispersion were gnathion, pogonion, and tragion. Results confirm that few anatomical points can be defined with the highest accuracy and show the importance of the preliminary investigation of reliability in positioning facial landmarks.

  3. A computer simulation study comparing lesion detection accuracy with digital mammography, breast tomosynthesis, and cone-beam CT breast imaging

    SciTech Connect

    Gong Xing; Glick, Stephen J.; Liu, Bob; Vedula, Aruna A.; Thacker, Samta

    2006-04-15

    Although conventional mammography is currently the best modality to detect early breast cancer, it is limited in that the recorded image represents the superposition of a three-dimensional (3D) object onto a 2D plane. Recently, two promising approaches for 3D volumetric breast imaging have been proposed, breast tomosynthesis (BT) and CT breast imaging (CTBI). To investigate possible improvements in lesion detection accuracy with either breast tomosynthesis or CT breast imaging as compared to digital mammography (DM), a computer simulation study was conducted using simulated lesions embedded into a structured 3D breast model. The computer simulation realistically modeled x-ray transport through a breast model, as well as the signal and noise propagation through a CsI based flat-panel imager. Polyenergetic x-ray spectra of Mo/Mo 28 kVp for digital mammography, Mo/Rh 28 kVp for BT, and W/Ce 50 kVp for CTBI were modeled. For the CTBI simulation, the intensity of the x-ray spectra for each projection view was determined so as to provide a total average glandular dose of 4 mGy, which is approximately equivalent to that given in conventional two-view screening mammography. The same total dose was modeled for both the DM and BT simulations. Irregular lesions were simulated by using a stochastic growth algorithm providing lesions with an effective diameter of 5 mm. Breast tissue was simulated by generating an ensemble of backgrounds with a power law spectrum, with the composition of 50% fibroglandular and 50% adipose tissue. To evaluate lesion detection accuracy, a receiver operating characteristic (ROC) study was performed with five observers reading an ensemble of images for each case. The average area under the ROC curves (A{sub z}) was 0.76 for DM, 0.93 for BT, and 0.94 for CTBI. Results indicated that for the same dose, a 5 mm lesion embedded in a structured breast phantom was detected by the two volumetric breast imaging systems, BT and CTBI, with statistically

  4. Assessing accuracy of gas-driven permeability measurements: a comparative study of diverse Hassler-cell and probe permeameter devices

    NASA Astrophysics Data System (ADS)

    Filomena, C. M.; Hornung, J.; Stollhofen, H.

    2013-08-01

    Permeability is one of the most important petrophysical parameters to describe the reservoir potential of sedimentary rocks, contributing to problems in hydrology, geothermics, or hydrocarbon reservoir analysis. Outcrop analog studies, well core measurements, or individual sample analysis take advantage of a variety of commercially available devices for permeability measurements. Very often, permeability data derived from different devices need to be merged within one study, e.g. outcrop mini-permeametry and lab-based core plug measurements. To enhance accuracy of different gas-driven permeability measurements, device-specific aberrations need to be taken into account. The application of simple one-to-one correlations may draw a wrong picture of permeability trends. For this purpose, transform equations need to be established. This study presents a detailed comparison of permeability data derived from a selection of commonly used Hassler cells and probe permeameters. As a result of individual cross-plots, typical aberrations and transform equations are elaborated which enable corrections for the specific permeameters. Permeability measurements of the commercially available ErgoTech Gas Permeameter and the TinyPerm II probe-permeameter are well-comparable over the entire range of permeability, with R2 = 0.967. Major aberrations are identified among the TinyPerm II and the mini-permeameter/Hassler-cell combination at Darmstadt University, which need to be corrected and standardized within one study. However, transforms are critical to their use, as aberrations are frequently limited to certain permeability intervals. In the presented examples, deviations typically tend to occur in the lower permeability range < 10 mD. Applying standardizations which consider these aberration intervals strongly improve the comparability of permeability datasets and facilitate the combination of measurement principles. Therefore, the utilization of such correlation tests is highly

  5. An Observational Study to Evaluate the Usability and Intent to Adopt an Artificial Intelligence–Powered Medication Reconciliation Tool

    PubMed Central

    Yuan, Michael Juntao; Poonawala, Robina

    2016-01-01

    Background Medication reconciliation (the process of creating an accurate list of all medications a patient is taking) is a widely practiced procedure to reduce medication errors. It is mandated by the Joint Commission and reimbursed by Medicare. Yet, in practice, medication reconciliation is often not effective owing to knowledge gaps in the team. A promising approach to improve medication reconciliation is to incorporate artificial intelligence (AI) decision support tools into the process to engage patients and bridge the knowledge gap. Objective The aim of this study was to improve the accuracy and efficiency of medication reconciliation by engaging the patient, the nurse, and the physician as a team via an iPad tool. With assistance from the AI agent, the patient will review his or her own medication list from the electronic medical record (EMR) and annotate changes, before reviewing together with the physician and making decisions on the shared iPad screen. Methods In this study, we developed iPad-based software tools, with AI decision support, to engage patients to “self-service” medication reconciliation and then share the annotated reconciled list with the physician. To evaluate the software tool’s user interface and workflow, a small number of patients (10) in a primary care clinic were recruited, and they were observed through the whole process during a pilot study. The patients are surveyed for the tool’s usability afterward. Results All patients were able to complete the medication reconciliation process correctly. Every patient found at least one error or other issues with their EMR medication lists. All of them reported that the tool was easy to use, and 8 of 10 patients reported that they will use the tool in the future. However, few patients interacted with the learning modules in the tool. The physician and nurses reported the tool to be easy-to-use, easy to integrate into existing workflow, and potentially time-saving. Conclusions We have

  6. Assessment of the haptic robot as a new tool for the study of the neural control of reaching.

    PubMed

    Rakusa, Martin; Hribar, Ales; Koritnik, Blaz; Munih, Marko; Battaglni, Piero Paolo; Belic, Ales; Zidar, Janez

    2013-10-01

    Current experimental methods for the study of reaching in the MRI environment do not exactly mimic actual reaching, due to constrains in movement which are imposed by the MRI machine itself. We tested a haptic robot (HR) as such a tool. Positive results would also be promising for combined use of fMRI and EEG to study reaching. Twenty right-handed subjects performed reaching tasks with their right hand with and without the HR. Reaction time, movement time (MT), accuracy, event-related potentials (ERPs) and event-related desynchronisation/synchronisation (ERD/ERS) were studied. Reaction times and accuracies did not differ significantly between the two tasks, while the MT was significantly longer in HR reaching (959 vs. 447 ms). We identified two positive and two negative ERP peaks across all leads in both tasks. The latencies of the P1 and N2 peaks were significantly longer in HR reaching, while there were no significant differences in the P3 and N4 latencies. ERD/ERS topographies were similar between tasks and similar to other reaching studies. Main difference was in ERS rebound which was observed only in actual reaching. Probable reason was significantly larger MT. We found that reaching with the HR engages similar neural structures as in actual reaching. Although there are some constrains, its use may be superior to other techniques used for reaching studies in the MRI environment, where freedom of movement is limited. PMID:23474640

  7. Genetic transformation: a tool to study protein targeting in diatoms.

    PubMed

    Kroth, Peter G

    2007-01-01

    Diatoms are unicellular photoautotrophic eukaryotes that play an important role in ecology by fixing large amounts of CO2 in the oceans. Because they evolved by secondary endocytobiosis-- a process of uptake of a eukaryotic alga into another eukaryotic cell--they have a rather unusual cell biology and genetic constitution. Because the preparation of organelles is rather difficult as a result of the cytosolic structures, genetic transformation and expression of preproteins fused to green fluorescent protein (GFP) became one of the major tools to analyze subcellular localization of proteins in diatoms. Meanwhile several groups successfully attempted to develop genetic transformation protocols for diatoms. These methods are based on "biolistic" DNA delivery via a particle gun and allow the introduction and expression of foreign genes in the algae. Here a protocol for the genetic transformation of the diatom Phaeodactylum tricornutum is described as well as the subsequent characterization of the transformants. PMID:17951693

  8. Assessing accuracy of gas-driven permeability measurements: a comparative study of diverse Hassler-cell and probe permeameter devices

    NASA Astrophysics Data System (ADS)

    Filomena, C. M.; Hornung, J.; Stollhofen, H.

    2014-01-01

    Permeability is one of the most important petrophysical parameters to describe the reservoir properties of sedimentary rocks, pertaining to problems in hydrology, geothermics, and hydrocarbon reservoir analysis. Outcrop analogue studies, well core measurements, and individual sample analysis take advantage of a variety of commercially available devices for permeability measurements. Very often, permeability data derived from different devices need to be merged within one study (e.g. outcrop minipermeametry and lab-based core plug measurements). To enhance accuracy of different gas-driven permeability measurements, device-specific aberrations need to be taken into account. The application of simple one-to-one correlations may draw the wrong picture of permeability trends. For this purpose, transform equations need to be established. This study presents a detailed comparison of permeability data derived from a selection of commonly used Hassler cells and probe permeameters. As a result of individual cross-plots, typical aberrations and transform equations are elaborated, which enable corrections for the specific permeameters. Permeability measurements of the commercially available ErgoTech gas permeameter and the TinyPerm II probe permeameter are well-comparable over the entire range of permeability, with R2 = 0.955. Aberrations are mostly identified in the permeability range < 10 mD, regarding the TinyPerm II and the minipermeameter/Hassler-cell combination at Darmstadt University, which need to be corrected and standardized. Applying standardizations which consider these aberration intervals strongly improves the comparability of permeability data sets and facilitates the combination of measurement principles. Therefore, the utilization of such correlation tests is highly recommended for all kinds of reservoir studies using integrated permeability databases.

  9. Chimpanzees create and modify probe tools functionally: A study with zoo-housed chimpanzees.

    PubMed

    Hopper, Lydia M; Tennie, Claudio; Ross, Stephen R; Lonsdorf, Elizabeth V

    2015-02-01

    Chimpanzees (Pan troglodytes) use tools to probe for out-of-reach food, both in the wild and in captivity. Beyond gathering appropriately-sized materials to create tools, chimpanzees also perform secondary modifications in order to create an optimized tool. In this study, we recorded the behavior of a group of zoo-housed chimpanzees when presented with opportunities to use tools to probe for liquid foods in an artificial termite mound within their enclosure. Previous research with this group of chimpanzees has shown that they are proficient at gathering materials from within their environment in order to create tools to probe for the liquid food within the artificial mound. Extending beyond this basic question, we first asked whether they only made and modified probe tools when it was appropriate to do so (i.e. when the mound was baited with food). Second, by collecting continuous data on their behavior, we also asked whether the chimpanzees first (intentionally) modified their tools prior to probing for food or whether such modifications occurred after tool use, possibly as a by-product of chewing and eating the food from the tools. Following our predictions, we found that tool modification predicted tool use; the chimpanzees began using their tools within a short delay of creating and modifying them, and the chimpanzees performed more tool modifying behaviors when food was available than when they could not gain food through the use of probe tools. We also discuss our results in terms of the chimpanzees' acquisition of the skills, and their flexibility of tool use and learning.

  10. Improved Accuracy of Continuous Glucose Monitoring Systems in Pediatric Patients with Diabetes Mellitus: Results from Two Studies

    PubMed Central

    2016-01-01

    Abstract Objective: This study was designed to evaluate accuracy, performance, and safety of the Dexcom (San Diego, CA) G4® Platinum continuous glucose monitoring (CGM) system (G4P) compared with the Dexcom G4 Platinum with Software 505 algorithm (SW505) when used as adjunctive management to blood glucose (BG) monitoring over a 7-day period in youth, 2–17 years of age, with diabetes. Research Design and Methods: Youth wore either one or two sensors placed on the abdomen or upper buttocks for 7 days, calibrating the device twice daily with a uniform BG meter. Participants had one in-clinic session on Day 1, 4, or 7, during which fingerstick BG measurements (self-monitoring of blood glucose [SMBG]) were obtained every 30 ± 5 min for comparison with CGM, and in youth 6–17 years of age, reference YSI glucose measurements were obtained from arterialized venous blood collected every 15 ± 5 min for comparison with CGM. The sensor was removed by the participant/family after 7 days. Results: In comparison of 2,922 temporally paired points of CGM with the reference YSI measurement for G4P and 2,262 paired points for SW505, the mean absolute relative difference (MARD) was 17% for G4P versus 10% for SW505 (P < 0.0001). In comparison of 16,318 temporally paired points of CGM with SMBG for G4P and 4,264 paired points for SW505, MARD was 15% for G4P versus 13% for SW505 (P < 0.0001). Similarly, error grid analyses indicated superior performance with SW505 compared with G4P in comparison of CGM with YSI and CGM with SMBG results, with greater percentages of SW505 results falling within error grid Zone A or the combined Zones A plus B. There were no serious adverse events or device-related serious adverse events for either the G4P or the SW505, and there was no sensor breakoff. Conclusions: The updated algorithm offers substantial improvements in accuracy and performance in pediatric patients with diabetes. Use of CGM with improved performance has

  11. Cost-Saving Early Diagnosis of Functional Pain in Nonmalignant Pain: A Noninferiority Study of Diagnostic Accuracy

    PubMed Central

    Cámara, Rafael J. A.; Merz, Christian; von Känel, Roland; Egloff, Niklaus

    2016-01-01

    Objectives. We compared two index screening tests for early diagnosis of functional pain: pressure pain measurement by electronic diagnostic equipment, which is accurate but too specialized for primary health care, versus peg testing, which is cost-saving and more easily manageable but of unknown sensitivity and specificity. Early distinction of functional (altered pain perception; nervous sensitization) from neuropathic or nociceptive pain improves pain management. Methods. Clinicians blinded for the index screening tests assessed the reference standard of this noninferiority diagnostic accuracy study, namely, comprehensive medical history taking with all previous findings and treatment outcomes. All consenting patients referred to a university hospital for nonmalignant musculoskeletal pain participated. The main analysis compared the receiver operating characteristic (ROC) curves of both index screening tests. Results. The area under the ROC curve for peg testing was not inferior to that of electronic equipment: it was at least 95% as large for finger measures (two-sided p = 0.038) and at least equally as large for ear measures (two-sided p = 0.003). Conclusions. Routine diagnostic testing by peg, which is accessible for general practitioners, is at least as accurate as specialized equipment. This may shorten time-to-treatment in general practices, thereby improving the prognosis and quality of life. PMID:27088013

  12. Accuracy of cut-off value by measurement of third molar index: Study of a Colombian sample.

    PubMed

    De Luca, Stefano; Aguilar, Lina; Rivera, Marcela; Palacio, Luz Andrea Velandia; Riccomi, Giulia; Bestetti, Fiorella; Cameriere, Roberto

    2016-04-01

    The aim of this cross-sectional study was to test the accuracy of cut-off value of 0.08 by measurement of third molar index (I3M) in assessing legal adult age of 18 years in a sample of Colombian children and young adults. Digital orthopantomographs of 288 Colombian children and young adults (163 girls and 125 boys), aged between 13 and 22 years, were analysed. Concordance correlation coefficient (ρc) and κ statistics (Cohen's Kappa coefficient) showed that repeatability and reproducibility are high for both intra- and inter-observer error. κ statistics for intra- and inter-observer agreement in decision on adult or minor was 0.913 and 0.877, respectively. Age distribution gradually decreases as I3M increases in both girls and boys. For girls, the sensitivity test was 95.1% (95% CI 87.1%-95%) and specificity was 93.8% (95% CI 87.1%-98.8%). The proportion of correctly classified individuals was 95.1%. For boys, the sensitivity test was 91.7% (95% CI 85.1%-96.8%) and specificity was 90.6% (95% CI 82.1%-97.8%). The proportion of correctly classified individuals was 89.7%. The cut-off value of 0.08 is highly useful to determine if a subject is 18 years of age or older or not. PMID:26898677

  13. Accuracy of cut-off value by measurement of third molar index: Study of a Colombian sample.

    PubMed

    De Luca, Stefano; Aguilar, Lina; Rivera, Marcela; Palacio, Luz Andrea Velandia; Riccomi, Giulia; Bestetti, Fiorella; Cameriere, Roberto

    2016-04-01

    The aim of this cross-sectional study was to test the accuracy of cut-off value of 0.08 by measurement of third molar index (I3M) in assessing legal adult age of 18 years in a sample of Colombian children and young adults. Digital orthopantomographs of 288 Colombian children and young adults (163 girls and 125 boys), aged between 13 and 22 years, were analysed. Concordance correlation coefficient (ρc) and κ statistics (Cohen's Kappa coefficient) showed that repeatability and reproducibility are high for both intra- and inter-observer error. κ statistics for intra- and inter-observer agreement in decision on adult or minor was 0.913 and 0.877, respectively. Age distribution gradually decreases as I3M increases in both girls and boys. For girls, the sensitivity test was 95.1% (95% CI 87.1%-95%) and specificity was 93.8% (95% CI 87.1%-98.8%). The proportion of correctly classified individuals was 95.1%. For boys, the sensitivity test was 91.7% (95% CI 85.1%-96.8%) and specificity was 90.6% (95% CI 82.1%-97.8%). The proportion of correctly classified individuals was 89.7%. The cut-off value of 0.08 is highly useful to determine if a subject is 18 years of age or older or not.

  14. Evaluation of accuracy of non-linear finite element computations for surgical simulation: study using brain phantom.

    PubMed

    Ma, J; Wittek, A; Singh, S; Joldes, G; Washio, T; Chinzei, K; Miller, K

    2010-12-01

    In this paper, the accuracy of non-linear finite element computations in application to surgical simulation was evaluated by comparing the experiment and modelling of indentation of the human brain phantom. The evaluation was realised by comparing forces acting on the indenter and the deformation of the brain phantom. The deformation of the brain phantom was measured by tracking 3D motions of X-ray opaque markers, placed within the brain phantom using a custom-built bi-plane X-ray image intensifier system. The model was implemented using the ABAQUS(TM) finite element solver. Realistic geometry obtained from magnetic resonance images and specific constitutive properties determined through compression tests were used in the model. The model accurately predicted the indentation force-displacement relations and marker displacements. Good agreement between modelling and experimental results verifies the reliability of the finite element modelling techniques used in this study and confirms the predictive power of these techniques in surgical simulation. PMID:21153973

  15. Improving the Accuracy of Whole Genome Prediction for Complex Traits Using the Results of Genome Wide Association Studies

    PubMed Central

    Zhang, Zhe; Ober, Ulrike; Erbe, Malena; Zhang, Hao; Gao, Ning; He, Jinlong; Li, Jiaqi; Simianer, Henner

    2014-01-01

    Utilizing the whole genomic variation of complex traits to predict the yet-to-be observed phenotypes or unobserved genetic values via whole genome prediction (WGP) and to infer the underlying genetic architecture via genome wide association study (GWAS) is an interesting and fast developing area in the context of human disease studies as well as in animal and plant breeding. Though thousands of significant loci for several species were detected via GWAS in the past decade, they were not used directly to improve WGP due to lack of proper models. Here, we propose a generalized way of building trait-specific genomic relationship matrices which can exploit GWAS results in WGP via a best linear unbiased prediction (BLUP) model for which we suggest the name BLUP|GA. Results from two illustrative examples show that using already existing GWAS results from public databases in BLUP|GA improved the accuracy of WGP for two out of the three model traits in a dairy cattle data set, and for nine out of the 11 traits in a rice diversity data set, compared to the reference methods GBLUP and BayesB. While BLUP|GA outperforms BayesB, its required computing time is comparable to GBLUP. Further simulation results suggest that accounting for publicly available GWAS results is potentially more useful for WGP utilizing smaller data sets and/or traits of low heritability, depending on the genetic architecture of the trait under consideration. To our knowledge, this is the first study incorporating public GWAS results formally into the standard GBLUP model and we think that the BLUP|GA approach deserves further investigations in animal breeding, plant breeding as well as human genetics. PMID:24663104

  16. Improving the accuracy of whole genome prediction for complex traits using the results of genome wide association studies.

    PubMed

    Zhang, Zhe; Ober, Ulrike; Erbe, Malena; Zhang, Hao; Gao, Ning; He, Jinlong; Li, Jiaqi; Simianer, Henner

    2014-01-01

    Utilizing the whole genomic variation of complex traits to predict the yet-to-be observed phenotypes or unobserved genetic values via whole genome prediction (WGP) and to infer the underlying genetic architecture via genome wide association study (GWAS) is an interesting and fast developing area in the context of human disease studies as well as in animal and plant breeding. Though thousands of significant loci for several species were detected via GWAS in the past decade, they were not used directly to improve WGP due to lack of proper models. Here, we propose a generalized way of building trait-specific genomic relationship matrices which can exploit GWAS results in WGP via a best linear unbiased prediction (BLUP) model for which we suggest the name BLUP|GA. Results from two illustrative examples show that using already existing GWAS results from public databases in BLUP|GA improved the accuracy of WGP for two out of the three model traits in a dairy cattle data set, and for nine out of the 11 traits in a rice diversity data set, compared to the reference methods GBLUP and BayesB. While BLUP|GA outperforms BayesB, its required computing time is comparable to GBLUP. Further simulation results suggest that accounting for publicly available GWAS results is potentially more useful for WGP utilizing smaller data sets and/or traits of low heritability, depending on the genetic architecture of the trait under consideration. To our knowledge, this is the first study incorporating public GWAS results formally into the standard GBLUP model and we think that the BLUP|GA approach deserves further investigations in animal breeding, plant breeding as well as human genetics.

  17. SU-E-E-02: An Excel-Based Study Tool for ABR-Style Exams

    SciTech Connect

    Cline, K; Stanley, D; Defoor, D; Stathakis, S; Gutierrez, A; Papanikolaou, N; Kirby, N

    2015-06-15

    Purpose: As the landscape of learning and testing shifts toward a computer-based environment, a replacement for paper-based methods of studying is desirable. Using Microsoft Excel, a study tool was developed that allows the user to populate multiple-choice questions and then generate an interactive quiz session to answer them. Methods: The code for the tool was written using Microsoft Excel Visual Basic for Applications with the intent that this tool could be implemented by any institution with Excel. The base tool is a template with a setup macro, which builds out the structure based on user’s input. Once the framework is built, the user can input sets of multiple-choice questions, answer choices, and even add figures. The tool can be run in random-question or sequential-question mode for single or multiple courses of study. The interactive session allows the user to select answer choices and immediate feedback is provided. Once the user is finished studying, the tool records the day’s progress by reporting progress statistics useful for trending. Results: Six doctoral students at UTHSCSA have used this tool for the past two months to study for their qualifying exam, which is similar in format and content to the American Board of Radiology (ABR) Therapeutic Part II exam. The students collaborated to create a repository of questions, met weekly to go over these questions, and then used the tool to prepare for their exam. Conclusion: The study tool has provided an effective and efficient way for students to collaborate and be held accountable for exam preparation. The ease of use and familiarity of Excel are important factors for the tool’s use. There are software packages to create similar question banks, but this study tool has no additional cost for those that already have Excel. The study tool will be made openly available.

  18. Dolphin echolocation strategies studied with the Biosonar Measurement Tool

    NASA Astrophysics Data System (ADS)

    Houser, Dorian S.; Martin, Steve W.; Phillips, Michael; Bauer, Eric; Moore, Patrick W.

    2003-10-01

    Two free-swimming dolphins (Tt722 and Tt673) were trained to carry the Biosonar Measurement Tool (BMT) during open water, proud target searches in order to explore echolocation behavior without the constraints of traditional experimental designs. The BMT recorded the angular motion, depth, and velocity of the dolphin as well as echolocation clicks and echoes returning from insonified targets. Mean search time for Tt722 was 24.6+/-7.3 s and 6.5+/-3.0 s for Tt673 on target present trials, the former strategy resulting in the lower false alarm rate. The majority of clicks exceeded 195 dB re: 1 μPa throughout all trials for both animals but each demonstrated preferences for particular frequency bands of echolocation. Considering all trials, only 3.6% of all clicks produced by Tt722 contained peak frequencies greater than 60 kHz whereas Tt673 produced clicks with peak frequencies above 60 kHz 20.4% of the time. Distinctive frequency bands in the distribution of clicks were notable: bands for Tt673 occurred at 38, 54, and 69 kHz with less defined higher order bands; bands for Tt722 occurred at 25, 35, and 40 kHz. Distinctive frequency bands suggest a preferential use or mechanical constraint on harmonically related click frequencies.

  19. A comparative study between evaluation methods for quality control procedures for determining the accuracy of PET/CT registration

    NASA Astrophysics Data System (ADS)

    Cha, Min Kyoung; Ko, Hyun Soo; Jung, Woo Young; Ryu, Jae Kwang; Choe, Bo-Young

    2015-08-01

    The Accuracy of registration between positron emission tomography (PET) and computed tomography (CT) images is one of the important factors for reliable diagnosis in PET/CT examinations. Although quality control (QC) for checking alignment of PET and CT images should be performed periodically, the procedures have not been fully established. The aim of this study is to determine optimal quality control (QC) procedures that can be performed at the user level to ensure the accuracy of PET/CT registration. Two phantoms were used to carry out this study: the American college of Radiology (ACR)-approved PET phantom and National Electrical Manufacturers Association (NEMA) International Electrotechnical Commission (IEC) body phantom, containing fillable spheres. All PET/CT images were acquired on a Biograph TruePoint 40 PET/CT scanner using routine protocols. To measure registration error, the spatial coordinates of the estimated centers of the target slice (spheres) was calculated independently for the PET and the CT images in two ways. We compared the images from the ACR-approved PET phantom to that from the NEMA IEC body phantom. Also, we measured the total time required from phantom preparation to image analysis. The first analysis method showed a total difference of 0.636 ± 0.11 mm for the largest hot sphere and 0.198 ± 0.09 mm for the largest cold sphere in the case of the ACR-approved PET phantom. In the NEMA IEC body phantom, the total difference was 3.720 ± 0.97 mm for the largest hot sphere and 4.800 ± 0.85 mm for the largest cold sphere. The second analysis method showed that the differences in the x location at the line profile of the lesion on PET and CT were (1.33, 1.33) mm for a bone lesion, (-1.26, -1.33) mm for an air lesion and (-1.67, -1.60) mm for a hot sphere lesion for the ACR-approved PET phantom. For the NEMA IEC body phantom, the differences in the x location at the line profile of the lesion on PET and CT were (-1.33, 4.00) mm for the air

  20. The ADENOMA Study. Accuracy of Detection using Endocuff Vision™ Optimization of Mucosal Abnormalities: study protocol for randomized controlled trial

    PubMed Central

    Bevan, Roisin; Ngu, Wee Sing; Saunders, Brian P.; Tsiamoulos, Zacharias; Bassett, Paul; Hoare, Zoe; Rees, Colin J.

    2016-01-01

    Background: Colonoscopy is the gold standard investigation for the diagnosis of bowel pathology and colorectal cancer screening. Adenoma detection rate is a marker of high quality colonoscopy and a high adenoma detection rate is associated with a lower incidence of interval cancers. Several technological advancements have been explored to improve adenoma detection rate. A new device called Endocuff Vision™ has been shown to improve adenoma detection rate in pilot studies. Methods/Design: This is a prospective, multicenter, randomized controlled trial comparing the adenoma detection rate in patients undergoing Endocuff Vision™-assisted colonoscopy with standard colonoscopy. All patients above 18 years of age referred for screening, surveillance, or diagnostic colonoscopy who are able to consent are invited to the study. Patients with absolute contraindications to colonoscopy, large bowel obstruction or pseudo-obstruction, colon cancer or polyposis syndromes, colonic strictures, severe diverticular segments, active colitis, anticoagulant therapy, or pregnancy are excluded. Patients are randomized according to site, age, sex, and bowel cancer screening status to receive Endocuff Vision™-assisted colonoscopy or standard colonoscopy on the day of procedure. Baseline data, colonoscopy, and polyp data including histology are collected. Nurse assessment of patient comfort and patient comfort questionnaires are completed post procedure. Patients are followed up at 21 days and complete a patient experience questionnaire. This study will take place across seven NHS Hospital Trusts: one in London and six within the Northern Region Endoscopy Group. A maximum of 10 colonoscopists per site will recruit a total of 1772 patients, with a maximum of four bowel screening colonoscopists permitted per site. Discussion: This is the first trial to evaluate the adenoma detection rate of Endocuff Vision™ in all screening, surveillance, and diagnostic patient groups. This timely

  1. The efficacy of screening for common dental diseases by hygiene-therapists: a diagnostic test accuracy study.

    PubMed

    Macey, R; Glenny, A; Walsh, T; Tickle, M; Worthington, H; Ashley, J; Brocklehurst, P

    2015-03-01

    Regularly attending adult patients are increasingly asymptomatic and not in need of treatment when attending for their routine dental examinations. As oral health improves further, using the general dental practitioner to undertake the "checkup" on regular "low-risk" patients represents a substantial and potentially unnecessary cost for state-funded systems. Given recent regulatory changes in the United Kingdom, it is now theoretically possible to delegate a range of tasks to hygiene-therapists. This has the potential to release the general dental practitioner's time and increase the capacity to care. The aim of this study is to compare the diagnostic test accuracy of hygiene-therapists when screening for dental caries and periodontal disease in regularly attending asymptomatic adults who attend for their checkup. A visual screen by hygiene-therapists acted as the index test, and the general dental practitioner acted as the reference standard. Consenting asymptomatic adult patients, who were regularly attending patients at 10 practices across the Northwest of England, entered the study. Both sets of clinicians made an assessment of dental caries and periodontal disease. The primary outcomes measured were the sensitivity and specificity values for dental caries and periodontal disease. In total, 1899 patients were screened. The summary point for sensitivity of dental care professionals when screening for caries and periodontal disease was 0.81 (95% CI, 0.74 to 0.87) and 0.89 (0.86 to 0.92), respectively. The summary point for specificity of dental care professionals when screening for caries and periodontal disease was 0.87 (0.78 to 0.92) and 0.75 (0.66 to 0.82), respectively. The results suggest that hygiene-therapists could be used to screen for dental caries and periodontal disease. This has important ramifications for service design in public-funded health systems.

  2. The effect of mandibular buccal tilting on the accuracy of posterior mandibular spiral tomographic images: An in vitro study

    PubMed Central

    Sheikhi, Mahnaz; Maleki, Vida

    2011-01-01

    Background: Accurate measurement of the height and buccolingual thickness of available bone has a significant role in dental implantology. The shadow of ramus on the mandibular second molar region disturbs the sharpness of conventional tomographic images. The aim of this study was to evaluate the effect of transferring the shadow of ramus from the center of the focal plane, by changing the position of mandible, on the sharpness of the posterior mandibular region. Materials and Methods: In this experimental study, we used 10 dry human mandibles. Three metal balls were mounted on the midline and mandibular second molar regions bilaterally. Standard panoramic and tomographic images were taken. Then, the mandible was tilted buccaly for 8° – compensating the normal lingual inclination of the mandibular ridge and teeth on this region – and tomographic images were taken again. The height and thickness of bone were measured on the images and then compared with the real amounts measured directly on mandibles. Also, the sharpness of mandibular canals was compared between the two tomographic methods. Findings were analyzed with repeated measured ANOVA test (P<0.05). Results: The height of mandibular bone, on the images of the tilted tomography technique was more accurate compared to standard (P<0.001), but standard tomography had more accuracy in estimating the buccolingual thickness at the half-height point. Regarding the sharpness of mandibular canals, we found no significant differences between two tomographic methods. Conclusion: Buccal tilting is recommended when measuring the bone height is more important, but routine standard tomography is preferred when the thickness is requested. PMID:23372586

  3. In-vitro study on the accuracy of a simple-design CT-guided stent for dental implants

    PubMed Central

    Huh, Young-June; Choi, Bo-Ram; Huh, Kyung-Hoe; Yi, Won-Jin; Heo, Min-Suk; Lee, Sam-Sun

    2012-01-01

    Purpose An individual surgical stent fabricated from computed tomography (CT) data, called a CT-guided stent, would be useful for accurate installation of implants. The purpose of the present study was to introduce a newly developed CT-guided stent with a simple design and evaluate the accuracy of the stent placement. Materials and Methods A resin template was fabricated from a hog mandible and a specially designed plastic plate, with 4 metal balls inserted in it for radiographic recognition, was attached to the occlusal surface of the template. With the surgical stent applied, CT images were taken, and virtual implants were placed using software. The spatial positions of the virtually positioned implants were acquired and implant guiding holes were drilled into the surgical stent using a specially designed 5-axis drilling machine. The surgical stent was placed on the mandible and CT images were taken again. The discrepancy between the central axis of the drilled holes on the second CT images and the virtually installed implants on the first CT images was evaluated. Results The deviation of the entry point and angulation of the central axis in the reference plane were 0.47±0.27 mm, 0.57±0.23 mm, and 0.64±0.16°, 0.57±0.15°, respectively. However, for the two different angulations in each group, the 20° angulation showed a greater error in the deviation of the entry point than did the 10° angulation. Conclusion The CT-guided template proposed in this study was highly accurate. It could replace existing implant guide systems to reduce costs and effort. PMID:23071963

  4. Use of Molecular Diagnostic Tools for the Identification of Species Responsible for Snakebite in Nepal: A Pilot Study

    PubMed Central

    Sharma, Sanjib Kumar; Kuch, Ulrich; Höde, Patrick; Bruhse, Laura; Pandey, Deb P.; Ghimire, Anup; Chappuis, François; Alirol, Emilie

    2016-01-01

    Snakebite is an important medical emergency in rural Nepal. Correct identification of the biting species is crucial for clinicians to choose appropriate treatment and anticipate complications. This is particularly important for neurotoxic envenoming which, depending on the snake species involved, may not respond to available antivenoms. Adequate species identification tools are lacking. This study used a combination of morphological and molecular approaches (PCR-aided DNA sequencing from swabs of bite sites) to determine the contribution of venomous and non-venomous species to the snakebite burden in southern Nepal. Out of 749 patients admitted with a history of snakebite to one of three study centres, the biting species could be identified in 194 (25.9%). Out of these, 87 had been bitten by a venomous snake, most commonly the Indian spectacled cobra (Naja naja; n = 42) and the common krait (Bungarus caeruleus; n = 22). When both morphological identification and PCR/sequencing results were available, a 100% agreement was noted. The probability of a positive PCR result was significantly lower among patients who had used inadequate “first aid” measures (e.g. tourniquets or local application of remedies). This study is the first to report the use of forensic genetics methods for snake species identification in a prospective clinical study. If high diagnostic accuracy is confirmed in larger cohorts, this method will be a very useful reference diagnostic tool for epidemiological investigations and clinical studies. PMID:27105074

  5. Use of Molecular Diagnostic Tools for the Identification of Species Responsible for Snakebite in Nepal: A Pilot Study.

    PubMed

    Sharma, Sanjib Kumar; Kuch, Ulrich; Höde, Patrick; Bruhse, Laura; Pandey, Deb P; Ghimire, Anup; Chappuis, François; Alirol, Emilie

    2016-04-01

    Snakebite is an important medical emergency in rural Nepal. Correct identification of the biting species is crucial for clinicians to choose appropriate treatment and anticipate complications. This is particularly important for neurotoxic envenoming which, depending on the snake species involved, may not respond to available antivenoms. Adequate species identification tools are lacking. This study used a combination of morphological and molecular approaches (PCR-aided DNA sequencing from swabs of bite sites) to determine the contribution of venomous and non-venomous species to the snakebite burden in southern Nepal. Out of 749 patients admitted with a history of snakebite to one of three study centres, the biting species could be identified in 194 (25.9%). Out of these, 87 had been bitten by a venomous snake, most commonly the Indian spectacled cobra (Naja naja; n = 42) and the common krait (Bungarus caeruleus; n = 22). When both morphological identification and PCR/sequencing results were available, a 100% agreement was noted. The probability of a positive PCR result was significantly lower among patients who had used inadequate "first aid" measures (e.g. tourniquets or local application of remedies). This study is the first to report the use of forensic genetics methods for snake species identification in a prospective clinical study. If high diagnostic accuracy is confirmed in larger cohorts, this method will be a very useful reference diagnostic tool for epidemiological investigations and clinical studies. PMID:27105074

  6. A Comparative Study on Diagnostic Accuracy of Colour Coded Digital Images, Direct Digital Images and Conventional Radiographs for Periapical Lesions – An In Vitro Study

    PubMed Central

    Mubeen; K.R., Vijayalakshmi; Bhuyan, Sanat Kumar; Panigrahi, Rajat G; Priyadarshini, Smita R; Misra, Satyaranjan; Singh, Chandravir

    2014-01-01

    Objectives: The identification and radiographic interpretation of periapical bone lesions is important for accurate diagnosis and treatment. The present study was undertaken to study the feasibility and diagnostic accuracy of colour coded digital radiographs in terms of presence and size of lesion and to compare the diagnostic accuracy of colour coded digital images with direct digital images and conventional radiographs for assessing periapical lesions. Materials and Methods: Sixty human dry cadaver hemimandibles were obtained and periapical lesions were created in first and second premolar teeth at the junction of cancellous and cortical bone using a micromotor handpiece and carbide burs of sizes 2, 4 and 6. After each successive use of round burs, a conventional, RVG and colour coded image was taken for each specimen. All the images were evaluated by three observers. The diagnostic accuracy for each bur and image mode was calculated statistically. Results: Our results showed good interobserver (kappa > 0.61) agreement for the different radiographic techniques and for the different bur sizes. Conventional Radiography outperformed Digital Radiography in diagnosing periapical lesions made with Size two bur. Both were equally diagnostic for lesions made with larger bur sizes. Colour coding method was least accurate among all the techniques. Conclusion: Conventional radiography traditionally forms the backbone in the diagnosis, treatment planning and follow-up of periapical lesions. Direct digital imaging is an efficient technique, in diagnostic sense. Colour coding of digital radiography was feasible but less accurate however, this imaging technique, like any other, needs to be studied continuously with the emphasis on safety of patients and diagnostic quality of images. PMID:25584318

  7. [True color accuracy in digital forensic photography].

    PubMed

    Ramsthaler, Frank; Birngruber, Christoph G; Kröll, Ann-Katrin; Kettner, Mattias; Verhoff, Marcel A

    2016-01-01

    Forensic photographs not only need to be unaltered and authentic and capture context-relevant images, along with certain minimum requirements for image sharpness and information density, but color accuracy also plays an important role, for instance, in the assessment of injuries or taphonomic stages, or in the identification and evaluation of traces from photos. The perception of color not only varies subjectively from person to person, but as a discrete property of an image, color in digital photos is also to a considerable extent influenced by technical factors such as lighting, acquisition settings, camera, and output medium (print, monitor). For these reasons, consistent color accuracy has so far been limited in digital photography. Because images usually contain a wealth of color information, especially for complex or composite colors or shades of color, and the wavelength-dependent sensitivity to factors such as light and shadow may vary between cameras, the usefulness of issuing general recommendations for camera capture settings is limited. Our results indicate that true image colors can best and most realistically be captured with the SpyderCheckr technical calibration tool for digital cameras tested in this study. Apart from aspects such as the simplicity and quickness of the calibration procedure, a further advantage of the tool is that the results are independent of the camera used and can also be used for the color management of output devices such as monitors and printers. The SpyderCheckr color-code patches allow true colors to be captured more realistically than with a manual white balance tool or an automatic flash. We therefore recommend that the use of a color management tool should be considered for the acquisition of all images that demand high true color accuracy (in particular in the setting of injury documentation). PMID:27386623

  8. [True color accuracy in digital forensic photography].

    PubMed

    Ramsthaler, Frank; Birngruber, Christoph G; Kröll, Ann-Katrin; Kettner, Mattias; Verhoff, Marcel A

    2016-01-01

    Forensic photographs not only need to be unaltered and authentic and capture context-relevant images, along with certain minimum requirements for image sharpness and information density, but color accuracy also plays an important role, for instance, in the assessment of injuries or taphonomic stages, or in the identification and evaluation of traces from photos. The perception of color not only varies subjectively from person to person, but as a discrete property of an image, color in digital photos is also to a considerable extent influenced by technical factors such as lighting, acquisition settings, camera, and output medium (print, monitor). For these reasons, consistent color accuracy has so far been limited in digital photography. Because images usually contain a wealth of color information, especially for complex or composite colors or shades of color, and the wavelength-dependent sensitivity to factors such as light and shadow may vary between cameras, the usefulness of issuing general recommendations for camera capture settings is limited. Our results indicate that true image colors can best and most realistically be captured with the SpyderCheckr technical calibration tool for digital cameras tested in this study. Apart from aspects such as the simplicity and quickness of the calibration procedure, a further advantage of the tool is that the results are independent of the camera used and can also be used for the color management of output devices such as monitors and printers. The SpyderCheckr color-code patches allow true colors to be captured more realistically than with a manual white balance tool or an automatic flash. We therefore recommend that the use of a color management tool should be considered for the acquisition of all images that demand high true color accuracy (in particular in the setting of injury documentation).

  9. Accuracy in optical overlay metrology

    NASA Astrophysics Data System (ADS)

    Bringoltz, Barak; Marciano, Tal; Yaziv, Tal; DeLeeuw, Yaron; Klein, Dana; Feler, Yoel; Adam, Ido; Gurevich, Evgeni; Sella, Noga; Lindenfeld, Ze'ev; Leviant, Tom; Saltoun, Lilach; Ashwal, Eltsafon; Alumot, Dror; Lamhot, Yuval; Gao, Xindong; Manka, James; Chen, Bryan; Wagner, Mark

    2016-03-01

    In this paper we discuss the mechanism by which process variations determine the overlay accuracy of optical metrology. We start by focusing on scatterometry, and showing that the underlying physics of this mechanism involves interference effects between cavity modes that travel between the upper and lower gratings in the scatterometry target. A direct result is the behavior of accuracy as a function of wavelength, and the existence of relatively well defined spectral regimes in which the overlay accuracy and process robustness degrades (`resonant regimes'). These resonances are separated by wavelength regions in which the overlay accuracy is better and independent of wavelength (we term these `flat regions'). The combination of flat and resonant regions forms a spectral signature which is unique to each overlay alignment and carries certain universal features with respect to different types of process variations. We term this signature the `landscape', and discuss its universality. Next, we show how to characterize overlay performance with a finite set of metrics that are available on the fly, and that are derived from the angular behavior of the signal and the way it flags resonances. These metrics are used to guarantee the selection of accurate recipes and targets for the metrology tool, and for process control with the overlay tool. We end with comments on the similarity of imaging overlay to scatterometry overlay, and on the way that pupil overlay scatterometry and field overlay scatterometry differ from an accuracy perspective.

  10. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  11. Diagnostic Accuracy Study of Intraoperative and Perioperative Serum Intact PTH Level for Successful Parathyroidectomy in 501 Secondary Hyperparathyroidism Patients

    PubMed Central

    Zhang, Lina; Xing, Changying; Shen, Chong; Zeng, Ming; Yang, Guang; Mao, Huijuan; Zhang, Bo; Yu, Xiangbao; Cui, Yiyao; Sun, Bin; Ouyang, Chun; Ge, Yifei; Jiang, Yao; Yin, Caixia; Zha, Xiaoming; Wang, Ningning

    2016-01-01

    Parathyroidectomy (PTX) is an effective treatment for severe secondary hyperparathyroidism (SHPT); however, persistent SHPT may occur because of supernumerary and ectopic parathyroids. Here a diagnostic accuracy study of intraoperative and perioperative serum intact parathyroid hormone (iPTH) was performed to predict successful surgery in 501 patients, who received total PTX + autotransplantation without thymectomy. Serum iPTH values before incision (io-iPTH0), 10 and 20 min after removing the last parathyroid (io-iPTH10, io-iPTH20), and the first and fourth day after PTX (D1-iPTH, D4-iPTH) were recoded. Patients whose serum iPTH was >50 pg/mL at the first postoperative week were followed up within six months. Successful PTX was defined if iPTH was <300 pg/mL, on the contrary, persistent SHPT was regarded. There were 86.4% patients underwent successful PTX, 9.8% remained as persistent SHPT and 3.8% were undetermined. Intraoperative serum iPTH demonstrated no significant differences in two subgroups with or without chronic hepatitis. Receiver operating characteristic (ROC) curves showed that >88.9% of io-iPTH20% could predict successful PTX (area under the curve [AUC] 0.909, sensitivity 78.6%, specificity 88.5%), thereby avoiding unnecessary exploration to reduce operative complications. D4-iPTH >147.4 pg/mL could predict persistent SHPT (AUC 0.998, sensitivity 100%, specificity 99.5%), so that medical intervention or reoperation start timely. PMID:27231027

  12. Diagnostic Accuracy Study of Intraoperative and Perioperative Serum Intact PTH Level for Successful Parathyroidectomy in 501 Secondary Hyperparathyroidism Patients.

    PubMed

    Zhang, Lina; Xing, Changying; Shen, Chong; Zeng, Ming; Yang, Guang; Mao, Huijuan; Zhang, Bo; Yu, Xiangbao; Cui, Yiyao; Sun, Bin; Ouyang, Chun; Ge, Yifei; Jiang, Yao; Yin, Caixia; Zha, Xiaoming; Wang, Ningning

    2016-01-01

    Parathyroidectomy (PTX) is an effective treatment for severe secondary hyperparathyroidism (SHPT); however, persistent SHPT may occur because of supernumerary and ectopic parathyroids. Here a diagnostic accuracy study of intraoperative and perioperative serum intact parathyroid hormone (iPTH) was performed to predict successful surgery in 501 patients, who received total PTX + autotransplantation without thymectomy. Serum iPTH values before incision (io-iPTH0), 10 and 20 min after removing the last parathyroid (io-iPTH10, io-iPTH20), and the first and fourth day after PTX (D1-iPTH, D4-iPTH) were recoded. Patients whose serum iPTH was >50 pg/mL at the first postoperative week were followed up within six months. Successful PTX was defined if iPTH was <300 pg/mL, on the contrary, persistent SHPT was regarded. There were 86.4% patients underwent successful PTX, 9.8% remained as persistent SHPT and 3.8% were undetermined. Intraoperative serum iPTH demonstrated no significant differences in two subgroups with or without chronic hepatitis. Receiver operating characteristic (ROC) curves showed that >88.9% of io-iPTH20% could predict successful PTX (area under the curve [AUC] 0.909, sensitivity 78.6%, specificity 88.5%), thereby avoiding unnecessary exploration to reduce operative complications. D4-iPTH >147.4 pg/mL could predict persistent SHPT (AUC 0.998, sensitivity 100%, specificity 99.5%), so that medical intervention or reoperation start timely. PMID:27231027

  13. A high accuracy femto-/picosecond laser damage test facility dedicated to the study of optical thin films

    NASA Astrophysics Data System (ADS)

    Mangote, B.; Gallais, L.; Zerrad, M.; Lemarchand, F.; Gao, L. H.; Commandré, M.; Lequime, M.

    2012-01-01

    A laser damage test facility delivering pulses from 100 fs to 3 ps and designed to operate at 1030 nm is presented. The different details of its implementation and performances are given. The originality of this system relies the online damage detection system based on Nomarski microscopy and the use of a non-conventional energy detection method based on the utilization of a cooled CCD that offers the possibility to obtain the laser induced damage threshold (LIDT) with high accuracy. Applications of this instrument to study thin films under laser irradiation are presented. Particularly the deterministic behavior of the sub-picosecond damage is investigated in the case of fused silica and oxide films. It is demonstrated that the transition of 0-1 damage probability is very sharp and the LIDT is perfectly deterministic at few hundreds of femtoseconds. The damage process in dielectric materials being the results of electronic processes, specific information such as the material bandgap is needed for the interpretation of results and applications of scaling laws. A review of the different approaches for the estimation of the absorption gap of optical dielectric coatings is conducted and the results given by the different methods are compared and discussed. The LIDT and gap of several oxide materials are then measured with the presented instrument: Al2O3, Nb2O5, HfO2, SiO2, Ta2O5, and ZrO2. The obtained relation between the LIDT and gap at 1030 nm confirms the linear evolution of the threshold with the bandgap that exists at 800 nm, and our work expands the number of tested materials.

  14. Accuracy of Intraocular Lens Power Formulas Involving 148 Eyes with Long Axial Lengths: A Retrospective Chart-Review Study

    PubMed Central

    Chen, Chong; Xu, Xian; Miao, Yuyu; Zheng, Gaoxin; Sun, Yong; Xu, Xun

    2015-01-01

    Purpose. This study aims to compare the accuracy of intraocular lens power calculation formulas in eyes with long axial lengths from Chinese patients subjected to cataract surgery. Methods. A total of 148 eyes with an axial length of >26 mm from 148 patients who underwent phacoemulsification with intraocular lens implantation were included. The Haigis, Hoffer Q, Holladay 1, and SRK/T formulas were used to calculate the refractive power of the intraocular lenses and the postoperative estimated power. Results. Overall, the Haigis formula achieved the lowest level of median absolute error 1.025 D (P < 0.01 for Haigis versus each of the other formulas), followed by SRK/T formula (1.040 D). All formulas were least accurate when eyes were with axial length of >33 mm, and median absolute errors were significantly higher for those eyes than eyes with axial length = 26.01–30.00 mm. Absolute error was correlated with axial length for the SRK/T (r = 0.212, P = 0.010) and Hoffer Q (r = 0.223, P = 0.007) formulas. For axial lengths > 33 mm, eyes exhibited a postoperative hyperopic refractive error. Conclusions. The Haigis and SRK/T formulas may be more suitable for calculating intraocular lens power for eyes with axial lengths ranging from 26 to 33 mm. And for axial length over 33 mm, the Haigis formula could be more accurate. PMID:26793392

  15. Diagnostic accuracy of cardiothoracic ratio on admission chest radiography to detect left or right ventricular systolic dysfunction: a retrospective study

    PubMed Central

    Chana, Harmeet S; Martin, Claire A; Cakebread, Holly E; Adjei, Felicia D

    2015-01-01

    Objectives To determine the diagnostic accuracy of the cardiothoracic ratio on postero-anterior or antero-posterior chest radiographs in predicting left ventricular or right ventricular dysfunction on echocardiography in an inpatient population. Design Retrospective study. Setting Two secondary care hospitals in the United Kingdom. Participants Four hundred consecutive inpatient echocardiograms were screened for inclusion along with chest radiographs (both postero-anterior and antero-posterior). The cardiothoracic ratio was calculated from chest radiographs along with quantitative and qualitative measures of left ventricular or right ventricular dysfunction on echocardiography. Main outcome measures Sensitivity and specificity of cardiothoracic ratio across a range of values to detect moderate/severe left ventricular and/or right ventricular dysfunction on echocardiography. Results Overall, 272 records met inclusion criteria. The prevalence of left ventricular/right ventricular dysfunction on echocardiography was 26% in an inpatient population with high clinical suspicion of cardiac disease referred for echocardiography. Over a range of cardiothoracic ratio values on postero-anterior films, a value of >0.55 yielded the best sensitivity (62.5%) and specificity (76.5%) for diagnosing left ventricular/right ventricular impairment (positive likelihood ratio 2.56), with a positive predictive value of 29.5%. Cardiothoracic ratio on antero-posterior film was not predictive of left ventricular/right ventricular impairment on echocardiography. Conclusions In conclusion, in the context of an acute admission, cardiothoracic ratio measured on postero-anterior or antero-posterior films has limited value in detecting moderate left ventricular and/or right ventricular systolic dysfunction. Previously established absolute values may be unreliable by modern standards. PMID:26152673

  16. A high accuracy femto-/picosecond laser damage test facility dedicated to the study of optical thin films

    SciTech Connect

    Mangote, B.; Gallais, L.; Zerrad, M.; Lemarchand, F.; Gao, L. H.; Commandre, M.; Lequime, M.

    2012-01-15

    A laser damage test facility delivering pulses from 100 fs to 3 ps and designed to operate at 1030 nm is presented. The different details of its implementation and performances are given. The originality of this system relies the online damage detection system based on Nomarski microscopy and the use of a non-conventional energy detection method based on the utilization of a cooled CCD that offers the possibility to obtain the laser induced damage threshold (LIDT) with high accuracy. Applications of this instrument to study thin films under laser irradiation are presented. Particularly the deterministic behavior of the sub-picosecond damage is investigated in the case of fused silica and oxide films. It is demonstrated that the transition of 0-1 damage probability is very sharp and the LIDT is perfectly deterministic at few hundreds of femtoseconds. The damage process in dielectric materials being the results of electronic processes, specific information such as the material bandgap is needed for the interpretation of results and applications of scaling laws. A review of the different approaches for the estimation of the absorption gap of optical dielectric coatings is conducted and the results given by the different methods are compared and discussed. The LIDT and gap of several oxide materials are then measured with the presented instrument: Al{sub 2}O{sub 3}, Nb{sub 2}O{sub 5}, HfO{sub 2}, SiO{sub 2}, Ta{sub 2}O{sub 5}, and ZrO{sub 2}. The obtained relation between the LIDT and gap at 1030 nm confirms the linear evolution of the threshold with the bandgap that exists at 800 nm, and our work expands the number of tested materials.

  17. Positional Accuracy Assessment of the Openstreetmap Buildings Layer Through Automatic Homologous Pairs Detection: the Method and a Case Study

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Minghini, M.; Molinari, M. E.; Zamboni, G.

    2016-06-01

    OpenStreetMap (OSM) is currently the largest openly licensed collection of geospatial data. Being OSM increasingly exploited in a variety of applications, research has placed great attention on the assessment of its quality. This work focuses on assessing the quality of OSM buildings. While most of the studies available in literature are limited to the evaluation of OSM building completeness, this work proposes an original approach to assess the positional accuracy of OSM buildings based on comparison with a reference dataset. The comparison relies on a quasi-automated detection of homologous pairs on the two datasets. Based on the homologous pairs found, warping algorithms like e.g. affine transformations and multi-resolution splines can be applied to the OSM buildings to generate a new version having an optimal local match to the reference layer. A quality assessment of the OSM buildings of Milan Municipality (Northern Italy), having an area of about 180 km2, is then presented. After computing some measures of completeness, the algorithm based on homologous points is run using the building layer of the official vector cartography of Milan Municipality as the reference dataset. Approximately 100000 homologous points are found, which show a systematic translation of about 0.4 m on both the X and Y directions and a mean distance of about 0.8 m between the datasets. Besides its efficiency and high degree of automation, the algorithm generates a warped version of OSM buildings which, having by definition a closest match to the reference buildings, can be eventually integrated in the OSM database.

  18. ForestPMPlot: A Flexible Tool for Visualizing Heterogeneity Between Studies in Meta-analysis

    PubMed Central

    Kang, Eun Yong; Park, Yurang; Li, Xiao; Segrè, Ayellet V.; Han, Buhm; Eskin, Eleazar

    2016-01-01

    Meta-analysis has become a popular tool for genetic association studies to combine different genetic studies. A key challenge in meta-analysis is heterogeneity, or the differences in effect sizes between studies. Heterogeneity complicates the interpretation of meta-analyses. In this paper, we describe ForestPMPlot, a flexible visualization tool for analyzing studies included in a meta-analysis. The main feature of the tool is visualizing the differences in the effect sizes of the studies to understand why the studies exhibit heterogeneity for a particular phenotype and locus pair under different conditions. We show the application of this tool to interpret a meta-analysis of 17 mouse studies, and to interpret a multi-tissue eQTL study. PMID:27194809

  19. Accuracy assessment of a marker-free method for registration of CT and stereo images applied in image-guided implantology: a phantom study.

    PubMed

    Mohagheghi, Saeed; Ahmadian, Alireza; Yaghoobee, Siamak

    2014-12-01

    To assess the accuracy of a proposed marker-free registration method as opposed to the conventional marker-based method using an image-guided dental system, and investigating the best configurations of anatomical landmarks for various surgical fields in a phantom study, a CT-compatible dental phantom consisting of implanted targets was used. Two marker-free registration methods were evaluated, first using dental anatomical landmarks and second, using a reference marker tool. Six implanted markers, distributed in the inner space of the phantom were used as the targets; the values of target registration error (TRE) for each target were measured and compared with the marker-based method. Then, the effects of different landmark configurations on TRE values, measured using the Parsiss IV Guided Navigation system (Parsiss, Tehran, Iran), were investigated to find the best landmark arrangement for reaching the minimum registration error in each target region. It was proved that marker-free registration can be as precise as the marker-based method. This has a great impact on image-guided implantology systems whereby the drawbacks of fiducial markers for patient and surgeon are removed. It was also shown that smaller values of TRE could be achieved by using appropriate landmark configurations and moving the center of the landmark set closer to the surgery target. Other common factors would not necessarily decrease the TRE value so the conventional rules accepted in the clinical community about the ways to reduce TRE should be adapted to the selected field of dental surgery.

  20. A browser-based tool for space weather and space climate studies

    NASA Astrophysics Data System (ADS)

    Tanskanen, E. I.; Pérez-Suárez, D.

    2014-04-01

    A browser-based research tool has been developed for time series analysis on-line. Large amount of high-resolution measurements are nowadays available from different heliospheric locations. It has become an issue how to best handle the ever-increasing amount of information about the near-Earth space weather conditions, and how to improve the social data analysis tools for space studies. To resolve the problem, we have developed an interactive web interface, called Substorm Zoo, which we expect to become a powerful tool for scientists and a useful tool for public.

  1. Videogames, Tools for Change: A Study Based on Activity Theory

    ERIC Educational Resources Information Center

    Méndez, Laura; Lacasa, Pilar

    2015-01-01

    Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are…

  2. The Effect of Delayed-JOLs and Sentence Generation on Children's Monitoring Accuracy and Regulation of Idiom Study

    ERIC Educational Resources Information Center

    van Loon, Mariëtte H.; de Bruin, Anique B. H.; van Gog, Tamara; van Merriënboer, Jeroen J. G.

    2013-01-01

    When studying verbal materials, both adults and children are often poor at accurately monitoring their level of learning and regulating their subsequent restudy of materials, which leads to suboptimal test performance. The present experiment investigated how monitoring accuracy and regulation of study could be improved when learning idiomatic…

  3. A Monte Carlo Study of the Effect of Item Characteristic Curve Estimation on the Accuracy of Three Person-Fit Statistics

    ERIC Educational Resources Information Center

    St-Onge, Christina; Valois, Pierre; Abdous, Belkacem; Germain, Stephane

    2009-01-01

    To date, there have been no studies comparing parametric and nonparametric Item Characteristic Curve (ICC) estimation methods on the effectiveness of Person-Fit Statistics (PFS). The primary aim of this study was to determine if the use of ICCs estimated by nonparametric methods would increase the accuracy of item response theory-based PFS for…

  4. Intravital microscopy as a tool to study drug delivery in preclinical studies

    PubMed Central

    Amornphimoltham, Panomwat; Masedunskas, Andrius; Weigert, Roberto

    2010-01-01

    The technical developments in the field of non-linear microscopy have made intravital microscopy one of the most successful techniques for studying physiological and pathological processes in live animals. Intravital microscopy has been utilized to address many biological questions in basic research and is now a fundamental tool for preclinical studies, with an enormous potential for clinical applications. The ability to dynamically image cellular and subcellular structures combined with the possibility to perform longitudinal studies have empowered investigators to use this discipline to study the mechanisms of action of therapeutic agents and assess the efficacy on their targets in vivo. The goal of this review is to provide a general overview of the recent advances in intravital microscopy and to discuss some of its applications in preclinical studies. PMID:20933026

  5. Effects of random study checks and guided notes study cards on middle school special education students' notetaking accuracy and science vocabulary quiz scores

    NASA Astrophysics Data System (ADS)

    Wood, Charles L.

    Federal legislation mandates that all students with disabilities have meaningful access to the general education curriculum and that students with and without disabilities be held equally accountable to the same academic standards (IDEIA, 2004; NCLB, 2001). Many students with disabilities, however, perform poorly in academic content courses, especially at the middle and secondary school levels. Previous research has reported increased notetaking accuracy and quiz scores over lecture content when students completed guided notes compared to taking their own notes. This study evaluated the effects of a pre-quiz review procedure and specially formatted guided notes on middle school special education students' learning of science vocabulary. This study compared the effects of three experimental conditions. (a) Own Notes (ON), (b) Own Notes+Random Study Checks (ON+RSC), and (c) Guided Notes Study Cards+Random Study Checks (GNSC+RSC) on each student's accuracy of notes, next-day quiz scores, and review quiz scores. Each session, the teacher presented 12 science vocabulary terms and definitions during a lecture and students took notes. The students were given 5 minutes to study their notes at the end of each session and were reminded to study their notes at home and in study hall period. In the ON condition students took notes on a sheet of paper with numbered lines from 1 to 12. Just before each next-day quiz in the ON+RSC condition students used write-on response cards to answer two teacher-posed questions over randomly selected vocabulary terms from the previous day's lecture. If the answer on a randomly selected student's response card was correct, that student earned a lottery ticket for inexpensive prizes and a quiz bonus point for herself and each classmate. In the GNSC+RSC condition students took notes on specially formatted guided notes that after the lecture they cut into a set of flashcards that could used for study. The students' mean notetaking accuracy was 75

  6. Real-Word and Nonword Repetition in Italian-Speaking Children with Specific Language Impairment: A Study of Diagnostic Accuracy

    ERIC Educational Resources Information Center

    Dispaldro, Marco; Leonard, Laurence B.; Deevy, Patricia

    2013-01-01

    Purpose: Using 2 different scoring methods, the authors examined the diagnostic accuracy of both real-word and nonword repetition in identifying Italian-speaking children with and without specific language impairment (SLI). Method: A total of 34 children ages 3;11-5;8 (years;months) participated--17 children with SLI and 17 typically developing…

  7. The effects of relatedness and GxE interaction on prediction accuracies in genomic selection: a study in cassava

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Prior to implementation of genomic selection, an evaluation of the potential accuracy of prediction can be obtained by cross validation. In this procedure, a population with both phenotypes and genotypes is split into training and validation sets. The prediction model is fitted using the training se...

  8. Referential Communication Accuracy of Mother-Child Pairs and Children's Later Scholastic Achievement: A Follow-Up Study.

    ERIC Educational Resources Information Center

    McDevitt, Teresa M.; And Others

    1987-01-01

    The relationship between the referential communication accuracy of mothers and their 4-year-old children and the children's achievement in vocabulary and mathematics at age 12 was examined in 47 American and 44 Japanese mother-child pairs. Positive correlations were found in both cultures. (Author/BN)

  9. Accuracy and efficacy of percutaneous biopsy and ablation using robotic assistance under computed tomography guidance: a phantom study

    PubMed Central

    Koethe, Yilun; Xu, Sheng; Velusamy, Gnanasekar; Wood, Bradford J.; Venkatesan, Aradhana M.

    2014-01-01

    Objective To compare the accuracy of a robotic interventional radiologist (IR) assistance platform with a standard freehand technique for computed-tomography (CT)-guided biopsy and simulated radiofrequency ablation (RFA). Methods The accuracy of freehand single-pass needle insertions into abdominal phantoms was compared with insertions facilitated with the use of a robotic assistance platform (n = 20 each). Post-procedural CTs were analysed for needle placement error. Percutaneous RFA was simulated by sequentially placing five 17-gauge needle introducers into 5-cm diameter masses (n = 5) embedded within an abdominal phantom. Simulated ablations were planned based on pre-procedural CT, before multi-probe placement was executed freehand. Multi-probe placement was then performed on the same 5-cm mass using the ablation planning software and robotic assistance. Post-procedural CTs were analysed to determine the percentage of untreated residual target. Results Mean needle tip-to-target errors were reduced with use of the IR assistance platform (both P < 0.0001). Reduced percentage residual tumour was observed with treatment planning (P = 0.02). Conclusion Improved needle accuracy and optimised probe geometry are observed during simulated CT-guided biopsy and percutaneous ablation with use of a robotic IR assistance platform. This technology may be useful for clinical CT-guided biopsy and RFA, when accuracy may have an impact on outcome. PMID:24220755

  10. Dietary Adherence Monitoring Tool for Free-living, Controlled Feeding Studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Objective: To devise a dietary adherence monitoring tool for use in controlled human feeding trials involving free-living study participants. Methods: A scoring tool was devised to measure and track dietary adherence for an 8-wk randomized trial evaluating the effects of two different dietary patter...

  11. A Usability Study of Users' Perceptions toward a Multimedia Computer-Assisted Learning Tool for Neuroanatomy

    ERIC Educational Resources Information Center

    Gould, Douglas J.; Terrell, Mark A.; Fleming, Jo

    2008-01-01

    This usability study evaluated users' perceptions of a multimedia prototype for a new e-learning tool: Anatomy of the Central Nervous System: A Multimedia Course. Usability testing is a collection of formative evaluation methods that inform the developmental design of e-learning tools to maximize user acceptance, satisfaction, and adoption.…

  12. Experience of Integrating Various Technological Tools into the Study and Future Teaching of Mathematics Education Students

    ERIC Educational Resources Information Center

    Gorev, Dvora; Gurevich-Leibman, Irina

    2015-01-01

    This paper presents our experience of integrating technological tools into our mathematics teaching (in both disciplinary and didactic courses) for student-teachers. In the first cycle of our study, a variety of technological tools were used (e.g., dynamic software, hypertexts, video and applets) in teaching two disciplinary mathematics courses.…

  13. GoPro as an Ethnographic Tool: A Wayfinding Study in an Academic Library

    ERIC Educational Resources Information Center

    Kinsley, Kirsten M.; Schoonover, Dan; Spitler, Jasmine

    2016-01-01

    In this study, researchers sought to capture students' authentic experience of finding books in the main library using a GoPro camera and the think-aloud protocol. The GoPro provided a first-person perspective and was an effective ethnographic tool for observing a student's individual experience, while also demonstrating what tools they use to…

  14. Tools to study and manage grazing behavior at multiple scales to enhance the sustainability of livestock

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Free-ranging animal behavior is a multifaceted and complex phenomenon within rangeland ecology that must be understood and ultimately managed. Improving behavioral studies requires tools appropriate for use at the landscape scale. Though tools alone do not assure research will generate accurate in...

  15. WiFiSiM: An Educational Tool for the Study and Design of Wireless Networks

    ERIC Educational Resources Information Center

    Mateo Sanguino, T. J.; Serrano Lopez, C.; Marquez Hernandez, F. A.

    2013-01-01

    A new educational simulation tool designed for the generic study of wireless networks, the Wireless Fidelity Simulator (WiFiSim), is presented in this paper. The goal of this work was to create and implement a didactic tool to improve the teaching and learning of computer networks by means of two complementary strategies: simulating the behavior…

  16. K-12 Student Use of Web 2.0 Tools: A Global Study

    ERIC Educational Resources Information Center

    Toledo, Cheri; Shepard, MaryFriend

    2011-01-01

    Over the past decade, Internet use has increased 445% worldwide. This boom has enabled widespread access to online tools and digital spaces for educational practices. The results of this study of Web 2.0 tool use in kindergarten through high school (K-12) classrooms around the world will be presented. A web-based survey was sent out through online…

  17. Refining Ovarian Cancer Test accuracy Scores (ROCkeTS): protocol for a prospective longitudinal test accuracy study to validate new risk scores in women with symptoms of suspected ovarian cancer

    PubMed Central

    Sundar, Sudha; Rick, Caroline; Dowling, Francis; Au, Pui; Rai, Nirmala; Champaneria, Rita; Stobart, Hilary; Neal, Richard; Davenport, Clare; Mallett, Susan; Sutton, Andrew; Kehoe, Sean; Timmerman, Dirk; Bourne, Tom; Van Calster, Ben; Gentry-Maharaj, Aleksandra; Deeks, Jon

    2016-01-01

    Introduction Ovarian cancer (OC) is associated with non-specific symptoms such as bloating, making accurate diagnosis challenging: only 1 in 3 women with OC presents through primary care referral. National Institute for Health and Care Excellence guidelines recommends sequential testing with CA125 and routine ultrasound in primary care. However, these diagnostic tests have limited sensitivity or specificity. Improving accurate triage in women with vague symptoms is likely to improve mortality by streamlining referral and care pathways. The Refining Ovarian Cancer Test Accuracy Scores (ROCkeTS; HTA 13/13/01) project will derive and validate new tests/risk prediction models that estimate the probability of having OC in women with symptoms. This protocol refers to the prospective study only (phase III). Methods and analysis ROCkeTS comprises four parallel phases. The full ROCkeTS protocol can be found at http://www.birmingham.ac.uk/ROCKETS. Phase III is a prospective test accuracy study. The study will recruit 2450 patients from 15 UK sites. Recruited patients complete symptom and anxiety questionnaires, donate a serum sample and undergo ultrasound scored as per International Ovarian Tumour Analysis (IOTA) criteria. Recruitment is at rapid access clinics, emergency departments and elective clinics. Models to be evaluated include those based on ultrasound derived by the IOTA group and novel models derived from analysis of existing data sets. Estimates of sensitivity, specificity, c-statistic (area under receiver operating curve), positive predictive value and negative predictive value of diagnostic tests are evaluated and a calibration plot for models will be presented. ROCkeTS has received ethical approval from the NHS West Midlands REC (14/WM/1241) and is registered on the controlled trials website (ISRCTN17160843) and the National Institute of Health Research Cancer and Reproductive Health portfolios. PMID:27507231

  18. Test Expectancy Affects Metacomprehension Accuracy

    ERIC Educational Resources Information Center

    Thiede, Keith W.; Wiley, Jennifer; Griffin, Thomas D.

    2011-01-01

    Background: Theory suggests that the accuracy of metacognitive monitoring is affected by the cues used to judge learning. Researchers have improved monitoring accuracy by directing attention to more appropriate cues; however, this is the first study to more directly point students to more appropriate cues using instructions regarding tests and…

  19. Psychological Autopsy Studies as Diagnostic Tools: Are They Methodologically Flawed?

    ERIC Educational Resources Information Center

    Hjelmeland, Heidi; Dieserud, Gudrun; Dyregrov, Kari; Knizek, Birthe L.; Leenaars, Antoon A.

    2012-01-01

    One of the most established "truths" in suicidology is that almost all (90% or more) of those who kill themselves suffer from one or more mental disorders, and a causal link between the two is implied. Psychological autopsy (PA) studies constitute one main evidence base for this conclusion. However, there has been little reflection on the…

  20. Factor Analysis: A Tool for Studying Mathematics Anxiety.

    ERIC Educational Resources Information Center

    McAuliffe, Elizabeth A.; Trueblood, Cecil R.

    Mathematics anxiety and its relationship to other constructs was studied in 138 preservice elementary and special education teachers. The students, primarily women, were enrolled in a variety of professional courses and field experiences. Five instruments were administered, their factor structures were determined, and intercorrelations among the…

  1. Minecraft as a Creative Tool: A Case Study

    ERIC Educational Resources Information Center

    Cipollone, Maria; Schifter, Catherine C.; Moffat, Rick A.

    2014-01-01

    Many scholars are enthusiastic about the potential learning opportunities present in the sandbox-style gaming environment, Minecraft. In the following case study, the authors explored the use of Minecraft in a high school literature class and the presentation of characterization and plot in three student-made machinima, or films made in the game…

  2. Educator Study Groups: A Professional Development Tool to Enhance Inclusion

    ERIC Educational Resources Information Center

    Herner-Patnode, Leah

    2009-01-01

    Professional development can take many forms. The most effective development includes individual educators in the formation and planning process. Educator study groups are one form of professional development that allows major stakeholders in the education process the autonomy to develop individual and group goals. This often translates into an…

  3. The Writing Workshop as an Inservice Tool: A Case Study.

    ERIC Educational Resources Information Center

    Pollock, Jeri

    1994-01-01

    Presents a case study of an inservice writing workshop (at Our Lady of Mercy School in Rio de Janeiro, Brazil) designed to give teachers hands-on experience in applying computer writing to their individual subjects. Describes how a computer culture was developed at the school. (RS)

  4. Developing a Social Autopsy Tool for Dengue Mortality: A Pilot Study

    PubMed Central

    Arauz, María José; Ridde, Valéry; Hernández, Libia Milena; Charris, Yaneth; Carabali, Mabel; Villar, Luis Ángel

    2015-01-01

    Background Dengue fever is a public health problem in the tropical and sub-tropical world. Dengue cases have grown dramatically in recent years as well as dengue mortality. Colombia has experienced periodic dengue outbreaks with numerous dengue related-deaths, where the Santander department has been particularly affected. Although social determinants of health (SDH) shape health outcomes, including mortality, it is not yet understood how these affect dengue mortality. The aim of this pilot study was to develop and pre-test a social autopsy (SA) tool for dengue mortality. Methods and Findings The tool was developed and pre-tested in three steps. First, dengue fatal cases and ‘near misses’ (those who recovered from dengue complications) definitions were elaborated. Second, a conceptual framework on determinants of dengue mortality was developed to guide the construction of the tool. Lastly, the tool was designed and pre-tested among three relatives of fatal cases and six near misses in 2013 in the metropolitan zone of Bucaramanga. The tool turned out to be practical in the context of dengue mortality in Colombia after some modifications. The tool aims to study the social, individual, and health systems determinants of dengue mortality. The tool is focused on studying the socioeconomic position and the intermediary SDH rather than the socioeconomic and political context. Conclusions The SA tool is based on the scientific literature, a validated conceptual framework, researchers’ and health professionals’ expertise, and a pilot study. It is the first time that a SA tool has been created for the dengue mortality context. Our work furthers the study on SDH and how these are applied to neglected tropical diseases, like dengue. This tool could be integrated in surveillance systems to provide complementary information on the modifiable and avoidable death-related factors and therefore, be able to formulate interventions for dengue mortality reduction. PMID:25658485

  5. Case studies: low cost, high-strength, large carbon foam tooling

    SciTech Connect

    Lucas, R.; Danford, H.

    2009-01-15

    A new carbon foam tooling system has been developed that results in a low-cost, high-strength material that has been proving attractive for creation of tooling for composite parts. Composites are stronger; lighter and less subject to corrosion and fatigue than materials that are currently used for fabrication of advanced structures. Tools to manufacture these composite parts must be rigid, durable and able to offer a coefficient of thermal expansion (CTE) closely matching that of the composites. Current technology makes it difficult to match the CTE of a composite part in the curing cycle with anything other than a carbon composite or a nickel iron alloy such as Invar. Fabrication of metallic tooling requires many, expensive stages of long duration with a large infrastructure investment. Card ban fiber reinforced polymer resin composite tooling has a shorter lead-time but limited production use because of durability concerns. Coal-based carbon foam material has a compatible CTE and strong durability, that make it an attractive alternative for use in tooling. The use of coal-based carbon foam in tooling for carbon composites is advantageous because of its low cost, light weight, machinability , vacuum integrity and compatibility with a wide range of curing processes. Large-scale tooling case studies will be presented detailing carbon foam's potential for tooling applications.

  6. DeID - a data sharing tool for neuroimaging studies.

    PubMed

    Song, Xuebo; Wang, James; Wang, Anlin; Meng, Qingping; Prescott, Christian; Tsu, Loretta; Eckert, Mark A

    2015-01-01

    Funding institutions and researchers increasingly expect that data will be shared to increase scientific integrity and provide other scientists with the opportunity to use the data with novel methods that may advance understanding in a particular field of study. In practice, sharing human subject data can be complicated because data must be de-identified prior to sharing. Moreover, integrating varied data types collected in a study can be challenging and time consuming. For example, sharing data from structural imaging studies of a complex disorder requires the integration of imaging, demographic and/or behavioral data in a way that no subject identifiers are included in the de-identified dataset and with new subject labels or identification values that cannot be tracked back to the original ones. We have developed a Java program that users can use to remove identifying information in neuroimaging datasets, while still maintaining the association among different data types from the same subject for further studies. This software provides a series of user interaction wizards to allow users to select data variables to be de-identified, implements functions for auditing and validation of de-identified data, and enables the user to share the de-identified data in a single compressed package through various communication protocols, such as FTPS and SFTP. DeID runs with Windows, Linux, and Mac operating systems and its open architecture allows it to be easily adapted to support a broader array of data types, with the goal of facilitating data sharing. DeID can be obtained at http://www.nitrc.org/projects/deid. PMID:26441500

  7. Matrix isolation as a tool for studying interstellar chemical reactions

    NASA Technical Reports Server (NTRS)

    Ball, David W.; Ortman, Bryan J.; Hauge, Robert H.; Margrave, John L.

    1989-01-01

    Since the identification of the OH radical as an interstellar species, over 50 molecular species were identified as interstellar denizens. While identification of new species appears straightforward, an explanation for their mechanisms of formation is not. Most astronomers concede that large bodies like interstellar dust grains are necessary for adsorption of molecules and their energies of reactions, but many of the mechanistic steps are unknown and speculative. It is proposed that data from matrix isolation experiments involving the reactions of refractory materials (especially C, Si, and Fe atoms and clusters) with small molecules (mainly H2, H2O, CO, CO2) are particularly applicable to explaining mechanistic details of likely interstellar chemical reactions. In many cases, matrix isolation techniques are the sole method of studying such reactions; also in many cases, complexations and bond rearrangements yield molecules never before observed. The study of these reactions thus provides a logical basis for the mechanisms of interstellar reactions. A list of reactions is presented that would simulate interstellar chemical reactions. These reactions were studied using FTIR-matrix isolation techniques.

  8. Development of a Burn Escharotomy Assessment Tool: A Pilot Study.

    PubMed

    Ur, Rebecca; Holmes, James H; Johnson, James E; Molnar, Joseph A; Carter, Jeffrey E

    2016-01-01

    Severe burn injuries can require escharotomies which are urgent, infrequent, and relatively high-risk procedures necessary to preserve limb perfusion and sometimes ventilation. The American Burn Association Advanced Burn Life Support© course educates surgeons and emergency providers about escharotomy incisions but lacks a biomimetic trainer to demonstrate, practice, or provide assessment. The goal was to build an affordable biomimetic trainer with discrete points of failure and pilot a validation study. Fellowship-trained burn and plastic surgeons worked with special effect artists and anatomists to develop a biomimetic trainer with three discrete points of failure: median or ulnar nerve injury, fasciotomy, and failure to check distal pulse. Participants were divided between experienced and inexperienced, survey pre- and post-procedure on a biomimetic model while being timed. The trainer total cost per participant was less than $35. Eighteen participants were involved in the study. The inexperienced (0-1 prior escharotomies performed) had significantly more violations at the discrete points of failure relative to more experienced participants (P = .036). Face validity was assessed with 100% of participants agreement that the model appeared similar to real life and was valuable in their training. Given the advancements in biomimetic models and the need to train surgeons in how to perform infrequent, emergent surgical procedures, an escharotomy trainer is needed today. The authors developed an affordable model with a successful pilot study demonstrating discrimination between experienced and inexperienced surgeons. Additional research is needed to increase the reliability and assessment metrics.

  9. The space elevator: a new tool for space studies.

    PubMed

    Edwards, Bradley C

    2003-06-01

    The objective has been to develop a viable scenario for the construction, deployment and operation of a space elevator using current or near future technology. This effort has been primarily a paper study with several experimental tests of specific systems. Computer simulations, engineering designs, literature studies and inclusion of existing programs have been utilized to produce a design for the first space elevator. The results from this effort illustrate a viable design using current and near-term technology for the construction of the first space elevator. The timeline for possible construction is within the coming decades and estimated costs are less than $10 B. The initial elevator would have a 5 ton/day capacity and operating costs near $100/lb for payloads going to any Earth orbit or traveling to the Moon, Mars, Venus or the asteroids. An operational space elevator would allow for larger and much longer-term biological space studies at selectable gravity levels. The high-capacity and low operational cost of this system would also allow for inexpensive searches for life throughout our solar system and the first tests of environmental engineering. This work is supported by a grant from the NASA Institute for Advanced Concepts (NIAC).

  10. Applying measures of discriminatory accuracy to revisit traditional risk factors for being small for gestational age in Sweden: a national cross-sectional study

    PubMed Central

    Juárez, Sol Pía; Wagner, Phillip; Merlo, Juan

    2014-01-01

    Objectives Small for gestational age (SGA) is considered as an indicator of intrauterine growth restriction, and multiple maternal and newborn characteristics have been identified as risk factors for SGA. This knowledge is mainly based on measures of average association (ie, OR) that quantify differences in average risk between exposed and unexposed groups. Nevertheless, average associations do not assess the discriminatory accuracy of the risk factors (ie, its ability to discriminate the babies who will develop SGA from those that will not). Therefore, applying measures of discriminatory accuracy rather than measures of association only, our study revisits known risk factors of SGA and discusses their role from a public health perspective. Design Cross-sectional study. We measured maternal (ie, smoking, hypertension, age, marital status, education) and delivery (ie, sex, gestational age, birth order) characteristics and performed logistic regression models to estimate both ORs and measures of discriminatory accuracy, like the area under the receiver operating characteristic curve (AU-ROC) and the net reclassification improvement. Setting Data were obtained from the Swedish Medical Birth Registry. Participants Our sample included 731 989 babies born during 1987–1993. Results We replicated the expected associations. For instance, smoking (OR=2.57), having had a previous SGA baby (OR=5.48) and hypertension (OR=4.02) were strongly associated with SGA. However, they show a very small discriminatory accuracy (AU-ROC≈0.5). The discriminatory accuracy increased, but remained unsatisfactorily low (AU-ROC=0.6), when including all variables studied in the same model. Conclusions Traditional risk factors for SGA alone or in combination have a low accuracy for discriminating babies with SGA from those without SGA. A proper understanding of these findings is of fundamental relevance to address future research and to design policymaking recommendations in a more informed

  11. Improvement of focus accuracy on processed wafer

    NASA Astrophysics Data System (ADS)

    Higashibata, Satomi; Komine, Nobuhiro; Fukuhara, Kazuya; Koike, Takashi; Kato, Yoshimitsu; Hashimoto, Kohji

    2013-04-01

    As feature size shrinkage in semiconductor device progress, process fluctuation, especially focus strongly affects device performance. Because focus control is an ongoing challenge in optical lithography, various studies have sought for improving focus monitoring and control. Focus errors are due to wafers, exposure tools, reticles, QCs, and so on. Few studies are performed to minimize the measurement errors of auto focus (AF) sensors of exposure tool, especially when processed wafers are exposed. With current focus measurement techniques, the phase shift grating (PSG) focus monitor 1) has been already proposed and its basic principle is that the intensity of the diffraction light of the mask pattern is made asymmetric by arranging a π/2 phase shift area on a reticle. The resist pattern exposed at the defocus position is shifted on the wafer and shifted pattern can be easily measured using an overlay inspection tool. However, it is difficult to measure shifted pattern for the pattern on the processed wafer because of interruptions caused by other patterns in the underlayer. In this paper, we therefore propose "SEM-PSG" technique, where the shift of the PSG resist mark is measured by employing critical dimension-scanning electron microscope (CD-SEM) to measure the focus error on the processed wafer. First, we evaluate the accuracy of SEM-PSG technique. Second, by applying the SEM-PSG technique and feeding the results back to the exposure, we evaluate the focus accuracy on processed wafers. By applying SEM-PSG feedback, the focus accuracy on the processed wafer was improved from 40 to 29 nm in 3σ.

  12. Formaldehyde crosslinking: a tool for the study of chromatin complexes.

    PubMed

    Hoffman, Elizabeth A; Frey, Brian L; Smith, Lloyd M; Auble, David T

    2015-10-30

    Formaldehyde has been used for decades to probe macromolecular structure and function and to trap complexes, cells, and tissues for further analysis. Formaldehyde crosslinking is routinely employed for detection and quantification of protein-DNA interactions, interactions between chromatin proteins, and interactions between distal segments of the chromatin fiber. Despite widespread use and a rich biochemical literature, important aspects of formaldehyde behavior in cells have not been well described. Here, we highlight features of formaldehyde chemistry relevant to its use in analyses of chromatin complexes, focusing on how its properties may influence studies of chromatin structure and function.

  13. Identity method-a new tool for studying chemical fluctuations

    SciTech Connect

    Mackowiak, M.

    2012-06-15

    Event-by-event fluctuations of the chemical composition of the hadronic system produced in nuclear collisions are believed to be sensitive to properties of the transition between confined and deconfined strongly interacting matter. In this paper a new technique for the study of chemical fluctuation, the identity method, is introduced and its features are discussed. The method is tested using data on central PbPb collisions at 40 A GeV registered by the NA49 experiment at the CERN SPS.

  14. Formaldehyde crosslinking: a tool for the study of chromatin complexes.

    PubMed

    Hoffman, Elizabeth A; Frey, Brian L; Smith, Lloyd M; Auble, David T

    2015-10-30

    Formaldehyde has been used for decades to probe macromolecular structure and function and to trap complexes, cells, and tissues for further analysis. Formaldehyde crosslinking is routinely employed for detection and quantification of protein-DNA interactions, interactions between chromatin proteins, and interactions between distal segments of the chromatin fiber. Despite widespread use and a rich biochemical literature, important aspects of formaldehyde behavior in cells have not been well described. Here, we highlight features of formaldehyde chemistry relevant to its use in analyses of chromatin complexes, focusing on how its properties may influence studies of chromatin structure and function. PMID:26354429

  15. Studying PubMed usages in the field for complex problem solving: Implications for tool design.

    PubMed

    Mirel, Barbara; Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa

    2013-05-01

    Many recent studies on MEDLINE-based information seeking have shed light on scientists' behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists' problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375

  16. Studying PubMed usages in the field for complex problem solving: Implications for tool design

    PubMed Central

    Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa

    2012-01-01

    Many recent studies on MEDLINE-based information seeking have shed light on scientists’ behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists’ problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375

  17. Emerging tools to study proteoglycan function during skeletal development.

    PubMed

    Brown, D S; Eames, B F

    2016-01-01

    In the past 20years, appreciation for the varied roles of proteoglycans (PGs), which are specific types of sugar-coated proteins, has increased dramatically. PGs in the extracellular matrix were long known to impart structural functions to many tissues, especially articular cartilage, which cushions bones and allows mobility at skeletal joints. Indeed, osteoarthritis is a debilitating disease associated with loss of PGs in articular cartilage. Today, however, PGs have a demonstrated role in cell biological processes, such as growth factor signalling, prompting new perspectives on the etiology of PG-associated diseases. Here, we review diseases associated with defects in PG synthesis and sulfation, also highlighting current understanding of the underlying genetics, biochemistry, and cell biology. Since most research has analyzed a class of PGs called heparan sulfate PGs, more attention is paid here to studies of chondroitin sulfate PGs (CSPGs), which are abundant in cartilage. Interestingly, CSPG synthesis is tightly linked to the cell biological processes of secretion and lysosomal degradation, suggesting that these systems may be linked genetically. Animal models of loss of CSPG function have revealed CSPGs to impact skeletal development. Specifically, our work from a mutagenesis screen in zebrafish led to the hypothesis that cartilage PGs normally delay the timing of endochondral ossification. Finally, we outline emerging approaches in zebrafish that may revolutionize the study of cartilage PG function, including transgenic methods and novel imaging techniques. Our recent work with X-ray fluorescent imaging, for example, enables direct correlation of PG function with PG-dependent biological processes. PMID:27312503

  18. Cell transfection as a tool to study growth hormone action

    SciTech Connect

    Norstedt, G.; Enberg, B.; Francis, S.

    1994-12-31

    The isolation of growth hormone receptor (GHR) cDNA clones has made possible the transfection of GHRs into cultured cells. Our aim in this minireview is to show how the application of such approaches have benefited GHR research. GH stimulation of cells expressing GHR cDNAs can cause an alteration of cellular function that mimic those of the endogenous GHR. GHR cDNA transfected cells also offer a system where the mechanism of GH action can be studied. Such a system has been used to demonstrate that the GHR itself becomes tyrosine phosphorylated and that further phosphorylation of downstream proteins is important in GH action. The GH signals are transmitted to the nucleus and GH regulated genes have now begun to be characterized. The ability to use cell transfection for mechanistic studies of GH action will be instrumental to define domains within the receptor that are of functional importance and to determined pathways whereby GH signals are conveyed within the cell. 33 refs., 2 tabs.

  19. Exposure tool chuck flatness study and effects on lithography

    NASA Astrophysics Data System (ADS)

    Mukherjee-Roy, Moitreyee; Tan, Cher-Huan; Tan, Yong K.; Samudra, Ganesh S.

    2001-04-01

    The flatness of the chuck on the stepper or scanner is critical to obtain good patterning performance especially in the sub quarter micron regime. In this study an attempt has been made to u7nderstand the flatness signature of the chuck by measuring the flatness of a super flat wafer in two different notch orientations and subtracting the signatures. If the chuck or the wafer were ideally flat then there would be no different in flatness signatures between the two orientations. However in practice difference was found as neither the chuck nor the wafer is perfectly flat. This difference could be used to obtain an understanding about the flatness signature on the scanner chuck itself. This signature could be used by equipment manufacturers as an additional method to measure chuck flatness so that only superior chucks are used for equipment that are being made for sub quarter micron lithography. The second part of this study consisted of finding out the effect of this flatness on the resulting CD on wafers. Wafers, with different flatness signatures, were exposed at different orientations and the CD variations were evaluated. All wafers showed improvements in the orientation of better flatness. For some wafers the improvements was significant but for others the result was close to the CD variation due to rework. This could be attributed to the inherent signatures on the wafers and how abrupt the change in flatness was. The wafer deformation factor was not analyzed for brevity as this would make the problem far more complex.

  20. Sampling strategies for improving tree accuracy and phylogenetic analyses: a case study in ciliate protists, with notes on the genus Paramecium.

    PubMed

    Yi, Zhenzhen; Strüder-Kypke, Michaela; Hu, Xiaozhong; Lin, Xiaofeng; Song, Weibo

    2014-02-01

    In order to assess how dataset-selection for multi-gene analyses affects the accuracy of inferred phylogenetic trees in ciliates, we chose five genes and the genus Paramecium, one of the most widely used model protist genera, and compared tree topologies of the single- and multi-gene analyses. Our empirical study shows that: (1) Using multiple genes improves phylogenetic accuracy, even when their one-gene topologies are in conflict with each other. (2) The impact of missing data on phylogenetic accuracy is ambiguous: resolution power and topological similarity, but not number of represented taxa, are the most important criteria of a dataset for inclusion in concatenated analyses. (3) As an example, we tested the three classification models of the genus Paramecium with a multi-gene based approach, and only the monophyly of the subgenus Paramecium is supported.

  1. Study on the maximum accuracy of the pseudopotential density functional method with localized atomic orbitals versus plane-wave basis sets.

    PubMed

    Gusso, Michele

    2008-01-28

    A detailed study on the accuracy attainable with numerical atomic orbitals in the context of pseudopotential first-principles density functional theory is presented. Dimers of first- and second-row elements are analyzed: bond lengths, atomization energies, and Kohn-Sham eigenvalue spectra obtained with localized orbitals and with plane-wave basis sets are compared. For each dimer, the cutoff radius, the shape, and the number of the atomic basis orbitals are varied in order to maximize the accuracy of the calculations. Optimized atomic orbitals are obtained following two routes: (i) maximization of the projection of plane wave results into atomic orbital basis sets and (ii) minimization of the total energy with respect to a set of primitive atomic orbitals as implemented in the OPENMX software package. It is found that by optimizing the numerical basis, chemical accuracy can be obtained even with a small set of orbitals.

  2. Conductimetry: a new tool for studying inhibition of elastolysis.

    PubMed

    Saulnier, J; Bostancioglu, K; Favre-Bonvin, G; Wallach, J

    1988-05-01

    The conductimetric method was applied to the measurement of human leukocyte elastase activity, using insoluble elastin as a substrate. From conductance changes, initial rates of elastolysis were derived. A linear relationship of enzyme activity with enzyme concentration was demonstrated up to 400nM of enzyme, for three different substrates. In this concentration range, inhibition of elastolysis by eglin c was studied for different concentrations of eglin c. A 50% inhibitory concentrations of 0.13-0.15 microM of eglin c was derived from our results, corresponding to an inhibitor/enzyme ratio of about 0.5, indicating a strong inhibition, as previously demonstrated by authors using synthetic substrates.

  3. Methods and tools to enjoy and to study inaccessible Heritage

    NASA Astrophysics Data System (ADS)

    Capone, M.; Campi, M.

    2014-06-01

    Our research on a multi-purpose survey of cultural Heritage located in UNESCO Historical Centre of Naples has the following goals: to test some innovative strategies to improve public enjoyment for inaccessible sites; to explore the use of some interactive systems to study heritage in remote; to explore how to access the information system through AR applications. In this paper we are going to focus on comparison between interactive system to access 3D data and photogrammetric processing of panoramic images. We investigated on: a. the use of 360° panorama for 3D restitutions; b. the use of 360° panorama as an interface to 3D data to extract real 3D coordinates and accurately measure distances; c. the use of 3D PDF to access a 3D database.

  4. Collagen matrix as a tool in studying fibroblastic cell behavior.

    PubMed

    Kanta, Jiří

    2015-01-01

    Type I collagen is a fibrillar protein, a member of a large family of collagen proteins. It is present in most body tissues, usually in combination with other collagens and other components of extracellular matrix. Its synthesis is increased in various pathological situations, in healing wounds, in fibrotic tissues and in many tumors. After extraction from collagen-rich tissues it is widely used in studies of cell behavior, especially those of fibroblasts and myofibroblasts. Cells cultured in a classical way, on planar plastic dishes, lack the third dimension that is characteristic of body tissues. Collagen I forms gel at neutral pH and may become a basis of a 3D matrix that better mimics conditions in tissue than plastic dishes.

  5. Sphingolipidomics: An Important Mechanistic Tool for Studying Fungal Pathogens

    PubMed Central

    Singh, Ashutosh; Del Poeta, Maurizio

    2016-01-01

    Sphingolipids form of a unique and complex group of bioactive lipids in fungi. Structurally, sphingolipids of fungi are quite diverse with unique differences in the sphingoid backbone, amide linked fatty acyl chain and the polar head group. Two of the most studied and conserved sphingolipid classes in fungi are the glucosyl- or galactosyl-ceramides and the phosphorylinositol containing phytoceramides. Comprehensive structural characterization and quantification of these lipids is largely based on advanced analytical mass spectrometry based lipidomic methods. While separation of complex lipid mixtures is achieved through high performance liquid chromatography, the soft – electrospray ionization tandem mass spectrometry allows a high sensitivity and selectivity of detection. Herein, we present an overview of lipid extraction, chromatographic separation and mass spectrometry employed in qualitative and quantitative sphingolipidomics in fungi. PMID:27148190

  6. Collagen matrix as a tool in studying fibroblastic cell behavior.

    PubMed

    Kanta, Jiří

    2015-01-01

    Type I collagen is a fibrillar protein, a member of a large family of collagen proteins. It is present in most body tissues, usually in combination with other collagens and other components of extracellular matrix. Its synthesis is increased in various pathological situations, in healing wounds, in fibrotic tissues and in many tumors. After extraction from collagen-rich tissues it is widely used in studies of cell behavior, especially those of fibroblasts and myofibroblasts. Cells cultured in a classical way, on planar plastic dishes, lack the third dimension that is characteristic of body tissues. Collagen I forms gel at neutral pH and may become a basis of a 3D matrix that better mimics conditions in tissue than plastic dishes. PMID:25734486

  7. The developing mouse dentition: a new tool for apoptosis study.

    PubMed

    Peterková, Renata; Peterka, Miroslav; Lesot, Hervé

    2003-12-01

    Developing limb or differentiating neural and blood cells are traditional models used to study programmed cell death in mammals. The developing mouse dentition can also be an attractive model for studying apoptosis regulation. Apoptosis is most extant during early odontogenesis in mice. The embryonic tooth pattern is comprised not only of anlagen of functional teeth (incisor, molars), but also of vestiges of ancestral tooth primordia that must be suppressed. Apoptosis is involved in (a) the elimination of vestigial tooth primordia in the prospective toothless gap (diastema) between the incisor and molars and (b) the shaping of germs in functional teeth. This type of apoptosis occurs in the dental epithelium according to a characteristic temporo-spatial pattern. Where apoptosis concentrates, specific signaling is also found. We proposed a hypothesis to explain the stimulation of apoptosis in the dental epithelium by integrating two concepts: (1) The regulation of epithelial budding by positional information generated from interactions between growth-activating and growth-inhibiting signals, and (2) apoptosis stimulation by the failure of death-suppressing signals. During the budding of the dental epithelium, local excess in growth inhibitors (e.g., Bmps) might lead to the epithelial cells' failure to receive adequate growth-activating (apoptosis-suppressing) signals (e.g., Fgfs). The resulting signal imbalance leads to cell "suicide" by apoptosis. Understanding of apoptosis regulation in the vestigial tooth primordia can help to elucidate the mechanism of their suppression during evolution and to identify factors essential for tooth survival. The latter knowledge will be important for developing a technology of tooth engineering. PMID:15033770

  8. Phase segmentation of X-ray computer tomography rock images using machine learning techniques: an accuracy and performance study

    NASA Astrophysics Data System (ADS)

    Chauhan, Swarup; Rühaak, Wolfram; Anbergen, Hauke; Kabdenov, Alen; Freise, Marcus; Wille, Thorsten; Sass, Ingo

    2016-07-01

    Performance and accuracy of machine learning techniques to segment rock grains, matrix and pore voxels from a 3-D volume of X-ray tomographic (XCT) grayscale rock images was evaluated. The segmentation and classification capability of unsupervised (k-means, fuzzy c-means, self-organized maps), supervised (artificial neural networks, least-squares support vector machines) and ensemble classifiers (bragging and boosting) were tested using XCT images of andesite volcanic rock, Berea sandstone, Rotliegend sandstone and a synthetic sample. The averaged porosity obtained for andesite (15.8 ± 2.5 %), Berea sandstone (16.3 ± 2.6 %), Rotliegend sandstone (13.4 ± 7.4 %) and the synthetic sample (48.3 ± 13.3 %) is in very good agreement with the respective laboratory measurement data and varies by a factor of 0.2. The k-means algorithm is the fastest of all machine learning algorithms, whereas a least-squares support vector machine is the most computationally expensive. Metrics entropy, purity, mean square root error, receiver operational characteristic curve and 10 K-fold cross-validation were used to determine the accuracy of unsupervised, supervised and ensemble classifier techniques. In general, the accuracy was found to be largely affected by the feature vector selection scheme. As it is always a trade-off between performance and accuracy, it is difficult to isolate one particular machine learning algorithm which is best suited for the complex phase segmentation problem. Therefore, our investigation provides parameters that can help in selecting the appropriate machine learning techniques for phase segmentation.

  9. The magnetism of speleothems: a novel tool for paleoclimatic studies

    NASA Astrophysics Data System (ADS)

    Lascu, I.; Feinberg, J. M.

    2011-12-01

    The magnetism of speleothems is an untapped resource of paleoclimatic, hydrogeologic, and geomagnetic information. Similar to other deposits containing magnetic minerals, speleothems chronicle the evolution of local environmental parameters via the concentration, composition and grain size of their magnetic mineral assemblages. Environmental magnetic studies on speleothems represent a new frontier in paleoclimate research because the low concentration of magnetic minerals in speleothems was often at or below the limit of detection of older magnetometers. Recent improvements in instrument sensitivity and resolution enable speleothems to reveal high quality data comparable to that of the most complete enviromagnetic records from traditional sedimentary deposits. The advantage of obtaining such records from speleothems is that they can be directly compared to temperature and precipitation reconstructions based on geochemical proxies from the same specimens. Well-dated environmental magnetic records from speleothems can yield high resolution reconstructions of regional and local climatic, erosional, or pedogenic histories, and may also contain information about the local hydrogeological conditions and aquifer architecture. In addition, speleothem records of the Earth's magnetic field intensity can provide information about the amplitude of shielding from cosmic rays, which determines the cosmogenic isotope production rate in the atmosphere, a key aspect of dating and paleoclimate reconstructions. Here we present two case studies of stalagmites whose magnetic properties can help decipher the local and regional environmental conditions at the time of speleothem formation. The first is a stalagmite from a cave in central China that grew continuously in the period 13,600-12,000 years BP, spanning the Late Glacial Allerød-Younger Dryas transition. The specimen is a fast-growing, annually-laminated speleothem. Continuous measurements using a pull-through cryogenic

  10. Understanding FRET as a Research Tool for Cellular Studies

    PubMed Central

    Shrestha, Dilip; Jenei, Attila; Nagy, Péter; Vereb, György; Szöllősi, János

    2015-01-01

    Communication of molecular species through dynamic association and/or dissociation at various cellular sites governs biological functions. Understanding these physiological processes require delineation of molecular events occurring at the level of individual complexes in a living cell. Among the few non-invasive approaches with nanometer resolution are methods based on Förster Resonance Energy Transfer (FRET). FRET is effective at a distance of 1–10 nm which is equivalent to the size of macromolecules, thus providing an unprecedented level of detail on molecular interactions. The emergence of fluorescent proteins and SNAP- and CLIP- tag proteins provided FRET with the capability to monitor changes in a molecular complex in real-time making it possible to establish the functional significance of the studied molecules in a native environment. Now, FRET is widely used in biological sciences, including the field of proteomics, signal transduction, diagnostics and drug development to address questions almost unimaginable with biochemical methods and conventional microscopies. However, the underlying physics of FRET often scares biologists. Therefore, in this review, our goal is to introduce FRET to non-physicists in a lucid manner. We will also discuss our contributions to various FRET methodologies based on microscopy and flow cytometry, while describing its application for determining the molecular heterogeneity of the plasma membrane in various cell types. PMID:25815593

  11. Scanning force microscopy as a tool for fracture studies

    SciTech Connect

    Thome, F.; Goeken, M.; Grosse Gehling, M.; Vehoff, H.

    1999-08-01

    Dynamic simulations of the fracture toughness as a function of the orientation and temperature were carried out and compared with experimental results obtained by in-situ loading pre-cracked NiAl single crystals inside a scanning force microscope (SFM). In order to compare the simulations with the experiments the problem of the short crack with dislocations was solved for general loading and arbitrary slip line directions. The stress and strain field obtained could be directly connected to FEM calculations which allowed the examination of the stability of micro cracks at notches. The effect of different fracture conditions for biaxial loading were studied in detail. The dynamic simulation yielded predictions of K{sub IC}, slip line length and dislocation distributions as a function of loading rate, temperature and orientation. These predictions were tested by in-situ loading NiAl single crystals inside a SFM at various temperatures. The local COD, slip line length and apparent dislocation distribution at the surface were measured as a function of the applied load and the temperature. The experiments clearly demonstrated that dislocations emit from the crack tip before unstable crack jumps occur. The local COD could be directly related to the number of dislocations emitted from the crack tip. With increasing temperature the number of dislocations and the local COD increased before unstable crack jumps or final fracture occurred.

  12. Next generation sequencing technologies: tool to study avian virus diversity.

    PubMed

    Kapgate, S S; Barbuddhe, S B; Kumanan, K

    2015-03-01

    Increased globalisation, climatic changes and wildlife-livestock interface led to emergence of novel viral pathogens or zoonoses that have become serious concern to avian, animal and human health. High biodiversity and bird migration facilitate spread of the pathogen and provide reservoirs for emerging infectious diseases. Current classical diagnostic methods designed to be virus-specific or aim to be limited to group of viral agents, hinder identifying of novel viruses or viral variants. Recently developed approaches of next-generation sequencing (NGS) provide culture-independent methods that are useful for understanding viral diversity and discovery of novel virus, thereby enabling a better diagnosis and disease control. This review discusses the different possible steps of a NGS study utilizing sequence-independent amplification, high-throughput sequencing and bioinformatics approaches to identify novel avian viruses and their diversity. NGS lead to the identification of a wide range of new viruses such as picobirnavirus, picornavirus, orthoreovirus and avian gamma coronavirus associated with fulminating disease in guinea fowl and is also used in describing viral diversity among avian species. The review also briefly discusses areas of viral-host interaction and disease associated causalities with newly identified avian viruses. PMID:25790045

  13. Next generation sequencing technologies: tool to study avian virus diversity.

    PubMed

    Kapgate, S S; Barbuddhe, S B; Kumanan, K

    2015-03-01

    Increased globalisation, climatic changes and wildlife-livestock interface led to emergence of novel viral pathogens or zoonoses that have become serious concern to avian, animal and human health. High biodiversity and bird migration facilitate spread of the pathogen and provide reservoirs for emerging infectious diseases. Current classical diagnostic methods designed to be virus-specific or aim to be limited to group of viral agents, hinder identifying of novel viruses or viral variants. Recently developed approaches of next-generation sequencing (NGS) provide culture-independent methods that are useful for understanding viral diversity and discovery of novel virus, thereby enabling a better diagnosis and disease control. This review discusses the different possible steps of a NGS study utilizing sequence-independent amplification, high-throughput sequencing and bioinformatics approaches to identify novel avian viruses and their diversity. NGS lead to the identification of a wide range of new viruses such as picobirnavirus, picornavirus, orthoreovirus and avian gamma coronavirus associated with fulminating disease in guinea fowl and is also used in describing viral diversity among avian species. The review also briefly discusses areas of viral-host interaction and disease associated causalities with newly identified avian viruses.

  14. Animal models as tools to study the pathophysiology of depression.

    PubMed

    Abelaira, Helena M; Réus, Gislaine Z; Quevedo, João

    2013-01-01

    The incidence of depressive illness is high worldwide, and the inadequacy of currently available drug treatments contributes to the significant health burden associated with depression. A basic understanding of the underlying disease processes in depression is lacking; therefore, recreating the disease in animal models is not possible. Popular current models of depression creatively merge ethologically valid behavioral assays with the latest technological advances in molecular biology. Within this context, this study aims to evaluate animal models of depression and determine which has the best face, construct, and predictive validity. These models differ in the degree to which they produce features that resemble a depressive-like state, and models that include stress exposure are widely used. Paradigms that employ acute or sub-chronic stress exposure include learned helplessness, the forced swimming test, the tail suspension test, maternal deprivation, chronic mild stress, and sleep deprivation, to name but a few, all of which employ relatively short-term exposure to inescapable or uncontrollable stress and can reliably detect antidepressant drug response.

  15. Immediate effects of lower cervical spine manipulation on handgrip strength and free-throw accuracy of asymptomatic basketball players: a pilot study

    PubMed Central

    Humphries, Kelley M.; Ward, John; Coats, Jesse; Nobert, Jeannique; Amonette, William; Dyess, Stephen

    2013-01-01

    Objective The purpose of this pilot study was to collect preliminary information for a study to determine the immediate effects of a single unilateral chiropractic manipulation to the lower cervical spine on handgrip strength and free-throw accuracy in asymptomatic male recreational basketball players. Methods For this study, 24 asymptomatic male recreational right-handed basketball players (age = 26.3 ± 9.2 years, height = 1.81 ± 0.07 m, body mass = 82.6 ± 10.4 kg [mean ± SD]) underwent baseline dominant handgrip isometric strength and free-throw accuracy testing in an indoor basketball court. They were then equally randomized to receive either (1) diversified left lower cervical spine chiropractic manipulative therapy (CMT) at C5/C6 or (2) placebo CMT at C5/C6 using an Activator adjusting instrument on zero force setting. Participants then underwent posttesting of isometric handgrip strength and free-throw accuracy. A paired-samples t test was used to make within-group pre to post comparisons and between-group pre to post comparisons. Results No statistically significant difference was shown between either of the 2 basketball performance variables measured in either group. Isometric handgrip strength marginally improved by 0.7 kg (mean) in the CMT group (P = .710). Free-throw accuracy increased by 13.2% in the CMT group (P = .058). The placebo CMT group performed the same or more poorly during their second test session. Conclusions The results of this preliminary study showed that a single lower cervical spine manipulation did not significantly impact basketball performance for this group of healthy asymptomatic participants. A slight increase in free-throw percentage was seen, which deserves further investigation. This pilot study demonstrates that a larger study to evaluate if CMT affects handgrip strength and free-throw accuracy is feasible. PMID:24396315

  16. Prostate intrafraction motion evaluation using kV fluoroscopy during treatment delivery: A feasibility and accuracy study

    SciTech Connect

    Adamson, Justus; Wu Qiuwen

    2008-05-15

    Margin reduction for prostate radiotherapy is limited by uncertainty in prostate localization during treatment. We investigated the feasibility and accuracy of measuring prostate intrafraction motion using kV fluoroscopy performed simultaneously with radiotherapy. Three gold coils used for target localization were implanted into the patient's prostate gland before undergoing hypofractionated online image-guided step-and-shoot intensity modulated radiation therapy (IMRT) on an Elekta Synergy linear accelerator. At each fraction, the patient was aligned using a cone-beam computed tomography (CBCT), after which the IMRT treatment delivery and fluoroscopy were performed simultaneously. In addition, a post-treatment CBCT was acquired with the patient still on the table. To measure the intrafraction motion, we developed an algorithm to register the fluoroscopy images to a reference image derived from the post-treatment CBCT, and we estimated coil motion in three-dimensional (3D) space by combining information from registrations at different gantry angles. We also detected the MV beam turning on and off using MV scatter incident in the same fluoroscopy images, and used this information to synchronize our intrafraction evaluation with the treatment delivery. In addition, we assessed the following: the method to synchronize with treatment delivery, the dose from kV imaging, the accuracy of the localization, and the error propagated into the 3D localization from motion between fluoroscopy acquisitions. With 0.16 mAs/frame and a bowtie filter implemented, the coils could be localized with the gantry at both 0 deg. and 270 deg. with the MV beam off, and at 270 deg. with the MV beam on when multiple fluoroscopy frames were averaged. The localization in two-dimensions for phantom and patient measurements was performed with submillimeter accuracy. After backprojection into 3D the patient localization error was (-0.04{+-}0.30) mm, (0.09{+-}0.36) mm, and (0.03{+-}0.68) mm in the

  17. Prostate intrafraction motion evaluation using kV fluoroscopy during treatment delivery: A feasibility and accuracy study

    PubMed Central

    Adamson, Justus; Wu, Qiuwen

    2008-01-01

    Margin reduction for prostate radiotherapy is limited by uncertainty in prostate localization during treatment. We investigated the feasibility and accuracy of measuring prostate intrafraction motion using kV fluoroscopy performed simultaneously with radiotherapy. Three gold coils used for target localization were implanted into the patient’s prostate gland before undergoing hypofractionated online image-guided step-and-shoot intensity modulated radiation therapy (IMRT) on an Elekta Synergy linear accelerator. At each fraction, the patient was aligned using a cone-beam computed tomography (CBCT), after which the IMRT treatment delivery and fluoroscopy were performed simultaneously. In addition, a post-treatment CBCT was acquired with the patient still on the table. To measure the intrafraction motion, we developed an algorithm to register the fluoroscopy images to a reference image derived from the post-treatment CBCT, and we estimated coil motion in three-dimensional (3D) space by combining information from registrations at different gantry angles. We also detected the MV beam turning on and off using MV scatter incident in the same fluoroscopy images, and used this information to synchronize our intrafraction evaluation with the treatment delivery. In addition, we assessed the following: the method to synchronize with treatment delivery, the dose from kV imaging, the accuracy of the localization, and the error propagated into the 3D localization from motion between fluoroscopy acquisitions. With 0.16 mAs∕frame and a bowtie filter implemented, the coils could be localized with the gantry at both 0° and 270° with the MV beam off, and at 270° with the MV beam on when multiple fluoroscopy frames were averaged. The localization in two-dimensions for phantom and patient measurements was performed with submillimeter accuracy. After backprojection into 3D the patient localization error was (−0.04±0.30) mm, (0.09±0.36) mm, and (0.03±0.68) mm in the right

  18. Accuracy of autofluorescence in diagnosing oral squamous cell carcinoma and oral potentially malignant disorders: a comparative study with aero-digestive lesions

    PubMed Central

    Luo, Xiaobo; Xu, Hao; He, Mingjing; Han, Qi; Wang, Hui; Sun, Chongkui; Li, Jing; Jiang, Lu; Zhou, Yu; Dan, Hongxia; Feng, Xiaodong; Zeng, Xin; Chen, Qianming

    2016-01-01

    Presently, various studies had investigated the accuracy of autofluorescence in diagnosing oral squamous cell carcinoma (OSCC) and oral potentially malignant disorders (OPMD) with diverse conclusions. This study aimed to assess its accuracy for OSCC and OPMD and to investigate its applicability in general dental practice. After a comprehensive literature search, a meta-analysis was conducted to calculate the pooled diagnostic indexes of autofluorescence for premalignant lesions (PML) and malignant lesions (ML) of the oral cavity, lung, esophagus, stomach and colorectum and to compute indexes regarding the detection of OSCC aided by algorithms. Besides, a u test was performed. Twenty-four studies detecting OSCC and OPMD in 2761 lesions were included. This demonstrated that the overall accuracy of autofluorescence for OSCC and OPMD was superior to PML and ML of the lung, esophagus and stomach, slightly inferior to the colorectum. Additionally, the sensitivity and specificity for OSCC and OPMD were 0.89 and 0.8, respectively. Furthermore, the specificity could be remarkably improved by additional algorithms. With relatively high accuracy, autofluorescence could be potentially applied as an adjunct for early diagnosis of OSCC and OPMD. Moreover, approaches such as algorithms could enhance its specificity to ensure its efficacy in primary care. PMID:27416981

  19. Eating tools in hand activate the brain systems for eating action: a transcranial magnetic stimulation study.

    PubMed

    Yamaguchi, Kaori; Nakamura, Kimihiro; Oga, Tatsuhide; Nakajima, Yasoichi

    2014-07-01

    There is increasing neuroimaging evidence suggesting that visually presented tools automatically activate the human sensorimotor system coding learned motor actions relevant to the visual stimuli. Such crossmodal activation may reflect a general functional property of the human motor memory and thus can be operating in other, non-limb effector organs, such as the orofacial system involved in eating. In the present study, we predicted that somatosensory signals produced by eating tools in hand covertly activate the neuromuscular systems involved in eating action. In Experiments 1 and 2, we measured motor evoked response (MEP) of the masseter muscle in normal humans to examine the possible impact of tools in hand (chopsticks and scissors) on the neuromuscular systems during the observation of food stimuli. We found that eating tools (chopsticks) enhanced the masseter MEPs more greatly than other tools (scissors) during the visual recognition of food, although this covert change in motor excitability was not detectable at the behavioral level. In Experiment 3, we further observed that chopsticks overall increased MEPs more greatly than scissors and this tool-driven increase of MEPs was greater when participants viewed food stimuli than when they viewed non-food stimuli. A joint analysis of the three experiments confirmed a significant impact of eating tools on the masseter MEPs during food recognition. Taken together, these results suggest that eating tools in hand exert a category-specific impact on the neuromuscular system for eating.

  20. Astrophysics with Microarcsecond Accuracy Astrometry

    NASA Technical Reports Server (NTRS)

    Unwin, Stephen C.

    2008-01-01

    Space-based astrometry promises to provide a powerful new tool for astrophysics. At a precision level of a few microarcsonds, a wide range of phenomena are opened up for study. In this paper we discuss the capabilities of the SIM Lite mission, the first space-based long-baseline optical interferometer, which will deliver parallaxes to 4 microarcsec. A companion paper in this volume will cover the development and operation of this instrument. At the level that SIM Lite will reach, better than 1 microarcsec in a single measurement, planets as small as one Earth can be detected around many dozen of the nearest stars. Not only can planet masses be definitely measured, but also the full orbital parameters determined, allowing study of system stability in multiple planet systems. This capability to survey our nearby stellar neighbors for terrestrial planets will be a unique contribution to our understanding of the local universe. SIM Lite will be able to tackle a wide range of interesting problems in stellar and Galactic astrophysics. By tracing the motions of stars in dwarf spheroidal galaxies orbiting our Milky Way, SIM Lite will probe the shape of the galactic potential history of the formation of the galaxy, and the nature of dark matter. Because it is flexibly scheduled, the instrument can dwell on faint targets, maintaining its full accuracy on objects as faint as V=19. This paper is a brief survey of the diverse problems in modern astrophysics that SIM Lite will be able to address.

  1. Validity of ICD-9-CM codes for breast, lung and colorectal cancers in three Italian administrative healthcare databases: a diagnostic accuracy study protocol

    PubMed Central

    Abraha, Iosief; Serraino, Diego; Giovannini, Gianni; Stracci, Fabrizio; Casucci, Paola; Alessandrini, Giuliana; Bidoli, Ettore; Chiari, Rita; Cirocchi, Roberto; De Giorgi, Marcello; Franchini, David; Vitale, Maria Francesca; Fusco, Mario; Montedori, Alessandro

    2016-01-01

    Introduction Administrative healthcare databases are useful tools to study healthcare outcomes and to monitor the health status of a population. Patients with cancer can be identified through disease-specific codes, prescriptions and physician claims, but prior validation is required to achieve an accurate case definition. The objective of this protocol is to assess the accuracy of International Classification of Diseases Ninth Revision—Clinical Modification (ICD-9-CM) codes for breast, lung and colorectal cancers in identifying patients diagnosed with the relative disease in three Italian administrative databases. Methods and analysis Data from the administrative databases of Umbria Region (910 000 residents), Local Health Unit 3 of Napoli (1 170 000 residents) and Friuli-Venezia Giulia Region (1 227 000 residents) will be considered. In each administrative database, patients with the first occurrence of diagnosis of breast, lung or colorectal cancer between 2012 and 2014 will be identified using the following groups of ICD-9-CM codes in primary position: (1) 233.0 and (2) 174.x for breast cancer; (3) 162.x for lung cancer; (4) 153.x for colon cancer and (5) 154.0–154.1 and 154.8 for rectal cancer. Only incident cases will be considered, that is, excluding cases that have the same diagnosis in the 5 years (2007–2011) before the period of interest. A random sample of cases and non-cases will be selected from each administrative database and the corresponding medical charts will be assessed for validation by pairs of trained, independent reviewers. Case ascertainment within the medical charts will be based on (1) the presence of a primary nodular lesion in the breast, lung or colon–rectum, documented with imaging or endoscopy and (2) a cytological or histological documentation of cancer from a primary or metastatic site. Sensitivity and specificity with 95% CIs will be calculated. Dissemination Study results will be disseminated widely through

  2. Assessment tools of energy balance-related behaviours used in European obesity prevention strategies: review of studies during preschool.

    PubMed

    Mouratidou, T; Mesana, M I; Manios, Y; Koletzko, B; Chinapaw, M J M; De Bourdeaudhuij, I; Socha, P; Iotova, V; Moreno, L A

    2012-03-01

    Valid and reliable measures of energy balance-related behaviours are required when evaluating the effectiveness of public health interventions aiming at prevention of childhood obesity. A structured descriptive review was performed to appraise food intake, physical activity and sedentary behaviour assessment tools used in obesity intervention strategies targeting mainly preschool children across Europe. In total, 25 papers are described, addressing energy balance-related behaviours as study outcomes and targeting individuals or clusters of individuals at school- or home-based environment. Parentally reported food records and 24-h recalls were commonly used to assess food intake. Subjective levels of physical activity and sedentary behaviour were commonly accessed via parentally reported questionnaires. Accelerometry was used to obtain objective measures of physical activity. Insufficient evidence of tool evaluation was provided. When feasible, food records and accelerometry are recommended as the most appropriate methods to assess food intake in young children. Sedentary behaviour could be assessed via questionnaires that include key indicators of sedentarism and are able to differentiate individual practices. The choice of methodology for the assessment of specific intervention effects should be equally balanced between required accuracy levels and feasibility, and be guided by the intervention targets.

  3. Accuracy of surface registration compared to conventional volumetric registration in patient positioning for head-and-neck radiotherapy: A simulation study using patient data

    SciTech Connect

    Kim, Youngjun; Li, Ruijiang; Na, Yong Hum; Xing, Lei; Lee, Rena

    2014-12-15

    Purpose: 3D optical surface imaging has been applied to patient positioning in radiation therapy (RT). The optical patient positioning system is advantageous over conventional method using cone-beam computed tomography (CBCT) in that it is radiation free, frameless, and is capable of real-time monitoring. While the conventional radiographic method uses volumetric registration, the optical system uses surface matching for patient alignment. The relative accuracy of these two methods has not yet been sufficiently investigated. This study aims to investigate the theoretical accuracy of the surface registration based on a simulation study using patient data. Methods: This study compares the relative accuracy of surface and volumetric registration in head-and-neck RT. The authors examined 26 patient data sets, each consisting of planning CT data acquired before treatment and patient setup CBCT data acquired at the time of treatment. As input data of surface registration, patient’s skin surfaces were created by contouring patient skin from planning CT and treatment CBCT. Surface registration was performed using the iterative closest points algorithm by point–plane closest, which minimizes the normal distance between source points and target surfaces. Six degrees of freedom (three translations and three rotations) were used in both surface and volumetric registrations and the results were compared. The accuracy of each method was estimated by digital phantom tests. Results: Based on the results of 26 patients, the authors found that the average and maximum root-mean-square translation deviation between the surface and volumetric registrations were 2.7 and 5.2 mm, respectively. The residual error of the surface registration was calculated to have an average of 0.9 mm and a maximum of 1.7 mm. Conclusions: Surface registration may lead to results different from those of the conventional volumetric registration. Only limited accuracy can be achieved for patient

  4. Using checklists and algorithms to improve qualitative exposure judgment accuracy.

    PubMed

    Arnold, Susan F; Stenzel, Mark; Drolet, Daniel; Ramachandran, Gurumurthy

    2016-01-01

    Most exposure assessments are conducted without the aid of robust personal exposure data and are based instead on qualitative inputs such as education and experience, training, documentation on the process chemicals, tasks and equipment, and other information. Qualitative assessments determine whether there is any follow-up, and influence the type that occurs, such as quantitative sampling, worker training, and implementing exposure and risk management measures. Accurate qualitative exposure judgments ensure appropriate follow-up that in turn ensures appropriate exposure management. Studies suggest that qualitative judgment accuracy is low. A qualitative exposure assessment Checklist tool was developed to guide the application of a set of heuristics to aid decision making. Practicing hygienists (n = 39) and novice industrial hygienists (n = 8) were recruited for a study evaluating the influence of the Checklist on exposure judgment accuracy. Participants generated 85 pre-training judgments and 195 Checklist-guided judgments. Pre-training judgment accuracy was low (33%) and not statistically significantly different from random chance. A tendency for IHs to underestimate the true exposure was observed. Exposure judgment accuracy improved significantly (p <0.001) to 63% when aided by the Checklist. Qualitative judgments guided by the Checklist tool were categorically accurate or over-estimated the true exposure by one category 70% of the time. The overall magnitude of exposure judgment precision also improved following training. Fleiss' κ, evaluating inter-rater agreement between novice assessors was fair to moderate (κ = 0.39). Cohen's weighted and unweighted κ were good to excellent for novice (0.77 and 0.80) and practicing IHs (0.73 and 0.89), respectively. Checklist judgment accuracy was similar to quantitative exposure judgment accuracy observed in studies of similar design using personal exposure measurements, suggesting that the tool could be useful in

  5. Using checklists and algorithms to improve qualitative exposure judgment accuracy.

    PubMed

    Arnold, Susan F; Stenzel, Mark; Drolet, Daniel; Ramachandran, Gurumurthy

    2016-01-01

    Most exposure assessments are conducted without the aid of robust personal exposure data and are based instead on qualitative inputs such as education and experience, training, documentation on the process chemicals, tasks and equipment, and other information. Qualitative assessments determine whether there is any follow-up, and influence the type that occurs, such as quantitative sampling, worker training, and implementing exposure and risk management measures. Accurate qualitative exposure judgments ensure appropriate follow-up that in turn ensures appropriate exposure management. Studies suggest that qualitative judgment accuracy is low. A qualitative exposure assessment Checklist tool was developed to guide the application of a set of heuristics to aid decision making. Practicing hygienists (n = 39) and novice industrial hygienists (n = 8) were recruited for a study evaluating the influence of the Checklist on exposure judgment accuracy. Participants generated 85 pre-training judgments and 195 Checklist-guided judgments. Pre-training judgment accuracy was low (33%) and not statistically significantly different from random chance. A tendency for IHs to underestimate the true exposure was observed. Exposure judgment accuracy improved significantly (p <0.001) to 63% when aided by the Checklist. Qualitative judgments guided by the Checklist tool were categorically accurate or over-estimated the true exposure by one category 70% of the time. The overall magnitude of exposure judgment precision also improved following training. Fleiss' κ, evaluating inter-rater agreement between novice assessors was fair to moderate (κ = 0.39). Cohen's weighted and unweighted κ were good to excellent for novice (0.77 and 0.80) and practicing IHs (0.73 and 0.89), respectively. Checklist judgment accuracy was similar to quantitative exposure judgment accuracy observed in studies of similar design using personal exposure measurements, suggesting that the tool could be useful in

  6. A comparative study of the quantitative accuracy of three-dimensional reconstructions of spinal cord from serial histological sections.

    PubMed

    Duerstock, B S; Bajaj, C L; Borgens, R B

    2003-05-01

    We evaluated the accuracy of estimating the volume of biological soft tissues from their three-dimensional (3D) computer wireframe models, reconstructed from histological data sets obtained from guinea-pig spinal cords. We compared quantification from two methods of three-dimensional surface reconstruction to standard quantitative techniques, Cavalieri method employing planimetry and point counting and Geometric Best-Fitting. This involved measuring a group of spinal cord segments and test objects to evaluate the accuracy of our novel quantification approaches. Once a quantitative methodology was standardized there was no statistical difference in volume measurement of spinal segments between quantification methods. We found that our 3D surface reconstructions' ability to model precisely actual soft tissues provided an accurate volume quantification of complex anatomical structures as standard approaches of Cavalieri estimation and Geometric Best-Fitting. Additionally, 3D reconstruction quantitatively interrogates and three-dimensionally images spinal cord segments and obscured internal pathological features with approximately the same effort required for standard quantification alone.

  7. Accuracy of the Chinese lunar calendar method to predict a baby's sex: a population-based study.

    PubMed

    Villamor, Eduardo; Dekker, Louise; Svensson, Tobias; Cnattingius, Sven

    2010-07-01

    We estimated the accuracy of a non-invasive, inexpensive method (the Chinese lunar calendar, CLC) to predict the sex of a baby from around the time of conception, using 2,840,755 singleton births occurring in Sweden between 1973 and 2006. Maternal lunar age and month of conception were estimated, and used to predict each baby's sex, according to a published algorithm. Kappa statistics were estimated for the actual vs. the CLC-predicted sex of the baby. Overall kappa was 0.0002 [95% CI -0.0009, 0.0014]. Accuracy was not modified by year of conception, maternal age, level of education, body mass index or parity. In a validation subset of 1000 births in which we used a website-customised algorithm to estimate lunar dates, kappa was -0.02 [95% CI -0.08, 0.04]. Simulating the misuse of the method by failing to convert Gregorian dates into lunar did not change the results. We conclude that the CLC method is no better at predicting the sex of a baby than tossing a coin and advise against painting the nursery based on this method's result. PMID:20618730

  8. Accuracy of the Chinese lunar calendar method to predict a baby's sex: a population-based study.

    PubMed

    Villamor, Eduardo; Dekker, Louise; Svensson, Tobias; Cnattingius, Sven

    2010-07-01

    We estimated the accuracy of a non-invasive, inexpensive method (the Chinese lunar calendar, CLC) to predict the sex of a baby from around the time of conception, using 2,840,755 singleton births occurring in Sweden between 1973 and 2006. Maternal lunar age and month of conception were estimated, and used to predict each baby's sex, according to a published algorithm. Kappa statistics were estimated for the actual vs. the CLC-predicted sex of the baby. Overall kappa was 0.0002 [95% CI -0.0009, 0.0014]. Accuracy was not modified by year of conception, maternal age, level of education, body mass index or parity. In a validation subset of 1000 births in which we used a website-customised algorithm to estimate lunar dates, kappa was -0.02 [95% CI -0.08, 0.04]. Simulating the misuse of the method by failing to convert Gregorian dates into lunar did not change the results. We conclude that the CLC method is no better at predicting the sex of a baby than tossing a coin and advise against painting the nursery based on this method's result.

  9. Analysis and validation of automated skull stripping tools: a validation study based on 296 MR images from the Honolulu Asia aging study.

    PubMed

    Hartley, S W; Scher, A I; Korf, E S C; White, L R; Launer, L J

    2006-05-01

    As population-based epidemiologic studies may acquire images from thousands of subjects, automated image post-processing is needed. However, error in these methods may be biased and related to subject characteristics relevant to the research question. Here, we compare two automated methods of brain extraction against manually segmented images and evaluate whether method accuracy is associated with subject demographic and health characteristics. MRI data (n = 296) are from the Honolulu Asia Aging Study, a population-based study of elderly Japanese-American men. The intracranial space was manually outlined on the axial proton density sequence by a single operator. The brain was extracted automatically using BET (Brain Extraction Tool) and BSE (Brain Surface Extractor) on axial proton density images. Total intracranial volume was calculated for the manually segmented images (ticvM), the BET segmented images (ticvBET) and the BSE segmented images (ticvBSE). Mean ticvBSE was closer to that of ticvM, but ticvBET was more highly correlated with ticvM than ticvBSE. BSE had significant over (positive error) and underestimated (negative error) ticv, but net error was relatively low. BET had large positive and very low negative error. Method accuracy, measured in percent positive and negative error, varied slightly with age, head circumference, presence of the apolipoprotein eepsilon4 polymorphism, subcortical and cortical infracts and enlarged ventricles. This epidemiologic approach to the assessment of potential bias in image post-processing tasks shows both skull-stripping programs performed well in this large image dataset when compared to manually segmented images. Although method accuracy was statistically associated with some subject characteristics, the extent of the misclassification (in terms of percent of brain volume) was small.

  10. A student-oriented study tool for heterogeneous HyperCard courseware.

    PubMed

    Rathe, R; Garren, T

    1991-01-01

    Several computer-assisted instruction initiatives at our institution rely on a shared laserdisc for image storage and HyperCard for interactive programming. This paper furnishes a brief overview of problems encountered with multimedia projects and how we are attempting to overcome them using a generic, student-oriented study tool. The tool provides bookmarks, annotations, quotations, and other utilities across our entire HyperCard courseware collection. PMID:1807701

  11. A mixed effect model for bivariate meta-analysis of diagnostic test accuracy studies using a copula representation of the random effects distribution.

    PubMed

    Nikoloulopoulos, Aristidis K

    2015-12-20

    Diagnostic test accuracy studies typically report the number of true positives, false positives, true negatives and false negatives. There usually exists a negative association between the number of true positives and true negatives, because studies that adopt less stringent criterion for declaring a test positive invoke higher sensitivities and lower specificities. A generalized linear mixed model (GLMM) is currently recommended to synthesize diagnostic test accuracy studies. We propose a copula mixed model for bivariate meta-analysis of diagnostic test accuracy studies. Our general model includes the GLMM as a special case and can also operate on the original scale of sensitivity and specificity. Summary receiver operating characteristic curves are deduced for the proposed model through quantile regression techniques and different characterizations of the bivariate random effects distribution. Our general methodology is demonstrated with an extensive simulation study and illustrated by re-analysing the data of two published meta-analyses. Our study suggests that there can be an improvement on GLMM in fit to data and makes the argument for moving to copula random effects models. Our modelling framework is implemented in the package CopulaREMADA within the open source statistical environment R.

  12. Accuracy and Uncertainty of Asymmetric Magnetization Transfer Ratio Quantification for Amide Proton Transfer (APT) Imaging at 3T: A Monte Carlo Study

    PubMed Central

    Yuan, Jing; Zhang, Qinwei; Wang, Yi-Xiang; Wei, Juan; Zhou, Jinyuan

    2014-01-01

    Amide proton transfer (APT) imaging offers a novel and powerful MRI contrast mechanism for quantitative molecular imaging based on the principle of chemical exchange saturation transfer (CEST). Asymmetric magnetization transfer ratio (MTRasym) quantification is crucial for Z-spectrum analysis of APT imaging, but is still challenging, particularly at clinical field strength. This paper studies the accuracy and uncertainty in the quantification of MTRasym for APT imaging at 3T, by using high-order polynomial fitting of Z-spectrum through Monte Carlo simulation. Results show that polynomial fitting is a biased estimator that consistently underestimates MTRasym. For a fixed polynomial order, the accuracy of MTRasym is almost constant with regard to signal-to-noise ratio (SNR) while the uncertainty decreases exponentially with SNR. The higher order polynomial fitting increases both the accuracy and the uncertainty of MTRasym. For different APT signal intensity levels, the relative accuracy and the absolute uncertainty keep constant for a fixed polynomial order. These results indicate the limitations and pitfalls of polynomial fitting for MTRasym quantification so better quantification technique for MTRasym estimation is warranted. PMID:24110892

  13. Accuracy and uncertainty of asymmetric magnetization transfer ratio quantification for amide proton transfer (APT) imaging at 3T: a Monte Carlo study.

    PubMed

    Yuan, Jing; Zhang, Qinwei; Wang, Yi-Xiang; Wei, Juan; Zhou, Jinyuan

    2013-01-01

    Amide proton transfer (APT) imaging offers a novel and powerful MRI contrast mechanism for quantitative molecular imaging based on the principle of chemical exchange saturation transfer (CEST). Asymmetric magnetization transfer ratio (MTR(asym)) quantification is crucial for Z-spectrum analysis of APT imaging, but is still challenging, particularly at clinical field strength. This paper studies the accuracy and uncertainty in the quantification of MTR(asym) for APT imaging at 3T, by using high-order polynomial fitting of Z-spectrum through Monte Carlo simulation. Results show that polynomial fitting is a biased estimator that consistently underestimates MTR(asym). For a fixed polynomial order, the accuracy of MTR(asym) is almost constant with regard to signal-to-noise ratio (SNR) while the uncertainty decreases exponentially with SNR. The higher order polynomial fitting increases both the accuracy and the uncertainty of MTR(asym). For different APT signal intensity levels, the relative accuracy and the absolute uncertainty keep constant for a fixed polynomial order. These results indicate the limitations and pitfalls of polynomial fitting for MTR(asym) quantification so better quantification technique for MTR(asym) estimation is warranted.

  14. Recommended reporting standards for test accuracy studies of infectious diseases of finfish, amphibians, molluscs and crustaceans: the STRADAS-aquatic checklist.

    PubMed

    Gardner, Ian A; Whittington, Richard J; Caraguel, Charles G B; Hick, Paul; Moody, Nicholas J G; Corbeil, Serge; Garver, Kyle A; Warg, Janet V; Arzul, Isabelle; Purcell, Maureen K; Crane, Mark St J; Waltzek, Thomas B; Olesen, Niels J; Gallardo Lagno, Alicia

    2016-02-25

    Complete and transparent reporting of key elements of diagnostic accuracy studies for infectious diseases in cultured and wild aquatic animals benefits end-users of these tests, enabling the rational design of surveillance programs, the assessment of test results from clinical cases and comparisons of diagnostic test performance. Based on deficiencies in the Standards for Reporting of Diagnostic Accuracy (STARD) guidelines identified in a prior finfish study (Gardner et al. 2014), we adapted the Standards for Reporting of Animal Diagnostic Accuracy Studies-paratuberculosis (STRADAS-paraTB) checklist of 25 reporting items to increase their relevance to finfish, amphibians, molluscs, and crustaceans and provided examples and explanations for each item. The checklist, known as STRADAS-aquatic, was developed and refined by an expert group of 14 transdisciplinary scientists with experience in test evaluation studies using field and experimental samples, in operation of reference laboratories for aquatic animal pathogens, and in development of international aquatic animal health policy. The main changes to the STRADAS-paraTB checklist were to nomenclature related to the species, the addition of guidelines for experimental challenge studies, and the designation of some items as relevant only to experimental studies and ante-mortem tests. We believe that adoption of these guidelines will improve reporting of primary studies of test accuracy for aquatic animal diseases and facilitate assessment of their fitness-for-purpose. Given the importance of diagnostic tests to underpin the Sanitary and Phytosanitary agreement of the World Trade Organization, the principles outlined in this paper should be applied to other World Organisation for Animal Health (OIE)-relevant species.

  15. Recommended reporting standards for test accuracy studies of infectious diseases of finfish, amphibians, molluscs and crustaceans: the STRADAS-aquatic checklist.

    PubMed

    Gardner, Ian A; Whittington, Richard J; Caraguel, Charles G B; Hick, Paul; Moody, Nicholas J G; Corbeil, Serge; Garver, Kyle A; Warg, Janet V; Arzul, Isabelle; Purcell, Maureen K; Crane, Mark St J; Waltzek, Thomas B; Olesen, Niels J; Gallardo Lagno, Alicia

    2016-02-25

    Complete and transparent reporting of key elements of diagnostic accuracy studies for infectious diseases in cultured and wild aquatic animals benefits end-users of these tests, enabling the rational design of surveillance programs, the assessment of test results from clinical cases and comparisons of diagnostic test performance. Based on deficiencies in the Standards for Reporting of Diagnostic Accuracy (STARD) guidelines identified in a prior finfish study (Gardner et al. 2014), we adapted the Standards for Reporting of Animal Diagnostic Accuracy Studies-paratuberculosis (STRADAS-paraTB) checklist of 25 reporting items to increase their relevance to finfish, amphibians, molluscs, and crustaceans and provided examples and explanations for each item. The checklist, known as STRADAS-aquatic, was developed and refined by an expert group of 14 transdisciplinary scientists with experience in test evaluation studies using field and experimental samples, in operation of reference laboratories for aquatic animal pathogens, and in development of international aquatic animal health policy. The main changes to the STRADAS-paraTB checklist were to nomenclature related to the species, the addition of guidelines for experimental challenge studies, and the designation of some items as relevant only to experimental studies and ante-mortem tests. We believe that adoption of these guidelines will improve reporting of primary studies of test accuracy for aquatic animal diseases and facilitate assessment of their fitness-for-purpose. Given the importance of diagnostic tests to underpin the Sanitary and Phytosanitary agreement of the World Trade Organization, the principles outlined in this paper should be applied to other World Organisation for Animal Health (OIE)-relevant species. PMID:26912041

  16. Recommended reporting standards for test accuracy studies of infectious diseases of finfish, amphibians, molluscs and crustaceans: the STRADAS-aquatic checklist

    USGS Publications Warehouse

    Gardner, Ian A; Whittington, Richard J; Caraguel, Charles G B; Hick, Paul; Moody, Nicholas J G; Corbeil, Serge; Garver, Kyle A.; Warg, Janet V; Arzul, Isabelle; Purcell, Maureen; St. J. Crane, Mark; Waltzek, Thomas B.; Olesen, Niels J; Lagno, Alicia Gallardo

    2016-01-01

    Complete and transparent reporting of key elements of diagnostic accuracy studies for infectious diseases in cultured and wild aquatic animals benefits end-users of these tests, enabling the rational design of surveillance programs, the assessment of test results from clinical cases and comparisons of diagnostic test performance. Based on deficiencies in the Standards for Reporting of Diagnostic Accuracy (STARD) guidelines identified in a prior finfish study (Gardner et al. 2014), we adapted the Standards for Reporting of Animal Diagnostic Accuracy Studies—paratuberculosis (STRADAS-paraTB) checklist of 25 reporting items to increase their relevance to finfish, amphibians, molluscs, and crustaceans and provided examples and explanations for each item. The checklist, known as STRADAS-aquatic, was developed and refined by an expert group of 14 transdisciplinary scientists with experience in test evaluation studies using field and experimental samples, in operation of reference laboratories for aquatic animal pathogens, and in development of international aquatic animal health policy. The main changes to the STRADAS-paraTB checklist were to nomenclature related to the species, the addition of guidelines for experimental challenge studies, and the designation of some items as relevant only to experimental studies and ante-mortem tests. We believe that adoption of these guidelines will improve reporting of primary studies of test accuracy for aquatic animal diseases and facilitate assessment of their fitness-for-purpose. Given the importance of diagnostic tests to underpin the Sanitary and Phytosanitary agreement of the World Trade Organization, the principles outlined in this paper should be applied to other World Organisation for Animal Health (OIE)-relevant species.

  17. Predicting Out-of-Office Blood Pressure in the Clinic (PROOF-BP): Derivation and Validation of a Tool to Improve the Accuracy of Blood Pressure Measurement in Clinical Practice.

    PubMed

    Sheppard, James P; Stevens, Richard; Gill, Paramjit; Martin, Una; Godwin, Marshall; Hanley, Janet; Heneghan, Carl; Hobbs, F D Richard; Mant, Jonathan; McKinstry, Brian; Myers, Martin; Nunan, David; Ward, Alison; Williams, Bryan; McManus, Richard J

    2016-05-01

    Patients often have lower (white coat effect) or higher (masked effect) ambulatory/home blood pressure readings compared with clinic measurements, resulting in misdiagnosis of hypertension. The present study assessed whether blood pressure and patient characteristics from a single clinic visit can accurately predict the difference between ambulatory/home and clinic blood pressure readings (the home-clinic difference). A linear regression model predicting the home-clinic blood pressure difference was derived in 2 data sets measuring automated clinic and ambulatory/home blood pressure (n=991) using candidate predictors identified from a literature review. The model was validated in 4 further data sets (n=1172) using area under the receiver operator characteristic curve analysis. A masked effect was associated with male sex, a positive clinic blood pressure change (difference between consecutive measurements during a single visit), and a diagnosis of hypertension. Increasing age, clinic blood pressure level, and pulse pressure were associated with a white coat effect. The model showed good calibration across data sets (Pearson correlation, 0.48-0.80) and performed well-predicting ambulatory hypertension (area under the receiver operator characteristic curve, 0.75; 95% confidence interval, 0.72-0.79 [systolic]; 0.87; 0.85-0.89 [diastolic]). Used as a triaging tool for ambulatory monitoring, the model improved classification of a patient's blood pressure status compared with other guideline recommended approaches (93% [92% to 95%] classified correctly; United States, 73% [70% to 75%]; Canada, 74% [71% to 77%]; United Kingdom, 78% [76% to 81%]). This study demonstrates that patient characteristics from a single clinic visit can accurately predict a patient's ambulatory blood pressure. Usage of this prediction tool for triaging of ambulatory monitoring could result in more accurate diagnosis of hypertension and hence more appropriate treatment.

  18. Predicting Out-of-Office Blood Pressure in the Clinic (PROOF-BP): Derivation and Validation of a Tool to Improve the Accuracy of Blood Pressure Measurement in Clinical Practice.

    PubMed

    Sheppard, James P; Stevens, Richard; Gill, Paramjit; Martin, Una; Godwin, Marshall; Hanley, Janet; Heneghan, Carl; Hobbs, F D Richard; Mant, Jonathan; McKinstry, Brian; Myers, Martin; Nunan, David; Ward, Alison; Williams, Bryan; McManus, Richard J

    2016-05-01

    Patients often have lower (white coat effect) or higher (masked effect) ambulatory/home blood pressure readings compared with clinic measurements, resulting in misdiagnosis of hypertension. The present study assessed whether blood pressure and patient characteristics from a single clinic visit can accurately predict the difference between ambulatory/home and clinic blood pressure readings (the home-clinic difference). A linear regression model predicting the home-clinic blood pressure difference was derived in 2 data sets measuring automated clinic and ambulatory/home blood pressure (n=991) using candidate predictors identified from a literature review. The model was validated in 4 further data sets (n=1172) using area under the receiver operator characteristic curve analysis. A masked effect was associated with male sex, a positive clinic blood pressure change (difference between consecutive measurements during a single visit), and a diagnosis of hypertension. Increasing age, clinic blood pressure level, and pulse pressure were associated with a white coat effect. The model showed good calibration across data sets (Pearson correlation, 0.48-0.80) and performed well-predicting ambulatory hypertension (area under the receiver operator characteristic curve, 0.75; 95% confidence interval, 0.72-0.79 [systolic]; 0.87; 0.85-0.89 [diastolic]). Used as a triaging tool for ambulatory monitoring, the model improved classification of a patient's blood pressure status compared with other guideline recommended approaches (93% [92% to 95%] classified correctly; United States, 73% [70% to 75%]; Canada, 74% [71% to 77%]; United Kingdom, 78% [76% to 81%]). This study demonstrates that patient characteristics from a single clinic visit can accurately predict a patient's ambulatory blood pressure. Usage of this prediction tool for triaging of ambulatory monitoring could result in more accurate diagnosis of hypertension and hence more appropriate treatment. PMID:27001299

  19. Relative Accuracy Evaluation

    PubMed Central

    Zhang, Yan; Wang, Hongzhi; Yang, Zhongsheng; Li, Jianzhong

    2014-01-01

    The quality of data plays an important role in business analysis and decision making, and data accuracy is an important aspect in data quality. Thus one necessary task for data quality management is to evaluate the accuracy of the data. And in order to solve the problem that the accuracy of the whole data set is low while a useful part may be high, it is also necessary to evaluate the accuracy of the query results, called relative accuracy. However, as far as we know, neither measure nor effective methods for the accuracy evaluation methods are proposed. Motivated by this, for relative accuracy evaluation, we propose a systematic method. We design a relative accuracy evaluation framework for relational databases based on a new metric to measure the accuracy using statistics. We apply the methods to evaluate the precision and recall of basic queries, which show the result's relative accuracy. We also propose the method to handle data update and to improve accuracy evaluation using functional dependencies. Extensive experimental results show the effectiveness and efficiency of our proposed framework and algorithms. PMID:25133752

  20. Expected accuracy of tilt measurements on a novel hexapod-based digital zenith camera system: a Monte-Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Hirt, Christian; Papp, Gábor; Pál, András; Benedek, Judit; Szũcs, Eszter

    2014-08-01

    Digital zenith camera systems (DZCS) are dedicated astronomical-geodetic measurement systems for the observation of the direction of the plumb line. A DZCS key component is a pair of tilt meters for the determination of the instrumental tilt with respect to the plumb line. Highest accuracy (i.e., 0.1 arc-seconds or better) is achieved in practice through observation with precision tilt meters in opposite faces (180° instrumental rotation), and application of rigorous tilt reduction models. A novel concept proposes the development of a hexapod (Stewart platform)-based DZCS. However, hexapod-based total rotations are limited to about 30°-60° in azimuth (equivalent to ±15° to ±30° yaw rotation), which raises the question of the impact of the rotation angle between the two faces on the accuracy of the tilt measurement. The goal of the present study is the investigation of the expected accuracy of tilt measurements to be carried out on future hexapod-based DZCS, with special focus placed on the role of the limited rotation angle. A Monte-Carlo simulation study is carried out in order to derive accuracy estimates for the tilt determination as a function of several input parameters, and the results are validated against analytical error propagation. As the main result of the study, limitation of the instrumental rotation to 60° (30°) deteriorates the tilt accuracy by a factor of about 2 (4) compared to a 180° rotation between the faces. Nonetheless, a tilt accuracy at the 0.1 arc-second level is expected when the rotation is at least 45°, and 0.05 arc-second (about 0.25 microradian) accurate tilt meters are deployed. As such, a hexapod-based DZCS can be expected to allow sufficiently accurate determination of the instrumental tilt. This provides supporting evidence for the feasibility of such a novel instrumentation. The outcomes of our study are not only relevant to the field of DZCS, but also to all other types of instruments where the instrumental tilt

  1. Inertial Measures of Motion for Clinical Biomechanics: Comparative Assessment of Accuracy under Controlled Conditions – Changes in Accuracy over Time

    PubMed Central

    Lebel, Karina; Boissy, Patrick; Hamel, Mathieu; Duval, Christian

    2015-01-01

    Background Interest in 3D inertial motion tracking devices (AHRS) has been growing rapidly among the biomechanical community. Although the convenience of such tracking devices seems to open a whole new world of possibilities for evaluation in clinical biomechanics, its limitations haven’t been extensively documented. The objectives of this study are: 1) to assess the change in absolute and relative accuracy of multiple units of 3 commercially available AHRS over time; and 2) to identify different sources of errors affecting AHRS accuracy and to document how they may affect the measurements over time. Methods This study used an instrumented Gimbal table on which AHRS modules were carefully attached and put through a series of velocity-controlled sustained motions including 2 minutes motion trials (2MT) and 12 minutes multiple dynamic phases motion trials (12MDP). Absolute accuracy was assessed by comparison of the AHRS orientation measurements to those of an optical gold standard. Relative accuracy was evaluated using the variation in relative orientation between modules during the trials. Findings Both absolute and relative accuracy decreased over time during 2MT. 12MDP trials showed a significant decrease in accuracy over multiple phases, but accuracy could be enhanced significantly by resetting the reference point and/or compensating for initial Inertial frame estimation reference for each phase. Interpretation The variation in AHRS accuracy observed between the different systems and with time can be attributed in part to the dynamic estimation error, but also and foremost, to the ability of AHRS units to locate the same Inertial frame. Conclusions Mean accuracies obtained under the Gimbal table sustained conditions of motion suggest that AHRS are promising tools for clinical mobility assessment under constrained conditions of use. However, improvement in magnetic compensation and alignment between AHRS modules are desirable in order for AHRS to reach their

  2. A HTML5 open source tool to conduct studies based on Libet’s clock paradigm

    PubMed Central

    Garaizar, Pablo; Cubillas, Carmelo P.; Matute, Helena

    2016-01-01

    Libet’s clock is a well-known procedure in experiments in psychology and neuroscience. Examples of its use include experiments exploring the subjective sense of agency, action-effect binding, and subjective timing of conscious decisions and perceptions. However, the technical details of the apparatus used to conduct these types of experiments are complex, and are rarely explained in sufficient detail as to guarantee an exact replication of the procedure. With this in mind, we developed Labclock Web, a web tool designed to conduct online and offline experiments using Libet’s clock. After describing its technical features, we explain how to configure specific experiments using this tool. Its degree of accuracy and precision in the presentation of stimuli has been technically validated, including the use of two cognitive experiments conducted with voluntary participants who performed the experiment both in our laboratory and via the Internet. Labclock Web is distributed without charge under a free software license (GPLv3) since one of our main objectives is to facilitate the replication of experiments and hence the advancement of knowledge in this area. PMID:27623167

  3. A HTML5 open source tool to conduct studies based on Libet's clock paradigm.

    PubMed

    Garaizar, Pablo; Cubillas, Carmelo P; Matute, Helena

    2016-01-01

    Libet's clock is a well-known procedure in experiments in psychology and neuroscience. Examples of its use include experiments exploring the subjective sense of agency, action-effect binding, and subjective timing of conscious decisions and perceptions. However, the technical details of the apparatus used to conduct these types of experiments are complex, and are rarely explained in sufficient detail as to guarantee an exact replication of the procedure. With this in mind, we developed Labclock Web, a web tool designed to conduct online and offline experiments using Libet's clock. After describing its technical features, we explain how to configure specific experiments using this tool. Its degree of accuracy and precision in the presentation of stimuli has been technically validated, including the use of two cognitive experiments conducted with voluntary participants who performed the experiment both in our laboratory and via the Internet. Labclock Web is distributed without charge under a free software license (GPLv3) since one of our main objectives is to facilitate the replication of experiments and hence the advancement of knowledge in this area. PMID:27623167

  4. A HTML5 open source tool to conduct studies based on Libet's clock paradigm.

    PubMed

    Garaizar, Pablo; Cubillas, Carmelo P; Matute, Helena

    2016-09-13

    Libet's clock is a well-known procedure in experiments in psychology and neuroscience. Examples of its use include experiments exploring the subjective sense of agency, action-effect binding, and subjective timing of conscious decisions and perceptions. However, the technical details of the apparatus used to conduct these types of experiments are complex, and are rarely explained in sufficient detail as to guarantee an exact replication of the procedure. With this in mind, we developed Labclock Web, a web tool designed to conduct online and offline experiments using Libet's clock. After describing its technical features, we explain how to configure specific experiments using this tool. Its degree of accuracy and precision in the presentation of stimuli has been technically validated, including the use of two cognitive experiments conducted with voluntary participants who performed the experiment both in our laboratory and via the Internet. Labclock Web is distributed without charge under a free software license (GPLv3) since one of our main objectives is to facilitate the replication of experiments and hence the advancement of knowledge in this area.

  5. Can Interactive Web-based CAD Tools Improve the Learning of Engineering Drawing? A Case Study

    NASA Astrophysics Data System (ADS)

    Pando Cerra, Pablo; Suárez González, Jesús M.; Busto Parra, Bernardo; Rodríguez Ortiz, Diana; Álvarez Peñín, Pedro I.

    2014-06-01

    Many current Web-based learning environments facilitate the theoretical teaching of a subject but this may not be sufficient for those disciplines that require a significant use of graphic mechanisms to resolve problems. This research study looks at the use of an environment that can help students learn engineering drawing with Web-based CAD tools, including a self-correction component. A comparative study of 121 students was carried out. The students were divided into two experimental groups using Web-based interactive CAD tools and into two control groups using traditional learning tools. A statistical analysis of all the samples was carried out in order to study student behavior during the research and the effectiveness of these self-study tools in the learning process. The results showed that a greater number of students in the experimental groups passed the test and improved their test scores. Therefore, the use Web-based graphic interactive tools to learn engineering drawing can be considered a significant improvement in the teaching of this kind of academic discipline.

  6. Leadership Trust in Virtual Teams Using Communication Tools: A Quantitative Correlational Study

    ERIC Educational Resources Information Center

    Clark, Robert Lynn

    2014-01-01

    The purpose of this quantitative correlational study was to address leadership trust in virtual teams using communication tools in a small south-central, family-owned pharmaceutical organization, with multiple dispersed locations located in the United States. The results of the current research study could assist leaders to develop a communication…

  7. An Entrepreneurial Learning Exercise as a Pedagogical Tool for Teaching CSR: A Peruvian Study

    ERIC Educational Resources Information Center

    Farber, Vanina A.; Prialé, María Angela; Fuchs, Rosa María

    2015-01-01

    This paper reports on an exploratory cross-sectional study of the value of an entrepreneurial learning exercise as a tool for examining the entrepreneurship dimension of corporate social responsibility (CSR). The study used grounded theory to analyse diaries kept by graduate (MBA) students during the "20 Nuevos Soles Project". From the…

  8. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    ERIC Educational Resources Information Center

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  9. The Use of Economic Impact Studies as a Service Learning Tool in Undergraduate Business Programs

    ERIC Educational Resources Information Center

    Misner, John M.

    2004-01-01

    This paper examines the use of community based economic impact studies as service learning tools for undergraduate business programs. Economic impact studies are used to measure the economic benefits of a variety of activities such as community redevelopment, tourism, and expansions of existing facilities for both private and public producers.…

  10. SU-E-J-147: Monte Carlo Study of the Precision and Accuracy of Proton CT Reconstructed Relative Stopping Power Maps

    SciTech Connect

    Dedes, G; Asano, Y; Parodi, K; Arbor, N; Dauvergne, D; Testa, E; Letang, J; Rit, S

    2015-06-15

    Purpose: The quantification of the intrinsic performances of proton computed tomography (pCT) as a modality for treatment planning in proton therapy. The performance of an ideal pCT scanner is studied as a function of various parameters. Methods: Using GATE/Geant4, we simulated an ideal pCT scanner and scans of several cylindrical phantoms with various tissue equivalent inserts of different sizes. Insert materials were selected in order to be of clinical relevance. Tomographic images were reconstructed using a filtered backprojection algorithm taking into account the scattering of protons into the phantom. To quantify the performance of the ideal pCT scanner, we study the precision and the accuracy with respect to the theoretical relative stopping power ratios (RSP) values for different beam energies, imaging doses, insert sizes and detector positions. The planning range uncertainty resulting from the reconstructed RSP is also assessed by comparison with the range of the protons in the analytically simulated phantoms. Results: The results indicate that pCT can intrinsically achieve RSP resolution below 1%, for most examined tissues at beam energies below 300 MeV and for imaging doses around 1 mGy. RSP maps accuracy of less than 0.5 % is observed for most tissue types within the studied dose range (0.2–1.5 mGy). Finally, the uncertainty in the proton range due to the accuracy of the reconstructed RSP map is well below 1%. Conclusion: This work explores the intrinsic performance of pCT as an imaging modality for proton treatment planning. The obtained results show that under ideal conditions, 3D RSP maps can be reconstructed with an accuracy better than 1%. Hence, pCT is a promising candidate for reducing the range uncertainties introduced by the use of X-ray CT alongside with a semiempirical calibration to RSP.Supported by the DFG Cluster of Excellence Munich-Centre for Advanced Photonics (MAP)

  11. Hermite finite elements for high accuracy electromagnetic field calculations: A case study of homogeneous and inhomogeneous waveguides

    NASA Astrophysics Data System (ADS)

    Boucher, C. R.; Li, Zehao; Ahheng, C. I.; Albrecht, J. D.; Ram-Mohan, L. R.

    2016-04-01

    Maxwell's vector field equations and their numerical solution represent significant challenges for physical domains with complex geometries. There are several limitations in the presently prevalent approaches to the calculation of field distributions in physical domains, in particular, with the vector finite elements. In order to quantify and resolve issues, we consider the modeling of the field equations for the prototypical examples of waveguides. We employ the finite element method with a new set of Hermite interpolation polynomials derived recently by us using group theoretic considerations. We show that (i) the approach presented here yields better accuracy by several orders of magnitude, with a smoother representation of fields than the vector finite elements for waveguide calculations. (ii) This method does not generate any spurious solutions that plague Lagrange finite elements, even though the C1 -continuous Hermite polynomials are also scalar in nature. (iii) We present solutions for propagating modes in inhomogeneous waveguides satisfying dispersion relations that can be derived directly, and investigate their behavior as the ratio of dielectric constants is varied both theoretically and numerically. Additional comparisons and advantages of the proposed method are detailed in this article. The Hermite interpolation polynomials are shown to provide a robust, accurate, and efficient means of solving Maxwell's equations in a variety of media, potentially offering a computationally inexpensive means of designing devices for optoelectronics and plasmonics of increasing complexity.

  12. Studying the Effect of Adaptive Momentum in Improving the Accuracy of Gradient Descent Back Propagation Algorithm on Classification Problems

    NASA Astrophysics Data System (ADS)

    Rehman, Muhammad Zubair; Nawi, Nazri Mohd.

    Despite being widely used in the practical problems around the world, Gradient Descent Back-propagation algorithm comes with problems like slow convergence and convergence to local minima. Previous researchers have suggested certain modifications to improve the convergence in gradient Descent Back-propagation algorithm such as careful selection of input weights and biases, learning rate, momentum, network topology, activation function and value for 'gain' in the activation function. This research proposed an algorithm for improving the working performance of back-propagation algorithm which is 'Gradient Descent with Adaptive Momentum (GDAM)' by keeping the gain value fixed during all network trials. The performance of GDAM is compared with 'Gradient Descent with fixed Momentum (GDM)' and 'Gradient Descent Method with Adaptive Gain (GDM-AG)'. The learning rate is fixed to 0.4 and maximum epochs are set to 3000 while sigmoid activation function is used for the experimentation. The results show that GDAM is a better approach than previous methods with an accuracy ratio of 1.0 for classification problems like Wine Quality, Mushroom and Thyroid disease.

  13. Real-Word and Nonword Repetition in Italian-Speaking Children with Specific Language Impairment: A Study of Diagnostic Accuracy

    PubMed Central

    Dispaldro, Marco; Leonard, Laurence B.; Deevy, Patricia

    2013-01-01

    Purpose: Using two different scoring methods, we examined the diagnostic accuracy of both real-word and nonword repetition in identifying Italian-speaking children with and without specific language impairment (SLI). Method: A total of 34 children aged 3;11 to 5;8 participated – 17 children with SLI and 17 typically developing children matched for age (TD-A children). Children completed real-word and nonword repetition tasks. The capacity of real-word and nonword repetition tasks to discriminate children with SLI from TD-A was examined through binary logistic regression and response operating characteristics curves. Results: Both real-word and nonword repetition showed good (or excellent) sensitivity and specificity in distinguishing children with SLI from their typically developing peers. Conclusions: Nonword repetition appears to be a useful diagnostic indicator for Italian, as in other languages. In addition, real-word repetition also holds promise. The contributions of each type of measure are discussed. PMID:22761319

  14. Cognitive Abilities Underlying Reading Accuracy, Fluency and Spelling Acquisition in Korean Hangul Learners from Grades 1 to 4: A Cross-Sectional Study.

    PubMed

    Park, Hyun-Rin; Uno, Akira

    2015-08-01

    The purpose of this cross-sectional study was to examine the cognitive abilities that predict reading and spelling performance in Korean children in Grades 1 to 4, depending on expertise and reading experience. As a result, visual cognition, phonological awareness, naming speed and receptive vocabulary significantly predicted reading accuracy in children in Grades 1 and 2, whereas visual cognition, phonological awareness and rapid naming speed did not predict reading accuracy in children in higher grades. For reading, fluency, phonological awareness, rapid naming speed and receptive vocabulary were crucial abilities in children in Grades 1 to 3, whereas phonological awareness was not a significant predictor in children in Grade 4. In spelling, reading ability and receptive vocabulary were the most important abilities for accurate Hangul spelling. The results suggested that the degree of cognitive abilities required for reading and spelling changed depending on expertise and reading experience.

  15. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part I. General Guidance and Tips

    PubMed Central

    Kim, Kyung Won; Lee, Juneyoung; Choi, Sang Hyun; Huh, Jimi

    2015-01-01

    In the field of diagnostic test accuracy (DTA), the use of systematic review and meta-analyses is steadily increasing. By means of objective evaluation of all available primary studies, these two processes generate an evidence-based systematic summary regarding a specific research topic. The methodology for systematic review and meta-analysis in DTA studies differs from that in therapeutic/interventional studies, and its content is still evolving. Here we review the overall process from a practical standpoint, which may serve as a reference for those who implement these methods. PMID:26576106

  16. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part I. General Guidance and Tips.

    PubMed

    Kim, Kyung Won; Lee, Juneyoung; Choi, Sang Hyun; Huh, Jimi; Park, Seong Ho

    2015-01-01

    In the field of diagnostic test accuracy (DTA), the use of systematic review and meta-analyses is steadily increasing. By means of objective evaluation of all available primary studies, these two processes generate an evidence-based systematic summary regarding a specific research topic. The methodology for systematic review and meta-analysis in DTA studies differs from that in therapeutic/interventional studies, and its content is still evolving. Here we review the overall process from a practical standpoint, which may serve as a reference for those who implement these methods. PMID:26576106

  17. Accuracy of self-reported intake of signature foods in a school meal intervention study: comparison between control and intervention period.

    PubMed

    Biltoft-Jensen, Anja; Damsgaard, Camilla Trab; Andersen, Rikke; Ygil, Karin Hess; Andersen, Elisabeth Wreford; Ege, Majken; Christensen, Tue; Sørensen, Louise Bergmann; Stark, Ken D; Tetens, Inge; Thorsen, Anne-Vibeke

    2015-08-28

    Bias in self-reported dietary intake is important when evaluating the effect of dietary interventions, particularly for intervention foods. However, few have investigated this in children, and none have investigated the reporting accuracy of fish intake in children using biomarkers. In a Danish school meal study, 8- to 11-year-old children (n 834) were served the New Nordic Diet (NND) for lunch. The present study examined the accuracy of self-reported intake of signature foods (berries, cabbage, root vegetables, legumes, herbs, potatoes, wild plants, mushrooms, nuts and fish) characterising the NND. Children, assisted by parents, self-reported their diet in a Web-based Dietary Assessment Software for Children during the intervention and control (packed lunch) periods. The reported fish intake by children was compared with their ranking according to fasting whole-blood EPA and DHA concentration and weight percentage using the Spearman correlations and cross-classification. Direct observation of school lunch intake (n 193) was used to score the accuracy of food-reporting as matches, intrusions, omissions and faults. The reporting of all lunch foods had higher percentage of matches compared with the reporting of signature foods in both periods, and the accuracy was higher during the control period compared with the intervention period. Both Spearman's rank correlations and linear mixed models demonstrated positive associations between EPA+DHA and reported fish intake. The direct observations showed that both reported and real intake of signature foods did increase during the intervention period. In conclusion, the self-reported data represented a true increase in the intake of signature foods and can be used to examine dietary intervention effects. PMID:26189886

  18. Immersion defectivity study with volume production immersion lithography tool for 45 nm node and below

    NASA Astrophysics Data System (ADS)

    Nakano, Katsushi; Nagaoka, Shiro; Yoshida, Masato; Iriuchijima, Yasuhiro; Fujiwara, Tomoharu; Shiraishi, Kenichi; Owa, Soichi

    2008-03-01

    Volume production of 45nm node devices utilizing Nikon's S610C immersion lithography tool has started. Important to the success in achieving high-yields in volume production with immersion lithography has been defectivity reduction. In this study we evaluate several methods of defectivity reduction. The tools used in our defectivity analysis included a dedicated immersion cluster tools consisting of a Nikon S610C, a volume production immersion exposure tool with NA of 1.3, and a resist coater-developer LITHIUS i+ from TEL. In our initial procedure we evaluated defectivity behavior by comparing on a topcoat-less resist process to a conventional topcoat process. Because of its simplicity the topcoatless resist shows lower defect levels than the topcoat process. In a second study we evaluated the defect reduction by introducing the TEL bevel rinse and pre-immersion bevel cleaning techniques. This technique was shown to successfully reduce the defect levels by reducing the particles at the wafer bevel region. For the third defect reduction method, two types of tool cleaning processes are shown. Finally, we discuss the overall defectivity behavior at the 45nm node. To facilitate an understanding of the root cause of the defects, defect source analysis (DSA) was applied to separate the defects into three classes according to the source of defects. DSA analysis revealed that more than 99% of defects relate to material and process, and less than 1% of the defects relate to the exposure tool. Material and process optimization by collaborative work between exposure tool vendors, track vendors and material vendors is a key for success of 45nm node device manufacturing.

  19. Development of patient decision support tools for motor neuron disease using stakeholder consultation: a study protocol

    PubMed Central

    Hogden, Anne; Greenfield, David; Caga, Jashelle; Cai, Xiongcai

    2016-01-01

    Introduction Motor neuron disease (MND) is a terminal, progressive, multisystem disorder. Well-timed decisions are key to effective symptom management. To date, there are few published decision support tools, also known as decision aids, to guide patients in making ongoing choices for symptom management and quality of life. This protocol is to develop and validate decision support tools for patients and families to use in conjunction with health professionals in MND multidisciplinary care. The tools will inform patients and families of the benefits and risks of each option, as well as the consequences of accepting or declining treatment. Methods and analysis The study is being conducted from June 2015 to May 2016, using a modified Delphi process. A 2-stage, 7-step process will be used to develop the tools, based on existing literature and stakeholder feedback. The first stage will be to develop the decision support tools, while the second stage will be to validate both the tools and the process used to develop them. Participants will form expert panels, to provide feedback on which the development and validation of the tools will be based. Participants will be drawn from patients with MND, family carers and health professionals, support association workers, peak body representatives, and MND and patient decision-making researchers. Ethics and dissemination Ethical approval for the study has been granted by Macquarie University Human Research Ethics Committee (HREC), approval number 5201500658. Knowledge translation will be conducted via publications, seminar and conference presentations to patients and families, health professionals and researchers. PMID:27053272

  20. A validation study concerning the effects of interview content, retention interval, and grade on children’s recall accuracy for dietary intake and/or physical activity

    PubMed Central

    Baxter, Suzanne D.; Hitchcock, David B.; Guinn, Caroline H.; Vaadi, Kate K.; Puryear, Megan P.; Royer, Julie A.; McIver, Kerry L.; Dowda, Marsha; Pate, Russell R.; Wilson, Dawn K.

    2014-01-01

    interactions mentioned. Content effects depended on other factors. Grade effects were mixed. Dietary accuracy was better with same-day than previous-day retention interval. Conclusions Results do not support integrating dietary intake and physical activity in children’s recalls, but do support using shorter rather than longer retention intervals to yield more accurate dietary recalls. Further validation studies need to clarify age effects and identify evidence-based practices to improve children’s accuracy for recalling dietary intake and/or physical activity. PMID:24767807

  1. Causal Relation Analysis Tool of the Case Study in the Engineer Ethics Education

    NASA Astrophysics Data System (ADS)

    Suzuki, Yoshio; Morita, Keisuke; Yasui, Mitsukuni; Tanada, Ichirou; Fujiki, Hiroyuki; Aoyagi, Manabu

    In engineering ethics education, the virtual experiencing of dilemmas is essential. Learning through the case study method is a particularly effective means. Many case studies are, however, difficult to deal with because they often include many complex causal relationships and social factors. It would thus be convenient if there were a tool that could analyze the factors of a case example and organize them into a hierarchical structure to get a better understanding of the whole picture. The tool that was developed applies a cause-and-effect matrix and simple graph theory. It analyzes the causal relationship between facts in a hierarchical structure and organizes complex phenomena. The effectiveness of this tool is shown by presenting an actual example.

  2. Effects of accuracy motivation and anchoring on metacomprehension judgment and accuracy.

    PubMed

    Zhao, Qin

    2012-01-01

    The current research investigates how accuracy motivation impacts anchoring and adjustment in metacomprehension judgment and how accuracy motivation and anchoring affect metacomprehension accuracy. Participants were randomly assigned to one of six conditions produced by the between-subjects factorial design involving accuracy motivation (incentive or no) and peer performance anchor (95%, 55%, or no). Two studies showed that accuracy motivation did not impact anchoring bias, but the adjustment-from-anchor process occurred. Accuracy incentive increased anchor-judgment gap for the 95% anchor but not for the 55% anchor, which induced less certainty about the direction of adjustment. The findings offer support to the integrative theory of anchoring. Additionally, the two studies revealed a "power struggle" between accuracy motivation and anchoring in influencing metacomprehension accuracy. Accuracy motivation could improve metacomprehension accuracy in spite of anchoring effect, but if anchoring effect is too strong, it could overpower the motivation effect. The implications of the findings were discussed.

  3. Developing symptom-based predictive models of endometriosis as a clinical screening tool: results from a multicenter study

    PubMed Central

    Nnoaham, Kelechi E.; Hummelshoj, Lone; Kennedy, Stephen H.; Jenkinson, Crispin; Zondervan, Krina T.

    2012-01-01

    Objective To generate and validate symptom-based models to predict endometriosis among symptomatic women prior to undergoing their first laparoscopy. Design Prospective, observational, two-phase study, in which women completed a 25-item questionnaire prior to surgery. Setting Nineteen hospitals in 13 countries. Patient(s) Symptomatic women (n = 1,396) scheduled for laparoscopy without a previous surgical diagnosis of endometriosis. Intervention(s) None. Main Outcome Measure(s) Sensitivity and specificity of endometriosis diagnosis predicted by symptoms and patient characteristics from optimal models developed using multiple logistic regression analyses in one data set (phase I), and independently validated in a second data set (phase II) by receiver operating characteristic (ROC) curve analysis. Result(s) Three hundred sixty (46.7%) women in phase I and 364 (58.2%) in phase II were diagnosed with endometriosis at laparoscopy. Menstrual dyschezia (pain on opening bowels) and a history of benign ovarian cysts most strongly predicted both any and stage III and IV endometriosis in both phases. Prediction of any-stage endometriosis, although improved by ultrasound scan evidence of cyst/nodules, was relatively poor (area under the curve [AUC] = 68.3). Stage III and IV disease was predicted with good accuracy (AUC = 84.9, sensitivity of 82.3% and specificity 75.8% at an optimal cut-off of 0.24). Conclusion(s) Our symptom-based models predict any-stage endometriosis relatively poorly and stage III and IV disease with good accuracy. Predictive tools based on such models could help to prioritize women for surgical investigation in clinical practice and thus contribute to reducing time to diagnosis. We invite other researchers to validate the key models in additional populations. PMID:22657249

  4. MetLab: An In Silico Experimental Design, Simulation and Analysis Tool for Viral Metagenomics Studies

    PubMed Central

    Gourlé, Hadrien; Bongcam-Rudloff, Erik; Hayer, Juliette

    2016-01-01

    Metagenomics, the sequence characterization of all genomes within a sample, is widely used as a virus discovery tool as well as a tool to study viral diversity of animals. Metagenomics can be considered to have three main steps; sample collection and preparation, sequencing and finally bioinformatics. Bioinformatic analysis of metagenomic datasets is in itself a complex process, involving few standardized methodologies, thereby hampering comparison of metagenomics studies between research groups. In this publication the new bioinformatics framework MetLab is presented, aimed at providing scientists with an integrated tool for experimental design and analysis of viral metagenomes. MetLab provides support in designing the metagenomics experiment by estimating the sequencing depth needed for the complete coverage of a species. This is achieved by applying a methodology to calculate the probability of coverage using an adaptation of Stevens’ theorem. It also provides scientists with several pipelines aimed at simplifying the analysis of viral metagenomes, including; quality control, assembly and taxonomic binning. We also implement a tool for simulating metagenomics datasets from several sequencing platforms. The overall aim is to provide virologists with an easy to use tool for designing, simulating and analyzing viral metagenomes. The results presented here include a benchmark towards other existing software, with emphasis on detection of viruses as well as speed of applications. This is packaged, as comprehensive software, readily available for Linux and OSX users at https://github.com/norling/metlab. PMID:27479078

  5. MetLab: An In Silico Experimental Design, Simulation and Analysis Tool for Viral Metagenomics Studies.

    PubMed

    Norling, Martin; Karlsson-Lindsjö, Oskar E; Gourlé, Hadrien; Bongcam-Rudloff, Erik; Hayer, Juliette

    2016-01-01

    Metagenomics, the sequence characterization of all genomes within a sample, is widely used as a virus discovery tool as well as a tool to study viral diversity of animals. Metagenomics can be considered to have three main steps; sample collection and preparation, sequencing and finally bioinformatics. Bioinformatic analysis of metagenomic datasets is in itself a complex process, involving few standardized methodologies, thereby hampering comparison of metagenomics studies between research groups. In this publication the new bioinformatics framework MetLab is presented, aimed at providing scientists with an integrated tool for experimental design and analysis of viral metagenomes. MetLab provides support in designing the metagenomics experiment by estimating the sequencing depth needed for the complete coverage of a species. This is achieved by applying a methodology to calculate the probability of coverage using an adaptation of Stevens' theorem. It also provides scientists with several pipelines aimed at simplifying the analysis of viral metagenomes, including; quality control, assembly and taxonomic binning. We also implement a tool for simulating metagenomics datasets from several sequencing platforms. The overall aim is to provide virologists with an easy to use tool for designing, simulating and analyzing viral metagenomes. The results presented here include a benchmark towards other existing software, with emphasis on detection of viruses as well as speed of applications. This is packaged, as comprehensive software, readily available for Linux and OSX users at https://github.com/norling/metlab. PMID:27479078

  6. User Friendly Open GIS Tool for Large Scale Data Assimilation - a Case Study of Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Gupta, P. K.

    2012-08-01

    Open source software (OSS) coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc...) and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.). Quantum GIS (QGIS) is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn - Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect), landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  7. MetLab: An In Silico Experimental Design, Simulation and Analysis Tool for Viral Metagenomics Studies.

    PubMed

    Norling, Martin; Karlsson-Lindsjö, Oskar E; Gourlé, Hadrien; Bongcam-Rudloff, Erik; Hayer, Juliette

    2016-01-01

    Metagenomics, the sequence characterization of all genomes within a sample, is widely used as a virus discovery tool as well as a tool to study viral diversity of animals. Metagenomics can be considered to have three main steps; sample collection and preparation, sequencing and finally bioinformatics. Bioinformatic analysis of metagenomic datasets is in itself a complex process, involving few standardized methodologies, thereby hampering comparison of metagenomics studies between research groups. In this publication the new bioinformatics framework MetLab is presented, aimed at providing scientists with an integrated tool for experimental design and analysis of viral metagenomes. MetLab provides support in designing the metagenomics experiment by estimating the sequencing depth needed for the complete coverage of a species. This is achieved by applying a methodology to calculate the probability of coverage using an adaptation of Stevens' theorem. It also provides scientists with several pipelines aimed at simplifying the analysis of viral metagenomes, including; quality control, assembly and taxonomic binning. We also implement a tool for simulating metagenomics datasets from several sequencing platforms. The overall aim is to provide virologists with an easy to use tool for designing, simulating and analyzing viral metagenomes. The results presented here include a benchmark towards other existing software, with emphasis on detection of viruses as well as speed of applications. This is packaged, as comprehensive software, readily available for Linux and OSX users at https://github.com/norling/metlab.

  8. A pilot study to determine whether using a lightweight, wearable micro-camera improves dietary assessment accuracy and offers information on macronutrients and eating rate.

    PubMed

    Pettitt, Claire; Liu, Jindong; Kwasnicki, Richard M; Yang, Guang-Zhong; Preston, Thomas; Frost, Gary

    2016-01-14

    A major limitation in nutritional science is the lack of understanding of the nutritional intake of free-living people. There is an inverse relationship between accuracy of reporting of energy intake by all current nutritional methodologies and body weight. In this pilot study we aim to explore whether using a novel lightweight, wearable micro-camera improves the accuracy of dietary intake assessment. Doubly labelled water (DLW) was used to estimate energy expenditure and intake over a 14-d period, over which time participants (n 6) completed a food diary and wore a micro-camera on 2 of the days. Comparisons were made between the estimated energy intake from the reported food diary alone and together with the images from the micro-camera recordings. There was an average daily deficit of 3912 kJ using food diaries to estimate energy intake compared with estimated energy expenditure from DLW (P=0·0118), representing an under-reporting rate of 34 %. Analysis of food diaries alone showed a significant deficit in estimated daily energy intake compared with estimated intake from food diary analysis with images from the micro-camera recordings (405 kJ). Use of the micro-camera images in conjunction with food diaries improves the accuracy of dietary assessment and provides valuable information on macronutrient intake and eating rate. There is a need to develop this recording technique to remove user and assessor bias.

  9. Comparative evaluation of the accuracy of two electronic apex locators in determining the working length in teeth with simulated apical root resorption: An in vitro study

    PubMed Central

    Saraswathi, Vidya; Kedia, Archit; Purayil, Tina Puthen; Ballal, Vasudev; Saini, Aakriti

    2016-01-01

    Introduction: Accurate determination of working length (WL) is a critical factor for endodontic success. This is commonly achieved using an apex locator which is influenced by the presence or absence of the apical constriction. Hence, this study was done to compare the accuracy of two generations of apex locators in teeth with simulated apical root resorption. Materials and Methods: Forty maxillary central incisors were selected and after access preparation, were embedded in an alginate mold. On achieving partial set, teeth were removed, and a 45° oblique cut was made at the apex. The teeth were replanted and stabilized in the mold, and WL was determined using two generations of apex locators (Raypex 5 and Apex NRG XFR). Actual length of teeth (control) was determined by visual method. Statistical Analysis: Results were subjected to statistical analysis using the paired t-test. Results: Raypex 5 and Apex NRG was accurate for only 33.75% and 23.75% of samples, respectively. However, with ±0.5 mm acceptance limit, they showed an average accuracy of 56.2% and 57.5%, respectively. There was no significant difference in the accuracy between the two apex locators. Conclusion: Neither of the two apex locators were 100% accurate in determining the WL.

  10. Comparative evaluation of the accuracy of two electronic apex locators in determining the working length in teeth with simulated apical root resorption: An in vitro study

    PubMed Central

    Saraswathi, Vidya; Kedia, Archit; Purayil, Tina Puthen; Ballal, Vasudev; Saini, Aakriti

    2016-01-01

    Introduction: Accurate determination of working length (WL) is a critical factor for endodontic success. This is commonly achieved using an apex locator which is influenced by the presence or absence of the apical constriction. Hence, this study was done to compare the accuracy of two generations of apex locators in teeth with simulated apical root resorption. Materials and Methods: Forty maxillary central incisors were selected and after access preparation, were embedded in an alginate mold. On achieving partial set, teeth were removed, and a 45° oblique cut was made at the apex. The teeth were replanted and stabilized in the mold, and WL was determined using two generations of apex locators (Raypex 5 and Apex NRG XFR). Actual length of teeth (control) was determined by visual method. Statistical Analysis: Results were subjected to statistical analysis using the paired t-test. Results: Raypex 5 and Apex NRG was accurate for only 33.75% and 23.75% of samples, respectively. However, with ±0.5 mm acceptance limit, they showed an average accuracy of 56.2% and 57.5%, respectively. There was no significant difference in the accuracy between the two apex locators. Conclusion: Neither of the two apex locators were 100% accurate in determining the WL. PMID:27656055

  11. When Does Choice of Accuracy Measure Alter Imputation Accuracy Assessments?

    PubMed

    Ramnarine, Shelina; Zhang, Juan; Chen, Li-Shiun; Culverhouse, Robert; Duan, Weimin; Hancock, Dana B; Hartz, Sarah M; Johnson, Eric O; Olfson, Emily; Schwantes-An, Tae-Hwi; Saccone, Nancy L

    2015-01-01

    Imputation, the process of inferring genotypes for untyped variants, is used to identify and refine genetic association findings. Inaccuracies in imputed data can distort the observed association between variants and a disease. Many statistics are used to assess accuracy; some compare imputed to genotyped data and others are calculated without reference to true genotypes. Prior work has shown that the Imputation Quality Score (IQS), which is based on Cohen's kappa statistic and compares imputed genotype probabilities to true genotypes, appropriately adjusts for chance agreement; however, it is not commonly used. To identify differences in accuracy assessment, we compared IQS with concordance rate, squared correlation, and accuracy measures built into imputation programs. Genotypes from the 1000 Genomes reference populations (AFR N = 246 and EUR N = 379) were masked to match the typed single nucleotide polymorphism (SNP) coverage of several SNP arrays and were imputed with BEAGLE 3.3.2 and IMPUTE2 in regions associated with smoking behaviors. Additional masking and imputation was conducted for sequenced subjects from the Collaborative Genetic Study of Nicotine Dependence and the Genetic Study of Nicotine Dependence in African Americans (N = 1,481 African Americans and N = 1,480 European Americans). Our results offer further evidence that concordance rate inflates accuracy estimates, particularly for rare and low frequency variants. For common variants, squared correlation, BEAGLE R2, IMPUTE2 INFO, and IQS produce similar assessments of imputation accuracy. However, for rare and low frequency variants, compared to IQS, the other statistics tend to be more liberal in their assessment of accuracy. IQS is important to consider when evaluating imputation accuracy, particularly for rare and low frequency variants. PMID:26458263

  12. StatXFinder: a web-based self-directed tool that provides appropriate statistical test selection for biomedical researchers in their scientific studies.

    PubMed

    Suner, Aslı; Karakülah, Gökhan; Koşaner, Özgün; Dicle, Oğuz

    2015-01-01

    The improper use of statistical methods is common in analyzing and interpreting research data in biological and medical sciences. The objective of this study was to develop a decision support tool encompassing the commonly used statistical tests in biomedical research by combining and updating the present decision trees for appropriate statistical test selection. First, the decision trees in textbooks, published articles, and online resources were scrutinized, and a more comprehensive unified one was devised via the integration of 10 distinct decision trees. The questions also in the decision steps were revised by simplifying and enriching of the questions with examples. Then, our decision tree was implemented into the web environment and the tool titled StatXFinder was developed. Finally, usability and satisfaction questionnaires were applied to the users of the tool, and StatXFinder was reorganized in line with the feedback obtained from these questionnaires. StatXFinder provides users with decision support in the selection of 85 distinct parametric and non-parametric statistical tests by directing 44 different yes-no questions. The accuracy rate of the statistical test recommendations obtained by 36 participants, with the cases applied, were 83.3 % for "difficult" tests, and 88.9 % for "easy" tests. The mean system usability score of the tool was found 87.43 ± 10.01 (minimum: 70-maximum: 100). A statistically significant difference could not be seen between total system usability score and participants' attributes (p value >0.05). The User Satisfaction Questionnaire showed that 97.2 % of the participants appreciated the tool, and almost all of the participants (35 of 36) thought of recommending the tool to the others. In conclusion, StatXFinder, can be utilized as an instructional and guiding tool for biomedical researchers with limited statistics knowledge. StatXFinder is freely available at http://webb.deu.edu.tr/tb/statxfinder. PMID:26543767

  13. Fourth-grade children's dietary recall accuracy for energy intake at school meals differs by social desirability and body mass index percentile in a study concerning retention interval.

    PubMed

    Guinn, Caroline H; Baxter, Suzanne D; Royer, Julie A; Hardin, James W; Mackelprang, Alyssa J; Smith, Albert F

    2010-05-01

    Data from a study concerning retention interval and school-meal observation on children's dietary recalls were used to investigate relationships of social desirability score (SDS) and body mass index percentile (BMI%) to recall accuracy for energy for observed (n = 327) children, and to reported energy for observed and unobserved (n = 152) children. Report rates (reported/observed) correlated negatively with SDS and BMI%. Correspondence rates (correctly reported/observed) correlated negatively with SDS. Inflation ratios (overreported/observed) correlated negatively with BMI%. The relationship between reported energy and each of SDS and BMI% did not depend on observation status. Studies utilizing children's dietary recalls should assess SDS and BMI%. PMID:20460407

  14. A Study of Turnitin as an Educational Tool in Student Dissertations

    ERIC Educational Resources Information Center

    Biggam, John; McCann, Margaret

    2010-01-01

    Purpose: This paper explores the use of Turnitin as a learning tool (particularly in relation to citing sources and paraphrasing) and as a vehicle for reducing incidences of plagiarism. Design/methodology/approach: The research was implemented using a case study of 49 final-year "honours" undergraduate students undertaking their year-long core…

  15. Music: Artistic Performance or a Therapeutic Tool? A Study on Differences

    ERIC Educational Resources Information Center

    Petersson, Gunnar; Nystrom, Maria

    2011-01-01

    The aim of this study is to analyze and describe how musicians who are also music therapy students separate music as artistic performance from music as a therapeutic tool. The data consist of 18 written reflections from music therapy students that were analyzed according to a phenomenographic method. The findings are presented as four…

  16. Exploring the Usage of a Video Application Tool: Experiences in Film Studies

    ERIC Educational Resources Information Center

    Ali, Nazlena Mohamad; Smeaton, Alan F.

    2011-01-01

    This paper explores our experiences in deploying a video application tool in film studies, and its evaluation in terms of realistic contextual end-users who have real tasks to perform in a real environment. We demonstrate our experiences and core lesson learnt in deploying our novel movie browser application with undergraduate and graduate…

  17. Basins and Wepp Climate Assessment Tools (Cat): Case Study Guide to Potential Applications (Final Report)

    EPA Science Inventory

    Cover of the BASINS and WEPP Climate Assessment <span class=Tool: Case Study Final report"> This final report supports application of two recently developed...

  18. Study Abroad Programs as Tools of Internationalization: Which Factors Influence Hungarian Business Students to Participate?

    ERIC Educational Resources Information Center

    Huják, Janka

    2015-01-01

    The internationalization of higher education has been on the agenda for decades now all over the world. Study abroad programs are undoubtedly tools of the internationalization endeavors. The ERASMUS Student Mobility Program is one of the flagships of the European Union's educational exchange programs implicitly aiming for the internationalization…

  19. Social Networking as an Admission Tool: A Case Study in Success

    ERIC Educational Resources Information Center

    Hayes, Thomas J.; Ruschman, Doug; Walker, Mary M.

    2009-01-01

    The concept of social networking, the focus of this article, targets the development of online communities in higher education, and in particular, as part of the admission process. A successful case study of a university is presented on how one university has used this tool to compete for students. A discussion including suggestions on how to…

  20. The Life Story Board: A Feasibility Study of a Visual Interview Tool for School Counsellors

    ERIC Educational Resources Information Center

    Chase, Robert M.; Medina, Maria Fernanda; Mignone, Javier

    2012-01-01

    The article describes the findings of a pilot study of the Life Story Board (LSB), a novel visual information system with a play board and sets of magnetic cards designed to be a practical clinical tool for counsellors, therapists, and researchers. The LSB is similar to a multidimensional genogram, and serves as a platform to depict personal…

  1. Writing as a Learning Tool: Integrating Theory and Practice. Studies in Writing, Volume 7.

    ERIC Educational Resources Information Center

    Tynjala, Paivi, Ed.; Mason, Lucia, Ed.; Lonka, Kirsti, Ed.

    This book, the seventh volume in the Studies in Writing International Series on the Research of Learning and Instruction of Writing, is an account of the current state of using writing as a tool for learning. The book presents psychological and educational foundations of the writing across the curriculum movement and describes writing-to-learn…

  2. Critical Reflection as a Learning Tool for Nurse Supervisors: A Hermeneutic Phenomenological Study

    ERIC Educational Resources Information Center

    Urbas-Llewellyn, Agnes

    2013-01-01

    Critical reflection as a learning tool for nursing supervisors is a complex and multifaceted process not completely understood by healthcare leadership, specifically nurse supervisors. Despite a multitude of research studies on critical reflection, there remains a gap in the literature regarding the perceptions of the individual, the support…

  3. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  4. iMindMap as an Innovative Tool in Teaching and Learning Accounting: An Exploratory Study

    ERIC Educational Resources Information Center

    Wan Jusoh, Wan Noor Hazlina; Ahmad, Suraya

    2016-01-01

    Purpose: The purpose of this study is to explore the use of iMindMap software as an interactive tool in the teaching and learning method and also to be able to consider iMindMap as an alternative instrument in achieving the ultimate learning outcome. Design/Methodology/Approach: Out of 268 students of the management accounting at the University of…

  5. Validation Studies of the Five P's. The Five P's: A New Handicapped Preschool Children's Assessment Tool.

    ERIC Educational Resources Information Center

    Hicks, John S.

    This paper reports the results of the validity studies done on the Five P's (Parent/Professional Preschool Performance Profile), an assessment tool for parents and teachers developed by the Variety Preschooler's Workshop (Syosset, New York) to rate young handicapped children functioning between birth and 5 years of age on their observed…

  6. Volunteering in the Digital Age: A Study of Online Collaboration Tools from the Perspective of CSCL

    ERIC Educational Resources Information Center

    Kok, Ayse

    2011-01-01

    There is little evidence that helps to inform education, practice, policy, and research about issues surrounding the use of online collaboration tools for organisational initiatives (Brown & Duguid, 1991; Cook & Brown, 1999); let alone a single study conducted with regard to the volunteering practice of knowledge workers. The underlying…

  7. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  8. The use of analytical surface tools in the fundamental study of wear. [atomic nature of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    Various techniques and surface tools available for the study of the atomic nature of the wear of materials are reviewed These include chemical etching, x-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which affect wear, such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding, are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  9. IGG: A tool to integrate GeneChips for genetic studies.

    PubMed

    Li, M-X; Jiang, L; Ho, S-L; Song, Y-Q; Sham, P-C

    2007-11-15

    To facilitate genetic studies using high-throughput genotyping technologies, we have developed an open source tool to integrate genotype data across the Affymetrix and Illumina platforms. It can efficiently integrate a large amount of data from various GeneChips, add genotypes of the HapMap Project into a specific project, flexibly trim and export the integrated data with different formats of popular genetic analysis tools, and highly control the quality of genotype data. Furthermore, this tool has sufficiently simplified its usage through its user-friendly graphic interface and is independent of third-party databases. IGG has successfully been applied to a genome-wide linkage scan in a Charcot-Marie-Tooth disease pedigree by integrating three types of GeneChips and HapMap project genotypes. PMID:17872914

  10. Experience of integrating various technological tools into the study and future teaching of mathematics education students

    NASA Astrophysics Data System (ADS)

    Gorev, Dvora; Gurevich-Leibman, Irina

    2015-07-01

    This paper presents our experience of integrating technological tools into our mathematics teaching (in both disciplinary and didactic courses) for student-teachers. In the first cycle of our study, a variety of technological tools were used (e.g., dynamic software, hypertexts, video and applets) in teaching two disciplinary mathematics courses. We found that the tool most preferred by the students was dynamic software, while the applets were almost neglected. In the next cycle, we focused on using various applets in both disciplinary and didactic mathematics courses. We found that if the assignments were applet-oriented, i.e., adjusted to the chosen applet, or vice versa - the applet was chosen appropriately to suit the given assignment - then the students were able to make use of applets in an effective way. Furthermore, the students came to see the potential of applets for improving learning.

  11. IGG: A tool to integrate GeneChips for genetic studies.

    PubMed

    Li, M-X; Jiang, L; Ho, S-L; Song, Y-Q; Sham, P-C

    2007-11-15

    To facilitate genetic studies using high-throughput genotyping technologies, we have developed an open source tool to integrate genotype data across the Affymetrix and Illumina platforms. It can efficiently integrate a large amount of data from various GeneChips, add genotypes of the HapMap Project into a specific project, flexibly trim and export the integrated data with different formats of popular genetic analysis tools, and highly control the quality of genotype data. Furthermore, this tool has sufficiently simplified its usage through its user-friendly graphic interface and is independent of third-party databases. IGG has successfully been applied to a genome-wide linkage scan in a Charcot-Marie-Tooth disease pedigree by integrating three types of GeneChips and HapMap project genotypes.

  12. The accuracy of breast volume measurement methods: A systematic review.

    PubMed

    Choppin, S B; Wheat, J S; Gee, M; Goyal, A

    2016-08-01

    Breast volume is a key metric in breast surgery and there are a number of different methods which measure it. However, a lack of knowledge regarding a method's accuracy and comparability has made it difficult to establish a clinical standard. We have performed a systematic review of the literature to examine the various techniques for measurement of breast volume and to assess their accuracy and usefulness in clinical practice. Each of the fifteen studies we identified had more than ten live participants and assessed volume measurement accuracy using a gold-standard based on the volume, or mass, of a mastectomy specimen. Many of the studies from this review report large (>200 ml) uncertainty in breast volume and many fail to assess measurement accuracy using appropriate statistical tools. Of the methods assessed, MRI scanning consistently demonstrated the highest accuracy with three studies reporting errors lower than 10% for small (250 ml), medium (500 ml) and large (1000 ml) breasts. However, as a high-cost, non-routine assessment other methods may be more appropriate. PMID:27288864

  13. Do Proficiency and Study-Abroad Experience Affect Speech Act Production? Analysis of Appropriateness, Accuracy, and Fluency

    ERIC Educational Resources Information Center

    Taguchi, Naoko

    2011-01-01

    This cross-sectional study examined the effect of general proficiency and study-abroad experience in production of speech acts among learners of L2 English. Participants were 25 native speakers of English and 64 Japanese college students of English divided into three groups. Group 1 (n = 22) had lower proficiency and no study-abroad experience.…

  14. Ground Truth Sampling and LANDSAT Accuracy Assessment

    NASA Technical Reports Server (NTRS)

    Robinson, J. W.; Gunther, F. J.; Campbell, W. J.

    1982-01-01

    It is noted that the key factor in any accuracy assessment of remote sensing data is the method used for determining the ground truth, independent of the remote sensing data itself. The sampling and accuracy procedures developed for nuclear power plant siting study are described. The purpose of the sampling procedure was to provide data for developing supervised classifications for two study sites and for assessing the accuracy of that and the other procedures used. The purpose of the accuracy assessment was to allow the comparison of the cost and accuracy of various classification procedures as applied to various data types.

  15. Smart tool holder

    DOEpatents

    Day, Robert Dean; Foreman, Larry R.; Hatch, Douglas J.; Meadows, Mark S.

    1998-01-01

    There is provided an apparatus for machining surfaces to accuracies within the nanometer range by use of electrical current flow through the contact of the cutting tool with the workpiece as a feedback signal to control depth of cut.

  16. Updating flood maps efficiently using existing hydraulic models, very-high-accuracy elevation data, and a geographic information system; a pilot study on the Nisqually River, Washington

    USGS Publications Warehouse

    Jones, Joseph L.; Haluska, Tana L.; Kresch, David L.

    2001-01-01

    A method of updating flood inundation maps at a fraction of the expense of using traditional methods was piloted in Washington State as part of the U.S. Geological Survey Urban Geologic and Hydrologic Hazards Initiative. Large savings in expense may be achieved by building upon previous Flood Insurance Studies and automating the process of flood delineation with a Geographic Information System (GIS); increases in accuracy and detail result from the use of very-high-accuracy elevation data and automated delineation; and the resulting digital data sets contain valuable ancillary information such as flood depth, as well as greatly facilitating map storage and utility. The method consists of creating stage-discharge relations from the archived output of the existing hydraulic model, using these relations to create updated flood stages for recalculated flood discharges, and using a GIS to automate the map generation process. Many of the effective flood maps were created in the late 1970?s and early 1980?s, and suffer from a number of well recognized deficiencies such as out-of-date or inaccurate estimates of discharges for selected recurrence intervals, changes in basin characteristics, and relatively low quality elevation data used for flood delineation. FEMA estimates that 45 percent of effective maps are over 10 years old (FEMA, 1997). Consequently, Congress has mandated the updating and periodic review of existing maps, which have cost the Nation almost 3 billion (1997) dollars. The need to update maps and the cost of doing so were the primary motivations for piloting a more cost-effective and efficient updating method. New technologies such as Geographic Information Systems and LIDAR (Light Detection and Ranging) elevation mapping are key to improving the efficiency of flood map updating, but they also improve the accuracy, detail, and usefulness of the resulting digital flood maps. GISs produce digital maps without manual estimation of inundated areas between

  17. Advanced Risk Reduction Tool (ARRT) Special Case Study Report: Science and Engineering Technical Assessments (SETA) Program

    NASA Technical Reports Server (NTRS)

    Kirsch, Paul J.; Hayes, Jane; Zelinski, Lillian

    2000-01-01

    This special case study report presents the Science and Engineering Technical Assessments (SETA) team's findings for exploring the correlation between the underlying models of Advanced Risk Reduction Tool (ARRT) relative to how it identifies, estimates, and integrates Independent Verification & Validation (IV&V) activities. The special case study was conducted under the provisions of SETA Contract Task Order (CTO) 15 and the approved technical approach documented in the CTO-15 Modification #1 Task Project Plan.

  18. Achieving plane wave accuracy in linear-scaling density functional theory applied to periodic systems: A case study on crystalline silicon

    NASA Astrophysics Data System (ADS)

    Skylaris, Chris-Kriton; Haynes, Peter D.

    2007-10-01

    Linear-scaling methods for density functional theory promise to revolutionize the scope and scale of first-principles quantum mechanical calculations. Crystalline silicon has been the system of choice for exploratory tests of such methods in the literature, yet attempts at quantitative comparisons under linear-scaling conditions with traditional methods or experimental results have not been forthcoming. A detailed study using the ONETEP code is reported here, demonstrating for the first time that plane wave accuracy can be achieved in linear-scaling calculations on periodic systems.

  19. A Normative Study of the Sport Concussion Assessment Tool (SCAT2) in Children and Adolescents

    PubMed Central

    Snyder, Aliyah R.; Bauer, Russell M.

    2014-01-01

    Recent clinical practice parameters encourage systematic use of concussion surveillance/management tools that evaluate participating athletes at baseline and after concussion. Office-based tools (Sports Concussion Assessment Tool [SCAT2]) require accurate baseline assessment to maximize utility but no normative data exist for children on the SCAT2, limiting identification of ‘normal’ or ‘impaired’ score ranges. The purpose of this study was to develop child and adolescent baseline norms for the SCAT2 to provide reference values for different age groups. A community-based approach was implemented to compile baseline performance data on the SCAT2 in 761 children aged 9 to 18 to create age- and sex-graded norms. Findings indicate a significant age effect on SCAT2 performance such that older adolescents and teenagers produced higher (better) total scores than younger children (ages 9 to 11) driven by age differences on individual components measuring cognition (SAC), postural stability (BESS), and symptom report. Females endorsed greater numbers of symptoms at baseline than males. Normative data tables are presented. Findings support the SCAT2 as a useful clinical tool for assessing baseline functioning in teenagers, but suggest clinical utility may be limited in children under age 11. Follow-up studies after incident concussion are needed to confirm this assumption. PMID:25244434

  20. Application of multicriteria decision analysis tools to two contaminated sediment case studies.

    PubMed

    Yatsalo, Boris I; Kiker, Gregory A; Kim, St Jongbum; Bridges, Todd S; Seager, Thomas P; Gardner, Kevin; Satterstrom, F Kyle; Linkov, Igor

    2007-04-01

    Environmental decision making is becoming increasingly more information intensive and complex. Our previous work shows that multicriteria decision analysis (MCDA) tools offer a scientifically sound decision analytical framework for environmental management, in general, and specifically for selecting optimal sediment management alternatives. Integration of MCDA into risk assessment and sediment management may require linkage of different models and software platforms whose results may lead to somewhat different conclusions. This paper illustrates the application of 3 different MCDA methods in 2 case studies involving contaminated sediment management. These case studies are based on real sediment management problems experienced by the US Army Corps of Engineers and other stakeholders in New York/New Jersey Harbor, USA, and the Cocheco River Superfund Site in New Hampshire, USA. Our analysis shows that application of 3 different MCDA tools points to similar management solutions no matter which tool is applied. MCDA tools and approaches were constructively used to elicit the strengths and weaknesses of each method when solving the problem. PMID:17477290

  1. Metagenomics: Tools and Insights for Analyzing Next-Generation Sequencing Data Derived from Biodiversity Studies

    PubMed Central

    Oulas, Anastasis; Pavloudi, Christina; Polymenakou, Paraskevi; Pavlopoulos, Georgios A; Papanikolaou, Nikolas; Kotoulas, Georgios; Arvanitidis, Christos; Iliopoulos, Ioannis

    2015-01-01

    Advances in next-generation sequencing (NGS) have allowed significant breakthroughs in microbial ecology studies. This has led to the rapid expansion of research in the field and the establishment of “metagenomics”, often defined as the analysis of DNA from microbial communities in environmental samples without prior need for culturing. Many metagenomics statistical/computational tools and databases have been developed in order to allow the exploitation of the huge influx of data. In this review article, we provide an overview of the sequencing technologies and how they are uniquely suited to various types of metagenomic studies. We focus on the currently available bioinformatics techniques, tools, and methodologies for performing each individual step of a typical metagenomic dataset analysis. We also provide future trends in the field with respect to tools and technologies currently under development. Moreover, we discuss data management, distribution, and integration tools that are capable of performing comparative metagenomic analyses of multiple datasets using well-established databases, as well as commonly used annotation standards. PMID:25983555

  2. Rheo-attenuated total reflectance infrared spectroscopy: a new tool to study biopolymers.

    PubMed

    Boulet-Audet, Maxime; Vollrath, Fritz; Holland, Chris

    2011-03-01

    Whilst rheology is the reference technique to study the mechanical properties of unspun silk, we know little of the structure and the dynamics that generate them. By coupling infrared spectroscopy and shearing forces to study silk fibroin conversion, we are introducing a novel tool to address this gap in our knowledge. Here the silk conversion process has been studied dynamically using polarized attenuated total reflectance Fourier transform infrared spectroscopy whilst applying shear, thus revealing silk protein conformation and molecular orientation in situ. Our results show that the silk conversion process starts with a pre-alignment of the proteins followed by a rapid growth of the β-sheet formation and then a subsequent deceleration of the growth. We propose that this tool will provide further insight into not only silk but any biopolymer solution, opening a new window into biological materials.

  3. Systems biology and "omics" tools: a cooperation for next-generation mycorrhizal studies.

    PubMed

    Salvioli, Alessandra; Bonfante, Paola

    2013-04-01

    Omics tools constitute a powerful means of describing the complexity of plants and soil-borne microorganisms. Next generation sequencing technologies, coupled with emerging systems biology approaches, seem promising to represent a new strategy in the study of plant-microbe interactions. Arbuscular mycorrhizal fungi (AMF) are ubiquitous symbionts of plant roots, that provide their host with many benefits. However, as obligate biotrophs, AMF show a genetic, cellular and physiological complexity that makes the study of their biology as well as their effective agronomical exploitation rather difficult. Here, we speculate that the increasing availability of omics data on mycorrhiza and of computational tools that allow systems biology approaches represents a step forward in the understanding of arbuscular mycorrhizal symbiosis. Furthermore, the application of this study-perspective to agriculturally relevant model plants, such as tomato and rice, will lead to a better in-field exploitation of this beneficial symbiosis in the frame of low-input agriculture.

  4. A simulation study of predictive ability measures in a survival model II: explained randomness and predictive accuracy.

    PubMed

    Choodari-Oskooei, B; Royston, P; Parmar, Mahesh K B

    2012-10-15

    Several R(2) -type measures have been proposed to evaluate the predictive ability of a survival model. In Part I, we classified the measures into four categories and studied the measures in the explained variation category. In this paper, we study the remaining measures in a similar fashion, discussing their strengths and shortcomings. Simulation studies are used to examine the performance of the measures with respect to the criteria we set out in Part I. Our simulation studies showed that among the measures studied in this paper, the measures proposed by Kent and O'Quigley ρ(W)(2) (and its approximation ρ(W,A)(2)) and Schemper and Kaider R(SK)(2) perform better with respect to our criteria. However, our investigations showed that ρ(W)(2) is adversely affected by the distribution of covariate and the presence of influential observations. The results show that the other measures perform poorly, primarily because they are affected either by the degree of censoring or the follow-up period.

  5. GEOSPATIAL DATA ACCURACY ASSESSMENT

    EPA Science Inventory

    The development of robust accuracy assessment methods for the validation of spatial data represent's a difficult scientific challenge for the geospatial science community. The importance and timeliness of this issue is related directly to the dramatic escalation in the developmen...

  6. Classification accuracy improvement

    NASA Technical Reports Server (NTRS)

    Kistler, R.; Kriegler, F. J.

    1977-01-01

    Improvements made in processing system designed for MIDAS (prototype multivariate interactive digital analysis system) effects higher accuracy in classification of pixels, resulting in significantly-reduced processing time. Improved system realizes cost reduction factor of 20 or more.

  7. SLIMMER: SLIce MRI motion estimation and reconstruction tool for studies of fetal anatomy

    NASA Astrophysics Data System (ADS)

    Kim, Kio; Habas, Piotr A.; Rajagopalan, Vidya; Scott, Julia; Rousseau, Francois; Barkovich, A. James; Glenn, Orit A.; Studholme, Colin

    2011-03-01

    We describe a free software tool which combines a set of algorithms that provide a framework for building 3D volumetric images of regions of moving anatomy using multiple fast multi-slice MRI studies. It is specifically motivated by the clinical application of unsedated fetal brain imaging, which has emerged as an important area for image analysis. The tool reads multiple DICOM image stacks acquired in any angulation into a consistent patient coordinate frame and allows the user to select regions to be locally motion corrected. It combines algorithms for slice motion estimation, bias field inconsistency correction and 3D volume reconstruction from multiple scattered slice stacks. The tool is built onto the RView (http://rview.colin-studholme.net) medical image display software and allows the user to inspect slice stacks, and apply both stack and slice level motion estimation that incorporates temporal constraints based on slice timing and interleave information read from the DICOM data. Following motion estimation an algorithm for bias field inconsistency correction provides the user with the ability to remove artifacts arising from the motion of the local anatomy relative to the imaging coils. Full 3D visualization of the slice stacks and individual slice orientations is provided to assist in evaluating the quality of the motion correction and final image reconstruction. The tool has been evaluated on a range of clinical data acquired on GE, Siemens and Philips MRI scanners.

  8. Cost Minimization Using an Artificial Neural Network Sleep Apnea Prediction Tool for Sleep Studies

    PubMed Central

    Teferra, Rahel A.; Grant, Brydon J. B.; Mindel, Jesse W.; Siddiqi, Tauseef A.; Iftikhar, Imran H.; Ajaz, Fatima; Aliling, Jose P.; Khan, Meena S.; Hoffmann, Stephen P.

    2014-01-01

    Rationale: More than a million polysomnograms (PSGs) are performed annually in the United States to diagnose obstructive sleep apnea (OSA). Third-party payers now advocate a home sleep test (HST), rather than an in-laboratory PSG, as the diagnostic study for OSA regardless of clinical probability, but the economic benefit of this approach is not known. Objectives: We determined the diagnostic performance of OSA prediction tools including the newly developed OSUNet, based on an artificial neural network, and performed a cost-minimization analysis when the prediction tools are used to identify patients who should undergo HST. Methods: The OSUNet was trained to predict the presence of OSA in a derivation group of patients who underwent an in-laboratory PSG (n = 383). Validation group 1 consisted of in-laboratory PSG patients (n = 149). The network was trained further in 33 patients who underwent HST and then was validated in a separate group of 100 HST patients (validation group 2). Likelihood ratios (LRs) were compared with two previously published prediction tools. The total costs from the use of the three prediction tools and the third-party approach within a clinical algorithm were compared. Measurements and Main Results: The OSUNet had a higher +LR in all groups compared with the STOP-BANG and the modified neck circumference (MNC) prediction tools. The +LRs for STOP-BANG, MNC, and OSUNet in validation group 1 were 1.1 (1.0–1.2), 1.3 (1.1–1.5), and 2.1 (1.4–3.1); and in validation group 2 they were 1.4 (1.1–1.7), 1.7 (1.3–2.2), and 3.4 (1.8–6.1), respectively. With an OSA prevalence less than 52%, the use of all three clinical prediction tools resulted in cost savings compared with the third-party approach. Conclusions: The routine requirement of an HST to diagnose OSA regardless of clinical probability is more costly compared with the use of OSA clinical prediction tools that identify patients who should undergo this procedure when OSA is expected to

  9. Accuracy of thoracolumbar transpedicular and vertebral body percutaneous screw placement: coupling the Rosa® Spine robot with intraoperative flat-panel CT guidance--a cadaver study.

    PubMed

    Lefranc, M; Peltier, J

    2015-12-01

    The primary objective of the present study was to evaluate the accuracy of a new robotic device when coupled with intraoperative flat-panel CT guidance. Screws (D8-S1) were implanted during two separate cadaver sessions by coupling the Rosa(®) Spine robot with the flat-panel CT device. Of 38 implanted screws, 37 (97.4 %) were fully contained within the pedicle. One screw breached the lateral cortical of one pedicle by <1 mm. The mean ± SD accuracy (relative to pre-operative planning) was 2.05 ± 1.2 mm for the screw head, 1.65 ± 1.11 for the middle of the pedicle and 1.57 ± 1.01 for the screw tip. When coupled with intraoperative flat-panel CT guidance, the Rosa(®) Spine robot appears to be accurate in placing pedicle screws within both pedicles and the vertebral body. Large clinical studies are mandatory to confirm this preliminary cadaveric report. PMID:26530846

  10. Experimental and numerical study of the accuracy of flame-speed measurements for methane/air combustion in a slot burner

    SciTech Connect

    Selle, L.; Ferret, B.; Poinsot, T.

    2011-01-15

    Measuring the velocities of premixed laminar flames with precision remains a controversial issue in the combustion community. This paper studies the accuracy of such measurements in two-dimensional slot burners and shows that while methane/air flame speeds can be measured with reasonable accuracy, the method may lack precision for other mixtures such as hydrogen/air. Curvature at the flame tip, strain on the flame sides and local quenching at the flame base can modify local flame speeds and require corrections which are studied using two-dimensional DNS. Numerical simulations also provide stretch, displacement and consumption flame speeds along the flame front. For methane/air flames, DNS show that the local stretch remains small so that the local consumption speed is very close to the unstretched premixed flame speed. The only correction needed to correctly predict flame speeds in this case is due to the finite aspect ratio of the slot used to inject the premixed gases which induces a flow acceleration in the measurement region (this correction can be evaluated from velocity measurement in the slot section or from an analytical solution). The method is applied to methane/air flames with and without water addition and results are compared to experimental data found in the literature. The paper then discusses the limitations of the slot-burner method to measure flame speeds for other mixtures and shows that it is not well adapted to mixtures with a Lewis number far from unity, such as hydrogen/air flames. (author)

  11. Accuracy assessment on the analysis of unbound drug in plasma by comparing traditional centrifugal ultrafiltration with hollow fiber centrifugal ultrafiltration and application in pharmacokinetic study.

    PubMed

    Zhang, Lin; Zhang, Zhi-Qing; Dong, Wei-Chong; Jing, Shao-Jun; Zhang, Jin-Feng; Jiang, Ye

    2013-11-29

    In present study, accuracy assessment on the analysis of unbound drug in plasma was made by comparing traditional centrifugal ultrafiltration (CF-UF) with hollow fiber centrifugal ultrafiltration (HFCF-UF). We used metformin (MET) as a model drug and studied the influence of centrifugal time, plasma condition and freeze-thaw circle times on the ultrafiltrate volume and related effect on the measurement of MET. Our results demonstrated that ultrafiltrate volume was a crucial factor which influenced measurement accuracy of unbound drug in plasma. For traditional CF-UF, the ultrafiltrate volume cannot be well-controlled due to a series of factors. Compared with traditional CF-UF, the ultrafiltrate volume by HFCF-UF can be easily controlled by the inner capacity of the U-shaped hollow fiber inserted into the sample under enough centrifugal force and centrifugal time, which contributes to a more accurate measurement. Moreover, the developed HFCF-UF method achieved a successful application in real plasma samples and exhibited several advantages including high precision, extremely low detection limit and perfect recovery. The HFCF-UF method offers the advantage of highly satisfactory performance in addition to being simple and fast in pretreatment, with these characteristics being consistent with the practicability requirements in current scientific research.

  12. Image-Guided Localization Accuracy of Stereoscopic Planar and Volumetric Imaging Methods for Stereotactic Radiation Surgery and Stereotactic Body Radiation Therapy: A Phantom Study

    SciTech Connect

    Kim, Jinkoo; Jin, Jian-Yue; Walls, Nicole; Nurushev, Teamour; Movsas, Benjamin; Chetty, Indrin J.; Ryu, Samuel

    2011-04-01

    Purpose: To evaluate the positioning accuracies of two image-guided localization systems, ExacTrac and On-Board Imager (OBI), in a stereotactic treatment unit. Methods and Materials: An anthropomorphic pelvis phantom with eight internal metal markers (BBs) was used. The center of one BB was set as plan isocenter. The phantom was set up on a treatment table with various initial setup errors. Then, the errors were corrected using each of the investigated systems. The residual errors were measured with respect to the radiation isocenter using orthogonal portal images with field size 3 x 3 cm{sup 2}. The angular localization discrepancies of the two systems and the correction accuracy of the robotic couch were also studied. A pair of pre- and post-cone beam computed tomography (CBCT) images was acquired for each angular correction. Then, the correction errors were estimated by using the internal BBs through fiducial marker-based registrations. Results: The isocenter localization errors ({mu} {+-}{sigma}) in the left/right, posterior/anterior, and superior/inferior directions were, respectively, -0.2 {+-} 0.2 mm, -0.8 {+-} 0.2 mm, and -0.8 {+-} 0.4 mm for ExacTrac, and 0.5 {+-} 0.7 mm, 0.6 {+-} 0.5 mm, and 0.0 {+-} 0.5 mm for OBI CBCT. The registration angular discrepancy was 0.1 {+-} 0.2{sup o} between the two systems, and the maximum angle correction error of the robotic couch was 0.2{sup o} about all axes. Conclusion: Both the ExacTrac and the OBI CBCT systems showed approximately 1 mm isocenter localization accuracies. The angular discrepancy of two systems was minimal, and the robotic couch angle correction was accurate. These positioning uncertainties should be taken as a lower bound because the results were based on a rigid dosimetry phantom.

  13. Study on the wear mechanism and tool life of coated gun drill

    NASA Astrophysics Data System (ADS)

    Wang, Yongguo; Yan, Xiangping; Chen, Xiaoguang; Sun, Changyu; Zhang, Xi

    2010-12-01

    A comprehensive investigation of the wear progress for solid carbide gun drill coated with TiAlN by machining steel S48CSiV at a cutting speed of 12.66m/s has been performed. Cutting torque was recorded and tool wear mechanism was studied. The surface morp