Science.gov

Sample records for accuracy studies tool

  1. QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies.

    PubMed

    Whiting, Penny F; Rutjes, Anne W S; Westwood, Marie E; Mallett, Susan; Deeks, Jonathan J; Reitsma, Johannes B; Leeflang, Mariska M G; Sterne, Jonathan A C; Bossuyt, Patrick M M

    2011-10-18

    In 2003, the QUADAS tool for systematic reviews of diagnostic accuracy studies was developed. Experience, anecdotal reports, and feedback suggested areas for improvement; therefore, QUADAS-2 was developed. This tool comprises 4 domains: patient selection, index test, reference standard, and flow and timing. Each domain is assessed in terms of risk of bias, and the first 3 domains are also assessed in terms of concerns regarding applicability. Signalling questions are included to help judge risk of bias. The QUADAS-2 tool is applied in 4 phases: summarize the review question, tailor the tool and produce review-specific guidance, construct a flow diagram for the primary study, and judge bias and applicability. This tool will allow for more transparent rating of bias and applicability of primary diagnostic accuracy studies.

  2. Quality Assessment of Comparative Diagnostic Accuracy Studies: Our Experience Using a Modified Version of the QUADAS-2 Tool

    ERIC Educational Resources Information Center

    Wade, Ros; Corbett, Mark; Eastwood, Alison

    2013-01-01

    Assessing the quality of included studies is a vital step in undertaking a systematic review. The recently revised Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool (QUADAS-2), which is the only validated quality assessment tool for diagnostic accuracy studies, does not include specific criteria for assessing comparative studies. As…

  3. Quality assessment of comparative diagnostic accuracy studies: our experience using a modified version of the QUADAS-2 tool.

    PubMed

    Wade, Ros; Corbett, Mark; Eastwood, Alison

    2013-09-01

    Assessing the quality of included studies is a vital step in undertaking a systematic review. The recently revised Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool (QUADAS-2), which is the only validated quality assessment tool for diagnostic accuracy studies, does not include specific criteria for assessing comparative studies. As part of an assessment that included comparative diagnostic accuracy studies, we used a modified version of QUADAS-2 to assess study quality. We modified QUADAS-2 by duplicating questions relating to the index test, to assess the relevant potential sources of bias for both the index test and comparator test. We also added review-specific questions. We have presented our modified version of QUADAS-2 and outlined some key issues for consideration when assessing the quality of comparative diagnostic accuracy studies, to help guide other systematic reviewers conducting comparative diagnostic reviews. Until QUADAS is updated to incorporate assessment of comparative studies, QUADAS-2 can be used, although modification and careful thought is required. It is important to reflect upon whether aspects of study design and methodology favour one of the tests over another.

  4. Accuracy study of the main screening tools for temporomandibular disorder in children and adolescents.

    PubMed

    de Santis, Tatiana Oliveira; Motta, Lara Jansiski; Biasotto-Gonzalez, Daniela Aparecida; Mesquita-Ferrari, Raquel Agnelli; Fernandes, Kristianne Porta Santos; de Godoy, Camila Haddad Leal; Alfaya, Thays Almeida; Bussadori, Sandra Kalil

    2014-01-01

    The aims of the present study were to assess the degree of sensitivity and specificity of the screening questionnaire recommended by the American Academy of Orofacial Pain (AAOP) and the patient-history index proposed by Helkimo (modified by Fonseca) and correlate the findings with a clinical exam. All participants answered the questionnaires and were submitted to a clinical exam by a dentist who had undergone calibration training. Both the AAOP questionnaire and Helkimo index achieved low degrees of sensitivity for the detection of temporomandibular disorder (TMD), but exhibited a high degree of specificity. With regard to concordance, the AAOP questionnaire and Helkimo index both achieved low levels of agreement with the clinical exam. The different instruments available in the literature for the assessment of TMD and examined herein exhibit low sensitivity and high specificity when administered to children and adolescents stemming from difficulties in comprehension due to the age group studied and the language used in the self-explanatory questions.

  5. The neglected tool in the Bayesian ecologist's shed: a case study testing informative priors' effect on model accuracy

    PubMed Central

    Morris, William K; Vesk, Peter A; McCarthy, Michael A; Bunyavejchewin, Sarayudh; Baker, Patrick J

    2015-01-01

    Despite benefits for precision, ecologists rarely use informative priors. One reason that ecologists may prefer vague priors is the perception that informative priors reduce accuracy. To date, no ecological study has empirically evaluated data-derived informative priors' effects on precision and accuracy. To determine the impacts of priors, we evaluated mortality models for tree species using data from a forest dynamics plot in Thailand. Half the models used vague priors, and the remaining half had informative priors. We found precision was greater when using informative priors, but effects on accuracy were more variable. In some cases, prior information improved accuracy, while in others, it was reduced. On average, models with informative priors were no more or less accurate than models without. Our analyses provide a detailed case study on the simultaneous effect of prior information on precision and accuracy and demonstrate that when priors are specified appropriately, they lead to greater precision without systematically reducing model accuracy. PMID:25628867

  6. Multinomial tree models for assessing the status of the reference in studies of the accuracy of tools for binary classification

    PubMed Central

    Botella, Juan; Huang, Huiling; Suero, Manuel

    2013-01-01

    Studies that evaluate the accuracy of binary classification tools are needed. Such studies provide 2 × 2 cross-classifications of test outcomes and the categories according to an unquestionable reference (or gold standard). However, sometimes a suboptimal reliability reference is employed. Several methods have been proposed to deal with studies where the observations are cross-classified with an imperfect reference. These methods require that the status of the reference, as a gold standard or as an imperfect reference, is known. In this paper a procedure for determining whether it is appropriate to maintain the assumption that the reference is a gold standard or an imperfect reference, is proposed. This procedure fits two nested multinomial tree models, and assesses and compares their absolute and incremental fit. Its implementation requires the availability of the results of several independent studies. These should be carried out using similar designs to provide frequencies of cross-classification between a test and the reference under investigation. The procedure is applied in two examples with real data. PMID:24106484

  7. Groves model accuracy study

    NASA Astrophysics Data System (ADS)

    Peterson, Matthew C.

    1991-08-01

    The United States Air Force Environmental Technical Applications Center (USAFETAC) was tasked to review the scientific literature for studies of the Groves Neutral Density Climatology Model and compare the Groves Model with others in the 30-60 km range. The tasking included a request to investigate the merits of comparing accuracy of the Groves Model to rocketsonde data. USAFETAC analysts found the Groves Model to be state of the art for middle-atmospheric climatological models. In reviewing previous comparisons with other models and with space shuttle-derived atmospheric densities, good density vs altitude agreement was found in almost all cases. A simple technique involving comparison of the model with range reference atmospheres was found to be the most economical way to compare the Groves Model with rocketsonde data; an example of this type is provided. The Groves 85 Model is used routinely in USAFETAC's Improved Point Analysis Model (IPAM). To create this model, Dr. Gerald Vann Groves produced tabulations of atmospheric density based on data derived from satellite observations and modified by rocketsonde observations. Neutral Density as presented here refers to the monthly mean density in 10-degree latitude bands as a function of altitude. The Groves 85 Model zonal mean density tabulations are given in their entirety.

  8. Ambulance smartphone tool for field triage of ruptured aortic aneurysms (FILTR): study protocol for a prospective observational validation of diagnostic accuracy

    PubMed Central

    Lewis, Thomas L; Fothergill, Rachael T; Karthikesalingam, Alan

    2016-01-01

    Introduction Rupture of an abdominal aortic aneurysm (rAAA) carries a considerable mortality rate and is often fatal. rAAA can be treated through open or endovascular surgical intervention and it is possible that more rapid access to definitive intervention might be a key aspect of improving mortality for rAAA. Diagnosis is not always straightforward with up to 42% of rAAA initially misdiagnosed, introducing potentially harmful delay. There is a need for an effective clinical decision support tool for accurate prehospital diagnosis and triage to enable transfer to an appropriate centre. Methods and analysis Prospective multicentre observational study assessing the diagnostic accuracy of a prehospital smartphone triage tool for detection of rAAA. The study will be conducted across London in conjunction with London Ambulance Service (LAS). A logistic score predicting the risk of rAAA by assessing ten key parameters was developed and retrospectively validated through logistic regression analysis of ambulance records and Hospital Episode Statistics data for 2200 patients from 2005 to 2010. The triage tool is integrated into a secure mobile app for major smartphone platforms. Key parameters collected from the app will be retrospectively matched with final hospital discharge diagnosis for each patient encounter. The primary outcome is to assess the sensitivity, specificity and positive predictive value of the rAAA triage tool logistic score in prospective use as a mob app for prehospital ambulance clinicians. Data collection started in November 2014 and the study will recruit a minimum of 1150 non-consecutive patients over a time period of 2 years. Ethics and dissemination Full ethical approval has been gained for this study. The results of this study will be disseminated in peer-reviewed publications, and international/national presentations. Trial registration number CPMS 16459; pre-results. PMID:27797986

  9. EOS mapping accuracy study

    NASA Technical Reports Server (NTRS)

    Forrest, R. B.; Eppes, T. A.; Ouellette, R. J.

    1973-01-01

    Studies were performed to evaluate various image positioning methods for possible use in the earth observatory satellite (EOS) program and other earth resource imaging satellite programs. The primary goal is the generation of geometrically corrected and registered images, positioned with respect to the earth's surface. The EOS sensors which were considered were the thematic mapper, the return beam vidicon camera, and the high resolution pointable imager. The image positioning methods evaluated consisted of various combinations of satellite data and ground control points. It was concluded that EOS attitude control system design must be considered as a part of the image positioning problem for EOS, along with image sensor design and ground image processing system design. Study results show that, with suitable efficiency for ground control point selection and matching activities during data processing, extensive reliance should be placed on use of ground control points for positioning the images obtained from EOS and similar programs.

  10. Accuracy of overlay measurements: tool and mark asymmetry effects

    NASA Astrophysics Data System (ADS)

    Coleman, Daniel J.; Larson, Patricia J.; Lopata, Alexander D.; Muth, William A.; Starikov, Alexander

    1990-06-01

    Results of recent Investigations uncovering significant errors in overlay (O/L) measurements are reported. The two major contributors are related to the failures of symmetry of the overlay measurement tool and of the mark. These may result In measurement errors on the order of 100 nm. Methodology based on the conscientious verification of assumptions of symmetry is shown to be effective in identifying the extent and sources of such errors. This methodology can be used to arrive at an estimate of the relative accuracy of the O/L measurements, even in absence of certified O/L reference materials. Routes to improve the accuracy of O/L measurements are outlined and some examples of improvements are given. Errors in O/L measurements associated with the asymmetry of the metrology tool can be observed by comparing the O/L measurements taken at 0 and 180 degree orientations of the sample in reference to the tool. Half the difference of these measurements serves as an estimate of such tool related bias in estimating O/L. This is called tool induced shift (TIS). Errors of this kind can be traced to asymmetries of tool components, e. g., camera, illumination misalignment, residual asymmetric aberrations etc. Tool asymmetry leads to biased O/L estimates even on symmetric O/L measurement marks. Its impact on TIS depends on the optical properties of the structure being measured, the measurement procedure and on the combination of tool and sample asymmetries. It is also a function of design and manufacture of the O/L metrology tool. In the absence of certified O/L samples, measurement accuracy and repeatability may be improved by demanding that TIS be small for all tools on all structures.

  11. Wind Prediction Accuracy for Air Traffic Management Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Cole, Rod; Green, Steve; Jardin, Matt; Schwartz, Barry; Benjamin, Stan

    2000-01-01

    The performance of Air Traffic Management and flight deck decision support tools depends in large part on the accuracy of the supporting 4D trajectory predictions. This is particularly relevant to conflict prediction and active advisories for the resolution of conflicts and the conformance with of traffic-flow management flow-rate constraints (e.g., arrival metering / required time of arrival). Flight test results have indicated that wind prediction errors may represent the largest source of trajectory prediction error. The tests also discovered relatively large errors (e.g., greater than 20 knots), existing in pockets of space and time critical to ATM DST performance (one or more sectors, greater than 20 minutes), are inadequately represented by the classic RMS aggregate prediction-accuracy studies of the past. To facilitate the identification and reduction of DST-critical wind-prediction errors, NASA has lead a collaborative research and development activity with MIT Lincoln Laboratories and the Forecast Systems Lab of the National Oceanographic and Atmospheric Administration (NOAA). This activity, begun in 1996, has focussed on the development of key metrics for ATM DST performance, assessment of wind-prediction skill for state of the art systems, and development/validation of system enhancements to improve skill. A 13 month study was conducted for the Denver Center airspace in 1997. Two complementary wind-prediction systems were analyzed and compared to the forecast performance of the then standard 60 km Rapid Update Cycle - version 1 (RUC-1). One system, developed by NOAA, was the prototype 40-km RUC-2 that became operational at NCEP in 1999. RUC-2 introduced a faster cycle (1 hr vs. 3 hr) and improved mesoscale physics. The second system, Augmented Winds (AW), is a prototype en route wind application developed by MITLL based on the Integrated Terminal Wind System (ITWS). AW is run at a local facility (Center) level, and updates RUC predictions based on an

  12. The role of sensors in the accuracy of machine tools

    SciTech Connect

    McClure, E.R.

    1988-07-26

    Accuracy of machine tools is impossible without the assistance of sensors. The original manufacturers employed human senses, especially touch and sight, to enable the human brain to control manufacturing processes. Gradually, manufacturers found artificial means to overcome the limitations of human senses. More recently, manufacturers began to employ artificial means to overcome the limitations of the human brain to effect control of manufacturing processes. The resultant array of sensors and computers, coupled with artificial means to overcome the limitations of human skeletons and muscles is embodied in modern machine tools. The evolution continues, resulting in increasing human capacity to create and replicate products. Machine tools are used to make products, are assembled with products and are products themselves. Consequently, sensors play a role in both the manufacture and the use of machine tools. In order to fully manage the design, manufacture and operation of precise and accurate machine tools, engineers must examine and understand the nature of sources of errors and imperfections. Many errors are not directly measurable, e.g., thermal effects. Consequently, control of such errors requires that engineers base the selection and use of sensors on an understanding of the underlying cause and effect relationship. 15 refs., 4 figs.

  13. Comparison of Dimensional Accuracies Using Two Elastomeric Impression Materials in Casting Three-dimensional Tool Marks.

    PubMed

    Wang, Zhen

    2016-05-01

    The purpose of this study was to evaluate two types of impression materials which were frequently used for casting three-dimensional tool marks in China, namely (i) dental impression material and (ii) special elastomeric impression material for tool mark casting. The two different elastomeric impression materials were compared under equal conditions. The parameters measured were dimensional accuracies, the number of air bubbles, the ease of use, and the sharpness and quality of the individual characteristics present on casts. The results showed that dental impression material had the advantage of special elastomeric impression material in casting tool marks in crime scenes; hence, it combined ease of use, dimensional accuracy, sharpness and high quality.

  14. Astronomic Position Accuracy Capability Study.

    DTIC Science & Technology

    1979-10-01

    portion of F. E. Warren AFB, Wyoming. The three points were called THEODORE ECC , TRACY, and JIM and consisted of metal tribrachs plastered to cinder...sets were computed as a deviation from the standard. Accuracy figures were determined from these residuals. Homo - geneity of variances was tested using

  15. Machine tool accuracy characterization workshops. Final report, May 5, 1992--November 5 1993

    SciTech Connect

    1995-01-06

    The ability to assess the accuracy of machine tools is required by both tool builders and users. Builders must have this ability in order to predict the accuracy capability of a machine tool for different part geometry`s, to provide verifiable accuracy information for sales purposes, and to locate error sources for maintenance, troubleshooting, and design enhancement. Users require the same ability in order to make intelligent choices in selecting or procuring machine tools, to predict component manufacturing accuracy, and to perform maintenance and troubleshooting. In both instances, the ability to fully evaluate the accuracy capabilities of a machine tool and the source of its limitations is essential for using the tool to its maximum accuracy and productivity potential. This project was designed to transfer expertise in modern machine tool accuracy testing methods from LLNL to US industry, and to educate users on the use and application of emerging standards for machine tool performance testing.

  16. The Accuracy of IOS Device-based uHear as a Screening Tool for Hearing Loss: A Preliminary Study From the Middle East

    PubMed Central

    Al-Abri, Rashid; Al-Balushi, Mustafa; Kolethekkat, Arif; Bhargava, Deepa; Al-Alwi, Amna; Al-Bahlani, Hana; Al-Garadi, Manal

    2016-01-01

    Objectives To determine and explore the potential use of uHear as a screening test for determining hearing disability by evaluating its accuracy in a clinical setting and a soundproof booth when compared to the gold standard conventional audiometry.   Methods Seventy Sultan Qaboos University students above the age of 17 years who had normal hearing were recruited for the study. They underwent a hearing test using conventional audiometry in a soundproof room, a self-administered uHear evaluation in a side room resembling a clinic setting, and a self-administered uHear test in a soundproof booth. The mean pure tone average (PTA) of thresholds at 500, 1000, 2000 and 4000 Hz for all the three test modalities was calculated, compared, and analyzed statistically.   Results There were 36 male and 34 female students in the study. The PTA with conventional audiometry ranged from 1 to 21 dB across left and right ears. The PTA using uHear in the side room for the same participants was 25 dB in the right ear and 28 dB in the left ear (3–54 dB across all ears). The PTA for uHear in the soundproof booth was 18 dB and 17 dB (1–43 dB) in the right and left ears, respectively. Twenty-three percent of participants were reported to have a mild hearing impairment (PTA > 25 dB) using the soundproof uHear test, and this number was 64% for the same test in the side room. For the same group, only 3% of participants were reported to have a moderate hearing impairment (PTA > 40 dB) using the uHear test in a soundproof booth, and 13% in the side room.   Conclusion uHear in any setting lacks specificity in the range of normal hearing and is highly unreliable in giving the exact hearing threshold in clinical settings. However, there is a potential for the use of uHear if it is used to rule out moderate hearing loss, even in a clinical setting, as exemplified by our study. This method needs standardization through further research. PMID:27168926

  17. Application of a Monte Carlo accuracy assessment tool to TDRS and GPS

    NASA Technical Reports Server (NTRS)

    Pavloff, Michael S.

    1994-01-01

    In support of a NASA study on the application of radio interferometry to satellite orbit determination, MITRE developed a simulation tool for assessing interferometric tracking accuracy. Initially, the tool was applied to the problem of determining optimal interferometric station siting for orbit determination of the Tracking and Data Relay Satellite (TDRS). Subsequently, the Orbit Determination Accuracy Estimator (ODAE) was expanded to model the general batch maximum likelihood orbit determination algorithms of the Goddard Trajectory Determination System (GTDS) with measurement types including not only group and phase delay from radio interferometry, but also range, range rate, angular measurements, and satellite-to-satellite measurements. The user of ODAE specifies the statistical properties of error sources, including inherent observable imprecision, atmospheric delays, station location uncertainty, and measurement biases. Upon Monte Carlo simulation of the orbit determination process, ODAE calculates the statistical properties of the error in the satellite state vector and any other parameters for which a solution was obtained in the orbit determination. This paper presents results from ODAE application to two different problems: (1)determination of optimal geometry for interferometirc tracking of TDRS, and (2) expected orbit determination accuracy for Global Positioning System (GPS) tracking of low-earth orbit (LEO) satellites. Conclusions about optimal ground station locations for TDRS orbit determination by radio interferometry are presented, and the feasibility of GPS-based tracking for IRIDIUM, a LEO mobile satellite communications (MOBILSATCOM) system, is demonstrated.

  18. Quantifying the prediction accuracy of a 1-D SVAT model at a range of ecosystems in the USA and Australia: evidence towards its use as a tool to study Earth's system interactions

    NASA Astrophysics Data System (ADS)

    Petropoulos, G. P.; North, M. R.; Ireland, G.; Srivastava, P. K.; Rendall, D. V.

    2015-10-01

    This paper describes the validation of the SimSphere SVAT (Soil-Vegetation-Atmosphere Transfer) model conducted at a range of US and Australian ecosystem types. Specific focus was given to examining the models' ability in predicting shortwave incoming solar radiation (Rg), net radiation (Rnet), latent heat (LE), sensible heat (H), air temperature at 1.3 m (Tair 1.3 m) and air temperature at 50 m (Tair 50 m). Model predictions were compared against corresponding in situ measurements acquired for a total of 72 selected days of the year 2011 obtained from eight sites belonging to the AmeriFlux (USA) and OzFlux (Australia) monitoring networks. Selected sites were representative of a variety of environmental, biome and climatic conditions, to allow for the inclusion of contrasting conditions in the model evaluation. Overall, results showed a good agreement between the model predictions and the in situ measurements, particularly so for the Rg, Rnet, Tair 1.3 m and Tair 50 m parameters. The simulated Rg parameter exhibited a root mean square deviation (RMSD) within 25 % of the observed fluxes for 58 of the 72 selected days, whereas an RMSD within ~ 24 % of the observed fluxes was reported for the Rnet parameter for all days of study (RMSD = 58.69 W m-2). A systematic underestimation of Rg and Rnet (mean bias error (MBE) = -19.48 and -16.46 W m-2) was also found. Simulations for the Tair 1.3 m and Tair 50 m showed good agreement with the in situ observations, exhibiting RMSDs of 3.23 and 3.77 °C (within ~ 15 and ~ 18 % of the observed) for all days of analysis, respectively. Comparable, yet slightly less satisfactory simulation accuracies were exhibited for the H and LE parameters (RMSDs = 38.47 and 55.06 W m-2, ~ 34 and ~ 28 % of the observed). Highest simulation accuracies were obtained for the open woodland savannah and mulga woodland sites for most of the compared parameters. The Nash-Sutcliffe efficiency index for all parameters ranges from 0.720 to 0.998, suggesting

  19. Accuracy and Resolution in Micro-earthquake Tomographic Inversion Studies

    NASA Astrophysics Data System (ADS)

    Hutchings, L. J.; Ryan, J.

    2010-12-01

    Accuracy and resolution are complimentary properties necessary to interpret the results of earthquake location and tomography studies. Accuracy is the how close an answer is to the “real world”, and resolution is who small of node spacing or earthquake error ellipse one can achieve. We have modified SimulPS (Thurber, 1986) in several ways to provide a tool for evaluating accuracy and resolution of potential micro-earthquake networks. First, we provide synthetic travel times from synthetic three-dimensional geologic models and earthquake locations. We use this to calculate errors in earthquake location and velocity inversion results when we perturb these models and try to invert to obtain these models. We create as many stations as desired and can create a synthetic velocity model with any desired node spacing. We apply this study to SimulPS and TomoDD inversion studies. “Real” travel times are perturbed with noise and hypocenters are perturbed to replicate a starting location away from the “true” location, and inversion is performed by each program. We establish travel times with the pseudo-bending ray tracer and use the same ray tracer in the inversion codes. This, of course, limits our ability to test the accuracy of the ray tracer. We developed relationships for the accuracy and resolution expected as a function of the number of earthquakes and recording stations for typical tomographic inversion studies. Velocity grid spacing started at 1km, then was decreased to 500m, 100m, 50m and finally 10m to see if resolution with decent accuracy at that scale was possible. We considered accuracy to be good when we could invert a velocity model perturbed by 50% back to within 5% of the original model, and resolution to be the size of the grid spacing. We found that 100 m resolution could obtained by using 120 stations with 500 events, bu this is our current limit. The limiting factors are the size of computers needed for the large arrays in the inversion and a

  20. Intellijoint HIP®: a 3D mini-optical navigation tool for improving intraoperative accuracy during total hip arthroplasty

    PubMed Central

    Paprosky, Wayne G; Muir, Jeffrey M

    2016-01-01

    Total hip arthroplasty is an increasingly common procedure used to address degenerative changes in the hip joint due to osteoarthritis. Although generally associated with good results, among the challenges associated with hip arthroplasty are accurate measurement of biomechanical parameters such as leg length, offset, and cup position, discrepancies of which can lead to significant long-term consequences such as pain, instability, neurological deficits, dislocation, and revision surgery, as well as patient dissatisfaction and, increasingly, litigation. Current methods of managing these parameters are limited, with manual methods such as outriggers or calipers being used to monitor leg length; however, these are susceptible to small intraoperative changes in patient position and are therefore inaccurate. Computer-assisted navigation, while offering improved accuracy, is expensive and cumbersome, in addition to adding significantly to procedural time. To address the technological gap in hip arthroplasty, a new intraoperative navigation tool (Intellijoint HIP®) has been developed. This innovative, 3D mini-optical navigation tool provides real-time, intraoperative data on leg length, offset, and cup position and allows for improved accuracy and precision in component selection and alignment. Benchtop and simulated clinical use testing have demonstrated excellent accuracy, with the navigation tool able to measure leg length and offset to within <1 mm and cup position to within <1° in both anteversion and inclination. This study describes the indications, procedural technique, and early accuracy results of the Intellijoint HIP surgical tool, which offers an accurate and easy-to-use option for hip surgeons to manage leg length, offset, and cup position intraoperatively. PMID:27920583

  1. The Diagnostic Accuracy of Screening Tools to Detect Eating Disorders in Female Athletes.

    PubMed

    Wagner, Alyssa J; Erickson, Casey D; Tierney, Dayna K; Houston, Megan N; Bacon, Cailee E Welch

    2016-12-01

    Clinical Scenario: Eating disorders in female athletes are a commonly underdiagnosed condition. Better screening tools for eating disorders in athletic females could help increase diagnosis and help athletes get the treatment they need. Focused Clinical Question: Should screening tools be used to detect eating disorders in female athletes? Summary of Key Findings: The literature was searched for studies that included information regarding the sensitivity and specificity of screening tools for eating disorders in female athletes. The search returned 5 possible articles related to the clinical question; 3 studies met the inclusion criteria (2 cross-sectional studies, 1 cohort study) and were included. All 3 studies reported sensitivity and specificity for the Athletic Milieu Direct Questionnaire version 2, the Brief Eating Disorder in Athletes Questionnaire version 2, and the Physiologic Screening Test to Detect Eating Disorders Among Female Athletes. All 3 studies found that the respective screening tool was able to accurately identify female athletes with eating disorders; however, the screening tools varied in sensitivity and specificity values. Clinical Bottom Line: There is strong evidence to support the use of screening tools to detect eating disorders in female athletes. Screening tools with higher sensitivity and specificity have demonstrated a successful outcome of determining athletes with eating disorders or at risk for developing an eating disorder. Strength of Recommendation: There is grade A evidence available to demonstrate that screening tools accurately detect female athletes at risk for eating disorders.

  2. Accuracy of quick and easy undernutrition screening tools--Short Nutritional Assessment Questionnaire, Malnutrition Universal Screening Tool, and modified Malnutrition Universal Screening Tool--in patients undergoing cardiac surgery.

    PubMed

    van Venrooij, Lenny M W; van Leeuwen, Paul A M; Hopmans, Wendy; Borgmeijer-Hoelen, Mieke M M J; de Vos, Rien; De Mol, Bas A J M

    2011-12-01

    The objective of this study was to compare the quick-and-easy undernutrition screening tools, ie, Short Nutritional Assessment Questionnaire and Malnutrition Universal Screening Tool, in patients undergoing cardiac surgery with respect to their accuracy in detecting undernutrition measured by a low-fat free mass index (FFMI; calculated as kg/m(2)), and secondly, to assess their association with postoperative adverse outcomes. Between February 2008 and December 2009, a single-center observational cohort study was performed (n=325). A low FFMI was set at ≤14.6 in women and ≤16.7 in men measured using bioelectrical impedance spectroscopy. To compare the accuracy of the Malnutrition Universal Screening Tool and Short Nutritional Assessment Questionnaire in detecting low FFMI sensitivity, specificity, and other accuracy test characteristics were calculated. The associations between the Malnutrition Universal Screening Tool and Short Nutritional Assessment Questionnaire and adverse outcomes were analyzed using logistic regression analyses with odds ratios and 95% confidence intervals (CI) presented. Sensitivity and receiver operator characteristic-based area under the curve to detect low FFMI were 59% and 19%, and 0.71 (95% CI: 0.60 to 0.82) and 0.56 (95% CI: 0.44 to 0.68) for the Malnutrition Universal Screening Tool and Short Nutritional Assessment Questionnaire, respectively. Accuracy of the Malnutrition Universal Screening Tool improved when age and sex were added to the nutritional screening process (sensitivity 74%, area under the curve: 0.72 [95% CI: 0.62 to 0.82]). This modified version of the Malnutrition Universal Screening Tool, but not the original Malnutrition Universal Screening Tool or Short Nutritional Assessment Questionnaire, was associated with prolonged intensive care unit and hospital stay (odds ratio: 2.1, 95% CI: 1.3 to 3.4; odds ratio: 1.6, 95% CI: 1.0 to 2.7). The accuracy to detect a low FFMI was considerably higher for the Malnutrition

  3. NREL Evaluates Thermal Performance of Uninsulated Walls to Improve Accuracy of Building Energy Simulation Tools (Fact Sheet)

    SciTech Connect

    Not Available

    2012-03-01

    NREL researchers discover ways to increase accuracy in building energy simulations tools to improve predictions of potential energy savings in homes. Uninsulated walls are typical in older U.S. homes where the wall cavities were not insulated during construction or where the insulating material has settled. Researchers at the National Renewable Energy Laboratory (NREL) are investigating ways to more accurately calculate heat transfer through building enclosures to verify the benefit of energy efficiency upgrades that reduce energy use in older homes. In this study, scientists used computational fluid dynamics (CFD) analysis to calculate the energy loss/gain through building walls and visualize different heat transfer regimes within the uninsulated cavities. The effects of ambient outdoor temperature, the radiative properties of building materials, insulation levels, and the temperature dependence of conduction through framing members were considered. The research showed that the temperature dependence of conduction through framing members dominated the differences between this study and previous results - an effect not accounted for in existing building energy simulation tools. The study provides correlations for the resistance of the uninsulated assemblies that can be implemented into building simulation tools to increase the accuracy of energy use estimates in older homes, which are currently over-predicted.

  4. Evaluating radiographers' diagnostic accuracy in screen-reading mammograms: what constitutes a quality study?

    SciTech Connect

    Debono, Josephine C; Poulos, Ann E

    2015-03-15

    The aim of this study was to first evaluate the quality of studies investigating the diagnostic accuracy of radiographers as mammogram screen-readers and then to develop an adapted tool for determining the quality of screen-reading studies. A literature search was used to identify relevant studies and a quality evaluation tool constructed by combining the criteria for quality of Whiting, Rutjes, Dinnes et al. and Brealey and Westwood. This constructed tool was then applied to the studies and subsequently adapted specifically for use in evaluating quality in studies investigating diagnostic accuracy of screen-readers. Eleven studies were identified and the constructed tool applied to evaluate quality. This evaluation resulted in the identification of quality issues with the studies such as potential for bias, applicability of results, study conduct, reporting of the study and observer characteristics. An assessment of the applicability and relevance of the tool for this area of research resulted in adaptations to the criteria and the development of a tool specifically for evaluating diagnostic accuracy in screen-reading. This tool, with further refinement and rigorous validation can make a significant contribution to promoting well-designed studies in this important area of research and practice.

  5. Effect of force feedback from each DOF on the motion accuracy of a surgical tool in performing a robot-assisted tracing task.

    PubMed

    Samad, Manar D; Hu, Yaoping; Sutherland, Garnette R

    2010-01-01

    In robot-assisted surgery, it may be important to provide force feedback to the hand of the surgeon. Here we examine how force feedback from each degree of freedom (DOF) on a hand controller affects the motion accuracy of a surgical tool. We studied the motion accuracy of a needle-shaped tool in performing a robot-assisted tracing task. On a virtual simulation of the tool and neuroArm robot, human participants manipulated a hand controller to move the tool attached to the end-effector of the robot. They used the tool to trace a line on pipes (mimicking blood vessels) along 3 orthogonal directions, corresponding to 3 DOF on the hand controller. We observed that force feedback from each DOF on the hand controller had a significant effect on the motion accuracy of the tool during tracing. Varying force conditions yielded insignificant difference in motion accuracy. These results indicate a need of revising the hand controller for achieving improved motion accuracy in performing robot-assisted tasks.

  6. Technical Highlight: NREL Evaluates the Thermal Performance of Uninsulated Walls to Improve the Accuracy of Building Energy Simulation Tools

    SciTech Connect

    Ridouane, E.H.

    2012-01-01

    This technical highlight describes NREL research to develop models of uninsulated wall assemblies that help to improve the accuracy of building energy simulation tools when modeling potential energy savings in older homes.

  7. Accuracy of Depression Screening Tools for Identifying Postpartum Depression Among Urban Mothers

    PubMed Central

    Chaudron, Linda H.; Szilagyi, Peter G.; Tang, Wan; Anson, Elizabeth; Talbot, Nancy L.; Wadkins, Holly I.M.; Tu, Xin; Wisner, Katherine L.

    2011-01-01

    Objective The goal was to describe the accuracy of the Edinburgh Postnatal Depression Scale (EPDS), Beck Depression Inventory II (BDI-II), and Postpartum Depression Screening Scale (PDSS) in identifying major depressive disorder (MDD) or minor depressive disorder (MnDD) in low-income, urban mothers attending well childcare (WCC) visits during the postpartum year. Design/Methods Mothers (N=198) attending WCC visits with their infants 0 to 14 months of age completed a psychiatric diagnostic interview (standard method) and 3 screening tools. The sensitivity and specificity of each screening tool were calculated in comparison with diagnoses of MDD or MDD/MnDD. Receiver operating characteristic curves were calculated and the areas under the curves for each tool were compared to assess accuracy for the entire sample (representing the postpartum year) and sub-samples (representing early, middle and late postpartum time frames). Optimal cut-points were calculated. Results At some point between 2 weeks and 14 months postpartum, 56% of mothers met criteria for either MDD (37%) or MnDD (19%). When used as a continuous measures, all scales performed equally well (areas under the curves of ≥ 0.8). With traditional cut-points, the measures did not perform at the expected levels of sensitivity and specificity. Optimal cut-points for the BDI-II (≥14 for MDD, ≥11 for MDD/MnDD) and EPDS (≥9 for MDD, ≥7 for MDD/MnDD) were lower than currently recommended. For the PDSS, the optimal cut-point was consistent with current guidelines for MDD (≥80) but higher than recommended for MDD/MnDD (≥ 77). Conclusions Large proportions of low-income, urban mothers attending WCC visits experience MDD or MnDD during the postpartum year. The EPDS, BDI-II and PDSS have high accuracy in identifying depression but cutoff points may need to be altered to more accurately identify depression in urban, low-income mothers. PMID:20156899

  8. Expansion/De-expansion Tool to Quantify the Accuracy of Prostate Contours

    SciTech Connect

    Chung, Eugene; Stenmark, Matthew H.; Evans, Cheryl; Narayana, Vrinda; McLaughlin, Patrick W.

    2012-05-01

    Purpose: Accurate delineation of the prostate gland on computed tomography (CT) remains a persistent challenge and continues to introduce geometric uncertainty into the planning and delivery of external beam radiotherapy. We, therefore, developed an expansion/de-expansion tool to quantify the contour errors and determine the location of the deviations. Methods and Materials: A planning CT scan and magnetic resonance imaging scan were prospectively acquired for 10 patients with prostate cancer. The prostate glands were contoured by 3 independent observers using the CT data sets with instructions to contour the prostate without underestimation but to minimize overestimation. The standard prostate for each patient was defined using magnetic resonance imaging and CT on multiple planes. After registration of the CT and magnetic resonance imaging data sets, the CT-defined prostates were scored for accuracy. The contours were defined as ideal if they were within a 2.5-mm expansion of the standard without underestimation, acceptable if they were within a 5.0-mm expansion and a 2.5-mm de-expansion, and unacceptable if they extended >5.0 mm or underestimated the prostate by >2.5 mm. Results: A total of 636 CT slices were individually analyzed, with the vast majority scored as ideal or acceptable. However, none of the 30 prostate contour sets had all the contours scored as ideal or acceptable. For all 3 observers, the unacceptable contours were more likely from underestimation than overestimation of the prostate. The errors were more common at the base and apex than the mid-gland. Conclusions: The expansion/de-expansion tool allows for directed feedback on the location of contour deviations, as well as the determination of over- or underestimation of the prostate. This metric might help improve the accuracy of prostate contours.

  9. NREL Evaluates the Thermal Performance of Uninsulated Walls to Improve the Accuracy of Building Energy Simulation Tools (Fact Sheet)

    SciTech Connect

    Not Available

    2012-01-01

    This technical highlight describes NREL research to develop models of uninsulated wall assemblies that help to improve the accuracy of building energy simulation tools when modeling potential energy savings in older homes. Researchers at the National Renewable Energy Laboratory (NREL) have developed models for evaluating the thermal performance of walls in existing homes that will improve the accuracy of building energy simulation tools when predicting potential energy savings of existing homes. Uninsulated walls are typical in older homes where the wall cavities were not insulated during construction or where the insulating material has settled. Accurate calculation of heat transfer through building enclosures will help determine the benefit of energy efficiency upgrades in order to reduce energy consumption in older American homes. NREL performed detailed computational fluid dynamics (CFD) analysis to quantify the energy loss/gain through the walls and to visualize different airflow regimes within the uninsulated cavities. The effects of ambient outdoor temperature, radiative properties of building materials, and insulation level were investigated. The study showed that multi-dimensional airflows occur in walls with uninsulated cavities and that the thermal resistance is a function of the outdoor temperature - an effect not accounted for in existing building energy simulation tools. The study quantified the difference between CFD prediction and the approach currently used in building energy simulation tools over a wide range of conditions. For example, researchers found that CFD predicted lower heating loads and slightly higher cooling loads. Implementation of CFD results into building energy simulation tools such as DOE2 and EnergyPlus will likely reduce the predicted heating load of homes. Researchers also determined that a small air gap in a partially insulated cavity can lead to a significant reduction in thermal resistance. For instance, a 4-in. tall air gap

  10. Air traffic control surveillance accuracy and update rate study

    NASA Technical Reports Server (NTRS)

    Craigie, J. H.; Morrison, D. D.; Zipper, I.

    1973-01-01

    The results of an air traffic control surveillance accuracy and update rate study are presented. The objective of the study was to establish quantitative relationships between the surveillance accuracies, update rates, and the communication load associated with the tactical control of aircraft for conflict resolution. The relationships are established for typical types of aircraft, phases of flight, and types of airspace. Specific cases are analyzed to determine the surveillance accuracies and update rates required to prevent two aircraft from approaching each other too closely.

  11. On the accuracy of Hipparcos using binary stars as a calibration tool

    SciTech Connect

    Docobo, J. A.; Andrade, M. E-mail: manuel.andrade@usc.es

    2015-02-01

    Stellar binary systems, specifically those that present the most accurate available orbital elements, are a reliable tool to test the accuracy of astrometric observations. We selected all 35 binaries with these characteristics. Our objective is to provide standard uncertainties for the positions and parallaxes measured by Hipparcos relative to this trustworthy set, as well as to check supposed correlations between several parameters (measurement residuals, positions, magnitudes, and parallaxes). In addition, using the high-confidence subset of visual–spectroscopic binaries, we implemented a validation test of the Hipparcos trigonometric parallaxes of binary systems that allowed the evaluation of their reliability. Standard and non-standard statistical analysis techniques were applied in order to achieve well-founded conclusions. In particular, errors-in-variables models such as the total least-squares method were used to validate Hipparcos parallaxes by comparison with those obtained directly from the orbital elements. Previously, we executed Thompson's τ technique in order to detect suspected outliers in the data. Furthermore, several statistical hypothesis tests were carried out to verify if our results were statistically significant. A statistically significant trend indicating larger Hipparcos angular separations with respect to the reference values in 5.2 ± 1.4 mas was found at the 10{sup −8} significance level. Uncertainties in the polar coordinates θ and ρ of 1.°8 and 6.3 mas, respectively, were estimated for the Hipparcos observations of binary systems. We also verified that the parallaxes of binary systems measured in this mission are absolutely compatible with the set of orbital parallaxes obtained from the most accurate orbits at least at the 95% confidence level. This methodology allows us to better estimate the accuracy of Hipparcos observations of binary systems. Indeed, further application to the data collected by Gaia should yield a

  12. A study of laseruler accuracy and precision (1986-1987)

    SciTech Connect

    Ramachandran, R.S.; Armstrong, K.P.

    1989-06-22

    A study was conducted to investigate Laserruler accuracy and precision. Tests were performed on 0.050 in., 0.100 in., and 0.120 in. gauge block standards. Results showed and accuracy of 3.7 {mu}in. for the 0.12 in. standard, with higher accuracies for the two thinner blocks. The Laserruler precision was 4.83 {mu}in. for the 0.120 in. standard, 3.83 {mu}in. for the 0.100 in. standard, and 4.2 {mu}in. for the 0.050 in. standard.

  13. A promising tool to achieve chemical accuracy for density functional theory calculations on Y-NO homolysis bond dissociation energies.

    PubMed

    Li, Hong Zhi; Hu, Li Hong; Tao, Wei; Gao, Ting; Li, Hui; Lu, Ying Hua; Su, Zhong Min

    2012-01-01

    A DFT-SOFM-RBFNN method is proposed to improve the accuracy of DFT calculations on Y-NO (Y = C, N, O, S) homolysis bond dissociation energies (BDE) by combining density functional theory (DFT) and artificial intelligence/machine learning methods, which consist of self-organizing feature mapping neural networks (SOFMNN) and radial basis function neural networks (RBFNN). A descriptor refinement step including SOFMNN clustering analysis and correlation analysis is implemented. The SOFMNN clustering analysis is applied to classify descriptors, and the representative descriptors in the groups are selected as neural network inputs according to their closeness to the experimental values through correlation analysis. Redundant descriptors and intuitively biased choices of descriptors can be avoided by this newly introduced step. Using RBFNN calculation with the selected descriptors, chemical accuracy (≤1 kcal·mol(-1)) is achieved for all 92 calculated organic Y-NO homolysis BDE calculated by DFT-B3LYP, and the mean absolute deviations (MADs) of the B3LYP/6-31G(d) and B3LYP/STO-3G methods are reduced from 4.45 and 10.53 kcal·mol(-1) to 0.15 and 0.18 kcal·mol(-1), respectively. The improved results for the minimal basis set STO-3G reach the same accuracy as those of 6-31G(d), and thus B3LYP calculation with the minimal basis set is recommended to be used for minimizing the computational cost and to expand the applications to large molecular systems. Further extrapolation tests are performed with six molecules (two containing Si-NO bonds and two containing fluorine), and the accuracy of the tests was within 1 kcal·mol(-1). This study shows that DFT-SOFM-RBFNN is an efficient and highly accurate method for Y-NO homolysis BDE. The method may be used as a tool to design new NO carrier molecules.

  14. Cutting tool study: 21-6-9 stainless steel

    SciTech Connect

    McManigle, A.P.

    1992-07-29

    The Rocky Flats Plant conducted a study to test cermet cutting tools by performing machinability studies on War Reserve product under controlled conditions. The purpose of these studies was to determine the most satisfactory tools that optimize tool life, minimize costs, improve reliability and chip control, and increase productivity by performing the operations to specified Accuracies. This study tested three manufacturers` cermet cutting tools and a carbide tool used previously by the Rocky Flats Plant for machining spherical-shaped 21-6-9 stainless steel forgings (Figure 1). The 80-degree diamond inserts were tested by experimenting with various chip-breaker geometries, cutting speeds, feedrates, and cermet grades on the outside contour roughing operation. The cermets tested were manufactured by Kennametal, Valenite, and NTK. The carbide tool ordinarily used for this operation is manufactured by Carboloy. Evaluation of tho tools was conducted by investigating the number of passes per part and parts per insert, tool wear, cutting time, tool life, surface finish, and stem taper. Benefits to be gained from this study were: improved part quality, better chip control, increased tool life and utilization, and greater fabrication productivity. This was to be accomplished by performing the operation to specified accuracies within the scope of the tools tested.

  15. Cutting tool study: 21-6-9 stainless steel

    SciTech Connect

    McManigle, A.P.

    1992-07-29

    The Rocky Flats Plant conducted a study to test cermet cutting tools by performing machinability studies on War Reserve product under controlled conditions. The purpose of these studies was to determine the most satisfactory tools that optimize tool life, minimize costs, improve reliability and chip control, and increase productivity by performing the operations to specified Accuracies. This study tested three manufacturers' cermet cutting tools and a carbide tool used previously by the Rocky Flats Plant for machining spherical-shaped 21-6-9 stainless steel forgings (Figure 1). The 80-degree diamond inserts were tested by experimenting with various chip-breaker geometries, cutting speeds, feedrates, and cermet grades on the outside contour roughing operation. The cermets tested were manufactured by Kennametal, Valenite, and NTK. The carbide tool ordinarily used for this operation is manufactured by Carboloy. Evaluation of tho tools was conducted by investigating the number of passes per part and parts per insert, tool wear, cutting time, tool life, surface finish, and stem taper. Benefits to be gained from this study were: improved part quality, better chip control, increased tool life and utilization, and greater fabrication productivity. This was to be accomplished by performing the operation to specified accuracies within the scope of the tools tested.

  16. UMI-tools: modeling sequencing errors in Unique Molecular Identifiers to improve quantification accuracy.

    PubMed

    Smith, Tom; Heger, Andreas; Sudbery, Ian

    2017-03-01

    Unique Molecular Identifiers (UMIs) are random oligonucleotide barcodes that are increasingly used in high-throughput sequencing experiments. Through a UMI, identical copies arising from distinct molecules can be distinguished from those arising through PCR amplification of the same molecule. However, bioinformatic methods to leverage the information from UMIs have yet to be formalized. In particular, sequencing errors in the UMI sequence are often ignored or else resolved in an ad hoc manner. We show that errors in the UMI sequence are common and introduce network-based methods to account for these errors when identifying PCR duplicates. Using these methods, we demonstrate improved quantification accuracy both under simulated conditions and real iCLIP and single-cell RNA-seq data sets. Reproducibility between iCLIP replicates and single-cell RNA-seq clustering are both improved using our proposed network-based method, demonstrating the value of properly accounting for errors in UMIs. These methods are implemented in the open source UMI-tools software package.

  17. UMI-tools: modeling sequencing errors in Unique Molecular Identifiers to improve quantification accuracy

    PubMed Central

    2017-01-01

    Unique Molecular Identifiers (UMIs) are random oligonucleotide barcodes that are increasingly used in high-throughput sequencing experiments. Through a UMI, identical copies arising from distinct molecules can be distinguished from those arising through PCR amplification of the same molecule. However, bioinformatic methods to leverage the information from UMIs have yet to be formalized. In particular, sequencing errors in the UMI sequence are often ignored or else resolved in an ad hoc manner. We show that errors in the UMI sequence are common and introduce network-based methods to account for these errors when identifying PCR duplicates. Using these methods, we demonstrate improved quantification accuracy both under simulated conditions and real iCLIP and single-cell RNA-seq data sets. Reproducibility between iCLIP replicates and single-cell RNA-seq clustering are both improved using our proposed network-based method, demonstrating the value of properly accounting for errors in UMIs. These methods are implemented in the open source UMI-tools software package. PMID:28100584

  18. Embodied Rules in Tool Use: A Tool-Switching Study

    ERIC Educational Resources Information Center

    Beisert, Miriam; Massen, Cristina; Prinz, Wolfgang

    2010-01-01

    In tool use, a transformation rule defines the relation between an operating movement and its distal effect. This rule is determined by the tool structure and requires no explicit definition. The present study investigates how humans represent and apply compatible and incompatible transformation rules in tool use. In Experiment 1, participants had…

  19. The accuracy of a patient or parent-administered bleeding assessment tool administered in a paediatric haematology clinic.

    PubMed

    Lang, A T; Sturm, M S; Koch, T; Walsh, M; Grooms, L P; O'Brien, S H

    2014-11-01

    Classifying and describing bleeding symptoms is essential in the diagnosis and management of patients with mild bleeding disorders (MBDs). There has been increased interest in the use of bleeding assessment tools (BATs) to more objectively quantify the presence and severity of bleeding symptoms. To date, the administration of BATs has been performed almost exclusively by clinicians; the accuracy of a parent-proxy BAT has not been studied. Our objective was to determine the accuracy of a parent-administered BAT by measuring the level of agreement between parent and clinician responses to the Condensed MCMDM-1VWD Bleeding Questionnaire. Our cross-sectional study included children 0-21 years presenting to a haematology clinic for initial evaluation of a suspected MBD or follow-up evaluation of a previously diagnosed MBD. The parent/caregiver completed a modified version of the BAT; the clinician separately completed the BAT through interview. The mean parent-report bleeding score (BS) was 6.09 (range: -2 to 25); the mean clinician report BS was 4.54 (range: -1 to 17). The mean percentage of agreement across all bleeding symptoms was 78% (mean κ = 0.40; Gwet's AC1 = 0.74). Eighty percent of the population had an abnormal BS (defined as ≥2) when rated by parents and 76% had an abnormal score when rated by clinicians (86% agreement, κ = 0.59, Gwet's AC1 = 0.79). While parents tended to over-report bleeding as compared to clinicians, overall, BSs were similar between groups. These results lend support for further study of a modified proxy-report BAT as a clinical and research tool.

  20. High angular accuracy manufacture method of micro v-grooves based on tool alignment by on-machine measurement.

    PubMed

    Zhang, Xiaodong; Jiang, Lili; Zeng, Zhen; Fang, Fengzhou; Liu, Xianlei

    2015-10-19

    Micro v-groove has found wide applications in optical areas as one of the most important structures. However, its performance is significantly affected by its angular geometry accuracy. The diamond cutting has been commonly used as the fabrication method of micro v-groove, but it is still difficult to guarantee the cutting tool angle, which is limited by the measurement accuracy in the manufacture and mounting of the diamond tool. A cutting tool alignment method based on the on-machine measurement is proposed to improve the fabricated quality of the v-groove angle. An on-machine probe is employed to scan the v-groove geometrical deviation precisely. The system errors model, data processing algorithm and tool alignment methods are analyzed in details. Experimental results show that the measurement standard deviation within 0.01° can be achieved. Retro-reflection mirrors are fabricated and measured finally by the proposed method for verification.

  1. ACCURACY OF LABORATORY REPORTING IN EPAS WET INTERLABORATORY VARIABILITY STUDY

    EPA Science Inventory

    In 1999 and 2000, EPA conducted an interlaboratory variability study of whole effluent toxicity (WET) test methods. This study provided an excellent opportunity to evaluate the accuracy with which laboratories analyzed and reported WET test data. Twenty-eight laboratories reporte...

  2. Study of accuracy of precipitation measurements using simulation method

    NASA Astrophysics Data System (ADS)

    Nagy, Zoltán; Lajos, Tamás; Morvai, Krisztián

    2013-04-01

    Hungarian Meteorological Service1 Budapest University of Technology and Economics2 Precipitation is one of the the most important meteorological parameters describing the state of the climate and to get correct information from trends, accurate measurements of precipitation is very important. The problem is that the precipitation measurements are affected by systematic errors leading to an underestimation of actual precipitation which errors vary by type of precipitaion and gauge type. It is well known that the wind speed is the most important enviromental factor that contributes to the underestimation of actual precipitation, especially for solid precipitation. To study and correct the errors of precipitation measurements there are two basic possibilities: · Use of results and conclusion of International Precipitation Measurements Intercomparisons; · To build standard reference gauges (DFIR, pit gauge) and make own investigation; In 1999 at the HMS we tried to achieve own investigation and built standard reference gauges But the cost-benefit ratio in case of snow (use of DFIR) was very bad (we had several winters without significant amount of snow, while the state of DFIR was continously falling) Due to the problem mentioned above there was need for new approximation that was the modelling made by Budapest University of Technology and Economics, Department of Fluid Mechanics using the FLUENT 6.2 model. The ANSYS Fluent package is featured fluid dynamics solution for modelling flow and other related physical phenomena. It provides the tools needed to describe atmospheric processes, design and optimize new equipment. The CFD package includes solvers that accurately simulate behaviour of the broad range of flows that from single-phase to multi-phase. The questions we wanted to get answer to are as follows: · How do the different types of gauges deform the airflow around themselves? · Try to give quantitative estimation of wind induced error. · How does the use

  3. A New 3D Tool for Assessing the Accuracy of Bimaxillary Surgery: The OrthoGnathicAnalyser

    PubMed Central

    Xi, Tong; Schreurs, Ruud; de Koning, Martien; Bergé, Stefaan; Maal, Thomas

    2016-01-01

    Aim The purpose of this study was to present and validate an innovative semi-automatic approach to quantify the accuracy of the surgical outcome in relation to 3D virtual orthognathic planning among patients who underwent bimaxillary surgery. Material and Method For the validation of this new semi-automatic approach, CBCT scans of ten patients who underwent bimaxillary surgery were acquired pre-operatively. Individualized 3D virtual operation plans were made for all patients prior to surgery. During surgery, the maxillary and mandibular segments were positioned as planned by using 3D milled interocclusal wafers. Consequently, post-operative CBCT scan were acquired. The 3D rendered pre- and postoperative virtual head models were aligned by voxel-based registration upon the anterior cranial base. To calculate the discrepancies between the 3D planning and the actual surgical outcome, the 3D planned maxillary and mandibular segments were segmented and superimposed upon the postoperative maxillary and mandibular segments. The translation matrices obtained from this registration process were translated into translational and rotational discrepancies between the 3D planning and the surgical outcome, by using the newly developed tool, the OrthoGnathicAnalyser. To evaluate the reproducibility of this method, the process was performed by two independent observers multiple times. Results Low intra-observer and inter-observer variations in measurement error (mean error < 0.25 mm) and high intraclass correlation coefficients (> 0.97) were found, supportive of the observer independent character of the OrthoGnathicAnalyser. The pitch of the maxilla and mandible showed the highest discrepancy between the 3D planning and the postoperative results, 2.72° and 2.75° respectively. Conclusion This novel method provides a reproducible tool for the evaluation of bimaxillary surgery, making it possible to compare larger patient groups in an objective and time-efficient manner in order to

  4. Bias due to composite reference standards in diagnostic accuracy studies.

    PubMed

    Schiller, Ian; van Smeden, Maarten; Hadgu, Alula; Libman, Michael; Reitsma, Johannes B; Dendukuri, Nandini

    2016-04-30

    Composite reference standards (CRSs) have been advocated in diagnostic accuracy studies in the absence of a perfect reference standard. The rationale is that combining results of multiple imperfect tests leads to a more accurate reference than any one test in isolation. Focusing on a CRS that classifies subjects as disease positive if at least one component test is positive, we derive algebraic expressions for sensitivity and specificity of this CRS, sensitivity and specificity of a new (index) test compared with this CRS, as well as the CRS-based prevalence. We use as a motivating example the problem of evaluating a new test for Chlamydia trachomatis, an asymptomatic disease for which no gold-standard test exists. As the number of component tests increases, sensitivity of this CRS increases at the expense specificity, unless all tests have perfect specificity. Therefore, such a CRS can lead to significantly biased accuracy estimates of the index test. The bias depends on disease prevalence and accuracy of the CRS. Further, conditional dependence between the CRS and index test can lead to over-estimation of index test accuracy estimates. This commonly-used CRS combines results from multiple imperfect tests in a way that ignores information and therefore is not guaranteed to improve over a single imperfect reference unless each component test has perfect specificity, and the CRS is conditionally independent of the index test. When these conditions are not met, as in the case of C. trachomatis testing, more realistic statistical models should be researched instead of relying on such CRSs.

  5. Effect of a Novel Clinical Decision Support Tool on the Efficiency and Accuracy of Treatment Recommendations for Cholesterol Management

    PubMed Central

    Scheitel, Marianne R.; Kessler, Maya E.; Shellum, Jane L.; Peters, Steve G.; Milliner, Dawn S.; Liu, Hongfang; Elayavilli, Ravikumar Komandur; Poterack, Karl A.; Miksch, Timothy A.; Boysen, Jennifer J.; Hankey, Ron A.

    2017-01-01

    Summary Background The 2013 American College of Cardiology / American Heart Association Guidelines for the Treatment of Blood Cholesterol emphasize treatment based on cardiovascular risk. But finding time in a primary care visit to manually calculate cardiovascular risk and prescribe treatment based on risk is challenging. We developed an informatics-based clinical decision support tool, MayoExpertAdvisor, to deliver automated cardiovascular risk scores and guideline-based treatment recommendations based on patient-specific data in the electronic heath record. Objective To assess the impact of our clinical decision support tool on the efficiency and accuracy of clinician calculation of cardiovascular risk and its effect on the delivery of guideline-consistent treatment recommendations. Methods Clinicians were asked to review the EHR records of selected patients. We evaluated the amount of time and the number of clicks and keystrokes needed to calculate cardiovascular risk and provide a treatment recommendation with and without our clinical decision support tool. We also compared the treatment recommendation arrived at by clinicians with and without the use of our tool to those recommended by the guidelines. Results Clinicians saved 3 minutes and 38 seconds in completing both tasks with MayoExpertAdvisor, used 94 fewer clicks and 23 fewer key strokes, and improved accuracy from the baseline of 60.61% to 100% for both the risk score calculation and guideline-consistent treatment recommendation. Conclusion Informatics solution can greatly improve the efficiency and accuracy of individualized treatment recommendations and have the potential to increase guideline compliance. PMID:28174820

  6. Numerical Stability and Accuracy of Temporally Coupled Multi-Physics Modules in Wind-Turbine CAE Tools

    SciTech Connect

    Gasmi, A.; Sprague, M. A.; Jonkman, J. M.; Jones, W. B.

    2013-02-01

    In this paper we examine the stability and accuracy of numerical algorithms for coupling time-dependent multi-physics modules relevant to computer-aided engineering (CAE) of wind turbines. This work is motivated by an in-progress major revision of FAST, the National Renewable Energy Laboratory's (NREL's) premier aero-elastic CAE simulation tool. We employ two simple examples as test systems, while algorithm descriptions are kept general. Coupled-system governing equations are framed in monolithic and partitioned representations as differential-algebraic equations. Explicit and implicit loose partition coupling is examined. In explicit coupling, partitions are advanced in time from known information. In implicit coupling, there is dependence on other-partition data at the next time step; coupling is accomplished through a predictor-corrector (PC) approach. Numerical time integration of coupled ordinary-differential equations (ODEs) is accomplished with one of three, fourth-order fixed-time-increment methods: Runge-Kutta (RK), Adams-Bashforth (AB), and Adams-Bashforth-Moulton (ABM). Through numerical experiments it is shown that explicit coupling can be dramatically less stable and less accurate than simulations performed with the monolithic system. However, PC implicit coupling restored stability and fourth-order accuracy for ABM; only second-order accuracy was achieved with RK integration. For systems without constraints, explicit time integration with AB and explicit loose coupling exhibited desired accuracy and stability.

  7. Study on High Accuracy Topographic Mapping via UAV-based Images

    NASA Astrophysics Data System (ADS)

    Chi, Yun-Yao; Lee, Ya-Fen; Tsai, Shang-En

    2016-10-01

    Unmanned aerial vehicle (UAV) provides a promising tool for the acquisition of such multi-temporal aerial stereo photos and high-resolution digital surface models. Recently, the flight of UAVs operates with high degrees of autonomy by the global position system and onboard digit camera and computer. The UAV-based mapping can be obtained faster and cheaper, but its accuracy is anxious. This paper aims to identify the integration ability of high accuracy topographic map via the image of quad-rotors UAV and ground control points (GCPs). The living survey data is collected in the Errn river basins area in Tainan, Taiwan. The high accuracy UAV-based topographic in the study area is calibrated by the local coordinate of GCPs using the total station with the accuracy less than 1/2000. The comparison results show the accuracy of UAV-based topographic is accepted by overlapping. The results can be a reference for the practice works of mapping survey in earth.

  8. Cadastral Positioning Accuracy Improvement: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Hashim, N. M.; Omar, A. H.; Omar, K. M.; Abdullah, N. M.; Yatim, M. H. M.

    2016-09-01

    Cadastral map is a parcel-based information which is specifically designed to define the limitation of boundaries. In Malaysia, the cadastral map is under authority of the Department of Surveying and Mapping Malaysia (DSMM). With the growth of spatial based technology especially Geographical Information System (GIS), DSMM decided to modernize and reform its cadastral legacy datasets by generating an accurate digital based representation of cadastral parcels. These legacy databases usually are derived from paper parcel maps known as certified plan. The cadastral modernization will result in the new cadastral database no longer being based on single and static parcel paper maps, but on a global digital map. Despite the strict process of the cadastral modernization, this reform has raised unexpected queries that remain essential to be addressed. The main focus of this study is to review the issues that have been generated by this transition. The transformed cadastral database should be additionally treated to minimize inherent errors and to fit them to the new satellite based coordinate system with high positional accuracy. This review result will be applied as a foundation for investigation to study the systematic and effectiveness method for Positional Accuracy Improvement (PAI) in cadastral database modernization.

  9. High-accuracy mass spectrometry for fundamental studies.

    PubMed

    Kluge, H-Jürgen

    2010-01-01

    Mass spectrometry for fundamental studies in metrology and atomic, nuclear and particle physics requires extreme sensitivity and efficiency as well as ultimate resolving power and accuracy. An overview will be given on the global status of high-accuracy mass spectrometry for fundamental physics and metrology. Three quite different examples of modern mass spectrometric experiments in physics are presented: (i) the retardation spectrometer KATRIN at the Forschungszentrum Karlsruhe, employing electrostatic filtering in combination with magnetic-adiabatic collimation-the biggest mass spectrometer for determining the smallest mass, i.e. the mass of the electron anti-neutrino, (ii) the Experimental Cooler-Storage Ring at GSI-a mass spectrometer of medium size, relative to other accelerators, for determining medium-heavy masses and (iii) the Penning trap facility, SHIPTRAP, at GSI-the smallest mass spectrometer for determining the heaviest masses, those of super-heavy elements. Finally, a short view into the future will address the GSI project HITRAP at GSI for fundamental studies with highly-charged ions.

  10. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis

    NASA Astrophysics Data System (ADS)

    Litjens, Geert; Sánchez, Clara I.; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen-van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen

    2016-05-01

    Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce ‘deep learning’ as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30–40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that ‘deep learning’ holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging.

  11. Accuracy and efficiency of detection dogs: a powerful new tool for koala conservation and management

    PubMed Central

    Cristescu, Romane H.; Foley, Emily; Markula, Anna; Jackson, Gary; Jones, Darryl; Frère, Céline

    2015-01-01

    Accurate data on presence/absence and spatial distribution for fauna species is key to their conservation. Collecting such data, however, can be time consuming, laborious and costly, in particular for fauna species characterised by low densities, large home ranges, cryptic or elusive behaviour. For such species, including koalas (Phascolarctos cinereus), indicators of species presence can be a useful shortcut: faecal pellets (scats), for instance, are widely used. Scat surveys are not without their difficulties and often contain a high false negative rate. We used experimental and field-based trials to investigate the accuracy and efficiency of the first dog specifically trained for koala scats. The detection dog consistently out-performed human-only teams. Off-leash, the dog detection rate was 100%. The dog was also 19 times more efficient than current scat survey methods and 153% more accurate (the dog found koala scats where the human-only team did not). This clearly demonstrates that the use of detection dogs decreases false negatives and survey time, thus allowing for a significant improvement in the quality and quantity of data collection. Given these unequivocal results, we argue that to improve koala conservation, detection dog surveys for koala scats could in the future replace human-only teams. PMID:25666691

  12. Accuracy and efficiency of detection dogs: a powerful new tool for koala conservation and management.

    PubMed

    Cristescu, Romane H; Foley, Emily; Markula, Anna; Jackson, Gary; Jones, Darryl; Frère, Céline

    2015-02-10

    Accurate data on presence/absence and spatial distribution for fauna species is key to their conservation. Collecting such data, however, can be time consuming, laborious and costly, in particular for fauna species characterised by low densities, large home ranges, cryptic or elusive behaviour. For such species, including koalas (Phascolarctos cinereus), indicators of species presence can be a useful shortcut: faecal pellets (scats), for instance, are widely used. Scat surveys are not without their difficulties and often contain a high false negative rate. We used experimental and field-based trials to investigate the accuracy and efficiency of the first dog specifically trained for koala scats. The detection dog consistently out-performed human-only teams. Off-leash, the dog detection rate was 100%. The dog was also 19 times more efficient than current scat survey methods and 153% more accurate (the dog found koala scats where the human-only team did not). This clearly demonstrates that the use of detection dogs decreases false negatives and survey time, thus allowing for a significant improvement in the quality and quantity of data collection. Given these unequivocal results, we argue that to improve koala conservation, detection dog surveys for koala scats could in the future replace human-only teams.

  13. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis

    PubMed Central

    Litjens, Geert; Sánchez, Clara I.; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen - van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen

    2016-01-01

    Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce ‘deep learning’ as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30–40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that ‘deep learning’ holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging. PMID:27212078

  14. "Score the Core" Web-based pathologist training tool improves the accuracy of breast cancer IHC4 scoring.

    PubMed

    Engelberg, Jesse A; Retallack, Hanna; Balassanian, Ronald; Dowsett, Mitchell; Zabaglo, Lila; Ram, Arishneel A; Apple, Sophia K; Bishop, John W; Borowsky, Alexander D; Carpenter, Philip M; Chen, Yunn-Yi; Datnow, Brian; Elson, Sarah; Hasteh, Farnaz; Lin, Fritz; Moatamed, Neda A; Zhang, Yanhong; Cardiff, Robert D

    2015-11-01

    Hormone receptor status is an integral component of decision-making in breast cancer management. IHC4 score is an algorithm that combines hormone receptor, HER2, and Ki-67 status to provide a semiquantitative prognostic score for breast cancer. High accuracy and low interobserver variance are important to ensure the score is accurately calculated; however, few previous efforts have been made to measure or decrease interobserver variance. We developed a Web-based training tool, called "Score the Core" (STC) using tissue microarrays to train pathologists to visually score estrogen receptor (using the 300-point H score), progesterone receptor (percent positive), and Ki-67 (percent positive). STC used a reference score calculated from a reproducible manual counting method. Pathologists in the Athena Breast Health Network and pathology residents at associated institutions completed the exercise. By using STC, pathologists improved their estrogen receptor H score and progesterone receptor and Ki-67 proportion assessment and demonstrated a good correlation between pathologist and reference scores. In addition, we collected information about pathologist performance that allowed us to compare individual pathologists and measures of agreement. Pathologists' assessment of the proportion of positive cells was closer to the reference than their assessment of the relative intensity of positive cells. Careful training and assessment should be used to ensure the accuracy of breast biomarkers. This is particularly important as breast cancer diagnostics become increasingly quantitative and reproducible. Our training tool is a novel approach for pathologist training that can serve as an important component of ongoing quality assessment and can improve the accuracy of breast cancer prognostic biomarkers.

  15. Potential Child Abuse Screening in Emergency Department; a Diagnostic Accuracy Study

    PubMed Central

    Dinpanah, Hossein; Akbarzadeh Pasha, Abazar

    2017-01-01

    Introduction: Designing a tool that can differentiate those at risk of child abuse with great diagnostic accuracy is of great interest. The present study was designed to evaluate the diagnostic accuracy of Escape instrument in triage of at risk cases of child abuse presenting to emergency department (ED). Method: The present diagnostic accuracy study performed on 6120 of the children under 16 years old presented to ED during 3 years, using convenience sampling. Confirmation by the child abuse team (pediatrician, a social worker, and a forensic physician) was considered as the gold standard. Screening performance characteristics of Escape were calculated using STATA 21. Results: 6120 children with the mean age of 2.19 ± 1.12 years were screened (52.7% girls). 137 children were suspected victims of child abuse. Based on child abuse team opinion, 35 (0.5%) children were confirmed victims of child abuse. Sensitivity, specificity, positive and negative likelihood ratio and positive and negative predictive values of this test with 95% CI were 100 (87.6 – 100), 98.3 (97.9 – 98.6), 25.5 (18.6 – 33.8), 100 (99.9 – 100), 0.34 (0.25 – 0.46), and 0 (0 – NAN), respectively. Area under the ROC curve was 99.2 (98.9 – 99.4). Conclusion: It seems that Escape is a suitable screening instrument for detection of at risk cases of child abuse presenting to ED. Based on the results of the present study, the accuracy of this screening tool is 99.2%, which is in the excellent range. PMID:28286815

  16. Alaska national hydrography dataset positional accuracy assessment study

    USGS Publications Warehouse

    Arundel, Samantha; Yamamoto, Kristina H.; Constance, Eric; Mantey, Kim; Vinyard-Houx, Jeremy

    2013-01-01

    Initial visual assessments Wide range in the quality of fit between features in NHD and these new image sources. No statistical analysis has been performed to actually quantify accuracy Determining absolute accuracy is cost prohibitive (must collect independent, well defined test points) Quantitative analysis of relative positional error is feasible.

  17. Template for Systems Engineering Tools Trade Study

    NASA Technical Reports Server (NTRS)

    Bailey, Michelle D.

    2005-01-01

    A discussion of Systems Engineering tools brings out numerous preferences and reactions regarding tools of choice as well as the functions those tools are to perform. A recent study of Systems Engineering Tools for a new Program illustrated the need for a generic template for use by new Programs or Projects to determine the toolset appropriate for their use. This paper will provide the guidelines new initiatives can follow and tailor to their specific needs, to enable them to make their choice of tools in an efficient and informed manner. Clearly, those who perform purely technical functions will need different tools than those who perform purely systems engineering functions. And, everyone has tools they are comfortable with. That degree of comfort is frequently the deciding factor in tools choice rather than an objective study of all criteria and weighting factors. This paper strives to produce a comprehensive list of criteria for selection with suggestions for weighting factors based on a number of assumptions regarding the given Program or Project. In addition, any given Program will begin with assumptions for its toolset based on Program size, tool cost, user base and technical needs. In providing a template for tool selection, this paper will guide the reader through assumptions based on Program need; decision criteria; potential weighting factors; the need for a compilation of available tools; the importance of tool demonstrations; and finally a down selection of tools. While specific vendors cannot be mentioned in this work, it is expected that this template could serve other Programs in the formulation phase by alleviating the trade study process of some of its subjectivity.

  18. Tool Version Management Technology: A Case Study.

    DTIC Science & Technology

    1990-11-01

    Technical Report AD-A235 639 CMU/SEI-90-TR-25 Tool Version Management Technology: A Case Study Peter H. Feiler Grace F. Downey November 1990 x 91...00304 90 7 Technical Report CMU/SEI-90-TR-25 ESD-90-TR-226 November 1990 Tool Version Management Technology: A Case Study Peter H. Feiler Grace F. Downey...trademark holder. Table of Contents 1. lntroducton 1 2. The Problem 3 2.1. Tool Version Organization and Selection 3 2.2. Stability of Selected Tool

  19. Visualization tool for improved accuracy in needle placement during percutaneous radio-frequency ablation of liver tumors

    NASA Astrophysics Data System (ADS)

    Stüdeli, Thomas; Kalkofen, Denis; Risholm, Petter; Ali, Wajid; Freudenthal, Adinda; Samset, Eigil

    2008-03-01

    The European research network "Augmented reality in Surgery" (ARIS*ER) developed a system that supports percutaneous radio frequency ablation of liver tumors. The system provides interventionists, during placement and insertion of the RFA needle, with information from pre-operative CT images and real-time tracking data. A visualization tool has been designed that aims to support (1) exploration of the abdomen, (2) planning of needle trajectory and (3) insertion of the needle in the most efficient way. This work describes a first evaluation of the system, where user performances and feedback of two visualization concepts of the tool - needle view and user view - are compared. After being introduced to the system, ten subjects performed three needle placements with both concepts. Task fulfillment rate, time for completion of task, special incidences, accuracy of needle placement recorded and analyzed. The results show ambiguous results with beneficial and less favorable effects on user performance and workload of both concepts. Effects depend on characteristics of intra-operative tasks as well as on task complexities depending on tumor location. The results give valuable input for the next design steps.

  20. [Accuracy of apposition achieved by mandibular osteosyntheses. Stereophotogrammetric study].

    PubMed

    Randzio, J; Ficker, E; Wintges, T; Laser, S

    1989-01-01

    The accuracy of apposition achieved by wire and plate osteosyntheses is measured with the aid of close range stereophotogrammetry in the mandibles of dead bodies. Both osteosynthesis methods are characterized by an increase in the intercondylar distance which, on the average, is about 3.3 mm greater after plate osteosynthesis and about 1.9 mm after wiring. Moreover, osteosyntheses of the base of the mandible may involve a tendency of the condyle to become caudally dislocated.

  1. Accuracy of the TeacherInsight Online Perceiver Tool in Determining the Effectiveness of High Rated and Low Rated Math and Science New Hire Teachers following One Year and Three Years of Single School District Employment

    ERIC Educational Resources Information Center

    Regan, Nicole A.

    2011-01-01

    The purpose of this study is to explore the accuracy of the TeacherInsight online perceiver tool (Gallup University, 2007) in determining the effectiveness of high rated (n =14) and low rated (n =36) math and science new hire teachers summative appraisal ratings, completed graduate coursework, and retention status following one year and three…

  2. Screening Characteristics of Bedside Ultrasonography in Confirming Endotracheal Tube Placement; a Diagnostic Accuracy Study

    PubMed Central

    Zamani Moghadam, Hamid; Sharifi, Mohamad Davood; Rajabi, Hasan; Mousavi Bazaz, Mojtaba; Alamdaran, Ali; Jafari, Niazmohammad; Hashemian, Seyed Amir Masoud; Talebi Deloei, Morteza

    2017-01-01

    Introduction: Confirmation of proper endotracheal tube placement is one of the most important and lifesaving issues of tracheal intubation. The present study was aimed to evaluate the accuracy of tracheal ultrasonography by emergency residents in this regard. Method: This was a prospective, cross sectional study for evaluating the diagnostic accuracy of ultrasonography in endotracheal tube placement confirmation compared to a combination of 4 clinical confirmation methods of chest and epigastric auscultation, direct laryngoscopy, aspiration of the tube, and pulse oximetry (as reference test). Results: 150 patients with the mean age of 58.52 ± 1.73 years were included (56.6% male). Sensitivity, specificity, positive predictive value, negative predictive value, and positive and negative likelihood ratio of tracheal ultrasonography in endotracheal tube confirmation were 96 (95% CI: 92-99), 88 (95% CI: 62-97), 98 (95% CI: 94-99), 78 (95% CI: 53-93), 64 (95% CI: 16-255), and 0.2 (95% CI: 0.1-0.6), respectively. Conclusion: The present study showed that tracheal ultrasonography by trained emergency medicine residents had excellent sensitivity (>90%) and good specificity (80-90) for confirming endotracheal tube placement. Therefore, it seems that ultrasonography is a proper screening tool in determining endotracheal tube placement. PMID:28286826

  3. Scatter dose summation for irregular fields: speed and accuracy study.

    PubMed

    DeWyngaert, J K; Siddon, R L; Bjarngard, B E

    1986-05-01

    Using program IRREG as a standard, we have compared speed and accuracy of several algorithms that calculate the scatter dose in an irregular field. All the algorithms, in some manner, decompose the irregular field into component triangles and obtain the scatter dose as the sum of the contributions from those triangles. Two of the algorithms replace each such component triangle with a sector of a certain "effective radius": in one case the average radius of the triangle, in the other the radius of the sector having the same area as the component triangle. A third algorithm decomposes each triangle further into two right triangles and utilizes the precalculated "equivalent radius" of each, to find the scatter contribution. For points near the center of a square field, all the methods compare favorably in accuracy to program IRREG, with less than a 1% error in total dose and with approximately a factor of 3-5 savings in computation time. Even for extreme rectangular fields (2 cm X 30 cm), the methods using the average radius and the equivalent right triangles agree to within 2% in total dose and approximately a factor of 3-4 savings in computation time.

  4. High accuracy NMR chemical shift corrected for bulk magnetization as a tool for structural elucidation of microemulsions. Part 2 - Anionic and nonionic dilutable microemulsions.

    PubMed

    Hoffman, Roy E; Darmon, Eliezer; Aserin, Abraham; Garti, Nissim

    2016-02-01

    In our previous report we suggested a new analytical tool, high accuracy NMR chemical shift corrected for bulk magnetization as a supplementary tool to study structural transitions and droplet size and shape of dilutable microemulsions. The aim of this study was to show the generality of this technique and to demonstrate that in almost any type of microemulsion this technique provides additional valuable structural information. The analysis made by the technique adds to the elucidation of some structural aspects that could not be clearly determined by other classical techniques. Therefore, in this part we are extending the study to three additional systems differing in the type of oil phase (toluene and cyclohexane), the nature of the surfactants (anionic and nonionic), and other microemulsion characteristics. We studied sodium dodecyl sulfate (SDS)-based anionic microemulsions with different oils and a nonionic microemulsion based on Tween 20 as the surfactant and toluene as the oil phase. All the microemulsions were fully dilutable with water. We found that the change in the slope of chemical shift against dilution reflects phase transition points of the microemulsion (O/W, bicontinuous, W/O). Chemical shift changes were clearly observed with the transition between spherical and non-spherical (wormlike, etc.) droplet shapes. We compared the interaction of cyclohexane and toluene and used the anisotropic effect of toluene's ring current to determine its preferred orientation relative to SDS. Chemical shifts of the microemulsion components are therefore a useful addition to the arsenal of techniques for characterizing microemulsions.

  5. Pose Estimation with a Kinect for Ergonomic Studies: Evaluation of the Accuracy Using a Virtual Mannequin

    PubMed Central

    Plantard, Pierre; Auvinet, Edouard; Le Pierres, Anne-Sophie; Multon, Franck

    2015-01-01

    Analyzing human poses with a Kinect is a promising method to evaluate potentials risks of musculoskeletal disorders at workstations. In ecological situations, complex 3D poses and constraints imposed by the environment make it difficult to obtain reliable kinematic information. Thus, being able to predict the potential accuracy of the measurement for such complex 3D poses and sensor placements is challenging in classical experimental setups. To tackle this problem, we propose a new evaluation method based on a virtual mannequin. In this study, we apply this method to the evaluation of joint positions (shoulder, elbow, and wrist), joint angles (shoulder and elbow), and the corresponding RULA (a popular ergonomics assessment grid) upper-limb score for a large set of poses and sensor placements. Thanks to this evaluation method, more than 500,000 configurations have been automatically tested, which would be almost impossible to evaluate with classical protocols. The results show that the kinematic information obtained by the Kinect software is generally accurate enough to fill in ergonomic assessment grids. However inaccuracy strongly increases for some specific poses and sensor positions. Using this evaluation method enabled us to report configurations that could lead to these high inaccuracies. As a supplementary material, we provide a software tool to help designers to evaluate the expected accuracy of this sensor for a set of upper-limb configurations. Results obtained with the virtual mannequin are in accordance with those obtained from a real subject for a limited set of poses and sensor placements. PMID:25599426

  6. Novel Genetic Analysis for Case-Control Genome-Wide Association Studies: Quantification of Power and Genomic Prediction Accuracy

    PubMed Central

    Lee, Sang Hong; Wray, Naomi R.

    2013-01-01

    Genome-wide association studies (GWAS) are routinely conducted for both quantitative and binary (disease) traits. We present two analytical tools for use in the experimental design of GWAS. Firstly, we present power calculations quantifying power in a unified framework for a range of scenarios. In this context we consider the utility of quantitative scores (e.g. endophenotypes) that may be available on cases only or both cases and controls. Secondly, we consider, the accuracy of prediction of genetic risk from genome-wide SNPs and derive an expression for genomic prediction accuracy using a liability threshold model for disease traits in a case-control design. The expected values based on our derived equations for both power and prediction accuracy agree well with observed estimates from simulations. PMID:23977056

  7. A diagnostic tool for determining the quality of accuracy validation. Assessing the method for determination of nitrate in drinking water.

    PubMed

    Escuder-Gilabert, L; Bonet-Domingo, E; Medina-Hernández, M J; Sagrado, S

    2007-01-01

    Realistic internal validation of a method implies the performance validation experiments under intermediate precision conditions. The validation results can be organized in an X (NrxNs) (replicates x runs) data matrix, analysis of which enables assessment of the accuracy of the method. By means of Monte Carlo simulation, uncertainty in the estimates of bias and precision can be assessed. A bivariate plot is presented for assessing whether the uncertainty intervals for the bias (E +/- U(E)) and intermediate precision (RSDi +/- U(RSDi) are included in prefixed limits (requirements for the method). As a case study, a method for determining the concentration of nitrate in drinking water at the official level set by 98/83/EC Directive is assessed by use of the proposed plot.

  8. Using Meta-Analysis to Inform the Design of Subsequent Studies of Diagnostic Test Accuracy

    ERIC Educational Resources Information Center

    Hinchliffe, Sally R.; Crowther, Michael J.; Phillips, Robert S.; Sutton, Alex J.

    2013-01-01

    An individual diagnostic accuracy study rarely provides enough information to make conclusive recommendations about the accuracy of a diagnostic test; particularly when the study is small. Meta-analysis methods provide a way of combining information from multiple studies, reducing uncertainty in the result and hopefully providing substantial…

  9. Seismicity map tools for earthquake studies

    NASA Astrophysics Data System (ADS)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  10. Dynamic optimization case studies in DYNOPT tool

    NASA Astrophysics Data System (ADS)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.

  11. Accuracy of Self-Evaluation in Adults with ADHD: Evidence from a Driving Study

    ERIC Educational Resources Information Center

    Knouse, Laura E.; Bagwell, Catherine L.; Barkley, Russell A.; Murphy, Kevin R.

    2005-01-01

    Research on children with ADHD indicates an association with inaccuracy of self-appraisal. This study examines the accuracy of self-evaluations in clinic-referred adults diagnosed with ADHD. Self-assessments and performance measures of driving in naturalistic settings and on a virtual-reality driving simulator are used to assess accuracy of…

  12. Dynamic Development of Complexity and Accuracy: A Case Study in Second Language Academic Writing

    ERIC Educational Resources Information Center

    Rosmawati

    2014-01-01

    This paper reports on the development of complexity and accuracy in English as a Second Language (ESL) academic writing. Although research into complexity and accuracy development in second language (L2) writing has been well established, few studies have assumed the multidimensionality of these two constructs (Norris & Ortega, 2009) or…

  13. Alternative Confidence Interval Methods Used in the Diagnostic Accuracy Studies.

    PubMed

    Erdoğan, Semra; Gülhan, Orekıcı Temel

    2016-01-01

    Background/Aim. It is necessary to decide whether the newly improved methods are better than the standard or reference test or not. To decide whether the new diagnostics test is better than the gold standard test/imperfect standard test, the differences of estimated sensitivity/specificity are calculated with the help of information obtained from samples. However, to generalize this value to the population, it should be given with the confidence intervals. The aim of this study is to evaluate the confidence interval methods developed for the differences between the two dependent sensitivity/specificity values on a clinical application. Materials and Methods. In this study, confidence interval methods like Asymptotic Intervals, Conditional Intervals, Unconditional Interval, Score Intervals, and Nonparametric Methods Based on Relative Effects Intervals are used. Besides, as clinical application, data used in diagnostics study by Dickel et al. (2010) has been taken as a sample. Results. The results belonging to the alternative confidence interval methods for Nickel Sulfate, Potassium Dichromate, and Lanolin Alcohol are given as a table. Conclusion. While preferring the confidence interval methods, the researchers have to consider whether the case to be compared is single ratio or dependent binary ratio differences, the correlation coefficient between the rates in two dependent ratios and the sample sizes.

  14. The Study: A Tool for Decision?

    DTIC Science & Technology

    2014-09-26

    ABSTRACT AUTHOR(S): John H. Van Zant, Jr., COL, AR TITLE: THE SID : A Tool For Decision? F~HAT: Individual Study Project DATE: 22 April 1985 PAGES...the CSA and HQ,USMC. Thus, while the questions were answered, the larger effort turned out to be almost anti- climatic . WHAT DID IT PROVIDE? The ACVT... change in the threat, the Army leadership became impatient for an answer. As a result, a side study was conducted between January and October 1980 which

  15. Twin Studies: A Unique Epidemiological Tool

    PubMed Central

    Sahu, Monalisha; Prasuna, Josyula G

    2016-01-01

    Twin studies are a special type of epidemiological studies designed to measure the contribution of genetics as opposed to the environment, to a given trait. Despite the facts that the classical twin studies are still being guided by assumptions made back in the 1920s and that the inherent limitation lies in the study design itself, the results suggested by earlier twin studies have often been confirmed by molecular genetic studies later. Use of twin registries and various innovative yet complex software packages such as the (SAS) and their extensions (e.g., SAS PROC GENMOD and SAS PROC PHREG) has increased the potential of this epidemiological tool toward contributing significantly to the field of genetics and other life sciences. PMID:27385869

  16. Podcasts as Tools in Introductory Environmental Studies

    PubMed Central

    Vatovec, Christine; Balser, Teri

    2009-01-01

    Technological tools have increasingly become a part of the college classroom, often appealing to teachers because of their potential to increase student engagement with course materials. Podcasts in particular have gained popularity as tools to better inform students by providing access to lectures outside of the classroom. In this paper, we argue that educators should expand course materials to include prepublished podcasts to engage students with both course topics and a broader skill set for evaluating readily available media. We present a pre- and postassignment survey evaluation assessing student preferences for using podcasts and the ability of a podcast assignment to support learning objectives in an introductory environmental studies course. Overall, students reported that the podcasts were useful tools for learning, easy to use, and increased their understanding of course topics. However, students also provided insightful comments on visual versus aural learning styles, leading us to recommend assigning video podcasts or providing text-based transcripts along with audio podcasts. A qualitative analysis of survey data provides evidence that the podcast assignment supported the course learning objective for students to demonstrate critical evaluation of media messages. Finally, we provide recommendations for selecting published podcasts and designing podcast assignments. PMID:23653686

  17. Research of annealing mode for high accuracy stamped parts production from titanium alloy 83Ti-5Al-5Cr-5Mo after tooling

    NASA Astrophysics Data System (ADS)

    Balaykin, A. V.; Nosova, E. A.; Galkina, N. V.

    2016-11-01

    The aim of the work is to solve question of accuracy increase in tolled and annealed parts made from forged rod of titanium alloy. Plate pieces were cut from cross-section, annealed at 800°C during 1, 2, 3, 4 and 5 hours. The criterion combining minimum bending radius and spring back angle was found. This criterion shows the maximum values after tooling and annealing for 3 hours.

  18. Analysis of Accuracy in Pointing with Redundant Hand-held Tools: A Geometric Approach to the Uncontrolled Manifold Method

    PubMed Central

    Campolo, Domenico; Widjaja, Ferdinan; Xu, Hong; Ang, Wei Tech; Burdet, Etienne

    2013-01-01

    This work introduces a coordinate-independent method to analyse movement variability of tasks performed with hand-held tools, such as a pen or a surgical scalpel. We extend the classical uncontrolled manifold (UCM) approach by exploiting the geometry of rigid body motions, used to describe tool configurations. In particular, we analyse variability during a static pointing task with a hand-held tool, where subjects are asked to keep the tool tip in steady contact with another object. In this case the tool is redundant with respect to the task, as subjects control position/orientation of the tool, i.e. 6 degrees-of-freedom (dof), to maintain the tool tip position (3dof) steady. To test the new method, subjects performed a pointing task with and without arm support. The additional dof introduced in the unsupported condition, injecting more variability into the system, represented a resource to minimise variability in the task space via coordinated motion. The results show that all of the seven subjects channeled more variability along directions not directly affecting the task (UCM), consistent with previous literature but now shown in a coordinate-independent way. Variability in the unsupported condition was only slightly larger at the endpoint but much larger in the UCM. PMID:23592956

  19. Screening Characteristics of TIMI Score in Predicting Acute Coronary Syndrome Outcome; a Diagnostic Accuracy Study

    PubMed Central

    Alavi-Moghaddam, Mostafa; Safari, Saeed; Alavi-Moghaddam, Hamideh

    2017-01-01

    Introduction: In cases with potential diagnosis of ischemic chest pain, screening high risk patients for adverse outcomes would be very helpful. The present study was designed aiming to determine the diagnostic accuracy of thrombolysis in myocardial infarction (TIMI) score in Patients with potential diagnosis of ischemic chest pain. Method: This diagnostic accuracy study was designed to evaluate the screening performance characteristics of TIMI score in predicting 30-day outcomes of mortality, myocardial infarction (MI), and need for revascularization in patients presenting to ED with complaint of typical chest pain and diagnosis of unstable angina or Non-ST elevation MI. Results: 901 patients with the mean age of 58.17 ± 15.00 years (19-90) were studied (52.9% male). Mean TIMI score of the studied patients was 0.97 ± 0.93 (0-5) and the highest frequency of the score belonged to 0 to 2 with 37.2%, 35.3%, and 21.4%, respectively. In total, 170 (18.8%) patients experienced the outcomes evaluated in this study. Total sensitivity, specificity, positive and negative predictive value, and positive and negative likelihood ratio of TIMI score were 20 (95% CI: 17 – 24), 99 (95% CI: 97 – 100), 98 (95% CI: 93 – 100), 42 (95% CI: 39 – 46), 58 (95% CI: 14 – 229), and 1.3 (95% CI: 1.2 – 1.4), respectively. Area under the ROC curve of this system for prediction of 30-day mortality, MI, and need for revascularization were 0.51 (95% CI: 0.47 – 0.55), 0.58 (95% CI: 0.54 – 0.62) and 0.56 (95% CI: 0.52 – 0.60), respectively. Conclusion: Based on the findings of the present study, it seems that TIMI score has a high specificity in predicting 30-day adverse outcomes of mortality, MI, and need for revascularization following acute coronary syndrome. However, since its sensitivity, negative predictive value, and negative likelihood ratio are low, it cannot be used as a proper screening tool for ruling out low risk patients in ED. PMID:28286825

  20. Regression Modeling and Meta-Analysis of Diagnostic Accuracy of SNP-Based Pathogenicity Detection Tools for UGT1A1 Gene Mutation

    PubMed Central

    Rahim, Fakher; Galehdari, Hamid; Mohammadi-asl, Javad; Saki, Najmaldin

    2013-01-01

    Aims. This review summarized all available evidence on the accuracy of SNP-based pathogenicity detection tools and introduced regression model based on functional scores, mutation score, and genomic variation degree. Materials and Methods. A comprehensive search was performed to find all mutations related to Crigler-Najjar syndrome. The pathogenicity prediction was done using SNP-based pathogenicity detection tools including SIFT, PHD-SNP, PolyPhen2, fathmm, Provean, and Mutpred. Overall, 59 different SNPs related to missense mutations in the UGT1A1 gene, were reviewed. Results. Comparing the diagnostic OR, our model showed high detection potential (diagnostic OR: 16.71, 95% CI: 3.38–82.69). The highest MCC and ACC belonged to our suggested model (46.8% and 73.3%), followed by SIFT (34.19% and 62.71%). The AUC analysis showed a significance overall performance of our suggested model compared to the selected SNP-based pathogenicity detection tool (P = 0.046). Conclusion. Our suggested model is comparable to the well-established SNP-based pathogenicity detection tools that can appropriately reflect the role of a disease-associated SNP in both local and global structures. Although the accuracy of our suggested model is not relatively high, the functional impact of the pathogenic mutations is highlighted at the protein level, which improves the understanding of the molecular basis of mutation pathogenesis. PMID:23997956

  1. Study of the disturbances effect on small satellite route tracking accuracy

    NASA Astrophysics Data System (ADS)

    Mashtakov, Y. V.; Ovchinnikov, M. Yu.; Tkachev, S. S.

    2016-12-01

    This paper studies an accuracy of tracking specified routes on the Earth surface. An algorithm of satellite angular motion synthesis that provides surveying of such routes and constraint implied on the route curvature by limited attitude control abilities are proposed. Lyapunov-based control algorithm is used to provide necessary motion and the effect of external disturbances on this motion is studied. In addition, end-form equations that link together satellite parameters, attitude and stabilization accuracy and route tracking errors are presented.

  2. STARD 2015 guidelines for reporting diagnostic accuracy studies: explanation and elaboration

    PubMed Central

    Cohen, Jérémie F; Korevaar, Daniël A; Altman, Douglas G; Bruns, David E; Gatsonis, Constantine A; Hooft, Lotty; Irwig, Les; Levine, Deborah; Reitsma, Johannes B; de Vet, Henrica C W; Bossuyt, Patrick M M

    2016-01-01

    Diagnostic accuracy studies are, like other clinical studies, at risk of bias due to shortcomings in design and conduct, and the results of a diagnostic accuracy study may not apply to other patient groups and settings. Readers of study reports need to be informed about study design and conduct, in sufficient detail to judge the trustworthiness and applicability of the study findings. The STARD statement (Standards for Reporting of Diagnostic Accuracy Studies) was developed to improve the completeness and transparency of reports of diagnostic accuracy studies. STARD contains a list of essential items that can be used as a checklist, by authors, reviewers and other readers, to ensure that a report of a diagnostic accuracy study contains the necessary information. STARD was recently updated. All updated STARD materials, including the checklist, are available at http://www.equator-network.org/reporting-guidelines/stard. Here, we present the STARD 2015 explanation and elaboration document. Through commented examples of appropriate reporting, we clarify the rationale for each of the 30 items on the STARD 2015 checklist, and describe what is expected from authors in developing sufficiently informative study reports. PMID:28137831

  3. Fluorescence microscopy: A tool to study autophagy

    NASA Astrophysics Data System (ADS)

    Rai, Shashank; Manjithaya, Ravi

    2015-08-01

    Autophagy is a cellular recycling process through which a cell degrades old and damaged cellular components such as organelles and proteins and the degradation products are reused to provide energy and building blocks. Dysfunctional autophagy is reported in several pathological situations. Hence, autophagy plays an important role in both cellular homeostasis and diseased conditions. Autophagy can be studied through various techniques including fluorescence based microscopy. With the advancements of newer technologies in fluorescence microscopy, several novel processes of autophagy have been discovered which makes it an essential tool for autophagy research. Moreover, ability to tag fluorescent proteins with sub cellular targets has enabled us to evaluate autophagy processes in real time under fluorescent microscope. In this article, we demonstrate different aspects of autophagy in two different model organisms i.e. yeast and mammalian cells, with the help of fluorescence microscopy.

  4. Microinjection--a tool to study gravitropism.

    PubMed

    Scherp, P; Hasenstein, K H

    2003-01-01

    Despite extensive studies on plant gravitropism this phenomenon is still poorly understood. The separation of gravity sensing, signal transduction and response is a common concept but especially the mechanism of gravisensing remains unclear. This paper focuses on microinjection as powerful tool to investigate gravisensing in plants. We describe the microinjection of magnetic beads in rhizoids of the green alga Chara and related subsequent manipulation of the gravisensing system. After injection, an external magnet can control the movement of the magnetic beads. We demonstrate successful injection of magnetic beads into rhizoids and describe a multitude of experiments that can be carried out to investigate gravitropism in Chara rhizoids. In addition to examining mechanical properties, bead microinjection is also useful for probing the function of the cytoskeleton by coating beads with drugs that interfere with the cytoskeleton. The injection of fluorescently labeled beads or probes may reveal the involvement of the cytoskeleton during gravistimulation and response in living cells.

  5. Short-Term Forecasting of Loads and Wind Power for Latvian Power System: Accuracy and Capacity of the Developed Tools

    NASA Astrophysics Data System (ADS)

    Radziukynas, V.; Klementavičius, A.

    2016-04-01

    The paper analyses the performance results of the recently developed short-term forecasting suit for the Latvian power system. The system load and wind power are forecasted using ANN and ARIMA models, respectively, and the forecasting accuracy is evaluated in terms of errors, mean absolute errors and mean absolute percentage errors. The investigation of influence of additional input variables on load forecasting errors is performed. The interplay of hourly loads and wind power forecasting errors is also evaluated for the Latvian power system with historical loads (the year 2011) and planned wind power capacities (the year 2023).

  6. Factoring vs linear modeling in rate estimation: a simulation study of relative accuracy.

    PubMed

    Maldonado, G; Greenland, S

    1998-07-01

    A common strategy for modeling dose-response in epidemiology is to transform ordered exposures and covariates into sets of dichotomous indicator variables (that is, to factor the variables). Factoring tends to increase estimation variance, but it also tends to decrease bias and thus may increase or decrease total accuracy. We conducted a simulation study to examine the impact of factoring on the accuracy of rate estimation. Factored and unfactored Poisson regression models were fit to follow-up study datasets that were randomly generated from 37,500 population model forms that ranged from subadditive to supramultiplicative. In the situations we examined, factoring sometimes substantially improved accuracy relative to fitting the corresponding unfactored model, sometimes substantially decreased accuracy, and sometimes made little difference. The difference in accuracy between factored and unfactored models depended in a complicated fashion on the difference between the true and fitted model forms, the strength of exposure and covariate effects in the population, and the study size. It may be difficult in practice to predict when factoring is increasing or decreasing accuracy. We recommend, therefore, that the strategy of factoring variables be supplemented with other strategies for modeling dose-response.

  7. New Tools to Study Contact Activation

    PubMed Central

    Rosén, Steffen

    2016-01-01

    The recent availability of a sensitive chromogenic method approach for determination of FXIa activity has been explored for designing sensitive methods for FXIIa and kallikrein, both using FXa formation as the read-out. For both enzymes the assay range 1–10 nmol/L provides a resolution of about 0.8 absorbance units with a total assay time of about 20 min. For studies on activation kinetics, subsampling and extensive dilution can be performed in MES–bovine serum albumin (BSA) buffer pH 5.7 for quenching of enzyme activity and with ensuing determination of FXa generation in a chromogenic FXIa method. Optionally, suitable inhibitors such as aprotinin and/or corn trypsin inhibitor may be included. The stability of FXIa, FXIIa, and kallikrein in MES–BSA buffer was shown to be at least 5 h on ice. In conclusion, the use of a sensitive chromogenic FXIa method either per se or in combination with MES–BSA buffer pH 5.7 are new and potentially valuable tools for the study of contact factor enzymes and their inhibitors. So far, dose–response studies of FXIIa and kallikrein have been limited to purified systems, and hence more data are required to learn whether these new methods might or might not be applicable to the determination of FXIIa and kallikrein activities in plasma. PMID:27921033

  8. New Tools to Study Contact Activation.

    PubMed

    Rosén, Steffen

    2016-01-01

    The recent availability of a sensitive chromogenic method approach for determination of FXIa activity has been explored for designing sensitive methods for FXIIa and kallikrein, both using FXa formation as the read-out. For both enzymes the assay range 1-10 nmol/L provides a resolution of about 0.8 absorbance units with a total assay time of about 20 min. For studies on activation kinetics, subsampling and extensive dilution can be performed in MES-bovine serum albumin (BSA) buffer pH 5.7 for quenching of enzyme activity and with ensuing determination of FXa generation in a chromogenic FXIa method. Optionally, suitable inhibitors such as aprotinin and/or corn trypsin inhibitor may be included. The stability of FXIa, FXIIa, and kallikrein in MES-BSA buffer was shown to be at least 5 h on ice. In conclusion, the use of a sensitive chromogenic FXIa method either per se or in combination with MES-BSA buffer pH 5.7 are new and potentially valuable tools for the study of contact factor enzymes and their inhibitors. So far, dose-response studies of FXIIa and kallikrein have been limited to purified systems, and hence more data are required to learn whether these new methods might or might not be applicable to the determination of FXIIa and kallikrein activities in plasma.

  9. Do knowledge, knowledge sources and reasoning skills affect the accuracy of nursing diagnoses? a randomised study

    PubMed Central

    2012-01-01

    Background This paper reports a study about the effect of knowledge sources, such as handbooks, an assessment format and a predefined record structure for diagnostic documentation, as well as the influence of knowledge, disposition toward critical thinking and reasoning skills, on the accuracy of nursing diagnoses. Knowledge sources can support nurses in deriving diagnoses. A nurse’s disposition toward critical thinking and reasoning skills is also thought to influence the accuracy of his or her nursing diagnoses. Method A randomised factorial design was used in 2008–2009 to determine the effect of knowledge sources. We used the following instruments to assess the influence of ready knowledge, disposition, and reasoning skills on the accuracy of diagnoses: (1) a knowledge inventory, (2) the California Critical Thinking Disposition Inventory, and (3) the Health Science Reasoning Test. Nurses (n = 249) were randomly assigned to one of four factorial groups, and were instructed to derive diagnoses based on an assessment interview with a simulated patient/actor. Results The use of a predefined record structure resulted in a significantly higher accuracy of nursing diagnoses. A regression analysis reveals that almost half of the variance in the accuracy of diagnoses is explained by the use of a predefined record structure, a nurse’s age and the reasoning skills of `deduction’ and `analysis’. Conclusions Improving nurses’ dispositions toward critical thinking and reasoning skills, and the use of a predefined record structure, improves accuracy of nursing diagnoses. PMID:22852577

  10. An observational study of the accuracy and completeness of an anesthesia information management system: recommendations for documentation system changes.

    PubMed

    Wilbanks, Bryan A; Moss, Jacqueline A; Berner, Eta S

    2013-08-01

    Anesthesia information management systems must often be tailored to fit the environment in which they are implemented. Extensive customization necessitates that systems be analyzed for both accuracy and completeness of documentation design to ensure that the final record is a true representation of practice. The purpose of this study was to determine the accuracy of a recently installed system in the capture of key perianesthesia data. This study used an observational design and was conducted using a convenience sample of nurse anesthetists. Observational data of the nurse anesthetists'delivery of anesthesia care were collected using a touch-screen tablet computer utilizing an Access database customized observational data collection tool. A questionnaire was also administered to these nurse anesthetists to assess perceived accuracy, completeness, and satisfaction with the electronic documentation system. The major sources of data not documented in the system were anesthesiologist presence (20%) and placement of intravenous lines (20%). The major sources of inaccuracies in documentation were gas flow rates (45%), medication administration times (30%), and documentation of neuromuscular function testing (20%)-all of the sources of inaccuracies were related to the use of charting templates that were not altered to reflect the actual interventions performed.

  11. Diagnostic accuracies of clinical studies in patients with small cell carcinoma of the lung

    SciTech Connect

    Chak, L.Y.; Paryani, S.B.; Sikic, B.I.; Lockbaum, P.; Torti, F.M.; Carter, S.K.

    1983-05-01

    The diagnostic accuracy of clinical studies done in 38 patients with small cell carcinoma of the lung was analyzed by comparing the test results to autopsy findings. The chest radiograph was accurate in 31 of 38 patients (82%). The accuracy of the chest radiograph was higher in evaluating the lung parenchyma and mediastinum than in evaluating the hilum and pleura. Computerized tomographic brain scan was accurate in 11 of 12 patients. However, all the diagnostic studies used for assessing the liver, including physical examination, serum liver enzyme and bilirubin measurements, and radionuclide liver scan, were only moderately accurate. More accurate studies for detecting liver metastasis in patients with small cell carcinoma are needed.

  12. Specific Challenges in Conducting and Reporting Studies on the Diagnostic Accuracy of Ultrasonography in Bovine Medicine.

    PubMed

    Buczinski, Sébastien; O'Connor, Annette M

    2016-03-01

    Ultrasonography is used by bovine practitioners more for reproductive issues than as a diagnostic test for medical and surgical diseases. This article reviews the specific challenges and standards concerning reporting of studies on diagnostic accuracy of ultrasound in cattle for nonreproductive issues. Specific biases and applicability concerns in studies reporting ultrasonography as a diagnostic test are also reviewed. Better understanding of these challenges will help the practitioner to interpret and apply (or not) diagnostic accuracy study results depending on the field context. Examples of application of sensitivity and specificity results in a clinical context are given using the Bayes theorem.

  13. Visual DMDX: A web-based authoring tool for DMDX, a Windows display program with millisecond accuracy.

    PubMed

    Garaizar, Pablo; Reips, Ulf-Dietrich

    2015-09-01

    DMDX is a software package for the experimental control and timing of stimulus display for Microsoft Windows systems. DMDX is reliable, flexible, millisecond accurate, and can be downloaded free of charge; therefore it has become very popular among experimental researchers. However, setting up a DMDX-based experiment is burdensome because of its command-based interface. Further, DMDX relies on RTF files in which parts of the stimuli, design, and procedure of an experiment are defined in a complicated (DMASTR-compatible) syntax. Other experiment software, such as E-Prime, Psychopy, and WEXTOR, became successful as a result of integrated visual authoring tools. Such an intuitive interface was lacking for DMDX. We therefore created and present here Visual DMDX (http://visualdmdx.com/), a HTML5-based web interface to set up experiments and export them to DMDX item files format in RTF. Visual DMDX offers most of the features available from the rich DMDX/DMASTR syntax, and it is a useful tool to support researchers who are new to DMDX. Both old and modern versions of DMDX syntax are supported. Further, with Visual DMDX, we go beyond DMDX by having added export to JSON (a versatile web format), easy backup, and a preview option for experiments. In two examples, one experiment each on lexical decision making and affective priming, we explain in a step-by-step fashion how to create experiments using Visual DMDX. We release Visual DMDX under an open-source license to foster collaboration in its continuous improvement.

  14. Immunogenetics as a tool in anthropological studies.

    PubMed

    Sanchez-Mazas, Alicia; Fernandez-Viña, Marcelo; Middleton, Derek; Hollenbach, Jill A; Buhler, Stéphane; Di, Da; Rajalingam, Raja; Dugoujon, Jean-Michel; Mack, Steven J; Thorsby, Erik

    2011-06-01

    The genes coding for the main molecules involved in the human immune system--immunoglobulins, human leucocyte antigen (HLA) molecules and killer-cell immunoglobulin-like receptors (KIR)--exhibit a very high level of polymorphism that reveals remarkable frequency variation in human populations. 'Genetic marker' (GM) allotypes located in the constant domains of IgG antibodies have been studied for over 40 years through serological typing, leading to the identification of a variety of GM haplotypes whose frequencies vary sharply from one geographic region to another. An impressive diversity of HLA alleles, which results in amino acid substitutions located in the antigen-binding region of HLA molecules, also varies greatly among populations. The KIR differ between individuals according to both gene content and allelic variation, and also display considerable population diversity. Whereas the molecular evolution of these polymorphisms has most likely been subject to natural selection, principally driven by host-pathogen interactions, their patterns of genetic variation worldwide show significant signals of human geographic expansion, demographic history and cultural diversification. As current developments in population genetic analysis and computer simulation improve our ability to discriminate among different--either stochastic or deterministic--forces acting on the genetic evolution of human populations, the study of these systems shows great promise for investigating both the peopling history of modern humans in the time since their common origin and human adaptation to past environmental (e.g. pathogenic) changes. Therefore, in addition to mitochondrial DNA, Y-chromosome, microsatellites, single nucleotide polymorphisms and other markers, immunogenetic polymorphisms represent essential and complementary tools for anthropological studies.

  15. Microinjection--a tool to study gravitropism

    NASA Technical Reports Server (NTRS)

    Scherp, P.; Hasenstein, K. H.

    2003-01-01

    Despite extensive studies on plant gravitropism this phenomenon is still poorly understood. The separation of gravity sensing, signal transduction and response is a common concept but especially the mechanism of gravisensing remains unclear. This paper focuses on microinjection as powerful tool to investigate gravisensing in plants. We describe the microinjection of magnetic beads in rhizoids of the green alga Chara and related subsequent manipulation of the gravisensing system. After injection, an external magnet can control the movement of the magnetic beads. We demonstrate successful injection of magnetic beads into rhizoids and describe a multitude of experiments that can be carried out to investigate gravitropism in Chara rhizoids. In addition to examining mechanical properties, bead microinjection is also useful for probing the function of the cytoskeleton by coating beads with drugs that interfere with the cytoskeleton. The injection of fluorescently labeled beads or probes may reveal the involvement of the cytoskeleton during gravistimulation and response in living cells. c2003 COSPAR. Published by Elsevier Ltd. All rights reserved.

  16. Accuracy Studies of a Magnetometer-Only Attitude-and-Rate-Determination System

    NASA Technical Reports Server (NTRS)

    Challa, M. (Editor); Wheeler, C. (Editor)

    1996-01-01

    A personal computer based system was recently prototyped that uses measurements from a three axis magnetometer (TAM) to estimate the attitude and rates of a spacecraft using no a priori knowledge of the spacecraft's state. Past studies using in-flight data from the Solar, Anomalous, and Magnetospheric Particles Explorer focused on the robustness of the system and demonstrated that attitude and rate estimates could be obtained accurately to 1.5 degrees (deg) and 0.01 deg per second (deg/sec), respectively, despite limitations in the data and in the accuracies of te truth models. This paper studies the accuracy of the Kalman filter in the system using several orbits of in-flight Earth Radiation Budget Satellite (ERBS) data and attitude and rate truth models obtained from high precision sensors to demonstrate the practical capabilities. This paper shows the following: Using telemetered TAM data, attitude accuracies of 0.2 to 0.4 deg and rate accuracies of 0.002 to 0.005 deg/sec (within ERBS attitude control requirements of 1 deg and 0.0005 deg/sec) can be obtained with minimal tuning of the filter; Replacing the TAM data in the telemetry with simulated TAM data yields corresponding accuracies of 0.1 to 0.2 deg and 0.002 to 0.005 deg/sec, thus demonstrating that the filter's accuracy can be significantly enhanced by further calibrating the TAM. Factors affecting the fillter's accuracy and techniques for tuning the system's Kalman filter are also presented.

  17. Using ProSight PTM and related tools for targeted protein identification and characterization with high mass accuracy tandem MS data.

    PubMed

    Leduc, Richard D; Kelleher, Neil L

    2007-09-01

    ProSight PTM v2.0, neuroProSight, and the Sequence Gazer allow the identification and characterization of proteins from high mass accuracy tandem mass spectrometric data of intact proteins and large peptides. Input data consists of one or more neutral precursor ion masses and a set of neutral b/y or c/z(.) fragment ions masses. This data is compared against "shotgun annotated" proteome databases or known protein sequences. With these tools it is possible to not only identify unknown proteins, but to determine the location of post-translational modifications (PTM) with 100% sequence coverage. Collectively, the tools create a search environment that allows five different search modes, including absolute mass and sequence tag searching, which are conveniently employed via a graphical user interface. Data management and chemical noise reduction tools are also available. These tools provide a complete environment for the identification and characterization of proteins from high resolution tandem mass spectrometry of intact proteins and large peptides.

  18. Experimental studies of high-accuracy RFID localization with channel impairments

    NASA Astrophysics Data System (ADS)

    Pauls, Eric; Zhang, Yimin D.

    2015-05-01

    Radio frequency identification (RFID) systems present an incredibly cost-effective and easy-to-implement solution to close-range localization. One of the important applications of a passive RFID system is to determine the reader position through multilateration based on the estimated distances between the reader and multiple distributed reference tags obtained from, e.g., the received signal strength indicator (RSSI) readings. In practice, the achievable accuracy of passive RFID reader localization suffers from many factors, such as the distorted RSSI reading due to channel impairments in terms of the susceptibility to reader antenna patterns and multipath propagation. Previous studies have shown that the accuracy of passive RFID localization can be significantly improved by properly modeling and compensating for such channel impairments. The objective of this paper is to report experimental study results that validate the effectiveness of such approaches for high-accuracy RFID localization. We also examine a number of practical issues arising in the underlying problem that limit the accuracy of reader-tag distance measurements and, therefore, the estimated reader localization. These issues include the variations in tag radiation characteristics for similar tags, effects of tag orientations, and reader RSS quantization and measurement errors. As such, this paper reveals valuable insights of the issues and solutions toward achieving high-accuracy passive RFID localization.

  19. Dose calculation accuracies in whole breast radiotherapy treatment planning: a multi-institutional study.

    PubMed

    Hatanaka, Shogo; Miyabe, Yuki; Tohyama, Naoki; Kumazaki, Yu; Kurooka, Masahiko; Okamoto, Hiroyuki; Tachibana, Hidenobu; Kito, Satoshi; Wakita, Akihisa; Ohotomo, Yuko; Ikagawa, Hiroyuki; Ishikura, Satoshi; Nozaki, Miwako; Kagami, Yoshikazu; Hiraoka, Masahiro; Nishio, Teiji

    2015-07-01

    Our objective in this study was to evaluate the variation in the doses delivered among institutions due to dose calculation inaccuracies in whole breast radiotherapy. We have developed practical procedures for quality assurance (QA) of radiation treatment planning systems. These QA procedures are designed to be performed easily at any institution and to permit comparisons of results across institutions. The dose calculation accuracy was evaluated across seven institutions using various irradiation conditions. In some conditions, there was a >3 % difference between the calculated dose and the measured dose. The dose calculation accuracy differs among institutions because it is dependent on both the dose calculation algorithm and beam modeling. The QA procedures in this study are useful for verifying the accuracy of the dose calculation algorithm and of the beam model before clinical use for whole breast radiotherapy.

  20. Galaxy tools to study genome diversity

    PubMed Central

    2013-01-01

    Background Intra-species genetic variation can be used to investigate population structure, selection, and gene flow in non-model vertebrates; and due to the plummeting costs for genome sequencing, it is now possible for small labs to obtain full-genome variation data from their species of interest. However, those labs may not have easy access to, and familiarity with, computational tools to analyze those data. Results We have created a suite of tools for the Galaxy web server aimed at handling nucleotide and amino-acid polymorphisms discovered by full-genome sequencing of several individuals of the same species, or using a SNP genotyping microarray. In addition to providing user-friendly tools, a main goal is to make published analyses reproducible. While most of the examples discussed in this paper deal with nuclear-genome diversity in non-human vertebrates, we also illustrate the application of the tools to fungal genomes, human biomedical data, and mitochondrial sequences. Conclusions This project illustrates that a small group can design, implement, test, document, and distribute a Galaxy tool collection to meet the needs of a particular community of biologists. PMID:24377391

  1. Tools for the study of dynamical spacetimes

    NASA Astrophysics Data System (ADS)

    Zhang, Fan

    This thesis covers a range of topics in numerical and analytical relativity, centered around introducing tools and methodologies for the study of dynamical spacetimes. The scope of the studies is limited to classical (as opposed to quantum) vacuum spacetimes described by Einstein's general theory of relativity. The numerical works presented here are carried out within the Spectral Einstein Code (SpEC) infrastructure, while analytical calculations extensively utilize Wolfram's Mathematica program. We begin by examining highly dynamical spacetimes such as binary black hole mergers, which can be investigated using numerical simulations. However, there are difficulties in interpreting the output of such simulations. One difficulty stems from the lack of a canonical coordinate system (henceforth referred to as gauge freedom) and tetrad, against which quantities such as Newman-Penrose Psi4 (usually interpreted as the gravitational wave part of curvature) should be measured. We tackle this problem in Chapter 2 by introducing a set of geometrically motivated coordinates that are independent of the simulation gauge choice, as well as a quasi-Kinnersley tetrad, also invariant under gauge changes in addition to being optimally suited to the task of gravitational wave extraction. Another difficulty arises from the need to condense the overwhelming amount of data generated by the numerical simulations. In order to extract physical information in a succinct and transparent manner, one may define a version of gravitational field lines and field strength using spatial projections of the Weyl curvature tensor. Introduction, investigation and utilization of these quantities will constitute the main content in Chapters 3 through 6. For the last two chapters, we turn to the analytical study of a simpler dynamical spacetime, namely a perturbed Kerr black hole. We will introduce in Chapter 7 a new analytical approximation to the quasi-normal mode (QNM) frequencies, and relate various

  2. Computational Tools for Genomic Studies in Plants.

    PubMed

    Martinez, Manuel

    2016-12-01

    In recent years, the genomic sequence of numerous plant species including the main crop species has been determined. Computational tools have been developed to deal with the issue of which plant has been sequenced and where is the sequence hosted. In this mini-review, the databases for genome projects, the databases created to host species/clade projects and the databases developed to perform plant comparative genomics are revised. Because of their importance in modern research, an in-depth analysis of the plant comparative genomics databases has been performed. This comparative analysis is focused in the common and specific computational tools developed to achieve the particular objectives of each database. Besides, emerging high-performance bioinformatics tools specific for plant research are commented. What kind of computational approaches should be implemented in next years to efficiently analyze plant genomes is discussed.

  3. Data Analysis Tools for Visualization Study

    DTIC Science & Technology

    2015-08-01

    represented true threats. The correct answers and the selections by each subject were recorded as fixed-format text files. My tools parse this text ...1 2.3 Three Display Types 2 2.4 Inputs from Test Subjects 3 3. Subject Trial Results 4 3.1 Selection Text Files 4 3.2 Creation of Table...the selections by each subject were recorded as fixed-format text files. My tools parse the text files and insert the data into tables in a

  4. A simulation tool for brassiness studies.

    PubMed

    Gilbert, Joël; Menguy, Ludovic; Campbell, Murray

    2008-04-01

    A frequency-domain numerical model of brass instrument sound production is proposed as a tool to predict their brassiness, defined as the rate of spectral enrichment with increasing dynamic level. It is based on generalized Burger's equations dedicated to weakly nonlinear wave propagation in nonuniform ducts, and is an extension of previous work by Menguy and Gilbert [Acta Acustica 86, 798-810 (2000)], initially limited to short cylindrical tubes. The relevance of the present tool is evaluated by carrying out simulations over distances longer than typical shock formation distances, and by doing preliminary simulations of periodic regimes in a typical brass trombone bore geometry.

  5. Effects of implant angulation, material selection, and impression technique on impression accuracy: a preliminary laboratory study.

    PubMed

    Rutkunas, Vygandas; Sveikata, Kestutis; Savickas, Raimondas

    2012-01-01

    The aim of this preliminary laboratory study was to evaluate the effects of 5- and 25-degree implant angulations in simulated clinical casts on an impression's accuracy when using different impression materials and tray selections. A convenience sample of each implant angulation group was selected for both open and closed trays in combination with one polyether and two polyvinyl siloxane impression materials. The influence of material and technique appeared to be significant for both 5- and 25-degree angulations (P < .05), and increased angulation tended to decrease impression accuracy. The open-tray technique was more accurate with highly nonaxially oriented implants for the small sample size investigated.

  6. Meta-analysis diagnostic accuracy of SNP-based pathogenicity detection tools: a case of UTG1A1 gene mutations

    PubMed Central

    Galehdari, Hamid; Saki, Najmaldin; Mohammadi-asl, Javad; Rahim, Fakher

    2013-01-01

    Crigler-Najjar syndrome (CNS) type I and type II are usually inherited as autosomal recessive conditions that result from mutations in the UGT1A1 gene. The main objective of the present review is to summarize results of all available evidence on the accuracy of SNP-based pathogenicity detection tools compared to published clinical result for the prediction of in nsSNPs that leads to disease using prediction performance method. A comprehensive search was performed to find all mutations related to CNS. Database searches included dbSNP, SNPdbe, HGMD, Swissvar, ensemble, and OMIM. All the mutation related to CNS was extracted. The pathogenicity prediction was done using SNP-based pathogenicity detection tools include SIFT, PHD-SNP, PolyPhen2, fathmm, Provean, and Mutpred. Overall, 59 different SNPs related to missense mutations in the UGT1A1 gene, were reviewed. Comparing the diagnostic OR, PolyPhen2 and Mutpred have the highest detection 4.983 (95% CI: 1.24 – 20.02) in both, following by SIFT (diagnostic OR: 3.25, 95% CI: 1.07 – 9.83). The highest MCC of SNP-based pathogenicity detection tools, was belong to SIFT (34.19%) followed by Provean, PolyPhen2, and Mutpred (29.99%, 29.89%, and 29.89%, respectively). Hence the highest SNP-based pathogenicity detection tools ACC, was fit to SIFT (62.71%) followed by PolyPhen2, and Mutpred (61.02%, in both). Our results suggest that some of the well-established SNP-based pathogenicity detection tools can appropriately reflect the role of a disease-associated SNP in both local and global structures. PMID:23875061

  7. Improving Accuracy of Decoding Emotions from Facial Expressions by Cooperative Learning Techniques, Two Experimental Studies.

    ERIC Educational Resources Information Center

    Klinzing, Hans Gerhard

    A program was developed for the improvement of social competence in general among professionals with the improvement of the accuracy of decoding emotions from facial expressions as the specific focus. It was integrated as a laboratory experience into traditional lectures at two German universities where studies were conducted to assess the…

  8. Accuracy and Precision of Partial-Volume Correction in Oncological PET/CT Studies.

    PubMed

    Cysouw, Matthijs C F; Kramer, Gerbrand Maria; Hoekstra, Otto S; Frings, Virginie; de Langen, Adrianus Johannes; Smit, Egbert F; van den Eertwegh, Alfons J M; Oprea-Lager, Daniela E; Boellaard, Ronald

    2016-10-01

    Accurate quantification of tracer uptake in small tumors using PET is hampered by the partial-volume effect as well as by the method of volume-of-interest (VOI) delineation. This study aimed to investigate the effect of partial-volume correction (PVC) combined with several VOI methods on the accuracy and precision of quantitative PET.

  9. Do Fixation Cues Ensure Fixation Accuracy in Split-Fovea Studies of Word Recognition?

    ERIC Educational Resources Information Center

    Jordan, Timothy R.; Paterson, Kevin B.; Kurtev, Stoyan; Xu, Mengyun

    2009-01-01

    Many studies have claimed that hemispheric processing is split precisely at the foveal midline and so place great emphasis on the precise location at which words are fixated. These claims are based on experiments in which a variety of fixation procedures were used to ensure fixation accuracy but the effectiveness of these procedures is unclear. We…

  10. Adjusting Expectations: The Study of Complexity, Accuracy, and Fluency in Second Language Acquisition

    ERIC Educational Resources Information Center

    Larsen-Freeman, Diane

    2009-01-01

    It is a good practice to try to understand matters at hand by first stepping back and adopting an historical perspective, which I will begin this review by doing. Next, I will take up the challenges that each of the authors in the articles in this volume has presented for the study of complexity, accuracy, and fluency (CAF) in second language…

  11. Breaking the Code of Silence: A Study of Teachers' Nonverbal Decoding Accuracy of Foreign Language Anxiety

    ERIC Educational Resources Information Center

    Gregersen, Tammy

    2007-01-01

    This study examined teachers' accuracy in decoding nonverbal behaviour indicative of foreign language anxiety. Teachers and teacher trainees twice observed a videotape without sound of seven beginning French foreign language students as they participated in an oral exam; four of these students were defined as anxious language learners by the…

  12. Interferometric study of a machine tool

    NASA Astrophysics Data System (ADS)

    Hoefling, Roland; Vaclavik, Jaroslav; Neigebauer, Reimund

    1996-09-01

    This paper describes the use of a non-destructive optical technique, digital speckle pattern interferometry, for the deformation analysis of a machine tool. An interferometric set-up has been designed and measurements of the milling head deformation have been made on the horizontal single spindle milling machine center.

  13. A Study of Confidence and Accuracy Using the Rasch Modeling Procedures. Research Report. ETS RR-08-42

    ERIC Educational Resources Information Center

    Paek, Insu; Lee, Jihyun; Stankov, Lazar; Wilson, Mark

    2008-01-01

    This study investigated the relationship between students' actual performance (accuracy) and their subjective judgments of accuracy (confidence) on selected English language proficiency tests. The unidimensional and multidimensional IRT Rasch approaches were used to model the discrepancy between confidence and accuracy at the item and test level…

  14. Theoretical study of precision and accuracy of strain analysis by nano-beam electron diffraction.

    PubMed

    Mahr, Christoph; Müller-Caspary, Knut; Grieb, Tim; Schowalter, Marco; Mehrtens, Thorsten; Krause, Florian F; Zillmann, Dennis; Rosenauer, Andreas

    2015-11-01

    Measurement of lattice strain is important to characterize semiconductor nanostructures. As strain has large influence on the electronic band structure, methods for the measurement of strain with high precision, accuracy and spatial resolution in a large field of view are mandatory. In this paper we present a theoretical study of precision and accuracy of measurement of strain by convergent nano-beam electron diffraction. It is found that the accuracy of the evaluation suffers from halos in the diffraction pattern caused by a variation of strain within the area covered by the focussed electron beam. This effect, which is expected to be strong at sharp interfaces between materials with different lattice plane distances, will be discussed for convergent-beam electron diffraction patterns using a conventional probe and for patterns formed by a precessing electron beam. Furthermore, we discuss approaches to optimize the accuracy of strain measured at interfaces. The study is based on the evaluation of diffraction patterns simulated for different realistic structures that have been investigated experimentally in former publications. These simulations account for thermal diffuse scattering using the frozen-lattice approach and the modulation-transfer function of the image-recording system. The influence of Poisson noise is also investigated.

  15. A phantom study on the positioning accuracy of the Novalis Body system.

    PubMed

    Yan, Hui; Yin, Fang-Fang; Kim, Jae Ho

    2003-12-01

    A phantom study was conducted to investigate inherent positioning accuracy of an image-guided patient positioning system-the Novalis Body system for three-dimensional (3-D) conformal radiotherapy. This positioning system consists of two infrared (IR) cameras and one video camera and two kV x-ray imaging devices. The initial patient setup was guided by the IR camera system and the target localization was accomplished using the kV x-ray imaging system. In this study, the IR marker shift and phantom rotation were simulated and their effects on the positioning accuracy were examined by a Rando phantom. The effects of CT slice thickness and treatment sites on the positioning accuracy were tested. In addition, the internal target shift was simulated and its effect on the positioning accuracy was examined by a water tank. With the application of the Novalis Body system, the positioning error of the planned isocenter was significantly reduced. The experimental results with the simulated IR marker shifts indicated that the positioning errors of the planned isocenter were 0.6 +/- 0.3, 0.5 +/- 0.2, and 0.7 +/- 0.2 mm along the lateral, longitudinal, and vertical axes, respectively. The experimental results with the simulated phantom rotations indicated that the positioning errors of the planned isocenter were 0.6 +/- 0.3, 0.7 +/- 0.2, and 0.5 +/- 0.2 mm along the three axes, respectively. The experimental results with the simulated target shifts indicated that the positioning errors of the planned isocenter were 0.6 +/- 0.3, 0.7 +/- 0.2, and 0.5 +/- 0.2 mm along the three axes, respectively. On average, the positioning accuracy of 1 mm for the planned isocenter was achieved using the Novalis Body system.

  16. Studies of the accuracy of time integration methods for reaction-diffusion equations

    NASA Astrophysics Data System (ADS)

    Ropp, David L.; Shadid, John N.; Ober, Curtis C.

    2004-03-01

    In this study we present numerical experiments of time integration methods applied to systems of reaction-diffusion equations. Our main interest is in evaluating the relative accuracy and asymptotic order of accuracy of the methods on problems which exhibit an approximate balance between the competing component time scales. Nearly balanced systems can produce a significant coupling of the physical mechanisms and introduce a slow dynamical time scale of interest. These problems provide a challenging test for this evaluation and tend to reveal subtle differences between the various methods. The methods we consider include first- and second-order semi-implicit, fully implicit, and operator-splitting techniques. The test problems include a prototype propagating nonlinear reaction-diffusion wave, a non-equilibrium radiation-diffusion system, a Brusselator chemical dynamics system and a blow-up example. In this evaluation we demonstrate a "split personality" for the operator-splitting methods that we consider. While operator-splitting methods often obtain very good accuracy, they can also manifest a serious degradation in accuracy due to stability problems.

  17. Simulation approach for the evaluation of tracking accuracy in radiotherapy: a preliminary study.

    PubMed

    Tanaka, Rie; Ichikawa, Katsuhiro; Mori, Shinichiro; Sanada, Sigeru

    2013-01-01

    Real-time tumor tracking in external radiotherapy can be achieved by diagnostic (kV) X-ray imaging with a dynamic flat-panel detector (FPD). It is important to keep the patient dose as low as possible while maintaining tracking accuracy. A simulation approach would be helpful to optimize the imaging conditions. This study was performed to develop a computer simulation platform based on a noise property of the imaging system for the evaluation of tracking accuracy at any noise level. Flat-field images were obtained using a direct-type dynamic FPD, and noise power spectrum (NPS) analysis was performed. The relationship between incident quantum number and pixel value was addressed, and a conversion function was created. The pixel values were converted into a map of quantum number using the conversion function, and the map was then input into the random number generator to simulate image noise. Simulation images were provided at different noise levels by changing the incident quantum numbers. Subsequently, an implanted marker was tracked automatically and the maximum tracking errors were calculated at different noise levels. The results indicated that the maximum tracking error increased with decreasing incident quantum number in flat-field images with an implanted marker. In addition, the range of errors increased with decreasing incident quantum number. The present method could be used to determine the relationship between image noise and tracking accuracy. The results indicated that the simulation approach would aid in determining exposure dose conditions according to the necessary tracking accuracy.

  18. A Study of the dimensional accuracy obtained by low cost 3D printing for possible application in medicine

    NASA Astrophysics Data System (ADS)

    Kitsakis, K.; Alabey, P.; Kechagias, J.; Vaxevanidis, N.

    2016-11-01

    Low cost 3D printing' is a terminology that referred to the fused filament fabrication (FFF) technique, which constructs physical prototypes, by depositing material layer by layer using a thermal nozzle head. Nowadays, 3D printing is widely used in medical applications such as tissue engineering as well as supporting tool in diagnosis and treatment in Neurosurgery, Orthopedic and Dental-Cranio-Maxillo-Facial surgery. 3D CAD medical models are usually obtained by MRI or CT scans and then are sent to a 3D printer for physical model creation. The present paper is focused on a brief overview of benefits and limitations of 3D printing applications in the field of medicine as well as on a dimensional accuracy study of low-cost 3D printing technique.

  19. Accuracy of bite mark analysis from food substances: A comparative study

    PubMed Central

    Daniel, M. Jonathan; Pazhani, Ambiga

    2015-01-01

    Aims and Objectives: The aims and objectives of the study were to compare the accuracy of bite mark analysis from three different food substances-apple, cheese and chocolate using two techniques-the manual docking procedure and computer assisted overlay generation technique and to compare the accuracy of the two techniques for bite mark analysis on food substances. Materials and Methods: The individuals who participated in the study were made to bite on three food substances-apple, cheese, and chocolate. Dentate individuals were included in the study. Edentulous individuals and individuals having a missing anterior tooth were excluded from the study. The dental casts of the individual were applied to the positive cast of the bitten food substance to determine docking or matching. Then, computer generated overlays were compared with bite mark pattern on the foodstuff. Results: The results were tabulated and the comparison of bite mark analysis on the three different food substances was analyzed by Kruskall-Wallis ANOVA test and the comparison of the two techniques was analyzed by Spearman's Rho correlation coefficient. Conclusion: On comparing the bite marks analysis from the three food substances-apple, cheese and chocolate, the accuracy was found to be greater for chocolate and cheese than apple. PMID:26816463

  20. Defense Medical Human Resources System-internet (DMHRSi): A Case Study on Compliance and Accuracy

    DTIC Science & Technology

    2009-06-02

    further education focusing on the importance of DMHRSi. In a recent study, intensive care nurses increased physicians’ adherence to following...promoting compliance and accuracy, ensure users have adequate education regarding DMHRSi, and use an organizational change model to improve user... Education Technician 19b. TELEPHONE NUMBER (Include area code) (210)221-6443 Standard Form 298 (Rev. 8/98) Proscribed by ANSI Std. Z39.18 Defense

  1. Does the Reporting Quality of Diagnostic Test Accuracy Studies, as Defined by STARD 2015, Affect Citation?

    PubMed Central

    Choi, Young Jun; Chung, Mi Sun; Koo, Hyun Jung; Park, Ji Eun; Yoon, Hee Mang

    2016-01-01

    Objective To determine the rate with which diagnostic test accuracy studies that are published in a general radiology journal adhere to the Standards for Reporting of Diagnostic Accuracy Studies (STARD) 2015, and to explore the relationship between adherence rate and citation rate while avoiding confounding by journal factors. Materials and Methods All eligible diagnostic test accuracy studies that were published in the Korean Journal of Radiology in 2011–2015 were identified. Five reviewers assessed each article for yes/no compliance with 27 of the 30 STARD 2015 checklist items (items 28, 29, and 30 were excluded). The total STARD score (number of fulfilled STARD items) was calculated. The score of the 15 STARD items that related directly to the Quality Assessment of Diagnostic Accuracy Studies (QUADAS)-2 was also calculated. The number of times each article was cited (as indicated by the Web of Science) after publication until March 2016 and the article exposure time (time in months between publication and March 2016) were extracted. Results Sixty-three articles were analyzed. The mean (range) total and QUADAS-2-related STARD scores were 20.0 (14.5–25) and 11.4 (7–15), respectively. The mean citation number was 4 (0–21). Citation number did not associate significantly with either STARD score after accounting for exposure time (total score: correlation coefficient = 0.154, p = 0.232; QUADAS-2-related score: correlation coefficient = 0.143, p = 0.266). Conclusion The degree of adherence to STARD 2015 was moderate for this journal, indicating that there is room for improvement. When adjusted for exposure time, the degree of adherence did not affect the citation rate. PMID:27587959

  2. A Comparison of Parameter Study Creation and Job Submission Tools

    NASA Technical Reports Server (NTRS)

    DeVivo, Adrian; Yarrow, Maurice; McCann, Karen M.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We consider the differences between the available general purpose parameter study and job submission tools. These tools necessarily share many features, but frequently with differences in the way they are designed and implemented For this class of features, we will only briefly outline the essential differences. However we will focus on the unique features which distinguish the ILab parameter study and job submission tool from other packages, and which make the ILab tool easier and more suitable for use in our research and engineering environment.

  3. Accuracy of self-evaluation in adults with ADHD: evidence from a driving study.

    PubMed

    Knouse, Laura E; Bagwell, Catherine L; Barkley, Russell A; Murphy, Kevin R

    2005-05-01

    Research on children with ADHD indicates an association with inaccuracy of self-appraisal. This study examines the accuracy of self-evaluations in clinic-referred adults diagnosed with ADHD. Self-assessments and performance measures of driving in naturalistic settings and on a virtual-reality driving simulator are used to assess accuracy of self-evaluations. The group diagnosed with ADHD (n= 44) has a higher rate of collisions, speeding tickets, and total driving citations in their driving history; report less use of safe driving behaviors in naturalistic settings; and use fewer safe driving behaviors in the simulator than the community comparison group (n= 44). Despite poorer performance, adults with ADHD provide similar driving self-assessments, thereby overestimating in naturalistic settings to a greater degree than the comparison group. These findings extend research in children with ADHD to an adult sample in an important domain of functioning and may relate to findings of executive deficits associated with ADHD.

  4. An Initial Study of Airport Arrival Heinz Capacity Benefits Due to Improved Scheduling Accuracy

    NASA Technical Reports Server (NTRS)

    Meyn, Larry; Erzberger, Heinz

    2005-01-01

    The long-term growth rate in air-traffic demand leads to future air-traffic densities that are unmanageable by today's air-traffic control system. I n order to accommodate such growth, new technology and operational methods will be needed in the next generation air-traffic control system. One proposal for such a system is the Automated Airspace Concept (AAC). One of the precepts of AAC is to direct aircraft using trajectories that are sent via an air-ground data link. This greatly improves the accuracy in directing aircraft to specific waypoints at specific times. Studies of the Center-TRACON Automation System (CTAS) have shown that increased scheduling accuracy enables increased arrival capacity at CTAS equipped airports.

  5. Accuracy of Electronic Health Record Data for Identifying Stroke Cases in Large-Scale Epidemiological Studies: A Systematic Review from the UK Biobank Stroke Outcomes Group

    PubMed Central

    Woodfield, Rebecca; Grant, Ian; Sudlow, Cathie L. M.

    2015-01-01

    Objective Long-term follow-up of population-based prospective studies is often achieved through linkages to coded regional or national health care data. Our knowledge of the accuracy of such data is incomplete. To inform methods for identifying stroke cases in UK Biobank (a prospective study of 503,000 UK adults recruited in middle-age), we systematically evaluated the accuracy of these data for stroke and its main pathological types (ischaemic stroke, intracerebral haemorrhage, subarachnoid haemorrhage), determining the optimum codes for case identification. Methods We sought studies published from 1990-November 2013, which compared coded data from death certificates, hospital admissions or primary care with a reference standard for stroke or its pathological types. We extracted information on a range of study characteristics and assessed study quality with the Quality Assessment of Diagnostic Studies tool (QUADAS-2). To assess accuracy, we extracted data on positive predictive values (PPV) and—where available—on sensitivity, specificity, and negative predictive values (NPV). Results 37 of 39 eligible studies assessed accuracy of International Classification of Diseases (ICD)-coded hospital or death certificate data. They varied widely in their settings, methods, reporting, quality, and in the choice and accuracy of codes. Although PPVs for stroke and its pathological types ranged from 6–97%, appropriately selected, stroke-specific codes (rather than broad cerebrovascular codes) consistently produced PPVs >70%, and in several studies >90%. The few studies with data on sensitivity, specificity and NPV showed higher sensitivity of hospital versus death certificate data for stroke, with specificity and NPV consistently >96%. Few studies assessed either primary care data or combinations of data sources. Conclusions Particular stroke-specific codes can yield high PPVs (>90%) for stroke/stroke types. Inclusion of primary care data and combining data sources should

  6. A reference dataset for deformable image registration spatial accuracy evaluation using the COPDgene study archive.

    PubMed

    Castillo, Richard; Castillo, Edward; Fuentes, David; Ahmad, Moiz; Wood, Abbie M; Ludwig, Michelle S; Guerrero, Thomas

    2013-05-07

    Landmark point-pairs provide a strategy to assess deformable image registration (DIR) accuracy in terms of the spatial registration of the underlying anatomy depicted in medical images. In this study, we propose to augment a publicly available database (www.dir-lab.com) of medical images with large sets of manually identified anatomic feature pairs between breath-hold computed tomography (BH-CT) images for DIR spatial accuracy evaluation. Ten BH-CT image pairs were randomly selected from the COPDgene study cases. Each patient had received CT imaging of the entire thorax in the supine position at one-fourth dose normal expiration and maximum effort full dose inspiration. Using dedicated in-house software, an imaging expert manually identified large sets of anatomic feature pairs between images. Estimates of inter- and intra-observer spatial variation in feature localization were determined by repeat measurements of multiple observers over subsets of randomly selected features. 7298 anatomic landmark features were manually paired between the 10 sets of images. Quantity of feature pairs per case ranged from 447 to 1172. Average 3D Euclidean landmark displacements varied substantially among cases, ranging from 12.29 (SD: 6.39) to 30.90 (SD: 14.05) mm. Repeat registration of uniformly sampled subsets of 150 landmarks for each case yielded estimates of observer localization error, which ranged in average from 0.58 (SD: 0.87) to 1.06 (SD: 2.38) mm for each case. The additions to the online web database (www.dir-lab.com) described in this work will broaden the applicability of the reference data, providing a freely available common dataset for targeted critical evaluation of DIR spatial accuracy performance in multiple clinical settings. Estimates of observer variance in feature localization suggest consistent spatial accuracy for all observers across both four-dimensional CT and COPDgene patient cohorts.

  7. The Effect of Study Design Biases on the Diagnostic Accuracy of Magnetic Resonance Imaging to Detect Silicone Breast Implant Ruptures: A Meta-Analysis

    PubMed Central

    Song, Jae W.; Kim, Hyungjin Myra; Bellfi, Lillian T.; Chung, Kevin C.

    2010-01-01

    Background All silicone breast implant recipients are recommended by the US Food and Drug Administration to undergo serial screening to detect implant rupture with magnetic resonance imaging (MRI). We performed a systematic review of the literature to assess the quality of diagnostic accuracy studies utilizing MRI or ultrasound to detect silicone breast implant rupture and conducted a meta-analysis to examine the effect of study design biases on the estimation of MRI diagnostic accuracy measures. Method Studies investigating the diagnostic accuracy of MRI and ultrasound in evaluating ruptured silicone breast implants were identified using MEDLINE, EMBASE, ISI Web of Science, and Cochrane library databases. Two reviewers independently screened potential studies for inclusion and extracted data. Study design biases were assessed using the QUADAS tool and the STARDS checklist. Meta-analyses estimated the influence of biases on diagnostic odds ratios. Results Among 1175 identified articles, 21 met the inclusion criteria. Most studies using MRI (n= 10 of 16) and ultrasound (n=10 of 13) examined symptomatic subjects. Meta-analyses revealed that MRI studies evaluating symptomatic subjects had 14-fold higher diagnostic accuracy estimates compared to studies using an asymptomatic sample (RDOR 13.8; 95% CI 1.83–104.6) and 2-fold higher diagnostic accuracy estimates compared to studies using a screening sample (RDOR 1.89; 95% CI 0.05–75.7). Conclusion Many of the published studies utilizing MRI or ultrasound to detect silicone breast implant rupture are flawed with methodological biases. These methodological shortcomings may result in overestimated MRI diagnostic accuracy measures and should be interpreted with caution when applying the data to a screening population. PMID:21364405

  8. Pitch discrimination accuracy in musicians vs nonmusicians: an event-related potential and behavioral study.

    PubMed

    Tervaniemi, Mari; Just, Viola; Koelsch, Stefan; Widmann, Andreas; Schröger, Erich

    2005-02-01

    Previously, professional violin players were found to automatically discriminate tiny pitch changes, not discriminable by nonmusicians. The present study addressed the pitch processing accuracy in musicians with expertise in playing a wide selection of instruments (e.g., piano; wind and string instruments). Of specific interest was whether also musicians with such divergent backgrounds have facilitated accuracy in automatic and/or attentive levels of auditory processing. Thirteen professional musicians and 13 nonmusicians were presented with frequent standard sounds and rare deviant sounds (0.8, 2, or 4% higher in frequency). Auditory event-related potentials evoked by these sounds were recorded while first the subjects read a self-chosen book and second they indicated behaviorally the detection of sounds with deviant frequency. Musicians detected the pitch changes faster and more accurately than nonmusicians. The N2b and P3 responses recorded during attentive listening had larger amplitude in musicians than in nonmusicians. Interestingly, the superiority in pitch discrimination accuracy in musicians over nonmusicians was observed not only with the 0.8% but also with the 2% frequency changes. Moreover, also nonmusicians detected quite reliably the smallest pitch changes of 0.8%. However, the mismatch negativity (MMN) and P3a recorded during a reading condition did not differentiate musicians and nonmusicians. These results suggest that musical expertise may exert its effects merely at attentive levels of processing and not necessarily already at the preattentive levels.

  9. High accuracy differential pressure measurements using fluid-filled catheters - A feasibility study in compliant tubes.

    PubMed

    Rotman, Oren Moshe; Weiss, Dar; Zaretsky, Uri; Shitzer, Avraham; Einav, Shmuel

    2015-09-18

    High accuracy differential pressure measurements are required in various biomedical and medical applications, such as in fluid-dynamic test systems, or in the cath-lab. Differential pressure measurements using fluid-filled catheters are relatively inexpensive, yet may be subjected to common mode pressure errors (CMP), which can significantly reduce the measurement accuracy. Recently, a novel correction method for high accuracy differential pressure measurements was presented, and was shown to effectively remove CMP distortions from measurements acquired in rigid tubes. The purpose of the present study was to test the feasibility of this correction method inside compliant tubes, which effectively simulate arteries. Two tubes with varying compliance were tested under dynamic flow and pressure conditions to cover the physiological range of radial distensibility in coronary arteries. A third, compliant model, with a 70% stenosis severity was additionally tested. Differential pressure measurements were acquired over a 3 cm tube length using a fluid-filled double-lumen catheter, and were corrected using the proposed CMP correction method. Validation of the corrected differential pressure signals was performed by comparison to differential pressure recordings taken via a direct connection to the compliant tubes, and by comparison to predicted differential pressure readings of matching fluid-structure interaction (FSI) computational simulations. The results show excellent agreement between the experimentally acquired and computationally determined differential pressure signals. This validates the application of the CMP correction method in compliant tubes of the physiological range for up to intermediate size stenosis severity of 70%.

  10. Sex differences in accuracy and precision when judging time to arrival: data from two Internet studies.

    PubMed

    Sanders, Geoff; Sinclair, Kamila

    2011-12-01

    We report two Internet studies that investigated sex differences in the accuracy and precision of judging time to arrival. We used accuracy to mean the ability to match the actual time to arrival and precision to mean the consistency with which each participant made their judgments. Our task was presented as a computer game in which a toy UFO moved obliquely towards the participant through a virtual three-dimensional space on route to a docking station. The UFO disappeared before docking and participants pressed their space bar at the precise moment they thought the UFO would have docked. Study 1 showed it was possible to conduct quantitative studies of spatiotemporal judgments in virtual reality via the Internet and confirmed reports that men are more accurate because women underestimate, but found no difference in precision measured as intra-participant variation. Study 2 repeated Study 1 with five additional presentations of one condition to provide a better measure of precision. Again, men were more accurate than women but there were no sex differences in precision. However, within the coincidence-anticipation timing (CAT) literature, of those studies that report sex differences, a majority found that males are both more accurate and more precise than females. Noting that many CAT studies report no sex differences, we discuss appropriate interpretations of such null findings. While acknowledging that CAT performance may be influenced by experience we suggest that the sex difference may have originated among our ancestors with the evolutionary selection of men for hunting and women for gathering.

  11. Results of a remote multiplexer/digitizer unit accuracy and environmental study

    NASA Technical Reports Server (NTRS)

    Wilner, D. O.

    1977-01-01

    A remote multiplexer/digitizer unit (RMDU), a part of the airborne integrated flight test data system, was subjected to an accuracy study. The study was designed to show the effects of temperature, altitude, and vibration on the RMDU. The RMDU was subjected to tests at temperatures from -54 C (-65 F) to 71 C (160 F), and the resulting data are presented here, along with a complete analysis of the effects. The methods and means used for obtaining correctable data and correcting the data are also discussed.

  12. Updating Risk Prediction Tools: A Case Study in Prostate Cancer

    PubMed Central

    Ankerst, Donna P.; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J.; Feng, Ziding; Sanda, Martin G.; Partin, Alan W.; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M.

    2013-01-01

    Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [−2]proPSA measured on an external case control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. PMID:22095849

  13. Status of the VOTech Design Study about User Tools

    NASA Astrophysics Data System (ADS)

    Dolensky, M.; Pierfederici, F.; Allen, M.; Boch, T.; Bonnarel, F.; Derrière, S.; Fernique, P.; Noddle, K.; Smareglia, R.

    2006-07-01

    The VOTech design study on future tools started in spring 2005. This project, co-funded by the EC, produces design documents and software prototypes for new VO-compliant end-user tools. It is based on the experience and feedback of precursor projects and on input from the scientific user community. This status report details a number of early deliverables available from the project pages wiki.eurovotech.org, section DS4. This includes a summary of existing tools, desired future tools as derived from the AVO SRM, requirements for a cross matcher, a simple method for transferring instrumental footprints, use cases for simulations and the evaluation of various technologies.

  14. Dimensional Accuracy of Hydrophilic and Hydrophobic VPS Impression Materials Using Different Impression Techniques - An Invitro Study

    PubMed Central

    Pilla, Ajai; Pathipaka, Suman

    2016-01-01

    Introduction The dimensional stability of the impression material could have an influence on the accuracy of the final restoration. Vinyl Polysiloxane Impression materials (VPS) are most frequently used as the impression material in fixed prosthodontics. As VPS is hydrophobic when it is poured with gypsum products, manufacturers added intrinsic surfactants and marketed as hydrophilic VPS. These hydrophilic VPS have shown increased wettability with gypsum slurries. VPS are available in different viscosities ranging from very low to very high for usage under different impression techniques. Aim To compare the dimensional accuracy of hydrophilic VPS and hydrophobic VPS using monophase, one step and two step putty wash impression techniques. Materials and Methods To test the dimensional accuracy of the impression materials a stainless steel die was fabricated as prescribed by ADA specification no. 19 for elastomeric impression materials. A total of 60 impressions were made. The materials were divided into two groups, Group1 hydrophilic VPS (Aquasil) and Group 2 hydrophobic VPS (Variotime). These were further divided into three subgroups A, B, C for monophase, one-step and two-step putty wash technique with 10 samples in each subgroup. The dimensional accuracy of the impressions was evaluated after 24 hours using vertical profile projector with lens magnification range of 20X-125X illumination. The study was analyzed through one-way ANOVA, post-hoc Tukey HSD test and unpaired t-test for mean comparison between groups. Results Results showed that the three different impression techniques (monophase, 1-step, 2-step putty wash techniques) did cause significant change in dimensional accuracy between hydrophilic VPS and hydrophobic VPS impression materials. One-way ANOVA disclosed, mean dimensional change and SD for hydrophilic VPS varied between 0.56% and 0.16%, which were low, suggesting hydrophilic VPS was satisfactory with all three impression techniques. However, mean

  15. Are the surgeon's movements repeatable? An analysis of the feasibility and expediency of implementing support procedures guiding the surgical tools and increasing motion accuracy during the performance of stereotypical movements by the surgeon.

    PubMed

    Podsędkowski, Leszek Robert; Moll, Jacek; Moll, Maciej; Frącczak, Łukasz

    2014-03-01

    The developments in surgical robotics suggest that it will be possible to entrust surgical robots with a wider range of tasks. So far, it has not been possible to automate the surgery procedures related to soft tissue. Thus, the objective of the conducted studies was to confirm the hypothesis that the surgery telemanipulator can be equipped with certain routines supporting the surgeon in leading the surgical tools and increasing motion accuracy during stereotypical movements. As the first step in facilitating the surgery, an algorithm will be developed which will concurrently provide automation and allow the surgeon to maintain full control over the slave robot. The algorithm will assist the surgeon in performing typical movement sequences. This kind of support must, however, be preceded by determining the reference points for accurately defining the position of the stitched tissue. It is in relation to these points that the tool's trajectory will be created, along which the master manipulator will guide the surgeon's hand. The paper presents the first stage, concerning the selection of movements for which the support algorithm will be used. The work also contains an analysis of surgical movement repeatability. The suturing movement was investigated in detail by experimental research in order to determine motion repeatability and verify the position of the stitched tissue. Tool trajectory was determined by a motion capture stereovision system. The study has demonstrated that the suturing movement could be considered as repeatable; however, the trajectories performed by different surgeons exhibit some individual characteristics.

  16. Study on Increasing the Accuracy of Classification Based on Ant Colony algorithm

    NASA Astrophysics Data System (ADS)

    Yu, M.; Chen, D.-W.; Dai, C.-Y.; Li, Z.-L.

    2013-05-01

    The application for GIS advances the ability of data analysis on remote sensing image. The classification and distill of remote sensing image is the primary information source for GIS in LUCC application. How to increase the accuracy of classification is an important content of remote sensing research. Adding features and researching new classification methods are the ways to improve accuracy of classification. Ant colony algorithm based on mode framework defined, agents of the algorithms in nature-inspired computation field can show a kind of uniform intelligent computation mode. It is applied in remote sensing image classification is a new method of preliminary swarm intelligence. Studying the applicability of ant colony algorithm based on more features and exploring the advantages and performance of ant colony algorithm are provided with very important significance. The study takes the outskirts of Fuzhou with complicated land use in Fujian Province as study area. The multi-source database which contains the integration of spectral information (TM1-5, TM7, NDVI, NDBI) and topography characters (DEM, Slope, Aspect) and textural information (Mean, Variance, Homogeneity, Contrast, Dissimilarity, Entropy, Second Moment, Correlation) were built. Classification rules based different characters are discovered from the samples through ant colony algorithm and the classification test is performed based on these rules. At the same time, we compare with traditional maximum likelihood method, C4.5 algorithm and rough sets classifications for checking over the accuracies. The study showed that the accuracy of classification based on the ant colony algorithm is higher than other methods. In addition, the land use and cover changes in Fuzhou for the near term is studied and display the figures by using remote sensing technology based on ant colony algorithm. In addition, the land use and cover changes in Fuzhou for the near term is studied and display the figures by using

  17. The Eye Phone Study: reliability and accuracy of assessing Snellen visual acuity using smartphone technology

    PubMed Central

    Perera, C; Chakrabarti, R; Islam, F M A; Crowston, J

    2015-01-01

    Purpose Smartphone-based Snellen visual acuity charts has become popularized; however, their accuracy has not been established. This study aimed to evaluate the equivalence of a smartphone-based visual acuity chart with a standard 6-m Snellen visual acuity (6SVA) chart. Methods First, a review of available Snellen chart applications on iPhone was performed to determine the most accurate application based on optotype size. Subsequently, a prospective comparative study was performed by measuring conventional 6SVA and then iPhone visual acuity using the ‘Snellen' application on an Apple iPhone 4. Results Eleven applications were identified, with accuracy of optotype size ranging from 4.4–39.9%. Eighty-eight patients from general medical and surgical wards in a tertiary hospital took part in the second part of the study. The mean difference in logMAR visual acuity between the two charts was 0.02 logMAR (95% limit of agreement −0.332, 0.372 logMAR). The largest mean difference in logMAR acuity was noted in the subgroup of patients with 6SVA worse than 6/18 (n=5), who had a mean difference of two Snellen visual acuity lines between the charts (0.276 logMAR). Conclusion We did not identify a Snellen visual acuity app at the time of study, which could predict a patients standard Snellen visual acuity within one line. There was considerable variability in the optotype accuracy of apps. Further validation is required for assessment of acuity in patients with severe vision impairment. PMID:25931170

  18. Automated Multi-Peak Tracking Kymography (AMTraK): A Tool to Quantify Sub-Cellular Dynamics with Sub-Pixel Accuracy

    PubMed Central

    Chaphalkar, Anushree R.; Jain, Kunalika; Gangan, Manasi S.

    2016-01-01

    Kymographs or space-time plots are widely used in cell biology to reduce the dimensions of a time-series in microscopy for both qualitative and quantitative insight into spatio-temporal dynamics. While multiple tools for image kymography have been described before, quantification remains largely manual. Here, we describe a novel software tool for automated multi-peak tracking kymography (AMTraK), which uses peak information and distance minimization to track and automatically quantify kymographs, integrated in a GUI. The program takes fluorescence time-series data as an input and tracks contours in the kymographs based on intensity and gradient peaks. By integrating a branch-point detection method, it can be used to identify merging and splitting events of tracks, important in separation and coalescence events. In tests with synthetic images, we demonstrate sub-pixel positional accuracy of the program. We test the program by quantifying sub-cellular dynamics in rod-shaped bacteria, microtubule (MT) transport and vesicle dynamics. A time-series of E. coli cell division with labeled nucleoid DNA is used to identify the time-point and rate at which the nucleoid segregates. The mean velocity of microtubule (MT) gliding motility due to a recombinant kinesin motor is estimated as 0.5 μm/s, in agreement with published values, and comparable to estimates using software for nanometer precision filament-tracking. We proceed to employ AMTraK to analyze previously published time-series microscopy data where kymographs had been manually quantified: clathrin polymerization kinetics during vesicle formation and anterograde and retrograde transport in axons. AMTraK analysis not only reproduces the reported parameters, it also provides an objective and automated method for reproducible analysis of kymographs from in vitro and in vivo fluorescence microscopy time-series of sub-cellular dynamics. PMID:27992448

  19. Quality standards are needed for reporting of test accuracy studies for animal diseases.

    PubMed

    Gardner, Ian A

    2010-12-01

    The STARD statement (www.stard-statement.org) emphasizes complete and transparent reporting of key elements of test accuracy studies. Guidelines for authors in many biomedical journals recommend adherence to these standards but explicit recommendations by editors of veterinary journals are limited. Adherence to standards benefits end-users of tests including doctors, veterinarians and other healthcare professionals and the human and animal patients in which the tests are used. Reporting standards also provide a structured basis for researchers and graduate students to prepare manuscripts, and subsequently can be a useful adjunct to the peer-review process. This paper discusses the purpose of STARD and its possible modification for animal disease studies, variation in reporting and design quality in human and animal disease studies, use of a different instrument (QUADAS) for assessing methodological quality, and provides some recommendations for the future. Finally, the contributions of Dr. Hollis Erb to improvements in methodological and reporting qualities of test accuracy studies in Preventive Veterinary Medicine are described.

  20. Comparative study of public-domain supervised machine-learning accuracy on the UCI database

    NASA Astrophysics Data System (ADS)

    Eklund, Peter W.

    1999-02-01

    This paper surveys public domain supervised learning algorithms and performs accuracy (error rate) analysis of their classification performance on unseen instances for twenty-nine of the University of California at Irvine machine learning datasets. The learning algorithms represent three types of classifiers: decision trees, neural networks and rule-based classifiers. The study performs data analysis and examines the effect of irrelevant attributes to explain the performance characteristics of the learning algorithms. The survey concludes with some general recommendations about the selection of public domain machine-learning algorithms relative to the properties of the data examined.

  1. Tools for Teaching Climate Change Studies

    SciTech Connect

    Maestas, A.M.; Jones, L.A.

    2005-03-18

    The Atmospheric Radiation Measurement Climate Research Facility (ACRF) develops public outreach materials and educational resources for schools. Studies prove that science education in rural and indigenous communities improves when educators integrate regional knowledge of climate and environmental issues into school curriculum and public outreach materials. In order to promote understanding of ACRF climate change studies, ACRF Education and Outreach has developed interactive kiosks about climate change for host communities close to the research sites. A kiosk for the North Slope of Alaska (NSA) community was installed at the Iupiat Heritage Center in 2003, and a kiosk for the Tropical Western Pacific locales will be installed in 2005. The kiosks feature interviews with local community elders, regional agency officials, and Atmospheric Radiation Measurement (ARM) Program scientists, which highlight both research and local observations of some aspects of environmental and climatic change in the Arctic and Pacific. The kiosks offer viewers a unique opportunity to learn about the environmental concerns and knowledge of respected community elders, and to also understand state-of-the-art climate research. An archive of interviews from the communities will also be distributed with supplemental lessons and activities to encourage teachers and students to compare and contrast climate change studies and oral history observations from two distinct locations. The U.S. Department of Energy's ACRF supports education and outreach efforts for communities and schools located near its sites. ACRF Education and Outreach has developed interactive kiosks at the request of the communities to provide an opportunity for the public to learn about climate change from both scientific and indigenous perspectives. Kiosks include interviews with ARM scientists and provide users with basic information about climate change studies as well as interviews with elders and community leaders

  2. Effect of dynamic random leaks on the monitoring accuracy of home mechanical ventilators: a bench study

    PubMed Central

    2013-01-01

    Background So far, the accuracy of tidal volume (VT) and leak measures provided by the built-in software of commercial home ventilators has only been tested using bench linear models with fixed calibrated and continuous leaks. The objective was to assess the reliability of the estimation of tidal volume (VT) and unintentional leaks in a single tubing bench model which introduces random dynamic leaks during inspiratory or expiratory phases. Methods The built-in software of four commercial home ventilators and a fifth ventilator-independent ad hoc designed external software tool were tested with two levels of leaks and two different models with excess leaks (inspiration or expiration). The external software analyzed separately the inspiratory and expiratory unintentional leaks. Results In basal condition, all ventilators but one underestimated tidal volume with values ranging between -1.5 ± 3.3% to -8.7% ± 3.27%. In the model with excess of inspiratory leaks, VT was overestimated by all four commercial software tools, with values ranging from 18.27 ± 7.05% to 35.92 ± 17.7%, whereas the ventilator independent-software gave a smaller difference (3.03 ± 2.6%). Leaks were underestimated by two applications with values of -11.47 ± 6.32 and -5.9 ± 0.52 L/min. With expiratory leaks, VT was overestimated by the software of one ventilator and the ventilator-independent software and significantly underestimated by the other three, with deviations ranging from +10.94 ± 7.1 to -48 ± 23.08%. The four commercial tools tested overestimated unintentional leaks, with values between 2.19 ± 0.85 to 3.08 ± 0.43 L/min. Conclusions In a bench model, the presence of unintentional random leaks may be a source of error in the measurement of VT and leaks provided by the software of home ventilators. Analyzing leaks during inspiration and expiration separately may reduce this source of error. PMID:24325396

  3. [Comparative study on hyperspectral inversion accuracy of soil salt content and electrical conductivity].

    PubMed

    Peng, Jie; Wang, Jia-Qiang; Xiang, Hong-Ying; Teng, Hong-Fen; Liu, Wei-Yang; Chi, Chun-Ming; Niu, Jian-Long; Guo, Yan; Shi, Zhou

    2014-02-01

    The objective of the present article is to ascertain the mechanism of hyperspectral remote sensing monitoring for soil salinization, which is of great importance for improving the accuracy of hyperspectral remote sensing monitoring. Paddy soils in Wensu, Hetian and Baicheng counties of the southern Xinjiang were selected. Hyperspectral data of soils were obtained. Soil salt content (S(t)) an electrical conductivity of 1:5 soil-to-water extracts (EC(1:5)) were determined. Relationships between S(t) and EC(1:5) were studied. Correlations between hyperspectral indices and S(t), and EC(1:5) were analyzed. The inversion accuracy of S(t) using hyperspectral technique was compared with that of EC(1:5). Results showed that: significant (p<0.01) relationships were found between S(t) and EC(1:5) for soils in Wensu and Hetian counties, and correlation coefficients were 0.86 and 0.45, respectively; there was no significant relationship between S(t) and EC(1:5) for soils in Baicheng county. Therefore, the correlations between S(t) and EC(1:5) varied with studied sites. S(t) and EC(1:5) were significantly related with spectral reflectance, first derivative reflectance and continuum-removed reflectance, respectively; but correlation coefficients between S(t) and spectral indices were higher than those between EC(1:5) and spectral indices, which was obvious in some sensitive bands for soil salinization such as 660, 35, 1229, 1414, 1721, 1738, 1772, 2309 nm, and so on. Prediction equations of St and EC(1:5) were established using multivariate linear regression, principal component regression and partial least-squares regression methods, respectively. Coefficients of determination, determination coefficients of prediction, and relative analytical errors of these equations were analyzed. Coefficients of determination and relative analytical errors of equations between S(t) and spectral indices were higher than those of equations between EC(1:5) and spectral indices. Therefore, the

  4. Astra: Interdisciplinary study on enhancement of the end-to-end accuracy for spacecraft tracking techniques

    NASA Astrophysics Data System (ADS)

    Iess, Luciano; Di Benedetto, Mauro; James, Nick; Mercolino, Mattia; Simone, Lorenzo; Tortora, Paolo

    2014-02-01

    Navigation of deep-space probes is accomplished through a variety of different radio observables, namely Doppler, ranging and Delta-Differential One-Way Ranging (Delta-DOR). The particular mix of observations used for navigation mainly depends on the available on-board radio system, the mission phase and orbit determination requirements. The accuracy of current ESA and NASA tracking systems is at level of 0.1 mm/s at 60 s integration time for Doppler, 1-5 m for ranging and 6-15 nrad for Delta-DOR measurements in a wide range of operational conditions. The ASTRA study, funded under ESA's General Studies Programme (GSP), addresses the ways to improve the end-to-end accuracy of Doppler, ranging and Delta-DOR systems by roughly a factor of 10. The target accuracies were set to 0.01 mm/s at 60 s integration time for Doppler, 20 cm for ranging and 1 nrad for Delta-DOR. The companies and universities that took part in the study were the University of Rome Sapienza, ALMASpace, BAE Systems and Thales Alenia Space Italy. The analysis of an extensive data set of radio-metric observables and dedicated tests of the ground station allowed consolidating the error budget for each measurement technique. The radio-metric data set comprises X/X, X/Ka and Ka/Ka range and Doppler observables from the Cassini and Rosetta missions. It includes also measurements from the Advanced Media Calibration System (AMCS) developed by JPL for the radio science experiments of the Cassini mission. The error budget for the three radio-metric observables was consolidated by comparing the statistical properties of the data set with the expected error models. The analysis confirmed the contribution from some error sources, but revealed also some discrepancies and ultimately led to improved error models. The error budget reassessment provides adequate information for building guidelines and strategies to effectively improve the navigation accuracies of future deep space missions. We report both on updated

  5. Accuracy and repeatability of an inertial measurement unit system for field-based occupational studies.

    PubMed

    Schall, Mark C; Fethke, Nathan B; Chen, Howard; Oyama, Sakiko; Douphrate, David I

    2016-04-01

    The accuracy and repeatability of an inertial measurement unit (IMU) system for directly measuring trunk angular displacement and upper arm elevation were evaluated over eight hours (i) in comparison to a gold standard, optical motion capture (OMC) system in a laboratory setting, and (ii) during a field-based assessment of dairy parlour work. Sample-to-sample root mean square differences between the IMU and OMC system ranged from 4.1° to 6.6° for the trunk and 7.2°-12.1° for the upper arm depending on the processing method. Estimates of mean angular displacement and angular displacement variation (difference between the 90th and 10th percentiles of angular displacement) were observed to change <4.5° on average in the laboratory and <1.5° on average in the field per eight hours of data collection. Results suggest the IMU system may serve as an acceptable instrument for directly measuring trunk and upper arm postures in field-based occupational exposure assessment studies with long sampling durations. Practitioner Summary: Few studies have evaluated inertial measurement unit (IMU) systems in the field or over long sampling durations. Results of this study indicate that the IMU system evaluated has reasonably good accuracy and repeatability for use in a field setting over a long sampling duration.

  6. A PILOT STUDY OF THE ACCURACY OF CO2 SENSORS IN COMMERCIAL BUILDINGS

    SciTech Connect

    Fisk, William; Fisk, William J.; Faulkner, David; Sullivan, Douglas P.

    2007-09-01

    Carbon dioxide (CO2) sensors are often deployed in commercial buildings to obtain CO2 data that are used to automatically modulate rates of outdoor air supply. The goal is to keep ventilation rates at or above design requirements and to save energy by avoiding ventilation rates exceeding design requirements. However, there have been many anecdotal reports of poor CO2 sensor performance in actual commercial building applications. This study evaluated the accuracy of 44 CO2 sensors located in nine commercial buildings to determine if CO2 sensor performance, in practice, is generally acceptable or problematic. CO2 measurement errors varied widely and were sometimes hundreds of parts per million. Despite its small size, this study provides a strong indication that the accuracy of CO2 sensors, as they are applied and maintained in commercial buildings, is frequently less than needed to measure typical values of maximum one-hour-average indoor-outdoor CO2 concentration differences with less than a 20percent error. Thus, we conclude that there is a need for more accurate CO2 sensors and/or better sensor maintenance or calibration procedures.

  7. Do fixation cues ensure fixation accuracy in split-fovea studies of word recognition?

    PubMed

    Jordan, Timothy R; Paterson, Kevin B; Kurtev, Stoyan; Xu, Mengyun

    2009-07-01

    Many studies have claimed that hemispheric processing is split precisely at the foveal midline and so place great emphasis on the precise location at which words are fixated. These claims are based on experiments in which a variety of fixation procedures were used to ensure fixation accuracy but the effectiveness of these procedures is unclear. We investigated this issue using procedures matched to the original studies and an eye-tracker to monitor the locations actually fixated. Four common types of fixation cues were used: cross, two vertical gapped lines, two vertical gapped lines plus a secondary task in which a digit was presented at the designated fixation point, and a dot. Accurate fixations occurred on <35% of trials for all fixation conditions. Moreover, despite the usefulness often attributed to a secondary task, no increase in fixation accuracy was produced in this condition. The indications are that split-fovea theory should not assume that fixation of specified locations occurs in experiments without appropriate eye-tracking control or, indeed, that consistent fixation of specified locations is plausible under normal conditions of word recognition.

  8. An accuracy study of computer-planned implant placement in the augmented maxilla using osteosynthesis screws.

    PubMed

    Verhamme, L M; Meijer, G J; Soehardi, A; Bergé, S J; Xi, T; Maal, T J J

    2017-04-01

    Previous research on the accuracy of flapless implant placement of virtually planned implants in the augmented maxilla revealed unfavourable discrepancies between implant planning and placement. By using the osteosynthesis screws placed during the augmentation procedure, the surgical template could be optimally stabilized. The purpose of this study was to validate this method by evaluating its clinically relevant accuracy. Twelve consecutive fully edentulous patients with extreme resorption of the maxilla were treated with a bone augmentation procedure. Virtual implant planning was performed and a surgical template was manufactured. Subsequently, six implants were installed using the surgical template, which was only supported by the osteosynthesis screws. Implant deviations between planning and placement were calculated. A total of 72 implants were installed. Mean deviations found in the mesiodistal direction were 0.817mm at the implant tip and 0.528mm at the implant shoulder. The angular deviation was 2.924°. In the buccolingual direction, a deviation of 1.038mm was registered at the implant tip and 0.633mm at the implant shoulder. The angular deviation was 3.440°. This study showed that implant placement in the augmented maxilla using a surgical template supported by osteosynthesis screws is accurate.

  9. Assessing the accuracy of tympanometric evaluation of external auditory canal volume: a scientific study using an ear canal model.

    PubMed

    Al-Hussaini, A; Owens, D; Tomkinson, A

    2011-12-01

    Tympanometric evaluation is routinely used as part of the complete otological examination. During tympanometric examination, evaluation of middle ear pressure and ear canal volume is undertaken. Little is reported in relation to the accuracy and precision tympanometry evaluates external ear canal volume. This study examines the capability of the tympanometer to accurately evaluate external auditory canal volume in both simple and partially obstructed ear canal models and assesses its capability to be used in studies examining the effectiveness of cerumolytics. An ear canal model was designed using simple laboratory equipment, including a 5 ml calibrated clinical syringe (Becton Dickinson, Spain). The ear canal model was attached to the sensing probe of a Kamplex tympanometer (Interacoustics, Denmark). Three basic trials were undertaken: evaluation of the tympanometer in simple canal volume measurement, evaluation of the tympanometer in assessing canal volume with partial canal occlusion at different positions within the model, and evaluation of the tympanometer in assessing canal volume with varying degrees of canal occlusion. 1,290 individual test scenarios were completed over the three arms of the study. At volumes of 1.4 cm(3) or below, a perfect relationship was noted between the actual and tympanometric volumes in the simple model (Spearman's ρ = 1) with weakening degrees of agreement with increasing volume of the canal. Bland-Altman plotting confirmed the accuracy of this agreement. In the wax substitute models, tympanometry was observed to have a close relationship (Spearman's ρ > 0.99) with the actual volume present with worsening error above a volume of 1.4 cm(3). Bland-Altman plotting and precision calculations provided evidence of accuracy. Size and position of the wax substitute had no statistical effect on results [Wilcoxon rank-sum test (WRST) p > 0.99], nor did degree of partial obstruction (WRST p > 0.99). The Kamplex tympanometer

  10. Influence of Pedometer Position on Pedometer Accuracy at Various Walking Speeds: A Comparative Study

    PubMed Central

    Lovis, Christian

    2016-01-01

    Background Demographic growth in conjunction with the rise of chronic diseases is increasing the pressure on health care systems in most OECD countries. Physical activity is known to be an essential factor in improving or maintaining good health. Walking is especially recommended, as it is an activity that can easily be performed by most people without constraints. Pedometers have been extensively used as an incentive to motivate people to become more active. However, a recognized problem with these devices is their diminishing accuracy associated with decreased walking speed. The arrival on the consumer market of new devices, worn indifferently either at the waist, wrist, or as a necklace, gives rise to new questions regarding their accuracy at these different positions. Objective Our objective was to assess the performance of 4 pedometers (iHealth activity monitor, Withings Pulse O2, Misfit Shine, and Garmin vívofit) and compare their accuracy according to their position worn, and at various walking speeds. Methods We conducted this study in a controlled environment with 21 healthy adults required to walk 100 m at 3 different paces (0.4 m/s, 0.6 m/s, and 0.8 m/s) regulated by means of a string attached between their legs at the level of their ankles and a metronome ticking the cadence. To obtain baseline values, we asked the participants to walk 200 m at their own pace. Results A decrease of accuracy was positively correlated with reduced speed for all pedometers (12% mean error at self-selected pace, 27% mean error at 0.8 m/s, 52% mean error at 0.6 m/s, and 76% mean error at 0.4 m/s). Although the position of the pedometer on the person did not significantly influence its accuracy, some interesting tendencies can be highlighted in 2 settings: (1) positioning the pedometer at the waist at a speed greater than 0.8 m/s or as a necklace at preferred speed tended to produce lower mean errors than at the wrist position; and (2) at a slow speed (0.4 m/s), pedometers

  11. Review of The SIAM 100-Digit Challenge: A Study in High-Accuracy Numerical Computing

    SciTech Connect

    Bailey, David

    2005-01-25

    In the January 2002 edition of SIAM News, Nick Trefethen announced the '$100, 100-Digit Challenge'. In this note he presented ten easy-to-state but hard-to-solve problems of numerical analysis, and challenged readers to find each answer to ten-digit accuracy. Trefethen closed with the enticing comment: 'Hint: They're hard! If anyone gets 50 digits in total, I will be impressed.' This challenge obviously struck a chord in hundreds of numerical mathematicians worldwide, as 94 teams from 25 nations later submitted entries. Many of these submissions exceeded the target of 50 correct digits; in fact, 20 teams achieved a perfect score of 100 correct digits. Trefethen had offered $100 for the best submission. Given the overwhelming response, a generous donor (William Browning, founder of Applied Mathematics, Inc.) provided additional funds to provide a $100 award to each of the 20 winning teams. Soon after the results were out, four participants, each from a winning team, got together and agreed to write a book about the problems and their solutions. The team is truly international: Bornemann is from Germany, Laurie is from South Africa, Wagon is from the USA, and Waldvogel is from Switzerland. This book provides some mathematical background for each problem, and then shows in detail how each of them can be solved. In fact, multiple solution techniques are mentioned in each case. The book describes how to extend these solutions to much larger problems and much higher numeric precision (hundreds or thousands of digit accuracy). The authors also show how to compute error bounds for the results, so that one can say with confidence that one's results are accurate to the level stated. Numerous numerical software tools are demonstrated in the process, including the commercial products Mathematica, Maple and Matlab. Computer programs that perform many of the algorithms mentioned in the book are provided, both in an appendix to the book and on a website. In the process, the

  12. Longitudinal Study: Efficacy of Online Technology Tools for Instructional Use

    NASA Technical Reports Server (NTRS)

    Uenking, Michael D.

    2011-01-01

    Studies show that the student population (secondary and post secondary) is becoming increasingly more technologically savvy. Use of the internet, computers, MP3 players, and other technologies along with online gaming has increased tremendously amongst this population such that it is creating an apparent paradigm shift in the learning modalities of these students. Instructors and facilitators of learning can no longer rely solely on traditional lecture-based lesson formals. In order to achieve student academic success and satisfaction and to increase student retention, instructors must embrace various technology tools that are available and employ them in their lessons. A longitudinal study (January 2009-June 2010) has been performed that encompasses the use of several technology tools in an instructional setting. The study provides further evidence that students not only like the tools that are being used, but prefer that these tools be used to help supplement and enhance instruction.

  13. A pilot study of the accuracy of onsite immunoassay urinalysis of illicit drug use in seriously mentally ill outpatients

    PubMed Central

    McDonell, Michael G.; Angelo, Frank; Sugar, Andrea; Rainey, Christina; Srebnik, Debra; Roll, John; Short, Robert; Ries, Richard K.

    2014-01-01

    Objectives This pilot study investigated the accuracy of onsite immunoassay urinalysis of illicit drug use in 42 outpatients with co-occurring substance use disorders and serious mental illness. Methods Up to 40 urine samples were submitted by each participant as part of a larger study investigating the efficacy of contingency management in persons with co-occurring disorders. Each sample was analyzed for the presence of amphetamine, methamphetamine, cocaine, marijuana, and opiates or their metabolites using onsite qualitative immunoassays. One onsite urinalysis was randomly selected from each participant for confirmatory gas chromatography–mass spectrometry (GC–MS) analyses. Results Agreement between immunoassay and GC–MS was calculated. Agreement was high, with 98% agreement for amphetamine, methamphetamine, opiate, and marijuana. Agreement for cocaine was 93%. Conclusions Results of this pilot study support the use of onsite immunoassay screening cups as an assessment and outcome measure in adults with serious mental illness. Scientific Significance Data suggest that onsite urinalysis screenings may be a helpful assessment tool for measuring clinical and research outcomes. PMID:21219262

  14. Case studies on forecasting for innovative technologies: frequent revisions improve accuracy.

    PubMed

    Lerner, Jeffrey C; Robertson, Diane C; Goldstein, Sara M

    2015-02-01

    Health technology forecasting is designed to provide reliable predictions about costs, utilization, diffusion, and other market realities before the technologies enter routine clinical use. In this article we address three questions central to forecasting's usefulness: Are early forecasts sufficiently accurate to help providers acquire the most promising technology and payers to set effective coverage policies? What variables contribute to inaccurate forecasts? How can forecasters manage the variables to improve accuracy? We analyzed forecasts published between 2007 and 2010 by the ECRI Institute on four technologies: single-room proton beam radiation therapy for various cancers; digital breast tomosynthesis imaging technology for breast cancer screening; transcatheter aortic valve replacement for serious heart valve disease; and minimally invasive robot-assisted surgery for various cancers. We then examined revised ECRI forecasts published in 2013 (digital breast tomosynthesis) and 2014 (the other three topics) to identify inaccuracies in the earlier forecasts and explore why they occurred. We found that five of twenty early predictions were inaccurate when compared with the updated forecasts. The inaccuracies pertained to two technologies that had more time-sensitive variables to consider. The case studies suggest that frequent revision of forecasts could improve accuracy, especially for complex technologies whose eventual use is governed by multiple interactive factors.

  15. The effect of morphology on spelling and reading accuracy: a study on Italian children.

    PubMed

    Angelelli, Paola; Marinelli, Chiara Valeria; Burani, Cristina

    2014-01-01

    In opaque orthographies knowledge of morphological information helps in achieving reading and spelling accuracy. In transparent orthographies with regular print-to-sound correspondences, such as Italian, the mappings of orthography onto phonology and phonology onto orthography are in principle sufficient to read and spell most words. The present study aimed to investigate the role of morphology in the reading and spelling accuracy of Italian children as a function of school experience to determine whether morphological facilitation was present in children learning a transparent orthography. The reading and spelling performances of 15 third-grade and 15 fifth-grade typically developing children were analyzed. Children read aloud and spelled both low-frequency words and pseudowords. Low-frequency words were manipulated for the presence of morphological structure (morphemic words vs. non-derived words). Morphemic words could also vary for the frequency (high vs. low) of roots and suffixes. Pseudo-words were made up of either a real root and a real derivational suffix in a combination that does not exist in the Italian language or had no morphological constituents. Results showed that, in Italian, morphological information is a useful resource for both reading and spelling. Typically developing children benefitted from the presence of morphological structure when they read and spelled pseudowords; however, in processing low-frequency words, morphology facilitated reading but not spelling. These findings are discussed in terms of morpho-lexical access and successful cooperation between lexical and sublexical processes in reading and spelling.

  16. The effect of morphology on spelling and reading accuracy: a study on Italian children

    PubMed Central

    Angelelli, Paola; Marinelli, Chiara Valeria; Burani, Cristina

    2014-01-01

    In opaque orthographies knowledge of morphological information helps in achieving reading and spelling accuracy. In transparent orthographies with regular print-to-sound correspondences, such as Italian, the mappings of orthography onto phonology and phonology onto orthography are in principle sufficient to read and spell most words. The present study aimed to investigate the role of morphology in the reading and spelling accuracy of Italian children as a function of school experience to determine whether morphological facilitation was present in children learning a transparent orthography. The reading and spelling performances of 15 third-grade and 15 fifth-grade typically developing children were analyzed. Children read aloud and spelled both low-frequency words and pseudowords. Low-frequency words were manipulated for the presence of morphological structure (morphemic words vs. non-derived words). Morphemic words could also vary for the frequency (high vs. low) of roots and suffixes. Pseudo-words were made up of either a real root and a real derivational suffix in a combination that does not exist in the Italian language or had no morphological constituents. Results showed that, in Italian, morphological information is a useful resource for both reading and spelling. Typically developing children benefitted from the presence of morphological structure when they read and spelled pseudowords; however, in processing low-frequency words, morphology facilitated reading but not spelling. These findings are discussed in terms of morpho-lexical access and successful cooperation between lexical and sublexical processes in reading and spelling. PMID:25477855

  17. Understanding of accuracy on calculated soil moisture field for the study of land-atmosphere interaction

    NASA Astrophysics Data System (ADS)

    Yorozu, K.; Tanaka, K.; Nakakita, E.; Ikebuchi, S.

    2007-12-01

    Understanding the state of soil moisture is effective to enhance climate predictability on inter-seasonal or annual time scales. Thus, the Global Soil Wetness Project (GSWP) has been implemented as an environmental modeling research activity. The SiBUC (Simple Biosphere including Urban Canopy) land surface model is one of the participants of the 2nd GSWP, and it uses mosaic approach to incorporate all kind of land-use. In order to estimate the global soil moisture field as accurately as possible and to utilize the products of GSWP2 simulation more efficiently, SiBUC is run with irrigation scheme activated. Integration of one-way uncoupled SiBUC model from 1986 to 1995 have produced global soil moisture field. Both the model and forcing data may contain uncertainty. However, the SiBUC model is one of the few models which can consider irrigation effect. And also, the advantage of the meteorological forcing data provided from GSWP2 is hybridization among reanalysis products, observation data and satellite data. In this sense, it is assumed that GSWP2 products is the most accurate global land surface hydrological data set in available. Thus, these global products should be applied to land-atmosphere interaction study, if possible. To do this, it is important to understand inter-annual or much higher time scale accuracy on calculated soil moisture filed. In this study, calculated soil moisture field are validated with observation of soil moisture in five regions (Illinois:USA, China, India, Mongolia, Russia). The Russian data has two types data: one is located in spring wheat and another is located in winter wheat. These observation data are provided from Global Soil Moisture Data Bank (GSMDB). To understand the time scale accuracy on soil moisture field, three correlation coefficients are calculated between calculated soil moisture and observed soil moisture: inter-annual, inter-seasonal and monthly mean correlation, respectively. As a result, if the median value in

  18. Impact of contacting study authors to obtain additional data for systematic reviews: diagnostic accuracy studies for hepatic fibrosis

    PubMed Central

    2014-01-01

    Background Seventeen of 172 included studies in a recent systematic review of blood tests for hepatic fibrosis or cirrhosis reported diagnostic accuracy results discordant from 2 × 2 tables, and 60 studies reported inadequate data to construct 2 × 2 tables. This study explores the yield of contacting authors of diagnostic accuracy studies and impact on the systematic review findings. Methods Sixty-six corresponding authors were sent letters requesting additional information or clarification of data from 77 studies. Data received from the authors were synthesized with data included in the previous review, and diagnostic accuracy sensitivities, specificities, and positive and likelihood ratios were recalculated. Results Of the 66 authors, 68% were successfully contacted and 42% provided additional data for 29 out of 77 studies (38%). All authors who provided data at all did so by the third emailed request (ten authors provided data after one request). Authors of more recent studies were more likely to be located and provide data compared to authors of older studies. The effects of requests for additional data on the conclusions regarding the utility of blood tests to identify patients with clinically significant fibrosis or cirrhosis were generally small for ten out of 12 tests. Additional data resulted in reclassification (using median likelihood ratio estimates) from less useful to moderately useful or vice versa for the remaining two blood tests and enabled the calculation of an estimate for a third blood test for which previously the data had been insufficient to do so. We did not identify a clear pattern for the directional impact of additional data on estimates of diagnostic accuracy. Conclusions We successfully contacted and received results from 42% of authors who provided data for 38% of included studies. Contacting authors of studies evaluating the diagnostic accuracy of serum biomarkers for hepatic fibrosis and cirrhosis in hepatitis C patients

  19. STARD-BLCM: Standards for the Reporting of Diagnostic accuracy studies that use Bayesian Latent Class Models.

    PubMed

    Kostoulas, Polychronis; Nielsen, Søren S; Branscum, Adam J; Johnson, Wesley O; Dendukuri, Nandini; Dhand, Navneet K; Toft, Nils; Gardner, Ian A

    2017-03-01

    The Standards for the Reporting of Diagnostic Accuracy (STARD) statement, which was recently updated to the STARD2015 statement, was developed to encourage complete and transparent reporting of test accuracy studies. Although STARD principles apply broadly, the checklist is limited to studies designed to evaluate the accuracy of tests when the disease status is determined from a perfect reference procedure or an imperfect one with known measures of test accuracy. However, a reference standard does not always exist, especially in the case of infectious diseases with a long latent period. In such cases, a valid alternative to classical test evaluation involves the use of latent class models that do not require a priori knowledge of disease status. Latent class models have been successfully implemented in a Bayesian framework for over 20 years. The objective of this work was to identify the STARD items that require modification and develop a modified version of STARD for studies that use Bayesian latent class analysis to estimate diagnostic test accuracy in the absence of a reference standard. Examples and elaborations for each of the modified items are provided. The new guidelines, termed STARD-BLCM (Standards for Reporting of Diagnostic accuracy studies that use Bayesian Latent Class Models), will facilitate improved quality of reporting on the design, conduct and results of diagnostic accuracy studies that use Bayesian latent class models.

  20. The Accuracy of Emergency Physicians in Ultrasonographic Screening of Acute Appendicitis; a Cross Sectional Study

    PubMed Central

    Karimi, Ebrahim; Aminianfar, Mohammad; Zarafshani, Keivan; Safaie, Arash

    2017-01-01

    Introduction: Diagnostic values reported for ultrasonographic screening of acute appendicitis vary widely and are dependent on the operator’s skill, patient’s gender, weight, etc. The present study aimed to evaluate the effect of operator skill on the diagnostic accuracy of ultrasonography in detection of appendicitis by comparing the results of ultrasonography done by radiologists and emergency physicians. Methods: This prospective diagnostic accuracy was carried out on patients suspected to acute appendicitis presenting to EDs of 2 hospitals. After the initial clinical examinations, all the patients underwent ultrasonography for appendicitis by emergency physician and radiologist, respectively. The final diagnosis of appendicitis was based on either pathology report or 48-hour follow-up. Screening performance characteristics of appendix ultrasonography by emergency physician and radiologist were compared using STATA 11.0 software. Results: 108 patients with the mean age of 23.91 ± 7.46 years were studied (61.1% male). Appendicitis was confirmed for 37 (34.26%) cases. Cohen's kappa coefficient between ultrasonography by the radiologist and emergency physician in diagnosis of acute appendicitis was 0.51 (95% CI: 0.35 – 0.76). Area under the ROC curve of ultrasonography in appendicitis diagnosis was 0.78 (95% CI: 0.69 – 0.86) for emergency physician and 0.88 (95% CI: 0.81 – 0.94) for radiologist (p = 0.052). Sensitivity and specificity of ultrasonography by radiologist and emergency physician in appendicitis diagnosis were 83.87% (95% CI: 67.32 – 93.23), 91.5% (95% CI: 81.89 – 96.52), 72.97% (95% CI: 55.61 – 85.63), and 83.10% (95% CI: 71.94 – 90.59), respectively. Conclusion: Findings of the present study showed that the diagnostic accuracy of ultrasonography carried out by radiologist (89%) is a little better compared to that of emergency physician (80%) in diagnosis of appendicitis, but none are excellent. PMID:28286829

  1. Case Studies of Software Development Tools for Parallel Architectures

    DTIC Science & Technology

    1993-06-01

    RL-TR-93-114 Final Technical Report AD-A269 193I M N11 Nal I U l iE rr ll Hllll CASE STUDIES OF SOFTWARE DEVELOPMENT TOOLS FOR PARALLEL ARCHITECTURES...65 Om ega/ PegaSys ..................................................................................... 66 PARET...Pisces Rn BALSA II TANGO PARET VMMP Omega/ PegaSys PSG POKER ISSOS Unity -4- PADWB Schedule Tool Degn Graph= Alg I/gr- Sol Pormbil- Ptform Pan/don Debug

  2. Accuracy in Rietveld quantitative phase analysis: a comparative study of strictly monochromatic Mo and Cu radiations.

    PubMed

    León-Reina, L; García-Maté, M; Álvarez-Pinazo, G; Santacruz, I; Vallcorba, O; De la Torre, A G; Aranda, M A G

    2016-06-01

    This study reports 78 Rietveld quantitative phase analyses using Cu Kα1, Mo Kα1 and synchrotron radiations. Synchrotron powder diffraction has been used to validate the most challenging analyses. From the results for three series with increasing contents of an analyte (an inorganic crystalline phase, an organic crystalline phase and a glass), it is inferred that Rietveld analyses from high-energy Mo Kα1 radiation have slightly better accuracies than those obtained from Cu Kα1 radiation. This behaviour has been established from the results of the calibration graphics obtained through the spiking method and also from Kullback-Leibler distance statistic studies. This outcome is explained, in spite of the lower diffraction power for Mo radiation when compared to Cu radiation, as arising because of the larger volume tested with Mo and also because higher energy allows one to record patterns with fewer systematic errors. The limit of detection (LoD) and limit of quantification (LoQ) have also been established for the studied series. For similar recording times, the LoDs in Cu patterns, ∼0.2 wt%, are slightly lower than those derived from Mo patterns, ∼0.3 wt%. The LoQ for a well crystallized inorganic phase using laboratory powder diffraction was established to be close to 0.10 wt% in stable fits with good precision. However, the accuracy of these analyses was poor with relative errors near to 100%. Only contents higher than 1.0 wt% yielded analyses with relative errors lower than 20%.

  3. Diagnostic accuracy of bedside tests for predicting difficult intubation in Indian population: An observational study

    PubMed Central

    Dhanger, Sangeeta; Gupta, Suman Lata; Vinayagam, Stalin; Bidkar, Prasanna Udupi; Elakkumanan, Lenin Babu; Badhe, Ashok Shankar

    2016-01-01

    Background: Unanticipated difficult intubation can be challenging to anesthesiologists, and various bedside tests have been tried to predict difficult intubation. Aims: The aim of this study was to determine the incidence of difficult intubation in the Indian population and also to determine the diagnostic accuracy of bedside tests in predicting difficult intubation. Settings and Design: In this study, 200 patients belonging to age group 18–60 years of American Society of Anesthesiologists I and II, scheduled for surgery under general anesthesia requiring endotracheal intubation were enrolled. Patients with upper airway pathology, neck mass, and cervical spine injury were excluded from the study. Materials and Methods: An attending anesthesiologist conducted preoperative assessment and recorded parameters such as body mass index, modified Mallampati grading, inter-incisor distance, neck circumference, and thyromental distance (NC/TMD). After standard anesthetic induction, laryngoscopy was performed, and intubation difficulty assessed using intubation difficulty scale on the basis of seven variables. Statistical Analysis: The Chi-square test or student t-test was performed when appropriate. The binary multivariate logistic regression (forward-Wald) model was used to determine the independent risk factors. Results: Among the 200 patients, 26 patients had difficult intubation with an incidence of 13%. Among different variables, the Mallampati score and NC/TMD were independently associated with difficult intubation. Receiver operating characteristic curve showed a cut-off point of 3 or 4 for Mallampati score and 5.62 for NC/TMD to predict difficult intubation. Conclusion: The diagnostic accuracy of NC/TM ratio and Mallampatti score were better compared to other bedside tests to predict difficult intubation in Indian population. PMID:26957691

  4. Accuracy in Rietveld quantitative phase analysis: a comparative study of strictly monochromatic Mo and Cu radiations

    PubMed Central

    León-Reina, L.; García-Maté, M.; Álvarez-Pinazo, G.; Santacruz, I.; Vallcorba, O.; De la Torre, A. G.; Aranda, M. A. G.

    2016-01-01

    This study reports 78 Rietveld quantitative phase analyses using Cu Kα1, Mo Kα1 and synchrotron radiations. Synchrotron powder diffraction has been used to validate the most challenging analyses. From the results for three series with increasing contents of an analyte (an inorganic crystalline phase, an organic crystalline phase and a glass), it is inferred that Rietveld analyses from high-energy Mo Kα1 radiation have slightly better accuracies than those obtained from Cu Kα1 radiation. This behaviour has been established from the results of the calibration graphics obtained through the spiking method and also from Kullback–Leibler distance statistic studies. This outcome is explained, in spite of the lower diffraction power for Mo radiation when compared to Cu radiation, as arising because of the larger volume tested with Mo and also because higher energy allows one to record patterns with fewer systematic errors. The limit of detection (LoD) and limit of quantification (LoQ) have also been established for the studied series. For similar recording times, the LoDs in Cu patterns, ∼0.2 wt%, are slightly lower than those derived from Mo patterns, ∼0.3 wt%. The LoQ for a well crystallized inorganic phase using laboratory powder diffraction was established to be close to 0.10 wt% in stable fits with good precision. However, the accuracy of these analyses was poor with relative errors near to 100%. Only contents higher than 1.0 wt% yielded analyses with relative errors lower than 20%. PMID:27275132

  5. A material sensitivity study on the accuracy of deformable organ registration using linear biomechanical models.

    PubMed

    Chi, Y; Liang, J; Yan, D

    2006-02-01

    Model-based deformable organ registration techniques using the finite element method (FEM) have recently been investigated intensively and applied to image-guided adaptive radiotherapy (IGART). These techniques assume that human organs are linearly elastic material, and their mechanical properties are predetermined. Unfortunately, the accurate measurement of the tissue material properties is challenging and the properties usually vary between patients. A common issue is therefore the achievable accuracy of the calculation due to the limited access to tissue elastic material constants. In this study, we performed a systematic investigation on this subject based on tissue biomechanics and computer simulations to establish the relationships between achievable registration accuracy and tissue mechanical and organ geometrical properties. Primarily we focused on image registration for three organs: rectal wall, bladder wall, and prostate. The tissue anisotropy due to orientation preference in tissue fiber alignment is captured by using an orthotropic or a transversely isotropic elastic model. First we developed biomechanical models for the rectal wall, bladder wall, and prostate using simplified geometries and investigated the effect of varying material parameters on the resulting organ deformation. Then computer models based on patient image data were constructed, and image registrations were performed. The sensitivity of registration errors was studied by perturbating the tissue material properties from their mean values while fixing the boundary conditions. The simulation results demonstrated that registration error for a subvolume increases as its distance from the boundary increases. Also, a variable associated with material stability was found to be a dominant factor in registration accuracy in the context of material uncertainty. For hollow thin organs such as rectal walls and bladder walls, the registration errors are limited. Given 30% in material uncertainty

  6. A material sensitivity study on the accuracy of deformable organ registration using linear biomechanical models

    SciTech Connect

    Chi, Y.; Liang, J.; Yan, D.

    2006-02-15

    Model-based deformable organ registration techniques using the finite element method (FEM) have recently been investigated intensively and applied to image-guided adaptive radiotherapy (IGART). These techniques assume that human organs are linearly elastic material, and their mechanical properties are predetermined. Unfortunately, the accurate measurement of the tissue material properties is challenging and the properties usually vary between patients. A common issue is therefore the achievable accuracy of the calculation due to the limited access to tissue elastic material constants. In this study, we performed a systematic investigation on this subject based on tissue biomechanics and computer simulations to establish the relationships between achievable registration accuracy and tissue mechanical and organ geometrical properties. Primarily we focused on image registration for three organs: rectal wall, bladder wall, and prostate. The tissue anisotropy due to orientation preference in tissue fiber alignment is captured by using an orthotropic or a transversely isotropic elastic model. First we developed biomechanical models for the rectal wall, bladder wall, and prostate using simplified geometries and investigated the effect of varying material parameters on the resulting organ deformation. Then computer models based on patient image data were constructed, and image registrations were performed. The sensitivity of registration errors was studied by perturbating the tissue material properties from their mean values while fixing the boundary conditions. The simulation results demonstrated that registration error for a subvolume increases as its distance from the boundary increases. Also, a variable associated with material stability was found to be a dominant factor in registration accuracy in the context of material uncertainty. For hollow thin organs such as rectal walls and bladder walls, the registration errors are limited. Given 30% in material uncertainty

  7. AD8 Informant Questionnaire for Cognitive Impairment: Pragmatic Diagnostic Test Accuracy Study.

    PubMed

    Larner, A J

    2015-09-01

    The diagnostic accuracy of the AD8 informant questionnaire for cognitive impairment was assessed in patients referred to a dedicated memory clinic. This pragmatic prospective study of consecutive referrals attending with an informant who completed AD8 (n = 212) lasted 12 months. Diagnosis used standard clinical diagnostic criteria for dementia and mild cognitive impairment as reference standard (prevalence of cognitive impairment = 0.62). The AD8 proved acceptable to informants, was quick, and easy to use. Using the cutoff of ≥2/8, AD8 was highly sensitive (0.97) for diagnosis of cognitive impairment but specificity was poor (0.17). Combining AD8 with either the Mini-Mental State Examination or the Six-Item Cognitive Impairment Test showed little additional diagnostic benefit. In conclusion, AD8 is very sensitive for cognitive impairment in a memory clinic but specificity may be inadequate.

  8. In vivo Study of the Accuracy of Dual-arch Impressions

    PubMed Central

    de Lima, Luciana Martinelli Santayana; Borges, Gilberto Antonio; Junior, Luiz Henrique Burnett; Spohr, Ana Maria

    2014-01-01

    Background: This study evaluated in vivo the accuracy of metal (Smart®) and plastic (Triple Tray®) dual-arch trays used with vinyl polysiloxane (Flexitime®), in the putty/wash viscosity, as well as polyether (Impregum Soft®) in the regular viscosity. Materials and Methods: In one patient, an implant-level transfer was screwed on an implant in the mandibular right first molar, serving as a pattern. Ten impressions were made with each tray and impression material. The impressions were poured with Type IV gypsum. The width and height of the pattern and casts were measured in a profile projector (Nikon). The results were submitted to Student’s t-test for one sample (α = 0.05). Results: For the width distance, the plastic dual-arch trays with vinyl polysiloxane (4.513 mm) and with polyether (4.531 mm) were statistically wider than the pattern (4.489 mm). The metal dual-arch tray with vinyl polysiloxane (4.504 mm) and with polyether (4.500 mm) did not differ statistically from the pattern. For the height distance, only the metal dual-arch tray with polyether (2.253 mm) differed statistically from the pattern (2.310 mm). Conclusion: The metal dual-arch tray with vinyl polysiloxane, in the putty/wash viscosities, reproduced casts with less distortion in comparison with the same technique with the plastic dual-arch tray. The plastic or metal dual-arch trays with polyether reproduced cast with greater distortion. How to cite the article: Santayana de Lima LM, Borges GA, Burnett LH Jr, Spohr AM. In vivo study of the accuracy of dual-arch impressions. J Int Oral Health 2014;6(3):50-5. PMID:25083032

  9. EM-navigated catheter placement for gynecologic brachytherapy: an accuracy study

    NASA Astrophysics Data System (ADS)

    Mehrtash, Alireza; Damato, Antonio; Pernelle, Guillaume; Barber, Lauren; Farhat, Nabgha; Viswanathan, Akila; Cormack, Robert; Kapur, Tina

    2014-03-01

    Gynecologic malignancies, including cervical, endometrial, ovarian, vaginal and vulvar cancers, cause significant mortality in women worldwide. The standard care for many primary and recurrent gynecologic cancers consists of chemoradiation followed by brachytherapy. In high dose rate (HDR) brachytherapy, intracavitary applicators and /or interstitial needles are placed directly inside the cancerous tissue so as to provide catheters to deliver high doses of radiation. Although technology for the navigation of catheters and needles is well developed for procedures such as prostate biopsy, brain biopsy, and cardiac ablation, it is notably lacking for gynecologic HDR brachytherapy. Using a benchtop study that closely mimics the clinical interstitial gynecologic brachytherapy procedure, we developed a method for evaluating the accuracy of image-guided catheter placement. Future bedside translation of this technology offers the potential benefit of maximizing tumor coverage during catheter placement while avoiding damage to the adjacent organs, for example bladder, rectum and bowel. In the study, two independent experiments were performed on a phantom model to evaluate the targeting accuracy of an electromagnetic (EM) tracking system. The procedure was carried out using a laptop computer (2.1GHz Intel Core i7 computer, 8GB RAM, Windows 7 64-bit), an EM Aurora tracking system with a 1.3mm diameter 6 DOF sensor, and 6F (2 mm) brachytherapy catheters inserted through a Syed-Neblett applicator. The 3D Slicer and PLUS open source software were used to develop the system. The mean of the targeting error was less than 2.9mm, which is comparable to the targeting errors in commercial clinical navigation systems.

  10. Predictive accuracy of risk scales following self-harm: multicentre, prospective cohort study.

    PubMed

    Quinlivan, Leah; Cooper, Jayne; Meehan, Declan; Longson, Damien; Potokar, John; Hulme, Tom; Marsden, Jennifer; Brand, Fiona; Lange, Kezia; Riseborough, Elena; Page, Lisa; Metcalfe, Chris; Davies, Linda; O'Connor, Rory; Hawton, Keith; Gunnell, David; Kapur, Nav

    2017-03-16

    BackgroundScales are widely used in psychiatric assessments following self-harm. Robust evidence for their diagnostic use is lacking.AimsTo evaluate the performance of risk scales (Manchester Self-Harm Rule, ReACT Self-Harm Rule, SAD PERSONS scale, Modified SAD PERSONS scale, Barratt Impulsiveness Scale); and patient and clinician estimates of risk in identifying patients who repeat self-harm within 6 months.MethodA multisite prospective cohort study was conducted of adults aged 18 years and over referred to liaison psychiatry services following self-harm. Scale a priori cut-offs were evaluated using diagnostic accuracy statistics. The area under the curve (AUC) was used to determine optimal cut-offs and compare global accuracy.ResultsIn total, 483 episodes of self-harm were included in the study. The episode-based 6-month repetition rate was 30% (n = 145). Sensitivity ranged from 1% (95% CI 0-5) for the SAD PERSONS scale, to 97% (95% CI 93-99) for the Manchester Self-Harm Rule. Positive predictive values ranged from 13% (95% CI 2-47) for the Modified SAD PERSONS Scale to 47% (95% CI 41-53) for the clinician assessment of risk. The AUC ranged from 0.55 (95% CI 0.50-0.61) for the SAD PERSONS scale to 0.74 (95% CI 0.69-0.79) for the clinician global scale. The remaining scales performed significantly worse than clinician and patient estimates of risk (P<0.001).ConclusionsRisk scales following self-harm have limited clinical utility and may waste valuable resources. Most scales performed no better than clinician or patient ratings of risk. Some performed considerably worse. Positive predictive values were modest. In line with national guidelines, risk scales should not be used to determine patient management or predict self-harm.

  11. EM-Navigated Catheter Placement for Gynecologic Brachytherapy: An Accuracy Study.

    PubMed

    Mehrtash, Alireza; Damato, Antonio; Pernelle, Guillaume; Barber, Lauren; Farhat, Nabgha; Viswanathan, Akila; Cormack, Robert; Kapur, Tina

    2014-03-12

    Gynecologic malignancies, including cervical, endometrial, ovarian, vaginal and vulvar cancers, cause significant mortality in women worldwide. The standard care for many primary and recurrent gynecologic cancers consists of chemoradiation followed by brachytherapy. In high dose rate (HDR) brachytherapy, intracavitary applicators and/or interstitial needles are placed directly inside the cancerous tissue so as to provide catheters to deliver high doses of radiation. Although technology for the navigation of catheters and needles is well developed for procedures such as prostate biopsy, brain biopsy, and cardiac ablation, it is notably lacking for gynecologic HDR brachytherapy. Using a benchtop study that closely mimics the clinical interstitial gynecologic brachytherapy procedure, we developed a method for evaluating the accuracy of image-guided catheter placement. Future bedside translation of this technology offers the potential benefit of maximizing tumor coverage during catheter placement while avoiding damage to the adjacent organs, for example bladder, rectum and bowel. In the study, two independent experiments were performed on a phantom model to evaluate the targeting accuracy of an electromagnetic (EM) tracking system. The procedure was carried out using a laptop computer (2.1GHz Intel Core i7 computer, 8GB RAM, Windows 7 64-bit), an EM Aurora tracking system with a 1.3mm diameter 6 DOF sensor, and 6F (2 mm) brachytherapy catheters inserted through a Syed-Neblett applicator. The 3D Slicer and PLUS open source software were used to develop the system. The mean of the targeting error was less than 2.9mm, which is comparable to the targeting errors in commercial clinical navigation systems.

  12. Physical Activity Level Improves the Predictive Accuracy of Cardiovascular Disease Risk Score: The ATTICA Study (2002–2012)

    PubMed Central

    Georgousopoulou, Ekavi N.; Panagiotakos, Demosthenes B.; Bougatsas, Dimitrios; Chatzigeorgiou, Michael; Kavouras, Stavros A.; Chrysohoou, Christina; Skoumas, Ioannis; Tousoulis, Dimitrios; Stefanadis, Christodoulos; Pitsavos, Christos

    2016-01-01

    Background: Although physical activity (PA) has long been associated with cardiovascular disease (CVD), assessment of PA status has never been used as a part of CVD risk prediction tools. The aim of the present work was to examine whether the inclusion of PA status in a CVD risk model improves its predictive accuracy. Methods: Data from the 10-year follow-up (2002–2012) of the n = 2020 participants (aged 18–89 years) of the ATTICA prospective study were used to test the research hypothesis. The HellenicSCORE (that incorporates age, sex, smoking, total cholesterol, and systolic blood pressure levels) was calculated to estimate the baseline 10-year CVD risk; assessment of PA status was based on the International Physical Activity Questionnaire. The estimated CVD risk was tested against the observed 10-year incidence (i.e., development of acute coronary syndromes, stroke, or other CVD according to the World Health Organization [WHO]-International Classification of Diseases [ICD]-10 criteria). Changes in the predictive ability of the nested CVD risk model that contained the HellenicSCORE plus PA assessment were evaluated using Harrell's C and net reclassification index. Results: Both HellenicSCORE and PA status were predictors of future CVD events (P < 0.05). However, the estimating classification bias of the model that included only the HellenicSCORE was significantly reduced when PA assessment was included (Harrel's C = 0.012, P = 0.032); this reduction remained significant even when adjusted for diabetes mellitus and dietary habits (P < 0.05). Conclusions: CVD risk scores seem to be more accurate by incorporating individuals’ PA status; thus, may be more effective tools in primary prevention by efficiently allocating CVD candidates. PMID:27076890

  13. Study on Improvement of Accuracy in Inertial Photogrammetry by Combining Images with Inertial Measurement Unit

    NASA Astrophysics Data System (ADS)

    Kawasaki, Hideaki; Anzai, Shojiro; Koizumi, Toshio

    2016-06-01

    Inertial photogrammetry is defined as photogrammetry that involves using a camera on which an inertial measurement unit (IMU) is mounted. In inertial photogrammetry, the position and inclination of a shooting camera are calculated using the IMU. An IMU is characterized by error growth caused by time accumulation because acceleration is integrated with respect to time. This study examines the procedure to estimate the position of the camera accurately while shooting using the IMU and the structure from motion (SfM) technology, which is applied in many fields, such as computer vision. When neither the coordinates of the position of the camera nor those of feature points are known, SfM provides a similar positional relationship between the position of the camera and feature points. Therefore, the actual length of positional coordinates is not determined. If the actual length of the position of the camera is unknown, the camera acceleration is obtained by calculating the second order differential of the position of the camera, with respect to the shooting time. The authors had determined the actual length by assigning the position of IMU to the SfM-calculated position. Hence, accuracy decreased because of the error growth, which was the characteristic feature of IMU. In order to solve this problem, a new calculation method was proposed. Using this method, the difference between the IMU-calculated acceleration and the camera-calculated acceleration can be obtained using the method of least squares, and the magnification required for calculating the actual dimension from the position of the camera can be obtained. The actual length can be calculated by multiplying all the SfM point groups by the obtained magnification factor. This calculation method suppresses the error growth, which is due to the time accumulation in IMU, and improves the accuracy of inertial photogrammetry.

  14. Accuracy evaluation of the optical surface monitoring system on EDGE linear accelerator in a phantom study.

    PubMed

    Mancosu, Pietro; Fogliata, Antonella; Stravato, Antonella; Tomatis, Stefano; Cozzi, Luca; Scorsetti, Marta

    2016-01-01

    Frameless stereotactic radiosurgery (SRS) requires dedicated systems to monitor the patient position during the treatment to avoid target underdosage due to involuntary shift. The optical surface monitoring system (OSMS) is here evaluated in a phantom-based study. The new EDGE linear accelerator from Varian (Varian, Palo Alto, CA) integrates, for cranial lesions, the common cone beam computed tomography (CBCT) and kV-MV portal images to the optical surface monitoring system (OSMS), a device able to detect real-time patient׳s face movements in all 6 couch axes (vertical, longitudinal, lateral, rotation along the vertical axis, pitch, and roll). We have evaluated the OSMS imaging capability in checking the phantoms׳ position and monitoring its motion. With this aim, a home-made cranial phantom was developed to evaluate the OSMS accuracy in 4 different experiments: (1) comparison with CBCT in isocenter location, (2) capability to recognize predefined shifts up to 2° or 3cm, (3) evaluation at different couch angles, (4) ability to properly reconstruct the surface when the linac gantry visually block one of the cameras. The OSMS system showed, with a phantom, to be accurate for positioning in respect to the CBCT imaging system with differences of 0.6 ± 0.3mm for linear vector displacement, with a maximum rotational inaccuracy of 0.3°. OSMS presented an accuracy of 0.3mm for displacement up to 1cm and 1°, and 0.5mm for larger displacements. Different couch angles (45° and 90°) induced a mean vector uncertainty < 0.4mm. Coverage of 1 camera produced an uncertainty < 0.5mm. Translations and rotations of a phantom can be accurately detect with the optical surface detector system.

  15. Summarising and validating test accuracy results across multiple studies for use in clinical practice.

    PubMed

    Riley, Richard D; Ahmed, Ikhlaaq; Debray, Thomas P A; Willis, Brian H; Noordzij, J Pieter; Higgins, Julian P T; Deeks, Jonathan J

    2015-06-15

    Following a meta-analysis of test accuracy studies, the translation of summary results into clinical practice is potentially problematic. The sensitivity, specificity and positive (PPV) and negative (NPV) predictive values of a test may differ substantially from the average meta-analysis findings, because of heterogeneity. Clinicians thus need more guidance: given the meta-analysis, is a test likely to be useful in new populations, and if so, how should test results inform the probability of existing disease (for a diagnostic test) or future adverse outcome (for a prognostic test)? We propose ways to address this. Firstly, following a meta-analysis, we suggest deriving prediction intervals and probability statements about the potential accuracy of a test in a new population. Secondly, we suggest strategies on how clinicians should derive post-test probabilities (PPV and NPV) in a new population based on existing meta-analysis results and propose a cross-validation approach for examining and comparing their calibration performance. Application is made to two clinical examples. In the first example, the joint probability that both sensitivity and specificity will be >80% in a new population is just 0.19, because of a low sensitivity. However, the summary PPV of 0.97 is high and calibrates well in new populations, with a probability of 0.78 that the true PPV will be at least 0.95. In the second example, post-test probabilities calibrate better when tailored to the prevalence in the new population, with cross-validation revealing a probability of 0.97 that the observed NPV will be within 10% of the predicted NPV.

  16. Beyond the Correlation Coefficient in Studies of Self-Assessment Accuracy: Commentary on Zell & Krizan (2014).

    PubMed

    Dunning, David; Helzer, Erik G

    2014-03-01

    Zell and Krizan (2014, this issue) provide a comprehensive yet incomplete portrait of the factors influencing accurate self-assessment. This is no fault of their own. Much work on self-accuracy focuses on the correlation coefficient as the measure of accuracy, but it is not the only way self-accuracy can be measured. As such, its use can provide an incomplete and potentially misleading story. We urge researchers to explore measures of bias as well as correlation, because there are indirect hints that each respond to a different psychological dynamic. We further entreat researchers to develop other creative measures of accuracy and not to forget that self-accuracy may come not only from personal knowledge but also from insight about human nature more generally.

  17. Validity and accuracy of maternal tactile assessment for fever in under-five children in North Central Nigeria: a cross-sectional study

    PubMed Central

    Abdulkadir, Mohammed Baba; Johnson, Wahab Babatunde Rotimi; Ibraheem, Rasheedah Mobolaji

    2014-01-01

    Objectives This study seeks to determine not only the reliability of parental touch in detecting fever as compared to rectal thermometry in under-five children, but also the sociodemographic factors that may predict its reliability. Setting The study was carried out in the Emergency Paediatric Unit of a tertiary hospital in North Central Nigeria. Participants 409 children aged less than 5 years with a history of fever in the 48 h prior to presentation and their mothers were recruited consecutively. All the children recruited completed the study. Children with clinical parameters suggestive of shock, and those who were too ill, were excluded from the study. Primary and secondary outcome measures The primary outcome was the proportion of mothers who could accurately predict if their child was febrile or not (defined by rectal temperature) using tactile assessment only. Secondary outcomes were the validity and accuracy of touch in detecting fever and factors related to its accuracy. Results About 85% of the children were febrile using rectal thermometry. The sensitivity, specificity, positive predictive and negative predictive values for touch as a screening tool were 63%, 54%, 88.3% and 21%, respectively. High maternal socioeconomic status and low maternal age influenced positively the accuracy of touch in correctly determining the presence or absence of fever. Conclusions This study has shown that tactile assessment of temperature is not reliable and that absence of fever in a previously febrile child should be confirmed by objective methods of temperature measurement. PMID:25304190

  18. Accuracy of various methods of localization of the orifice of the coronary sinus at electrophysiologic study.

    PubMed

    Davis, L M; Byth, K; Lau, K C; Uther, J B; Richards, D A; Ross, D L

    1992-08-01

    The coronary sinus (CS) orifice is an important reference point for determining electrode and, thereby, accessory pathway location at electrophysiologic study. The reliability of fluoroscopic landmarks used to identify the CS orifice is not known. This study compared the accuracy of several fluoroscopic landmarks for identifying the CS orifice with the location defined by radiopaque contrast injection of the CS. Forty patients were studied. Radiographic markers of the CS orifice that were examined included: (1) the point at which the CS catheter prolapsed during advancement, (2) the point of maximum convexity of the CS catheter when a superior vena caval approach was used, (3) the right side of the ventricular septum, and (4) the relation to the underlying vertebrae. The least-significant difference method of multiple comparisons was used for statistical analysis. The point at which the CS catheter prolapsed was the most accurate noncontrast method for determining the location of the CS orifice (p less than 0.05), but was possible without the use of excessive force in only 48% of patients. The point of catheter prolapse was a median of 1 mm (range 0 to 11) from the true location of the os. Errors with other examined landmarks ranged up to 3 cm. Identification of the CS orifice is best performed by radiopaque contrast injection. The point of prolapse during catheter advancement in the CS is an accurate alternative when contrast injection is not feasible. Other noncontrast fluoroscopic landmarks are less reliable and are best avoided.(ABSTRACT TRUNCATED AT 250 WORDS)

  19. Study of on-machine error identification and compensation methods for micro machine tools

    NASA Astrophysics Data System (ADS)

    Wang, Shih-Ming; Yu, Han-Jen; Lee, Chun-Yi; Chiu, Hung-Sheng

    2016-08-01

    Micro machining plays an important role in the manufacturing of miniature products which are made of various materials with complex 3D shapes and tight machining tolerance. To further improve the accuracy of a micro machining process without increasing the manufacturing cost of a micro machine tool, an effective machining error measurement method and a software-based compensation method are essential. To avoid introducing additional errors caused by the re-installment of the workpiece, the measurement and compensation method should be on-machine conducted. In addition, because the contour of a miniature workpiece machined with a micro machining process is very tiny, the measurement method should be non-contact. By integrating the image re-constructive method, camera pixel correction, coordinate transformation, the error identification algorithm, and trajectory auto-correction method, a vision-based error measurement and compensation method that can on-machine inspect the micro machining errors and automatically generate an error-corrected numerical control (NC) program for error compensation was developed in this study. With the use of the Canny edge detection algorithm and camera pixel calibration, the edges of the contour of a machined workpiece were identified and used to re-construct the actual contour of the work piece. The actual contour was then mapped to the theoretical contour to identify the actual cutting points and compute the machining errors. With the use of a moving matching window and calculation of the similarity between the actual and theoretical contour, the errors between the actual cutting points and theoretical cutting points were calculated and used to correct the NC program. With the use of the error-corrected NC program, the accuracy of a micro machining process can be effectively improved. To prove the feasibility and effectiveness of the proposed methods, micro-milling experiments on a micro machine tool were conducted, and the results

  20. Accuracy and Adoption of Wearable Technology Used by Active Citizens: A Marathon Event Field Study

    PubMed Central

    Suleder, Julian; Zowalla, Richard

    2017-01-01

    Background Today, runners use wearable technology such as global positioning system (GPS)–enabled sport watches to track and optimize their training activities, for example, when participating in a road race event. For this purpose, an increasing amount of low-priced, consumer-oriented wearable devices are available. However, the variety of such devices is overwhelming. It is unclear which devices are used by active, healthy citizens and whether they can provide accurate tracking results in a diverse study population. No published literature has yet assessed the dissemination of wearable technology in such a cohort and related influencing factors. Objective The aim of this study was 2-fold: (1) to determine the adoption of wearable technology by runners, especially “smart” devices and (2) to investigate on the accuracy of tracked distances as recorded by such devices. Methods A pre-race survey was applied to assess which wearable technology was predominantly used by runners of different age, sex, and fitness level. A post-race survey was conducted to determine the accuracy of the devices that tracked the running course. Logistic regression analysis was used to investigate whether age, sex, fitness level, or track distance were influencing factors. Recorded distances of different device categories were tested with a 2-sample t test against each other. Results A total of 898 pre-race and 262 post-race surveys were completed. Most of the participants (approximately 75%) used wearable technology for training optimization and distance recording. Females (P=.02) and runners in higher age groups (50-59 years: P=.03; 60-69 years: P<.001; 70-79 year: P=.004) were less likely to use wearables. The mean of the track distances recorded by mobile phones with combined app (mean absolute error, MAE=0.35 km) and GPS-enabled sport watches (MAE=0.12 km) was significantly different (P=.002) for the half-marathon event. Conclusions A great variety of vendors (n=36) and devices

  1. Accuracy and repeatability of Roentgen stereophotogrammetric analysis (RSA) for measuring knee laxity in longitudinal studies.

    PubMed

    Fleming, B C; Peura, G D; Abate, J A; Beynnon, B D

    2001-10-01

    Roentgen stereophotogrammetric analysis (RSA) can be used to assess temporal changes in anterior-posterior (A-P) knee laxity. However, the accuracy and precision of RSA is dependent on many factors and should be independently evaluated for a particular application. The objective of this study was to evaluate the use of RSA for measuring A-P knee laxity. The specific aims were to assess the variation or "noise" inherent to RSA, to determine the reproducibility of RSA for repeated A-P laxity testing, and to assess the accuracy of these measurements. Two experiments were performed. The first experiment utilized three rigid models of the tibiofemoral joint to assess the noise and to compare digitization errors of two independent examiners. No differences were found in the kinematic outputs of the RSA due to examiner, repeated trials, or the model used. In a second experiment, A-P laxity values between the A-P shear load limits of +/-60 N of five cadaver goat knees were measured to assess the error associated with repeated testing. The RSA laxity values were also compared to those obtained from a custom designed linkage system. The mean A-P laxity values with the knee 30 degrees, 60 degrees, and 90 degrees of flexion for the ACL-intact goat knee (+/-95% confidence interval) were 0.8 (+/-0.25), 0.9 (+/-0.29), and 0.4 (+/-0.22) mm, respectively. In the ACL-deficient knee, the A-P laxity values increased by an order of magnitude to 8.8 (+/-1.39), 7.6 (+/-1.32), and 3.1 (+/-1.20)mm, respectively. No significant differences were found between the A-P laxity values measured by RSA and the independent measurement technique. A highly significant linear relationship (r(2)=0.83) was also found between these techniques. This study suggests that the RSA method is an accurate and precise means to measure A-P knee laxity for repeated testing over time.

  2. Cumulative incidence of childhood autism: a total population study of better accuracy and precision.

    PubMed

    Honda, Hideo; Shimizu, Yasuo; Imai, Miho; Nitto, Yukari

    2005-01-01

    Most studies on the frequency of autism have had methodological problems. Most notable of these have been differences in diagnostic criteria between studies, degree of cases overlooked by the initial screening, and type of measurement. This study aimed to replicate the first report on childhood autism to address cumulative incidence as well as prevalence, as defined in the International Statistical Classification of Diseases and Related Health Problems, 10th revision (ICD-10) Diagnostic Criteria for Research. Here, the same methodological accuracy (exactness of a measurement to the true value) as the first study was used, but population size was four times larger to achieve greater precision (reduction of random error). A community-oriented system of early detection and early intervention for developmental disorders was established in the northern part of Yokohama, Japan. The city's routine health checkup for 18-month-old children served as the initial mass screening, and all facilities that provided child care services aimed to detect all cases of childhood autism and refer them to the Yokohama Rehabilitation Center. Cumulative incidence up to age 5 years was calculated for childhood autism among a birth cohort from four successive years (1988 to 1991). Cumulative incidence of childhood autism was 27.2 per 10000. Cumulative incidences by sex were 38.4 per 10000 in males, and 15.5 per 10000 in females. The male:female ratio was 2.5:1. The proportions of children with high-functioning autism who had Binet IQs of 70 and over and those with Binet IQs of 85 and over were 25.3% and 13.7% respectively. Data on cumulative incidence of childhood autism derived from this study are the first to be drawn from an accurate, as well as precise, screening methodology.

  3. Anti-aliasing filters for deriving high-accuracy DEMs from TLS data: A case study from Freeport, Texas

    NASA Astrophysics Data System (ADS)

    Xiong, Lin.; Wang, Guoquan; Wessel, Paul

    2017-03-01

    Terrestrial laser scanning (TLS), also known as ground-based Light Detection and Ranging (LiDAR), has been frequently applied to build bare-earth digital elevation models (DEMs) for high-accuracy geomorphology studies. The point clouds acquired from TLS often achieve a spatial resolution at fingerprint (e.g., 3 cm×3 cm) to handprint (e.g., 10 cm×10 cm) level. A downsampling process has to be applied to decimate the massive point clouds and obtain manageable DEMs. It is well known that downsampling can result in aliasing that causes different signal components to become indistinguishable when the signal is reconstructed from the datasets with a lower sampling rate. Conventional DEMs are mainly the results of upsampling of sparse elevation measurements from land surveying, satellite remote sensing, and aerial photography. As a consequence, the effects of aliasing caused by downsampling have not been fully investigated in the open literature of DEMs. This study aims to investigate the spatial aliasing problem of regridding dense TLS data. The TLS data collected from the beach and dune area near Freeport, Texas in the summer of 2015 are used for this study. The core idea of the anti-aliasing procedure is to apply a low-pass spatial filter prior to conducting downsampling. This article describes the successful use of a fourth-order Butterworth low-pass spatial filter employed in the Generic Mapping Tools (GMT) software package as an anti-aliasing filter. The filter can be applied as an isotropic filter with a single cutoff wavelength or as an anisotropic filter with two different cutoff wavelengths in the X and Y directions. The cutoff wavelength for the isotropic filter is recommended to be three times the grid size of the target DEM.

  4. Accuracy of positioning and irradiation targeting for multiple targets in intracranial image-guided radiation therapy: a phantom study.

    PubMed

    Tominaga, Hirofumi; Araki, Fujio; Shimohigashi, Yoshinobu; Ishihara, Terunobu; Kawasaki, Keiichi; Kanetake, Nagisa; Sakata, Junichi; Iwashita, Yuki

    2014-12-21

    This study investigated the accuracy of positioning and irradiation targeting for multiple off-isocenter targets in intracranial image-guided radiation therapy (IGRT). A phantom with nine circular targets was created to evaluate both accuracies. First, the central point of the isocenter target was positioned with a combination of an ExacTrac x-ray (ETX) and a 6D couch. The positioning accuracy was determined from the deviations of coordinates of the central point in each target obtained from the kV-cone beam computed tomography (kV-CBCT) for IGRT and the planning CT. Similarly, the irradiation targeting accuracy was evaluated from the deviations of the coordinates between the central point of each target and the central point of each multi-leaf collimator (MLC) field for multiple targets. Secondly, the 6D couch was intentionally rotated together with both roll and pitch angles of 0.5° and 1° at the isocenter and similarly the deviations were evaluated. The positioning accuracy for all targets was less than 1 mm after 6D positioning corrections. The irradiation targeting accuracy was up to 1.3 mm in the anteroposterior (AP) direction for a target 87 mm away from isocenter. For the 6D couch rotations with both roll and pitch angles of 0.5° and 1°, the positioning accuracy was up to 1.0 mm and 2.3 mm in the AP direction for the target 87 mm away from the isocenter, respectively. The irradiation targeting accuracy was up to 2.1 mm and 2.6 mm in the AP direction for the target 87 mm away from the isocenter, respectively. The off-isocenter irradiation targeting accuracy became worse than the positioning accuracy. Both off-isocenter accuracies worsened in proportion to rotation angles and the distance from the isocenter to the targets. It is necessary to examine the set-up margin for off-isocenter multiple targets at each institution because irradiation targeting accuracy is peculiar to the linac machine.

  5. Accuracy of positioning and irradiation targeting for multiple targets in intracranial image-guided radiation therapy: a phantom study

    NASA Astrophysics Data System (ADS)

    Tominaga, Hirofumi; Araki, Fujio; Shimohigashi, Yoshinobu; Ishihara, Terunobu; Kawasaki, Keiichi; Kanetake, Nagisa; Sakata, Junichi; Iwashita, Yuki

    2014-12-01

    This study investigated the accuracy of positioning and irradiation targeting for multiple off-isocenter targets in intracranial image-guided radiation therapy (IGRT). A phantom with nine circular targets was created to evaluate both accuracies. First, the central point of the isocenter target was positioned with a combination of an ExacTrac x-ray (ETX) and a 6D couch. The positioning accuracy was determined from the deviations of coordinates of the central point in each target obtained from the kV-cone beam computed tomography (kV-CBCT) for IGRT and the planning CT. Similarly, the irradiation targeting accuracy was evaluated from the deviations of the coordinates between the central point of each target and the central point of each multi-leaf collimator (MLC) field for multiple targets. Secondly, the 6D couch was intentionally rotated together with both roll and pitch angles of 0.5° and 1° at the isocenter and similarly the deviations were evaluated. The positioning accuracy for all targets was less than 1 mm after 6D positioning corrections. The irradiation targeting accuracy was up to 1.3 mm in the anteroposterior (AP) direction for a target 87 mm away from isocenter. For the 6D couch rotations with both roll and pitch angles of 0.5° and 1°, the positioning accuracy was up to 1.0 mm and 2.3 mm in the AP direction for the target 87 mm away from the isocenter, respectively. The irradiation targeting accuracy was up to 2.1 mm and 2.6 mm in the AP direction for the target 87 mm away from the isocenter, respectively. The off-isocenter irradiation targeting accuracy became worse than the positioning accuracy. Both off-isocenter accuracies worsened in proportion to rotation angles and the distance from the isocenter to the targets. It is necessary to examine the set-up margin for off-isocenter multiple targets at each institution because irradiation targeting accuracy is peculiar to the linac machine.

  6. Popular Music as a Learning Tool in the Social Studies.

    ERIC Educational Resources Information Center

    Litevich, John A., Jr.

    This teaching guide reflects the belief that popular music is an effective tool for teachers to use in presenting social studies lessons to students. Titles of songs representative of popular music from 1955 to 1982 are listed by subject matter and suggest a possible lesson to be used in teaching that particular issue. Subject areas listed…

  7. Second International Diagnostic Accuracy Study for the Serological Detection of West Nile Virus Infection

    PubMed Central

    Papa, Anna; Sambri, Vittorio; Teichmann, Anette; Niedrig, Matthias

    2013-01-01

    Background In recent decades, sporadic cases and outbreaks in humans of West Nile virus (WNV) infection have increased. Serological diagnosis of WNV infection can be performed by enzyme-linked immunosorbent assay (ELISA), immunofluorescence assay (IFA) neutralization test (NT) and by hemagglutination-inhibition assay. The aim of this study is to collect updated information regarding the performance accuracy of WNV serological diagnostics. Methodology/Principal findings In 2011, the European Network for the Diagnostics of Imported Viral Diseases-Collaborative Laboratory Response Network (ENIVD-CLRN) organized the second external quality assurance (EQA) study for the serological diagnosis of WNV infection. A serum panel of 13 samples (included sera reactive against WNV, plus specificity and negative controls) was sent to 48 laboratories involved in WNV diagnostics. Forty-seven of 48 laboratories from 30 countries participated in the study. Eight laboratories achieved 100% of concurrent and correct results. The main obstacle in other laboratories to achieving similar performances was the cross-reactivity of antibodies amongst heterologous flaviviruses. No differences were observed in performances of in-house and commercial test used by the laboratories. IFA was significantly more specific compared to ELISA in detecting IgG antibodies. The overall analytical sensitivity and specificity of diagnostic tests for IgM detection were 50% and 95%, respectively. In comparison, the overall sensitivity and specificity of diagnostic tests for IgG detection were 86% and 69%, respectively. Conclusions/Significance This EQA study demonstrates that there is still need to improve serological tests for WNV diagnosis. The low sensitivity of IgM detection suggests that there is a risk of overlooking WNV acute infections, whereas the low specificity for IgG detection demonstrates a high level of cross-reactivity with heterologous flaviviruses. PMID:23638205

  8. Accuracy of a self-collection kit for the microbiological study of the vaginal content.

    PubMed

    Passos, Mauro Romero L; Varella, Renata Q; Barreto, Nero A; Garcia, Maria Luiza; Giraldo, Paulo C

    2007-04-01

    Diagnosis of vaginal discharge is frequently performed in an empirical way, leading to inadequate treatment. This study tested the accuracy of a self-collection kit for microbiological study of the vaginal content. One hundred and forty-two women of Family Health Program units in Niterói and Piraí cities were enrolled in order to have their vaginal content studied. A brief explanation and a self-collection kit were provided in order to sample the vaginal content. The self-collection kit was composed of one empty plastic tube, two glass slides, a long handle cytobrush, an identification card and guideline notes. The vaginal sample was applied on the glass slides by the women and stained by Gram technique. A second sampling was done by the medical personnel. The microbiological diagnosis in a blinded analysis was made under optical microscopy. A validation diagnosis test was done taking the medical collection results as a gold standard. A total of 106 women had followed the protocol and were included in the study. Microbiological analysis was unsatisfactory in 12 cases (6 cases of self-collection material and 6 cases of medical collection). The microbiological analyses in the self-collection and in the medical collection material were respectively: bacterial vaginosis in 21.7% and 17.9%, non bacillar flora in 10.3% and 11.3%, vaginal trichomoniasis in 5.66% and 5.6%, candidiasis in 3.78% and 2.8% and a normal microbiota in 52.8% and 56.6%. The Kappa coefficient suggested a "very good correlation" of the microbiological results between the two methods of collection (K=0.7945). The self-collection kit provides samples for microbiological analysis of the vaginal microbiota as good as medical collection.

  9. Influence of instrument size on the accuracy of different apex locators: an in vitro study.

    PubMed

    Briseño-Marroquín, Benjamín; Frajlich, Santiago; Goldberg, Fernando; Willershausen, Brita

    2008-06-01

    The aim of this in vitro investigation was to determine the accuracy of 4 different electronic apex locators (EALs) with 3 different instrument sizes. For this study 146 roots were embedded in an agar solution. Electronic measurements were made to the physiologic foramen (apical constriction) with the Elements Apex Locator, Justy II, Raypex 5, and ProPex II and K-type files sizes 08, 10, and 15. Statistical significances were calculated with the sign test (P < .001). Exact measurements to the physiologic foramen were made with the Elements Apex Locator, 36.99%, 39.04%, and 44.93%; Justy II, 38.62%, 32.41%, and 43.41%; Raypex 5, 42.76%, 39.31%, and 39.06%; and ProPex II, 38.62%, 43.45%, and 40.63% of the time with instrument sizes 08, 10, and 15, respectively. No significant differences were found between the actual working length and EALs/instrument size. A nonsignificant higher number of unstable measurements were observed in all EALs with instrument size 15.

  10. Accuracy, Effectiveness and Improvement of Vibration-Based Maintenance in Paper Mills: Case Studies

    NASA Astrophysics Data System (ADS)

    AL-NAJJAR, B.

    2000-01-01

    Many current vibration-based maintenance (VBM) policies for rolling element bearings do not use as much as possible of their useful lives. Evidence and indications to prolong the bearings' mean effective lives by using more accurate diagnosis and prognosis are confirmed when faulty bearing installation, faulty machinery design, harsh environmental condition and when a bearing is replaced as soon as its vibration level exceeds the normal. Analysis of data from roller bearings at two paper mills suggests that longer bearing lives can be safely achieved by increasing the accuracy of the vibration data. This paper relates bearing failure modes to the observed vibration spectra and their development patterns over the bearings' lives. A systematic approach, which describes the objectives and performance of studies in two Swedish paper mills, is presented. Explanations of the mechanisms behind some frequent modes of early failure and ways to avoid them are suggested. It is shown theoretically, and partly confirmed by the analysis of (unfortunately incomplete) data from two paper mills over many years, that accurate prediction of remaining bearing life requires: (a) enough vibration measurements, (b) numerate records of operating conditions, (c) better discrimination between frequencies in the spectrum and (d) correlation of (b) and (c). This is because life prediction depends on precise knowledge of primary, harmonic and side-band frequency amplitudes and their development over time. Further, the available data, which are collected from relevant plant activities, can be utilized to perform cyclic improvements in diagnosis, prognosis, experience and economy.

  11. Functional Knowledge Transfer for High-accuracy Prediction of Under-studied Biological Processes

    PubMed Central

    Rowland, Jessica; Guan, Yuanfang; Bongo, Lars A.; Burdine, Rebecca D.; Troyanskaya, Olga G.

    2013-01-01

    A key challenge in genetics is identifying the functional roles of genes in pathways. Numerous functional genomics techniques (e.g. machine learning) that predict protein function have been developed to address this question. These methods generally build from existing annotations of genes to pathways and thus are often unable to identify additional genes participating in processes that are not already well studied. Many of these processes are well studied in some organism, but not necessarily in an investigator's organism of interest. Sequence-based search methods (e.g. BLAST) have been used to transfer such annotation information between organisms. We demonstrate that functional genomics can complement traditional sequence similarity to improve the transfer of gene annotations between organisms. Our method transfers annotations only when functionally appropriate as determined by genomic data and can be used with any prediction algorithm to combine transferred gene function knowledge with organism-specific high-throughput data to enable accurate function prediction. We show that diverse state-of-art machine learning algorithms leveraging functional knowledge transfer (FKT) dramatically improve their accuracy in predicting gene-pathway membership, particularly for processes with little experimental knowledge in an organism. We also show that our method compares favorably to annotation transfer by sequence similarity. Next, we deploy FKT with state-of-the-art SVM classifier to predict novel genes to 11,000 biological processes across six diverse organisms and expand the coverage of accurate function predictions to processes that are often ignored because of a dearth of annotated genes in an organism. Finally, we perform in vivo experimental investigation in Danio rerio and confirm the regulatory role of our top predicted novel gene, wnt5b, in leftward cell migration during heart development. FKT is immediately applicable to many bioinformatics techniques and will

  12. [Study monitoring: a useful tool for quality health research].

    PubMed

    Arias Valencia, Samuel Andrés; Hernández Pinzón, Giovanna

    2009-05-01

    As well as protecting the rights of participants, a study's ethics must encompass the quality of its execution. As such, international standards have been established for studies involving human subjects. The objective of this review is to evaluate the usefulness of the Guide to Good Clinical Practice and "study monitoring" as tools useful to producing quality research. The Guide provides scientific ethics and quality standards for designing, conducting, registering, and notifying studies involving human subjects. By implementing specific processes and procedures, study monitoring seeks to ensure that research is followed and evaluated from inception, through execution and closure, thus producing studies with high quality standards.

  13. A New Wind Profiler Trajectory Tool for Air Quality Studies

    NASA Astrophysics Data System (ADS)

    White, A. B.; Senff, C. J.; Keane, A. N.; Koury, J.

    2003-12-01

    The Cooperative Institute for Research in Environmental Sciences, the NOAA Environmental Technology Laboratory (NOAA/ETL), and the Science and Technology Corporation have developed a new online tool for producing forward and backward trajectories from hourly wind profiles measured by a network of boundary-layer wind profilers. The tool is intended to aid scientists and forecasters in the planning and execution of field operations during the 2004 Northeast North Atlantic Air Quality Study. This study will involve an international consortium of agencies and will include upwards of a dozen of research aircraft and the NOAA research vessel Ronald H. Brown. The purpose of this talk is to demonstrate the tool and collect feedback from scientific investigators, which we will use to modify the tool before the 2004 field study. In addition, we will present preliminary results from the 2002 New England Air Quality Study that demonstrate the value of using continuous profiler observations instead of numerical model initialization fields to calculate trajectories for the meteorologically complex coastal zone of New England. The trajectory tool uses the horizontal wind profiles measured by the profiler network that are collected in near-real time and archived at NOAA/ETL's facility in Boulder, Colorado. The vertical velocities are not used because of large uncertainty in the profiler's vertical velocity measurement. To calculate hourly trajectory positions, the horizontal winds are interpolated in space using an inverse distance squared weighting. Users may request altitude ranges for the trajectories as well as start and end times and trajectory starting/end points. Trajectories are plotted on a two dimensional map background and are color coded by their respective altitude range.

  14. Accuracy Assessment of Using Rapid Prototyping Drill Templates for Atlantoaxial Screw Placement: A Cadaver Study

    PubMed Central

    Guo, Shuai; Lu, Teng; Hu, Qiaolong; Yang, Baohui; He, Xijing

    2016-01-01

    Purpose. To preliminarily evaluate the feasibility and accuracy of using rapid prototyping drill templates (RPDTs) for C1 lateral mass screw (C1-LMS) and C2 pedicle screw (C2-PS) placement. Methods. 23 formalin-fixed craniocervical cadaver specimens were randomly divided into two groups. In the conventional method group, intraoperative fluoroscopy was used to assist the screw placement. In the RPDT navigation group, specific RPDTs were constructed for each specimen and were used intraoperatively for screw placement navigation. The screw position, the operating time, and the fluoroscopy time for each screw placement were compared between the 2 groups. Results. Compared with the conventional method, the RPDT technique significantly increased the placement accuracy of the C2-PS (p < 0.05). In the axial plane, using RPDTs also significantly increased C1-LMS placement accuracy (p < 0.05). In the sagittal plane, although using RPDTs had a very high accuracy rate (100%) in C1-LMS placement, it was not statistically significant compared with the conventional method (p > 0.05). Moreover, the RPDT technique significantly decreased the operating and fluoroscopy times. Conclusion. Using RPDTs significantly increases the accuracy of C1-LMS and C2-PS placement while decreasing the screw placement time and the radiation exposure. Due to these advantages, this approach is worth promoting for use in the Harms technique. PMID:28004004

  15. [Study on high accuracy detection of multi-component gas in oil-immerse power transformer].

    PubMed

    Fan, Jie; Chen, Xiao; Huang, Qi-Feng; Zhou, Yu; Chen, Gang

    2013-12-01

    In order to solve the problem of low accuracy and mutual interference in multi-component gas detection, a kind of multi-component gas detection network with high accuracy was designed. A semiconductor laser with narrow bandwidth was utilized as light source and a novel long-path gas cell was also used in this system. By taking the single sine signal to modulate the spectrum of laser and using space division multiplexing (SDM) and time division multiplexing (TDM) technique, the detection of multi-component gas was achieved. The experiments indicate that the linearity relevance coefficient is 0. 99 and the measurement relative error is less than 4%. The system dynamic response time is less than 15 s, by filling a volume of multi-component gas into the gas cell gradually. The system has advantages of high accuracy and quick response, which can be used in the fault gas on-line monitoring for power transformers in real time.

  16. Comparison study of algorithms and accuracy in the wavelength scanning interferometry.

    PubMed

    Muhamedsalih, Hussam; Gao, Feng; Jiang, Xiangqian

    2012-12-20

    Wavelength scanning interferometry (WSI) can be used for surface measurement with discontinuous surface profiles by producing phase shifts without any mechanical scanning process. The choice of algorithms for the WSI to analyze the fringe pattern depends on the desired accuracy and computing speed. This paper provides comparison of four different algorithms to analyze the interference fringe pattern acquired from WSI. The mathematical description of these algorithms, their computing resolution, and speed are presented. Two step-height samples are measured using the WSI. Experimental results demonstrate that the accuracy of measuring surface height varies from micrometer to nanometer value depending on the algorithm used to analyze the captured interferograms.

  17. Alaska Case Study: Scientists Venturing Into Field with Journalists Improves Accuracy

    NASA Astrophysics Data System (ADS)

    Ekwurzel, B.; Detjen, J.; Hayes, R.; Nurnberger, L.; Pavangadkar, A.; Poulson, D.

    2008-12-01

    Issues such as climate change, stem cell research, public health vaccination, etc., can be fraught with public misunderstanding, myths, as well as deliberate distortions of the fundamental science. Journalists are adept at creating print, radio, and video content that can be both compelling and informative to the public. Yet most scientists have little time or training to devote to developing media content for the public and spend little time with journalists who cover science stories. We conducted a case study to examine whether the time and funding invested in exposing journalists to scientists in the field over several days would improve accuracy of media stories about complex scientific topics. Twelve journalists were selected from the 70 who applied for a four-day environmental journalism fellowship in Alaska. The final group achieved the goal of a broad geographic spectrum of the media outlets (small regional to large national organizations), medium (print, radio, online), and experience (early career to senior producers). Reporters met with a diverse group of scientists. The lessons learned and successful techniques will be presented. Initial results demonstrate that stories were highly accurate and rich with audio or visual content for lay audiences. The journalists have also maintained contact with the scientists, asking for leads on emerging stories and seeking new experts that can assist in their reporting. Science-based institutions should devote more funding to foster direct journalist-scientist interactions in the lab and field. These positive goals can be achieved: (1) more accurate dissemination of science information to the public; (2) a broader portion of the scientific community will become a resource to journalists instead of the same eloquent few in the community; (3) scientists will appreciate the skill and pressures of those who survive the media downsizing and provide media savvy content; and (4) the public may incorporate science evidence

  18. Accuracy of migrant landbird habitat maps produced from LANDSAT TM data: Two case studies in southern Belize

    USGS Publications Warehouse

    Spruce, J.P.; Sader, S.; Robbins, C.S.; Dowell, B.A.; Wilson, Marcia H.; Sader, Steven A.

    1995-01-01

    The study investigated the utility of Landsat TM data applied to produce geo-referenced habitat maps for two study areas (Toledo and Stann Creek). Locational and non-site-specific map accuracy was evaluated by stratified random sampling and statistical analysis of satellite classification (SCR) versus air photo interpretation results (PIR) for the overall classification and individual classes. The effect of classification scheme specificity on map accuracy was also assessed. A decision criteria was developed for the minimum acceptable level of map performance (i.e., classification accuracy and scheme specificity). A satellite map was deemed acceptable if it has a useful degree of classification specificity, plus either an adequate overall locational agreement (< 70%) and/or non-site specific agreement (Chi Square goodness of fit test results indicating insufficient evidence to reject the null hypothesis that the overall classification distribution for the SCR and PIR are equal). For the most detailed revised classification, overall locational accuracy ranges from 52% (5 classes) for the Toledo to 63% (9 classes) for the Stann Creek. For the least detailed revised classification, overall locational accuracy ranges from 91% (2 classes) for Toledo to 86% (5 classes) for Stann Creek. Considering both location and non-site-specific accuracy results, the most detailed yet insufficient accurate classification for both sites includes low/medium/tall broadleaf forest, broadleaf forest scrub and herb-dominated openings. For these classifications, the overall locational accuracy is 72% for Toledo (4 classes) and 75% for Stann Creek (7 classes). This level of classification detail is suitable for aiding many analyses of migrant landbird habitat use.

  19. Applying Signal-Detection Theory to the Study of Observer Accuracy and Bias in Behavioral Assessment

    ERIC Educational Resources Information Center

    Lerman, Dorothea C.; Tetreault, Allison; Hovanetz, Alyson; Bellaci, Emily; Miller, Jonathan; Karp, Hilary; Mahmood, Angela; Strobel, Maggie; Mullen, Shelley; Keyl, Alice; Toupard, Alexis

    2010-01-01

    We evaluated the feasibility and utility of a laboratory model for examining observer accuracy within the framework of signal-detection theory (SDT). Sixty-one individuals collected data on aggression while viewing videotaped segments of simulated teacher-child interactions. The purpose of Experiment 1 was to determine if brief feedback and…

  20. Accuracy of Person-Fit Statistics: A Monte Carlo Study of the Influence of Aberrance Rates

    ERIC Educational Resources Information Center

    St-Onge, Christina; Valois, Pierre; Abdous, Belkacem; Germain, Stephane

    2011-01-01

    Using a Monte Carlo experimental design, this research examined the relationship between answer patterns' aberrance rates and person-fit statistics (PFS) accuracy. It was observed that as the aberrance rate increased, the detection rates of PFS also increased until, in some situations, a peak was reached and then the detection rates of PFS…

  1. Accuracy of Range Restriction Correction with Multiple Imputation in Small and Moderate Samples: A Simulation Study

    ERIC Educational Resources Information Center

    Pfaffel, Andreas; Spiel, Christiane

    2016-01-01

    Approaches to correcting correlation coefficients for range restriction have been developed under the framework of large sample theory. The accuracy of missing data techniques for correcting correlation coefficients for range restriction has thus far only been investigated with relatively large samples. However, researchers and evaluators are…

  2. Accuracy of self-report of on-road crashes and traffic offences in a cohort of young drivers: the DRIVE study.

    PubMed

    Boufous, Soufiane; Ivers, Rebecca; Senserrick, Teresa; Stevenson, Mark; Norton, Robyn; Williamson, Ann

    2010-08-01

    In order to determine the accuracy of self-report of on-road crashes and traffic offences among participants in the DRIVE study, 2991 young drivers in New South Wales, Australia who completed the follow-up questionnaire were asked whether they had been involved in an on-road crash or were convicted for a traffic offence while driving during the year prior to the survey. This information was linked to police crash data to determine the level of accuracy of self-report of on-road crashes. There was a high level of accuracy in young drivers' self-report of police recorded crashes (85.1%; 95% CI 78.2% to 92.1%) and of police recorded traffic offences (83.0%; 95% CI 79.4% to 86.6%). Results suggest that surveys may be useful tools for estimating the incidence of on-road crashes and traffic offences in young drivers. The findings are particularly relevant to jurisdictions where access to administrative data is limited.

  3. A Comparative Study of Precise Point Positioning (PPP) Accuracy Using Online Services

    NASA Astrophysics Data System (ADS)

    Malinowski, Marcin; Kwiecień, Janusz

    2016-12-01

    Precise Point Positioning (PPP) is a technique used to determine the position of receiver antenna without communication with the reference station. It may be an alternative solution to differential measurements, where maintaining a connection with a single RTK station or a regional network of reference stations RTN is necessary. This situation is especially common in areas with poorly developed infrastructure of ground stations. A lot of research conducted so far on the use of the PPP technique has been concerned about the development of entire day observation sessions. However, this paper presents the results of a comparative analysis of accuracy of absolute determination of position from observations which last between 1 to 7 hours with the use of four permanent services which execute calculations with PPP technique such as: Automatic Precise Positioning Service (APPS), Canadian Spatial Reference System Precise Point Positioning (CSRS-PPP), GNSS Analysis and Positioning Software (GAPS) and magicPPP - Precise Point Positioning Solution (magicGNSS). On the basis of acquired results of measurements, it can be concluded that at least two-hour long measurements allow acquiring an absolute position with an accuracy of 2-4 cm. An evaluation of the impact on the accuracy of simultaneous positioning of three points test network on the change of the horizontal distance and the relative height difference between measured triangle vertices was also conducted. Distances and relative height differences between points of the triangular test network measured with a laser station Leica TDRA6000 were adopted as references. The analyses of results show that at least two hours long measurement sessions can be used to determine the horizontal distance or the difference in height with an accuracy of 1-2 cm. Rapid products employed in calculations conducted with PPP technique reached the accuracy of determining coordinates on a close level as in elaborations which employ Final products.

  4. Study of decoder complexity for HEVC and AVC standards based on tool-by-tool comparison

    NASA Astrophysics Data System (ADS)

    Ahn, Y. J.; Han, W. J.; Sim, D. G.

    2012-10-01

    High Efficiency Video Coding (HEVC) is the latest standardization efforts of ISO/IEC MPEG and ITU-T VCEG for further improving the coding efficiency of H.264/AVC standard. It has been reported that HEVC can provide comparable subjective visual quality with H.264/AVC at only half bit-rates in many cases. In this paper, decoder complexities between HEVC and H.264/AVC are studied for providing initial complexity estimates of the HEVC decoder compared with the H.264/AVC decoder. For this purpose, several selected coding tools including intra prediction, motion compensation, transform, loop filters and entropy coder have been analyzed in terms of number of operations as well as their statistical differences.

  5. Comparative Accuracy Evaluation of Fine-Scale Global and Local Digital Surface Models: The Tshwane Case Study I

    NASA Astrophysics Data System (ADS)

    Breytenbach, A.

    2016-10-01

    Conducted in the City of Tshwane, South Africa, this study set about to test the accuracy of DSMs derived from different remotely sensed data locally. VHR digital mapping camera stereo-pairs, tri-stereo imagery collected by a Pléiades satellite and data detected from the Tandem-X InSAR satellite configuration were fundamental in the construction of seamless DSM products at different postings, namely 2 m, 4 m and 12 m. The three DSMs were sampled against independent control points originating from validated airborne LiDAR data. The reference surfaces were derived from the same dense point cloud at grid resolutions corresponding to those of the samples. The absolute and relative positional accuracies were computed using well-known DEM error metrics and accuracy statistics. Overall vertical accuracies were also assessed and compared across seven slope classes and nine primary land cover classes. Although all three DSMs displayed significantly more vertical errors where solid waterbodies, dense natural and/or alien woody vegetation and, in a lesser degree, urban residential areas with significant canopy cover were encountered, all three surpassed their expected positional accuracies overall.

  6. Design and Preliminary Accuracy Studies of an MRI-Guided Transrectal Prostate Intervention System

    PubMed Central

    Krieger, Axel; Csoma, Csaba; Iordachita, Iulian I.; Guion, Peter; Singh, Anurag K.; Fichtinger, Gabor; Whitcomb, Louis L.

    2012-01-01

    This paper reports a novel system for magnetic resonance imaging (MRI) guided transrectal prostate interventions, such as needle biopsy, fiducial marker placement, and therapy delivery. The system utilizes a hybrid tracking method, comprised of passive fiducial tracking for initial registration and subsequent incremental motion measurement along the degrees of freedom using fiber-optical encoders and mechanical scales. Targeting accuracy of the system is evaluated in prostate phantom experiments. Achieved targeting accuracy and procedure times were found to compare favorably with existing systems using passive and active tracking methods. Moreover, the portable design of the system using only standard MRI image sequences and minimal custom scanner interfacing allows the system to be easily used on different MRI scanners. PMID:18044553

  7. Brain temperature measurement: A study of in vitro accuracy and stability of smart catheter temperature sensors.

    PubMed

    Li, Chunyan; Wu, Pei-Ming; Wu, Zhizhen; Ahn, Chong H; LeDoux, David; Shutter, Lori A; Hartings, Jed A; Narayan, Raj K

    2012-02-01

    The injured brain is vulnerable to increases in temperature after severe head injury. Therefore, accurate and reliable measurement of brain temperature is important to optimize patient outcome. In this work, we have fabricated, optimized and characterized temperature sensors for use with a micromachined smart catheter for multimodal intracranial monitoring. Developed temperature sensors have resistance of 100.79 ± 1.19Ω and sensitivity of 67.95 mV/°C in the operating range from15-50°C, and time constant of 180 ms. Under the optimized excitation current of 500 μA, adequate signal-to-noise ratio was achieved without causing self-heating, and changes in immersion depth did not introduce clinically significant errors of measurements (<0.01°C). We evaluated the accuracy and long-term drift (5 days) of twenty temperature sensors in comparison to two types of commercial temperature probes (USB Reference Thermometer, NIST-traceable bulk probe with 0.05°C accuracy; and IT-21, type T type clinical microprobe with guaranteed 0.1°C accuracy) under controlled laboratory conditions. These in vitro experimental data showed that the temperature measurement performance of our sensors was accurate and reliable over the course of 5 days. The smart catheter temperature sensors provided accuracy and long-term stability comparable to those of commercial tissue-implantable microprobes, and therefore provide a means for temperature measurement in a microfabricated, multimodal cerebral monitoring device.

  8. Additional studies of forest classification accuracy as influenced by multispectral scanner spatial resolution

    NASA Technical Reports Server (NTRS)

    Sadowski, F. E.; Sarno, J. E.

    1976-01-01

    First, an analysis of forest feature signatures was used to help explain the large variation in classification accuracy that can occur among individual forest features for any one case of spatial resolution and the inconsistent changes in classification accuracy that were demonstrated among features as spatial resolution was degraded. Second, the classification rejection threshold was varied in an effort to reduce the large proportion of unclassified resolution elements that previously appeared in the processing of coarse resolution data when a constant rejection threshold was used for all cases of spatial resolution. For the signature analysis, two-channel ellipse plots showing the feature signature distributions for several cases of spatial resolution indicated that the capability of signatures to correctly identify their respective features is dependent on the amount of statistical overlap among signatures. Reductions in signature variance that occur in data of degraded spatial resolution may not necessarily decrease the amount of statistical overlap among signatures having large variance and small mean separations. Features classified by such signatures may thus continue to have similar amounts of misclassified elements in coarser resolution data, and thus, not necessarily improve in classification accuracy.

  9. Classification Accuracy of MMPI-2 Validity Scales in the Detection of Pain-Related Malingering: A Known-Groups Study

    ERIC Educational Resources Information Center

    Bianchini, Kevin J.; Etherton, Joseph L.; Greve, Kevin W.; Heinly, Matthew T.; Meyers, John E.

    2008-01-01

    The purpose of this study was to determine the accuracy of "Minnesota Multiphasic Personality Inventory" 2nd edition (MMPI-2; Butcher, Dahlstrom, Graham, Tellegen, & Kaemmer, 1989) validity indicators in the detection of malingering in clinical patients with chronic pain using a hybrid clinical-known groups/simulator design. The…

  10. Dynamic Patterns in Development of Accuracy and Complexity: A Longitudinal Case Study in the Acquisition of Finnish

    ERIC Educational Resources Information Center

    Spoelman, Marianne; Verspoor, Marjolijn

    2010-01-01

    Within a Dynamic System Theory (DST) approach, it is assumed that language is in a constant flux, but that differences in the degree of variability can give insight into the developmental process. This longitudinal case study focuses on intra-individual variability in accuracy rates and complexity measures in Finnish learner language. The study…

  11. Study of the Effect of Modes of Electroerosion Treatment on the Microstructure and Accuracy of Precision Sizes of Small Parts

    NASA Astrophysics Data System (ADS)

    Korobova, N. V.; Aksenenko, A. Yu.; Bashevskaya, O. S.; Nikitin, A. A.

    2016-01-01

    Results of a study of the effect of the parameters of electroerosion treatment in a GF Agie Charmilles CUT 1000 OilTech wire-cutting bench on the size accuracy, the quality of the surface layer of cuts, and the microstructure of the surface of the treated parts are presented.

  12. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  13. Diabetic Retinopathy Screening Using Telemedicine Tools: Pilot Study in Hungary

    PubMed Central

    Eszes, Dóra J.; Szabó, Dóra J.; Russell, Greg; Kirby, Phil; Paulik, Edit; Nagymajtényi, László

    2016-01-01

    Introduction. Diabetic retinopathy (DR) is a sight-threatening complication of diabetes. Telemedicine tools can prevent blindness. We aimed to investigate the patients' satisfaction when using such tools (fundus camera examination) and the effect of demographic and socioeconomic factors on participation in screening. Methods. Pilot study involving fundus camera screening and self-administered questionnaire on participants' experience during fundus examination (comfort, reliability, and future interest in participation), as well as demographic and socioeconomic factors was performed on 89 patients with known diabetes in Csongrád County, a southeastern region of Hungary. Results. Thirty percent of the patients had never participated in any ophthalmological screening, while 25.7% had DR of some grade based upon a standard fundus camera examination and UK-based DR grading protocol (Spectra™ software). Large majority of the patients were satisfied with the screening and found it reliable and acceptable to undertake examination under pupil dilation; 67.3% were willing to undergo nonmydriatic fundus camera examination again. There was a statistically significant relationship between economic activity, education and marital status, and future interest in participation. Discussion. Participants found digital retinal screening to be reliable and satisfactory. Telemedicine can be a strong tool, supporting eye care professionals and allowing for faster and more comfortable DR screening. PMID:28078306

  14. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  15. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2015-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this paper is on tool presentation, verification, and validation. These processes are carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  16. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2014-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio and number of control surfaces. A doublet lattice approach is taken to compute generalized forces. A rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. Although, all parameters can be easily modified if desired.The focus of this paper is on tool presentation, verification and validation. This process is carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool. Therefore the flutter speed and frequency for a clamped plate are computed using V-g and V-f analysis. The computational results are compared to a previously published computational analysis and wind tunnel results for the same structure. Finally a case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to V-g and V-f analysis. This also includes the analysis of the model in response to a 1-cos gust.

  17. 76 FR 71341 - BASINS and WEPP Climate Assessment Tools: Case Study Guide to Potential Applications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-17

    ... Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (EPA/600/R-11/123A). EPA also... Assessment Tool (BASINS CAT) and the Water Erosion Prediction Project Climate Assessment Tool (WEPPCAT),...

  18. Surgical accuracy of three-dimensional virtual planning: a pilot study of bimaxillary orthognathic procedures including maxillary segmentation.

    PubMed

    Stokbro, K; Aagaard, E; Torkov, P; Bell, R B; Thygesen, T

    2016-01-01

    This retrospective study evaluated the precision and positional accuracy of different orthognathic procedures following virtual surgical planning in 30 patients. To date, no studies of three-dimensional virtual surgical planning have evaluated the influence of segmentation on positional accuracy and transverse expansion. Furthermore, only a few have evaluated the precision and accuracy of genioplasty in placement of the chin segment. The virtual surgical plan was compared with the postsurgical outcome by using three linear and three rotational measurements. The influence of maxillary segmentation was analyzed in both superior and inferior maxillary repositioning. In addition, transverse surgical expansion was compared with the postsurgical expansion obtained. An overall, high degree of linear accuracy between planned and postsurgical outcomes was found, but with a large standard deviation. Rotational difference showed an increase in pitch, mainly affecting the maxilla. Segmentation had no significant influence on maxillary placement. However, a posterior movement was observed in inferior maxillary repositioning. A lack of transverse expansion was observed in the segmented maxilla independent of the degree of expansion.

  19. A Study of Soil Tillage Tools from Boronized Sintered Iron

    NASA Astrophysics Data System (ADS)

    Yazici, A.; Çavdar, U.

    2017-03-01

    Acomparative analysis of the properties of boronized sintered iron and quenched steels 30MnB5, 28MnCrB5 used for making soil tillage tools is performed. The microstructure, phase composition, hardness and strength characteristics of the materials are studied. The composition of the boride phase formed in the sintered iron after boronizing is determined by an x-ray method. The losses to abrasive wear are evaluated with the help of a device containing a special bin with a sample of abrasive soil.

  20. Toward robust deconvolution of pass-through paleomagnetic measurements: new tool to estimate magnetometer sensor response and laser interferometry of sample positioning accuracy

    NASA Astrophysics Data System (ADS)

    Oda, Hirokuni; Xuan, Chuang; Yamamoto, Yuhji

    2016-07-01

    Pass-through superconducting rock magnetometers (SRM) offer rapid and high-precision remanence measurements for continuous samples that are essential for modern paleomagnetism studies. However, continuous SRM measurements are inevitably smoothed and distorted due to the convolution effect of SRM sensor response. Deconvolution is necessary to restore accurate magnetization from pass-through SRM data, and robust deconvolution requires reliable estimate of SRM sensor response as well as understanding of uncertainties associated with the SRM measurement system. In this paper, we use the SRM at Kochi Core Center (KCC), Japan, as an example to introduce new tool and procedure for accurate and efficient estimate of SRM sensor response. To quantify uncertainties associated with the SRM measurement due to track positioning errors and test their effects on deconvolution, we employed laser interferometry for precise monitoring of track positions both with and without placing a u-channel sample on the SRM tray. The acquired KCC SRM sensor response shows significant cross-term of Z-axis magnetization on the X-axis pick-up coil and full widths of ~46-54 mm at half-maximum response for the three pick-up coils, which are significantly narrower than those (~73-80 mm) for the liquid He-free SRM at Oregon State University. Laser interferometry measurements on the KCC SRM tracking system indicate positioning uncertainties of ~0.1-0.2 and ~0.5 mm for tracking with and without u-channel sample on the tray, respectively. Positioning errors appear to have reproducible components of up to ~0.5 mm possibly due to patterns or damages on tray surface or rope used for the tracking system. Deconvolution of 50,000 simulated measurement data with realistic error introduced based on the position uncertainties indicates that although the SRM tracking system has recognizable positioning uncertainties, they do not significantly debilitate the use of deconvolution to accurately restore high

  1. A retrospective study to validate an intraoperative robotic classification system for assessing the accuracy of kirschner wire (K-wire) placements with postoperative computed tomography classification system for assessing the accuracy of pedicle screw placements.

    PubMed

    Tsai, Tai-Hsin; Wu, Dong-Syuan; Su, Yu-Feng; Wu, Chieh-Hsin; Lin, Chih-Lung

    2016-09-01

    This purpose of this retrospective study is validation of an intraoperative robotic grading classification system for assessing the accuracy of Kirschner-wire (K-wire) placements with the postoperative computed tomography (CT)-base classification system for assessing the accuracy of pedicle screw placements.We conducted a retrospective review of prospectively collected data from 35 consecutive patients who underwent 176 robotic assisted pedicle screws instrumentation at Kaohsiung Medical University Hospital from September 2014 to November 2015. During the operation, we used a robotic grading classification system for verifying the intraoperative accuracy of K-wire placements. Three months after surgery, we used the common CT-base classification system to assess the postoperative accuracy of pedicle screw placements. The distributions of accuracy between the intraoperative robot-assisted and various postoperative CT-based classification systems were compared using kappa statistics of agreement.The intraoperative accuracies of K-wire placements before and after repositioning were classified as excellent (131/176, 74.4% and 133/176, 75.6%, respectively), satisfactory (36/176, 20.5% and 41/176, 23.3%, respectively), and malpositioned (9/176, 5.1% and 2/176, 1.1%, respectively)In postoperative CT-base classification systems were evaluated. No screw placements were evaluated as unacceptable under any of these systems. Kappa statistics revealed no significant differences between the proposed system and the aforementioned classification systems (P <0.001).Our results revealed no significant differences between the intraoperative robotic grading system and various postoperative CT-based grading systems. The robotic grading classification system is a feasible method for evaluating the accuracy of K-wire placements. Using the intraoperative robot grading system to classify the accuracy of K-wire placements enables predicting the postoperative accuracy of pedicle screw

  2. Tools for Studying Quantum Emergence near Phase Transitions

    NASA Astrophysics Data System (ADS)

    Imada, Masatoshi; Onoda, Shigeki; Mizusaki, Takahiro; Watanabe, Shinji

    2003-12-01

    We review recent studies on developing tools for quantum complex phenomena. The tools have been applied for clarifying the perspective of the Mott transitions and the phase diagram of metals, Mott insulators and magnetically ordered phases in the two-dimensional Hubbard model. The path-integral renormalization-group (PIRG) method has made it possible to numerically study correlated electrons even with geometrical frustration effects without biases . It has numerically clarified the phase diagram at zero temperature, T = 0, in the parameter space of the onsite Coulomb repulsion, the geometrical frustration amplitude and the chemical potential. When the bandwidth is controlled at half filling, the first-order transition between insulating and metallic phases is evidenced. In contrast, the filling-control transition shows diverging critical fluctuations for spin and charge responses with decreasing doping concentration. Near the Mott transition, a nonmagnetic spin-liquid phase appears in a region with large frustration effects. The phase is characterized remarkably by gapless spin excitations and the vanishing dispersion of spin excitations. Magnetic orders quantum mechanically melt through diverging magnon mass. The correlator projection method (CPM) is formulated as an extension of the operator projection theory. This method also allows an extension of the dynamical mean-field theory (DMFT) with systematic inclusion of the momentum dependence in the self-energy. It has enabled determining the phase diagram at T > 0, where the boundary surface of the first-order metal-insulator transition at half filling terminates on the critical end curve at T = Tc. The critical end curve is characterized by the diverging compressibility. The single particle spectra show strong renormalization of low-energy spectra, generating largely momentum dependent and flat dispersion. The results of two tools consistently suggest that the strong competitions of various phases with underlying

  3. Ejection time-corrected systolic velocity improves accuracy in the evaluation of myocardial dysfunction: a study in piglets.

    PubMed

    Odland, Hans Henrik; Kro, Grete Anette Birkeland; Munkeby, Berit H; Edvardsen, Thor; Saugstad, Ola Didrik; Thaulow, Erik

    2010-10-01

    This study aimed to assess the effect of correcting for the impact of heart rate (HR) or ejection time (ET) on myocardial velocities in the long axis in piglets undergoing hypoxia. The ability to eject a higher volume at a fixed ET is a characteristic of contractility in the heart. Systolic velocity of the atrioventricular annulus displacement is directly related to volume changes of the ventricle. Both ET and systolic velocity may be measured in a single heartbeat. In 29 neonatal pigs, systolic velocity and ET were measured with tissue Doppler techniques in the mitral valve annulus, the tricuspid valve annulus, and the septum. All ejection time corrected velocities (S((ET)), mean ± SEM, cm/s) decreased significantly during hypoxia (S(mva(ET)) 15.5 ± 0.2 to 13.2 ± 0.3 (p < 0.001), S(septal(ET)) 9.9 ± 0.1 to 7.8 ± 0.2 (p < 0.001), S(tva(ET)) 12.1 ± 0.2 to 9.8 ± 0.3 (p < 0.001)). The magnitude of change from baseline to hypoxia was greater for ejection time corrected systolic velocities than for RR-interval corrected velocities (mean ± SEM, cm/s); ΔS(mva(ET)) 2.3 ± 2.0 vs. ΔS(mva(RR)) 1.6 ± 1.1 (p = 0.02), ΔS(septal(ET)) 2.1 ± 1.0 vs. ΔS(septal(RR)) 1.6 ± 1.0 (p < 0.01), ΔS(tva(ET)) 2.3 ± 1.1 vs. ΔS(tva(RR)) 1.8 ± 1.3 (p = 0.04). The receiver operator characteristic (ROC) showed superior performance of S((ET)) compared with uncorrected velocities. The decrease in S((ET)) during hypoxia was not influenced by important hemodynamic determinants. ET-corrected systolic velocity improves accuracy and decreases variability in the evaluation of systolic longitudinal function and contractility during global hypoxia in neonatal pigs compared with systolic velocity alone. It is robust toward hemodynamic changes. This novel method has the potential of becoming a useful tool in clinical practice.

  4. Reconstructability analysis as a tool for identifying gene-gene interactions in studies of human diseases.

    PubMed

    Shervais, Stephen; Kramer, Patricia L; Westaway, Shawn K; Cox, Nancy J; Zwick, Martin

    2010-01-01

    There are a number of common human diseases for which the genetic component may include an epistatic interaction of multiple genes. Detecting these interactions with standard statistical tools is difficult because there may be an interaction effect, but minimal or no main effect. Reconstructability analysis (RA) uses Shannon's information theory to detect relationships between variables in categorical datasets. We applied RA to simulated data for five different models of gene-gene interaction, and find that even with heritability levels as low as 0.008, and with the inclusion of 50 non-associated genes in the dataset, we can identify the interacting gene pairs with an accuracy of > or =80%. We applied RA to a real dataset of type 2 non-insulin-dependent diabetes (NIDDM) cases and controls, and closely approximated the results of more conventional single SNP disease association studies. In addition, we replicated prior evidence for epistatic interactions between SNPs on chromosomes 2 and 15.

  5. Combining cow and bull reference populations to increase accuracy of genomic prediction and genome-wide association studies.

    PubMed

    Calus, M P L; de Haas, Y; Veerkamp, R F

    2013-10-01

    Genomic selection holds the promise to be particularly beneficial for traits that are difficult or expensive to measure, such that access to phenotypes on large daughter groups of bulls is limited. Instead, cow reference populations can be generated, potentially supplemented with existing information from the same or (highly) correlated traits available on bull reference populations. The objective of this study, therefore, was to develop a model to perform genomic predictions and genome-wide association studies based on a combined cow and bull reference data set, with the accuracy of the phenotypes differing between the cow and bull genomic selection reference populations. The developed bivariate Bayesian stochastic search variable selection model allowed for an unbalanced design by imputing residuals in the residual updating scheme for all missing records. The performance of this model is demonstrated on a real data example, where the analyzed trait, being milk fat or protein yield, was either measured only on a cow or a bull reference population, or recorded on both. Our results were that the developed bivariate Bayesian stochastic search variable selection model was able to analyze 2 traits, even though animals had measurements on only 1 of 2 traits. The Bayesian stochastic search variable selection model yielded consistently higher accuracy for fat yield compared with a model without variable selection, both for the univariate and bivariate analyses, whereas the accuracy of both models was very similar for protein yield. The bivariate model identified several additional quantitative trait loci peaks compared with the single-trait models on either trait. In addition, the bivariate models showed a marginal increase in accuracy of genomic predictions for the cow traits (0.01-0.05), although a greater increase in accuracy is expected as the size of the bull population increases. Our results emphasize that the chosen value of priors in Bayesian genomic prediction

  6. Accuracy Study of the Space-Time CE/SE Method for Computational Aeroacoustics Problems Involving Shock Waves

    NASA Technical Reports Server (NTRS)

    Wang, Xiao Yen; Chang, Sin-Chung; Jorgenson, Philip C. E.

    1999-01-01

    The space-time conservation element and solution element(CE/SE) method is used to study the sound-shock interaction problem. The order of accuracy of numerical schemes is investigated. The linear model problem.govemed by the 1-D scalar convection equation, sound-shock interaction problem governed by the 1-D Euler equations, and the 1-D shock-tube problem which involves moving shock waves and contact surfaces are solved to investigate the order of accuracy of numerical schemes. It is concluded that the accuracy of the CE/SE numerical scheme with designed 2nd-order accuracy becomes 1st order when a moving shock wave exists. However, the absolute error in the CE/SE solution downstream of the shock wave is on the same order as that obtained using a fourth-order accurate essentially nonoscillatory (ENO) scheme. No special techniques are used for either high-frequency low-amplitude waves or shock waves.

  7. Overlay accuracy fundamentals

    NASA Astrophysics Data System (ADS)

    Kandel, Daniel; Levinski, Vladimir; Sapiens, Noam; Cohen, Guy; Amit, Eran; Klein, Dana; Vakshtein, Irina

    2012-03-01

    Currently, the performance of overlay metrology is evaluated mainly based on random error contributions such as precision and TIS variability. With the expected shrinkage of the overlay metrology budget to < 0.5nm, it becomes crucial to include also systematic error contributions which affect the accuracy of the metrology. Here we discuss fundamental aspects of overlay accuracy and a methodology to improve accuracy significantly. We identify overlay mark imperfections and their interaction with the metrology technology, as the main source of overlay inaccuracy. The most important type of mark imperfection is mark asymmetry. Overlay mark asymmetry leads to a geometrical ambiguity in the definition of overlay, which can be ~1nm or less. It is shown theoretically and in simulations that the metrology may enhance the effect of overlay mark asymmetry significantly and lead to metrology inaccuracy ~10nm, much larger than the geometrical ambiguity. The analysis is carried out for two different overlay metrology technologies: Imaging overlay and DBO (1st order diffraction based overlay). It is demonstrated that the sensitivity of DBO to overlay mark asymmetry is larger than the sensitivity of imaging overlay. Finally, we show that a recently developed measurement quality metric serves as a valuable tool for improving overlay metrology accuracy. Simulation results demonstrate that the accuracy of imaging overlay can be improved significantly by recipe setup optimized using the quality metric. We conclude that imaging overlay metrology, complemented by appropriate use of measurement quality metric, results in optimal overlay accuracy.

  8. A vine copula mixed effect model for trivariate meta-analysis of diagnostic test accuracy studies accounting for disease prevalence.

    PubMed

    Nikoloulopoulos, Aristidis K

    2015-08-11

    A bivariate copula mixed model has been recently proposed to synthesize diagnostic test accuracy studies and it has been shown that it is superior to the standard generalized linear mixed model in this context. Here, we call trivariate vine copulas to extend the bivariate meta-analysis of diagnostic test accuracy studies by accounting for disease prevalence. Our vine copula mixed model includes the trivariate generalized linear mixed model as a special case and can also operate on the original scale of sensitivity, specificity, and disease prevalence. Our general methodology is illustrated by re-analyzing the data of two published meta-analyses. Our study suggests that there can be an improvement on trivariate generalized linear mixed model in fit to data and makes the argument for moving to vine copula random effects models especially because of their richness, including reflection asymmetric tail dependence, and computational feasibility despite their three dimensionality.

  9. Borehole tool for studies in coalbed degasification wells

    NASA Astrophysics Data System (ADS)

    Serdyukov, SV; Patutin, AV; Shilova, TV

    2017-02-01

    The paper presents a downhole tool designed for gas-dynamic research to be carried out in coalbed methane drainage holes. Structurally, the tool has a twin-packer design. The tool allows hydraulic fracturing, gas-dynamic investigations using indicator diagrams and pressure drop and recovery curves, and local destressing of coal.

  10. Reversing the picture superiority effect: a speed-accuracy trade-off study of recognition memory.

    PubMed

    Boldini, Angela; Russo, Riccardo; Punia, Sahiba; Avons, S E

    2007-01-01

    Speed-accuracy trade-off methods have been used to contrast single- and dual-process accounts of recognition memory. With these procedures, subjects are presented with individual test items and required to make recognition decisions under various time constraints. In three experiments, we presented words and pictures to be intentionally learned; test stimuli were always visually presented words. At test, we manipulated the interval between the presentation of each test stimulus and that of a response signal, thus controlling the amount of time available to retrieve target information. The standard picture superiority effect was significant in long response deadline conditions (i.e., > or = 2,000 msec). Conversely, a significant reverse picture superiority effect emerged at short response-signal deadlines (< 200 msec). The results are congruent with views suggesting that both fast familiarity and slower recollection processes contribute to recognition memory. Alternative accounts are also discussed.

  11. Study regarding the spline interpolation accuracy of the experimentally acquired data

    NASA Astrophysics Data System (ADS)

    Oanta, Emil M.; Danisor, Alin; Tamas, Razvan

    2016-12-01

    Experimental data processing is an issue that must be solved in almost all the domains of science. In engineering we usually have a large amount of data and we try to extract the useful signal which is relevant for the phenomenon under investigation. The criteria used to consider some points more relevant then some others may take into consideration various conditions which may be either phenomenon dependent, or general. The paper presents some of the ideas and tests regarding the identification of the best set of criteria used to filter the initial set of points in order to extract a subset which best fits the approximated function. If the function has regions where it is either constant, or it has a slow variation, fewer discretization points may be used. This means to create a simpler solution to process the experimental data, keeping the accuracy in some fair good limits.

  12. Comparison of accuracy of anterior and superomedial approaches to shoulder injection: an experimental study

    PubMed Central

    Chernchujit, Bancha; Zonthichai, Nutthapon

    2016-01-01

    Introduction: We aimed to compare the accuracy between the standard anterior technique of shoulder injection and the new superomedial technique modified from Neviaser arthroscopic portal placement. Intra-articular placement, especially at the long head of biceps (LHB) tendon, and needle depth were evaluated. Methods: Fifty-eight patients (ages 57 ± 10 years) requiring shoulder arthroscopy in the beach-chair position were recruited. Needle punctures for both techniques were performed by an experienced sports medicine orthopedist. Patients were anesthetized, and the shoulder placed in the neutral position. A single needle was passed through the skin, with only one redirection allowed per trial. The superomedial technique was performed, then the anterior technique. Posterior-portal arthroscopy determined whether needle placement was inside the joint. The percentage of intra-articular needle placements for each technique defined accuracy. When inside the joint, the needle’s precise location was determined and its depth measured. A marginal χ2 test compared results between techniques. Results: The superomedial technique was significantly more accurate than the anterior technique (84% vs. 55%, p < 0.05). For superomedial versus anterior attempts, the LHB tendon was penetrated in 4% vs. 28% of patients, respectively, and the superior labrum in 35% vs. 0% of patients, respectively; the needle depth was 42 ± 7 vs. 32 ± 7 mm, respectively (all p < 0.05). Conclusions: The superomedial technique was more accurate, penetrating the LHB tendon less frequently than the standard anterior technique. A small-diameter needle was needed to minimize superior labral injury. The superomedial technique required a longer needle to access the shoulder joint. PMID:27163102

  13. Study of academic achievements using spatial analysis tools

    NASA Astrophysics Data System (ADS)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  14. Meta-analysis for diagnostic accuracy studies: a new statistical model using beta-binomial distributions and bivariate copulas.

    PubMed

    Kuss, Oliver; Hoyer, Annika; Solms, Alexander

    2014-01-15

    There are still challenges when meta-analyzing data from studies on diagnostic accuracy. This is mainly due to the bivariate nature of the response where information on sensitivity and specificity must be summarized while accounting for their correlation within a single trial. In this paper, we propose a new statistical model for the meta-analysis for diagnostic accuracy studies. This model uses beta-binomial distributions for the marginal numbers of true positives and true negatives and links these margins by a bivariate copula distribution. The new model comes with all the features of the current standard model, a bivariate logistic regression model with random effects, but has the additional advantages of a closed likelihood function and a larger flexibility for the correlation structure of sensitivity and specificity. In a simulation study, which compares three copula models and two implementations of the standard model, the Plackett and the Gauss copula do rarely perform worse but frequently better than the standard model. We use an example from a meta-analysis to judge the diagnostic accuracy of telomerase (a urinary tumor marker) for the diagnosis of primary bladder cancer for illustration.

  15. Accuracy of Continuous Glucose Monitoring During Three Closed-Loop Home Studies Under Free-Living Conditions

    PubMed Central

    Thabit, Hood; Leelarathna, Lalantha; Wilinska, Malgorzata E.; Elleri, Daniella; Allen, Janet M.; Lubina-Solomon, Alexandra; Walkinshaw, Emma; Stadler, Marietta; Choudhary, Pratik; Mader, Julia K.; Dellweg, Sibylle; Benesch, Carsten; Pieber, Thomas R.; Arnolds, Sabine; Heller, Simon R.; Amiel, Stephanie A.; Dunger, David; Evans, Mark L.

    2015-01-01

    Abstract Objectives: Closed-loop (CL) systems modulate insulin delivery based on glucose levels measured by a continuous glucose monitor (CGM). Accuracy of the CGM affects CL performance and safety. We evaluated the accuracy of the Freestyle Navigator® II CGM (Abbott Diabetes Care, Alameda, CA) during three unsupervised, randomized, open-label, crossover home CL studies. Materials and Methods: Paired CGM and capillary glucose values (10,597 pairs) were collected from 57 participants with type 1 diabetes (41 adults [mean±SD age, 39±12 years; mean±SD hemoglobin A1c, 7.9±0.8%] recruited at five centers and 16 adolescents [mean±SD age, 15.6±3.6 years; mean±SD hemoglobin A1c, 8.1±0.8%] recruited at two centers). Numerical accuracy was assessed by absolute relative difference (ARD) and International Organization for Standardization (ISO) 15197:2013 15/15% limits, and clinical accuracy was assessed by Clarke error grid analysis. Results: Total duration of sensor use was 2,002 days (48,052 h). Overall sensor accuracy for the capillary glucose range (1.1–27.8 mmol/L) showed mean±SD and median (interquartile range) ARD of 14.2±15.5% and 10.0% (4.5%, 18.4%), respectively. Lowest mean ARD was observed in the hyperglycemic range (9.8±8.8%). Over 95% of pairs were in combined Clarke error grid Zones A and B (A, 80.1%, B, 16.2%). Overall, 70.0% of the sensor readings satisfied ISO criteria. Mean ARD was consistent (12.3%; 95% of the values fall within ±3.7%) and not different between participants (P=0.06) within the euglycemic and hyperglycemic range, when CL is actively modulating insulin delivery. Conclusions: Consistent accuracy of the CGM within the euglycemic–hyperglycemic range using the Freestyle Navigator II was observed and supports its use in home CL studies. Our results may contribute toward establishing normative CGM performance criteria for unsupervised home use of CL. PMID:26241693

  16. A new statistical tool for NOAA local climate studies

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Meyers, J. C.; Hollingshead, A.

    2011-12-01

    The National Weather Services (NWS) Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to efficiently access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NOAA's staff to conduct regional and local climate studies using state-of-the-art station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. LCAT will augment current climate reference materials with information pertinent to the local and regional levels as they apply to diverse variables appropriate to each locality. The LCAT main emphasis is to enable studies of extreme meteorological and hydrological events such as tornadoes, flood, drought, severe storms, etc. LCAT will close a very critical gap in NWS local climate services because it will allow addressing climate variables beyond average temperature and total precipitation. NWS external partners and government agencies will benefit from the LCAT outputs that could be easily incorporated into their own analysis and/or delivery systems. Presently we identified five existing requirements for local climate: (1) Local impacts of climate change; (2) Local impacts of climate variability; (3) Drought studies; (4) Attribution of severe meteorological and hydrological events; and (5) Climate studies for water resources. The methodologies for the first three requirements will be included in the LCAT first phase implementation. Local rate of climate change is defined as a slope of the mean trend estimated from the ensemble of three trend techniques: (1) hinge, (2) Optimal Climate Normals (running mean for optimal time periods), (3) exponentially

  17. Functional limits of agreement applied as a novel method comparison tool for accuracy and precision of inertial measurement unit derived displacement of the distal limb in horses.

    PubMed

    Olsen, Emil; Pfau, Thilo; Ritz, Christian

    2013-09-03

    Over ground motion analysis in horses is limited by a small number of strides and restraints of the indoor gait laboratory. Inertial measurement units (IMUs) are transforming the knowledge of human motion and objective clinical assessment through the opportunity to obtain clinically relevant data under various conditions. When using IMUs on the limbs of horses to determine local position estimates, conditions with high dynamic range of both accelerations and rotational velocities prove particularly challenging. Here we apply traditional method agreement and suggest a novel method of functional data analysis to compare motion capture with IMUs placed over the fetlock joint in seven horses. We demonstrate acceptable accuracy and precision at less than or equal to 5% of the range of motion for detection of distal limb mounted cranio-caudal and vertical position. We do not recommend the use of the latero-medial position estimate of the distal metacarpus/metatarsus during walk where the average error is 10% and the maximum error 111% of the range. We also show that functional data analysis and functional limits of agreement are sensitive methods for comparison of cyclical data and could be applied to differentiate changes in gait for individuals across time and conditions.

  18. Linked color imaging application for improving the endoscopic diagnosis accuracy: a pilot study

    PubMed Central

    Sun, Xiaotian; Dong, Tenghui; Bi, Yiliang; Min, Min; Shen, Wei; Xu, Yang; Liu, Yan

    2016-01-01

    Endoscopy has been widely used in diagnosing gastrointestinal mucosal lesions. However, there are still lack of objective endoscopic criteria. Linked color imaging (LCI) is newly developed endoscopic technique which enhances color contrast. Thus, we investigated the clinical application of LCI and further analyzed pixel brightness for RGB color model. All the lesions were observed by white light endoscopy (WLE), LCI and blue laser imaging (BLI). Matlab software was used to calculate pixel brightness for red (R), green (G) and blue color (B). Of the endoscopic images for lesions, LCI had significantly higher R compared with BLI but higher G compared with WLE (all P < 0.05). R/(G + B) was significantly different among 3 techniques and qualified as a composite LCI marker. Our correlation analysis of endoscopic diagnosis with pathology revealed that LCI was quite consistent with pathological diagnosis (P = 0.000) and the color could predict certain kinds of lesions. ROC curve demonstrated at the cutoff of R/(G+B) = 0.646, the area under curve was 0.646, and the sensitivity and specificity was 0.514 and 0.773. Taken together, LCI could improve efficiency and accuracy of diagnosing gastrointestinal mucosal lesions and benefit target biopsy. R/(G + B) based on pixel brightness may be introduced as a objective criterion for evaluating endoscopic images. PMID:27641243

  19. Linked color imaging application for improving the endoscopic diagnosis accuracy: a pilot study.

    PubMed

    Sun, Xiaotian; Dong, Tenghui; Bi, Yiliang; Min, Min; Shen, Wei; Xu, Yang; Liu, Yan

    2016-09-19

    Endoscopy has been widely used in diagnosing gastrointestinal mucosal lesions. However, there are still lack of objective endoscopic criteria. Linked color imaging (LCI) is newly developed endoscopic technique which enhances color contrast. Thus, we investigated the clinical application of LCI and further analyzed pixel brightness for RGB color model. All the lesions were observed by white light endoscopy (WLE), LCI and blue laser imaging (BLI). Matlab software was used to calculate pixel brightness for red (R), green (G) and blue color (B). Of the endoscopic images for lesions, LCI had significantly higher R compared with BLI but higher G compared with WLE (all P < 0.05). R/(G + B) was significantly different among 3 techniques and qualified as a composite LCI marker. Our correlation analysis of endoscopic diagnosis with pathology revealed that LCI was quite consistent with pathological diagnosis (P = 0.000) and the color could predict certain kinds of lesions. ROC curve demonstrated at the cutoff of R/(G+B) = 0.646, the area under curve was 0.646, and the sensitivity and specificity was 0.514 and 0.773. Taken together, LCI could improve efficiency and accuracy of diagnosing gastrointestinal mucosal lesions and benefit target biopsy. R/(G + B) based on pixel brightness may be introduced as a objective criterion for evaluating endoscopic images.

  20. A study of the accuracy of neutrally buoyant bubbles used as flow tracers in air

    NASA Technical Reports Server (NTRS)

    Kerho, Michael F.

    1993-01-01

    Research has been performed to determine the accuracy of neutrally buoyant and near neutrally buoyant bubbles used as flow tracers in air. Theoretical, computational, and experimental results are presented to evaluate the dynamics of bubble trajectories and factors affecting their ability to trace flow-field streamlines. The equation of motion for a single bubble was obtained and evaluated using a computational scheme to determine the factors which affect a bubble's trajectory. A two-dimensional experiment was also conducted to experimentally determine bubble trajectories in the stagnation region of NACA 0012 airfoil at 0 deg angle of attack using a commercially available helium bubble generation system. Physical properties of the experimental bubble trajectories were estimated using the computational scheme. These properties included the density ratio and diameter of the individual bubbles. the helium bubble system was then used to visualize and document the flow field about a 30 deg swept semispan wing with simulated glaze ice. Results were compared to Navier-Stokes calculations and surface oil flow visualization. The theoretical and computational analysis have shown that neutrally buoyant bubbles will trace even the most complex flow patterns. Experimental analysis revealed that the use of bubbles to trace flow patterns should be limited to qualitative measurements unless care is taken to ensure neutral buoyancy. This is due to the difficulty in the production of neutrally buoyant bubbles.

  1. Total Diet Studies as a Tool for Ensuring Food Safety

    PubMed Central

    Lee, Joon-Goo; Kim, Sheen-Hee; Kim, Hae-Jung

    2015-01-01

    With the diversification and internationalization of the food industry and the increased focus on health from a majority of consumers, food safety policies are being implemented based on scientific evidence. Risk analysis represents the most useful scientific approach for making food safety decisions. Total diet study (TDS) is often used as a risk assessment tool to evaluate exposure to hazardous elements. Many countries perform TDSs to screen for chemicals in foods and analyze exposure trends to hazardous elements. TDSs differ from traditional food monitoring in two major aspects: chemicals are analyzed in food in the form in which it will be consumed and it is cost-effective in analyzing composite samples after processing multiple ingredients together. In Korea, TDSs have been conducted to estimate dietary intakes of heavy metals, pesticides, mycotoxins, persistent organic pollutants, and processing contaminants. TDSs need to be carried out periodically to ensure food safety. PMID:26483881

  2. Numerical Relativity as a tool for studying the Early Universe

    NASA Astrophysics Data System (ADS)

    Garrison, David

    2013-04-01

    Numerical simulations are becoming a more effective tool for conducting detailed investigations into the evolution of our universe. In this presentation, I show how the framework of numerical relativity can be used for studying cosmological models. We are working to develop a large-scale simulation of the dynamical processes in the early universe. These take into account interactions of dark matter, scalar perturbations, gravitational waves, magnetic fields and a turbulent plasma. The code described in this report is a GRMHD code based on the Cactus framework and is structured to utilize one of several different differencing methods chosen at run-time. It is being developed and tested on the Texas Learning and Computation Center's Xanadu cluster.

  3. Total Diet Studies as a Tool for Ensuring Food Safety.

    PubMed

    Lee, Joon-Goo; Kim, Sheen-Hee; Kim, Hae-Jung; Yoon, Hae-Jung

    2015-09-01

    With the diversification and internationalization of the food industry and the increased focus on health from a majority of consumers, food safety policies are being implemented based on scientific evidence. Risk analysis represents the most useful scientific approach for making food safety decisions. Total diet study (TDS) is often used as a risk assessment tool to evaluate exposure to hazardous elements. Many countries perform TDSs to screen for chemicals in foods and analyze exposure trends to hazardous elements. TDSs differ from traditional food monitoring in two major aspects: chemicals are analyzed in food in the form in which it will be consumed and it is cost-effective in analyzing composite samples after processing multiple ingredients together. In Korea, TDSs have been conducted to estimate dietary intakes of heavy metals, pesticides, mycotoxins, persistent organic pollutants, and processing contaminants. TDSs need to be carried out periodically to ensure food safety.

  4. Concept study of an observation preparation tool for MICADO

    NASA Astrophysics Data System (ADS)

    Wegner, Michael; Schlichter, Jörg

    2016-07-01

    MICADO, the near-infrared Multi-AO Imaging Camera for Deep Observations and first light instrument for the European ELT, will provide capabilities for imaging, coronagraphy, and spectroscopy. As usual, MICADO observations will have to be prepared in advance, including AO and secondary guide star selection, offset/dither pattern definition, and an optimization for the most suitable configuration. A visual representation of the latter along with graphical and scripting interfaces is desirable. We aim at developing a flexible and user-friendly application that enhances or complements the ESO standard preparation software. Here, we give a summary of the requirements on such a tool, report on the status of our conceptual study and present a first proof-of-concept implementation.

  5. The Relationship Between Accuracy of Numerical Magnitude Comparisons and Children's Arithmetic Ability: A Study in Iranian Primary School Children.

    PubMed

    Tavakoli, Hamdollah Manzari

    2016-11-01

    The relationship between children's accuracy during numerical magnitude comparisons and arithmetic ability has been investigated by many researchers. Contradictory results have been reported from these studies due to the use of many different tasks and indices to determine the accuracy of numerical magnitude comparisons. In the light of this inconsistency among measurement techniques, the present study aimed to investigate this relationship among Iranian second grade children (n = 113) using a pre-established test (known as the Numeracy Screener) to measure numerical magnitude comparison accuracy. The results revealed that both the symbolic and non-symbolic items of the Numeracy Screener significantly correlated with arithmetic ability. However, after controlling for the effect of working memory, processing speed, and long-term memory, only performance on symbolic items accounted for the unique variances in children's arithmetic ability. Furthermore, while working memory uniquely contributed to arithmetic ability in one-and two-digit arithmetic problem solving, processing speed uniquely explained only the variance in single-digit arithmetic skills and long-term memory did not contribute to any significant additional variance for one-digit or two-digit arithmetic problem solving.

  6. Accuracy of the unified approach in maternally influenced traits - illustrated by a simulation study in the honey bee (Apis mellifera)

    PubMed Central

    2013-01-01

    Background The honey bee is an economically important species. With a rapid decline of the honey bee population, it is necessary to implement an improved genetic evaluation methodology. In this study, we investigated the applicability of the unified approach and its impact on the accuracy of estimation of breeding values for maternally influenced traits on a simulated dataset for the honey bee. Due to the limitation to the number of individuals that can be genotyped in a honey bee population, the unified approach can be an efficient strategy to increase the genetic gain and to provide a more accurate estimation of breeding values. We calculated the accuracy of estimated breeding values for two evaluation approaches, the unified approach and the traditional pedigree based approach. We analyzed the effects of different heritabilities as well as genetic correlation between direct and maternal effects on the accuracy of estimation of direct, maternal and overall breeding values (sum of maternal and direct breeding values). The genetic and reproductive biology of the honey bee was accounted for by taking into consideration characteristics such as colony structure, uncertain paternity, overlapping generations and polyandry. In addition, we used a modified numerator relationship matrix and a realistic genome for the honey bee. Results For all values of heritability and correlation, the accuracy of overall estimated breeding values increased significantly with the unified approach. The increase in accuracy was always higher for the case when there was no correlation as compared to the case where a negative correlation existed between maternal and direct effects. Conclusions Our study shows that the unified approach is a useful methodology for genetic evaluation in honey bees, and can contribute immensely to the improvement of traits of apicultural interest such as resistance to Varroa or production and behavioural traits. In particular, the study is of great interest for

  7. [Analysis on evaluation tool for literature quality in clinical study].

    PubMed

    Liu, Qing; Zhai, Wei; Tan, Ya-qin; Huang, Juan

    2014-09-01

    The tools used for the literature quality evaluation are introduced. The common evaluation tools that are publicly and extensively used for the evaluation of clinical trial literature quality in the world are analyzed, including Jadad scale, Consolidated Standards of Reporting Trials (CONSORT) statement and Grades of Recommendations Assessment, Development and Evaluation (GRADE) system and the others. Additionally, the present development, updates and applications of these tools are involved in analysis.

  8. EMU battery/SMM power tool characterization study

    NASA Technical Reports Server (NTRS)

    Palandati, C.

    1982-01-01

    The power tool which will be used to replace the attitude control system in the SMM spacecraft was modified to operate from a self contained battery. The extravehicular mobility unit (EMU) battery was tested for the power tool application. The results are that the EMU battery is capable of operating the power tool within the pulse current range of 2.0 to 15.0 amperes and battery temperature range of -10 to 40 degrees Celsius.

  9. EMU Battery/module Service Tool Characterization Study

    NASA Technical Reports Server (NTRS)

    Palandati, C. F.

    1984-01-01

    The power tool which will be used to replace the attitude control system in the SMM spacecraft is being modified to operate from a self contained battery. The extravehicular mobility unit (EMU) battery, a silver zinc battery, was tested for the power tool application. The results obtained during show the EMU battery is capable of operating the power tool within the pulse current range of 2.0 to 15.0 amperes and battery temperature range of -10 to 40 degrees Celsius.

  10. In vivo diagnostic accuracy of high resolution microendoscopy in differentiating neoplastic from non-neoplastic colorectal polyps: a prospective study

    PubMed Central

    Parikh, Neil; Perl, Daniel; Lee, Michelle H.; Shah, Brijen; Young, Yuki; Chang, Shannon S.; Shukla, Richa; Polydorides, Alexandros D.; Moshier, Erin; Godbold, James; Zhou, Elinor; Mitchaml, Josephine; Richards-Kortum, Rebecca; Anandasabapathy, Sharmila

    2013-01-01

    High-resolution microendoscopy (HRME) is a low-cost, “optical biopsy” technology that allows for subcellular imaging. The purpose of this study was to determine the in vivo diagnostic accuracy of the HRME for the differentiation of neoplastic from non-neoplastic colorectal polyps and compare it to that of high-definition white-light endoscopy (WLE) with histopathology as the gold standard. Three endoscopists prospectively detected a total of 171 polyps from 94 patients that were then imaged by HRME and classified in real-time as neoplastic (adenomatous, cancer) or non-neoplastic (normal, hyperplastic, inflammatory). HRME had a significantly higher accuracy (94%), specificity (95%), and positive predictive value (87%) for the determination of neoplastic colorectal polyps compared to WLE (65%, 39%, and 55%, respectively). When looking at small colorectal polyps (less than 10 mm), HRME continued to significantly outperform WLE in terms of accuracy (95% vs. 64%), specificity (98% vs. 40%) and positive predictive value (92% vs. 55%). These trends continued when evaluating diminutive polyps (less than 5 mm) as HRME's accuracy (95%), specificity (98%), and positive predictive value (93%) were all significantly greater than their WLE counterparts (62%, 41%, and 53%, respectively). In conclusion, this in vivo study demonstrates that HRME can be a very effective modality in the differentiation of neoplastic and non-neoplastic colorectal polyps. A combination of standard white-light colonoscopy for polyp detection and HRME for polyp classification has the potential to truly allow the endoscopist to selectively determine which lesions can be left in situ, which lesions can simply be discarded, and which lesions need formal histopathologic analysis. PMID:24296752

  11. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    NASA Technical Reports Server (NTRS)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  12. Databases and registers: useful tools for research, no studies.

    PubMed

    Curbelo, Rafael J; Loza, Estíbaliz; de Yébenes, Maria Jesús García; Carmona, Loreto

    2014-04-01

    There are many misunderstandings about databases. Database is a commonly misused term in reference to any set of data entered into a computer. However, true databases serve a main purpose, organising data. They do so by establishing several layers of relationships; databases are hierarchical. Databases commonly organise data over different levels and over time, where time can be measured as the time between visits, or between treatments, or adverse events, etc. In this sense, medical databases are closely related to longitudinal observational studies, as databases allow the introduction of data on the same patient over time. Basically, we could establish four types of databases in medicine, depending on their purpose: (1) administrative databases, (2) clinical databases, (3) registers, and (4) study-oriented databases. But a database is a useful tool for a large variety of studies, not a type of study itself. Different types of databases serve very different purposes, and a clear understanding of the different research designs mentioned in this paper would prevent many of the databases we launch from being just a lot of work and very little science.

  13. Drosophila tools and assays for the study of human diseases

    PubMed Central

    Ugur, Berrak; Chen, Kuchuan; Bellen, Hugo J.

    2016-01-01

    ABSTRACT Many of the internal organ systems of Drosophila melanogaster are functionally analogous to those in vertebrates, including humans. Although humans and flies differ greatly in terms of their gross morphological and cellular features, many of the molecular mechanisms that govern development and drive cellular and physiological processes are conserved between both organisms. The morphological differences are deceiving and have led researchers to undervalue the study of invertebrate organs in unraveling pathogenic mechanisms of diseases. In this review and accompanying poster, we highlight the physiological and molecular parallels between fly and human organs that validate the use of Drosophila to study the molecular pathogenesis underlying human diseases. We discuss assays that have been developed in flies to study the function of specific genes in the central nervous system, heart, liver and kidney, and provide examples of the use of these assays to address questions related to human diseases. These assays provide us with simple yet powerful tools to study the pathogenic mechanisms associated with human disease-causing genes. PMID:26935102

  14. Assessing the accuracy of the International Classification of Diseases codes to identify abusive head trauma: a feasibility study

    PubMed Central

    Berger, Rachel P; Parks, Sharyn; Fromkin, Janet; Rubin, Pamela; Pecora, Peter J

    2016-01-01

    Objective To assess the accuracy of an International Classification of Diseases (ICD) code-based operational case definition for abusive head trauma (AHT). Methods Subjects were children <5 years of age evaluated for AHT by a hospital-based Child Protection Team (CPT) at a tertiary care paediatric hospital with a completely electronic medical record (EMR) system. Subjects were designated as non-AHT traumatic brain injury (TBI) or AHT based on whether the CPT determined that the injuries were due to AHT. The sensitivity and specificity of the ICD-based definition were calculated. Results There were 223 children evaluated for AHT: 117 AHT and 106 non-AHT TBI. The sensitivity and specificity of the ICD-based operational case definition were 92% (95% CI 85.8 to 96.2) and 96% (95% CI 92.3 to 99.7), respectively. All errors in sensitivity and three of the four specificity errors were due to coder error; one specificity error was a physician error. Conclusions In a paediatric tertiary care hospital with an EMR system, the accuracy of an ICD-based case definition for AHT was high. Additional studies are needed to assess the accuracy of this definition in all types of hospitals in which children with AHT are cared for. PMID:24167034

  15. Bellis perennis: a useful tool for protein localization studies.

    PubMed

    Jaedicke, Katharina; Rösler, Jutta; Gans, Tanja; Hughes, Jon

    2011-10-01

    Fluorescent fusion proteins together with transient transformation techniques are commonly used to investigate intracellular protein localisation in vivo. Biolistic transfection is reliable, efficient and avoids experimental problems associated with producing and handling fragile protoplasts. Onion epidermis pavement cells are frequently used with this technique, their excellent properties for microscopy resulting from their easy removal from the underlying tissues and large size. They also have advantages over mesophyll cells for fluorescence microscopy, as they are devoid of chloroplasts whose autofluorescence can pose problems. The arrested plastid development is peculiar to epidermal cells, however, and stands in the way of studies on protein targeting to plastids. We have developed a system enabling studies of in vivo protein targeting to organelles including chloroplasts within a photosynthetically active plant cell with excellent optical properties using a transient transformation procedure. We established biolistic transfection in epidermal pavement cells of the lawn daisy (Bellis perennis L., cultivar "Galaxy red") which unusually contain a moderate number of functional chloroplasts. These cells are excellent objects for fluorescence microscopy using current reporters, combining the advantages of the ease of biolistic transfection, the excellent optical properties of a single cell layer and access to chloroplast protein targeting. We demonstrate chloroplast targeting of plastid-localised heme oxygenase, and two further proteins whose localisation was equivocal. We also demonstrate unambiguous targeting to mitochondria, peroxisomes and nuclei. We thus propose that the Bellis system represents a valuable tool for protein localisation studies in living plant cells.

  16. Helicobacter pylori virulence factors as tools to study human migrations.

    PubMed

    Queiroz, Dulciene Maria de Magalhães; Cunha, Roberto Penna de Almeida; Saraiva, Ivan Euclides Borges; Rocha, Andreia Maria Camargos

    2010-12-15

    Helicobacter pylori is one of the most common infections worldwide. In most individuals it consists in a lifelong host-pathogen relationship without consequences, but in some subjects it is associated with peptic ulcer disease and gastric cancer. Polymorphism in genes that code bacterial virulence factors, cagA and vacA, are independently associated with the infection severe outcomes and are geographically diverse. In the last decade, accumulated knowledge allowed to characterize typical H. pylori strain patterns for all the major human populations; patterns that can be used to study the origin of specific human groups. Thus, the presence or absence of cagA, cagA EPIYA genotypes, and vacA subtypes can be used as tools to study not only the geographic origin of specific human populations, but also to identify markers of historical contact between different ethnicities. We report here a study including a set of native Amazon Amerindians that had supposedly been some, but little, contact with European Brazilian colonizer and/or African slaves. They harbor H. pylori strains in a mixed pattern with Asian and Iberian Peninsula characteristics. It is possible that this finding represents H. pylori recombination upon short contact between human groups. Alternatively, it could be due to a founder effect from a small cluster of Asian origin native Americans.

  17. A Study of the Training of Tool and Die Makers.

    ERIC Educational Resources Information Center

    Horowitz, Morris A.; Herrnstadt, Irwin L.

    To develop and test a methodology which would help determine the combination of education, training, and experience that is most likely to yield highly qualified workers in specific occupations, the tool and die maker trade was selected for examination in the Boston Metropolitan Area. Tool and die making was chosen because it is a clearly…

  18. A 3-D numerical study of pinhole diffraction to predict the accuracy of EUV point diffraction interferometry

    SciTech Connect

    Goldberg, K.A. |; Tejnil, E.; Bokor, J. |

    1995-12-01

    A 3-D electromagnetic field simulation is used to model the propagation of extreme ultraviolet (EUV), 13-nm, light through sub-1500 {Angstrom} dia pinholes in a highly absorptive medium. Deviations of the diffracted wavefront phase from an ideal sphere are studied within 0.1 numerical aperture, to predict the accuracy of EUV point diffraction interferometersused in at-wavelength testing of nearly diffraction-limited EUV optical systems. Aberration magnitudes are studied for various 3-D pinhole models, including cylindrical and conical pinhole bores.

  19. Studies on Effect of Fused Deposition Modelling Process Parameters on Ultimate Tensile Strength and Dimensional Accuracy of Nylon

    NASA Astrophysics Data System (ADS)

    Basavaraj, C. K.; Vishwas, M.

    2016-09-01

    This paper discusses the process parameters for fused deposition modelling (FDM). Layer thickness, Orientation angle and shell thickness are the process variables considered for studies. Ultimate tensile strength, dimensional accuracy and manufacturing time are the response parameters. For number of experimental runs the taguchi's L9 orthogonal array is used. Taguchis S/N ratio was used to identify a set of process parameters which give good results for respective response characteristics. Effectiveness of each parameter is investigated by using analysis of variance. The material used for the studies of process parameter is Nylon.

  20. Accuracy of tablet splitting: Comparison study between hand splitting and tablet cutter

    PubMed Central

    Habib, Walid A.; Alanizi, Abdulaziz S.; Abdelhamid, Magdi M.; Alanizi, Fars K.

    2013-01-01

    Background Tablet splitting is often used in pharmacy practice to adjust the administered doses. It is also used as a method of reducing medication costs. Objective To investigate the accuracy of tablet splitting by comparing hand splitting vs. a tablet cutter for a low dose drug tablet. Methods Salbutamol tablets (4 mg) were chosen as low dose tablets. A randomly selected equal number of tablets were split by hand and a tablet cutter, and the remaining tablets were kept whole. Weight variation and drug content were analysed for salbutamol in 0.1 N HCl using a validated spectrophotometric method. The percentages by which each whole tablet’s or half-tablet’s drug content and weight difference from sample mean values were compared with USP specification ranges for drug content. The %RSD was also calculated in order to determine whether the drugs met USP specification for %RSD. The tablets and half tablets were scanned using electron microscopy to show any visual differences arising from splitting. Results 27.5% of samples differed from sample mean values by a percentage that fell outside of USP specification for weight, of which 15% from the tablet cutter and 25% from those split by hand fell outside the specifications. All whole tablets and half tablets met the USP specifications for drug content but the variation of content between the two halves reached 21.3% of total content in case of hand splitting, and 7.13% only for the tablet cutter. The %RSDs for drug content and weight met the USP specification for whole salbutamol tablets and the half tablets which were split by tablet cutter. The halves which were split by hand fell outside the specification for %RSD (drug content = 6.43%, weight = 8.33%). The differences were visually clear in the electron microscope scans. Conclusion Drug content variation in half-tablets appeared to be attributable to weight variation occurring during the splitting process. This could have serious clinical consequences for

  1. CopyRighter: a rapid tool for improving the accuracy of microbial community profiles through lineage-specific gene copy number correction

    PubMed Central

    2014-01-01

    Background Culture-independent molecular surveys targeting conserved marker genes, most notably 16S rRNA, to assess microbial diversity remain semi-quantitative due to variations in the number of gene copies between species. Results Based on 2,900 sequenced reference genomes, we show that 16S rRNA gene copy number (GCN) is strongly linked to microbial phylogenetic taxonomy, potentially under-representing Archaea in amplicon microbial profiles. Using this relationship, we inferred the GCN of all bacterial and archaeal lineages in the Greengenes database within a phylogenetic framework. We created CopyRighter, new software which uses these estimates to correct 16S rRNA amplicon microbial profiles and associated quantitative (q)PCR total abundance. CopyRighter parses microbial profiles and, because GCN estimates are pre-computed for all taxa in the reference taxonomy, rapidly corrects GCN bias. Software validation with in silico and in vitro mock communities indicated that GCN correction results in more accurate estimates of microbial relative abundance and improves the agreement between metagenomic and amplicon profiles. Analyses of human-associated and anaerobic digester microbiomes illustrate that correction makes tangible changes to estimates of qPCR total abundance, α and β diversity, and can significantly change biological interpretation. For example, human gut microbiomes from twins were reclassified into three rather than two enterotypes after GCN correction. Conclusions The CopyRighter bioinformatic tools permits rapid correction of GCN in microbial surveys, resulting in improved estimates of microbial abundance, α and β diversity. PMID:24708850

  2. Ciliobrevins as tools for studying dynein motor function

    PubMed Central

    Roossien, Douglas H.; Miller, Kyle E.; Gallo, Gianluca

    2015-01-01

    Dyneins are a small class of molecular motors that bind to microtubules and walk toward their minus ends. They are essential for the transport and distribution of organelles, signaling complexes and cytoskeletal elements. In addition dyneins generate forces on microtubule arrays that power the beating of cilia and flagella, cell division, migration and growth cone motility. Classical approaches to the study of dynein function in axons involve the depletion of dynein, expression of mutant/truncated forms of the motor, or interference with accessory subunits. By necessity, these approaches require prolonged time periods for the expression or manipulation of cellular dynein levels. With the discovery of the ciliobrevins, a class of cell permeable small molecule inhibitors of dynein, it is now possible to acutely disrupt dynein both globally and locally. In this review, we briefly summarize recent work using ciliobrevins to inhibit dynein and discuss the insights ciliobrevins have provided about dynein function in various cell types with a focus on neurons. We temper this with a discussion of the need for studies that will elucidate the mechanism of action of ciliobrevin and as well as the need for experiments to further analyze the specificity of ciliobreviens for dynein. Although much remains to be learned about ciliobrevins, these small molecules are proving themselves to be valuable novel tools to assess the cellular functions of dynein. PMID:26217180

  3. Oral Fluency, Accuracy, and Complexity in Formal Instruction and Study Abroad Learning Contexts

    ERIC Educational Resources Information Center

    Mora, Joan C.; Valls-Ferrer, Margalida

    2012-01-01

    This study investigates the differential effects of two learning contexts, formal instruction (FI) at home and a study abroad period (SA), on the oral production skills of advanced-level Catalan-Spanish undergraduate learners of English. Speech samples elicited through an interview at three data collection times over a 2-year period were…

  4. ent-Steroids: novel tools for studies of signaling pathways.

    PubMed

    Covey, Douglas F

    2009-07-01

    Membrane receptors are often modulated by steroids and it is necessary to distinguish the effects of steroids at these receptors from effects occurring at nuclear receptors. Additionally, it may also be mechanistically important to distinguish between direct effects caused by binding of steroids to membrane receptors and indirect effects on membrane receptor function caused by steroid perturbation of the membrane containing the receptor. In this regard, ent-steroids, the mirror images of naturally occurring steroids, are novel tools for distinguishing between these various actions of steroids. The review provides a background for understanding the different actions that can be expected of steroids and ent-steroids in biological systems, references for the preparation of ent-steroids, a short discussion about relevant forms of stereoisomerism and the requirements that need to be fulfilled for the interaction between two molecules to be enantioselective. The review then summarizes results of biophysical, biochemical and pharmacological studies published since 1992 in which ent-steroids have been used to investigate the actions of steroids in membranes and/or receptor-mediated signaling pathways.

  5. Biosphere 2 Center as a unique tool for environmental studies.

    PubMed

    Walter, Achim; Lambrecht, Susanne Carmen

    2004-04-01

    The Biosphere 2 Laboratory of Biosphere 2 Center, Arizona, is a unique, self-contained glasshouse fostering several mesocosms of tropical and subtropical regions on an area of 12,700 m2. It was constructed around 1990 to test whether human life is possible in this completely sealed, self-sustaining artificial ecosystem. Mainly due to overly rich organic soils, the initial mission failed in a spectacular manner that raised enormous disbelief in the scientific seriousness of the project. From 1995 to 2003, the facility had been operated by Columbia University under a completely new scientific management. The aim of the project had then been to conduct research in the field of 'experimental climate change science'. Climatic conditions within the mesocosms can be precisely controlled. In studies with elevated CO2, altered temperature and irrigation regimes performed in the rainforest, coral reef and agriforestry mesocosm, the facility had proven to be a valuable tool for global climate change research. Upon submission of this manuscript, Columbia University is relinquishing the management of this facility now although there was a contract to operate the facility until 2010, leaving it with an unclear destiny that might bring about anything from complete abandonment to a new flowering phase with a new destination.

  6. Tool deflection in the milling of titanium alloy: case study

    NASA Astrophysics Data System (ADS)

    Zebala, W.

    2015-09-01

    Tool deflection strongly influences on the workpiece quality. Author of the paper built a simulation model of the down milling process of titanium alloy (Ti6Al4V) with a tool made of sintered carbides. Material model consists of strain, strain rate and thermal sensitivity formulations to predict the stress field distribution in the cutting zone. Numerical calculations were experimentally verified on the milling center, equipped with measuring devices: force dynamometer, thermo-vision and high-speed video cameras.

  7. Assessment of the accuracy of ABC/2 variations in traumatic epidural hematoma volume estimation: a retrospective study

    PubMed Central

    Hu, Tingting; Zhang, Zhen

    2016-01-01

    Background. The traumatic epidural hematoma (tEDH) volume is often used to assist in tEDH treatment planning and outcome prediction. ABC/2 is a well-accepted volume estimation method that can be used for tEDH volume estimation. Previous studies have proposed different variations of ABC/2; however, it is unclear which variation will provide a higher accuracy. Given the promising clinical contribution of accurate tEDH volume estimations, we sought to assess the accuracy of several ABC/2 variations in tEDH volume estimation. Methods. The study group comprised 53 patients with tEDH who had undergone non-contrast head computed tomography scans. For each patient, the tEDH volume was automatically estimated by eight ABC/2 variations (four traditional and four newly derived) with an in-house program, and results were compared to those from manual planimetry. Linear regression, the closest value, percentage deviation, and Bland-Altman plot were adopted to comprehensively assess accuracy. Results. Among all ABC/2 variations assessed, the traditional variations y = 0.5 × A1B1C1 (or A2B2C1) and the newly derived variations y = 0.65 × A1B1C1 (or A2B2C1) achieved higher accuracy than the other variations. No significant differences were observed between the estimated volume values generated by these variations and those of planimetry (p > 0.05). Comparatively, the former performed better than the latter in general, with smaller mean percentage deviations (7.28 ± 5.90% and 6.42 ± 5.74% versus 19.12 ± 6.33% and 21.28 ± 6.80%, respectively) and more values closest to planimetry (18/53 and 18/53 versus 2/53 and 0/53, respectively). Besides, deviations of most cases in the former fell within the range of <10% (71.70% and 84.91%, respectively), whereas deviations of most cases in the latter were in the range of 10–20% and >20% (90.57% and 96.23, respectively). Discussion. In the current study, we adopted an automatic approach to assess the accuracy of several ABC/2 variations

  8. ACCURACY AND PRECISION OF A METHOD TO STUDY KINEMATICS OF THE TEMPOROMANDIBULAR JOINT: COMBINATION OF MOTION DATA AND CT IMAGING

    PubMed Central

    Baltali, Evre; Zhao, Kristin D.; Koff, Matthew F.; Keller, Eugene E.; An, Kai-Nan

    2008-01-01

    The purpose of the study was to test the precision and accuracy of a method used to track selected landmarks during motion of the temporomandibular joint (TMJ). A precision phantom device was constructed and relative motions between two rigid bodies on the phantom device were measured using optoelectronic (OE) and electromagnetic (EM) motion tracking devices. The motion recordings were also combined with a 3D CT image for each type of motion tracking system (EM+CT and OE+CT) to mimic methods used in previous studies. In the OE and EM data collections, specific landmarks on the rigid bodies were determined using digitization. In the EM+CT and OE+CT data sets, the landmark locations were obtained from the CT images. 3D linear distances and 3D curvilinear path distances were calculated for the points. The accuracy and precision for all 4 methods were evaluated (EM, OE, EM+CT and OE+CT). In addition, results were compared with and without the CT imaging (EM vs. EM+CT, OE vs. OE+CT). All systems overestimated the actual 3D curvilinear path lengths. All systems also underestimated the actual rotation values. The accuracy of all methods was within 0.5 mm for 3D curvilinear path calculations, 0.05 mm for 3D linear distance calculations, and 0.2° for rotation calculations. In addition, Bland-Altman plots for each configuration of the systems suggest that measurements obtained from either system are repeatable and comparable. PMID:18617178

  9. The accuracy of the MMSE in detecting cognitive impairment when administered by general practitioners: A prospective observational study

    PubMed Central

    Pezzotti, Patrizio; Scalmana, Silvia; Mastromattei, Antonio; Di Lallo, Domenico

    2008-01-01

    Background The Mini-Mental State Examination (MMSE) has contributed to detecting cognitive impairment, yet few studies have evaluated its accuracy when used by general practitioners (GP) in an actual public-health setting. Objectives We evaluated the accuracy of MMSE scores obtained by GPs by comparing them to scores obtained by Alzheimer's Evaluation Units (UVA). Methods The study was observational in design and involved 59 voluntary GPs who, after having undergone training, administered the MMSE to patients with symptoms of cognitive disturbances. Individuals who scored ≤ 24 (adjusted by age and educational level) were referred to Alzheimer's Evaluation Units (UVA) for diagnosis (including the MMSE). UVAs were unblinded to the MMSE score of the GP. To measure interrater agreement, the weighted Kappa statistic was calculated. To evaluate factors associated with the magnitude of the difference between paired scores, a linear regression model was applied. To quantify the accuracy in discriminating no cognitive impairment from any cognitive impairment and from Alzheimer's disease (AD), the ROC curves (AUC) were calculated. Results For the 317 patients, the mean score obtained by GPs was significantly lower (15.8 vs. 17.4 for the UVAs; p < 0.01). However, overall concordance was good (Kappa = 0.86). Only the diagnosis made by the UVA was associated with the difference between paired scores: the adjusted mean difference was 3.1 for no cognitive impairment and 3.8 for mild cognitive impairment. The AUC of the scores for GPs was 0.80 (95%CI: 0.75–0.86) for discriminating between no impairment and any impairment and 0.89 (95%CI: 0.84–0.94) for distinguishing patients with AD, though the UVA scores discriminated better. Conclusion In a public-health setting involving patients with symptoms of cognitive disturbances, the MMSE used by the GPs was sufficiently accurate to detect patients with cognitive impairment, particularly those with dementia. PMID:18477390

  10. Timing accuracy of Web experiments: a case study using the WebExp software package.

    PubMed

    Keller, Frank; Gunasekharan, Subahshini; Mayo, Neil; Corley, Martin

    2009-02-01

    Although Internet-based experiments are gaining in popularity, most studies rely on directly evaluating participants' responses rather than response times. In the present article, we present two experiments that demonstrate the feasibility of collecting response latency data over the World-Wide Web using WebExp-a software package designed to run psychological experiments over the Internet. Experiment 1 uses WebExp to collect measurements for known time intervals (generated using keyboard repetition). The resulting measurements are found to be accurate across platforms and load conditions. In Experiment 2, we use WebExp to replicate a lab-based self-paced reading study from the psycholinguistic literature. The data of the Web-based replication correlate significantly with those of the original study and show the same main effects and interactions. We conclude that WebExp can be used to obtain reliable response time data, at least for the self-paced reading paradigm.

  11. Textbook-Bundled Metacognitive Tools: A Study of LearnSmart's Efficacy in General Chemistry

    ERIC Educational Resources Information Center

    Thadani, Vandana; Bouvier-Brown, Nicole C.

    2016-01-01

    College textbook publishers increasingly bundle sophisticated technology-based study tools with their texts. These tools appear promising, but empirical work on their efficacy is needed. We examined whether LearnSmart, a study tool bundled with McGraw-Hill's textbook "Chemistry" (Chang & Goldsby, 2013), improved learning in an…

  12. Early Career Teachers Accuracy in Predicting Behavioral Functioning: A Pilot Study of Teacher Skills

    ERIC Educational Resources Information Center

    Mortenson, Bruce P.; Rush, Karena S.; Webster, John; Beck, Twila

    2008-01-01

    The purpose of this study was to discern the current skill level of novice teachers in identifying the function of problem behaviors and illustrate the continued need for developing data collection skills with this population. Eighty-eight teachers with experience ranging from 1-5 years completed a series of open and forced-choice questions that…

  13. A Sixteen Journal Study of Accuracy of Direct Quotes and Associated Reference List Entries.

    ERIC Educational Resources Information Center

    White, Arden; And Others

    A study examined the nature and frequency of faults and errors in reference list entries and direct quotes selected from all 1988 issues of 16 social and biological science journals. All departures from the original (additions, omissions, or changes) were labelled as either a word or punctuation deviation. Of the 402 quotes verified, 33.33%…

  14. A Longitudinal Study of Complexity, Accuracy and Fluency Variation in Second Language Development

    ERIC Educational Resources Information Center

    Ferraris, Stefania

    2012-01-01

    This chapter presents the results of a study on interlanguage variation. The production of four L2 learners of Italian, tested four times at yearly intervals while engaged in four oral tasks, is compared to that of two native speakers, and analysed with quantitative CAF measures. Thus, time, task type, nativeness, as well as group vs. individual…

  15. Accurate radiometry from space: an essential tool for climate studies.

    PubMed

    Fox, Nigel; Kaiser-Weiss, Andrea; Schmutz, Werner; Thome, Kurtis; Young, Dave; Wielicki, Bruce; Winkler, Rainer; Woolliams, Emma

    2011-10-28

    The Earth's climate is undoubtedly changing; however, the time scale, consequences and causal attribution remain the subject of significant debate and uncertainty. Detection of subtle indicators from a background of natural variability requires measurements over a time base of decades. This places severe demands on the instrumentation used, requiring measurements of sufficient accuracy and sensitivity that can allow reliable judgements to be made decades apart. The International System of Units (SI) and the network of National Metrology Institutes were developed to address such requirements. However, ensuring and maintaining SI traceability of sufficient accuracy in instruments orbiting the Earth presents a significant new challenge to the metrology community. This paper highlights some key measurands and applications driving the uncertainty demand of the climate community in the solar reflective domain, e.g. solar irradiances and reflectances/radiances of the Earth. It discusses how meeting these uncertainties facilitate significant improvement in the forecasting abilities of climate models. After discussing the current state of the art, it describes a new satellite mission, called TRUTHS, which enables, for the first time, high-accuracy SI traceability to be established in orbit. The direct use of a 'primary standard' and replication of the terrestrial traceability chain extends the SI into space, in effect realizing a 'metrology laboratory in space'.

  16. Accurate Radiometry from Space: An Essential Tool for Climate Studies

    NASA Technical Reports Server (NTRS)

    Fox, Nigel; Kaiser-Weiss, Andrea; Schmutz, Werner; Thome, Kurtis; Young, Dave; Wielicki, Bruce; Winkler, Rainer; Woolliams, Emma

    2011-01-01

    The Earth s climate is undoubtedly changing; however, the time scale, consequences and causal attribution remain the subject of significant debate and uncertainty. Detection of subtle indicators from a background of natural variability requires measurements over a time base of decades. This places severe demands on the instrumentation used, requiring measurements of sufficient accuracy and sensitivity that can allow reliable judgements to be made decades apart. The International System of Units (SI) and the network of National Metrology Institutes were developed to address such requirements. However, ensuring and maintaining SI traceability of sufficient accuracy in instruments orbiting the Earth presents a significant new challenge to the metrology community. This paper highlights some key measurands and applications driving the uncertainty demand of the climate community in the solar reflective domain, e.g. solar irradiances and reflectances/radiances of the Earth. It discusses how meeting these uncertainties facilitate significant improvement in the forecasting abilities of climate models. After discussing the current state of the art, it describes a new satellite mission, called TRUTHS, which enables, for the first time, high-accuracy SI traceability to be established in orbit. The direct use of a primary standard and replication of the terrestrial traceability chain extends the SI into space, in effect realizing a metrology laboratory in space . Keywords: climate change; Earth observation; satellites; radiometry; solar irradiance

  17. Accuracy of age estimation methods from orthopantomograph in forensic odontology: a comparative study.

    PubMed

    Khorate, Manisha M; Dinkar, A D; Ahmed, Junaid

    2014-01-01

    Changes related to chronological age are seen in both hard and soft tissue. A number of methods for age estimation have been proposed which can be classified in four categories, namely, clinical, radiological, histological and chemical analysis. In forensic odontology, age estimation based on tooth development is universally accepted method. The panoramic radiographs of 500 healthy Goan, Indian children (250 boys and 250 girls) aged between 4 and 22.1 years were selected. Modified Demirjian's method (1973/2004), Acharya AB formula (2011), Dr Ajit D. Dinkar (1984) regression equation, Foti and coworkers (2003) formula (clinical and radiological) were applied for estimation of age. The result of our study has shown that Dr Ajit D. Dinkar method is more accurate followed by Acharya Indian-specific formula. Furthermore, in this study by applying all these methods to one regional population, we have attempted to present dental age estimation methodology best suited for the Goan Indian population.

  18. The impact of registration accuracy on imaging validation study design: A novel statistical power calculation.

    PubMed

    Gibson, Eli; Fenster, Aaron; Ward, Aaron D

    2013-10-01

    Novel imaging modalities are pushing the boundaries of what is possible in medical imaging, but their signal properties are not always well understood. The evaluation of these novel imaging modalities is critical to achieving their research and clinical potential. Image registration of novel modalities to accepted reference standard modalities is an important part of characterizing the modalities and elucidating the effect of underlying focal disease on the imaging signal. The strengths of the conclusions drawn from these analyses are limited by statistical power. Based on the observation that in this context, statistical power depends in part on uncertainty arising from registration error, we derive a power calculation formula relating registration error, number of subjects, and the minimum detectable difference between normal and pathologic regions on imaging, for an imaging validation study design that accommodates signal correlations within image regions. Monte Carlo simulations were used to evaluate the derived models and test the strength of their assumptions, showing that the model yielded predictions of the power, the number of subjects, and the minimum detectable difference of simulated experiments accurate to within a maximum error of 1% when the assumptions of the derivation were met, and characterizing sensitivities of the model to violations of the assumptions. The use of these formulae is illustrated through a calculation of the number of subjects required for a case study, modeled closely after a prostate cancer imaging validation study currently taking place at our institution. The power calculation formulae address three central questions in the design of imaging validation studies: (1) What is the maximum acceptable registration error? (2) How many subjects are needed? (3) What is the minimum detectable difference between normal and pathologic image regions?

  19. Assessing accuracy of a probabilistic model for very large fire in the Rocky Mountains: A High Park Fire case study

    NASA Astrophysics Data System (ADS)

    Stavros, E.; Abatzoglou, J. T.; Larkin, N.; McKenzie, D.; Steel, A.

    2012-12-01

    Across the western United States, the largest wildfires account for a major proportion of the area burned and substantially affect mountain forests and their associated ecosystem services, among which is pristine air quality. These fires commandeer national attention and significant fire suppression resources. Despite efforts to understand the influence of fuel loading, climate, and weather on annual area burned, few studies have focused on understanding what abiotic factors enable and drive the very largest wildfires. We investigated the correlation between both antecedent climate and in-situ biophysical variables and very large (>20,000 ha) fires in the western United States from 1984 to 2009. We built logistic regression models, at the spatial scale of the national Geographic Area Coordination Centers (GACCs), to estimate the probability that a given day is conducive to a very large wildfire. Models vary in accuracy and in which variables are the best predictors. In a case study of the conditions of the High Park Fire, neighboring Fort Collins, Colorado, occurring in early summer 2012, we evaluate the predictive accuracy of the Rocky Mountain model.

  20. Improved accuracy of markerless motion tracking on bone suppression images: preliminary study for image-guided radiation therapy (IGRT)

    NASA Astrophysics Data System (ADS)

    Tanaka, Rie; Sanada, Shigeru; Sakuta, Keita; Kawashima, Hiroki

    2015-05-01

    The bone suppression technique based on advanced image processing can suppress the conspicuity of bones on chest radiographs, creating soft tissue images obtained by the dual-energy subtraction technique. This study was performed to evaluate the usefulness of bone suppression image processing in image-guided radiation therapy. We demonstrated the improved accuracy of markerless motion tracking on bone suppression images. Chest fluoroscopic images of nine patients with lung nodules during respiration were obtained using a flat-panel detector system (120 kV, 0.1 mAs/pulse, 5 fps). Commercial bone suppression image processing software was applied to the fluoroscopic images to create corresponding bone suppression images. Regions of interest were manually located on lung nodules and automatic target tracking was conducted based on the template matching technique. To evaluate the accuracy of target tracking, the maximum tracking error in the resulting images was compared with that of conventional fluoroscopic images. The tracking errors were decreased by half in eight of nine cases. The average maximum tracking errors in bone suppression and conventional fluoroscopic images were 1.3   ±   1.0 and 3.3   ±   3.3 mm, respectively. The bone suppression technique was especially effective in the lower lung area where pulmonary vessels, bronchi, and ribs showed complex movements. The bone suppression technique improved tracking accuracy without special equipment and implantation of fiducial markers, and with only additional small dose to the patient. Bone suppression fluoroscopy is a potential measure for respiratory displacement of the target. This paper was presented at RSNA 2013 and was carried out at Kanazawa University, JAPAN.

  1. Accuracy of Family History of Hemochromatosis or Iron Overload: The Hemochromatosis and Iron Overload Screening Study

    PubMed Central

    Acton, Ronald T.; Barton, James C.; Passmore, Leah V.; Adams, Paul C.; Mclaren, Gordon D.; Leiendecker–Foster, Catherine; Speechley, Mark R.; Harris, Emily L.; Castro, Oswaldo; Reiss, Jacob A.; Snively, Beverly M.; Harrison, Barbara W.; Mclaren, Christine E.

    2013-01-01

    Background & Aims The aim of this study was to assess the analytic validity of self-reported family history of hemochromatosis or iron overload. Methods A total of 141 probands, 549 family members, and 641 controls participated in the primary care Hemochromatosis and Iron Overload Screening Study. Participants received a postscreening clinical examination and completed questionnaires about personal and family histories of hemochromatosis or iron overload, arthritis, diabetes, liver disease, and heart disease. We evaluated sensitivities and specificities of proband-reported family history, and concordance of HFE genotype C282Y/C282Y in probands and siblings who reported having hemochromatosis or iron overload. Results The sensitivities of proband-reported family history ranged from 81.4% for hemochromatosis or iron overload to 18.4% for liver disease; specificities for diabetes, liver disease, and heart disease were greater than 94%. Hemochromatosis or iron overload was associated with a positive family history across all racial/ethnic groups in the study (odds ratio, 14.53; 95% confidence intervals, 7.41–28.49; P < .0001) and among Caucasians (odds ratio, 16.98; 95% confidence intervals, 7.53–38.32; P < .0001). There was 100% concordance of HFE genotype C282Y/C282Y in 6 probands and 8 of their siblings who reported having hemochromatosis or iron overload. Conclusions Self-reported family history of hemochromatosis or iron overload can be used to identify individuals whose risk of hemochromatosis or iron overload and associated conditions is increased. These individuals could benefit from further evaluation with iron phenotyping and HFE mutation analysis. PMID:18585964

  2. Strategies to Improve the Accuracy of Mars-GRAM Sensitivity Studies at Large Optical Depths

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, Carl G.; Badger, Andrew M.

    2010-01-01

    The poster provides an overview of techniques to improve the Mars Global Reference Atmospheric Model (Mars-GRAM) sensitivity. It has been discovered during the Mars Science Laboratory (MSL) site selection process that the Mars Global Reference Atmospheric Model (Mars-GRAM) when used for sensitivity studies for TES MapYear = 0 and large optical depth values such as tau = 3 is less than realistic. A preliminary fix has been made to Mars-GRAM by adding a density factor value that was determined for tau = 0.3, 1 and 3.

  3. Improving Mars-GRAM: Increasing the Accuracy of Sensitivity Studies at Large Optical Depths

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, C. G.; Badger, Andrew M.

    2010-01-01

    Extensively utilized for numerous mission applications, the Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model. In a Monte-Carlo mode, Mars-GRAM's perturbation modeling capability is used to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). Mars-GRAM has been found to be inexact when used during the Mars Science Laboratory (MSL) site selection process for sensitivity studies for MapYear=0 and large optical depth values such as tau=3. Mars-GRAM is based on the NASA Ames Mars General Circulation Model (MGCM) from the surface to 80 km altitude. Mars-GRAM with the MapYear parameter set to 0 utilizes results from a MGCM run with a fixed value of tau=3 at all locations for the entire year. Imprecise atmospheric density and pressure at all altitudes is a consequence of this use of MGCM with tau=3. Density factor values have been determined for tau=0.3, 1 and 3 as a preliminary fix to this pressure-density problem. These factors adjust the input values of MGCM MapYear 0 pressure and density to achieve a better match of Mars-GRAM MapYear 0 with Thermal Emission Spectrometer (TES) observations for MapYears 1 and 2 at comparable dust loading. These density factors are fixed values for all latitudes and Ls and are included in Mars-GRAM Release 1.3. Work currently being done, to derive better multipliers by including variations with latitude and/or Ls by comparison of MapYear 0 output directly against TES limb data, will be highlighted in the presentation. The TES limb data utilized in this process has been validated by a comparison study between Mars atmospheric density estimates from Mars-GRAM and measurements by Mars Global Surveyor (MGS). This comparison study was undertaken for locations on Mars of varying latitudes, Ls, and LTST. The more precise density factors will be included in Mars-GRAM 2005 Release 1.4 and thus improve the results of future sensitivity studies done for large

  4. Effect of considering the initial parameters on accuracy of experimental studies conclusions

    NASA Astrophysics Data System (ADS)

    Zagulova, D.; Nesterenko, A.; Kapilevich, L.; Popova, J.

    2015-11-01

    The presented paper contains the evidences of the necessity to take into account the initial level of physiological parameters while conducting the biomedical research; it is exemplified by certain indicators of cardiorespiratory system. The analysis is based on the employment of data obtained via the multiple surveys of medical and pharmaceutical college students. There has been revealed a negative correlation of changes of the studied parameters of cardiorespiratory system in the repeated measurements compared to their initial level. It is assumed that the dependence of the changes of physiological parameters from the baseline can be caused by the biorhythmic changes inherent for all body systems.

  5. When Advocacy Obscures Accuracy Online: Digital Pandemics of Public Health Misinformation Through an Antifluoride Case Study

    PubMed Central

    Getman, Rebekah; Saraf, Avinash; Zhang, Lily H.; Kalenderian, Elsbeth

    2015-01-01

    Objectives. In an antifluoridation case study, we explored digital pandemics and the social spread of scientifically inaccurate health information across the Web, and we considered the potential health effects. Methods. Using the social networking site Facebook and the open source applications Netvizz and Gephi, we analyzed the connectedness of antifluoride networks as a measure of social influence, the social diffusion of information based on conversations about a sample scientific publication as a measure of spread, and the engagement and sentiment about the publication as a measure of attitudes and behaviors. Results. Our study sample was significantly more connected than was the social networking site overall (P < .001). Social diffusion was evident; users were forced to navigate multiple pages or never reached the sample publication being discussed 60% and 12% of the time, respectively. Users had a 1 in 2 chance of encountering negative and nonempirical content about fluoride unrelated to the sample publication. Conclusions. Network sociology may be as influential as the information content and scientific validity of a particular health topic discussed using social media. Public health must employ social strategies for improved communication management. PMID:25602893

  6. Strategies to Improve the Accuracy of Mars-GRAM Sensitivity Studies at Large Optical Depths

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, Carl G.; Badger, Andrew M.

    2009-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). It has been discovered during the Mars Science Laboratory (MSL) site selection process that Mars-GRAM when used for sensitivity studies for MapYear=0 and large optical depth values such as tau=3 is less than realistic. A comparison study between Mars atmospheric density estimates from Mars- GRAM and measurements by Mars Global Surveyor (MGS) has been undertaken for locations of varying latitudes, Ls, and LTST on Mars. The preliminary results from this study have validated the Thermal Emission Spectrometer (TES) limb data. From the surface to 80 km altitude, Mars- GRAM is based on the NASA Ames Mars General Circulation Model (MGCM). MGCM results that were used for Mars-GRAM with MapYear=0 were from a MGCM run with a fixed value of tau=3 for the entire year at all locations. Unrealistic energy absorption by uniform atmospheric dust leads to an unrealistic thermal energy balance on the polar caps. The outcome is an inaccurate cycle of condensation/sublimation of the polar caps and, as a consequence, an inaccurate cycle of total atmospheric mass and global-average surface pressure. Under an assumption of unchanged temperature profile and hydrostatic equilibrium, a given percentage change in surface pressure would produce a corresponding percentage change in density at all altitudes. Consequently, the final result of a change in surface pressure is an imprecise atmospheric density at all altitudes. To solve this pressure-density problem, a density factor value was determined for tau=.3, 1 and 3 that will adjust the input values of MGCM MapYear 0 pressure and density to achieve a better match of Mars-GRAM MapYear=0 with MapYears 1 and 2 MGCM output

  7. A pre–postintervention study to evaluate the impact of dose calculators on the accuracy of gentamicin and vancomycin initial doses

    PubMed Central

    Hamad, Anas; Cavell, Gillian; Hinton, James; Wade, Paul; Whittlesea, Cate

    2015-01-01

    Objectives Gentamicin and vancomycin are narrow-therapeutic-index antibiotics with potential for high toxicity requiring dose individualisation and continuous monitoring. Clinical decision support (CDS) tools have been effective in reducing gentamicin and vancomycin dosing errors. Online dose calculators for these drugs were implemented in a London National Health Service hospital. This study aimed to evaluate the impact of these calculators on the accuracy of gentamicin and vancomycin initial doses. Methods The study used a pre–postintervention design. Data were collected using electronic patient records and paper notes. Random samples of gentamicin and vancomycin initial doses administered during the 8 months before implementation of the calculators were assessed retrospectively against hospital guidelines. Following implementation of the calculators, doses were assessed prospectively. Any gentamicin dose not within ±10% and any vancomycin dose not within ±20% of the guideline-recommended dose were considered incorrect. Results The intranet calculator pages were visited 721 times (gentamicin=333; vancomycin=388) during the 2-month period following the calculators’ implementation. Gentamicin dose errors fell from 61.5% (120/195) to 44.2% (95/215), p<0.001. Incorrect vancomycin loading doses fell from 58.1% (90/155) to 32.4% (46/142), p<0.001. Incorrect vancomycin first maintenance doses fell from 55.5% (86/155) to 33.1% (47/142), p<0.001. Loading and first maintenance vancomycin doses were both incorrect in 37.4% (58/155) of patients before and 13.4% (19/142) after calculator implementation, p<0.001. Conclusions This study suggests that gentamicin and vancomycin dose calculators significantly improved the prescribing of initial doses of these agents. Therefore, healthcare organisations should consider using such CDS tools to support the prescribing of these high-risk drugs. PMID:26044758

  8. Developing a temperature sensitive tool for studying spin dissipation

    NASA Astrophysics Data System (ADS)

    Wickey, Kurtis Jon

    Measuring the thermodynamic properties of nanoscale structures is becoming increasingly important as heterostructures and devices shrink in size. For example, recent discoveries of spin thermal effects such as spin Seebeck and spin Peltier show that thermal gradients can manipulate spin systems and vice versa. However, the relevant interactions occur within a spin diffusion length of a spin active interface, making study of these spin thermal effects challenging. In addition, recent ferromagnetic resonance studies of spatially confined nanomagnets have shown unique magnon modes in arrays and lines which may give rise to unique magnon-phonon interactions. In this case, the small volume of magnetic material presents a challenge to measurement and as a result the bulk of the work is done on arrays with measurements of the magnetization of individual particles possible through various microscopies but limited access to thermal properties. As a result, tools capable of measuring the thermal properties of nanoscale structures are required to fully explore this emerging science. One approach to addressing this challenge is the use of microscale suspended platforms that maximize their sensitivity to these spin thermal interactions through thermal isolation from their surroundings. Combining this thermal decoupling with sensitive thermometry allows for the measurement of nanojoule heat accumulations, such as those resulting from the small heat flows associated with spin transport and spin relaxation. As these heat flows may manifest themselves in a variety of spin-thermal effects, the development of measurement platforms that can be tailored to optimize their sensitivity to specific thermal measurements is essential. To address these needs, I have fabricated thermally isolated platforms using a unique focused ion beam (FIB) machining that allow for flexible geometries as well as a wide choice of material systems. The thermal characteristics of these platforms were

  9. Molecular characterization of ten F8 splicing mutations in RNA isolated from patient's leucocytes: assessment of in silico prediction tools accuracy.

    PubMed

    Martorell, L; Corrales, I; Ramirez, L; Parra, R; Raya, A; Barquinero, J; Vidal, F

    2015-03-01

    Although 8% of reported FVIII gene (F8) mutations responsible for haemophilia A (HA) affect mRNA processing, very few have been fully characterized at the mRNA level and/or systematically predicted their biological consequences by in silico analysis. This study is aimed to elucidate the effect of potential splice site mutations (PSSM) on the F8 mRNA processing, investigate its correlation with disease severity, and assess their concordance with in silico predictions. We studied the F8 mRNA from 10 HA patient's leucocytes with PSSM by RT-PCR and compared the experimental results with those predicted in silico. The mRNA analysis could explain all the phenotypes observed and demonstrated exon skipping in six cases (c.222G>A, c.601+1delG, c.602-11T>G, c.671-3C>G, c.6115+9C>G and c.6116-1G>A) and activation of cryptic splicing sites, both donor (c.1009+1G>A and c.1009+3A>C) and acceptor sites (c.266-3delC and c.5587-1G>A). In contrast, the in silico analysis was able to predict the score variation of most of the affected splice site, but the precise mechanism could only be correctly determined in two of the 10 mutations analysed. In addition, we have detected aberrant F8 transcripts, even in healthy controls, so this must be taken into account as they could mask the actual contribution of some PSSM. We conclude that F8 mRNA analysis using leucocytes still constitutes an excellent approach to investigate the transcriptional effects of the PSSM in HA, whereas prediction in silico is not always reliable for diagnostic decision-making.

  10. A case control study to improve accuracy of an electronic fall prevention toolkit.

    PubMed

    Dykes, Patricia C; I-Ching, Evita Hou; Soukup, Jane R; Chang, Frank; Lipsitz, Stuart

    2012-01-01

    Patient falls are a serious and commonly report adverse event in hospitals. In 2009, our team conducted the first randomized control trial of a health information technology-based intervention that significantly reduced falls in acute care hospitals. However, some patients on intervention units with access to the electronic toolkit fell. The purpose of this case control study was to use data mining and modeling techniques to identify the factors associated with falls in hospitalized patients when the toolkit was in place. Our ultimate aim was to apply our findings to improve the toolkit logic and to generate practice recommendations. The results of our evaluation suggest that the fall prevention toolkit logic is accurate but strategies are needed to improve adherence with the fall prevention intervention recommendations generated by the electronic toolkit.

  11. Associations between visual perception accuracy and confidence in a dopaminergic manipulation study

    PubMed Central

    Andreou, Christina; Bozikas, Vasilis P.; Luedtke, Thies; Moritz, Steffen

    2015-01-01

    Delusions are defined as fixed erroneous beliefs that are based on misinterpretation of events or perception, and cannot be corrected by argumentation to the opposite. Cognitive theories of delusions regard this symptom as resulting from specific distorted thinking styles that lead to biased integration and interpretation of perceived stimuli (i.e., reasoning biases). In previous studies, we were able to show that one of these reasoning biases, overconfidence in errors, can be modulated by drugs that act on the dopamine system, a major neurotransmitter system implicated in the pathogenesis of delusions and other psychotic symptoms. Another processing domain suggested to involve the dopamine system and to be abnormal in psychotic disorders is sensory perception. The present study aimed to investigate whether (lower-order) sensory perception and (higher-order) overconfidence in errors are similarly affected by dopaminergic modulation in healthy subjects. Thirty-four healthy individuals were assessed upon administration of l-dopa, placebo, or haloperidol within a randomized, double-blind, cross-over design. Variables of interest were hits and false alarms in an illusory perception paradigm requiring speeded detection of pictures over a noisy background, and subjective confidence ratings for correct and incorrect responses. There was a significant linear increase of false alarm rates from haloperidol to placebo to l-dopa, whereas hit rates were not affected by dopaminergic manipulation. As hypothesized, confidence in error responses was significantly higher with l-dopa compared to placebo. Moreover, confidence in erroneous responses significantly correlated with false alarm rates. These findings suggest that overconfidence in errors and aberrant sensory processing might be both interdependent and related to dopaminergic transmission abnormalities in patients with psychosis. PMID:25932015

  12. Updating Mars-GRAM to Increase the Accuracy of Sensitivity Studies at Large Optical Depths

    NASA Technical Reports Server (NTRS)

    Justh, Hiliary L.; Justus, C. G.; Badger, Andrew M.

    2010-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). During the Mars Science Laboratory (MSL) site selection process, it was discovered that Mars-GRAM, when used for sensitivity studies for MapYear=0 and large optical depth values such as tau=3, is less than realistic. From the surface to 80 km altitude, Mars-GRAM is based on the NASA Ames Mars General Circulation Model (MGCM). MGCM results that were used for Mars-GRAM with MapYear set to 0 were from a MGCM run with a fixed value of tau=3 for the entire year at all locations. This has resulted in an imprecise atmospheric density at all altitudes. As a preliminary fix to this pressure-density problem, density factor values were determined for tau=0.3, 1 and 3 that will adjust the input values of MGCM MapYear 0 pressure and density to achieve a better match of Mars-GRAM MapYear 0 with Thermal Emission Spectrometer (TES) observations for MapYears 1 and 2 at comparable dust loading. Currently, these density factors are fixed values for all latitudes and Ls. Results will be presented from work being done to derive better multipliers by including variation with latitude and/or Ls by comparison of MapYear 0 output directly against TES limb data. The addition of these more precise density factors to Mars-GRAM 2005 Release 1.4 will improve the results of the sensitivity studies done for large optical depths.

  13. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Technical Reports Server (NTRS)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  14. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Astrophysics Data System (ADS)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  15. Accuracy of five electronic foramen locators with different operating systems: an ex vivo study

    PubMed Central

    de VASCONCELOS, Bruno Carvalho; BUENO, Michelli de Medeiros; LUNA-CRUZ, Suyane Maria; DUARTE, Marco Antonio Hungaro; FERNANDES, Carlos Augusto de Oliveira

    2013-01-01

    Objective: The aim of this study was to evaluate, ex vivo, the precision of five electronic root canal length measurement devices (ERCLMDs) with different operating systems: the Root ZX, Mini Apex Locator, Propex II, iPex, and RomiApex A-15, and the possible influence of the positioning of the instrument tips short of the apical foramen. Material and Methods: Forty-two mandibular bicuspids had their real canal lengths (RL) previously determined. Electronic measurements were performed 1.0 mm short of the apical foramen (-1.0), followed by measurements at the apical foramen (0.0). The data resulting from the comparison of the ERCLMD measurements and the RL were evaluated by the Wilcoxon and Friedman tests at a significance level of 5%. Results: Considering the measurements performed at 0.0 and -1.0, the precision rates for the ERCLMDs were: 73.5% and 47.1% (Root ZX), 73.5% and 55.9% (Mini Apex Locator), 67.6% and 41.1% (Propex II), 61.7% and 44.1% (iPex), and 79.4% and 44.1% (RomiApex A-15), respectively, considering ±0.5 mm of tolerance. Regarding the mean discrepancies, no differences were observed at 0.0; however, in the measurements at -1.0, the iPex, a multi-frequency ERCLMD, had significantly more discrepant readings short of the apical foramen than the other devices, except for the Propex II, which had intermediate results. When the ERCLMDs measurements at -1.0 were compared with those at 0.0, the Propex II, iPex and RomiApex A-15 presented significantly higher discrepancies in their readings. Conclusions: Under the conditions of the present study, all the ERCLMDs provided acceptable measurements at the 0.0 position. However, at the -1.0 position, the ERCLMDs had a lower precision, with statistically significant differences for the Propex II, iPex, and RomiApex A-15. PMID:23739852

  16. [Studies on the accuracy and precision of total serum cholesterol in regional interlaboratory trials (author's transl)].

    PubMed

    Hohenwallner, W; Sommer, R; Wimmer, E

    1976-01-02

    The between-run precision of the Liebermann-Burchard reaction modified by Watson was, in our laboratory, 2-3%, the within-run coefficient of variation was 1-2%. The between-run precision of the enzymatic test was 3-4%, the within-run coefficient of variation was 3%. The regression analysis of 92 serum specimens from patients was y = -17.31 + 1.04 chi, the coefficient of regression was r = 0.996. Interlaboratory trials of serum cholesterol were studied in the normal and pathological range. Lyophilized samples of serum prepared commercially and from fresh specimens from patients were analysed by the method of Liebermann-Burchard as well as by the enzymatic procedure. Acceptable results estimated by Liebermann-Burchard were obtained in the different laboratories after using a common standard of cholesterol. The coefficient of variation of the enzymatic test in the interlaboratory trial was higher in comparison to the Liebermann-Burchard reaction. Methodological difficulties of the Liebermann-Burchard reaction are discussed and compared with the specific, enzymatic assay.

  17. Comparison of accuracy of two electronic apex locators in the presence of various irrigants: An in vitro study

    PubMed Central

    Mull, J Paras; Manjunath, Vinutha; Manjunath, MK

    2012-01-01

    Aim: This study was designed to compare the accuracy of Root ZX and SybronEndo Mini, electronic apex locators (EALs), in the presence of various irrigants. Materials and Methods: Sixty extracted, single-rooted human teeth were decoronated and the root canals coronally flared. The actual length (AL) was assessed visually and teeth mounted in the gelatin model. The electronic length (EL) measurements were recorded with both EALs in the presence of 0.9% saline; 1% sodium hypochlorite (NaOCl); 2% chlorhexidine digluconate (CHX), and 17% EDTA solution, at “0.5” reading on display. The differences between the EL and AL were compared. Results: The accuracy of EL measurement of Root ZX and Sybron Mini within±0.5 mm of AL was consistently high in the presence of NaOCl and found to be least with EDTA. Conclusion: EL measurements were shorter with 1% NaOCl, whereas longer with 2% CHX for both the devices. Sybron Mini was more accurate with 1% NaOCl and 2% CHX than Root ZX. PMID:22557820

  18. Influence of root canal curvature on the accuracy of an electronic apex locator: An in vitro study

    PubMed Central

    Santhosh, Lekha; Raiththa, Pooja; Aswathanarayana, Srirekha; Panchajanya, Srinivas; Reddy, Jayakumar Thimmaraya; Susheela, Shwetha Rajanna

    2014-01-01

    Objective: This study investigated whether the canal curvature has an influence on the accuracy of Electronic Apex Locator. Materials and Methods: Sixty mandibular posterior teeth were decoronated. A number (No.) 10 file was inserted into the mesiobuccal canal and radiographs were taken to determine the degree of curvature by Schneider's method. Samples were divided into three groups of mild (<20°), moderate (20-36°) and severe curvature (>36°). After enlarging the orifice, the actual canal length was determined by introducing a file until the tip emerged through the major foramen when observed under 20X magnification. The teeth were embedded in an alginate model and the Root ZX was used to determine the electronic length. The data was analyzed by Kruskal-Wallis test followed by Mann-Whitney test. Results: The difference in measurement of Actual and Electronic working length was statistically significant between group 1 and 2 (P < 0.05) as well as between group 1 and group 3 (P < 0.05) with group 1 showing the lowest difference. Conclusion: Considering ± 0.5 mm as tolerance limit for accuracy, the device was 95% accurate for the mild curvature group and 80% accurate for moderate and severe groups. PMID:25506150

  19. A Comparison of Dimensional Accuracy of Addition Silicone of Different Consistencies with Two Different Spacer Designs - In-vitro Study

    PubMed Central

    Eswaran, B; Eswaran, MA; Prabhu, R; Geetha, KR; Krishna, GP; Jagadeshwari

    2014-01-01

    Introduction: Dimensional accuracy of impression materials is crucial for the production of working casts in Fixed Prosthodontics. The accurate replication of tooth preparations and their arch position requires impression materials that exhibit limited distortion. Methods: This study was conducted to comparatively evaluate the dimensional accuracy of additional silicones by comparing two different techniques and spacer designs, by measuring the linear changes in interpreparation distance. The impressions were made from a stainless steel master die simulating a three unit bridge. A total 80 die stone (type IV, Ultrarock) models were obtained from the impressions made using two different parameters. The two different parameters are Multimix and Monophasic technique and different spacer designs. Result: The interpreparation distance of the abutments in the casts was measured using a travelling microscope. Each sample was measured thrice and the mean value was calculated. The results obtained were statistically analysed and the values fall within the clinically acceptable range. Conclusion: The most accurate combination is multimix technique with spacer design which uses less bulk of impression material. PMID:25177635

  20. Primary care REFerral for EchocaRdiogram (REFER) in heart failure: a diagnostic accuracy study

    PubMed Central

    Taylor, Clare J; Roalfe, Andrea K; Iles, Rachel; Hobbs, FD Richard; Barton, P; Deeks, J; McCahon, D; Cowie, MR; Sutton, G; Davis, RC; Mant, J; McDonagh, T; Tait, L

    2017-01-01

    Background Symptoms of breathlessness, fatigue, and ankle swelling are common in general practice but deciding which patients are likely to have heart failure is challenging. Aim To evaluate the performance of a clinical decision rule (CDR), with or without N-Terminal pro-B type natriuretic peptide (NT-proBNP) assay, for identifying heart failure. Design and setting Prospective, observational, diagnostic validation study of patients aged >55 years, presenting with shortness of breath, lethargy, or ankle oedema, from 28 general practices in England. Method The outcome was test performance of the CDR and natriuretic peptide test in determining a diagnosis of heart failure. The reference standard was an expert consensus panel of three cardiologists. Results Three hundred and four participants were recruited, with 104 (34.2%; 95% confidence interval [CI] = 28.9 to 39.8) having a confirmed diagnosis of heart failure. The CDR+NT-proBNP had a sensitivity of 90.4% (95% CI = 83.0 to 95.3) and specificity 45.5% (95% CI = 38.5 to 52.7). NT-proBNP level alone with a cut-off <400 pg/ml had sensitivity 76.9% (95% CI = 67.6 to 84.6) and specificity 91.5% (95% CI = 86.7 to 95.0). At the lower cut-off of NT-proBNP <125 pg/ml, sensitivity was 94.2% (95% CI = 87.9 to 97.9) and specificity 49.0% (95% CI = 41.9 to 56.1). Conclusion At the low threshold of NT-proBNP <125 pg/ml, natriuretic peptide testing alone was better than a validated CDR+NT-proBNP in determining which patients presenting with symptoms went on to have a diagnosis of heart failure. The higher NT-proBNP threshold of 400 pg/ml may mean more than one in five patients with heart failure are not appropriately referred. Guideline natriuretic peptide thresholds may need to be revised. PMID:27919937

  1. Accuracy of tumor motion compensation algorithm from a robotic respiratory tracking system: A simulation study

    SciTech Connect

    Seppenwoolde, Yvette; Berbeco, Ross I.; Nishioka, Seiko; Shirato, Hiroki; Heijmen, Ben

    2007-07-15

    The Synchrony{sup TM} Respiratory Tracking System (RTS) is a treatment option of the CyberKnife robotic treatment device to irradiate extra-cranial tumors that move due to respiration. Advantages of RTS are that patients can breath normally and that there is no loss of linac duty cycle such as with gated therapy. Tracking is based on a measured correspondence model (linear or polynomial) between internal tumor motion and external (chest/abdominal) marker motion. The radiation beam follows the tumor movement via the continuously measured external marker motion. To establish the correspondence model at the start of treatment, the 3D internal tumor position is determined at 15 discrete time points by automatic detection of implanted gold fiducials in two orthogonal x-ray images; simultaneously, the positions of the external markers are measured. During the treatment, the relationship between internal and external marker positions is continuously accounted for and is regularly checked and updated. Here we use computer simulations based on continuously and simultaneously recorded internal and external marker positions to investigate the effectiveness of tumor tracking by the RTS. The Cyberknife does not allow continuous acquisition of x-ray images to follow the moving internal markers (typical imaging frequency is once per minute). Therefore, for the simulations, we have used data for eight lung cancer patients treated with respiratory gating. All of these patients had simultaneous and continuous recordings of both internal tumor motion and external abdominal motion. The available continuous relationship between internal and external markers for these patients allowed investigation of the consequences of the lower acquisition frequency of the RTS. With the use of the RTS, simulated treatment errors due to breathing motion were reduced largely and consistently over treatment time for all studied patients. A considerable part of the maximum reduction in treatment error

  2. Study of hot hardness characteristics of tool steels

    NASA Technical Reports Server (NTRS)

    Chevalier, J. L.; Dietrich, M. W.; Zaretsky, E. V.

    1972-01-01

    Hardness measurements of tool steel materials in electric furnace at elevated temperatures and low oxygen environment are discussed. Development of equation to predict short term hardness as function of intial room temperature hardness of steel is reported. Types of steel involved in the process are identified.

  3. Information Literacy and Office Tool Competencies: A Benchmark Study

    ERIC Educational Resources Information Center

    Heinrichs, John H.; Lim, Jeen-Su

    2010-01-01

    Present information science literature recognizes the importance of information technology to achieve information literacy. The authors report the results of a benchmarking student survey regarding perceived functional skills and competencies in word-processing and presentation tools. They used analysis of variance and regression analysis to…

  4. Micro and nanotechnological tools for study of RNA.

    PubMed

    Yoshizawa, Satoko

    2012-07-01

    Micro and nanotechnologies have originally contributed to engineering, especially in electronics. These technologies enable fabrication and assembly of materials at micrometer and nanometer scales and the manipulation of nano-objects. The power of these technologies has now been exploited in analyzes of biologically relevant molecules. In this review, the use of micro and nanotechnological tools in RNA research is described.

  5. The Effect of File Size on the Accuracy of the Raypex 5 Apex Locator: An In Vitro Study

    PubMed Central

    Sadeghi, Shiva; Abolghasemi, Masoomeh

    2008-01-01

    Background and aims Determining the proper length of the root canals is essential for successful endodontic treatment. The purpose of this in vitro study was to evaluate the effect of file size on the accuracy of the Raypex 5 electronic apex locator for working length determination of uninstrumented canals. Materials and methods Twenty maxillary central incisors with single straight canals were used. Following access cavity preparation, electronic working length by means of Raypex 5 apex locator and actual working length were determined. Data were analyzed using ANOVA with repeated measurements and LSD test. Results There was no significant difference between electronic and actual working lengths when a size 15 K-file was used. Conclusion Under the conditions of the present study, a size 15 K-file is a more suitable size for de-termining working length. PMID:23285326

  6. Social Networking Tools and Teacher Education Learning Communities: A Case Study

    ERIC Educational Resources Information Center

    Poulin, Michael T.

    2014-01-01

    Social networking tools have become an integral part of a pre-service teacher's educational experience. As a result, the educational value of social networking tools in teacher preparation programs must be examined. The specific problem addressed in this study is that the role of social networking tools in teacher education learning communities…

  7. HEFCE's People Management Self-Assessment Tool: Ticking Boxes or Adding Value? A Case Study

    ERIC Educational Resources Information Center

    McDonald, Claire

    2009-01-01

    This article examines one specific organisational development tool in depth and uses a case study to investigate whether using the tool is more than a tick-box exercise and really can add value and help organisations to develop and improve. The People Management Self-Assessment Tool (SAT) is used to examine higher education institutions' (HEIs)…

  8. BASINS and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (External Review Draft)

    EPA Science Inventory

    This draft report supports application of two recently developed water modeling tools, the BASINS and WEPP climate assessment tools. The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments...

  9. The dimensional accuracy of polyvinyl siloxane impression materials using two different impression techniques: An in vitro study

    PubMed Central

    Kumari, Nirmala; Nandeeshwar, D. B.

    2015-01-01

    Aim of the Study: To evaluate and compare the linear dimensional changes of the three representative polyvinyl siloxane (PVS) impression materials and to compare the accuracy of single mix with double mix impression technique. Methodology: A study mold was prepared according to revised American Dental Association specification number 19 for nonaqueous elastic dental impression materials. Three PVS impression materials selected were Elite-HD, Imprint™ II Garant, Aquasil Ultra Heavy. Two impression techniques used were single mix and double mix impression technique. A total of 60 specimens were made and after 24 h the specimens were measured using profile projector. Statistical Analysis: The data were analyzed using one-way analyses of variance analysis and significant differences were separated using Student's Newman–Keul's test. Results: When all the three study group impression materials were compared for double mix technique, the statistically significant difference was found only between Imprint™ II Garantand Elite-HD (P < 0.05). Similarly, using single mix technique, statistically significant difference were found between Elite-HD and Imprint™ II Garant (P < 0.05) and also between Aquasil Ultra Heavy and Elite-HD (P < 0.05). When the linear dimensional accuracy of all three impression material in double mix impression technique and single mix impression technique were compared with the control group, Imprint™ II Garant showed the values more nearing to the values of master die, followed by Aquasil Ultra Heavy and Elite-HD respectively. Conclusion: Among the impression materials Imprint™ II Garant showed least dimensional change. Among the impression techniques, double mix impression technique showed the better results. PMID:26929515

  10. Pooled analysis of the accuracy of five cervical cancer screening tests assessed in eleven studies in Africa and India.

    PubMed

    Arbyn, Marc; Sankaranarayanan, Rengaswamy; Muwonge, Richard; Keita, Namory; Dolo, Amadou; Mbalawa, Charles Gombe; Nouhou, Hassan; Sakande, Boblewende; Wesley, Ramani; Somanathan, Thara; Sharma, Anjali; Shastri, Surendra; Basu, Parthasarathy

    2008-07-01

    Cervical cancer is the main cancer among women in sub-Saharan Africa, India and other parts of the developing world. Evaluation of screening performance of effective, feasible and affordable early detection and management methods is a public health priority. Five screening methods, naked eye visual inspection of the cervix uteri after application of diluted acetic acid (VIA), or Lugol's iodine (VILI) or with a magnifying device (VIAM), the Pap smear and human papillomavirus testing with the high-risk probe of the Hybrid Capture-2 assay (HC2), were evaluated in 11 studies in India and Africa. More than 58,000 women, aged 25-64 years, were tested with 2-5 screening tests and outcome verification was done on all women independent of the screen test results. The outcome was presence or absence of cervical intraepithelial neoplasia (CIN) of different degrees or invasive cervical cancer. Verification was based on colposcopy and histological interpretation of colposcopy-directed biopsies. Negative colposcopy was accepted as a truly negative outcome. VIA showed a sensitivity of 79% (95% CI 73-85%) and 83% (95% CI 77-89%), and a specificity of 85% (95% CI 81-89%) and 84% (95% CI 80-88%) for the outcomes CIN2+ or CIN3+, respectively. VILI was on average 10% more sensitive and equally specific. VIAM showed similar results as VIA. The Pap smear showed lowest sensitivity, even at the lowest cutoff of atypical squamous cells of undetermined significance (57%; 95% CI 38-76%) for CIN2+ but the specificity was rather high (93%; 95% CI 89-97%). The HC2-assay showed a sensitivity for CIN2+ of 62% (95% CI 56-68%) and a specificity of 94% (95% CI 92-95%). Substantial interstudy variation was observed in the accuracy of the visual screening methods. Accuracy of visual methods and cytology increased over time, whereas performance of HC2 was constant. Results of visual tests and colposcopy were highly correlated. This study was the largest ever done that evaluates the cross

  11. A Comparison of Accuracy of Matrix Impression System with Putty Reline Technique and Multiple Mix Technique: An In Vitro Study

    PubMed Central

    Kumar, M Praveen; Patil, Suneel G; Dheeraj, Bhandari; Reddy, Keshav; Goel, Dinker; Krishna, Gopi

    2015-01-01

    Background: The difficulty in obtaining an acceptable impression increases exponentially as the number of abutments increases. Accuracy of the impression material and the use of a suitable impression technique are of utmost importance in the fabrication of a fixed partial denture. This study compared the accuracy of the matrix impression system with conventional putty reline and multiple mix technique for individual dies by comparing the inter-abutment distance in the casts obtained from the impressions. Materials and Methods: Three groups, 10 impressions each with three impression techniques (matrix impression system, putty reline technique and multiple mix technique) were made of a master die. Typodont teeth were embedded in a maxillary frasaco model base. The left first premolar was removed to create a three-unit fixed partial denture situation and the left canine and second premolar were prepared conservatively, and hatch marks were made on the abutment teeth. The final casts obtained from the impressions were examined under a profile projector and the inter-abutment distance was calculated for all the casts and compared. Results: The results from this study showed that in the mesiodistal dimensions the percentage deviation from master model in Group I was 0.1 and 0.2, in Group II was 0.9 and 0.3, and Group III was 1.6 and 1.5, respectively. In the labio-palatal dimensions the percentage deviation from master model in Group I was 0.01 and 0.4, Group II was 1.9 and 1.3, and Group III was 2.2 and 2.0, respectively. In the cervico-incisal dimensions the percentage deviation from the master model in Group I was 1.1 and 0.2, Group II was 3.9 and 1.7, and Group III was 1.9 and 3.0, respectively. In the inter-abutment dimension of dies, percentage deviation from master model in Group I was 0.1, Group II was 0.6, and Group III was 1.0. Conclusion: The matrix impression system showed more accuracy of reproduction for individual dies when compared with putty reline

  12. Improved accuracy of quantitative parameter estimates in dynamic contrast-enhanced CT study with low temporal resolution

    SciTech Connect

    Kim, Sun Mo; Jaffray, David A.

    2016-01-15

    Purpose: A previously proposed method to reduce radiation dose to patient in dynamic contrast-enhanced (DCE) CT is enhanced by principal component analysis (PCA) filtering which improves the signal-to-noise ratio (SNR) of time-concentration curves in the DCE-CT study. The efficacy of the combined method to maintain the accuracy of kinetic parameter estimates at low temporal resolution is investigated with pixel-by-pixel kinetic analysis of DCE-CT data. Methods: The method is based on DCE-CT scanning performed with low temporal resolution to reduce the radiation dose to the patient. The arterial input function (AIF) with high temporal resolution can be generated with a coarsely sampled AIF through a previously published method of AIF estimation. To increase the SNR of time-concentration curves (tissue curves), first, a region-of-interest is segmented into squares composed of 3 × 3 pixels in size. Subsequently, the PCA filtering combined with a fraction of residual information criterion is applied to all the segmented squares for further improvement of their SNRs. The proposed method was applied to each DCE-CT data set of a cohort of 14 patients at varying levels of down-sampling. The kinetic analyses using the modified Tofts’ model and singular value decomposition method, then, were carried out for each of the down-sampling schemes between the intervals from 2 to 15 s. The results were compared with analyses done with the measured data in high temporal resolution (i.e., original scanning frequency) as the reference. Results: The patients’ AIFs were estimated to high accuracy based on the 11 orthonormal bases of arterial impulse responses established in the previous paper. In addition, noise in the images was effectively reduced by using five principal components of the tissue curves for filtering. Kinetic analyses using the proposed method showed superior results compared to those with down-sampling alone; they were able to maintain the accuracy in the

  13. Optics of the human cornea influence the accuracy of stereo eye-tracking methods: a simulation study.

    PubMed

    Barsingerhorn, A D; Boonstra, F N; Goossens, H H L M

    2017-02-01

    Current stereo eye-tracking methods model the cornea as a sphere with one refractive surface. However, the human cornea is slightly aspheric and has two refractive surfaces. Here we used ray-tracing and the Navarro eye-model to study how these optical properties affect the accuracy of different stereo eye-tracking methods. We found that pupil size, gaze direction and head position all influence the reconstruction of gaze. Resulting errors range between ± 1.0 degrees at best. This shows that stereo eye-tracking may be an option if reliable calibration is not possible, but the applied eye-model should account for the actual optics of the cornea.

  14. Relative accuracy of grid references derived from postcode and address in UK epidemiological studies of overhead power lines.

    PubMed

    Swanson, J; Vincent, T J; Bunch, K J

    2014-12-01

    In the UK, the location of an address, necessary for calculating the distance to overhead power lines in epidemiological studies, is available from different sources. We assess the accuracy of each. The grid reference specific to each address, provided by the Ordnance Survey product Address-Point, is generally accurate to a few metres, which will usually be sufficient for calculating magnetic fields from the power lines. The grid reference derived from the postcode rather than the individual address is generally accurate to tens of metres, and may be acceptable for assessing effects that vary in the general proximity of the power line, but is probably not acceptable for assessing magnetic-field effects.

  15. Optics of the human cornea influence the accuracy of stereo eye-tracking methods: a simulation study

    PubMed Central

    Barsingerhorn, A. D.; Boonstra, F. N.; Goossens, H. H. L. M.

    2017-01-01

    Current stereo eye-tracking methods model the cornea as a sphere with one refractive surface. However, the human cornea is slightly aspheric and has two refractive surfaces. Here we used ray-tracing and the Navarro eye-model to study how these optical properties affect the accuracy of different stereo eye-tracking methods. We found that pupil size, gaze direction and head position all influence the reconstruction of gaze. Resulting errors range between ± 1.0 degrees at best. This shows that stereo eye-tracking may be an option if reliable calibration is not possible, but the applied eye-model should account for the actual optics of the cornea. PMID:28270978

  16. The accuracy of prostate volume measurement from ultrasound images: a quasi-Monte Carlo simulation study using magnetic resonance imaging.

    PubMed

    Azulay, David-Olivier D; Murphy, Philip; Graham, Jim

    2013-01-01

    Prostate volume is an important parameter to guide management of patients with benign prostatic hyperplasia (BPH) and to deliver clinical trial endpoints. Generally, simple 2D ultrasound (US) approaches are favoured despite the potential for greater accuracy afforded by magnetic resonance imaging (MRI) or complex US procedures. In this study, different approaches to estimate prostate size are evaluated with a simulation to select multiple organ cross-sections and diameters from 22 MRI-defined prostate shapes. A quasi-Monte Carlo (qMC) approach is used to simulate multiple probe positions and angles within prescribed limits resulting in a range of dimensions. The basic ellipsoid calculation which uses two scanning planes compares well to the MRI volume across the range of prostate shapes and sizes (R=0.992). However, using an appropriate linear regression model, accurate volume estimates can be made using prostate diameters calculated from a single scanning plane.

  17. Storage of bronchoalveolar lavage fluid and accuracy of microbiologic diagnostics in the ICU: a prospective observational study

    PubMed Central

    2013-01-01

    Introduction Early initiation of appropriate antimicrobial treatment is a cornerstone in managing pneumonia. Because microbiologic processing may not be available around the clock, optimal storage of specimens is essential for accurate microbiologic identification of pathogenetic bacteria. The aim of our study was to determine the accuracy of two commonly used storage approaches for delayed processing of bronchoalveolar lavage in critically ill patients with suspected pneumonia. Methods This study included 132 patients with clinically suspected pneumonia at two medical intensive care units of a tertiary care hospital. Bronchoalveolar lavage samples were obtained and divided into three aliquots: one was used for immediate culture, and two, for delayed culture (DC) after storage for 24 hours at 4°C (DC4) and -80°C (DC-80), respectively. Results Of 259 bronchoalveolar lavage samples, 84 (32.4%) were positive after immediate culture with 115 relevant culture counts (≥104 colony-forming units/ml). Reduced (<104 colony-forming units/ml) or no growth of four and 57 of these isolates was observed in DC4 and DC-80, respectively. The difference between mean bias of immediate culture and DC4 (-0.035; limits of agreement, -0.977 to 0.906) and immediate culture and DC-80 (-1.832; limits of agreement, -4.914 to 1.267) was -1.788 ± 1.682 (P < 0.0001). Sensitivity and negative predictive value were 96.5% and 97.8% for DC4 and 50.4% and 75.4% for DC-80, respectively; the differences were statistically significant (P < 0.0001). Conclusions Bronchoalveolar lavage samples can be processed for culture when stored up to 24 hours at 4°C without loss of diagnostic accuracy. Delayed culturing after storage at -80°C may not be reliable, in particular with regard to Gram-negative bacteria. PMID:23844796

  18. Postmarketing Safety Study Tool: A Web Based, Dynamic, and Interoperable System for Postmarketing Drug Surveillance Studies

    PubMed Central

    Sinaci, A. Anil; Laleci Erturkmen, Gokce B.; Gonul, Suat; Yuksel, Mustafa; Invernizzi, Paolo; Thakrar, Bharat; Pacaci, Anil; Cinar, H. Alper; Cicekli, Nihan Kesim

    2015-01-01

    Postmarketing drug surveillance is a crucial aspect of the clinical research activities in pharmacovigilance and pharmacoepidemiology. Successful utilization of available Electronic Health Record (EHR) data can complement and strengthen postmarketing safety studies. In terms of the secondary use of EHRs, access and analysis of patient data across different domains are a critical factor; we address this data interoperability problem between EHR systems and clinical research systems in this paper. We demonstrate that this problem can be solved in an upper level with the use of common data elements in a standardized fashion so that clinical researchers can work with different EHR systems independently of the underlying information model. Postmarketing Safety Study Tool lets the clinical researchers extract data from different EHR systems by designing data collection set schemas through common data elements. The tool interacts with a semantic metadata registry through IHE data element exchange profile. Postmarketing Safety Study Tool and its supporting components have been implemented and deployed on the central data warehouse of the Lombardy region, Italy, which contains anonymized records of about 16 million patients with over 10-year longitudinal data on average. Clinical researchers in Roche validate the tool with real life use cases. PMID:26543873

  19. Precision and accuracy in the quantitative analysis of biological samples by accelerator mass spectrometry: application in microdose absolute bioavailability studies.

    PubMed

    Gao, Lan; Li, Jing; Kasserra, Claudia; Song, Qi; Arjomand, Ali; Hesk, David; Chowdhury, Swapan K

    2011-07-15

    Determination of the pharmacokinetics and absolute bioavailability of an experimental compound, SCH 900518, following a 89.7 nCi (100 μg) intravenous (iv) dose of (14)C-SCH 900518 2 h post 200 mg oral administration of nonradiolabeled SCH 900518 to six healthy male subjects has been described. The plasma concentration of SCH 900518 was measured using a validated LC-MS/MS system, and accelerator mass spectrometry (AMS) was used for quantitative plasma (14)C-SCH 900518 concentration determination. Calibration standards and quality controls were included for every batch of sample analysis by AMS to ensure acceptable quality of the assay. Plasma (14)C-SCH 900518 concentrations were derived from the regression function established from the calibration standards, rather than directly from isotopic ratios from AMS measurement. The precision and accuracy of quality controls and calibration standards met the requirements of bioanalytical guidance (U.S. Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research, Center for Veterinary Medicine. Guidance for Industry: Bioanalytical Method Validation (ucm070107), May 2001. http://www.fda.gov/downloads/Drugs/GuidanceCompilanceRegulatoryInformation/Guidances/ucm070107.pdf ). The AMS measurement had a linear response range from 0.0159 to 9.07 dpm/mL for plasma (14)C-SCH 900158 concentrations. The CV and accuracy were 3.4-8.5% and 94-108% (82-119% for the lower limit of quantitation (LLOQ)), respectively, with a correlation coefficient of 0.9998. The absolute bioavailability was calculated from the dose-normalized area under the curve of iv and oral doses after the plasma concentrations were plotted vs the sampling time post oral dose. The mean absolute bioavailability of SCH 900518 was 40.8% (range 16.8-60.6%). The typical accuracy and standard deviation in AMS quantitative analysis of drugs from human plasma samples have been reported for the first time, and the impact of these

  20. Accuracy of a Wrist-Worn Wearable Device for Monitoring Heart Rates in Hospital Inpatients: A Prospective Observational Study

    PubMed Central

    Kroll, Ryan R; Boyd, J Gordon

    2016-01-01

    Background As the sensing capabilities of wearable devices improve, there is increasing interest in their application in medical settings. Capabilities such as heart rate monitoring may be useful in hospitalized patients as a means of enhancing routine monitoring or as part of an early warning system to detect clinical deterioration. Objective To evaluate the accuracy of heart rate monitoring by a personal fitness tracker (PFT) among hospital inpatients. Methods We conducted a prospective observational study of 50 stable patients in the intensive care unit who each completed 24 hours of heart rate monitoring using a wrist-worn PFT. Accuracy of heart rate recordings was compared with gold standard measurements derived from continuous electrocardiographic (cECG) monitoring. The accuracy of heart rates measured by pulse oximetry (Spo2.R) was also measured as a positive control. Results On a per-patient basis, PFT-derived heart rate values were slightly lower than those derived from cECG monitoring (average bias of −1.14 beats per minute [bpm], with limits of agreement of 24 bpm). By comparison, Spo2.R recordings produced more accurate values (average bias of +0.15 bpm, limits of agreement of 13 bpm, P<.001 as compared with PFT). Personal fitness tracker device performance was significantly better in patients in sinus rhythm than in those who were not (average bias −0.99 bpm vs −5.02 bpm, P=.02). Conclusions Personal fitness tracker–derived heart rates were slightly lower than those derived from cECG monitoring in real-world testing and not as accurate as Spo2.R-derived heart rates. Performance was worse among patients who were not in sinus rhythm. Further clinical evaluation is indicated to see if PFTs can augment early warning systems in hospitals. Trial Registration ClinicalTrials.gov NCT02527408; https://clinicaltrials.gov/ct2/show/NCT02527408 (Archived by WebCite at  http://www.webcitation.org/6kOFez3on) PMID:27651304

  1. A comparative evaluation of the marginal accuracy of crowns fabricated from four commercially available provisional materials: An in vitro study

    PubMed Central

    Amin, Bhavya Mohandas; Aras, Meena Ajay; Chitre, Vidya

    2015-01-01

    Purpose: The purpose of this in vitro study was to evaluate and compare the primary marginal accuracy of four commercially available provisional materials (Protemp 4, Luxatemp Star, Visalys Temp and DPI tooth moulding powder and liquid) at 2 time intervals (10 and 30 min). Materials and Methods: A customized stainless steel master model containing two interchangeable dies was used for fabrication of provisional crowns. Forty crowns (n = 10) were fabricated, and each crown was evaluated under a stereomicroscope. Vertical marginal discrepancies were noted and compared at 10 min since the start of mixing and then at 30 min. Observations and Results: Protemp 4 showed the least vertical marginal discrepancy (71.59 μ), followed by Luxatemp Star (91.93 μ) at 10 min. DPI showed a marginal discrepancy of 95.94 μ while Visalys Temp crowns had vertical marginal discrepancy of 106.81 μ. There was a significant difference in the marginal discrepancy values of Protemp 4 and Visalys Temp. At 30 min, there was a significant difference between the marginal discrepancy of Protemp 4 crowns (83.11 μ) and Visalys Temp crowns (128.97 μ) and between Protemp 4 and DPI (118.88 μ). No significant differences were observed between Protemp 4 and Luxatemp Star. Conclusion: The vertical marginal discrepancy of temporary crowns fabricated from the four commercially available provisional materials ranged from 71 to 106 μ immediately after fabrication (at 10 min from the start of mix) to 83–128 μ (30 min from the start of mix). The time elapsed after mixing had a significant influence on the marginal accuracy of the crowns. PMID:26097348

  2. Multi-centre evaluation of accuracy and reproducibility of planar and SPECT image quantification: An IAEA phantom study.

    PubMed

    Zimmerman, Brian E; Grošev, Darko; Buvat, Irène; Coca Pérez, Marco A; Frey, Eric C; Green, Alan; Krisanachinda, Anchali; Lassmann, Michael; Ljungberg, Michael; Pozzo, Lorena; Quadir, Kamila Afroj; Terán Gretter, Mariella A; Van Staden, Johann; Poli, Gian Luca

    2016-04-19

    Accurate quantitation of activity provides the basis for internal dosimetry of targeted radionuclide therapies. This study investigated quantitative imaging capabilities at sites with a variety of experience and equipment and assessed levels of errors in activity quantitation in Single-Photon Emission Computed Tomography (SPECT) and planar imaging. Participants from 9 countries took part in a comparison in which planar, SPECT and SPECT with X ray computed tomography (SPECT-CT) imaging were used to quantify activities of four epoxy-filled cylinders containing (133)Ba, which was chosen as a surrogate for (131)I. The sources, with nominal volumes of 2, 4, 6 and 23mL, were calibrated for (133)Ba activity by the National Institute of Standards and Technology, but the activity was initially unknown to the participants. Imaging was performed in a cylindrical phantom filled with water. Two trials were carried out in which the participants first estimated the activities using their local standard protocols, and then repeated the measurements using a standardized acquisition and analysis protocol. Finally, processing of the imaging data from the second trial was repeated by a single centre using a fixed protocol. In the first trial, the activities were underestimated by about 15% with planar imaging. SPECT with Chang's first order attenuation correction (Chang-AC) and SPECT-CT overestimated the activity by about 10%. The second trial showed moderate improvements in accuracy and variability. Planar imaging was subject to methodological errors, e.g., in the use of a transmission scan for attenuation correction. The use of Chang-AC was subject to variability from the definition of phantom contours. The project demonstrated the need for training and standardized protocols to achieve good levels of quantitative accuracy and precision in a multicentre setting. Absolute quantification of simple objects with no background was possible with the strictest protocol to about 6% with

  3. Accuracy and stability of measuring GABA, glutamate, and glutamine by proton magnetic resonance spectroscopy: A phantom study at 4 Tesla

    NASA Astrophysics Data System (ADS)

    Henry, Michael E.; Lauriat, Tara L.; Shanahan, Meghan; Renshaw, Perry F.; Jensen, J. Eric

    2011-02-01

    Proton magnetic resonance spectroscopy has the potential to provide valuable information about alterations in gamma-aminobutyric acid (GABA), glutamate (Glu), and glutamine (Gln) in psychiatric and neurological disorders. In order to use this technique effectively, it is important to establish the accuracy and reproducibility of the methodology. In this study, phantoms with known metabolite concentrations were used to compare the accuracy of 2D J-resolved MRS, single-echo 30 ms PRESS, and GABA-edited MEGA-PRESS for measuring all three aforementioned neurochemicals simultaneously. The phantoms included metabolite concentrations above and below the physiological range and scans were performed at baseline, 1 week, and 1 month time-points. For GABA measurement, MEGA-PRESS proved optimal with a measured-to-target correlation of R2 = 0.999, with J-resolved providing R2 = 0.973 for GABA. All three methods proved effective in measuring Glu with R2 = 0.987 (30 ms PRESS), R2 = 0.996 (J-resolved) and R2 = 0.910 (MEGA-PRESS). J-resolved and MEGA-PRESS yielded good results for Gln measures with respective R2 = 0.855 (J-resolved) and R2 = 0.815 (MEGA-PRESS). The 30 ms PRESS method proved ineffective in measuring GABA and Gln. When measurement stability at in vivo concentration was assessed as a function of varying spectral quality, J-resolved proved the most stable and immune to signal-to-noise and linewidth fluctuation compared to MEGA-PRESS and 30 ms PRESS.

  4. Databases and Web Tools for Cancer Genomics Study

    PubMed Central

    Yang, Yadong; Dong, Xunong; Xie, Bingbing; Ding, Nan; Chen, Juan; Li, Yongjun; Zhang, Qian; Qu, Hongzhu; Fang, Xiangdong

    2015-01-01

    Publicly-accessible resources have promoted the advance of scientific discovery. The era of genomics and big data has brought the need for collaboration and data sharing in order to make effective use of this new knowledge. Here, we describe the web resources for cancer genomics research and rate them on the basis of the diversity of cancer types, sample size, omics data comprehensiveness, and user experience. The resources reviewed include data repository and analysis tools; and we hope such introduction will promote the awareness and facilitate the usage of these resources in the cancer research community. PMID:25707591

  5. Databases and web tools for cancer genomics study.

    PubMed

    Yang, Yadong; Dong, Xunong; Xie, Bingbing; Ding, Nan; Chen, Juan; Li, Yongjun; Zhang, Qian; Qu, Hongzhu; Fang, Xiangdong

    2015-02-01

    Publicly-accessible resources have promoted the advance of scientific discovery. The era of genomics and big data has brought the need for collaboration and data sharing in order to make effective use of this new knowledge. Here, we describe the web resources for cancer genomics research and rate them on the basis of the diversity of cancer types, sample size, omics data comprehensiveness, and user experience. The resources reviewed include data repository and analysis tools; and we hope such introduction will promote the awareness and facilitate the usage of these resources in the cancer research community.

  6. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  7. Does gadolinium-based contrast material improve diagnostic accuracy of local invasion in rectal cancer MRI? A multireader study.

    PubMed

    Gollub, Marc J; Lakhman, Yulia; McGinty, Katrina; Weiser, Martin R; Sohn, Michael; Zheng, Junting; Shia, Jinru

    2015-02-01

    OBJECTIVE. The purpose of this study was to compare reader accuracy and agreement on rectal MRI with and without gadolinium administration in the detection of T4 rectal cancer. MATERIALS AND METHODS. In this study, two radiologists and one fellow independently interpreted all posttreatment MRI studies for patients with locally advanced or recurrent rectal cancer using unenhanced images alone or combined with contrast-enhanced images, with a minimum interval of 4 weeks. Readers evaluated involvement of surrounding structures on a 5-point scale and were blinded to pathology and disease stage. Sensitivity, specificity, negative predictive value, positive predictive value, and AUC were calculated and kappa statistics were used to describe interreader agreement. RESULTS. Seventy-two patients (38 men and 34 women) with a mean age of 61 years (range, 32-86 years) were evaluated. Fifteen patients had 32 organs invaded. Global AUCs without and with gadolinium administration were 0.79 and 0.77, 0.91 and 0.86, and 0.83 and 0.78 for readers 1, 2, and 3, respectively. AUCs before and after gadolinium administration were similar. Kappa values before and after gadolinium administration for pairs of readers ranged from 0.5 to 0.7. CONCLUSION. On the basis of pathology as a reference standard, the use of gadolinium during rectal MRI did not significantly improve radiologists' agreement or ability to detect T4 disease.

  8. Scripted finite element tools for global electromagnetic induction studies

    NASA Astrophysics Data System (ADS)

    Ribaudo, Joseph T.; Constable, Catherine G.; Parker, Robert L.

    2012-02-01

    Numerical solution of global geomagnetic induction problems in two and three spatial dimensions can be conducted with commercially available, general-purpose, scripted, finite-element software. We show that FlexPDE is capable of solving a variety of global geomagnetic induction problems. The models treated can include arbitrary electrical conductivity of the core and mantle, arbitrary spatial structure and time behaviour of the primary magnetic field. A thin surface layer of laterally heterogeneous conductivity, representing the oceans and crust, may be represented by a boundary condition at the Earth-space interface. We describe a numerical test, or validation, of the program by comparing its output to analytic and semi-analytic solutions for several electromagnetic induction problems: (1) concentric spherical shells representing a layered Earth in a time-varying, uniform, external magnetic field, (2) eccentrically nested conductive spheres in the same field and (3) homogeneous spheres or cylinders, initially at rest, then rotating at a steady rate in a constant, uniform, external field. Calculations are performed in both the time and frequency domains, and in both 2-D and 3-D computational meshes, with adaptive mesh refinement. Root-mean-square accuracies of better than 1 per cent are achieved in all cases. A unique advantage of our technique is the ability to model Earth rotation in both the time and the frequency domain, which is especially useful for simulating satellite data.

  9. A comparative study of S/MAR prediction tools

    PubMed Central

    Evans, Kenneth; Ott, Sascha; Hansen, Annika; Koentges, Georgy; Wernisch, Lorenz

    2007-01-01

    Background S/MARs are regions of the DNA that are attached to the nuclear matrix. These regions are known to affect substantially the expression of genes. The computer prediction of S/MARs is a highly significant task which could contribute to our understanding of chromatin organisation in eukaryotic cells, the number and distribution of boundary elements, and the understanding of gene regulation in eukaryotic cells. However, while a number of S/MAR predictors have been proposed, their accuracy has so far not come under scrutiny. Results We have selected S/MARs with sufficient experimental evidence and used these to evaluate existing methods of S/MAR prediction. Our main results are: 1.) all existing methods have little predictive power, 2.) a simple rule based on AT-percentage is generally competitive with other methods, 3.) in practice, the different methods will usually identify different sub-sequences as S/MARs, 4.) more research on the H-Rule would be valuable. Conclusion A new insight is needed to design a method which will predict S/MARs well. Our data, including the control data, has been deposited as additional material and this may help later researchers test new predictors. PMID:17335576

  10. An Observational Study to Evaluate the Usability and Intent to Adopt an Artificial Intelligence–Powered Medication Reconciliation Tool

    PubMed Central

    Yuan, Michael Juntao; Poonawala, Robina

    2016-01-01

    Background Medication reconciliation (the process of creating an accurate list of all medications a patient is taking) is a widely practiced procedure to reduce medication errors. It is mandated by the Joint Commission and reimbursed by Medicare. Yet, in practice, medication reconciliation is often not effective owing to knowledge gaps in the team. A promising approach to improve medication reconciliation is to incorporate artificial intelligence (AI) decision support tools into the process to engage patients and bridge the knowledge gap. Objective The aim of this study was to improve the accuracy and efficiency of medication reconciliation by engaging the patient, the nurse, and the physician as a team via an iPad tool. With assistance from the AI agent, the patient will review his or her own medication list from the electronic medical record (EMR) and annotate changes, before reviewing together with the physician and making decisions on the shared iPad screen. Methods In this study, we developed iPad-based software tools, with AI decision support, to engage patients to “self-service” medication reconciliation and then share the annotated reconciled list with the physician. To evaluate the software tool’s user interface and workflow, a small number of patients (10) in a primary care clinic were recruited, and they were observed through the whole process during a pilot study. The patients are surveyed for the tool’s usability afterward. Results All patients were able to complete the medication reconciliation process correctly. Every patient found at least one error or other issues with their EMR medication lists. All of them reported that the tool was easy to use, and 8 of 10 patients reported that they will use the tool in the future. However, few patients interacted with the learning modules in the tool. The physician and nurses reported the tool to be easy-to-use, easy to integrate into existing workflow, and potentially time-saving. Conclusions We have

  11. Mechanism study on the wear of polycrystalline cubic boron nitride cutting tools

    NASA Astrophysics Data System (ADS)

    Jia, Yunhai; Li, Jiangang

    2010-12-01

    The samples of bearing steel, alloy cold-die steel, cold-harden cast iron were continuous machined by polycrystalline cubic boron nitride(PcBN) cutting tools dry turning. After the machining, the phases of cutting tools blade-edge were analyzed by X-ray diffraction analyzer and cutting tools blade-edge microstructure were observed by scanning electronic microscope. And then, the wear mechanism of PcBN cutting tools in turing process was studied. The result showed that the oxidation wear and felt wear were main invalidation factors using PcBN cutting tools dry turning bearing steel and alloy cold-die steel samples; chemical wear and oxidation wear were main invalidation factors using PcBN cutting tools dry turning cold-harden cast iron. In turning process, the granularity of cBN, the heated-stability and chemical characteristic of felt material have key function to cutting tools wear.

  12. The hidden KPI registration accuracy.

    PubMed

    Shorrosh, Paul

    2011-09-01

    Determining the registration accuracy rate is fundamental to improving revenue cycle key performance indicators. A registration quality assurance (QA) process allows errors to be corrected before bills are sent and helps registrars learn from their mistakes. Tools are available to help patient access staff who perform registration QA manually.

  13. Pitfalls at the root of facial assessment on photographs: a quantitative study of accuracy in positioning facial landmarks.

    PubMed

    Cummaudo, M; Guerzoni, M; Marasciuolo, L; Gibelli, D; Cigada, A; Obertovà, Z; Ratnayake, M; Poppa, P; Gabriel, P; Ritz-Timme, S; Cattaneo, C

    2013-05-01

    In the last years, facial analysis has gained great interest also for forensic anthropology. The application of facial landmarks may bring about relevant advantages for the analysis of 2D images by measuring distances and extracting quantitative indices. However, this is a complex task which depends upon the variability in positioning facial landmarks. In addition, literature provides only general indications concerning the reliability in positioning facial landmarks on photographic material, and no study is available concerning the specific errors which may be encountered in such an operation. The aim of this study is to analyze the inter- and intra-observer error in defining facial landmarks on photographs by using a software specifically developed for this purpose. Twenty-four operators were requested to define 22 facial landmarks on frontal view photographs and 11 on lateral view images; in addition, three operators repeated the procedure on the same photographs 20 times (at distance of 24 h). In the frontal view, the landmarks with less dispersion were the pupil, cheilion, endocanthion, and stomion (sto), and the landmarks with the highest dispersion were gonion, zygion, frontotemporale, tragion, and selion (se). In the lateral view, the landmarks with the least dispersion were se, pronasale, subnasale, and sto, whereas landmarks with the highest dispersion were gnathion, pogonion, and tragion. Results confirm that few anatomical points can be defined with the highest accuracy and show the importance of the preliminary investigation of reliability in positioning facial landmarks.

  14. A qualitative study into the difficulties experienced by healthcare decision makers when reading a Cochrane diagnostic test accuracy review

    PubMed Central

    2013-01-01

    Background Cochrane reviews are one of the best known and most trusted sources of evidence-based information in health care. While steps have been taken to make Cochrane intervention reviews accessible to a diverse readership, little is known about the accessibility of the newcomer to the Cochrane library: diagnostic test accuracy reviews (DTARs). The current qualitative study explored how healthcare decision makers, who varied in their knowledge and experience with test accuracy research and systematic reviews, read and made sense of DTARs. Methods A purposive sample of clinicians, researchers and policy makers (n = 21) took part in a series of think-aloud interviews, using as interview material the first three DTARs published in the Cochrane library. Thematic qualitative analysis of the transcripts was carried out to identify patterns in participants’ ‘reading’ and interpretation of the reviews and the difficulties they encountered. Results Participants unfamiliar with the design and methodology of DTARs found the reviews largely inaccessible and experienced a range of difficulties stemming mainly from the mismatch between background knowledge and level of explanation provided in the text. Experience with systematic reviews of interventions did not guarantee better understanding and, in some cases, led to confusion and misinterpretation. These difficulties were further exacerbated by poor layout and presentation, which affected even those with relatively good knowledge of DTARs and had a negative impact not only on their understanding of the reviews but also on their motivation to engage with the text. Comparison between the readings of the three reviews showed that more accessible presentation, such as presenting the results as natural frequencies, significantly increased participants’ understanding. Conclusions The study demonstrates that authors and editors should pay more attention to the presentation as well as the content of Cochrane DTARs

  15. Demonstration of saw blade accuracy and excursion: a cadaveric comparison study of blade types used in total knee arthroplasty.

    PubMed

    Wetzel, Robert J; Shah, Ritesh R; Puri, Lalit

    2013-06-01

    In total knee arthroplasty, outcomes partly depend on accurate osteotomies and integrity of stabilizing structures. We compared accuracy and excursion between a conventional and an oscillating tip saw blade. Two sets of osteotomies were made on cadaveric knees. Bi-planar accuracy was compared using computer navigation, and excursion was compared using methylene blue. Wilcoxon-Mann-Whitney testing demonstrated no significant difference in blade accuracy (p=0.35). Blades were within 0.5 degrees of neutral coronally and 2.0 degrees sagittally. The oscillating tip blade demonstrated less dye markings on the surrounding tissues. Accurate osteotomies and soft tissue protection are critical to successful arthroplasties. Although comparative accuracy was equal, the oscillating tip blade exhibited less excursion displaying potential for less iatrogenic soft tissue injuries leading to catastrophic failure.

  16. A Case Study of Using a Social Annotation Tool to Support Collaboratively Learning

    ERIC Educational Resources Information Center

    Gao, Fei

    2013-01-01

    The purpose of the study was to understand student interaction and learning supported by a collaboratively social annotation tool--Diigo. The researcher examined through a case study how students participated and interacted when learning an online text with the social annotation tool--Diigo, and how they perceived their experience. The findings…

  17. Validation of the North American Chest Pain Rule in Prediction of Very Low-Risk Chest Pain; a Diagnostic Accuracy Study

    PubMed Central

    Valadkhani, Somayeh; Jalili, Mohammad; Hesari, Elham; Mirfazaelian, Hadi

    2017-01-01

    Introduction: Acute coronary syndrome accounts for more than 15% of the chest pains. Recently, Hess et al. developed North American Chest Pain Rule (NACPR) to identify very low-risk patients who can be safely discharged from emergency department (ED). The present study aimed to validate this rule in EDs of two academic hospitals. Methods: A prospective diagnostic accuracy study was conducted on consecutive patients 24 years of age and older presenting to the ED with the chief complaint of acute chest pain, during March 2013 to June 2013. Chest pain characteristics, cardiac history, electrocardiogram findings, and cardiac biomarker measurement of patients were collected and screening performance characteristics of NACPR with 95% confidence interval were calculated using SPSS 21. Results: From 400 eligible patients with completed follow up, 69 (17.25 %) developed myocardial infarction, 121 (30.25%) underwent coronary revascularization, and 4 (2%) died because of cardiac or unidentifiable causes. By using NACPR, 34 (8.50%) of all the patients could be considered very low- risk and discharged after a brief ED assessment. Among these patients, none developed above-mentioned adverse outcomes within 30 days. Sensitivity, specificity, positive prediction value, and negative prediction value of the rule were 100% (95% CI: 87.35 - 100.00), 45.35 (95% CI: 40.19 - 50.61), 14.52 (95% CI: 10.40 – 19.85), and 100 (95% CI: 97.18 - 100.00), respectively. Conclusions: The present multicenter study showed that NACPR is a good screening tool for early discharge of patients with very low-risk chest pain from ED. PMID:28286818

  18. Review of quality assessment tools for the evaluation of pharmacoepidemiological safety studies

    PubMed Central

    Neyarapally, George A; Hammad, Tarek A; Pinheiro, Simone P; Iyasu, Solomon

    2012-01-01

    Objectives Pharmacoepidemiological studies are an important hypothesis-testing tool in the evaluation of postmarketing drug safety. Despite the potential to produce robust value-added data, interpretation of findings can be hindered due to well-recognised methodological limitations of these studies. Therefore, assessment of their quality is essential to evaluating their credibility. The objective of this review was to evaluate the suitability and relevance of available tools for the assessment of pharmacoepidemiological safety studies. Design We created an a priori assessment framework consisting of reporting elements (REs) and quality assessment attributes (QAAs). A comprehensive literature search identified distinct assessment tools and the prespecified elements and attributes were evaluated. Primary and secondary outcome measures The primary outcome measure was the percentage representation of each domain, RE and QAA for the quality assessment tools. Results A total of 61 tools were reviewed. Most tools were not designed to evaluate pharmacoepidemiological safety studies. More than 50% of the reviewed tools considered REs under the research aims, analytical approach, outcome definition and ascertainment, study population and exposure definition and ascertainment domains. REs under the discussion and interpretation, results and study team domains were considered in less than 40% of the tools. Except for the data source domain, quality attributes were considered in less than 50% of the tools. Conclusions Many tools failed to include critical assessment elements relevant to observational pharmacoepidemiological safety studies and did not distinguish between REs and QAAs. Further, there is a lack of considerations on the relative weights of different domains and elements. The development of a quality assessment tool would facilitate consistent, objective and evidence-based assessments of pharmacoepidemiological safety studies. PMID:23015600

  19. SU-E-E-02: An Excel-Based Study Tool for ABR-Style Exams

    SciTech Connect

    Cline, K; Stanley, D; Defoor, D; Stathakis, S; Gutierrez, A; Papanikolaou, N; Kirby, N

    2015-06-15

    Purpose: As the landscape of learning and testing shifts toward a computer-based environment, a replacement for paper-based methods of studying is desirable. Using Microsoft Excel, a study tool was developed that allows the user to populate multiple-choice questions and then generate an interactive quiz session to answer them. Methods: The code for the tool was written using Microsoft Excel Visual Basic for Applications with the intent that this tool could be implemented by any institution with Excel. The base tool is a template with a setup macro, which builds out the structure based on user’s input. Once the framework is built, the user can input sets of multiple-choice questions, answer choices, and even add figures. The tool can be run in random-question or sequential-question mode for single or multiple courses of study. The interactive session allows the user to select answer choices and immediate feedback is provided. Once the user is finished studying, the tool records the day’s progress by reporting progress statistics useful for trending. Results: Six doctoral students at UTHSCSA have used this tool for the past two months to study for their qualifying exam, which is similar in format and content to the American Board of Radiology (ABR) Therapeutic Part II exam. The students collaborated to create a repository of questions, met weekly to go over these questions, and then used the tool to prepare for their exam. Conclusion: The study tool has provided an effective and efficient way for students to collaborate and be held accountable for exam preparation. The ease of use and familiarity of Excel are important factors for the tool’s use. There are software packages to create similar question banks, but this study tool has no additional cost for those that already have Excel. The study tool will be made openly available.

  20. Improved Accuracy of Continuous Glucose Monitoring Systems in Pediatric Patients with Diabetes Mellitus: Results from Two Studies

    PubMed Central

    2016-01-01

    Abstract Objective: This study was designed to evaluate accuracy, performance, and safety of the Dexcom (San Diego, CA) G4® Platinum continuous glucose monitoring (CGM) system (G4P) compared with the Dexcom G4 Platinum with Software 505 algorithm (SW505) when used as adjunctive management to blood glucose (BG) monitoring over a 7-day period in youth, 2–17 years of age, with diabetes. Research Design and Methods: Youth wore either one or two sensors placed on the abdomen or upper buttocks for 7 days, calibrating the device twice daily with a uniform BG meter. Participants had one in-clinic session on Day 1, 4, or 7, during which fingerstick BG measurements (self-monitoring of blood glucose [SMBG]) were obtained every 30 ± 5 min for comparison with CGM, and in youth 6–17 years of age, reference YSI glucose measurements were obtained from arterialized venous blood collected every 15 ± 5 min for comparison with CGM. The sensor was removed by the participant/family after 7 days. Results: In comparison of 2,922 temporally paired points of CGM with the reference YSI measurement for G4P and 2,262 paired points for SW505, the mean absolute relative difference (MARD) was 17% for G4P versus 10% for SW505 (P < 0.0001). In comparison of 16,318 temporally paired points of CGM with SMBG for G4P and 4,264 paired points for SW505, MARD was 15% for G4P versus 13% for SW505 (P < 0.0001). Similarly, error grid analyses indicated superior performance with SW505 compared with G4P in comparison of CGM with YSI and CGM with SMBG results, with greater percentages of SW505 results falling within error grid Zone A or the combined Zones A plus B. There were no serious adverse events or device-related serious adverse events for either the G4P or the SW505, and there was no sensor breakoff. Conclusions: The updated algorithm offers substantial improvements in accuracy and performance in pediatric patients with diabetes. Use of CGM with improved performance has

  1. Three-dimensional volume tomographic study of the imaging accuracy of impacted teeth: MSCT and CBCT comparison--an in vitro study.

    PubMed

    Hofmann, Elisabeth; Medelnik, Jürgen; Fink, Martin; Lell, Michael; Hirschfelder, Ursula

    2013-06-01

    The aim of this study was to analyze the imaging accuracy of cone beam computed tomography (CBCT) data sets compared with multislice spiral computed tomography (MSCT) data sets in determining the exact mesiodistal width of unerupted porcine tooth germs and to compare the radiologically obtained results of width measurements with the actual mesiodistal dimension of the tooth germs. In MSCT and CBCT data sets, the largest diameter of 24 tooth germs was determined with the aid of the mesial and distal contact points. The reference method used was mesiodistal width measurement using sliding callipers after the tooth germs had been osteotomized. Accuracy and precision were ascertained with difference plots and a one-way model II analysis of variance with random effects. Analysis of accuracy revealed marked differences between the measuring methods in the difference plot: slightly higher mean values were measured by MSCT and markedly lower values by CBCT than by the reference method (calliper); the mean deviation was significantly greater for CBCT. The width of the confidence interval in the comparison of CBCT versus clinical measurements is more than 4 times higher than in the comparison of MSCT versus clinical values. Precision analysis found that repeatability was twice as high with CBCT as with clinical measurement, whereas MSCT and clinical measurement differed only slightly. The mesiodistal width of displaced teeth can be determined by MSCT but also by CBCT. MSCT is superior to CBCT in determining tooth width; the difference was statistically significant (P = 0.05).

  2. Accuracy of cut-off value by measurement of third molar index: Study of a Colombian sample.

    PubMed

    De Luca, Stefano; Aguilar, Lina; Rivera, Marcela; Palacio, Luz Andrea Velandia; Riccomi, Giulia; Bestetti, Fiorella; Cameriere, Roberto

    2016-04-01

    The aim of this cross-sectional study was to test the accuracy of cut-off value of 0.08 by measurement of third molar index (I3M) in assessing legal adult age of 18 years in a sample of Colombian children and young adults. Digital orthopantomographs of 288 Colombian children and young adults (163 girls and 125 boys), aged between 13 and 22 years, were analysed. Concordance correlation coefficient (ρc) and κ statistics (Cohen's Kappa coefficient) showed that repeatability and reproducibility are high for both intra- and inter-observer error. κ statistics for intra- and inter-observer agreement in decision on adult or minor was 0.913 and 0.877, respectively. Age distribution gradually decreases as I3M increases in both girls and boys. For girls, the sensitivity test was 95.1% (95% CI 87.1%-95%) and specificity was 93.8% (95% CI 87.1%-98.8%). The proportion of correctly classified individuals was 95.1%. For boys, the sensitivity test was 91.7% (95% CI 85.1%-96.8%) and specificity was 90.6% (95% CI 82.1%-97.8%). The proportion of correctly classified individuals was 89.7%. The cut-off value of 0.08 is highly useful to determine if a subject is 18 years of age or older or not.

  3. A Stable Geodetic Reference Frame within the COCONET Footprint to Enable High-Accuracy Ground Deformation Study

    NASA Astrophysics Data System (ADS)

    Liu, H.; Wang, G.; Yu, J.

    2014-12-01

    COCONET(Continuously Operating Caribbean GPS Observational Network) is a multidisciplinary research infrastructure focused on improving the ability to understand, predict, and prepare for multiple natural hazards in the Caribbean, Central America, and Northern Andes. GPS data alone cannot provide accurate ground deformation information over time and space. A precise regional reference frame is needed in interpolating GPS observations to address regional and local ground deformations. Failure to use a precise reference frame would cause unintended negative consequences. The mainly purpose of this study is to establish a stable geodetic reference frame within the COCONet footprint (abbreviated as "COCONet-RF") and to provide positional time series and velocities (relative to COCONet-RF) of all permanent GPS stations within the COCONet footprint to the public. The GIPSY software package was used to calculate position within IGS08. The local reference frame was realized by a 14-parameter Helmert transformation technique. It will be periodically updated in order to synchronize with the update of the IGS reference frame. This stable COCONer-RF would provide a higher accuracy geodetic infrastructure for delineating the magnitude and spatial and temporal variations of ground deformations associated with landslides, faulting, subsidence, and volcanoes. Researchers who are not familiar with GPS data processing and reference transformation will be able to directly integrate COCONET products into their specific research.

  4. Pose prediction accuracy in docking studies and enrichment of actives in the active site of GSK-3beta.

    PubMed

    Gadakar, Pravin Kumar; Phukan, Samiron; Dattatreya, Prasanna; Balaji, V N

    2007-01-01

    We present molecular docking studies on the inhibitors of GSK-3beta kinase in the enzyme binding sites of the X-ray complexes (1H8F, 1PYX, 1O9U, 1Q4L, 1Q5K, and 1UV5) using the Schrödinger docking tool Glide. Cognate and cross-docking studies using standard precision (SP) and extraprecision (XP) algorithms have been carried out. Cognate docking studies demonstrate that docked poses similar to X-ray poses (root-mean-square deviations of less than 2 A) are found within the top four ranks of the GlideScore and E-model scores. However, cross-docking studies typically produce poses that are significantly deviated from X-ray poses in all but a couple of cases, implying potential for induced fit effects in ligand binding. In this light, we have also carried out induced fit docking studies in the active sites of 1O9U, 1Q4L, and 1Q5K. Specifically, conformational changes have been effected in the active sites of these three protein structures to dock noncognate ligands. Thus, for example, the active site of 1O9U has been induced to fit the ligands of 1Q4L, 1Q5K, and 1UV5. These studies produce ligand docked poses which have significantly lower root-mean-square deviations relative to their X-ray crystallographic poses, when compared to the corresponding values from the cross-docking studies. Furthermore, we have used an ensemble of the induced fit models and X-ray structures to enhance the retrieval of active GSK-3beta inhibitors seeded in a decoy database, normally used in Glide validation studies. Thus, our studies provide valuable insights into computational strategies useful for the identification of potential GSK-3beta inhibitors.

  5. Comparative evaluation of accuracy of two electronic apex locators in the presence of various irrigants: An in vitro study

    PubMed Central

    Jain, Saru; Kapur, Ravi

    2012-01-01

    Context: The establishment of appropriate working length is one of the most critical steps in endodontic therapy. Electronic apex locators have been introduced to determine the working length. The development of electronic apex locators has helped make the assessment of the working length more accurate and predictable, along with reduction in treatment time and radiation dose. Objectives: The aim of this study was to compare the efficacy of electronic apex locators after cleansing and shaping of the root canals and whether there was any alteration in accuracy when used in the presence of irrigants. Materials and Methods: Seventy extracted human permanent molars with mature apices were selected. Equal number of maxillary and mandibular permanent molars (35 each) were sectioned at the cemento-enamel junction. Access opening was done and only the mesiobuccal root canal was studied for the purpose of standardization. Electronic working length measurements were taken before and after preparation of the mesiobuccal canal with Root ZX and ProPex II using various irrigants. Statistical Analysis Used: The data were statistically analyzed using a paired t-test at 0.05 level of significance. Results: P-values for actual and final canal lengths for Root ZX employing NaoCl(0.001), CHX(0.006), LA(0.020) and for ProPex II was (0.001) respectively. When the data were compared, results were statistically significant (P < 0.05). Conclusion: Within the limitations of this study Root ZX can be considered to be an accurate electronic apex locator and CHX as irrigant matched more precisely with the actual canal length measurements. PMID:23230349

  6. Camera Calibration Accuracy at Different Uav Flying Heights

    NASA Astrophysics Data System (ADS)

    Yusoff, A. R.; Ariff, M. F. M.; Idris, K. M.; Majid, Z.; Chong, A. K.

    2017-02-01

    Unmanned Aerial Vehicles (UAVs) can be used to acquire highly accurate data in deformation survey, whereby low-cost digital cameras are commonly used in the UAV mapping. Thus, camera calibration is considered important in obtaining high-accuracy UAV mapping using low-cost digital cameras. The main focus of this study was to calibrate the UAV camera at different camera distances and check the measurement accuracy. The scope of this study included camera calibration in the laboratory and on the field, and the UAV image mapping accuracy assessment used calibration parameters of different camera distances. The camera distances used for the image calibration acquisition and mapping accuracy assessment were 1.5 metres in the laboratory, and 15 and 25 metres on the field using a Sony NEX6 digital camera. A large calibration field and a portable calibration frame were used as the tools for the camera calibration and for checking the accuracy of the measurement at different camera distances. Bundle adjustment concept was applied in Australis software to perform the camera calibration and accuracy assessment. The results showed that the camera distance at 25 metres is the optimum object distance as this is the best accuracy obtained from the laboratory as well as outdoor mapping. In conclusion, the camera calibration at several camera distances should be applied to acquire better accuracy in mapping and the best camera parameter for the UAV image mapping should be selected for highly accurate mapping measurement.

  7. Towards early software reliability prediction for computer forensic tools (case study).

    PubMed

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  8. Use of Molecular Diagnostic Tools for the Identification of Species Responsible for Snakebite in Nepal: A Pilot Study

    PubMed Central

    Sharma, Sanjib Kumar; Kuch, Ulrich; Höde, Patrick; Bruhse, Laura; Pandey, Deb P.; Ghimire, Anup; Chappuis, François; Alirol, Emilie

    2016-01-01

    Snakebite is an important medical emergency in rural Nepal. Correct identification of the biting species is crucial for clinicians to choose appropriate treatment and anticipate complications. This is particularly important for neurotoxic envenoming which, depending on the snake species involved, may not respond to available antivenoms. Adequate species identification tools are lacking. This study used a combination of morphological and molecular approaches (PCR-aided DNA sequencing from swabs of bite sites) to determine the contribution of venomous and non-venomous species to the snakebite burden in southern Nepal. Out of 749 patients admitted with a history of snakebite to one of three study centres, the biting species could be identified in 194 (25.9%). Out of these, 87 had been bitten by a venomous snake, most commonly the Indian spectacled cobra (Naja naja; n = 42) and the common krait (Bungarus caeruleus; n = 22). When both morphological identification and PCR/sequencing results were available, a 100% agreement was noted. The probability of a positive PCR result was significantly lower among patients who had used inadequate “first aid” measures (e.g. tourniquets or local application of remedies). This study is the first to report the use of forensic genetics methods for snake species identification in a prospective clinical study. If high diagnostic accuracy is confirmed in larger cohorts, this method will be a very useful reference diagnostic tool for epidemiological investigations and clinical studies. PMID:27105074

  9. A comparative study between evaluation methods for quality control procedures for determining the accuracy of PET/CT registration

    NASA Astrophysics Data System (ADS)

    Cha, Min Kyoung; Ko, Hyun Soo; Jung, Woo Young; Ryu, Jae Kwang; Choe, Bo-Young

    2015-08-01

    The Accuracy of registration between positron emission tomography (PET) and computed tomography (CT) images is one of the important factors for reliable diagnosis in PET/CT examinations. Although quality control (QC) for checking alignment of PET and CT images should be performed periodically, the procedures have not been fully established. The aim of this study is to determine optimal quality control (QC) procedures that can be performed at the user level to ensure the accuracy of PET/CT registration. Two phantoms were used to carry out this study: the American college of Radiology (ACR)-approved PET phantom and National Electrical Manufacturers Association (NEMA) International Electrotechnical Commission (IEC) body phantom, containing fillable spheres. All PET/CT images were acquired on a Biograph TruePoint 40 PET/CT scanner using routine protocols. To measure registration error, the spatial coordinates of the estimated centers of the target slice (spheres) was calculated independently for the PET and the CT images in two ways. We compared the images from the ACR-approved PET phantom to that from the NEMA IEC body phantom. Also, we measured the total time required from phantom preparation to image analysis. The first analysis method showed a total difference of 0.636 ± 0.11 mm for the largest hot sphere and 0.198 ± 0.09 mm for the largest cold sphere in the case of the ACR-approved PET phantom. In the NEMA IEC body phantom, the total difference was 3.720 ± 0.97 mm for the largest hot sphere and 4.800 ± 0.85 mm for the largest cold sphere. The second analysis method showed that the differences in the x location at the line profile of the lesion on PET and CT were (1.33, 1.33) mm for a bone lesion, (-1.26, -1.33) mm for an air lesion and (-1.67, -1.60) mm for a hot sphere lesion for the ACR-approved PET phantom. For the NEMA IEC body phantom, the differences in the x location at the line profile of the lesion on PET and CT were (-1.33, 4.00) mm for the air

  10. Adoption of online health management tools among healthy older adults: An exploratory study.

    PubMed

    Zettel-Watson, Laura; Tsukerman, Dmitry

    2016-06-01

    As the population ages and chronic diseases abound, overburdened healthcare systems will increasingly require individuals to manage their own health. Online health management tools, quickly increasing in popularity, have the potential to diminish or even replace in-person contact with health professionals, but overall efficacy and usage trends are unknown. The current study explored perceptions and usage patterns among users of online health management tools, and identified barriers and barrier-breakers among non-users. An online survey was completed by 169 computer users (aged 50+). Analyses revealed that a sizable minority (37%) of participants use online health management tools and most users (89%) are satisfied with these tools, but a limited range of tools are being used and usage occurs in relatively limited domains. Improved awareness and education for online health management tools could enhance people's abilities to remain at home as they age, reducing the financial burden on formal assistance programs.

  11. A browser-based tool for space weather and space climate studies

    NASA Astrophysics Data System (ADS)

    Tanskanen, E. I.; Pérez-Suárez, D.

    2014-04-01

    A browser-based research tool has been developed for time series analysis on-line. Large amount of high-resolution measurements are nowadays available from different heliospheric locations. It has become an issue how to best handle the ever-increasing amount of information about the near-Earth space weather conditions, and how to improve the social data analysis tools for space studies. To resolve the problem, we have developed an interactive web interface, called Substorm Zoo, which we expect to become a powerful tool for scientists and a useful tool for public.

  12. ForestPMPlot: A Flexible Tool for Visualizing Heterogeneity Between Studies in Meta-analysis.

    PubMed

    Kang, Eun Yong; Park, Yurang; Li, Xiao; Segrè, Ayellet V; Han, Buhm; Eskin, Eleazar

    2016-07-07

    Meta-analysis has become a popular tool for genetic association studies to combine different genetic studies. A key challenge in meta-analysis is heterogeneity, or the differences in effect sizes between studies. Heterogeneity complicates the interpretation of meta-analyses. In this paper, we describe ForestPMPlot, a flexible visualization tool for analyzing studies included in a meta-analysis. The main feature of the tool is visualizing the differences in the effect sizes of the studies to understand why the studies exhibit heterogeneity for a particular phenotype and locus pair under different conditions. We show the application of this tool to interpret a meta-analysis of 17 mouse studies, and to interpret a multi-tissue eQTL study.

  13. The ADENOMA Study. Accuracy of Detection using Endocuff Vision™ Optimization of Mucosal Abnormalities: study protocol for randomized controlled trial

    PubMed Central

    Bevan, Roisin; Ngu, Wee Sing; Saunders, Brian P.; Tsiamoulos, Zacharias; Bassett, Paul; Hoare, Zoe; Rees, Colin J.

    2016-01-01

    Background: Colonoscopy is the gold standard investigation for the diagnosis of bowel pathology and colorectal cancer screening. Adenoma detection rate is a marker of high quality colonoscopy and a high adenoma detection rate is associated with a lower incidence of interval cancers. Several technological advancements have been explored to improve adenoma detection rate. A new device called Endocuff Vision™ has been shown to improve adenoma detection rate in pilot studies. Methods/Design: This is a prospective, multicenter, randomized controlled trial comparing the adenoma detection rate in patients undergoing Endocuff Vision™-assisted colonoscopy with standard colonoscopy. All patients above 18 years of age referred for screening, surveillance, or diagnostic colonoscopy who are able to consent are invited to the study. Patients with absolute contraindications to colonoscopy, large bowel obstruction or pseudo-obstruction, colon cancer or polyposis syndromes, colonic strictures, severe diverticular segments, active colitis, anticoagulant therapy, or pregnancy are excluded. Patients are randomized according to site, age, sex, and bowel cancer screening status to receive Endocuff Vision™-assisted colonoscopy or standard colonoscopy on the day of procedure. Baseline data, colonoscopy, and polyp data including histology are collected. Nurse assessment of patient comfort and patient comfort questionnaires are completed post procedure. Patients are followed up at 21 days and complete a patient experience questionnaire. This study will take place across seven NHS Hospital Trusts: one in London and six within the Northern Region Endoscopy Group. A maximum of 10 colonoscopists per site will recruit a total of 1772 patients, with a maximum of four bowel screening colonoscopists permitted per site. Discussion: This is the first trial to evaluate the adenoma detection rate of Endocuff Vision™ in all screening, surveillance, and diagnostic patient groups. This timely

  14. In-vitro study on the accuracy of a simple-design CT-guided stent for dental implants

    PubMed Central

    Huh, Young-June; Choi, Bo-Ram; Huh, Kyung-Hoe; Yi, Won-Jin; Heo, Min-Suk; Lee, Sam-Sun

    2012-01-01

    Purpose An individual surgical stent fabricated from computed tomography (CT) data, called a CT-guided stent, would be useful for accurate installation of implants. The purpose of the present study was to introduce a newly developed CT-guided stent with a simple design and evaluate the accuracy of the stent placement. Materials and Methods A resin template was fabricated from a hog mandible and a specially designed plastic plate, with 4 metal balls inserted in it for radiographic recognition, was attached to the occlusal surface of the template. With the surgical stent applied, CT images were taken, and virtual implants were placed using software. The spatial positions of the virtually positioned implants were acquired and implant guiding holes were drilled into the surgical stent using a specially designed 5-axis drilling machine. The surgical stent was placed on the mandible and CT images were taken again. The discrepancy between the central axis of the drilled holes on the second CT images and the virtually installed implants on the first CT images was evaluated. Results The deviation of the entry point and angulation of the central axis in the reference plane were 0.47±0.27 mm, 0.57±0.23 mm, and 0.64±0.16°, 0.57±0.15°, respectively. However, for the two different angulations in each group, the 20° angulation showed a greater error in the deviation of the entry point than did the 10° angulation. Conclusion The CT-guided template proposed in this study was highly accurate. It could replace existing implant guide systems to reduce costs and effort. PMID:23071963

  15. Accuracy and Predictability of PANC-3 Scoring System over APACHE II in Acute Pancreatitis: A Prospective Study

    PubMed Central

    Vishnu, Vikram Hubbanageri; Muniyappa, Shridhar; Prasath, Arun

    2017-01-01

    Introduction Acute Pancreatitis (AP) is one of the common conditions encountered in the emergency room. The course of the disease ranges from mild form to severe acute form. Most of these episodes are mild and spontaneously subsiding within 3 to 5 days. In contrast, Severe Acute Pancreatitis (SAP) occurring in around 15-20% of all cases, mortality can range between 10 to 85% across various centres and countries. In such a situation we need an indicator which can predict the outcome of an attack, as severe or mild, as early as possible and such an indicator should be sensitive and specific enough to trust upon. PANC-3 scoring is such a scoring system in predicting the outcome of an attack of AP. Aim To assess the accuracy and predictability of PANC-3 scoring system over APACHE II in predicting severity in an attack of AP. Materials and Methods This prospective study was conducted on 82 patients admitted with the diagnosis of pancreatitis. Investigations to evaluate PANC-3 and APACHE II were done on all the patients and the PANC-3 and APACHE II score was calculated. Results PANC-3 score has a sensitivity of 82.6% and specificity of 77.9%, the test had a Positive Predictive Value (PPV) of 0.59 and Negative Predictive Value (NPV) of 0.92. Sensitivity of APACHE II in predicting SAP was 91.3% and specificity was 96.6% with PPV of 0.91, NPV was 0.96. Conclusion Our study shows that PANC-3 can be used to predict the severity of pancreatitis as efficiently as APACHE II. The interpretation of PANC-3 does not need expertise and can be applied at the time of admission which is an advantage when compared to classical scoring systems.

  16. [True color accuracy in digital forensic photography].

    PubMed

    Ramsthaler, Frank; Birngruber, Christoph G; Kröll, Ann-Katrin; Kettner, Mattias; Verhoff, Marcel A

    2016-01-01

    Forensic photographs not only need to be unaltered and authentic and capture context-relevant images, along with certain minimum requirements for image sharpness and information density, but color accuracy also plays an important role, for instance, in the assessment of injuries or taphonomic stages, or in the identification and evaluation of traces from photos. The perception of color not only varies subjectively from person to person, but as a discrete property of an image, color in digital photos is also to a considerable extent influenced by technical factors such as lighting, acquisition settings, camera, and output medium (print, monitor). For these reasons, consistent color accuracy has so far been limited in digital photography. Because images usually contain a wealth of color information, especially for complex or composite colors or shades of color, and the wavelength-dependent sensitivity to factors such as light and shadow may vary between cameras, the usefulness of issuing general recommendations for camera capture settings is limited. Our results indicate that true image colors can best and most realistically be captured with the SpyderCheckr technical calibration tool for digital cameras tested in this study. Apart from aspects such as the simplicity and quickness of the calibration procedure, a further advantage of the tool is that the results are independent of the camera used and can also be used for the color management of output devices such as monitors and printers. The SpyderCheckr color-code patches allow true colors to be captured more realistically than with a manual white balance tool or an automatic flash. We therefore recommend that the use of a color management tool should be considered for the acquisition of all images that demand high true color accuracy (in particular in the setting of injury documentation).

  17. Videogames, Tools for Change: A Study Based on Activity Theory

    ERIC Educational Resources Information Center

    Méndez, Laura; Lacasa, Pilar

    2015-01-01

    Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are…

  18. GoPro as an Ethnographic Tool: A Wayfinding Study in an Academic Library

    ERIC Educational Resources Information Center

    Kinsley, Kirsten M.; Schoonover, Dan; Spitler, Jasmine

    2016-01-01

    In this study, researchers sought to capture students' authentic experience of finding books in the main library using a GoPro camera and the think-aloud protocol. The GoPro provided a first-person perspective and was an effective ethnographic tool for observing a student's individual experience, while also demonstrating what tools they use to…

  19. Parents' and Service Providers' Perceptions of the Family Goal Setting Tool: A Pilot Study

    ERIC Educational Resources Information Center

    Rodger, Sylvia; O'Keefe, Amy; Cook, Madonna; Jones, Judy

    2012-01-01

    Background: This qualitative study describes parents' and service providers' experiences in using the Family Goal Setting Tool (FGST). This article looks specifically at the tool's perceived clinical utility during annual, collaborative goal setting. Methods: Participants included eight parents and ten service providers involved in a Family and…

  20. BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (Final Report)

    EPA Science Inventory

    EPA announced the release of the final report, BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications. This report supports application of two recently developed water modeling tools, the Better Assessment Science Integrating point & ...

  1. Experience of Integrating Various Technological Tools into the Study and Future Teaching of Mathematics Education Students

    ERIC Educational Resources Information Center

    Gorev, Dvora; Gurevich-Leibman, Irina

    2015-01-01

    This paper presents our experience of integrating technological tools into our mathematics teaching (in both disciplinary and didactic courses) for student-teachers. In the first cycle of our study, a variety of technological tools were used (e.g., dynamic software, hypertexts, video and applets) in teaching two disciplinary mathematics courses.…

  2. WiFiSiM: An Educational Tool for the Study and Design of Wireless Networks

    ERIC Educational Resources Information Center

    Mateo Sanguino, T. J.; Serrano Lopez, C.; Marquez Hernandez, F. A.

    2013-01-01

    A new educational simulation tool designed for the generic study of wireless networks, the Wireless Fidelity Simulator (WiFiSim), is presented in this paper. The goal of this work was to create and implement a didactic tool to improve the teaching and learning of computer networks by means of two complementary strategies: simulating the behavior…

  3. Tools to study and manage grazing behavior at multiple scales to enhance the sustainability of livestock

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Free-ranging animal behavior is a multifaceted and complex phenomenon within rangeland ecology that must be understood and ultimately managed. Improving behavioral studies requires tools appropriate for use at the landscape scale. Though tools alone do not assure research will generate accurate in...

  4. U.S. CASE STUDIES USING MUNICIPAL SOLID WASTE DECISION SUPPORT TOOL

    EPA Science Inventory

    The paper provides an overview of some case studies using the recently completed muniicpal solid waste decision support tool (MSW-DST) in communities across the U.S. The purpose of the overview is to help illustrate the variety of potential applications of the tool. The methodolo...

  5. Dietary Adherence Monitoring Tool for Free-living, Controlled Feeding Studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Objective: To devise a dietary adherence monitoring tool for use in controlled human feeding trials involving free-living study participants. Methods: A scoring tool was devised to measure and track dietary adherence for an 8-wk randomized trial evaluating the effects of two different dietary patter...

  6. A Usability Study of Users' Perceptions toward a Multimedia Computer-Assisted Learning Tool for Neuroanatomy

    ERIC Educational Resources Information Center

    Gould, Douglas J.; Terrell, Mark A.; Fleming, Jo

    2008-01-01

    This usability study evaluated users' perceptions of a multimedia prototype for a new e-learning tool: Anatomy of the Central Nervous System: A Multimedia Course. Usability testing is a collection of formative evaluation methods that inform the developmental design of e-learning tools to maximize user acceptance, satisfaction, and adoption.…

  7. Intravital microscopy as a tool to study drug delivery in preclinical studies

    PubMed Central

    Amornphimoltham, Panomwat; Masedunskas, Andrius; Weigert, Roberto

    2010-01-01

    The technical developments in the field of non-linear microscopy have made intravital microscopy one of the most successful techniques for studying physiological and pathological processes in live animals. Intravital microscopy has been utilized to address many biological questions in basic research and is now a fundamental tool for preclinical studies, with an enormous potential for clinical applications. The ability to dynamically image cellular and subcellular structures combined with the possibility to perform longitudinal studies have empowered investigators to use this discipline to study the mechanisms of action of therapeutic agents and assess the efficacy on their targets in vivo. The goal of this review is to provide a general overview of the recent advances in intravital microscopy and to discuss some of its applications in preclinical studies. PMID:20933026

  8. TSALLIS STATISTICS AS A TOOL FOR STUDYING INTERSTELLAR TURBULENCE

    SciTech Connect

    Esquivel, A.; Lazarian, A. E-mail: lazarian@astro.wisc.ed

    2010-02-10

    We used magnetohydrodynamic (MHD) simulations of interstellar turbulence to study the probability distribution functions (PDFs) of increments of density, velocity, and magnetic field. We found that the PDFs are well described by a Tsallis distribution, following the same general trends found in solar wind and electron MHD studies. We found that the PDFs of density are very different in subsonic and supersonic turbulence. In order to extend this work to ISM observations, we studied maps of column density obtained from three-dimensional MHD simulations. From the column density maps, we found the parameters that fit to Tsallis distributions and demonstrated that these parameters vary with the sonic and Alfven Mach numbers of turbulence. This opens avenues for using Tsallis distributions to study the dynamical and perhaps magnetic states of interstellar gas.

  9. A high accuracy femto-/picosecond laser damage test facility dedicated to the study of optical thin films

    NASA Astrophysics Data System (ADS)

    Mangote, B.; Gallais, L.; Zerrad, M.; Lemarchand, F.; Gao, L. H.; Commandré, M.; Lequime, M.

    2012-01-01

    A laser damage test facility delivering pulses from 100 fs to 3 ps and designed to operate at 1030 nm is presented. The different details of its implementation and performances are given. The originality of this system relies the online damage detection system based on Nomarski microscopy and the use of a non-conventional energy detection method based on the utilization of a cooled CCD that offers the possibility to obtain the laser induced damage threshold (LIDT) with high accuracy. Applications of this instrument to study thin films under laser irradiation are presented. Particularly the deterministic behavior of the sub-picosecond damage is investigated in the case of fused silica and oxide films. It is demonstrated that the transition of 0-1 damage probability is very sharp and the LIDT is perfectly deterministic at few hundreds of femtoseconds. The damage process in dielectric materials being the results of electronic processes, specific information such as the material bandgap is needed for the interpretation of results and applications of scaling laws. A review of the different approaches for the estimation of the absorption gap of optical dielectric coatings is conducted and the results given by the different methods are compared and discussed. The LIDT and gap of several oxide materials are then measured with the presented instrument: Al2O3, Nb2O5, HfO2, SiO2, Ta2O5, and ZrO2. The obtained relation between the LIDT and gap at 1030 nm confirms the linear evolution of the threshold with the bandgap that exists at 800 nm, and our work expands the number of tested materials.

  10. Positional Accuracy Assessment of the Openstreetmap Buildings Layer Through Automatic Homologous Pairs Detection: the Method and a Case Study

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Minghini, M.; Molinari, M. E.; Zamboni, G.

    2016-06-01

    OpenStreetMap (OSM) is currently the largest openly licensed collection of geospatial data. Being OSM increasingly exploited in a variety of applications, research has placed great attention on the assessment of its quality. This work focuses on assessing the quality of OSM buildings. While most of the studies available in literature are limited to the evaluation of OSM building completeness, this work proposes an original approach to assess the positional accuracy of OSM buildings based on comparison with a reference dataset. The comparison relies on a quasi-automated detection of homologous pairs on the two datasets. Based on the homologous pairs found, warping algorithms like e.g. affine transformations and multi-resolution splines can be applied to the OSM buildings to generate a new version having an optimal local match to the reference layer. A quality assessment of the OSM buildings of Milan Municipality (Northern Italy), having an area of about 180 km2, is then presented. After computing some measures of completeness, the algorithm based on homologous points is run using the building layer of the official vector cartography of Milan Municipality as the reference dataset. Approximately 100000 homologous points are found, which show a systematic translation of about 0.4 m on both the X and Y directions and a mean distance of about 0.8 m between the datasets. Besides its efficiency and high degree of automation, the algorithm generates a warped version of OSM buildings which, having by definition a closest match to the reference buildings, can be eventually integrated in the OSM database.

  11. Hepcidin detects iron deficiency in Sri Lankan adolescents with a high burden of hemoglobinopathy: A diagnostic test accuracy study

    PubMed Central

    Wray, Katherine; Allen, Angela; Evans, Emma; Fisher, Chris; Premawardhena, Anuja; Perera, Lakshman; Rodrigo, Rexan; Goonathilaka, Gayan; Ramees, Lebbe; Webster, Craig; Armitage, Andrew E; Prentice, Andrew M

    2017-01-01

    Abstract Anemia affects over 800 million women and children globally. Measurement of hepcidin as an index of iron status shows promise, but its diagnostic performance where hemoglobinopathies are prevalent is unclear. We evaluated the performance of hepcidin as a diagnostic test of iron deficiency in adolescents across Sri Lanka. We selected 2273 samples from a nationally representative cross‐sectional study of 7526 secondary schoolchildren across Sri Lanka and analyzed associations between hepcidin and participant characteristics, iron indices, inflammatory markers, and hemoglobinopathy states. We evaluated the diagnostic accuracy of hepcidin as a test for iron deficiency with estimation of the AUCROC, sensitivity/specificity at each hepcidin cutoff, and calculation of the Youden Index to find the optimal threshold. Hepcidin was associated with ferritin, sTfR, and hemoglobin. The AUCROC for hepcidin as a test of iron deficiency was 0.78; hepcidin outperformed Hb and sTfR. The Youden index‐predicted cutoff to detect iron deficiency (3.2 ng/mL) was similar to thresholds previously identified to predict iron utilization and identify deficiency in African populations. Neither age, sex, nor α‐ or β‐thalassemia trait affected diagnostic properties of hepcidin. Hepcidin pre‐screening would prevent most iron‐replete thalassemia carriers from receiving iron whilst still ensuring most iron deficient children were supplemented. Our data indicate that the physiological relationship between hepcidin and iron status transcends specific populations. Measurement of hepcidin in individuals or populations could establish the need for iron interventions. PMID:27883199

  12. Diagnostic Accuracy Study of Intraoperative and Perioperative Serum Intact PTH Level for Successful Parathyroidectomy in 501 Secondary Hyperparathyroidism Patients

    PubMed Central

    Zhang, Lina; Xing, Changying; Shen, Chong; Zeng, Ming; Yang, Guang; Mao, Huijuan; Zhang, Bo; Yu, Xiangbao; Cui, Yiyao; Sun, Bin; Ouyang, Chun; Ge, Yifei; Jiang, Yao; Yin, Caixia; Zha, Xiaoming; Wang, Ningning

    2016-01-01

    Parathyroidectomy (PTX) is an effective treatment for severe secondary hyperparathyroidism (SHPT); however, persistent SHPT may occur because of supernumerary and ectopic parathyroids. Here a diagnostic accuracy study of intraoperative and perioperative serum intact parathyroid hormone (iPTH) was performed to predict successful surgery in 501 patients, who received total PTX + autotransplantation without thymectomy. Serum iPTH values before incision (io-iPTH0), 10 and 20 min after removing the last parathyroid (io-iPTH10, io-iPTH20), and the first and fourth day after PTX (D1-iPTH, D4-iPTH) were recoded. Patients whose serum iPTH was >50 pg/mL at the first postoperative week were followed up within six months. Successful PTX was defined if iPTH was <300 pg/mL, on the contrary, persistent SHPT was regarded. There were 86.4% patients underwent successful PTX, 9.8% remained as persistent SHPT and 3.8% were undetermined. Intraoperative serum iPTH demonstrated no significant differences in two subgroups with or without chronic hepatitis. Receiver operating characteristic (ROC) curves showed that >88.9% of io-iPTH20% could predict successful PTX (area under the curve [AUC] 0.909, sensitivity 78.6%, specificity 88.5%), thereby avoiding unnecessary exploration to reduce operative complications. D4-iPTH >147.4 pg/mL could predict persistent SHPT (AUC 0.998, sensitivity 100%, specificity 99.5%), so that medical intervention or reoperation start timely. PMID:27231027

  13. BACs as tools for the study of genomic imprinting.

    PubMed

    Tunster, S J; Van De Pette, M; John, R M

    2011-01-01

    Genomic imprinting in mammals results in the expression of genes from only one parental allele. Imprinting occurs as a consequence of epigenetic marks set down either in the father's or the mother's germ line and affects a very specific category of mammalian gene. A greater understanding of this distinctive phenomenon can be gained from studies using large genomic clones, called bacterial artificial chromosomes (BACs). Here, we review the important applications of BACs to imprinting research, covering physical mapping studies and the use of BACs as transgenes in mice to study gene expression patterns, to identify imprinting centres, and to isolate the consequences of altered gene dosage. We also highlight the significant and unique advantages that rapid BAC engineering brings to genomic imprinting research.

  14. Developing a Social Autopsy Tool for Dengue Mortality: A Pilot Study

    PubMed Central

    Arauz, María José; Ridde, Valéry; Hernández, Libia Milena; Charris, Yaneth; Carabali, Mabel; Villar, Luis Ángel

    2015-01-01

    Background Dengue fever is a public health problem in the tropical and sub-tropical world. Dengue cases have grown dramatically in recent years as well as dengue mortality. Colombia has experienced periodic dengue outbreaks with numerous dengue related-deaths, where the Santander department has been particularly affected. Although social determinants of health (SDH) shape health outcomes, including mortality, it is not yet understood how these affect dengue mortality. The aim of this pilot study was to develop and pre-test a social autopsy (SA) tool for dengue mortality. Methods and Findings The tool was developed and pre-tested in three steps. First, dengue fatal cases and ‘near misses’ (those who recovered from dengue complications) definitions were elaborated. Second, a conceptual framework on determinants of dengue mortality was developed to guide the construction of the tool. Lastly, the tool was designed and pre-tested among three relatives of fatal cases and six near misses in 2013 in the metropolitan zone of Bucaramanga. The tool turned out to be practical in the context of dengue mortality in Colombia after some modifications. The tool aims to study the social, individual, and health systems determinants of dengue mortality. The tool is focused on studying the socioeconomic position and the intermediary SDH rather than the socioeconomic and political context. Conclusions The SA tool is based on the scientific literature, a validated conceptual framework, researchers’ and health professionals’ expertise, and a pilot study. It is the first time that a SA tool has been created for the dengue mortality context. Our work furthers the study on SDH and how these are applied to neglected tropical diseases, like dengue. This tool could be integrated in surveillance systems to provide complementary information on the modifiable and avoidable death-related factors and therefore, be able to formulate interventions for dengue mortality reduction. PMID:25658485

  15. Case studies: low cost, high-strength, large carbon foam tooling

    SciTech Connect

    Lucas, R.; Danford, H.

    2009-01-15

    A new carbon foam tooling system has been developed that results in a low-cost, high-strength material that has been proving attractive for creation of tooling for composite parts. Composites are stronger; lighter and less subject to corrosion and fatigue than materials that are currently used for fabrication of advanced structures. Tools to manufacture these composite parts must be rigid, durable and able to offer a coefficient of thermal expansion (CTE) closely matching that of the composites. Current technology makes it difficult to match the CTE of a composite part in the curing cycle with anything other than a carbon composite or a nickel iron alloy such as Invar. Fabrication of metallic tooling requires many, expensive stages of long duration with a large infrastructure investment. Card ban fiber reinforced polymer resin composite tooling has a shorter lead-time but limited production use because of durability concerns. Coal-based carbon foam material has a compatible CTE and strong durability, that make it an attractive alternative for use in tooling. The use of coal-based carbon foam in tooling for carbon composites is advantageous because of its low cost, light weight, machinability , vacuum integrity and compatibility with a wide range of curing processes. Large-scale tooling case studies will be presented detailing carbon foam's potential for tooling applications.

  16. Using Proteomics Bioinformatics Tools and Resources in Proteogenomic Studies.

    PubMed

    Vaudel, Marc; Barsnes, Harald; Ræder, Helge; Berven, Frode S

    Proteogenomic studies ally the omic fields related to gene expression into a combined approach to improve the characterization of biological samples. Part of this consists in mining proteomics datasets for non-canonical sequences of amino acids. These include intergenic peptides, products of mutations, or of RNA editing events hypothesized from genomic, epigenomic, or transcriptomic data. This approach poses new challenges for standard peptide identification workflows. In this chapter, we present the principles behind the use of peptide identification algorithms and highlight the major pitfalls of their application to proteogenomic studies.

  17. The Effect of Delayed-JOLs and Sentence Generation on Children's Monitoring Accuracy and Regulation of Idiom Study

    ERIC Educational Resources Information Center

    van Loon, Mariëtte H.; de Bruin, Anique B. H.; van Gog, Tamara; van Merriënboer, Jeroen J. G.

    2013-01-01

    When studying verbal materials, both adults and children are often poor at accurately monitoring their level of learning and regulating their subsequent restudy of materials, which leads to suboptimal test performance. The present experiment investigated how monitoring accuracy and regulation of study could be improved when learning idiomatic…

  18. Factor Analysis: A Tool for Studying Mathematics Anxiety.

    ERIC Educational Resources Information Center

    McAuliffe, Elizabeth A.; Trueblood, Cecil R.

    Mathematics anxiety and its relationship to other constructs was studied in 138 preservice elementary and special education teachers. The students, primarily women, were enrolled in a variety of professional courses and field experiences. Five instruments were administered, their factor structures were determined, and intercorrelations among the…

  19. Psychological Autopsy Studies as Diagnostic Tools: Are They Methodologically Flawed?

    ERIC Educational Resources Information Center

    Hjelmeland, Heidi; Dieserud, Gudrun; Dyregrov, Kari; Knizek, Birthe L.; Leenaars, Antoon A.

    2012-01-01

    One of the most established "truths" in suicidology is that almost all (90% or more) of those who kill themselves suffer from one or more mental disorders, and a causal link between the two is implied. Psychological autopsy (PA) studies constitute one main evidence base for this conclusion. However, there has been little reflection on the…

  20. Educator Study Groups: A Professional Development Tool to Enhance Inclusion

    ERIC Educational Resources Information Center

    Herner-Patnode, Leah

    2009-01-01

    Professional development can take many forms. The most effective development includes individual educators in the formation and planning process. Educator study groups are one form of professional development that allows major stakeholders in the education process the autonomy to develop individual and group goals. This often translates into an…

  1. The Environmental Quality Index: A Tool for Developmental Outcome Studies.

    ERIC Educational Resources Information Center

    Aylward, Glen P.; And Others

    A socioeconomic status (SES) and social support questionnaire was administered to the families of 559 singleton infants in a National Institute of Health collaborative study conducted at five university medical centers. A subset of six items measuring both SES and social support (which accounted for 87 percent of the common variance) was developed…

  2. Minecraft as a Creative Tool: A Case Study

    ERIC Educational Resources Information Center

    Cipollone, Maria; Schifter, Catherine C.; Moffat, Rick A.

    2014-01-01

    Many scholars are enthusiastic about the potential learning opportunities present in the sandbox-style gaming environment, Minecraft. In the following case study, the authors explored the use of Minecraft in a high school literature class and the presentation of characterization and plot in three student-made machinima, or films made in the game…

  3. Real-Word and Nonword Repetition in Italian-Speaking Children with Specific Language Impairment: A Study of Diagnostic Accuracy

    ERIC Educational Resources Information Center

    Dispaldro, Marco; Leonard, Laurence B.; Deevy, Patricia

    2013-01-01

    Purpose: Using 2 different scoring methods, the authors examined the diagnostic accuracy of both real-word and nonword repetition in identifying Italian-speaking children with and without specific language impairment (SLI). Method: A total of 34 children ages 3;11-5;8 (years;months) participated--17 children with SLI and 17 typically developing…

  4. A Comparative Study of the Variables Used to Measure Syntactic Complexity and Accuracy in Task-Based Research

    ERIC Educational Resources Information Center

    Inoue, Chihiro

    2016-01-01

    The constructs of complexity, accuracy and fluency (CAF) have been used extensively to investigate learner performance on second language tasks. However, a serious concern is that the variables used to measure these constructs are sometimes used conventionally without any empirical justification. It is crucial for researchers to understand how…

  5. The Accuracy of Recidivism Risk Assessments for Sexual Offenders: A Meta-Analysis of 118 Prediction Studies

    ERIC Educational Resources Information Center

    Hanson, R. Karl; Morton-Bourgon, Kelly E.

    2009-01-01

    This review compared the accuracy of various approaches to the prediction of recidivism among sexual offenders. On the basis of a meta-analysis of 536 findings drawn from 118 distinct samples (45,398 sexual offenders, 16 countries), empirically derived actuarial measures were more accurate than unstructured professional judgment for all outcomes…

  6. The accuracy of fine-needle aspiration cytology for diagnosis of parotid gland masses: a clinicopathological study of 114 patients

    PubMed Central

    GUDMUNDSSON, Jens Kristjan; AJAN, Aida; ABTAHI, Jahan

    2016-01-01

    ABSTRACT Objective Fine-needle aspiration cytology is a valuable method for preoperative assessment of head and neck tumors. However, its accuracy in detection of salivary gland masses is controversial compared with other methods. The aim of this work was to evaluate the effectiveness and accuracy of fine-needle aspiration cytology (FNAC) in the diagnosis of parotid gland masses. Material and Methods Over a 10-year period, 126 parotid gland masses were resected. Retrospective chart reviews of 114 patients were performed. The results of FNAC and final histological diagnosis were compared and the accuracy of FNAC was determined. Results Final histological evaluation revealed 11 malignant tumors and 103 benign lesions. Pleomorphic adenoma was the most common neoplasm (63%), followed by Warthin’s tumor (17.5%). The sensitivity of FNAC in detecting malignant tumors was 73% and the specificity was 97%. Positive predictive value (PPV) was 73% and negative predictive value (NPV) was 97%. The overall accuracy of FNAC in detecting parotid masses was 95%. False-negative diagnosis was found in mucoepidermoid carcinoma, acinic cell carcinoma, and epithelial-myoepithelial carcinoma whereas there was false-positive diagnosis in cases of pleomorphic adenoma and normal parotid gland tissue. Conclusion FNAC is a reliable minimally invasive diagnostic method with a high sensitivity in diagnosis of lesions in parotid glands. The sensitivity of detection of malignant tumors in parotid glands was low due to the biopsy technique used, and depended on tumor location. Postoperative complications decreased after superficial parotidectomy. PMID:28076460

  7. Disease severity assessment in epidemiological studies: accuracy and reliability of visual estimates of Septoria leaf blotch (SLB) in winter wheat

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The accuracy and reliability of visual assessments of SLB severity by raters (i.e. one plant pathologist with extensive experience and three other raters trained prior to field observations using standard area diagrams and DISTRAIN) was determined by comparison with assumed actual values obtained by...

  8. The effects of relatedness and GxE interaction on prediction accuracies in genomic selection: a study in cassava

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Prior to implementation of genomic selection, an evaluation of the potential accuracy of prediction can be obtained by cross validation. In this procedure, a population with both phenotypes and genotypes is split into training and validation sets. The prediction model is fitted using the training se...

  9. The space elevator: a new tool for space studies.

    PubMed

    Edwards, Bradley C

    2003-06-01

    The objective has been to develop a viable scenario for the construction, deployment and operation of a space elevator using current or near future technology. This effort has been primarily a paper study with several experimental tests of specific systems. Computer simulations, engineering designs, literature studies and inclusion of existing programs have been utilized to produce a design for the first space elevator. The results from this effort illustrate a viable design using current and near-term technology for the construction of the first space elevator. The timeline for possible construction is within the coming decades and estimated costs are less than $10 B. The initial elevator would have a 5 ton/day capacity and operating costs near $100/lb for payloads going to any Earth orbit or traveling to the Moon, Mars, Venus or the asteroids. An operational space elevator would allow for larger and much longer-term biological space studies at selectable gravity levels. The high-capacity and low operational cost of this system would also allow for inexpensive searches for life throughout our solar system and the first tests of environmental engineering. This work is supported by a grant from the NASA Institute for Advanced Concepts (NIAC).

  10. Matrix isolation as a tool for studying interstellar chemical reactions

    NASA Technical Reports Server (NTRS)

    Ball, David W.; Ortman, Bryan J.; Hauge, Robert H.; Margrave, John L.

    1989-01-01

    Since the identification of the OH radical as an interstellar species, over 50 molecular species were identified as interstellar denizens. While identification of new species appears straightforward, an explanation for their mechanisms of formation is not. Most astronomers concede that large bodies like interstellar dust grains are necessary for adsorption of molecules and their energies of reactions, but many of the mechanistic steps are unknown and speculative. It is proposed that data from matrix isolation experiments involving the reactions of refractory materials (especially C, Si, and Fe atoms and clusters) with small molecules (mainly H2, H2O, CO, CO2) are particularly applicable to explaining mechanistic details of likely interstellar chemical reactions. In many cases, matrix isolation techniques are the sole method of studying such reactions; also in many cases, complexations and bond rearrangements yield molecules never before observed. The study of these reactions thus provides a logical basis for the mechanisms of interstellar reactions. A list of reactions is presented that would simulate interstellar chemical reactions. These reactions were studied using FTIR-matrix isolation techniques.

  11. Studying PubMed usages in the field for complex problem solving: Implications for tool design

    PubMed Central

    Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa

    2012-01-01

    Many recent studies on MEDLINE-based information seeking have shed light on scientists’ behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists’ problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375

  12. Refining Ovarian Cancer Test accuracy Scores (ROCkeTS): protocol for a prospective longitudinal test accuracy study to validate new risk scores in women with symptoms of suspected ovarian cancer

    PubMed Central

    Sundar, Sudha; Rick, Caroline; Dowling, Francis; Au, Pui; Rai, Nirmala; Champaneria, Rita; Stobart, Hilary; Neal, Richard; Davenport, Clare; Mallett, Susan; Sutton, Andrew; Kehoe, Sean; Timmerman, Dirk; Bourne, Tom; Van Calster, Ben; Gentry-Maharaj, Aleksandra; Deeks, Jon

    2016-01-01

    Introduction Ovarian cancer (OC) is associated with non-specific symptoms such as bloating, making accurate diagnosis challenging: only 1 in 3 women with OC presents through primary care referral. National Institute for Health and Care Excellence guidelines recommends sequential testing with CA125 and routine ultrasound in primary care. However, these diagnostic tests have limited sensitivity or specificity. Improving accurate triage in women with vague symptoms is likely to improve mortality by streamlining referral and care pathways. The Refining Ovarian Cancer Test Accuracy Scores (ROCkeTS; HTA 13/13/01) project will derive and validate new tests/risk prediction models that estimate the probability of having OC in women with symptoms. This protocol refers to the prospective study only (phase III). Methods and analysis ROCkeTS comprises four parallel phases. The full ROCkeTS protocol can be found at http://www.birmingham.ac.uk/ROCKETS. Phase III is a prospective test accuracy study. The study will recruit 2450 patients from 15 UK sites. Recruited patients complete symptom and anxiety questionnaires, donate a serum sample and undergo ultrasound scored as per International Ovarian Tumour Analysis (IOTA) criteria. Recruitment is at rapid access clinics, emergency departments and elective clinics. Models to be evaluated include those based on ultrasound derived by the IOTA group and novel models derived from analysis of existing data sets. Estimates of sensitivity, specificity, c-statistic (area under receiver operating curve), positive predictive value and negative predictive value of diagnostic tests are evaluated and a calibration plot for models will be presented. ROCkeTS has received ethical approval from the NHS West Midlands REC (14/WM/1241) and is registered on the controlled trials website (ISRCTN17160843) and the National Institute of Health Research Cancer and Reproductive Health portfolios. PMID:27507231

  13. Software project management tools in global software development: a systematic mapping study.

    PubMed

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  14. Test Expectancy Affects Metacomprehension Accuracy

    ERIC Educational Resources Information Center

    Thiede, Keith W.; Wiley, Jennifer; Griffin, Thomas D.

    2011-01-01

    Background: Theory suggests that the accuracy of metacognitive monitoring is affected by the cues used to judge learning. Researchers have improved monitoring accuracy by directing attention to more appropriate cues; however, this is the first study to more directly point students to more appropriate cues using instructions regarding tests and…

  15. Muscle biopsy as a tool in the study of aging.

    PubMed

    Coggan, A R

    1995-11-01

    The needle biopsy procedure provides a minimally invasive means of obtaining small samples of skeletal muscle from human volunteers. Such samples can be used to examine a variety of structural and functional characteristics of muscle, including fiber type and size, capillarization, enzymatic capacities, energy substrate or protein/mRNA concentrations, metabolic responses, and contractile properties. In conjunction with other methods, biopsy sampling can also be used to estimate total muscle mass and fiber number, and to determine rates of protein synthesis and degradation. Optimal handling and storage conditions vary widely, but in general, most of the above measurements can be made using frozen tissue, so that samples can be stored almost indefinitely. The procedure is also safe and generally well-tolerated, making it possible to perform longitudinal studies of the same person. The biopsy technique is therefore well suited for examining the underlying physiological mechanisms responsible for muscle wasting in the elderly, as well as for assessing the effects of nutritional, hormonal, and/or lifestyle (e.g., exercise) interventions intended to combat this problem. Although sample size limitations have been largely overcome by the development of microtechniques, more information is needed on how to minimize the variability introduced by studying only a small fraction of the whole muscle. Studies are also required to determine whether it is sufficient to biopsy only one muscle (and if so, which is optimal), or whether there are differential effects of aging in various muscle groups that would preclude extrapolating from one muscle to all muscles in the body.(ABSTRACT TRUNCATED AT 250 WORDS)

  16. Advanced study techniques: tools for HVDC systems design

    SciTech Connect

    Degeneff, R.C.

    1984-01-01

    High voltage direct current (HVDC) transmission systems, which offer functional as well as environmental and economic advantages, could see a 15% growth rate over the next decade. Design studies of HVDC system components are complicated by the need to cover 11 major elements: power system, insulation coordination, filter design, subsynchronous torsional interaction, circuit breaker requirements, power line carrier and radio interference, electric fields and audible noise, protective relaying, availability and reliability, efficiency, equipment specification, and HVDC simulator and Transient Network Analyzers. The author summarizes and illustrates each element. 6 figures, 1 table.

  17. Formaldehyde Crosslinking: A Tool for the Study of Chromatin Complexes*

    PubMed Central

    Hoffman, Elizabeth A.; Frey, Brian L.; Smith, Lloyd M.; Auble, David T.

    2015-01-01

    Formaldehyde has been used for decades to probe macromolecular structure and function and to trap complexes, cells, and tissues for further analysis. Formaldehyde crosslinking is routinely employed for detection and quantification of protein-DNA interactions, interactions between chromatin proteins, and interactions between distal segments of the chromatin fiber. Despite widespread use and a rich biochemical literature, important aspects of formaldehyde behavior in cells have not been well described. Here, we highlight features of formaldehyde chemistry relevant to its use in analyses of chromatin complexes, focusing on how its properties may influence studies of chromatin structure and function. PMID:26354429

  18. NANIVID: A New Research Tool for Tissue Microenvironment Studies

    NASA Astrophysics Data System (ADS)

    Raja, Waseem K.

    Metastatic tumors are heterogeneous in nature and composed of subpopulations of cells having various metastatic potentials. The time progression of a tumor creates a unique microenvironment to improve the invasion capabilities and survivability of cancer cells in different microenvironments. In the early stages of intravasation, cancer cells establish communication with other cell types through a paracrine loop and covers long distances by sensing growth factor gradients through extracellular matrices. Cellular migration both in vitro and in vivo is a complex process and to understand their motility in depth, sophisticated techniques are required to document and record events in real time. This study presents the design and optimization of a new versatile chemotaxis device called the NANIVID (NANo IntraVital Imaging Device), developed using advanced Nano/Micro fabrication techniques. The current version of this device has been demonstrated to form a stable (epidermal growth factor) EGF gradient in vitro (2D and 3D) while a miniaturized size of NANIVID is used as an implantable device for intravital studies of chemotaxis and to collect cells in vivo. The device is fabricated using microfabrication techniques in which two substrates are bonded together using a thin polymer layer creating a bonded device with one point source (approximately 150 im x 50 im) outlet. The main structures of the device consist of two transparent substrates: one having etched chambers and channel while the second consists of a microelectrode system to measure real time cell arrival inside the device. The chamber of the device is loaded with a growth factor reservoir consisting of hydrogel to sustain a steady release of growth factor into the surrounding environment for long periods of time and establishing a concentration gradient from the device. The focus of this study was to design and optimize the new device for cell chemotaxis studies in breast cancer cells in cell culture. Our results

  19. Identity method-a new tool for studying chemical fluctuations

    SciTech Connect

    Mackowiak, M.

    2012-06-15

    Event-by-event fluctuations of the chemical composition of the hadronic system produced in nuclear collisions are believed to be sensitive to properties of the transition between confined and deconfined strongly interacting matter. In this paper a new technique for the study of chemical fluctuation, the identity method, is introduced and its features are discussed. The method is tested using data on central PbPb collisions at 40 A GeV registered by the NA49 experiment at the CERN SPS.

  20. Value of systematic detection of physical child abuse at emergency rooms: a cross-sectional diagnostic accuracy study

    PubMed Central

    Sittig, Judith S; Uiterwaal, Cuno S P M; Moons, Karel G M; Russel, Ingrid M B; Nievelstein, Rutger A J; Nieuwenhuis, Edward E S; van de Putte, Elise M

    2016-01-01

    Objectives The aim of our diagnostic accuracy study Child Abuse Inventory at Emergency Rooms (CHAIN-ER) was to establish whether a widely used checklist accurately detects or excludes physical abuse among children presenting to ERs with physical injury. Design A large multicentre study with a 6-month follow-up. Setting 4 ERs in The Netherlands. Participants 4290 children aged 0–7 years attending the ER because of physical injury. All children were systematically tested with an easy-to-use child abuse checklist (index test). A national expert panel (reference standard) retrospectively assessed all children with positive screens and a 15% random sample of the children with negative screens for physical abuse, using additional information, namely, an injury history taken by a paediatrician, information provided by the general practitioner, youth doctor and social services by structured questionnaires, and 6-month follow-up information. Main outcome measure Physical child abuse. Secondary outcome measure Injury due to neglect and need for help. Results 4253/4290 (99%) parents agreed to follow-up. At a prevalence of 0.07% (3/4253) for inflicted injury by expert panel decision, the positive predictive value of the checklist was 0.03 (95% CI 0.006 to 0.085), and the negative predictive value 1.0 (0.994 to 1.0). There was 100% (93 to 100) agreement about inflicted injury in children, with positive screens between the expert panel and child abuse experts. Conclusions Rare cases of inflicted injury among preschool children presenting at ERs for injury are very likely captured by easy-to-use checklists, but at very high false-positive rates. Subsequent assessment by child abuse experts can be safely restricted to children with positive screens at very low risk of missing cases of inflicted injury. Because of the high false positive rate, we do advise careful prior consideration of cost-effectiveness and clinical and societal implications before de novo implementation

  1. Sphingolipidomics: An Important Mechanistic Tool for Studying Fungal Pathogens

    PubMed Central

    Singh, Ashutosh; Del Poeta, Maurizio

    2016-01-01

    Sphingolipids form of a unique and complex group of bioactive lipids in fungi. Structurally, sphingolipids of fungi are quite diverse with unique differences in the sphingoid backbone, amide linked fatty acyl chain and the polar head group. Two of the most studied and conserved sphingolipid classes in fungi are the glucosyl- or galactosyl-ceramides and the phosphorylinositol containing phytoceramides. Comprehensive structural characterization and quantification of these lipids is largely based on advanced analytical mass spectrometry based lipidomic methods. While separation of complex lipid mixtures is achieved through high performance liquid chromatography, the soft – electrospray ionization tandem mass spectrometry allows a high sensitivity and selectivity of detection. Herein, we present an overview of lipid extraction, chromatographic separation and mass spectrometry employed in qualitative and quantitative sphingolipidomics in fungi. PMID:27148190

  2. Bromodeoxyuridine-labeled oligonucleotides as tools for oligonucleotide uptake studies.

    PubMed

    Maszewska, Maria; Kobylańska, Anna; Gendaszewska-Darmach, Edyta; Koziołkiewicz, Maria

    2002-12-01

    The mechanisms by which various oligonucleotides (ODNs) and their analogs enter cells are not fully understood. A common technique used in studies on cellular uptake of ODNs is their conjugation with fluorochromes. However, fluorescently labeled ODNs may vary from the parent compounds in charge and hydrophilicity, and they may interact differently with some components of cellular membranes. In this report, we present an alternative method based on the immunofluorescent detection of ODNs with incorporated 5-bromo-2'-deoxyuridine (BrdUrd). Localization of BrdUrd-modified ODNs has been achieved using FITC-labeled anti-BrdUrd antibodies. This technique allowed determination of the differences in cellular uptake of phosphodiester (PO) and phosphorothioate (PS) ODNs and their derivatives conjugated with cholesterol and menthol. The immunocytochemical method also has shown that the cellular uptake of some ODNs may be influenced by specific sequences that are responsible for the formation of higher-order structures.

  3. Photoemission Electron Microscopy as a Tool for Studying Steel Grains

    NASA Astrophysics Data System (ADS)

    Roese, Peter; Keutner, Christoph; Berges, Ulf; Espeter, Philipp; Westphal, Carsten

    2017-01-01

    Key properties of steel like stability, weldability, or ability for absorbing deformation energy are defined by their grain structure. The knowledge about their micrometer and submicrometer structure is of particular interest for tailor-cut macroscopic steel properties. We report on photoemission electron microscopy studies which in principle yield a higher magnification than comparable optical techniques. A flat surface without any topographic features was obtained by applying a non-etching preparation procedure. PEEM images showed very tiny phase islands embedded within a steel phase matrix. Furthermore, we developed an analysis procedure for PEEM images for dual-phase steels. As a result, it is possible to identify the individual work functions of different steel phases at the surface.

  4. Photoemission Electron Microscopy as a Tool for Studying Steel Grains

    NASA Astrophysics Data System (ADS)

    Roese, Peter; Keutner, Christoph; Berges, Ulf; Espeter, Philipp; Westphal, Carsten

    2017-03-01

    Key properties of steel like stability, weldability, or ability for absorbing deformation energy are defined by their grain structure. The knowledge about their micrometer and submicrometer structure is of particular interest for tailor-cut macroscopic steel properties. We report on photoemission electron microscopy studies which in principle yield a higher magnification than comparable optical techniques. A flat surface without any topographic features was obtained by applying a non-etching preparation procedure. PEEM images showed very tiny phase islands embedded within a steel phase matrix. Furthermore, we developed an analysis procedure for PEEM images for dual-phase steels. As a result, it is possible to identify the individual work functions of different steel phases at the surface.

  5. Tools for Studying Electron and Spin Transport in Single Molecules

    NASA Astrophysics Data System (ADS)

    Ralph, Daniel C.

    2005-03-01

    Experiments in the field of single-molecule electronics are challenging in part because it can be very difficult to control and characterize the device structure. Molecules contacted by metal electrodes cannot easily be imaged by microscopy techniques. Moreover, if one attempts to characterize the device structure simply by measuring a current-voltage curve, it is easy to mistake nonlinear transport across a bare tunnel junction or a metallic short for a molecular signal. I will discuss the development of a set of experimental test structures that enable the properties of a molecular device to be tuned controllably in-situ, so that the transport mechanisms can be studied more systematically and compared with theoretical predictions. My collaborators and I are developing the means to use several different types of such experimental "knobs" in coordination: electrostatic gating to shift the energy levels in a molecule, mechanical motion to adjust the molecular configuration or the molecule-electrode coupling strength, illumination with light to promote electrons to excited states or to make and break chemical bonds, and the use of ferromagnetic electrodes to study spin-polarized transport. Our work so far has provided new insights into Kondo physics, the coupling between a molecule's electronic and mechanical degrees of freedom, and spin transport through a molecule between magnetic electrodes. Collaborators: Radek Bialczak, Alex Champagne, Luke Donev, Jonas Goldsmith, Jacob Grose, Janice Guikema, Jiwoong Park, Josh Parks, Abhay Pasupathy, Jason Petta, Sara Slater, Burak Ulgut, Alexander Soldatov, H'ector Abruña, and Paul McEuen.

  6. Using a Software Tool in Forecasting: a Case Study of Sales Forecasting Taking into Account Data Uncertainty

    NASA Astrophysics Data System (ADS)

    Fabianová, Jana; Kačmáry, Peter; Molnár, Vieroslav; Michalik, Peter

    2016-10-01

    Forecasting is one of the logistics activities and a sales forecast is the starting point for the elaboration of business plans. Forecast accuracy affects the business outcomes and ultimately may significantly affect the economic stability of the company. The accuracy of the prediction depends on the suitability of the use of forecasting methods, experience, quality of input data, time period and other factors. The input data are usually not deterministic but they are often of random nature. They are affected by uncertainties of the market environment, and many other factors. Taking into account the input data uncertainty, the forecast error can by reduced. This article deals with the use of the software tool for incorporating data uncertainty into forecasting. Proposals are presented of a forecasting approach and simulation of the impact of uncertain input parameters to the target forecasted value by this case study model. The statistical analysis and risk analysis of the forecast results is carried out including sensitivity analysis and variables impact analysis.

  7. Improvement of focus accuracy on processed wafer

    NASA Astrophysics Data System (ADS)

    Higashibata, Satomi; Komine, Nobuhiro; Fukuhara, Kazuya; Koike, Takashi; Kato, Yoshimitsu; Hashimoto, Kohji

    2013-04-01

    As feature size shrinkage in semiconductor device progress, process fluctuation, especially focus strongly affects device performance. Because focus control is an ongoing challenge in optical lithography, various studies have sought for improving focus monitoring and control. Focus errors are due to wafers, exposure tools, reticles, QCs, and so on. Few studies are performed to minimize the measurement errors of auto focus (AF) sensors of exposure tool, especially when processed wafers are exposed. With current focus measurement techniques, the phase shift grating (PSG) focus monitor 1) has been already proposed and its basic principle is that the intensity of the diffraction light of the mask pattern is made asymmetric by arranging a π/2 phase shift area on a reticle. The resist pattern exposed at the defocus position is shifted on the wafer and shifted pattern can be easily measured using an overlay inspection tool. However, it is difficult to measure shifted pattern for the pattern on the processed wafer because of interruptions caused by other patterns in the underlayer. In this paper, we therefore propose "SEM-PSG" technique, where the shift of the PSG resist mark is measured by employing critical dimension-scanning electron microscope (CD-SEM) to measure the focus error on the processed wafer. First, we evaluate the accuracy of SEM-PSG technique. Second, by applying the SEM-PSG technique and feeding the results back to the exposure, we evaluate the focus accuracy on processed wafers. By applying SEM-PSG feedback, the focus accuracy on the processed wafer was improved from 40 to 29 nm in 3σ.

  8. In vitro study of accuracy of cervical pedicle screw insertion using an electronic conductivity device (ATPS part III).

    PubMed

    Koller, Heiko; Hitzl, Wolfgang; Acosta, Frank; Tauber, Mark; Zenner, Juliane; Resch, Herbert; Yukawa, Yasutsugu; Meier, Oliver; Schmidt, Rene; Mayer, Michael

    2009-09-01

    Reconstruction of the highly unstable, anteriorly decompressed cervical spine poses biomechanical challenges to current stabilization strategies, including circumferential instrumented fusion, to prevent failure. To avoid secondary posterior surgery, particularly in the elderly population, while increasing primary construct rigidity of anterior-only reconstructions, the authors introduced the concept of anterior transpedicular screw (ATPS) fixation and plating. We demonstrated its morphological feasibility, its superior biomechanical pull-out characteristics compared with vertebral body screws and the accuracy of inserting ATPS using a manual fluoroscopically assisted technique. Although accuracy was high, showing non-critical breaches in the axial and sagittal plane in 78 and 96%, further research was indicated refining technique and increasing accuracy. In light of first clinical case series, the authors analyzed the impact of using an electronic conductivity device (ECD, PediGuard) on the accuracy of ATPS insertion. As there exist only experiences in thoracolumbar surgery the versatility of the ECD was also assessed for posterior cervical pedicle screw fixation (pCPS). 30 ATPS and 30 pCPS were inserted alternately into the C3-T1 vertebra of five fresh-frozen specimen. Fluoroscopic assistance was only used for the entry point selection, pedicle tract preparation was done using the ECD. Preoperative CT scans were assessed for sclerosis at the pedicle entrance or core, and vertebrae with dense pedicles were excluded. Pre- and postoperative reconstructed CT scans were analyzed for pedicle screw positions according to a previously established grading system. Statistical analysis revealed an astonishingly high accuracy for the ATPS group with no critical screw position (0%) in axial or sagittal plane. In the pCPS group, 88.9% of screws inserted showed non-critical screw position, while 11.1% showed critical pedicle perforations. The usage of an ECD for posterior and

  9. Understanding FRET as a Research Tool for Cellular Studies

    PubMed Central

    Shrestha, Dilip; Jenei, Attila; Nagy, Péter; Vereb, György; Szöllősi, János

    2015-01-01

    Communication of molecular species through dynamic association and/or dissociation at various cellular sites governs biological functions. Understanding these physiological processes require delineation of molecular events occurring at the level of individual complexes in a living cell. Among the few non-invasive approaches with nanometer resolution are methods based on Förster Resonance Energy Transfer (FRET). FRET is effective at a distance of 1–10 nm which is equivalent to the size of macromolecules, thus providing an unprecedented level of detail on molecular interactions. The emergence of fluorescent proteins and SNAP- and CLIP- tag proteins provided FRET with the capability to monitor changes in a molecular complex in real-time making it possible to establish the functional significance of the studied molecules in a native environment. Now, FRET is widely used in biological sciences, including the field of proteomics, signal transduction, diagnostics and drug development to address questions almost unimaginable with biochemical methods and conventional microscopies. However, the underlying physics of FRET often scares biologists. Therefore, in this review, our goal is to introduce FRET to non-physicists in a lucid manner. We will also discuss our contributions to various FRET methodologies based on microscopy and flow cytometry, while describing its application for determining the molecular heterogeneity of the plasma membrane in various cell types. PMID:25815593

  10. Next generation sequencing technologies: tool to study avian virus diversity.

    PubMed

    Kapgate, S S; Barbuddhe, S B; Kumanan, K

    2015-03-01

    Increased globalisation, climatic changes and wildlife-livestock interface led to emergence of novel viral pathogens or zoonoses that have become serious concern to avian, animal and human health. High biodiversity and bird migration facilitate spread of the pathogen and provide reservoirs for emerging infectious diseases. Current classical diagnostic methods designed to be virus-specific or aim to be limited to group of viral agents, hinder identifying of novel viruses or viral variants. Recently developed approaches of next-generation sequencing (NGS) provide culture-independent methods that are useful for understanding viral diversity and discovery of novel virus, thereby enabling a better diagnosis and disease control. This review discusses the different possible steps of a NGS study utilizing sequence-independent amplification, high-throughput sequencing and bioinformatics approaches to identify novel avian viruses and their diversity. NGS lead to the identification of a wide range of new viruses such as picobirnavirus, picornavirus, orthoreovirus and avian gamma coronavirus associated with fulminating disease in guinea fowl and is also used in describing viral diversity among avian species. The review also briefly discusses areas of viral-host interaction and disease associated causalities with newly identified avian viruses.

  11. Bench study of the accuracy of a commercial AED arrhythmia analysis algorithm in the presence of electromagnetic interferences.

    PubMed

    Jekova, Irena; Krasteva, Vessela; Ménétré, Sarah; Stoyanov, Todor; Christov, Ivaylo; Fleischhackl, Roman; Schmid, Johann-Jakob; Didon, Jean-Philippe

    2009-07-01

    This paper presents a bench study on a commercial automated external defibrillator (AED). The objective was to evaluate the performance of the defibrillation advisory system and its robustness against electromagnetic interferences (EMI) with central frequencies of 16.7, 50 and 60 Hz. The shock advisory system uses two 50 and 60 Hz band-pass filters, an adaptive filter to identify and suppress 16.7 Hz interference, and a software technique for arrhythmia analysis based on morphology and frequency ECG parameters. The testing process includes noise-free ECG strips from the internationally recognized MIT-VFDB ECG database that were superimposed with simulated EMI artifacts and supplied to the shock advisory system embedded in a real AED. Measurements under special consideration of the allowed variation of EMI frequency (15.7-17.4, 47-52, 58-62 Hz) and amplitude (1 and 8 mV) were performed to optimize external validity. The accuracy was reported using the American Heart Association (AHA) recommendations for arrhythmia analysis performance. In the case of artifact-free signals, the AHA performance goals were exceeded for both sensitivity and specificity: 99% for ventricular fibrillation (VF), 98% for rapid ventricular tachycardia (VT), 90% for slow VT, 100% for normal sinus rhythm, 100% for asystole and 99% for other non-shockable rhythms. In the presence of EMI, the specificity for some non-shockable rhythms (NSR, N) may be affected in some specific cases of a low signal-to-noise ratio and extreme frequencies, leading to a drop in the specificity with no more than 7% point. The specificity for asystole and the sensitivity for VF and rapid VT in the presence of any kind of 16.7, 50 or 60 Hz EMI simulated artifact were shown to reach the equivalence of sensitivity required for non-noisy signals. In conclusion, we proved that the shock advisory system working in a real AED operates accurately according to the AHA recommendations without artifacts and in the presence of EMI

  12. Interactive tools for inpatient medication tracking: a multi-phase study with cardiothoracic surgery patients

    PubMed Central

    Woollen, Janet; Prey, Jennifer; Restaino, Susan; Bakken, Suzanne; Feiner, Steven; Sackeim, Alexander; Vawdrey, David K

    2016-01-01

    Objective Prior studies of computing applications that support patients’ medication knowledge and self-management offer valuable insights into effective application design, but do not address inpatient settings. This study is the first to explore the design and usefulness of patient-facing tools supporting inpatient medication management and tracking. Materials and Methods We designed myNYP Inpatient, a custom personal health record application, through an iterative, user-centered approach. Medication-tracking tools in myNYP Inpatient include interactive views of home and hospital medication data and features for commenting on these data. In a two-phase pilot study, patients used the tools during cardiothoracic postoperative care at Columbia University Medical Center. In Phase One, we provided 20 patients with the application for 24–48 h and conducted a closing interview after this period. In Phase Two, we conducted semi-structured interviews with 12 patients and 5 clinical pharmacists who evaluated refinements to the tools based on the feedback received during Phase One. Results Patients reported that the medication-tracking tools were useful. During Phase One, 14 of the 20 participants used the tools actively, to review medication lists and log comments and questions about their medications. Patients’ interview responses and audit logs revealed that they made frequent use of the hospital medications feature and found electronic reporting of questions and comments useful. We also uncovered important considerations for subsequent design of such tools. In Phase Two, the patients and pharmacists participating in the study confirmed the usability and usefulness of the refined tools. Conclusions Inpatient medication-tracking tools, when designed to meet patients’ needs, can play an important role in fostering patient participation in their own care and patient-provider communication during a hospital stay. PMID:26744489

  13. Phase segmentation of X-ray computer tomography rock images using machine learning techniques: an accuracy and performance study

    NASA Astrophysics Data System (ADS)

    Chauhan, Swarup; Rühaak, Wolfram; Anbergen, Hauke; Kabdenov, Alen; Freise, Marcus; Wille, Thorsten; Sass, Ingo

    2016-07-01

    Performance and accuracy of machine learning techniques to segment rock grains, matrix and pore voxels from a 3-D volume of X-ray tomographic (XCT) grayscale rock images was evaluated. The segmentation and classification capability of unsupervised (k-means, fuzzy c-means, self-organized maps), supervised (artificial neural networks, least-squares support vector machines) and ensemble classifiers (bragging and boosting) were tested using XCT images of andesite volcanic rock, Berea sandstone, Rotliegend sandstone and a synthetic sample. The averaged porosity obtained for andesite (15.8 ± 2.5 %), Berea sandstone (16.3 ± 2.6 %), Rotliegend sandstone (13.4 ± 7.4 %) and the synthetic sample (48.3 ± 13.3 %) is in very good agreement with the respective laboratory measurement data and varies by a factor of 0.2. The k-means algorithm is the fastest of all machine learning algorithms, whereas a least-squares support vector machine is the most computationally expensive. Metrics entropy, purity, mean square root error, receiver operational characteristic curve and 10 K-fold cross-validation were used to determine the accuracy of unsupervised, supervised and ensemble classifier techniques. In general, the accuracy was found to be largely affected by the feature vector selection scheme. As it is always a trade-off between performance and accuracy, it is difficult to isolate one particular machine learning algorithm which is best suited for the complex phase segmentation problem. Therefore, our investigation provides parameters that can help in selecting the appropriate machine learning techniques for phase segmentation.

  14. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  15. A Preanalytic Validation Study of Automated Bone Scan Index: Effect on Accuracy and Reproducibility Due to the Procedural Variabilities in Bone Scan Image Acquisition.

    PubMed

    Anand, Aseem; Morris, Michael J; Kaboteh, Reza; Reza, Mariana; Trägårdh, Elin; Matsunaga, Naofumi; Edenbrandt, Lars; Bjartell, Anders; Larson, Steven M; Minarik, David

    2016-12-01

    The effect of the procedural variability in image acquisition on the quantitative assessment of bone scan is unknown. Here, we have developed and performed preanalytical studies to assess the impact of the variability in scanning speed and in vendor-specific γ-camera on reproducibility and accuracy of the automated bone scan index (BSI).

  16. Immediate effects of lower cervical spine manipulation on handgrip strength and free-throw accuracy of asymptomatic basketball players: a pilot study

    PubMed Central

    Humphries, Kelley M.; Ward, John; Coats, Jesse; Nobert, Jeannique; Amonette, William; Dyess, Stephen

    2013-01-01

    Objective The purpose of this pilot study was to collect preliminary information for a study to determine the immediate effects of a single unilateral chiropractic manipulation to the lower cervical spine on handgrip strength and free-throw accuracy in asymptomatic male recreational basketball players. Methods For this study, 24 asymptomatic male recreational right-handed basketball players (age = 26.3 ± 9.2 years, height = 1.81 ± 0.07 m, body mass = 82.6 ± 10.4 kg [mean ± SD]) underwent baseline dominant handgrip isometric strength and free-throw accuracy testing in an indoor basketball court. They were then equally randomized to receive either (1) diversified left lower cervical spine chiropractic manipulative therapy (CMT) at C5/C6 or (2) placebo CMT at C5/C6 using an Activator adjusting instrument on zero force setting. Participants then underwent posttesting of isometric handgrip strength and free-throw accuracy. A paired-samples t test was used to make within-group pre to post comparisons and between-group pre to post comparisons. Results No statistically significant difference was shown between either of the 2 basketball performance variables measured in either group. Isometric handgrip strength marginally improved by 0.7 kg (mean) in the CMT group (P = .710). Free-throw accuracy increased by 13.2% in the CMT group (P = .058). The placebo CMT group performed the same or more poorly during their second test session. Conclusions The results of this preliminary study showed that a single lower cervical spine manipulation did not significantly impact basketball performance for this group of healthy asymptomatic participants. A slight increase in free-throw percentage was seen, which deserves further investigation. This pilot study demonstrates that a larger study to evaluate if CMT affects handgrip strength and free-throw accuracy is feasible. PMID:24396315

  17. Prostate intrafraction motion evaluation using kV fluoroscopy during treatment delivery: A feasibility and accuracy study

    SciTech Connect

    Adamson, Justus; Wu Qiuwen

    2008-05-15

    Margin reduction for prostate radiotherapy is limited by uncertainty in prostate localization during treatment. We investigated the feasibility and accuracy of measuring prostate intrafraction motion using kV fluoroscopy performed simultaneously with radiotherapy. Three gold coils used for target localization were implanted into the patient's prostate gland before undergoing hypofractionated online image-guided step-and-shoot intensity modulated radiation therapy (IMRT) on an Elekta Synergy linear accelerator. At each fraction, the patient was aligned using a cone-beam computed tomography (CBCT), after which the IMRT treatment delivery and fluoroscopy were performed simultaneously. In addition, a post-treatment CBCT was acquired with the patient still on the table. To measure the intrafraction motion, we developed an algorithm to register the fluoroscopy images to a reference image derived from the post-treatment CBCT, and we estimated coil motion in three-dimensional (3D) space by combining information from registrations at different gantry angles. We also detected the MV beam turning on and off using MV scatter incident in the same fluoroscopy images, and used this information to synchronize our intrafraction evaluation with the treatment delivery. In addition, we assessed the following: the method to synchronize with treatment delivery, the dose from kV imaging, the accuracy of the localization, and the error propagated into the 3D localization from motion between fluoroscopy acquisitions. With 0.16 mAs/frame and a bowtie filter implemented, the coils could be localized with the gantry at both 0 deg. and 270 deg. with the MV beam off, and at 270 deg. with the MV beam on when multiple fluoroscopy frames were averaged. The localization in two-dimensions for phantom and patient measurements was performed with submillimeter accuracy. After backprojection into 3D the patient localization error was (-0.04{+-}0.30) mm, (0.09{+-}0.36) mm, and (0.03{+-}0.68) mm in the

  18. Prostate intrafraction motion evaluation using kV fluoroscopy during treatment delivery: A feasibility and accuracy study

    PubMed Central

    Adamson, Justus; Wu, Qiuwen

    2008-01-01

    Margin reduction for prostate radiotherapy is limited by uncertainty in prostate localization during treatment. We investigated the feasibility and accuracy of measuring prostate intrafraction motion using kV fluoroscopy performed simultaneously with radiotherapy. Three gold coils used for target localization were implanted into the patient’s prostate gland before undergoing hypofractionated online image-guided step-and-shoot intensity modulated radiation therapy (IMRT) on an Elekta Synergy linear accelerator. At each fraction, the patient was aligned using a cone-beam computed tomography (CBCT), after which the IMRT treatment delivery and fluoroscopy were performed simultaneously. In addition, a post-treatment CBCT was acquired with the patient still on the table. To measure the intrafraction motion, we developed an algorithm to register the fluoroscopy images to a reference image derived from the post-treatment CBCT, and we estimated coil motion in three-dimensional (3D) space by combining information from registrations at different gantry angles. We also detected the MV beam turning on and off using MV scatter incident in the same fluoroscopy images, and used this information to synchronize our intrafraction evaluation with the treatment delivery. In addition, we assessed the following: the method to synchronize with treatment delivery, the dose from kV imaging, the accuracy of the localization, and the error propagated into the 3D localization from motion between fluoroscopy acquisitions. With 0.16 mAs∕frame and a bowtie filter implemented, the coils could be localized with the gantry at both 0° and 270° with the MV beam off, and at 270° with the MV beam on when multiple fluoroscopy frames were averaged. The localization in two-dimensions for phantom and patient measurements was performed with submillimeter accuracy. After backprojection into 3D the patient localization error was (−0.04±0.30) mm, (0.09±0.36) mm, and (0.03±0.68) mm in the right

  19. Diagnostic accuracy of developmental screening in primary care at the 18-month health supervision visit: a cross-sectional study

    PubMed Central

    van den Heuvel, Meta; Borkhoff, Cornelia M.; Koroshegyi, Christine; Zabih, Weeda; Reijneveld, Sijmen A.; Maguire, Jonathon; Birken, Catherine; Parkin, Patricia

    2016-01-01

    Background: Communication delays are often the first presenting problem in infants with a range of developmental disabilities. Our objective was to assess the validity of the 18-month Nipissing District Developmental Screen compared with the Infant Toddler Checklist, a validated tool for detecting expressive language and other communication delays. Methods: A cross-sectional design was used. Children aged 18-20 months were recruited during scheduled health supervision visits. Parents completed both the 18-month Nipissing District Developmental Screen and the Infant Toddler Checklist. We assessed criterion validity (diagnostic test properties, overall agreement) for 1 or more "no" responses (1+NDDS flag) and 2 or more "no" responses (2+NDDS flag) using the Infant Toddler Checklist as a criterion measure. Results: The study included 348 children (mean age 18.6 ± 0.7 mo). The 1+NDDS flag had good sensitivity (94%, 95% confidence interval [CI] 70%-100%, and 86%, 95% CI 64%-96%), poor specificity (63%, 95% CI 58%-68%, and 63%, 95% CI 58%-69%), and fair agreement (0.26) to identify expressive speech and other communication delays, respectively. The 2+NDDS flag had low to fair sensitivity (50%, 95% CI 26%-74%, and 73%, 95% CI 50%-88%), good specificity (86%, 95% CI 82%-90%, and 88%, 95% CI 84%-92%) and moderate agreement (0.45) to identify expressive speech and other communication delays, respectively. Interpretation: The low specificity of the 1+NDDS flag may lead to overdiagnosis, and the low sensitivity of the 2+NDDS flag may lead to underdiagnosis, suggesting that infants who could benefit from early intervention may not be identified. The Nipissing District Developmental Screen does not have adequate characteristics to accurately identify children with a range of communication delays. PMID:28018875

  20. Astrophysics with Microarcsecond Accuracy Astrometry

    NASA Technical Reports Server (NTRS)

    Unwin, Stephen C.

    2008-01-01

    Space-based astrometry promises to provide a powerful new tool for astrophysics. At a precision level of a few microarcsonds, a wide range of phenomena are opened up for study. In this paper we discuss the capabilities of the SIM Lite mission, the first space-based long-baseline optical interferometer, which will deliver parallaxes to 4 microarcsec. A companion paper in this volume will cover the development and operation of this instrument. At the level that SIM Lite will reach, better than 1 microarcsec in a single measurement, planets as small as one Earth can be detected around many dozen of the nearest stars. Not only can planet masses be definitely measured, but also the full orbital parameters determined, allowing study of system stability in multiple planet systems. This capability to survey our nearby stellar neighbors for terrestrial planets will be a unique contribution to our understanding of the local universe. SIM Lite will be able to tackle a wide range of interesting problems in stellar and Galactic astrophysics. By tracing the motions of stars in dwarf spheroidal galaxies orbiting our Milky Way, SIM Lite will probe the shape of the galactic potential history of the formation of the galaxy, and the nature of dark matter. Because it is flexibly scheduled, the instrument can dwell on faint targets, maintaining its full accuracy on objects as faint as V=19. This paper is a brief survey of the diverse problems in modern astrophysics that SIM Lite will be able to address.

  1. An observational, prospective study to evaluate the preoperative planning tool "CI-Wizard" for cochlear implant surgery.

    PubMed

    Pirlich, Markus; Tittmann, Mary; Franz, Daniela; Dietz, Andreas; Hofer, Mathias

    2017-02-01

    "CI-Wizard" is a new, three-dimensional software planning tool for cochlear implant surgery with manual and semi-automatic algorithms to visualize anatomical risk structures of the lateral skull base preoperatively. Primary endpoints of the study represented the evaluation of the CI-Wizards usability, accuracy, subjectively perceived and objectively measured time in clinical practice. In a period from January 2014 to March 2015, n = 36 participants were included in this study. These members were divided into three groups of equal number (n = 12), but different level of experience. Senior doctors and consultants (group 1), residents (group 2) and medical students (group 3) segmented 12 different CT-scan data sets of the CI-Wizard (four per participant). In total, n = 144 data sets were collected. The usability of the CI-Wizard was measured by the given questionnaire with an interval rating scale. The Jaccard coefficient (JT) was used to evaluate the accuracy of the anatomical structures segmented. The subjectively perceived time was measured with an interval rating scale in the questionnaire and was compared with the objectively mean measured time (time interact). Across all three groups, the usability of the CI-Wizard has been assessed between 1 ("very good") and 2 ("with small defects"). Subjectively, the time was stated as "appropriate" by questionnaire. Objective measurements of the required duration revealed averages of t = 9.8 min for creating a target view. Concerning the accuracy, semi-automatic anatomical structures such as the external acoustic canal (JT = 0.90), the tympanic cavity (JT = 0.87), the ossicles (JT = 0.63), the cochlea (JT = 0.66), and the semicircular canals (JT = 0.61) reached high Jaccard values, which describes a great match of the segmented structures between the partcipants and the gold standard. Facial nerve (JT = 0.39) and round window (JT = 0.37) reached lower Jaccard values. Very little overlap tendency was

  2. Study on accuracy and interobserver reliability of the assessment of odontoid fracture union using plain radiographs or CT scans

    PubMed Central

    Kolb, Klaus; Zenner, Juliane; Reynolds, Jeremy; Dvorak, Marcel; Acosta, Frank; Forstner, Rosemarie; Mayer, Michael; Tauber, Mark; Auffarth, Alexander; Kathrein, Anton; Hitzl, Wolfgang

    2009-01-01

    In odontoid fracture research, outcome can be evaluated based on validated questionnaires, based on functional outcome in terms of atlantoaxial and total neck rotation, and based on the treatment-related union rate. Data on clinical and functional outcome are still sparse. In contrast, there is abundant information on union rates, although, frequently the rates differ widely. Odontoid union is the most frequently assessed outcome parameter and therefore it is imperative to investigate the interobserver reliability of fusion assessment using radiographs compared to CT scans. Our objective was to identify the diagnostic accuracy of plain radiographs in detecting union and non-union after odontoid fractures and compare this to CT scans as the standard of reference. Complete sets of biplanar plain radiographs and CT scans of 21 patients treated for odontoid fractures were subjected to interobserver assessment of fusion. Image sets were presented to 18 international observers with a mean experience in fusion assessment of 10.7 years. Patients selected had complete radiographic follow-up at a mean of 63.3 ± 53 months. Mean age of the patients at follow-up was 68.2 years. We calculated interobserver agreement of the diagnostic assessment using radiographs compared to using CT scans, as well as the sensitivity and specificity of the radiographic assessment. Agreement on the fusion status using radiographs compared to CT scans ranged between 62 and 90% depending on the observer. Concerning the assessment of non-union and fusion, the mean specificity was 62% and mean sensitivity was 77%. Statistical analysis revealed an agreement of 80–100% in 48% of cases only, between the biplanar radiographs and the reconstructed CT scans. In 50% of patients assessed there was an agreement of less than 80%. The mean sensitivity and specificity values indicate that radiographs are not a reliable measure to indicate odontoid fracture union or non-union. Regarding experience in years

  3. Validity of ICD-9-CM codes for breast, lung and colorectal cancers in three Italian administrative healthcare databases: a diagnostic accuracy study protocol

    PubMed Central

    Abraha, Iosief; Serraino, Diego; Giovannini, Gianni; Stracci, Fabrizio; Casucci, Paola; Alessandrini, Giuliana; Bidoli, Ettore; Chiari, Rita; Cirocchi, Roberto; De Giorgi, Marcello; Franchini, David; Vitale, Maria Francesca; Fusco, Mario; Montedori, Alessandro

    2016-01-01

    Introduction Administrative healthcare databases are useful tools to study healthcare outcomes and to monitor the health status of a population. Patients with cancer can be identified through disease-specific codes, prescriptions and physician claims, but prior validation is required to achieve an accurate case definition. The objective of this protocol is to assess the accuracy of International Classification of Diseases Ninth Revision—Clinical Modification (ICD-9-CM) codes for breast, lung and colorectal cancers in identifying patients diagnosed with the relative disease in three Italian administrative databases. Methods and analysis Data from the administrative databases of Umbria Region (910 000 residents), Local Health Unit 3 of Napoli (1 170 000 residents) and Friuli-Venezia Giulia Region (1 227 000 residents) will be considered. In each administrative database, patients with the first occurrence of diagnosis of breast, lung or colorectal cancer between 2012 and 2014 will be identified using the following groups of ICD-9-CM codes in primary position: (1) 233.0 and (2) 174.x for breast cancer; (3) 162.x for lung cancer; (4) 153.x for colon cancer and (5) 154.0–154.1 and 154.8 for rectal cancer. Only incident cases will be considered, that is, excluding cases that have the same diagnosis in the 5 years (2007–2011) before the period of interest. A random sample of cases and non-cases will be selected from each administrative database and the corresponding medical charts will be assessed for validation by pairs of trained, independent reviewers. Case ascertainment within the medical charts will be based on (1) the presence of a primary nodular lesion in the breast, lung or colon–rectum, documented with imaging or endoscopy and (2) a cytological or histological documentation of cancer from a primary or metastatic site. Sensitivity and specificity with 95% CIs will be calculated. Dissemination Study results will be disseminated widely through

  4. Development of a critical appraisal tool to assess the quality of cross-sectional studies (AXIS)

    PubMed Central

    Downes, Martin J; Brennan, Marnie L; Williams, Hywel C; Dean, Rachel S

    2016-01-01

    Objectives The aim of this study was to develop a critical appraisal (CA) tool that addressed study design and reporting quality as well as the risk of bias in cross-sectional studies (CSSs). In addition, the aim was to produce a help document to guide the non-expert user through the tool. Design An initial scoping review of the published literature and key epidemiological texts was undertaken prior to the formation of a Delphi panel to establish key components for a CA tool for CSSs. A consensus of 80% was required from the Delphi panel for any component to be included in the final tool. Results An initial list of 39 components was identified through examination of existing resources. An international Delphi panel of 18 medical and veterinary experts was established. After 3 rounds of the Delphi process, the Appraisal tool for Cross-Sectional Studies (AXIS tool) was developed by consensus and consisted of 20 components. A detailed explanatory document was also developed with the tool, giving expanded explanation of each question and providing simple interpretations and examples of the epidemiological concepts being examined in each question to aid non-expert users. Conclusions CA of the literature is a vital step in evidence synthesis and therefore evidence-based decision-making in a number of different disciplines. The AXIS tool is therefore unique and was developed in a way that it can be used across disciplines to aid the inclusion of CSSs in systematic reviews, guidelines and clinical decision-making. PMID:27932337

  5. Accuracy of chimeric proteins in the serological diagnosis of chronic chagas disease – a Phase II study

    PubMed Central

    Celedon, Paola Alejandra Fiorani; Zanchin, Nilson Ivo Tonin; de Souza, Wayner Vieira; da Silva, Edimilson Domingos; Foti, Leonardo; Krieger, Marco Aurélio; Gomes, Yara de Miranda

    2017-01-01

    Background The performance of current serologic tests for diagnosing chronic Chagas disease (CD) is highly variable. The search for new diagnostic markers has been a constant challenge for improving accuracy and reducing the number of inconclusive results. Methodology/Principal findings Here, four chimeric proteins (IBMP-8.1 to -8.4) comprising immunodominant regions of different Trypanosoma cruzi antigens were tested by enzyme-linked immunosorbent assay. The proteins were used to detect specific anti-T. cruzi antibodies in the sera of 857 chagasic and 689 non-chagasic individuals to evaluate their accuracy for chronic CD diagnosis. The antigens were recombinantly expressed in Escherichia coli and purified by chromatographic methods. The sensitivity and specificity values ranged from 94.3% to 99.3% and 99.4% to 100%, respectively. The diagnostic odds ratio (DOR) values were 6,462 for IBMP-8.1, 3,807 for IBMP-8.2, 32,095 for IBMP-8.3, and 283,714 for IBMP-8.4. These chimeric antigens presented DORs that were higher than the commercial test Pathozyme Chagas. The antigens IBMP-8.3 and -8.4 also showed DORs higher than the Gold ELISA Chagas test. Mixtures with equimolar concentrations were tested in order to improve the diagnosis accuracy of negative samples with high signal and positive samples with low signal. However, no gain in accuracy was observed relative to the individual antigens. A total of 1,079 additional sera were used to test cross-reactivity to unrelated diseases. The cross-reactivity rates ranged from 0.37% to 0.74% even for Leishmania spp., a pathogen showing relatively high genome sequence identity to T. cruzi. Imprecision analyses showed that IBMP chimeras are very stable and the results are highly reproducible. Conclusions/Significance Our findings indicate that the IBMP-8.4 antigen can be safely used in serological tests for T. cruzi screening in blood banks and for chronic CD laboratory diagnosis. PMID:28273127

  6. Using checklists and algorithms to improve qualitative exposure judgment accuracy.

    PubMed

    Arnold, Susan F; Stenzel, Mark; Drolet, Daniel; Ramachandran, Gurumurthy

    2016-01-01

    Most exposure assessments are conducted without the aid of robust personal exposure data and are based instead on qualitative inputs such as education and experience, training, documentation on the process chemicals, tasks and equipment, and other information. Qualitative assessments determine whether there is any follow-up, and influence the type that occurs, such as quantitative sampling, worker training, and implementing exposure and risk management measures. Accurate qualitative exposure judgments ensure appropriate follow-up that in turn ensures appropriate exposure management. Studies suggest that qualitative judgment accuracy is low. A qualitative exposure assessment Checklist tool was developed to guide the application of a set of heuristics to aid decision making. Practicing hygienists (n = 39) and novice industrial hygienists (n = 8) were recruited for a study evaluating the influence of the Checklist on exposure judgment accuracy. Participants generated 85 pre-training judgments and 195 Checklist-guided judgments. Pre-training judgment accuracy was low (33%) and not statistically significantly different from random chance. A tendency for IHs to underestimate the true exposure was observed. Exposure judgment accuracy improved significantly (p <0.001) to 63% when aided by the Checklist. Qualitative judgments guided by the Checklist tool were categorically accurate or over-estimated the true exposure by one category 70% of the time. The overall magnitude of exposure judgment precision also improved following training. Fleiss' κ, evaluating inter-rater agreement between novice assessors was fair to moderate (κ = 0.39). Cohen's weighted and unweighted κ were good to excellent for novice (0.77 and 0.80) and practicing IHs (0.73 and 0.89), respectively. Checklist judgment accuracy was similar to quantitative exposure judgment accuracy observed in studies of similar design using personal exposure measurements, suggesting that the tool could be useful in

  7. Accuracy of surface registration compared to conventional volumetric registration in patient positioning for head-and-neck radiotherapy: A simulation study using patient data

    SciTech Connect

    Kim, Youngjun; Li, Ruijiang; Na, Yong Hum; Xing, Lei; Lee, Rena

    2014-12-15

    Purpose: 3D optical surface imaging has been applied to patient positioning in radiation therapy (RT). The optical patient positioning system is advantageous over conventional method using cone-beam computed tomography (CBCT) in that it is radiation free, frameless, and is capable of real-time monitoring. While the conventional radiographic method uses volumetric registration, the optical system uses surface matching for patient alignment. The relative accuracy of these two methods has not yet been sufficiently investigated. This study aims to investigate the theoretical accuracy of the surface registration based on a simulation study using patient data. Methods: This study compares the relative accuracy of surface and volumetric registration in head-and-neck RT. The authors examined 26 patient data sets, each consisting of planning CT data acquired before treatment and patient setup CBCT data acquired at the time of treatment. As input data of surface registration, patient’s skin surfaces were created by contouring patient skin from planning CT and treatment CBCT. Surface registration was performed using the iterative closest points algorithm by point–plane closest, which minimizes the normal distance between source points and target surfaces. Six degrees of freedom (three translations and three rotations) were used in both surface and volumetric registrations and the results were compared. The accuracy of each method was estimated by digital phantom tests. Results: Based on the results of 26 patients, the authors found that the average and maximum root-mean-square translation deviation between the surface and volumetric registrations were 2.7 and 5.2 mm, respectively. The residual error of the surface registration was calculated to have an average of 0.9 mm and a maximum of 1.7 mm. Conclusions: Surface registration may lead to results different from those of the conventional volumetric registration. Only limited accuracy can be achieved for patient

  8. Accuracy of the Chinese lunar calendar method to predict a baby's sex: a population-based study.

    PubMed

    Villamor, Eduardo; Dekker, Louise; Svensson, Tobias; Cnattingius, Sven

    2010-07-01

    We estimated the accuracy of a non-invasive, inexpensive method (the Chinese lunar calendar, CLC) to predict the sex of a baby from around the time of conception, using 2,840,755 singleton births occurring in Sweden between 1973 and 2006. Maternal lunar age and month of conception were estimated, and used to predict each baby's sex, according to a published algorithm. Kappa statistics were estimated for the actual vs. the CLC-predicted sex of the baby. Overall kappa was 0.0002 [95% CI -0.0009, 0.0014]. Accuracy was not modified by year of conception, maternal age, level of education, body mass index or parity. In a validation subset of 1000 births in which we used a website-customised algorithm to estimate lunar dates, kappa was -0.02 [95% CI -0.08, 0.04]. Simulating the misuse of the method by failing to convert Gregorian dates into lunar did not change the results. We conclude that the CLC method is no better at predicting the sex of a baby than tossing a coin and advise against painting the nursery based on this method's result.

  9. Meta-GWAS Accuracy and Power (MetaGAP) Calculator Shows that Hiding Heritability Is Partially Due to Imperfect Genetic Correlations across Studies

    PubMed Central

    Rietveld, Cornelius A.; Johannesson, Magnus; Magnusson, Patrik K. E.; Uitterlinden, André G.; van Rooij, Frank J. A.; Hofman, Albert

    2017-01-01

    Large-scale genome-wide association results are typically obtained from a fixed-effects meta-analysis of GWAS summary statistics from multiple studies spanning different regions and/or time periods. This approach averages the estimated effects of genetic variants across studies. In case genetic effects are heterogeneous across studies, the statistical power of a GWAS and the predictive accuracy of polygenic scores are attenuated, contributing to the so-called ‘missing heritability’. Here, we describe the online Meta-GWAS Accuracy and Power (MetaGAP) calculator (available at www.devlaming.eu) which quantifies this attenuation based on a novel multi-study framework. By means of simulation studies, we show that under a wide range of genetic architectures, the statistical power and predictive accuracy provided by this calculator are accurate. We compare the predictions from the MetaGAP calculator with actual results obtained in the GWAS literature. Specifically, we use genomic-relatedness-matrix restricted maximum likelihood to estimate the SNP heritability and cross-study genetic correlation of height, BMI, years of education, and self-rated health in three large samples. These estimates are used as input parameters for the MetaGAP calculator. Results from the calculator suggest that cross-study heterogeneity has led to attenuation of statistical power and predictive accuracy in recent large-scale GWAS efforts on these traits (e.g., for years of education, we estimate a relative loss of 51–62% in the number of genome-wide significant loci and a relative loss in polygenic score R2 of 36–38%). Hence, cross-study heterogeneity contributes to the missing heritability. PMID:28095416

  10. Thyroid Imaging Reporting and Data System and Ultrasound Elastography: Diagnostic Accuracy as a Tool in Recommending Repeat Fine-Needle Aspiration for Solid Thyroid Nodules with Non-Diagnostic Fine-Needle Aspiration Cytology.

    PubMed

    Park, Vivian Youngjean; Kim, Eun-Kyung; Kwak, Jin Young; Yoon, Jung Hyun; Kim, Min Jung; Moon, Hee Jung

    2016-02-01

    The Thyroid Imaging Reporting and Data System (TIRADS) has been found to be accurate in the stratification of malignancy risk, and elastography has been found to have a high negative predictive value in non-diagnostic thyroid nodules. Through assessment of 104 solid non-diagnostic thyroid nodules, this study investigated the role of both in recommending repeat ultrasonography-guided fine-needle aspiration for solid thyroid nodules with non-diagnostic cytology. All nodules were classified by TIRADS (categories 4a, 4b, 4c and 5), and elastography scores were assigned according to the Rago and Asteria criteria. The malignancy risks for TIRADS categories 4a, 4b, 4c and 5 were 12.5%, 25.0%, 25.8% and 16.7%, respectively. Elastography revealed the highest diagnostic performance for TIRADS category 4a, with a sensitivity, specificity, negative predictive value, positive predictive value and accuracy of 100%, 85.7%, 100%, 50% and 87.5% for the Asteria criteria. Observation may be considered for non-diagnostic solid nodules that have no other suspicious ultrasonographic features and are also benign on real-time strain elastography using the Asteria criteria.

  11. Predicting Out-of-Office Blood Pressure in the Clinic (PROOF-BP): Derivation and Validation of a Tool to Improve the Accuracy of Blood Pressure Measurement in Clinical Practice.

    PubMed

    Sheppard, James P; Stevens, Richard; Gill, Paramjit; Martin, Una; Godwin, Marshall; Hanley, Janet; Heneghan, Carl; Hobbs, F D Richard; Mant, Jonathan; McKinstry, Brian; Myers, Martin; Nunan, David; Ward, Alison; Williams, Bryan; McManus, Richard J

    2016-05-01

    Patients often have lower (white coat effect) or higher (masked effect) ambulatory/home blood pressure readings compared with clinic measurements, resulting in misdiagnosis of hypertension. The present study assessed whether blood pressure and patient characteristics from a single clinic visit can accurately predict the difference between ambulatory/home and clinic blood pressure readings (the home-clinic difference). A linear regression model predicting the home-clinic blood pressure difference was derived in 2 data sets measuring automated clinic and ambulatory/home blood pressure (n=991) using candidate predictors identified from a literature review. The model was validated in 4 further data sets (n=1172) using area under the receiver operator characteristic curve analysis. A masked effect was associated with male sex, a positive clinic blood pressure change (difference between consecutive measurements during a single visit), and a diagnosis of hypertension. Increasing age, clinic blood pressure level, and pulse pressure were associated with a white coat effect. The model showed good calibration across data sets (Pearson correlation, 0.48-0.80) and performed well-predicting ambulatory hypertension (area under the receiver operator characteristic curve, 0.75; 95% confidence interval, 0.72-0.79 [systolic]; 0.87; 0.85-0.89 [diastolic]). Used as a triaging tool for ambulatory monitoring, the model improved classification of a patient's blood pressure status compared with other guideline recommended approaches (93% [92% to 95%] classified correctly; United States, 73% [70% to 75%]; Canada, 74% [71% to 77%]; United Kingdom, 78% [76% to 81%]). This study demonstrates that patient characteristics from a single clinic visit can accurately predict a patient's ambulatory blood pressure. Usage of this prediction tool for triaging of ambulatory monitoring could result in more accurate diagnosis of hypertension and hence more appropriate treatment.

  12. Impact of Structured Rounding Tools on Time Allocation During Multidisciplinary Rounds: An Observational Study

    PubMed Central

    Kannampallil, Thomas G; Patel, Vimla L; Patel, Bela; Almoosa, Khalid F

    2016-01-01

    Background Recent research has shown evidence of disproportionate time allocation for patient communication during multidisciplinary rounds (MDRs). Studies have shown that patients discussed later during rounds receive lesser time. Objective The aim of our study was to investigate whether disproportionate time allocation effects persist with the use of structured rounding tools. Methods Using audio recordings of rounds (N=82 patients), we compared time allocation and communication breakdowns between a problem-based Subjective, Objective, Assessment, and Plan (SOAP) and a system-based Handoff Intervention Tool (HAND-IT) rounding tools. Results We found no significant linear dependence of the order of patient presentation on the time spent or on communication breakdowns for both structured tools. However, for the problem-based tool, there was a significant linear relationship between the time spent on discussing a patient and the number of communication breakdowns (P<.05)––with an average of 1.04 additional breakdowns with every 120 seconds in discussion. Conclusions The use of structured rounding tools potentially mitigates disproportionate time allocation and communication breakdowns during rounds, with the more structured HAND-IT, almost completely eliminating such effects. These results have potential implications for planning, prioritization, and training for time management during MDRs. PMID:27940423

  13. Accuracy and uncertainty of asymmetric magnetization transfer ratio quantification for amide proton transfer (APT) imaging at 3T: a Monte Carlo study.

    PubMed

    Yuan, Jing; Zhang, Qinwei; Wang, Yi-Xiang; Wei, Juan; Zhou, Jinyuan

    2013-01-01

    Amide proton transfer (APT) imaging offers a novel and powerful MRI contrast mechanism for quantitative molecular imaging based on the principle of chemical exchange saturation transfer (CEST). Asymmetric magnetization transfer ratio (MTR(asym)) quantification is crucial for Z-spectrum analysis of APT imaging, but is still challenging, particularly at clinical field strength. This paper studies the accuracy and uncertainty in the quantification of MTR(asym) for APT imaging at 3T, by using high-order polynomial fitting of Z-spectrum through Monte Carlo simulation. Results show that polynomial fitting is a biased estimator that consistently underestimates MTR(asym). For a fixed polynomial order, the accuracy of MTR(asym) is almost constant with regard to signal-to-noise ratio (SNR) while the uncertainty decreases exponentially with SNR. The higher order polynomial fitting increases both the accuracy and the uncertainty of MTR(asym). For different APT signal intensity levels, the relative accuracy and the absolute uncertainty keep constant for a fixed polynomial order. These results indicate the limitations and pitfalls of polynomial fitting for MTR(asym) quantification so better quantification technique for MTR(asym) estimation is warranted.

  14. Recommended reporting standards for test accuracy studies of infectious diseases of finfish, amphibians, molluscs and crustaceans: the STRADAS-aquatic checklist.

    PubMed

    Gardner, Ian A; Whittington, Richard J; Caraguel, Charles G B; Hick, Paul; Moody, Nicholas J G; Corbeil, Serge; Garver, Kyle A; Warg, Janet V; Arzul, Isabelle; Purcell, Maureen K; Crane, Mark St J; Waltzek, Thomas B; Olesen, Niels J; Gallardo Lagno, Alicia

    2016-02-25

    Complete and transparent reporting of key elements of diagnostic accuracy studies for infectious diseases in cultured and wild aquatic animals benefits end-users of these tests, enabling the rational design of surveillance programs, the assessment of test results from clinical cases and comparisons of diagnostic test performance. Based on deficiencies in the Standards for Reporting of Diagnostic Accuracy (STARD) guidelines identified in a prior finfish study (Gardner et al. 2014), we adapted the Standards for Reporting of Animal Diagnostic Accuracy Studies-paratuberculosis (STRADAS-paraTB) checklist of 25 reporting items to increase their relevance to finfish, amphibians, molluscs, and crustaceans and provided examples and explanations for each item. The checklist, known as STRADAS-aquatic, was developed and refined by an expert group of 14 transdisciplinary scientists with experience in test evaluation studies using field and experimental samples, in operation of reference laboratories for aquatic animal pathogens, and in development of international aquatic animal health policy. The main changes to the STRADAS-paraTB checklist were to nomenclature related to the species, the addition of guidelines for experimental challenge studies, and the designation of some items as relevant only to experimental studies and ante-mortem tests. We believe that adoption of these guidelines will improve reporting of primary studies of test accuracy for aquatic animal diseases and facilitate assessment of their fitness-for-purpose. Given the importance of diagnostic tests to underpin the Sanitary and Phytosanitary agreement of the World Trade Organization, the principles outlined in this paper should be applied to other World Organisation for Animal Health (OIE)-relevant species.

  15. Diagnostic Accuracy and Impact on Management of Ultrasonography-Guided Fine-Needle Aspiration to Detect Axillary Metastasis in Breast Cancer Patients: A Prospective Study

    PubMed Central

    Diaz-Ruiz, María Jesús; Arnau, Anna; Montesinos, Jesus; Miguel, Ana; Culell, Pere; Solernou, Lluis; Tortajada, Lidia; Vergara, Carmen; Yanguas, Carlos; Salvador-Tarrasón, Rafael

    2016-01-01

    Summary Background The axillary nodal status is essential to determine the stage of disease at diagnosis. Our aim was to prospectively assess the diagnostic accuracy of ultrasonography-guided fine-needle aspiration (US-FNA) for the detection of metastasis in axillary lymph nodes in patients with breast cancer (BC) and its impact on the therapeutic decision. Materials and Methods Ultrasonography (US) was performed in 407 axillae of 396 patients who subsequently underwent surgery. US-FNA was conducted when lymph nodes were detected by US. Axillary dissection (AD) was performed when US-FNA was positive for metastasis. Patients with negative US-FNA and breast tumors of 30 mm in size were candidates for selective sentinel lymph node biopsy (SLNB). The anatomopathological results of AD or SLNB were used as reference tests. Results Lymph nodes were detected by US in 207 (50.8%) axillae. Of these, US-FNA was performed on 180 (86.9%). 94 axillae (52.2%) were positive for carcinoma and 79 women received AD. US-FNA had 77.5% sensitivity, 100% specificity, 100% positive predictive value, 69.3% negative predictive value, and 85.1% diagnostic accuracy. US-FNA avoided SLNB in 18.1% of patients who underwent AD. Conclusions Axillary US-FNA is an accurate technique in the staging of patients with BC. It allows reducing the number of SLNB and, when positive, offers a fast and useful tool. PMID:27051394

  16. Deconvolution of u channel magnetometer data: Experimental study of accuracy, resolution, and stability of different inversion methods

    NASA Astrophysics Data System (ADS)

    Jackson, Mike; Bowles, Julie A.; Lascu, Ioan; Solheid, Peat

    2010-07-01

    We explore the effects of sampling density, signal/noise ratios, and position-dependent measurement errors on deconvolution calculations for u channel magnetometer data, using a combination of experimental and numerical approaches. Experiments involve a synthetic sample set made by setting hydraulic cement in a 30-cm u channel and slicing the hardened material into ˜2-cm lengths, and a natural lake sediment u channel sample. The cement segments can be magnetized and measured individually, and reassembled for continuous u channel measurement and deconvolution; the lake sediment channel was first measured continuously and then sliced into discrete samples for individual measurement. Each continuous data set was deconvolved using the ABIC minimization code of Oda and Shibuya (1996) and two new approaches that we have developed, using singular-value decomposition and regularized least squares. These involve somewhat different methods to stabilize the inverse calculations and different criteria for identifying the optimum solution, but we find in all of our experiments that the three methods converge to essentially identical solutions. Repeat scans in several experiments show that measurement errors are not distributed with position-independent variance; errors in setting/determining the u channel position (standard deviation ˜0.2 mm) translate in regions of strong gradients into measurement uncertainties much larger than those due to instrument noise and drift. When we incorporate these depth-dependent measurement uncertainties into the deconvolution calculations, the resulting models show decreased stability and accuracy compared to inversions assuming depth-independent measurement errors. The cement experiments involved varying directions and uniform intensities downcore, and very good accuracy was obtained using all of the methods when the signal/noise ratio was greater than a few hundred and the sampling interval no larger than half the length scale of

  17. Recommended reporting standards for test accuracy studies of infectious diseases of finfish, amphibians, molluscs and crustaceans: the STRADAS-aquatic checklist

    USGS Publications Warehouse

    Gardner, Ian A; Whittington, Richard J; Caraguel, Charles G B; Hick, Paul; Moody, Nicholas J G; Corbeil, Serge; Garver, Kyle A.; Warg, Janet V; Arzul, Isabelle; Purcell, Maureen; St. J. Crane, Mark; Waltzek, Thomas B.; Olesen, Niels J; Lagno, Alicia Gallardo

    2016-01-01

    Complete and transparent reporting of key elements of diagnostic accuracy studies for infectious diseases in cultured and wild aquatic animals benefits end-users of these tests, enabling the rational design of surveillance programs, the assessment of test results from clinical cases and comparisons of diagnostic test performance. Based on deficiencies in the Standards for Reporting of Diagnostic Accuracy (STARD) guidelines identified in a prior finfish study (Gardner et al. 2014), we adapted the Standards for Reporting of Animal Diagnostic Accuracy Studies—paratuberculosis (STRADAS-paraTB) checklist of 25 reporting items to increase their relevance to finfish, amphibians, molluscs, and crustaceans and provided examples and explanations for each item. The checklist, known as STRADAS-aquatic, was developed and refined by an expert group of 14 transdisciplinary scientists with experience in test evaluation studies using field and experimental samples, in operation of reference laboratories for aquatic animal pathogens, and in development of international aquatic animal health policy. The main changes to the STRADAS-paraTB checklist were to nomenclature related to the species, the addition of guidelines for experimental challenge studies, and the designation of some items as relevant only to experimental studies and ante-mortem tests. We believe that adoption of these guidelines will improve reporting of primary studies of test accuracy for aquatic animal diseases and facilitate assessment of their fitness-for-purpose. Given the importance of diagnostic tests to underpin the Sanitary and Phytosanitary agreement of the World Trade Organization, the principles outlined in this paper should be applied to other World Organisation for Animal Health (OIE)-relevant species.

  18. Added value of cost-utility analysis in simple diagnostic studies of accuracy: (18)F-fluoromethylcholine PET/CT in prostate cancer staging.

    PubMed

    Gerke, Oke; Poulsen, Mads H; Høilund-Carlsen, Poul Flemming

    2015-01-01

    Diagnostic studies of accuracy targeting sensitivity and specificity are commonly done in a paired design in which all modalities are applied in each patient, whereas cost-effectiveness and cost-utility analyses are usually assessed either directly alongside to or indirectly by means of stochastic modeling based on larger randomized controlled trials (RCTs). However the conduct of RCTs is hampered in an environment such as ours, in which technology is rapidly evolving. As such, there is a relatively limited number of RCTs. Therefore, we investigated as to which extent paired diagnostic studies of accuracy can be also used to shed light on economic implications when considering a new diagnostic test. We propose a simple decision tree model-based cost-utility analysis of a diagnostic test when compared to the current standard procedure and exemplify this approach with published data from lymph node staging of prostate cancer. Average procedure costs were taken from the Danish Diagnosis Related Groups Tariff in 2013 and life expectancy was estimated for an ideal 60 year old patient based on prostate cancer stage and prostatectomy or radiation and chemotherapy. Quality-adjusted life-years (QALYs) were deduced from the literature, and an incremental cost-effectiveness ratio (ICER) was used to compare lymph node dissection with respective histopathological examination (reference standard) and (18)F-fluoromethylcholine positron emission tomography/computed tomography (FCH-PET/CT). Lower bounds of sensitivity and specificity of FCH-PET/CT were established at which the replacement of the reference standard by FCH-PET/CT comes with a trade-off between worse effectiveness and lower costs. Compared to the reference standard in a diagnostic accuracy study, any imperfections in accuracy of a diagnostic test imply that replacing the reference standard generates a loss in effectiveness and utility. We conclude that diagnostic studies of accuracy can be put to a more extensive use

  19. Can Interactive Web-based CAD Tools Improve the Learning of Engineering Drawing? A Case Study

    NASA Astrophysics Data System (ADS)

    Pando Cerra, Pablo; Suárez González, Jesús M.; Busto Parra, Bernardo; Rodríguez Ortiz, Diana; Álvarez Peñín, Pedro I.

    2014-06-01

    Many current Web-based learning environments facilitate the theoretical teaching of a subject but this may not be sufficient for those disciplines that require a significant use of graphic mechanisms to resolve problems. This research study looks at the use of an environment that can help students learn engineering drawing with Web-based CAD tools, including a self-correction component. A comparative study of 121 students was carried out. The students were divided into two experimental groups using Web-based interactive CAD tools and into two control groups using traditional learning tools. A statistical analysis of all the samples was carried out in order to study student behavior during the research and the effectiveness of these self-study tools in the learning process. The results showed that a greater number of students in the experimental groups passed the test and improved their test scores. Therefore, the use Web-based graphic interactive tools to learn engineering drawing can be considered a significant improvement in the teaching of this kind of academic discipline.

  20. A HTML5 open source tool to conduct studies based on Libet’s clock paradigm

    PubMed Central

    Garaizar, Pablo; Cubillas, Carmelo P.; Matute, Helena

    2016-01-01

    Libet’s clock is a well-known procedure in experiments in psychology and neuroscience. Examples of its use include experiments exploring the subjective sense of agency, action-effect binding, and subjective timing of conscious decisions and perceptions. However, the technical details of the apparatus used to conduct these types of experiments are complex, and are rarely explained in sufficient detail as to guarantee an exact replication of the procedure. With this in mind, we developed Labclock Web, a web tool designed to conduct online and offline experiments using Libet’s clock. After describing its technical features, we explain how to configure specific experiments using this tool. Its degree of accuracy and precision in the presentation of stimuli has been technically validated, including the use of two cognitive experiments conducted with voluntary participants who performed the experiment both in our laboratory and via the Internet. Labclock Web is distributed without charge under a free software license (GPLv3) since one of our main objectives is to facilitate the replication of experiments and hence the advancement of knowledge in this area. PMID:27623167

  1. Study on computer controlled polishing machine with small air bag tool

    NASA Astrophysics Data System (ADS)

    Wang, Yi; Ni, Ying; Yu, Jing-chi

    2007-12-01

    Laser and infrared optical technologies are developed quickly recently. Small aspheric lens of φ30 to 100mm which are normally used in such optical systems are largely demanded. But computer controlled polishing technology for small batch-quantity aspheric lens is a bottle-neck technology to prevent the development of laser and infrared optical technologies. In this article, the technology of computer controlled optical surfacing (CCOS) was used to solve the problems of batch-quantity aspheric lens' polishing. First, material's removing action by computer controlled small polishing tool is detailed simulated by computer. Then, According to the simulation result, polishing correction is completed after adjusting the function of tool's resident time. Finally the accuracy of 70 mm aspheric lens (Surface shape measurement value is 0.45μm, roughness measurement value is 2.687nm) is achieved under efficient polishing with our home made model computer controlled polishing machine which has three universal driving shafts. Efficiency of small aspheric lens' batch-quantity manufacturing is remarkably improved.

  2. A HTML5 open source tool to conduct studies based on Libet's clock paradigm.

    PubMed

    Garaizar, Pablo; Cubillas, Carmelo P; Matute, Helena

    2016-09-13

    Libet's clock is a well-known procedure in experiments in psychology and neuroscience. Examples of its use include experiments exploring the subjective sense of agency, action-effect binding, and subjective timing of conscious decisions and perceptions. However, the technical details of the apparatus used to conduct these types of experiments are complex, and are rarely explained in sufficient detail as to guarantee an exact replication of the procedure. With this in mind, we developed Labclock Web, a web tool designed to conduct online and offline experiments using Libet's clock. After describing its technical features, we explain how to configure specific experiments using this tool. Its degree of accuracy and precision in the presentation of stimuli has been technically validated, including the use of two cognitive experiments conducted with voluntary participants who performed the experiment both in our laboratory and via the Internet. Labclock Web is distributed without charge under a free software license (GPLv3) since one of our main objectives is to facilitate the replication of experiments and hence the advancement of knowledge in this area.

  3. Leadership Trust in Virtual Teams Using Communication Tools: A Quantitative Correlational Study

    ERIC Educational Resources Information Center

    Clark, Robert Lynn

    2014-01-01

    The purpose of this quantitative correlational study was to address leadership trust in virtual teams using communication tools in a small south-central, family-owned pharmaceutical organization, with multiple dispersed locations located in the United States. The results of the current research study could assist leaders to develop a communication…

  4. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    ERIC Educational Resources Information Center

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  5. An Entrepreneurial Learning Exercise as a Pedagogical Tool for Teaching CSR: A Peruvian Study

    ERIC Educational Resources Information Center

    Farber, Vanina A.; Prialé, María Angela; Fuchs, Rosa María

    2015-01-01

    This paper reports on an exploratory cross-sectional study of the value of an entrepreneurial learning exercise as a tool for examining the entrepreneurship dimension of corporate social responsibility (CSR). The study used grounded theory to analyse diaries kept by graduate (MBA) students during the "20 Nuevos Soles Project". From the…

  6. The Use of Economic Impact Studies as a Service Learning Tool in Undergraduate Business Programs

    ERIC Educational Resources Information Center

    Misner, John M.

    2004-01-01

    This paper examines the use of community based economic impact studies as service learning tools for undergraduate business programs. Economic impact studies are used to measure the economic benefits of a variety of activities such as community redevelopment, tourism, and expansions of existing facilities for both private and public producers.…

  7. Inertial Measures of Motion for Clinical Biomechanics: Comparative Assessment of Accuracy under Controlled Conditions – Changes in Accuracy over Time

    PubMed Central

    Lebel, Karina; Boissy, Patrick; Hamel, Mathieu; Duval, Christian

    2015-01-01

    Background Interest in 3D inertial motion tracking devices (AHRS) has been growing rapidly among the biomechanical community. Although the convenience of such tracking devices seems to open a whole new world of possibilities for evaluation in clinical biomechanics, its limitations haven’t been extensively documented. The objectives of this study are: 1) to assess the change in absolute and relative accuracy of multiple units of 3 commercially available AHRS over time; and 2) to identify different sources of errors affecting AHRS accuracy and to document how they may affect the measurements over time. Methods This study used an instrumented Gimbal table on which AHRS modules were carefully attached and put through a series of velocity-controlled sustained motions including 2 minutes motion trials (2MT) and 12 minutes multiple dynamic phases motion trials (12MDP). Absolute accuracy was assessed by comparison of the AHRS orientation measurements to those of an optical gold standard. Relative accuracy was evaluated using the variation in relative orientation between modules during the trials. Findings Both absolute and relative accuracy decreased over time during 2MT. 12MDP trials showed a significant decrease in accuracy over multiple phases, but accuracy could be enhanced significantly by resetting the reference point and/or compensating for initial Inertial frame estimation reference for each phase. Interpretation The variation in AHRS accuracy observed between the different systems and with time can be attributed in part to the dynamic estimation error, but also and foremost, to the ability of AHRS units to locate the same Inertial frame. Conclusions Mean accuracies obtained under the Gimbal table sustained conditions of motion suggest that AHRS are promising tools for clinical mobility assessment under constrained conditions of use. However, improvement in magnetic compensation and alignment between AHRS modules are desirable in order for AHRS to reach their

  8. Immersion defectivity study with volume production immersion lithography tool for 45 nm node and below

    NASA Astrophysics Data System (ADS)

    Nakano, Katsushi; Nagaoka, Shiro; Yoshida, Masato; Iriuchijima, Yasuhiro; Fujiwara, Tomoharu; Shiraishi, Kenichi; Owa, Soichi

    2008-03-01

    Volume production of 45nm node devices utilizing Nikon's S610C immersion lithography tool has started. Important to the success in achieving high-yields in volume production with immersion lithography has been defectivity reduction. In this study we evaluate several methods of defectivity reduction. The tools used in our defectivity analysis included a dedicated immersion cluster tools consisting of a Nikon S610C, a volume production immersion exposure tool with NA of 1.3, and a resist coater-developer LITHIUS i+ from TEL. In our initial procedure we evaluated defectivity behavior by comparing on a topcoat-less resist process to a conventional topcoat process. Because of its simplicity the topcoatless resist shows lower defect levels than the topcoat process. In a second study we evaluated the defect reduction by introducing the TEL bevel rinse and pre-immersion bevel cleaning techniques. This technique was shown to successfully reduce the defect levels by reducing the particles at the wafer bevel region. For the third defect reduction method, two types of tool cleaning processes are shown. Finally, we discuss the overall defectivity behavior at the 45nm node. To facilitate an understanding of the root cause of the defects, defect source analysis (DSA) was applied to separate the defects into three classes according to the source of defects. DSA analysis revealed that more than 99% of defects relate to material and process, and less than 1% of the defects relate to the exposure tool. Material and process optimization by collaborative work between exposure tool vendors, track vendors and material vendors is a key for success of 45nm node device manufacturing.

  9. Diagnostic Accuracy of Brain-derived Neurotrophic Factor and Nitric Oxide in Patients with Schizophrenia: A pilot study

    PubMed Central

    Lazarević, Dušan; Ćosić, Vladan; Knežević, Marinela Z.; Djordjević, Vidosava B.; Stojanović, Ivana

    2016-01-01

    Summary Background Brain-derived neurotrophic factor (BDNF) and nitric oxide (NO) play multiple roles in the developing and adult CNS. Since BDNF and NO metabolisms are dysregulated in schizophrenia, we measured these markers simultaneously in the blood of schizophrenics and assessed their diagnostic accuracy. Methods Thirty-eight patients with schizophrenia classified according to demographic characteristics, symptomatologyand therapy and 39 age- and gender-matched healthy controls were enrolled. BDNF was determined by the ELISA technique while the concentration of nitrite/nitrate (NO2−/NO3−) was measured by the colorimetric method. Results Serum BDNF levels were significantly lower (20.38±3.73 ng/mL, P = 1.339E-05), whilst plasma NO2−/NO3− concentrations were significantly higher (84.3 (72–121) μmol/L, P=4.357E-08) in patients with schizophrenia than in healthy controls (25.65±4.32 ng/mL; 60.9 (50–76) μmol/L, respectively). The lowest value of BDNF (18.14±3.26 ng/mL) and the highest NO2−/NO3− concentration (115.3 (80–138) μmol/L) were found in patients treated with second-generation antipsychotics (SGA). The patients diseased before the age of 24 and the patients suffering for up to one year had significantly lower serum BDNF levels than those diseased after the age of 24 and the patients who were ill longer than one year. Both BDNF and NO2−/NO3− showed good diagnostic accuracy, but BDNF had better ROC curve characteristics, especially in patients with negative symptomatology. Conclusions BDNF and nitrite/nitrate showed inverse changes in schizophrenic patients. The most pronounced changes were found in patients treated with second-generation antipsychotics. Although BDNF is not specific of schizophrenia, it may be a clinically useful biomarker for the diagnosis of patients expressing predominantly negative symptoms. PMID:28356859

  10. SU-E-J-147: Monte Carlo Study of the Precision and Accuracy of Proton CT Reconstructed Relative Stopping Power Maps

    SciTech Connect

    Dedes, G; Asano, Y; Parodi, K; Arbor, N; Dauvergne, D; Testa, E; Letang, J; Rit, S

    2015-06-15

    Purpose: The quantification of the intrinsic performances of proton computed tomography (pCT) as a modality for treatment planning in proton therapy. The performance of an ideal pCT scanner is studied as a function of various parameters. Methods: Using GATE/Geant4, we simulated an ideal pCT scanner and scans of several cylindrical phantoms with various tissue equivalent inserts of different sizes. Insert materials were selected in order to be of clinical relevance. Tomographic images were reconstructed using a filtered backprojection algorithm taking into account the scattering of protons into the phantom. To quantify the performance of the ideal pCT scanner, we study the precision and the accuracy with respect to the theoretical relative stopping power ratios (RSP) values for different beam energies, imaging doses, insert sizes and detector positions. The planning range uncertainty resulting from the reconstructed RSP is also assessed by comparison with the range of the protons in the analytically simulated phantoms. Results: The results indicate that pCT can intrinsically achieve RSP resolution below 1%, for most examined tissues at beam energies below 300 MeV and for imaging doses around 1 mGy. RSP maps accuracy of less than 0.5 % is observed for most tissue types within the studied dose range (0.2–1.5 mGy). Finally, the uncertainty in the proton range due to the accuracy of the reconstructed RSP map is well below 1%. Conclusion: This work explores the intrinsic performance of pCT as an imaging modality for proton treatment planning. The obtained results show that under ideal conditions, 3D RSP maps can be reconstructed with an accuracy better than 1%. Hence, pCT is a promising candidate for reducing the range uncertainties introduced by the use of X-ray CT alongside with a semiempirical calibration to RSP.Supported by the DFG Cluster of Excellence Munich-Centre for Advanced Photonics (MAP)

  11. MetLab: An In Silico Experimental Design, Simulation and Analysis Tool for Viral Metagenomics Studies

    PubMed Central

    Gourlé, Hadrien; Bongcam-Rudloff, Erik; Hayer, Juliette

    2016-01-01

    Metagenomics, the sequence characterization of all genomes within a sample, is widely used as a virus discovery tool as well as a tool to study viral diversity of animals. Metagenomics can be considered to have three main steps; sample collection and preparation, sequencing and finally bioinformatics. Bioinformatic analysis of metagenomic datasets is in itself a complex process, involving few standardized methodologies, thereby hampering comparison of metagenomics studies between research groups. In this publication the new bioinformatics framework MetLab is presented, aimed at providing scientists with an integrated tool for experimental design and analysis of viral metagenomes. MetLab provides support in designing the metagenomics experiment by estimating the sequencing depth needed for the complete coverage of a species. This is achieved by applying a methodology to calculate the probability of coverage using an adaptation of Stevens’ theorem. It also provides scientists with several pipelines aimed at simplifying the analysis of viral metagenomes, including; quality control, assembly and taxonomic binning. We also implement a tool for simulating metagenomics datasets from several sequencing platforms. The overall aim is to provide virologists with an easy to use tool for designing, simulating and analyzing viral metagenomes. The results presented here include a benchmark towards other existing software, with emphasis on detection of viruses as well as speed of applications. This is packaged, as comprehensive software, readily available for Linux and OSX users at https://github.com/norling/metlab. PMID:27479078

  12. User Friendly Open GIS Tool for Large Scale Data Assimilation - a Case Study of Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Gupta, P. K.

    2012-08-01

    Open source software (OSS) coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc...) and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.). Quantum GIS (QGIS) is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn - Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect), landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  13. Introducing a Novel Applicant Ranking Tool to Predict Future Resident Performance: A Pilot Study.

    PubMed

    Bowe, Sarah N; Weitzel, Erik K; Hannah, William N; Fitzgerald, Brian M; Kraus, Gregory P; Nagy, Christopher J; Harrison, Stephen A

    2017-01-01

    The purposes of this study are to (1) introduce our novel Applicant Ranking Tool that aligns with the Accreditation Council for Graduate Medical Education competencies and (2) share our preliminary results comparing applicant rank to current performance. After a thorough literature review and multiple roundtable discussions, an Applicant Ranking Tool was created. Feasibility, satisfaction, and critiques were discussed via open feedback session. Inter-rater reliability was assessed using weighted kappa statistic (κ) and Kendall coefficient of concordance (W). Fisher's exact tests evaluated the ability of the tool to stratify performance into the top or bottom half of their class. Internal medicine and anesthesiology residents served as the pilot cohorts. The tool was considered user-friendly for both data input and analysis. Inter-rater reliability was strongest with intradisciplinary evaluation (W = 0.8-0.975). Resident performance was successfully stratified into those functioning in the upper vs. lower half of their class within the Clinical Anesthesia-3 grouping (p = 0.008). This novel Applicant Ranking Tool lends support for the use of both cognitive and noncognitive traits in predicting resident performance. While the ability of this instrument to accurately predict future resident performance will take years to answer, this pilot study suggests the instrument is worthy of ongoing investigation.

  14. Hermite finite elements for high accuracy electromagnetic field calculations: A case study of homogeneous and inhomogeneous waveguides

    NASA Astrophysics Data System (ADS)

    Boucher, C. R.; Li, Zehao; Ahheng, C. I.; Albrecht, J. D.; Ram-Mohan, L. R.

    2016-04-01

    Maxwell's vector field equations and their numerical solution represent significant challenges for physical domains with complex geometries. There are several limitations in the presently prevalent approaches to the calculation of field distributions in physical domains, in particular, with the vector finite elements. In order to quantify and resolve issues, we consider the modeling of the field equations for the prototypical examples of waveguides. We employ the finite element method with a new set of Hermite interpolation polynomials derived recently by us using group theoretic considerations. We show that (i) the approach presented here yields better accuracy by several orders of magnitude, with a smoother representation of fields than the vector finite elements for waveguide calculations. (ii) This method does not generate any spurious solutions that plague Lagrange finite elements, even though the C1 -continuous Hermite polynomials are also scalar in nature. (iii) We present solutions for propagating modes in inhomogeneous waveguides satisfying dispersion relations that can be derived directly, and investigate their behavior as the ratio of dielectric constants is varied both theoretically and numerically. Additional comparisons and advantages of the proposed method are detailed in this article. The Hermite interpolation polynomials are shown to provide a robust, accurate, and efficient means of solving Maxwell's equations in a variety of media, potentially offering a computationally inexpensive means of designing devices for optoelectronics and plasmonics of increasing complexity.

  15. Real-Word and Nonword Repetition in Italian-Speaking Children with Specific Language Impairment: A Study of Diagnostic Accuracy

    PubMed Central

    Dispaldro, Marco; Leonard, Laurence B.; Deevy, Patricia

    2013-01-01

    Purpose: Using two different scoring methods, we examined the diagnostic accuracy of both real-word and nonword repetition in identifying Italian-speaking children with and without specific language impairment (SLI). Method: A total of 34 children aged 3;11 to 5;8 participated – 17 children with SLI and 17 typically developing children matched for age (TD-A children). Children completed real-word and nonword repetition tasks. The capacity of real-word and nonword repetition tasks to discriminate children with SLI from TD-A was examined through binary logistic regression and response operating characteristics curves. Results: Both real-word and nonword repetition showed good (or excellent) sensitivity and specificity in distinguishing children with SLI from their typically developing peers. Conclusions: Nonword repetition appears to be a useful diagnostic indicator for Italian, as in other languages. In addition, real-word repetition also holds promise. The contributions of each type of measure are discussed. PMID:22761319

  16. Cognitive Abilities Underlying Reading Accuracy, Fluency and Spelling Acquisition in Korean Hangul Learners from Grades 1 to 4: A Cross-Sectional Study.

    PubMed

    Park, Hyun-Rin; Uno, Akira

    2015-08-01

    The purpose of this cross-sectional study was to examine the cognitive abilities that predict reading and spelling performance in Korean children in Grades 1 to 4, depending on expertise and reading experience. As a result, visual cognition, phonological awareness, naming speed and receptive vocabulary significantly predicted reading accuracy in children in Grades 1 and 2, whereas visual cognition, phonological awareness and rapid naming speed did not predict reading accuracy in children in higher grades. For reading, fluency, phonological awareness, rapid naming speed and receptive vocabulary were crucial abilities in children in Grades 1 to 3, whereas phonological awareness was not a significant predictor in children in Grade 4. In spelling, reading ability and receptive vocabulary were the most important abilities for accurate Hangul spelling. The results suggested that the degree of cognitive abilities required for reading and spelling changed depending on expertise and reading experience.

  17. The Relationship Between Accuracy of Numerical Magnitude Comparisons and Children’s Arithmetic Ability: A Study in Iranian Primary School Children

    PubMed Central

    Tavakoli, Hamdollah Manzari

    2016-01-01

    The relationship between children’s accuracy during numerical magnitude comparisons and arithmetic ability has been investigated by many researchers. Contradictory results have been reported from these studies due to the use of many different tasks and indices to determine the accuracy of numerical magnitude comparisons. In the light of this inconsistency among measurement techniques, the present study aimed to investigate this relationship among Iranian second grade children (n = 113) using a pre-established test (known as the Numeracy Screener) to measure numerical magnitude comparison accuracy. The results revealed that both the symbolic and non-symbolic items of the Numeracy Screener significantly correlated with arithmetic ability. However, after controlling for the effect of working memory, processing speed, and long-term memory, only performance on symbolic items accounted for the unique variances in children’s arithmetic ability. Furthermore, while working memory uniquely contributed to arithmetic ability in one-and two-digit arithmetic problem solving, processing speed uniquely explained only the variance in single-digit arithmetic skills and long-term memory did not contribute to any significant additional variance for one-digit or two-digit arithmetic problem solving. PMID:27872667

  18. Numerical accuracy of mean-field calculations in coordinate space

    NASA Astrophysics Data System (ADS)

    Ryssens, W.; Heenen, P.-H.; Bender, M.

    2015-12-01

    Background: Mean-field methods based on an energy density functional (EDF) are powerful tools used to describe many properties of nuclei in the entirety of the nuclear chart. The accuracy required of energies for nuclear physics and astrophysics applications is of the order of 500 keV and much effort is undertaken to build EDFs that meet this requirement. Purpose: Mean-field calculations have to be accurate enough to preserve the accuracy of the EDF. We study this numerical accuracy in detail for a specific numerical choice of representation for mean-field equations that can accommodate any kind of symmetry breaking. Method: The method that we use is a particular implementation of three-dimensional mesh calculations. Its numerical accuracy is governed by three main factors: the size of the box in which the nucleus is confined, the way numerical derivatives are calculated, and the distance between the points on the mesh. Results: We examine the dependence of the results on these three factors for spherical doubly magic nuclei, neutron-rich 34Ne , the fission barrier of 240Pu , and isotopic chains around Z =50 . Conclusions: Mesh calculations offer the user extensive control over the numerical accuracy of the solution scheme. When appropriate choices for the numerical scheme are made the achievable accuracy is well below the model uncertainties of mean-field methods.

  19. StatXFinder: a web-based self-directed tool that provides appropriate statistical test selection for biomedical researchers in their scientific studies.

    PubMed

    Suner, Aslı; Karakülah, Gökhan; Koşaner, Özgün; Dicle, Oğuz

    2015-01-01

    The improper use of statistical methods is common in analyzing and interpreting research data in biological and medical sciences. The objective of this study was to develop a decision support tool encompassing the commonly used statistical tests in biomedical research by combining and updating the present decision trees for appropriate statistical test selection. First, the decision trees in textbooks, published articles, and online resources were scrutinized, and a more comprehensive unified one was devised via the integration of 10 distinct decision trees. The questions also in the decision steps were revised by simplifying and enriching of the questions with examples. Then, our decision tree was implemented into the web environment and the tool titled StatXFinder was developed. Finally, usability and satisfaction questionnaires were applied to the users of the tool, and StatXFinder was reorganized in line with the feedback obtained from these questionnaires. StatXFinder provides users with decision support in the selection of 85 distinct parametric and non-parametric statistical tests by directing 44 different yes-no questions. The accuracy rate of the statistical test recommendations obtained by 36 participants, with the cases applied, were 83.3 % for "difficult" tests, and 88.9 % for "easy" tests. The mean system usability score of the tool was found 87.43 ± 10.01 (minimum: 70-maximum: 100). A statistically significant difference could not be seen between total system usability score and participants' attributes (p value >0.05). The User Satisfaction Questionnaire showed that 97.2 % of the participants appreciated the tool, and almost all of the participants (35 of 36) thought of recommending the tool to the others. In conclusion, StatXFinder, can be utilized as an instructional and guiding tool for biomedical researchers with limited statistics knowledge. StatXFinder is freely available at http://webb.deu.edu.tr/tb/statxfinder.

  20. Writing as a Learning Tool: Integrating Theory and Practice. Studies in Writing, Volume 7.

    ERIC Educational Resources Information Center

    Tynjala, Paivi, Ed.; Mason, Lucia, Ed.; Lonka, Kirsti, Ed.

    This book, the seventh volume in the Studies in Writing International Series on the Research of Learning and Instruction of Writing, is an account of the current state of using writing as a tool for learning. The book presents psychological and educational foundations of the writing across the curriculum movement and describes writing-to-learn…

  1. Computer animation as a tool to study preferences in the cichlid Pelvicachromis taeniatus.

    PubMed

    Baldauf, S A; Kullmann, H; Thünken, T; Winter, S; Bakker, T C M

    2009-08-01

    Four choice experiments were conducted with both sexes of the cichlid Pelvicachromis taeniatus using computer-manipulated stimuli of digital images differing in movement, body shape or colouration. The results show that computer animations can be useful and flexible tools in studying preferences of a cichlid with complex and variable preferences for different visual cues.

  2. Critical Reflection as a Learning Tool for Nurse Supervisors: A Hermeneutic Phenomenological Study

    ERIC Educational Resources Information Center

    Urbas-Llewellyn, Agnes

    2013-01-01

    Critical reflection as a learning tool for nursing supervisors is a complex and multifaceted process not completely understood by healthcare leadership, specifically nurse supervisors. Despite a multitude of research studies on critical reflection, there remains a gap in the literature regarding the perceptions of the individual, the support…

  3. Volunteering in the Digital Age: A Study of Online Collaboration Tools from the Perspective of CSCL

    ERIC Educational Resources Information Center

    Kok, Ayse

    2011-01-01

    There is little evidence that helps to inform education, practice, policy, and research about issues surrounding the use of online collaboration tools for organisational initiatives (Brown & Duguid, 1991; Cook & Brown, 1999); let alone a single study conducted with regard to the volunteering practice of knowledge workers. The underlying…

  4. Handwriting Characteristics among Secondary Students with and without Physical Disabilities: A Study with a Computerized Tool

    ERIC Educational Resources Information Center

    Li-Tsang, Cecilia W. P.; Au, Ricky K. C.; Chan, Michelle H. Y.; Chan, Lily W. L.; Lau, Gloria M. T.; Lo, T. K.; Leung, Howard W. H.

    2011-01-01

    The purpose of the present study was to investigate the handwriting characteristics of secondary school students with and without physical disabilities (PD). With the use of a computerized Chinese Handwriting Assessment Tool (CHAT), it was made possible to objectively assess and analyze in detail the handwriting characteristics of individual…

  5. Music: Artistic Performance or a Therapeutic Tool? A Study on Differences

    ERIC Educational Resources Information Center

    Petersson, Gunnar; Nystrom, Maria

    2011-01-01

    The aim of this study is to analyze and describe how musicians who are also music therapy students separate music as artistic performance from music as a therapeutic tool. The data consist of 18 written reflections from music therapy students that were analyzed according to a phenomenographic method. The findings are presented as four…

  6. Basins and Wepp Climate Assessment Tools (Cat): Case Study Guide to Potential Applications (Final Report)

    EPA Science Inventory

    Cover of the BASINS and WEPP Climate Assessment <span class=Tool: Case Study Final report"> This final report supports application of two recently developed...

  7. Study Abroad Programs as Tools of Internationalization: Which Factors Influence Hungarian Business Students to Participate?

    ERIC Educational Resources Information Center

    Huják, Janka

    2015-01-01

    The internationalization of higher education has been on the agenda for decades now all over the world. Study abroad programs are undoubtedly tools of the internationalization endeavors. The ERASMUS Student Mobility Program is one of the flagships of the European Union's educational exchange programs implicitly aiming for the internationalization…

  8. A Study of Turnitin as an Educational Tool in Student Dissertations

    ERIC Educational Resources Information Center

    Biggam, John; McCann, Margaret

    2010-01-01

    Purpose: This paper explores the use of Turnitin as a learning tool (particularly in relation to citing sources and paraphrasing) and as a vehicle for reducing incidences of plagiarism. Design/methodology/approach: The research was implemented using a case study of 49 final-year "honours" undergraduate students undertaking their year-long core…

  9. iMindMap as an Innovative Tool in Teaching and Learning Accounting: An Exploratory Study

    ERIC Educational Resources Information Center

    Wan Jusoh, Wan Noor Hazlina; Ahmad, Suraya

    2016-01-01

    Purpose: The purpose of this study is to explore the use of iMindMap software as an interactive tool in the teaching and learning method and also to be able to consider iMindMap as an alternative instrument in achieving the ultimate learning outcome. Design/Methodology/Approach: Out of 268 students of the management accounting at the University of…

  10. The Life Story Board: A Feasibility Study of a Visual Interview Tool for School Counsellors

    ERIC Educational Resources Information Center

    Chase, Robert M.; Medina, Maria Fernanda; Mignone, Javier

    2012-01-01

    The article describes the findings of a pilot study of the Life Story Board (LSB), a novel visual information system with a play board and sets of magnetic cards designed to be a practical clinical tool for counsellors, therapists, and researchers. The LSB is similar to a multidimensional genogram, and serves as a platform to depict personal…

  11. Google Translate as a Supplementary Tool for Learning Malay: A Case Study at Universiti Sains Malaysia

    ERIC Educational Resources Information Center

    Bahri, Hossein; Mahadi, Tengku Sepora Tengku

    2016-01-01

    The present paper examines the use of Google Translate as a supplementary tool for helping international students at Universiti Sains Malaysia (USM) to learn and develop their knowledge and skills in learning Bahasa Malaysia (Malay Language). The participants of the study were 16 international students at the School of Languages, Literacies, and…

  12. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  13. The Personal Study Program as a Tool for Career Planning and Personalization of Adult Learning.

    ERIC Educational Resources Information Center

    Onnismaa, Jussi

    2003-01-01

    The personal study program (PSP) can be defined as a tool for the successful accomplishment of vocational adult training. The program defines the objectives of education and training and the best means of achieving these. Through counseling interaction, the adult learner may redefine his goals and relation to a future profession and so revise his…

  14. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  15. The use of analytical surface tools in the fundamental study of wear. [atomic nature of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    Various techniques and surface tools available for the study of the atomic nature of the wear of materials are reviewed These include chemical etching, x-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which affect wear, such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding, are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  16. Improving the accuracies of bathymetric models based on multiple regression for calibration (case study: Sarca River, Italy)

    NASA Astrophysics Data System (ADS)

    Niroumand-Jadidi, Milad; Vitti, Alfonso

    2016-10-01

    The optical imagery has the potential for extraction of spatially and temporally explicit bathymetric information in inland/coastal waters. Lyzenga's model and optimal band ratio analysis (OBRA) are main bathymetric models which both provide linear relations with water depths. The former model is sensitive and the latter is quite robust to substrate variability. The simple regression is the widely used approach for calibration of bathymetric models either Lyzenga's model or OBRA model. In this research, a multiple regression is examined for empirical calibration of the models in order to take the advantage of all spectral channels of the imagery. This method is applied on both Lyzenga's model and OBRA model for the bathymetry of a shallow Alpine river in Italy, using WorldView-2 (WV-2) and GeoEye images. Insitu depths are recorded using RTK GPS in two reaches. One-half of the data is used for calibration of models and the remaining half as independent check-points for accuracy assessment. In addition, radiative transfer model is used to simulate a set of spectra in a range of depths, substrate types, and water column properties. The simulated spectra are convolved to the sensors' spectral bands for further bathymetric analysis. Investigating the simulated spectra, it is concluded that the multiple regression improves the robustness of the Lyzenga's model with respect to the substrate variability. The improvements of multiple regression approach are much more pronounced for the Lyzenga's model rather than the OBRA model. This is in line with findings from real imagery; for instance, the multiple regression applied for calibration of Lyzenga's and OBRA models demonstrated, respectively, 22% and 9% higher determination coefficients (R2) as well as 3 cm and 1 cm better RMSEs compared to the simple regression using the WV-2 image.

  17. A comparative study of submicron particle sizing platforms: accuracy, precision and resolution analysis of polydisperse particle size distributions.

    PubMed

    Anderson, Will; Kozak, Darby; Coleman, Victoria A; Jämting, Åsa K; Trau, Matt

    2013-09-01

    The particle size distribution (PSD) of a polydisperse or multimodal system can often be difficult to obtain due to the inherent limitations in established measurement techniques. For this reason, the resolution, accuracy and precision of three new and one established, commercially available and fundamentally different particle size analysis platforms were compared by measuring both individual and a mixed sample of monodisperse, sub-micron (220, 330, and 410 nm - nominal modal size) polystyrene particles. The platforms compared were the qNano Tunable Resistive Pulse Sensor, Nanosight LM10 Particle Tracking Analysis System, the CPS Instruments's UHR24000 Disc Centrifuge, and the routinely used Malvern Zetasizer Nano ZS Dyn