Science.gov

Sample records for accuracy studies tool

  1. Multinomial tree models for assessing the status of the reference in studies of the accuracy of tools for binary classification

    PubMed Central

    Botella, Juan; Huang, Huiling; Suero, Manuel

    2013-01-01

    Studies that evaluate the accuracy of binary classification tools are needed. Such studies provide 2 × 2 cross-classifications of test outcomes and the categories according to an unquestionable reference (or gold standard). However, sometimes a suboptimal reliability reference is employed. Several methods have been proposed to deal with studies where the observations are cross-classified with an imperfect reference. These methods require that the status of the reference, as a gold standard or as an imperfect reference, is known. In this paper a procedure for determining whether it is appropriate to maintain the assumption that the reference is a gold standard or an imperfect reference, is proposed. This procedure fits two nested multinomial tree models, and assesses and compares their absolute and incremental fit. Its implementation requires the availability of the results of several independent studies. These should be carried out using similar designs to provide frequencies of cross-classification between a test and the reference under investigation. The procedure is applied in two examples with real data. PMID:24106484

  2. EOS mapping accuracy study

    NASA Technical Reports Server (NTRS)

    Forrest, R. B.; Eppes, T. A.; Ouellette, R. J.

    1973-01-01

    Studies were performed to evaluate various image positioning methods for possible use in the earth observatory satellite (EOS) program and other earth resource imaging satellite programs. The primary goal is the generation of geometrically corrected and registered images, positioned with respect to the earth's surface. The EOS sensors which were considered were the thematic mapper, the return beam vidicon camera, and the high resolution pointable imager. The image positioning methods evaluated consisted of various combinations of satellite data and ground control points. It was concluded that EOS attitude control system design must be considered as a part of the image positioning problem for EOS, along with image sensor design and ground image processing system design. Study results show that, with suitable efficiency for ground control point selection and matching activities during data processing, extensive reliance should be placed on use of ground control points for positioning the images obtained from EOS and similar programs.

  3. Wind Prediction Accuracy for Air Traffic Management Decision Support Tools

    NASA Technical Reports Server (NTRS)

    Cole, Rod; Green, Steve; Jardin, Matt; Schwartz, Barry; Benjamin, Stan

    2000-01-01

    The performance of Air Traffic Management and flight deck decision support tools depends in large part on the accuracy of the supporting 4D trajectory predictions. This is particularly relevant to conflict prediction and active advisories for the resolution of conflicts and the conformance with of traffic-flow management flow-rate constraints (e.g., arrival metering / required time of arrival). Flight test results have indicated that wind prediction errors may represent the largest source of trajectory prediction error. The tests also discovered relatively large errors (e.g., greater than 20 knots), existing in pockets of space and time critical to ATM DST performance (one or more sectors, greater than 20 minutes), are inadequately represented by the classic RMS aggregate prediction-accuracy studies of the past. To facilitate the identification and reduction of DST-critical wind-prediction errors, NASA has lead a collaborative research and development activity with MIT Lincoln Laboratories and the Forecast Systems Lab of the National Oceanographic and Atmospheric Administration (NOAA). This activity, begun in 1996, has focussed on the development of key metrics for ATM DST performance, assessment of wind-prediction skill for state of the art systems, and development/validation of system enhancements to improve skill. A 13 month study was conducted for the Denver Center airspace in 1997. Two complementary wind-prediction systems were analyzed and compared to the forecast performance of the then standard 60 km Rapid Update Cycle - version 1 (RUC-1). One system, developed by NOAA, was the prototype 40-km RUC-2 that became operational at NCEP in 1999. RUC-2 introduced a faster cycle (1 hr vs. 3 hr) and improved mesoscale physics. The second system, Augmented Winds (AW), is a prototype en route wind application developed by MITLL based on the Integrated Terminal Wind System (ITWS). AW is run at a local facility (Center) level, and updates RUC predictions based on an

  4. Effect of Flexural Rigidity of Tool on Machining Accuracy during Microgrooving by Ultrasonic Vibration Cutting Method

    NASA Astrophysics Data System (ADS)

    Furusawa, Toshiaki

    2010-12-01

    It is necessary to form fine holes and grooves by machining in the manufacture of equipment in the medical or information field and the establishment of such a machining technology is required. In micromachining, the use of the ultrasonic vibration cutting method is expected and examined. In this study, I experimentally form microgrooves in stainless steel SUS304 by the ultrasonic vibration cutting method and examine the effects of the shape and material of the tool on the machining accuracy. As a result, the following are clarified. The evaluation of the machining accuracy of the straightness of the finished surface revealed that there is an optimal rake angle of the tools related to the increase in cutting resistance as a result of increases in work hardening and the cutting area. The straightness is improved by using a tool with low flexural rigidity. In particular, Young's modulus more significantly affects the cutting accuracy than the shape of the tool.

  5. Machine tool accuracy characterization workshops. Final report, May 5, 1992--November 5 1993

    SciTech Connect

    1995-01-06

    The ability to assess the accuracy of machine tools is required by both tool builders and users. Builders must have this ability in order to predict the accuracy capability of a machine tool for different part geometry`s, to provide verifiable accuracy information for sales purposes, and to locate error sources for maintenance, troubleshooting, and design enhancement. Users require the same ability in order to make intelligent choices in selecting or procuring machine tools, to predict component manufacturing accuracy, and to perform maintenance and troubleshooting. In both instances, the ability to fully evaluate the accuracy capabilities of a machine tool and the source of its limitations is essential for using the tool to its maximum accuracy and productivity potential. This project was designed to transfer expertise in modern machine tool accuracy testing methods from LLNL to US industry, and to educate users on the use and application of emerging standards for machine tool performance testing.

  6. The Accuracy of IOS Device-based uHear as a Screening Tool for Hearing Loss: A Preliminary Study From the Middle East

    PubMed Central

    Al-Abri, Rashid; Al-Balushi, Mustafa; Kolethekkat, Arif; Bhargava, Deepa; Al-Alwi, Amna; Al-Bahlani, Hana; Al-Garadi, Manal

    2016-01-01

    Objectives To determine and explore the potential use of uHear as a screening test for determining hearing disability by evaluating its accuracy in a clinical setting and a soundproof booth when compared to the gold standard conventional audiometry.   Methods Seventy Sultan Qaboos University students above the age of 17 years who had normal hearing were recruited for the study. They underwent a hearing test using conventional audiometry in a soundproof room, a self-administered uHear evaluation in a side room resembling a clinic setting, and a self-administered uHear test in a soundproof booth. The mean pure tone average (PTA) of thresholds at 500, 1000, 2000 and 4000 Hz for all the three test modalities was calculated, compared, and analyzed statistically.   Results There were 36 male and 34 female students in the study. The PTA with conventional audiometry ranged from 1 to 21 dB across left and right ears. The PTA using uHear in the side room for the same participants was 25 dB in the right ear and 28 dB in the left ear (3–54 dB across all ears). The PTA for uHear in the soundproof booth was 18 dB and 17 dB (1–43 dB) in the right and left ears, respectively. Twenty-three percent of participants were reported to have a mild hearing impairment (PTA > 25 dB) using the soundproof uHear test, and this number was 64% for the same test in the side room. For the same group, only 3% of participants were reported to have a moderate hearing impairment (PTA > 40 dB) using the uHear test in a soundproof booth, and 13% in the side room.   Conclusion uHear in any setting lacks specificity in the range of normal hearing and is highly unreliable in giving the exact hearing threshold in clinical settings. However, there is a potential for the use of uHear if it is used to rule out moderate hearing loss, even in a clinical setting, as exemplified by our study. This method needs standardization through further research. PMID:27168926

  7. Comparison of Dimensional Accuracies Using Two Elastomeric Impression Materials in Casting Three-dimensional Tool Marks.

    PubMed

    Wang, Zhen

    2016-05-01

    The purpose of this study was to evaluate two types of impression materials which were frequently used for casting three-dimensional tool marks in China, namely (i) dental impression material and (ii) special elastomeric impression material for tool mark casting. The two different elastomeric impression materials were compared under equal conditions. The parameters measured were dimensional accuracies, the number of air bubbles, the ease of use, and the sharpness and quality of the individual characteristics present on casts. The results showed that dental impression material had the advantage of special elastomeric impression material in casting tool marks in crime scenes; hence, it combined ease of use, dimensional accuracy, sharpness and high quality. PMID:27122422

  8. Acoustic Radiation Force Impulse Elastography: A Useful Tool for Differential Diagnosis of Thyroid Nodules and Recommending Fine-Needle Aspiration: A Diagnostic Accuracy Study.

    PubMed

    Zhang, Yi-Feng; Xu, Jun-Mei; Xu, Hui-Xiong; Liu, Chang; Bo, Xiao-Wan; Li, Xiao-Long; Guo, Le-Hang; Liu, Bo-Ji; Liu, Lin-Na; Xu, Xiao-Hong

    2015-10-01

    To investigate the diagnostic performance of combined use of conventional ultrasound (US) and elastography, including conventional strain elastography such as elasticity imaging (EI) and acoustic radiation force impulse (ARFI) elastography, and to evaluate their usefulness in recommending fine-needle aspiration (FNA).A total of 556 pathologically proven thyroid nodules were evaluated by US, EI, and ARFI examinations in this study. Three blinded readers scored the likelihood of malignancy for 4 datasets (ie, US alone, US and EI, US and virtual touch tissue imaging [VTI], and US and virtual touch tissue quantification [VTQ]). The diagnostic performances of 4 datasets in differentiating malignant from benign thyroid nodules were evaluated. The decision-making changes for FNA recommendation in the indeterminate nodules or the probably benign nodules on conventional US were evaluated after review of elastography.The diagnostic performance in terms of area under the ROC curve did not show any change after adding EI, VTI, or VTQ for analysis; and no differences were found among different readers; however, the specificity and positive predictive value (PPV) improved significantly after adding VTI or VTQ for analysis in the senior reader. For the indeterminate nodules on US that were pathologically benign, VTQ made correct decision-making changes from FNA biopsy to follow-up in a mean of 82.6% nodules, which was significantly higher than those achieved by EI (46.8%) and VTI (54.4%) (both P < 0.05). With regard to the probably benign nodules on US that were pathologically malignant, EI made the highest correct decision-making change from follow-up to FNA biopsy in a mean of 62.6% nodules (compared with 41.5% on VTQ, P < 0.05).The results indicated that ARFI increases the specificity and PPV in diagnosing thyroid nodules. US combined VTQ might be helpful in reducing unnecessary FNA for indeterminate nodules on US whereas US combined EI is useful to detect the false negative

  9. Application of a Monte Carlo accuracy assessment tool to TDRS and GPS

    NASA Technical Reports Server (NTRS)

    Pavloff, Michael S.

    1994-01-01

    In support of a NASA study on the application of radio interferometry to satellite orbit determination, MITRE developed a simulation tool for assessing interferometric tracking accuracy. Initially, the tool was applied to the problem of determining optimal interferometric station siting for orbit determination of the Tracking and Data Relay Satellite (TDRS). Subsequently, the Orbit Determination Accuracy Estimator (ODAE) was expanded to model the general batch maximum likelihood orbit determination algorithms of the Goddard Trajectory Determination System (GTDS) with measurement types including not only group and phase delay from radio interferometry, but also range, range rate, angular measurements, and satellite-to-satellite measurements. The user of ODAE specifies the statistical properties of error sources, including inherent observable imprecision, atmospheric delays, station location uncertainty, and measurement biases. Upon Monte Carlo simulation of the orbit determination process, ODAE calculates the statistical properties of the error in the satellite state vector and any other parameters for which a solution was obtained in the orbit determination. This paper presents results from ODAE application to two different problems: (1)determination of optimal geometry for interferometirc tracking of TDRS, and (2) expected orbit determination accuracy for Global Positioning System (GPS) tracking of low-earth orbit (LEO) satellites. Conclusions about optimal ground station locations for TDRS orbit determination by radio interferometry are presented, and the feasibility of GPS-based tracking for IRIDIUM, a LEO mobile satellite communications (MOBILSATCOM) system, is demonstrated.

  10. Method for estimating dynamic EM tracking accuracy of surgical navigation tools

    NASA Astrophysics Data System (ADS)

    Nafis, Christopher; Jensen, Vern; Beauregard, Lee; Anderson, Peter

    2006-03-01

    Optical tracking systems have been used for several years in image guided medical procedures. Vendors often state static accuracies of a single retro-reflective sphere or LED. Expensive coordinate measurement machines (CMM) are used to validate the positional accuracy over the specified working volume. Users are interested in the dynamic accuracy of their tools. The configuration of individual sensors into a unique tool, the calibration of the tool tip, and the motion of the tool contribute additional errors. Electromagnetic (EM) tracking systems are considered an enabling technology for many image guided procedures because they are not limited by line-of-sight restrictions, take minimum space in the operating room, and the sensors can be very small. It is often difficult to quantify the accuracy of EM trackers because they can be affected by field distortion from certain metal objects. Many high-accuracy measurement devices can affect the EM measurements being validated. EM Tracker accuracy tends to vary over the working volume and orientation of the sensors. We present several simple methods for estimating the dynamic accuracy of EM tracked tools. We discuss the characteristics of the EM Tracker used in the GE Healthcare family of surgical navigation systems. Results for other tracking systems are included.

  11. Quantifying the prediction accuracy of a 1-D SVAT model at a range of ecosystems in the USA and Australia: evidence towards its use as a tool to study Earth's system interactions

    NASA Astrophysics Data System (ADS)

    Petropoulos, G. P.; North, M. R.; Ireland, G.; Srivastava, P. K.; Rendall, D. V.

    2015-10-01

    This paper describes the validation of the SimSphere SVAT (Soil-Vegetation-Atmosphere Transfer) model conducted at a range of US and Australian ecosystem types. Specific focus was given to examining the models' ability in predicting shortwave incoming solar radiation (Rg), net radiation (Rnet), latent heat (LE), sensible heat (H), air temperature at 1.3 m (Tair 1.3 m) and air temperature at 50 m (Tair 50 m). Model predictions were compared against corresponding in situ measurements acquired for a total of 72 selected days of the year 2011 obtained from eight sites belonging to the AmeriFlux (USA) and OzFlux (Australia) monitoring networks. Selected sites were representative of a variety of environmental, biome and climatic conditions, to allow for the inclusion of contrasting conditions in the model evaluation. Overall, results showed a good agreement between the model predictions and the in situ measurements, particularly so for the Rg, Rnet, Tair 1.3 m and Tair 50 m parameters. The simulated Rg parameter exhibited a root mean square deviation (RMSD) within 25 % of the observed fluxes for 58 of the 72 selected days, whereas an RMSD within ~ 24 % of the observed fluxes was reported for the Rnet parameter for all days of study (RMSD = 58.69 W m-2). A systematic underestimation of Rg and Rnet (mean bias error (MBE) = -19.48 and -16.46 W m-2) was also found. Simulations for the Tair 1.3 m and Tair 50 m showed good agreement with the in situ observations, exhibiting RMSDs of 3.23 and 3.77 °C (within ~ 15 and ~ 18 % of the observed) for all days of analysis, respectively. Comparable, yet slightly less satisfactory simulation accuracies were exhibited for the H and LE parameters (RMSDs = 38.47 and 55.06 W m-2, ~ 34 and ~ 28 % of the observed). Highest simulation accuracies were obtained for the open woodland savannah and mulga woodland sites for most of the compared parameters. The Nash-Sutcliffe efficiency index for all parameters ranges from 0.720 to 0.998, suggesting

  12. Accuracy of McMonnies Questionnaire as a Screening Tool for Chinese Ophthalmic Outpatients

    PubMed Central

    Wang, Jiwei; Tang, Zheng; Kang, Mei; Deng, Qinglong; Yu, Jinming

    2016-01-01

    Objective To evaluate the accuracy of the McMonnies questionnaire (MQ) as a screening tool for dry eye (DE) among Chinese ophthalmic outpatients. Methods We recruited 27718 cases from 94 hospitals (research centers), randomly selected from 45 cities in 23 provinces from July to November in 2013. Only symptomatic outpatients were included and they were in a high risk of DE. Outpatients meeting the criteria filled out questionnaires and then underwent clinical examinations by qualified medical practitioners. We mainly evaluated sensitivity, specificity, diagnostic odds ratio (DOR), and area under the receiver-operating characteristic curve (AUC) to evaluate the accuracy of the questionnaire in the diagnosis of dry eye. Results Of all the subjects included in the study, sensitivity, specificity, and DOR were 0.77, 0.86 and 20.6, respectively. AUC was 0.865 with a 95% CI (0.861, 0.869). The prevalence of DE among the outpatients claiming “constantly” as the frequency of symptom was over 90%. Scratchiness was a more accurate diagnostic indication than dryness, soreness, grittiness or burning. Different cut points of McMonnies Index (MI) scores can be utilized to optimize the screening results. Conclusions MQ can be an effective screening tool for dry eye. We can take full advantage of MI score during the screening process. PMID:27073922

  13. Placement of the material temperature sensor during measuring the accuracy of CNC machine tools

    NASA Astrophysics Data System (ADS)

    Zhao, Dong-sheng; Jia, Min-qiang; Zhang, Jian; Sun, Lei; Li, Wei-jun

    2013-10-01

    In view of the dispute on the placement of material sensor when measuring the positional accuracy of a linear axis of a CNC machine tool, this paper presents the method and principle of deciding where to put the material temperature sensor. The positional accuracy of the linear axis of the machine tool is one of the most important performance parameters, and it must be measured when setup and check. The placement of the material temperature sensor has great influence on the measurement accuracy. At present, there are two main views on this issue: one is to place the sensor on the table of the machine tool, the other is to place it on the feedback system. This conflict between these two debates often makes the measurers feel confused and as a result influences the measure quality, sometimes. This thesis attempts to classify the CNC machine tools positional accuracy measurement according to its different purposes, then further presents the best placement. The thesis also elaborates other relevant questions of the placement of the material temperature sensor.

  14. Evaluating radiographers' diagnostic accuracy in screen-reading mammograms: what constitutes a quality study?

    SciTech Connect

    Debono, Josephine C; Poulos, Ann E

    2015-03-15

    The aim of this study was to first evaluate the quality of studies investigating the diagnostic accuracy of radiographers as mammogram screen-readers and then to develop an adapted tool for determining the quality of screen-reading studies. A literature search was used to identify relevant studies and a quality evaluation tool constructed by combining the criteria for quality of Whiting, Rutjes, Dinnes et al. and Brealey and Westwood. This constructed tool was then applied to the studies and subsequently adapted specifically for use in evaluating quality in studies investigating diagnostic accuracy of screen-readers. Eleven studies were identified and the constructed tool applied to evaluate quality. This evaluation resulted in the identification of quality issues with the studies such as potential for bias, applicability of results, study conduct, reporting of the study and observer characteristics. An assessment of the applicability and relevance of the tool for this area of research resulted in adaptations to the criteria and the development of a tool specifically for evaluating diagnostic accuracy in screen-reading. This tool, with further refinement and rigorous validation can make a significant contribution to promoting well-designed studies in this important area of research and practice.

  15. NREL Evaluates Thermal Performance of Uninsulated Walls to Improve Accuracy of Building Energy Simulation Tools (Fact Sheet)

    SciTech Connect

    Not Available

    2012-03-01

    NREL researchers discover ways to increase accuracy in building energy simulations tools to improve predictions of potential energy savings in homes. Uninsulated walls are typical in older U.S. homes where the wall cavities were not insulated during construction or where the insulating material has settled. Researchers at the National Renewable Energy Laboratory (NREL) are investigating ways to more accurately calculate heat transfer through building enclosures to verify the benefit of energy efficiency upgrades that reduce energy use in older homes. In this study, scientists used computational fluid dynamics (CFD) analysis to calculate the energy loss/gain through building walls and visualize different heat transfer regimes within the uninsulated cavities. The effects of ambient outdoor temperature, the radiative properties of building materials, insulation levels, and the temperature dependence of conduction through framing members were considered. The research showed that the temperature dependence of conduction through framing members dominated the differences between this study and previous results - an effect not accounted for in existing building energy simulation tools. The study provides correlations for the resistance of the uninsulated assemblies that can be implemented into building simulation tools to increase the accuracy of energy use estimates in older homes, which are currently over-predicted.

  16. An evaluation of the accuracy and speed of metagenome analysis tools

    PubMed Central

    Lindgreen, Stinus; Adair, Karen L.; Gardner, Paul P.

    2016-01-01

    Metagenome studies are becoming increasingly widespread, yielding important insights into microbial communities covering diverse environments from terrestrial and aquatic ecosystems to human skin and gut. With the advent of high-throughput sequencing platforms, the use of large scale shotgun sequencing approaches is now commonplace. However, a thorough independent benchmark comparing state-of-the-art metagenome analysis tools is lacking. Here, we present a benchmark where the most widely used tools are tested on complex, realistic data sets. Our results clearly show that the most widely used tools are not necessarily the most accurate, that the most accurate tool is not necessarily the most time consuming, and that there is a high degree of variability between available tools. These findings are important as the conclusions of any metagenomics study are affected by errors in the predicted community composition and functional capacity. Data sets and results are freely available from http://www.ucbioinformatics.org/metabenchmark.html PMID:26778510

  17. An evaluation of the accuracy and speed of metagenome analysis tools.

    PubMed

    Lindgreen, Stinus; Adair, Karen L; Gardner, Paul P

    2016-01-01

    Metagenome studies are becoming increasingly widespread, yielding important insights into microbial communities covering diverse environments from terrestrial and aquatic ecosystems to human skin and gut. With the advent of high-throughput sequencing platforms, the use of large scale shotgun sequencing approaches is now commonplace. However, a thorough independent benchmark comparing state-of-the-art metagenome analysis tools is lacking. Here, we present a benchmark where the most widely used tools are tested on complex, realistic data sets. Our results clearly show that the most widely used tools are not necessarily the most accurate, that the most accurate tool is not necessarily the most time consuming, and that there is a high degree of variability between available tools. These findings are important as the conclusions of any metagenomics study are affected by errors in the predicted community composition and functional capacity. Data sets and results are freely available from http://www.ucbioinformatics.org/metabenchmark.html. PMID:26778510

  18. Criteria and tools for objectively analysing the vocal accuracy of a popular song.

    PubMed

    Larrouy-Maestri, Pauline; Morsomme, Dominique

    2014-04-01

    This study aims to validate our method for measuring accuracy in a melodic context. We analysed the popular song 'Happy Birthday' sung by 63 occasional and 14 professional singers thanks to AudioSculpt and OpenMusic (IRCAM, Paris, France). In terms of evaluation of the pitch interval deviation, we replicated the profile of occasional singers described in the literature (the slower the performance, the more accurate it is). Our results also confirm that the professional singers sing more accurately than occasional singers but not when a Western operatic singing technique is involved. These results support the relevance of our method for analysing vocal accuracy of occasional and professional singers and led us to discuss adaptations to be implemented for analysing the accuracy of operatic voices. PMID:22721558

  19. Technical Highlight: NREL Evaluates the Thermal Performance of Uninsulated Walls to Improve the Accuracy of Building Energy Simulation Tools

    SciTech Connect

    Ridouane, E.H.

    2012-01-01

    This technical highlight describes NREL research to develop models of uninsulated wall assemblies that help to improve the accuracy of building energy simulation tools when modeling potential energy savings in older homes.

  20. Simultaneous nuclear data target accuracy study for innovative fast reactors.

    SciTech Connect

    Aliberti, G.; Palmiotti, G.; Salvatores, M.; Nuclear Engineering Division; INL; CEA Cadarache

    2007-01-01

    The present paper summarizes the major outcomes of a study conducted within a Nuclear Energy Agency Working Party on Evaluation Cooperation (NEA WPEC) initiative aiming to investigate data needs for future innovative nuclear systems, to quantify them and to propose a strategy to meet them. Within the NEA WPEC Subgroup 26 an uncertainty assessment has been carried out using covariance data recently processed by joint efforts of several US and European Labs. In general, the uncertainty analysis shows that for the wide selection of fast reactor concepts considered, the present integral parameters uncertainties resulting from the assumed uncertainties on nuclear data are probably acceptable in the early phases of design feasibility studies. However, in the successive phase of preliminary conceptual designs and in later design phases of selected reactor and fuel cycle concepts, there will be the need for improved data and methods, in order to reduce margins, both for economic and safety reasons. It is then important to define as soon as possible priority issues, i.e. which are the nuclear data (isotope, reaction type, energy range) that need improvement, in order to quantify target accuracies and to select a strategy to meet the requirements needed (e.g. by some selected new differential measurements and by the use of integral experiments). In this context one should account for the wide range of high accuracy integral experiments already performed and available in national or, better, international data basis, in order to indicate new integral experiments that will be needed to account for new requirements due to innovative design features, and to provide the necessary full integral data base to be used for validation of the design simulation tools.

  1. Dynamics of Complexity and Accuracy: A Longitudinal Case Study of Advanced Untutored Development

    ERIC Educational Resources Information Center

    Polat, Brittany; Kim, Youjin

    2014-01-01

    This longitudinal case study follows a dynamic systems approach to investigate an under-studied research area in second language acquisition, the development of complexity and accuracy for an advanced untutored learner of English. Using the analytical tools of dynamic systems theory (Verspoor et al. 2011) within the framework of complexity,…

  2. Expansion/De-expansion Tool to Quantify the Accuracy of Prostate Contours

    SciTech Connect

    Chung, Eugene; Stenmark, Matthew H.; Evans, Cheryl; Narayana, Vrinda; McLaughlin, Patrick W.

    2012-05-01

    Purpose: Accurate delineation of the prostate gland on computed tomography (CT) remains a persistent challenge and continues to introduce geometric uncertainty into the planning and delivery of external beam radiotherapy. We, therefore, developed an expansion/de-expansion tool to quantify the contour errors and determine the location of the deviations. Methods and Materials: A planning CT scan and magnetic resonance imaging scan were prospectively acquired for 10 patients with prostate cancer. The prostate glands were contoured by 3 independent observers using the CT data sets with instructions to contour the prostate without underestimation but to minimize overestimation. The standard prostate for each patient was defined using magnetic resonance imaging and CT on multiple planes. After registration of the CT and magnetic resonance imaging data sets, the CT-defined prostates were scored for accuracy. The contours were defined as ideal if they were within a 2.5-mm expansion of the standard without underestimation, acceptable if they were within a 5.0-mm expansion and a 2.5-mm de-expansion, and unacceptable if they extended >5.0 mm or underestimated the prostate by >2.5 mm. Results: A total of 636 CT slices were individually analyzed, with the vast majority scored as ideal or acceptable. However, none of the 30 prostate contour sets had all the contours scored as ideal or acceptable. For all 3 observers, the unacceptable contours were more likely from underestimation than overestimation of the prostate. The errors were more common at the base and apex than the mid-gland. Conclusions: The expansion/de-expansion tool allows for directed feedback on the location of contour deviations, as well as the determination of over- or underestimation of the prostate. This metric might help improve the accuracy of prostate contours.

  3. Air traffic control surveillance accuracy and update rate study

    NASA Technical Reports Server (NTRS)

    Craigie, J. H.; Morrison, D. D.; Zipper, I.

    1973-01-01

    The results of an air traffic control surveillance accuracy and update rate study are presented. The objective of the study was to establish quantitative relationships between the surveillance accuracies, update rates, and the communication load associated with the tactical control of aircraft for conflict resolution. The relationships are established for typical types of aircraft, phases of flight, and types of airspace. Specific cases are analyzed to determine the surveillance accuracies and update rates required to prevent two aircraft from approaching each other too closely.

  4. NREL Evaluates the Thermal Performance of Uninsulated Walls to Improve the Accuracy of Building Energy Simulation Tools (Fact Sheet)

    SciTech Connect

    Not Available

    2012-01-01

    This technical highlight describes NREL research to develop models of uninsulated wall assemblies that help to improve the accuracy of building energy simulation tools when modeling potential energy savings in older homes. Researchers at the National Renewable Energy Laboratory (NREL) have developed models for evaluating the thermal performance of walls in existing homes that will improve the accuracy of building energy simulation tools when predicting potential energy savings of existing homes. Uninsulated walls are typical in older homes where the wall cavities were not insulated during construction or where the insulating material has settled. Accurate calculation of heat transfer through building enclosures will help determine the benefit of energy efficiency upgrades in order to reduce energy consumption in older American homes. NREL performed detailed computational fluid dynamics (CFD) analysis to quantify the energy loss/gain through the walls and to visualize different airflow regimes within the uninsulated cavities. The effects of ambient outdoor temperature, radiative properties of building materials, and insulation level were investigated. The study showed that multi-dimensional airflows occur in walls with uninsulated cavities and that the thermal resistance is a function of the outdoor temperature - an effect not accounted for in existing building energy simulation tools. The study quantified the difference between CFD prediction and the approach currently used in building energy simulation tools over a wide range of conditions. For example, researchers found that CFD predicted lower heating loads and slightly higher cooling loads. Implementation of CFD results into building energy simulation tools such as DOE2 and EnergyPlus will likely reduce the predicted heating load of homes. Researchers also determined that a small air gap in a partially insulated cavity can lead to a significant reduction in thermal resistance. For instance, a 4-in. tall air gap

  5. Tool wear studies in fabrication of microchannels in ultrasonic micromachining.

    PubMed

    Cheema, Manjot S; Dvivedi, Akshay; Sharma, Apurbba K

    2015-03-01

    Form accuracy of a machined component is one of the performance indicators of a machining process. Ultrasonic micromachining is one such process in which the form accuracy of the micromachined component significantly depends upon the form stability of tool. Unlike macromachining, a very small amount of tool wear in micromachining could lead to considerable changes in the form accuracy of the machined component. Appropriate selection of tool material is essential to overcome this problem. The present study discusses the effect of tool material, abrasive size and step feed in fabrication of microchannels by ultrasonic machining on borosilicate glass. Development of microchannels using ultrasonic micromachining were rarely reported. It was observed that tungsten carbide tool provided a better form accuracy in comparison to the microchannel machined by stainless steel tool. The tool wear mechanism in both materials is proposed by considering scanning electron micrographs of the tool as evidence. A one factor at a time approach was used to study the effect of various process parameters. PMID:25465965

  6. On the Hipparcos Accuracy Using Binary Stars as a Calibration Tool

    NASA Astrophysics Data System (ADS)

    Docobo, J. A.; Andrade, M.

    2015-02-01

    Stellar binary systems, specifically those that present the most accurate available orbital elements, are a reliable tool to test the accuracy of astrometric observations. We selected all 35 binaries with these characteristics. Our objective is to provide standard uncertainties for the positions and parallaxes measured by Hipparcos relative to this trustworthy set, as well as to check supposed correlations between several parameters (measurement residuals, positions, magnitudes, and parallaxes). In addition, using the high-confidence subset of visual-spectroscopic binaries, we implemented a validation test of the Hipparcos trigonometric parallaxes of binary systems that allowed the evaluation of their reliability. Standard and non-standard statistical analysis techniques were applied in order to achieve well-founded conclusions. In particular, errors-in-variables models such as the total least-squares method were used to validate Hipparcos parallaxes by comparison with those obtained directly from the orbital elements. Previously, we executed Thompson's τ technique in order to detect suspected outliers in the data. Furthermore, several statistical hypothesis tests were carried out to verify if our results were statistically significant. A statistically significant trend indicating larger Hipparcos angular separations with respect to the reference values in 5.2 ± 1.4 mas was found at the 10-8 significance level. Uncertainties in the polar coordinates θ and ρ of 1.°8 and 6.3 mas, respectively, were estimated for the Hipparcos observations of binary systems. We also verified that the parallaxes of binary systems measured in this mission are absolutely compatible with the set of orbital parallaxes obtained from the most accurate orbits at least at the 95% confidence level. This methodology allows us to better estimate the accuracy of Hipparcos observations of binary systems. Indeed, further application to the data collected by Gaia should yield a standard

  7. On the accuracy of Hipparcos using binary stars as a calibration tool

    SciTech Connect

    Docobo, J. A.; Andrade, M. E-mail: manuel.andrade@usc.es

    2015-02-01

    Stellar binary systems, specifically those that present the most accurate available orbital elements, are a reliable tool to test the accuracy of astrometric observations. We selected all 35 binaries with these characteristics. Our objective is to provide standard uncertainties for the positions and parallaxes measured by Hipparcos relative to this trustworthy set, as well as to check supposed correlations between several parameters (measurement residuals, positions, magnitudes, and parallaxes). In addition, using the high-confidence subset of visual–spectroscopic binaries, we implemented a validation test of the Hipparcos trigonometric parallaxes of binary systems that allowed the evaluation of their reliability. Standard and non-standard statistical analysis techniques were applied in order to achieve well-founded conclusions. In particular, errors-in-variables models such as the total least-squares method were used to validate Hipparcos parallaxes by comparison with those obtained directly from the orbital elements. Previously, we executed Thompson's τ technique in order to detect suspected outliers in the data. Furthermore, several statistical hypothesis tests were carried out to verify if our results were statistically significant. A statistically significant trend indicating larger Hipparcos angular separations with respect to the reference values in 5.2 ± 1.4 mas was found at the 10{sup −8} significance level. Uncertainties in the polar coordinates θ and ρ of 1.°8 and 6.3 mas, respectively, were estimated for the Hipparcos observations of binary systems. We also verified that the parallaxes of binary systems measured in this mission are absolutely compatible with the set of orbital parallaxes obtained from the most accurate orbits at least at the 95% confidence level. This methodology allows us to better estimate the accuracy of Hipparcos observations of binary systems. Indeed, further application to the data collected by Gaia should yield a

  8. Cutting tool study: 21-6-9 stainless steel

    SciTech Connect

    McManigle, A.P.

    1992-07-29

    The Rocky Flats Plant conducted a study to test cermet cutting tools by performing machinability studies on War Reserve product under controlled conditions. The purpose of these studies was to determine the most satisfactory tools that optimize tool life, minimize costs, improve reliability and chip control, and increase productivity by performing the operations to specified Accuracies. This study tested three manufacturers` cermet cutting tools and a carbide tool used previously by the Rocky Flats Plant for machining spherical-shaped 21-6-9 stainless steel forgings (Figure 1). The 80-degree diamond inserts were tested by experimenting with various chip-breaker geometries, cutting speeds, feedrates, and cermet grades on the outside contour roughing operation. The cermets tested were manufactured by Kennametal, Valenite, and NTK. The carbide tool ordinarily used for this operation is manufactured by Carboloy. Evaluation of tho tools was conducted by investigating the number of passes per part and parts per insert, tool wear, cutting time, tool life, surface finish, and stem taper. Benefits to be gained from this study were: improved part quality, better chip control, increased tool life and utilization, and greater fabrication productivity. This was to be accomplished by performing the operation to specified accuracies within the scope of the tools tested.

  9. Cutting tool study: 21-6-9 stainless steel

    SciTech Connect

    McManigle, A.P.

    1992-07-29

    The Rocky Flats Plant conducted a study to test cermet cutting tools by performing machinability studies on War Reserve product under controlled conditions. The purpose of these studies was to determine the most satisfactory tools that optimize tool life, minimize costs, improve reliability and chip control, and increase productivity by performing the operations to specified Accuracies. This study tested three manufacturers' cermet cutting tools and a carbide tool used previously by the Rocky Flats Plant for machining spherical-shaped 21-6-9 stainless steel forgings (Figure 1). The 80-degree diamond inserts were tested by experimenting with various chip-breaker geometries, cutting speeds, feedrates, and cermet grades on the outside contour roughing operation. The cermets tested were manufactured by Kennametal, Valenite, and NTK. The carbide tool ordinarily used for this operation is manufactured by Carboloy. Evaluation of tho tools was conducted by investigating the number of passes per part and parts per insert, tool wear, cutting time, tool life, surface finish, and stem taper. Benefits to be gained from this study were: improved part quality, better chip control, increased tool life and utilization, and greater fabrication productivity. This was to be accomplished by performing the operation to specified accuracies within the scope of the tools tested.

  10. Embodied Rules in Tool Use: A Tool-Switching Study

    ERIC Educational Resources Information Center

    Beisert, Miriam; Massen, Cristina; Prinz, Wolfgang

    2010-01-01

    In tool use, a transformation rule defines the relation between an operating movement and its distal effect. This rule is determined by the tool structure and requires no explicit definition. The present study investigates how humans represent and apply compatible and incompatible transformation rules in tool use. In Experiment 1, participants had…

  11. High angular accuracy manufacture method of micro v-grooves based on tool alignment by on-machine measurement.

    PubMed

    Zhang, Xiaodong; Jiang, Lili; Zeng, Zhen; Fang, Fengzhou; Liu, Xianlei

    2015-10-19

    Micro v-groove has found wide applications in optical areas as one of the most important structures. However, its performance is significantly affected by its angular geometry accuracy. The diamond cutting has been commonly used as the fabrication method of micro v-groove, but it is still difficult to guarantee the cutting tool angle, which is limited by the measurement accuracy in the manufacture and mounting of the diamond tool. A cutting tool alignment method based on the on-machine measurement is proposed to improve the fabricated quality of the v-groove angle. An on-machine probe is employed to scan the v-groove geometrical deviation precisely. The system errors model, data processing algorithm and tool alignment methods are analyzed in details. Experimental results show that the measurement standard deviation within 0.01° can be achieved. Retro-reflection mirrors are fabricated and measured finally by the proposed method for verification. PMID:26480443

  12. Does diagnosis affect the predictive accuracy of risk assessment tools for juvenile offenders: Conduct Disorder and Attention Deficit Hyperactivity Disorder.

    PubMed

    Khanna, Dinesh; Shaw, Jenny; Dolan, Mairead; Lennox, Charlotte

    2014-10-01

    Studies have suggested an increased risk of criminality in juveniles if they suffer from co-morbid Attention Deficit Hyperactivity Disorder (ADHD) along with Conduct Disorder. The Structured Assessment of Violence Risk in Youth (SAVRY), the Psychopathy Checklist Youth Version (PCL:YV), and Youth Level of Service/Case Management Inventory (YLS/CMI) have been shown to be good predictors of violent and non-violent re-offending. The aim was to compare the accuracy of these tools to predict violent and non-violent re-offending in young people with co-morbid ADHD and Conduct Disorder and Conduct Disorder only. The sample included 109 White-British adolescent males in secure settings. Results revealed no significant differences between the groups for re-offending. SAVRY factors had better predictive values than PCL:YV or YLS/CMI. Tools generally had better predictive values for the Conduct Disorder only group than the co-morbid group. Possible reasons for these findings have been discussed along with limitations of the study. PMID:25173178

  13. "Virtual microscopy" and the internet as telepathology consultation tools: diagnostic accuracy in evaluating melanocytic skin lesions.

    PubMed

    Okada, D H; Binder, S W; Felten, C L; Strauss, J S; Marchevsky, A M

    1999-12-01

    The Internet offers a widely available, inexpensive tool for telepathology consultations. It allows the transfer of image and text files through electronic mail (e-mail) or file transfer protocols (FTP), using a variety of microcomputer platforms. We studied the use of the Internet and "virtual microscopy" tools for the diagnosis of 35 skin biopsies, including a variety of benign and malignant melanocytic lesions. Digitized images from these lesions were obtained at 40x and 100x optical magnification, using a high resolution digital camera (Microlumina, Leaf Systems, Southborough, MA), a light microscope with a phototube adapter and a microcomputer with a Pentium 166 MHz microprocessor. Two to four images of each case were arranged on a "canvas" to represent the majority or an entire biopsy level, using Photoshop software (Adobe Systems Inc., San Jose, CA). The images were compressed using Joint Photographers Expert Group (JPEG) format. The images were then viewed on a computer video monitor in a manner that closely resembles light microscopy, including scrolling by using the "hand tool" of Photoshop and changing magnification digitally up to 4 times without visible image degradation. The image files, ranging in size from 700 kilobytes to 2.1 megabytes (average 1.6 megabytes) were attached to e-mail messages that contained clinical information, using standard Multipurpose Internet Mail Extension (MIME) protocols and sent through the Internet, for interpretation by a dermatopathologist. The consultant could open the images from the e-mail message, using Microsoft Outlook Express (Microsoft Corp., Redmond, WA) and Photoshop software, scroll them, change magnification and render a diagnosis in a manner that closely simulates light microscopy. One hundred percent concordance was obtained between the telepathology and traditional hematoxylin and eosin slide diagnoses. The Internet and relatively inexpensive "virtual microscopy" tools offer a novel technology for

  14. Study of accuracy of precipitation measurements using simulation method

    NASA Astrophysics Data System (ADS)

    Nagy, Zoltán; Lajos, Tamás; Morvai, Krisztián

    2013-04-01

    Hungarian Meteorological Service1 Budapest University of Technology and Economics2 Precipitation is one of the the most important meteorological parameters describing the state of the climate and to get correct information from trends, accurate measurements of precipitation is very important. The problem is that the precipitation measurements are affected by systematic errors leading to an underestimation of actual precipitation which errors vary by type of precipitaion and gauge type. It is well known that the wind speed is the most important enviromental factor that contributes to the underestimation of actual precipitation, especially for solid precipitation. To study and correct the errors of precipitation measurements there are two basic possibilities: · Use of results and conclusion of International Precipitation Measurements Intercomparisons; · To build standard reference gauges (DFIR, pit gauge) and make own investigation; In 1999 at the HMS we tried to achieve own investigation and built standard reference gauges But the cost-benefit ratio in case of snow (use of DFIR) was very bad (we had several winters without significant amount of snow, while the state of DFIR was continously falling) Due to the problem mentioned above there was need for new approximation that was the modelling made by Budapest University of Technology and Economics, Department of Fluid Mechanics using the FLUENT 6.2 model. The ANSYS Fluent package is featured fluid dynamics solution for modelling flow and other related physical phenomena. It provides the tools needed to describe atmospheric processes, design and optimize new equipment. The CFD package includes solvers that accurately simulate behaviour of the broad range of flows that from single-phase to multi-phase. The questions we wanted to get answer to are as follows: · How do the different types of gauges deform the airflow around themselves? · Try to give quantitative estimation of wind induced error. · How does the use

  15. Accuracy of ambulatory blood pressure determination: a comparative study.

    PubMed

    Barthélémy, J C; Geyssant, A; Auboyer, C; Antoniadis, A; Berruyer, J; Lacour, J R

    1991-09-01

    This study was designed to discriminate, according to their accuracy, between three ambulatory pressurometers (Diasys 200R, Novacor; P IV, Del Mar Avionics; SpaceLab 90202, SpaceLab). The evaluation was performed against invasive arterial reference measurements. Accuracy was assessed by calculating the error on pressure (EOP) as the difference between invasive and non-invasive measurement of arterial blood pressure. For the systolic values, accuracy (mean of EOP differences) and uncertainty (SD of these differences) were -0.9 +/- 9.7, -4.3 +/- 10.1 and -16.7 +/- 10.1 mmHg for, respectively, Diasys, PIV and SpaceLab. For diastolic values, they were, respectively, 5.9 +/- 6.7, 6.8 +/- 8.5 and 9.1 +/- 6.6 mmHg. EOP was then separated in two different types of errors: (i) the error of dispersion appreciated by the index of homogeneity calculated by a Lehmann analysis and leading to a statistical classification (ii) the error due to the drift of EOP with the reference value, this last error being easier to correct. Two different behaviours were observed for the EOP: (i) the drift of EOP of systolic values was significantly larger for the oscillometric (SpaceLab) than for the auscultatory (Diasys and P IV) method, with no difference between Diasys and P IV (ii) the homogeneity index was not statistically different among these three devices. These data suggest that, in case the correction of the drift of EOP is carried out, there is no statistical significant difference in accuracy between these three pressurometers. However, in our experimental conditions, the two ambulatory pressurometers recording the Korotkoff sounds have a better accuracy than the one using the oscillometric approach. PMID:1947731

  16. Precision and Accuracy Studies with Kajaani Fiber Length Analyzers

    NASA Astrophysics Data System (ADS)

    Copur, Yalcin; Makkonen, Hannu

    The aim of this study was to test the measurement precision and accuracy of the Kajaani FS-100 giving attention to possible machine error in the measurements. Fiber length of pine pulps produced using polysulfide, kraft, biokraft and soda methods were determined using both FS-100 and FiberLab automated fiber length analyzers. The measured length values were compared for both methods. The measurement precision and accuracy was tested by replicated measurements using rayon stable fibers. Measurements performed on pulp samples showed typical length distributions for both analyzers. Results obtained from Kajaani FS-100 and FiberLab showed a significant correlation. The shorter length measurement with FiberLab was found to be mainly due to the instrument calibration. The measurement repeatability tested for Kajaani FS-100 indicated that the measurements are precise.

  17. A Study on the Effect of Input Parameters on Springback Prediction Accuracy

    NASA Astrophysics Data System (ADS)

    Han, Y. S.; Yang, W. H.; Choi, K. Y.; Kim, B. H.

    2011-08-01

    In this study, it is considered the input parameters in springback simulation affect factors to use member part by Taguchi's method into six-sigma tool on the basis of experiment for acquiring much more accurate springback prediction in Pamstamp2G. The best combination of input parameters for higher springback prediction accuracy is determined to the fender part as the one is applied for member part. The cracks and wrinkles in drawing and flanging operation must be removed for predicting the higher springback in accuracy. The compensation of springback on the basis of simulation is carried out. It is concluded that 95% of accuracy for springback prediction in dimension is secured as comparing with tryout panel.

  18. Bias due to composite reference standards in diagnostic accuracy studies.

    PubMed

    Schiller, Ian; van Smeden, Maarten; Hadgu, Alula; Libman, Michael; Reitsma, Johannes B; Dendukuri, Nandini

    2016-04-30

    Composite reference standards (CRSs) have been advocated in diagnostic accuracy studies in the absence of a perfect reference standard. The rationale is that combining results of multiple imperfect tests leads to a more accurate reference than any one test in isolation. Focusing on a CRS that classifies subjects as disease positive if at least one component test is positive, we derive algebraic expressions for sensitivity and specificity of this CRS, sensitivity and specificity of a new (index) test compared with this CRS, as well as the CRS-based prevalence. We use as a motivating example the problem of evaluating a new test for Chlamydia trachomatis, an asymptomatic disease for which no gold-standard test exists. As the number of component tests increases, sensitivity of this CRS increases at the expense specificity, unless all tests have perfect specificity. Therefore, such a CRS can lead to significantly biased accuracy estimates of the index test. The bias depends on disease prevalence and accuracy of the CRS. Further, conditional dependence between the CRS and index test can lead to over-estimation of index test accuracy estimates. This commonly-used CRS combines results from multiple imperfect tests in a way that ignores information and therefore is not guaranteed to improve over a single imperfect reference unless each component test has perfect specificity, and the CRS is conditionally independent of the index test. When these conditions are not met, as in the case of C. trachomatis testing, more realistic statistical models should be researched instead of relying on such CRSs. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26555849

  19. Meta-analysis of diagnostic accuracy studies in mental health

    PubMed Central

    Takwoingi, Yemisi; Riley, Richard D; Deeks, Jonathan J

    2015-01-01

    Objectives To explain methods for data synthesis of evidence from diagnostic test accuracy (DTA) studies, and to illustrate different types of analyses that may be performed in a DTA systematic review. Methods We described properties of meta-analytic methods for quantitative synthesis of evidence. We used a DTA review comparing the accuracy of three screening questionnaires for bipolar disorder to illustrate application of the methods for each type of analysis. Results The discriminatory ability of a test is commonly expressed in terms of sensitivity (proportion of those with the condition who test positive) and specificity (proportion of those without the condition who test negative). There is a trade-off between sensitivity and specificity, as an increasing threshold for defining test positivity will decrease sensitivity and increase specificity. Methods recommended for meta-analysis of DTA studies --such as the bivariate or hierarchical summary receiver operating characteristic (HSROC) model --jointly summarise sensitivity and specificity while taking into account this threshold effect, as well as allowing for between study differences in test performance beyond what would be expected by chance. The bivariate model focuses on estimation of a summary sensitivity and specificity at a common threshold while the HSROC model focuses on the estimation of a summary curve from studies that have used different thresholds. Conclusions Meta-analyses of diagnostic accuracy studies can provide answers to important clinical questions. We hope this article will provide clinicians with sufficient understanding of the terminology and methods to aid interpretation of systematic reviews and facilitate better patient care. PMID:26446042

  20. A New 3D Tool for Assessing the Accuracy of Bimaxillary Surgery: The OrthoGnathicAnalyser

    PubMed Central

    Xi, Tong; Schreurs, Ruud; de Koning, Martien; Bergé, Stefaan; Maal, Thomas

    2016-01-01

    Aim The purpose of this study was to present and validate an innovative semi-automatic approach to quantify the accuracy of the surgical outcome in relation to 3D virtual orthognathic planning among patients who underwent bimaxillary surgery. Material and Method For the validation of this new semi-automatic approach, CBCT scans of ten patients who underwent bimaxillary surgery were acquired pre-operatively. Individualized 3D virtual operation plans were made for all patients prior to surgery. During surgery, the maxillary and mandibular segments were positioned as planned by using 3D milled interocclusal wafers. Consequently, post-operative CBCT scan were acquired. The 3D rendered pre- and postoperative virtual head models were aligned by voxel-based registration upon the anterior cranial base. To calculate the discrepancies between the 3D planning and the actual surgical outcome, the 3D planned maxillary and mandibular segments were segmented and superimposed upon the postoperative maxillary and mandibular segments. The translation matrices obtained from this registration process were translated into translational and rotational discrepancies between the 3D planning and the surgical outcome, by using the newly developed tool, the OrthoGnathicAnalyser. To evaluate the reproducibility of this method, the process was performed by two independent observers multiple times. Results Low intra-observer and inter-observer variations in measurement error (mean error < 0.25 mm) and high intraclass correlation coefficients (> 0.97) were found, supportive of the observer independent character of the OrthoGnathicAnalyser. The pitch of the maxilla and mandible showed the highest discrepancy between the 3D planning and the postoperative results, 2.72° and 2.75° respectively. Conclusion This novel method provides a reproducible tool for the evaluation of bimaxillary surgery, making it possible to compare larger patient groups in an objective and time-efficient manner in order to

  1. [Rheocardiographic studies on the accuracy of cardiac stroke volume models].

    PubMed

    Baluev, E P; Parashin, V B

    1984-01-01

    Some results of the studies on the accuracy of detecting pulsating stroke volume from rheocardiograms are discussed. Using a physical model it is shown that resulting values are strongly conditioned by relative positions of electrodes and pulsating volume, its shape, and the geometry of conducting medium. An approximate value of the stroke volume may be derived from the semiempirical formulae coupling relative variations of the resistance and the volume. It may differ from the true magnitude by the factor of 1.5-2. PMID:6503678

  2. High-accuracy mass spectrometry for fundamental studies.

    PubMed

    Kluge, H-Jürgen

    2010-01-01

    Mass spectrometry for fundamental studies in metrology and atomic, nuclear and particle physics requires extreme sensitivity and efficiency as well as ultimate resolving power and accuracy. An overview will be given on the global status of high-accuracy mass spectrometry for fundamental physics and metrology. Three quite different examples of modern mass spectrometric experiments in physics are presented: (i) the retardation spectrometer KATRIN at the Forschungszentrum Karlsruhe, employing electrostatic filtering in combination with magnetic-adiabatic collimation-the biggest mass spectrometer for determining the smallest mass, i.e. the mass of the electron anti-neutrino, (ii) the Experimental Cooler-Storage Ring at GSI-a mass spectrometer of medium size, relative to other accelerators, for determining medium-heavy masses and (iii) the Penning trap facility, SHIPTRAP, at GSI-the smallest mass spectrometer for determining the heaviest masses, those of super-heavy elements. Finally, a short view into the future will address the GSI project HITRAP at GSI for fundamental studies with highly-charged ions. PMID:20530821

  3. Numerical Stability and Accuracy of Temporally Coupled Multi-Physics Modules in Wind-Turbine CAE Tools

    SciTech Connect

    Gasmi, A.; Sprague, M. A.; Jonkman, J. M.; Jones, W. B.

    2013-02-01

    In this paper we examine the stability and accuracy of numerical algorithms for coupling time-dependent multi-physics modules relevant to computer-aided engineering (CAE) of wind turbines. This work is motivated by an in-progress major revision of FAST, the National Renewable Energy Laboratory's (NREL's) premier aero-elastic CAE simulation tool. We employ two simple examples as test systems, while algorithm descriptions are kept general. Coupled-system governing equations are framed in monolithic and partitioned representations as differential-algebraic equations. Explicit and implicit loose partition coupling is examined. In explicit coupling, partitions are advanced in time from known information. In implicit coupling, there is dependence on other-partition data at the next time step; coupling is accomplished through a predictor-corrector (PC) approach. Numerical time integration of coupled ordinary-differential equations (ODEs) is accomplished with one of three, fourth-order fixed-time-increment methods: Runge-Kutta (RK), Adams-Bashforth (AB), and Adams-Bashforth-Moulton (ABM). Through numerical experiments it is shown that explicit coupling can be dramatically less stable and less accurate than simulations performed with the monolithic system. However, PC implicit coupling restored stability and fourth-order accuracy for ABM; only second-order accuracy was achieved with RK integration. For systems without constraints, explicit time integration with AB and explicit loose coupling exhibited desired accuracy and stability.

  4. Development of Automated Image Analysis Tools for Verification of Radiotherapy Field Accuracy with AN Electronic Portal Imaging Device.

    NASA Astrophysics Data System (ADS)

    Dong, Lei

    1995-01-01

    The successful management of cancer with radiation relies on the accurate deposition of a prescribed dose to a prescribed anatomical volume within the patient. Treatment set-up errors are inevitable because the alignment of field shaping devices with the patient must be repeated daily up to eighty times during the course of a fractionated radiotherapy treatment. With the invention of electronic portal imaging devices (EPIDs), patient's portal images can be visualized daily in real-time after only a small fraction of the radiation dose has been delivered to each treatment field. However, the accuracy of human visual evaluation of low-contrast portal images has been found to be inadequate. The goal of this research is to develop automated image analysis tools to detect both treatment field shape errors and patient anatomy placement errors with an EPID. A moments method has been developed to align treatment field images to compensate for lack of repositioning precision of the image detector. A figure of merit has also been established to verify the shape and rotation of the treatment fields. Following proper alignment of treatment field boundaries, a cross-correlation method has been developed to detect shifts of the patient's anatomy relative to the treatment field boundary. Phantom studies showed that the moments method aligned the radiation fields to within 0.5mm of translation and 0.5^ circ of rotation and that the cross-correlation method aligned anatomical structures inside the radiation field to within 1 mm of translation and 1^ circ of rotation. A new procedure of generating and using digitally reconstructed radiographs (DRRs) at megavoltage energies as reference images was also investigated. The procedure allowed a direct comparison between a designed treatment portal and the actual patient setup positions detected by an EPID. Phantom studies confirmed the feasibility of the methodology. Both the moments method and the cross -correlation technique were

  5. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis

    NASA Astrophysics Data System (ADS)

    Litjens, Geert; Sánchez, Clara I.; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen-van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen

    2016-05-01

    Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce ‘deep learning’ as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30–40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that ‘deep learning’ holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging.

  6. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis.

    PubMed

    Litjens, Geert; Sánchez, Clara I; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen-van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen

    2016-01-01

    Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce 'deep learning' as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30-40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that 'deep learning' holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging. PMID:27212078

  7. The comparison index: A tool for assessing the accuracy of image segmentation

    NASA Astrophysics Data System (ADS)

    Möller, M.; Lymburner, L.; Volk, M.

    2007-08-01

    Segmentation algorithms applied to remote sensing data provide valuable information about the size, distribution and context of landscape objects at a range of scales. However, there is a need for well-defined and robust validation tools to assessing the reliability of segmentation results. Such tools are required to assess whether image segments are based on 'real' objects, such as field boundaries, or on artefacts of the image segmentation algorithm. These tools can be used to improve the reliability of any land-use/land-cover classifications or landscape analyses that is based on the image segments. The validation algorithm developed in this paper aims to: (a) localize and quantify segmentation inaccuracies; and (b) allow the assessment of segmentation results on the whole. The first aim is achieved using object metrics that enable the quantification of topological and geometric object differences. The second aim is achieved by combining these object metrics into a 'Comparison Index', which allows a relative comparison of different segmentation results. The approach demonstrates how the Comparison Index CI can be used to guide trial-and-error techniques, enabling the identification of a segmentation scale H that is close to optimal. Once this scale has been identified a more detailed examination of the CI- H- diagrams can be used to identify precisely what H value and associated parameter settings will yield the most accurate image segmentation results. The procedure is applied to segmented Landsat scenes in an agricultural area in Saxony-Anhalt, Germany. The segmentations were generated using the 'Fractal Net Evolution Approach', which is implemented in the eCognition software.

  8. Nonparametric meta-analysis for diagnostic accuracy studies.

    PubMed

    Zapf, Antonia; Hoyer, Annika; Kramer, Katharina; Kuss, Oliver

    2015-12-20

    Summarizing the information of many studies using a meta-analysis becomes more and more important, also in the field of diagnostic studies. The special challenge in meta-analysis of diagnostic accuracy studies is that in general sensitivity and specificity are co-primary endpoints. Across the studies both endpoints are correlated, and this correlation has to be considered in the analysis. The standard approach for such a meta-analysis is the bivariate logistic random effects model. An alternative approach is to use marginal beta-binomial distributions for the true positives and the true negatives, linked by copula distributions. In this article, we propose a new, nonparametric approach of analysis, which has greater flexibility with respect to the correlation structure, and always converges. In a simulation study, it becomes apparent that the empirical coverage of all three approaches is in general below the nominal level. Regarding bias, empirical coverage, and mean squared error the nonparametric model is often superior to the standard model, and comparable with the copula model. The three approaches are also applied to two example meta-analyses. PMID:26174020

  9. Alaska national hydrography dataset positional accuracy assessment study

    USGS Publications Warehouse

    Arundel, Samantha; Yamamoto, Kristina H.; Constance, Eric; Mantey, Kim; Vinyard-Houx, Jeremy

    2013-01-01

    Initial visual assessments Wide range in the quality of fit between features in NHD and these new image sources. No statistical analysis has been performed to actually quantify accuracy Determining absolute accuracy is cost prohibitive (must collect independent, well defined test points) Quantitative analysis of relative positional error is feasible.

  10. Real-time diagnosis of H. pylori infection during endoscopy: Accuracy of an innovative tool (EndoFaster)

    PubMed Central

    Costamagna, Guido; Zullo, Angelo; Bizzotto, Alessandra; Hassan, Cesare; Riccioni, Maria Elena; Marmo, Clelia; Strangio, Giuseppe; Di Rienzo, Teresa Antonella; Cammarota, Giovanni; Gasbarrini, Antonio; Repici, Alessandro

    2015-01-01

    Background EndoFaster is novel device able to perform real-time ammonium measurement in gastric juice allowing H. pylori diagnosis during endoscopy. This large study aimed to validate the accuracy of EndoFaster for real-time H. pylori detection. Methods Consecutive patients who underwent upper endoscopy in two centres were prospectively enrolled. During endoscopy, 4 ml of gastric juice were aspirated to perform automatic analysis by EndoFaster within 90 seconds, and H. pylori was considered present (>62 ppm/ml) or absent (≤62 ppm/ml). Accuracy was measured by using histology as gold standard, and 13C-urea breath test (UBT) in discordant cases. Accuracy, sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) were calculated. Results Overall, 189 patients were enrolled, but in seven (3.4%) the aspirated gastric juice amount was insufficient to perform the test. The accuracy, sensitivity, specificity, PPV, and NPV were 87.4%, 90.3%, 85.5%, 80.2%, 93.1%, respectively, and 92.6%, 97.1%, 89.7%, 85.9%, 98.0%, respectively, when H. pylori status was reclassified according to the UBT result in discordant cases. Conclusions This study found a high accuracy/feasibility of EndoFaster for real-time H. pylori diagnosis. Use of EndoFaster may allow selecting those patients in whom routine gastric biopsies could be avoided.

  11. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis

    PubMed Central

    Litjens, Geert; Sánchez, Clara I.; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen - van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen

    2016-01-01

    Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce ‘deep learning’ as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30–40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that ‘deep learning’ holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging. PMID:27212078

  12. Accuracy and efficiency of detection dogs: a powerful new tool for koala conservation and management.

    PubMed

    Cristescu, Romane H; Foley, Emily; Markula, Anna; Jackson, Gary; Jones, Darryl; Frère, Céline

    2015-01-01

    Accurate data on presence/absence and spatial distribution for fauna species is key to their conservation. Collecting such data, however, can be time consuming, laborious and costly, in particular for fauna species characterised by low densities, large home ranges, cryptic or elusive behaviour. For such species, including koalas (Phascolarctos cinereus), indicators of species presence can be a useful shortcut: faecal pellets (scats), for instance, are widely used. Scat surveys are not without their difficulties and often contain a high false negative rate. We used experimental and field-based trials to investigate the accuracy and efficiency of the first dog specifically trained for koala scats. The detection dog consistently out-performed human-only teams. Off-leash, the dog detection rate was 100%. The dog was also 19 times more efficient than current scat survey methods and 153% more accurate (the dog found koala scats where the human-only team did not). This clearly demonstrates that the use of detection dogs decreases false negatives and survey time, thus allowing for a significant improvement in the quality and quantity of data collection. Given these unequivocal results, we argue that to improve koala conservation, detection dog surveys for koala scats could in the future replace human-only teams. PMID:25666691

  13. Accuracy and efficiency of detection dogs: a powerful new tool for koala conservation and management

    PubMed Central

    Cristescu, Romane H.; Foley, Emily; Markula, Anna; Jackson, Gary; Jones, Darryl; Frère, Céline

    2015-01-01

    Accurate data on presence/absence and spatial distribution for fauna species is key to their conservation. Collecting such data, however, can be time consuming, laborious and costly, in particular for fauna species characterised by low densities, large home ranges, cryptic or elusive behaviour. For such species, including koalas (Phascolarctos cinereus), indicators of species presence can be a useful shortcut: faecal pellets (scats), for instance, are widely used. Scat surveys are not without their difficulties and often contain a high false negative rate. We used experimental and field-based trials to investigate the accuracy and efficiency of the first dog specifically trained for koala scats. The detection dog consistently out-performed human-only teams. Off-leash, the dog detection rate was 100%. The dog was also 19 times more efficient than current scat survey methods and 153% more accurate (the dog found koala scats where the human-only team did not). This clearly demonstrates that the use of detection dogs decreases false negatives and survey time, thus allowing for a significant improvement in the quality and quantity of data collection. Given these unequivocal results, we argue that to improve koala conservation, detection dog surveys for koala scats could in the future replace human-only teams. PMID:25666691

  14. "Score the Core" Web-based pathologist training tool improves the accuracy of breast cancer IHC4 scoring.

    PubMed

    Engelberg, Jesse A; Retallack, Hanna; Balassanian, Ronald; Dowsett, Mitchell; Zabaglo, Lila; Ram, Arishneel A; Apple, Sophia K; Bishop, John W; Borowsky, Alexander D; Carpenter, Philip M; Chen, Yunn-Yi; Datnow, Brian; Elson, Sarah; Hasteh, Farnaz; Lin, Fritz; Moatamed, Neda A; Zhang, Yanhong; Cardiff, Robert D

    2015-11-01

    Hormone receptor status is an integral component of decision-making in breast cancer management. IHC4 score is an algorithm that combines hormone receptor, HER2, and Ki-67 status to provide a semiquantitative prognostic score for breast cancer. High accuracy and low interobserver variance are important to ensure the score is accurately calculated; however, few previous efforts have been made to measure or decrease interobserver variance. We developed a Web-based training tool, called "Score the Core" (STC) using tissue microarrays to train pathologists to visually score estrogen receptor (using the 300-point H score), progesterone receptor (percent positive), and Ki-67 (percent positive). STC used a reference score calculated from a reproducible manual counting method. Pathologists in the Athena Breast Health Network and pathology residents at associated institutions completed the exercise. By using STC, pathologists improved their estrogen receptor H score and progesterone receptor and Ki-67 proportion assessment and demonstrated a good correlation between pathologist and reference scores. In addition, we collected information about pathologist performance that allowed us to compare individual pathologists and measures of agreement. Pathologists' assessment of the proportion of positive cells was closer to the reference than their assessment of the relative intensity of positive cells. Careful training and assessment should be used to ensure the accuracy of breast biomarkers. This is particularly important as breast cancer diagnostics become increasingly quantitative and reproducible. Our training tool is a novel approach for pathologist training that can serve as an important component of ongoing quality assessment and can improve the accuracy of breast cancer prognostic biomarkers. PMID:26410019

  15. A hyperspectral imager for high radiometric accuracy Earth climate studies

    NASA Astrophysics Data System (ADS)

    Espejo, Joey; Drake, Ginger; Heuerman, Karl; Kopp, Greg; Lieber, Alex; Smith, Paul; Vermeer, Bill

    2011-10-01

    We demonstrate a visible and near-infrared prototype pushbroom hyperspectral imager for Earth climate studies that is capable of using direct solar viewing for on-orbit cross calibration and degradation tracking. Direct calibration to solar spectral irradiances allow the Earth-viewing instrument to achieve required climate-driven absolute radiometric accuracies of <0.2% (1σ). A solar calibration requires viewing scenes having radiances 105 higher than typical Earth scenes. To facilitate this calibration, the instrument features an attenuation system that uses an optimized combination of different precision aperture sizes, neutral density filters, and variable integration timing for Earth and solar viewing. The optical system consists of a three-mirror anastigmat telescope and an Offner spectrometer. The as-built system has a 12.2° cross track field of view with 3 arcmin spatial resolution and covers a 350-1050 nm spectral range with 10 nm resolution. A polarization compensated configuration using the Offner in an out of plane alignment is demonstrated as a viable approach to minimizing polarization sensitivity. The mechanical design takes advantage of relaxed tolerances in the optical design by using rigid, non-adjustable diamond-turned tabs for optical mount locating surfaces. We show that this approach achieves the required optical performance. A prototype spaceflight unit is also demonstrated to prove the applicability of these solar cross calibration methods to on-orbit environments. This unit is evaluated for optical performance prior to and after GEVS shake, thermal vacuum, and lifecycle tests.

  16. When does haste make waste? Speed-accuracy tradeoff, skill level, and the tools of the trade.

    PubMed

    Beilock, Sian L; Bertenthal, Bennett I; Hoerger, Michael; Carr, Thomas H

    2008-12-01

    Novice and skilled golfers took a series of golf putts with a standard putter (Exp. 1) or a distorted funny putter (consisting of an s-shaped and arbitrarily weighted putter shaft; Exp. 2) under instructions to either (a) take as much time as needed to be accurate or to (b) putt as fast as possible while still being accurate. Planning and movement time were measured for each putt. In both experiments, novices produced the typical speed-accuracy trade-off. Going slower, in terms of both the planning and movement components of execution, improved performance. In contrast, skilled golfers benefited from reduced performance time when using the standard putter in Exp. 1 and, specifically, taking less time to plan improved performance. In Exp. 2, skilled golfers improved by going slower when using the funny putter, but only when it was unfamiliar. Thus, skilled performance benefits from speed instructions when wielding highly familiar tools (i.e., the standard putter) is harmed when using new tools (i.e., the funny putter), and benefits again by speed instructions as the new tool becomes familiar. Planning time absorbs these changes. PMID:19102617

  17. Template for Systems Engineering Tools Trade Study

    NASA Technical Reports Server (NTRS)

    Bailey, Michelle D.

    2005-01-01

    A discussion of Systems Engineering tools brings out numerous preferences and reactions regarding tools of choice as well as the functions those tools are to perform. A recent study of Systems Engineering Tools for a new Program illustrated the need for a generic template for use by new Programs or Projects to determine the toolset appropriate for their use. This paper will provide the guidelines new initiatives can follow and tailor to their specific needs, to enable them to make their choice of tools in an efficient and informed manner. Clearly, those who perform purely technical functions will need different tools than those who perform purely systems engineering functions. And, everyone has tools they are comfortable with. That degree of comfort is frequently the deciding factor in tools choice rather than an objective study of all criteria and weighting factors. This paper strives to produce a comprehensive list of criteria for selection with suggestions for weighting factors based on a number of assumptions regarding the given Program or Project. In addition, any given Program will begin with assumptions for its toolset based on Program size, tool cost, user base and technical needs. In providing a template for tool selection, this paper will guide the reader through assumptions based on Program need; decision criteria; potential weighting factors; the need for a compilation of available tools; the importance of tool demonstrations; and finally a down selection of tools. While specific vendors cannot be mentioned in this work, it is expected that this template could serve other Programs in the formulation phase by alleviating the trade study process of some of its subjectivity.

  18. Accuracy of Nurse-Performed Lung Ultrasound in Patients With Acute Dyspnea: A Prospective Observational Study.

    PubMed

    Mumoli, Nicola; Vitale, Josè; Giorgi-Pierfranceschi, Matteo; Cresci, Alessandra; Cei, Marco; Basile, Valentina; Brondi, Barbara; Russo, Elisa; Giuntini, Lucia; Masi, Lorenzo; Cocciolo, Massimo; Dentali, Francesco

    2016-03-01

    In clinical practice lung ultrasound (LUS) is becoming an easy and reliable noninvasive tool for the evaluation of dyspnea. The aim of this study was to assess the accuracy of nurse-performed LUS, in particular, in the diagnosis of acute cardiogenic pulmonary congestion. We prospectively evaluated all the consecutive patients admitted for dyspnea in our Medicine Department between April and July 2014. At admission, serum brain natriuretic peptide (BNP) levels and LUS was performed by trained nurses blinded to clinical and laboratory data. The accuracy of nurse-performed LUS alone and combined with BNP for the diagnosis of acute cardiogenic dyspnea was calculated. Two hundred twenty-six patients (41.6% men, mean age 78.7 ± 12.7 years) were included in the study. Nurse-performed LUS alone had a sensitivity of 95.3% (95% CI: 92.6-98.1%), a specificity of 88.2% (95% CI: 84.0-92.4%), a positive predictive value of 87.9% (95% CI: 83.7-92.2%) and a negative predictive value of 95.5% (95% CI: 92.7-98.2%). The combination of nurse-performed LUS with BNP level (cut-off 400 pg/mL) resulted in a higher sensitivity (98.9%, 95% CI: 97.4-100%), negative predictive value (98.8%, 95% CI: 97.2-100%), and corresponding negative likelihood ratio (0.01, 95% CI: 0.0, 0.07). Nurse-performed LUS had a good accuracy in the diagnosis of acute cardiogenic dyspnea. Use of this technique in combination with BNP seems to be useful in ruling out cardiogenic dyspnea. Other studies are warranted to confirm our preliminary findings and to establish the role of this tool in other settings. PMID:26945396

  19. Accuracy of optical dental digitizers: an in vitro study.

    PubMed

    Vandeweghe, Stefan; Vervack, Valentin; Vanhove, Christian; Dierens, Melissa; Jimbo, Ryo; De Bruyn, Hugo

    2015-01-01

    The aim of this study was to evaluate the accuracy, in terms of trueness and precision, of optical dental scanners. An experimental acrylic resin cast was created and digitized using a microcomputed tomography (microCT) scanner, which served as the reference model. Five polyether impressions were made of the acrylic resin cast to create five stone casts. Each dental digitizer (Imetric, Lava ST, Smart Optics, KaVo Everest) made five scans of the acrylic resin cast and one scan of every stone cast. The scans were superimposed and compared using metrology software. Deviations were calculated between the datasets obtained from the dental digitizers and the microCT scanner (= trueness) and between datasets from the same dental digitizer (= precision). With exception of the Smart Optics scanner, there were no significant differences in trueness for the acrylic resin cast. For the stone casts, however, the Lava ST performed better than Imetric, which did better than the KaVo scanner. The Smart Optics scanner demonstrated the highest deviation. All digitizers demonstrated a significantly higher trueness for the acrylic resin cast compared to the plaster cast, except the Lava ST. The Lava ST was significantly more precise compared to the other scanners. Imetric and Smart Optics also demonstrated a higher level of precision compared to the KaVo scanner. All digitizers demonstrated some degree of error. Stone cast copies are less accurate because of difficulties with scanning the rougher surface or dimensional deformations caused during the production process. For complex, large-span reconstructions, a highly accurate scanner should be selected. PMID:25734714

  20. Accuracy Study of a 2-Component Point Doppler Velocimeter (PDV)

    NASA Technical Reports Server (NTRS)

    Kuhlman, John; Naylor, Steve; James, Kelly; Ramanath, Senthil

    1997-01-01

    A two-component Point Doppler Velocimeter (PDV) which has recently been developed is described, and a series of velocity measurements which have been obtained to quantify the accuracy of the PDV system are summarized. This PDV system uses molecular iodine vapor cells as frequency discriminating filters to determine the Doppler shift of laser light which is scattered off of seed particles in a flow. The majority of results which have been obtained to date are for the mean velocity of a rotating wheel, although preliminary data are described for fully-developed turbulent pipe flow. Accuracy of the present wheel velocity data is approximately +/- 1 % of full scale, while linearity of a single channel is on the order of +/- 0.5 % (i.e., +/- 0.6 m/sec and +/- 0.3 m/sec, out of 57 m/sec, respectively). The observed linearity of these results is on the order of the accuracy to which the speed of the rotating wheel has been set for individual data readings. The absolute accuracy of the rotating wheel data is shown to be consistent with the level of repeatability of the cell calibrations. The preliminary turbulent pipe flow data show consistent turbulence intensity values, and mean axial velocity profiles generally agree with pitot probe data. However, there is at present an offset error in the radial velocity which is on the order of 5-10 % of the mean axial velocity.

  1. Challenges in diagnostic accuracy studies in primary care: the fecal calprotectin example

    PubMed Central

    2013-01-01

    Background Low disease prevalence and lack of uniform reference standards in primary care induce methodological challenges for investigating the diagnostic accuracy of a test. We present a study design that copes with these methodological challenges and discuss the methodological implications of our choices, using a quality assessment tool for diagnostic accuracy studies (QUADAS-2). Design The study investigates the diagnostic value of fecal calprotectin for detecting inflammatory bowel disease in children presenting with chronic gastrointestinal symptoms in primary care. It is a prospective cohort study including two cohorts of children: one cohort will be recruited in primary care and the other in secondary/tertiary care. Test results of fecal calprotectin will be compared to one of the two reference standards for inflammatory bowel disease: endoscopy with histopathological examination of mucosal biopsies or assessment of clinical symptoms at 1-year follow-up. Discussion According to QUADAS-2 the use of two reference standards and the recruitment of patients in two populations may cause differential verification bias and spectrum bias, respectively. The clinical relevance of this potential bias and methods to adjust for this are presented. This study illustrates the importance of awareness of the different kinds of bias that result from choices in the design phase of a diagnostic study in a low prevalence setting. This approach is exemplary for other diagnostic research in primary care. PMID:24274463

  2. High-accuracy diagnostic tool for electron cloud observation in the LHC based on synchronous phase measurements

    NASA Astrophysics Data System (ADS)

    Esteban Müller, J. F.; Baudrenghien, P.; Mastoridis, T.; Shaposhnikova, E.; Valuch, D.

    2015-11-01

    Electron cloud effects, which include heat load in the cryogenic system, pressure rise, and beam instabilities, are among the main intensity limitations for the LHC operation with 25 ns spaced bunches. A new observation tool was proposed and developed to monitor the e-cloud activity and it has already been used successfully during the LHC run 1 (2010-2012) and it is being intensively used in operation during the start of the LHC run 2 (2015-2018). It is based on the fact that the power loss of each bunch due to e-cloud can be estimated using bunch-by-bunch measurement of the synchronous phase. The measurements were done using the existing beam phase module of the low-level rf control system. In order to achieve the very high accuracy required, corrections for reflection in the cables and for systematic errors need to be applied followed by a post-processing of the measurements. Results clearly show the e-cloud buildup along the bunch trains and its time evolution during each LHC fill as well as from fill to fill. Measurements during the 2012 LHC scrubbing run reveal a progressive reduction in the e-cloud activity and therefore a decrease in the secondary electron yield. The total beam power loss can be computed as a sum of the contributions from all bunches and compared with the heat load deposited in the cryogenic system.

  3. Free Mesh Method: fundamental conception, algorithms and accuracy study

    PubMed Central

    YAGAWA, Genki

    2011-01-01

    The finite element method (FEM) has been commonly employed in a variety of fields as a computer simulation method to solve such problems as solid, fluid, electro-magnetic phenomena and so on. However, creation of a quality mesh for the problem domain is a prerequisite when using FEM, which becomes a major part of the cost of a simulation. It is natural that the concept of meshless method has evolved. The free mesh method (FMM) is among the typical meshless methods intended for particle-like finite element analysis of problems that are difficult to handle using global mesh generation, especially on parallel processors. FMM is an efficient node-based finite element method that employs a local mesh generation technique and a node-by-node algorithm for the finite element calculations. In this paper, FMM and its variation are reviewed focusing on their fundamental conception, algorithms and accuracy. PMID:21558752

  4. Study on machining mechanism of nanotwinned CBN cutting tool

    NASA Astrophysics Data System (ADS)

    Chen, Junyun; Jin, Tianye; Wang, Jinhu; Zhao, Qingliang; Lu, Ling

    2014-08-01

    The latest developed nanotwinned cubic boron nitride (nt-CBN) with isotropic nano-sized microstructure possesses an extremely high hardness (~100GPa Hv), very large fracture toughness (>12Mpa m1/2) and excellent high temperature stability. Thus nt-CBN is a promising tool material to realize ultra-precision cutting of hardened steel which is widely used in mold insert of optical and opto-electrical mass products. In view of its hard machinability, the machining mechanism is studied in this paper. Three feasible methods of mechanical lapping, laser machining as well as ion beam sputtering are applied to process nt-CBN. The results indicate that among the three kinds of methods, mechanical lapping not only can achieve the highest machining accuracy because of material removing at ductile mode completely, but also has satisfactory high material removal rate. Thus mechanical lapping method is appropriate to finish machining of nt-CBN cutting tool. Moreover, laser machining method can be only used in contour machining or rough machining of cutting tool as worse machined surface quality. With regard to ion beam sputtering method, the material remove rate is too low in spite of high machining accuracy. Additionally, no phase transition was found in any machining process of nt-CBN.

  5. Pose estimation with a Kinect for ergonomic studies: evaluation of the accuracy using a virtual mannequin.

    PubMed

    Plantard, Pierre; Auvinet, Edouard; Pierres, Anne-Sophie Le; Multon, Franck

    2015-01-01

    Analyzing human poses with a Kinect is a promising method to evaluate potentials risks of musculoskeletal disorders at workstations. In ecological situations, complex 3D poses and constraints imposed by the environment make it difficult to obtain reliable kinematic information. Thus, being able to predict the potential accuracy of the measurement for such complex 3D poses and sensor placements is challenging in classical experimental setups. To tackle this problem, we propose a new evaluation method based on a virtual mannequin. In this study, we apply this method to the evaluation of joint positions (shoulder, elbow, and wrist), joint angles (shoulder and elbow), and the corresponding RULA (a popular ergonomics assessment grid) upper-limb score for a large set of poses and sensor placements. Thanks to this evaluation method, more than 500,000 configurations have been automatically tested, which would be almost impossible to evaluate with classical protocols. The results show that the kinematic information obtained by the Kinect software is generally accurate enough to fill in ergonomic assessment grids. However inaccuracy strongly increases for some specific poses and sensor positions. Using this evaluation method enabled us to report configurations that could lead to these high inaccuracies. As a supplementary material, we provide a software tool to help designers to evaluate the expected accuracy of this sensor for a set of upper-limb configurations. Results obtained with the virtual mannequin are in accordance with those obtained from a real subject for a limited set of poses and sensor placements. PMID:25599426

  6. Seismicity map tools for earthquake studies

    NASA Astrophysics Data System (ADS)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  7. STARD 2015: An Updated List of Essential Items for Reporting Diagnostic Accuracy Studies.

    PubMed

    Bossuyt, Patrick M; Reitsma, Johannes B; Bruns, David E; Gatsonis, Constantine A; Glasziou, Paul P; Irwig, Les; Lijmer, Jeroen G; Moher, David; Rennie, Drummond; de Vet, Henrica C W; Kressel, Herbert Y; Rifai, Nader; Golub, Robert M; Altman, Douglas G; Hooft, Lotty; Korevaar, Daniël A; Cohen, Jérémie F

    2015-12-01

    Incomplete reporting has been identified as a major source of avoidable waste in biomedical research. Essential information is often not provided in study reports, impeding the identification, critical appraisal, and replication of studies. To improve the quality of reporting of diagnostic accuracy studies, the Standards for Reporting of Diagnostic Accuracy Studies (STARD) statement was developed. Here we present STARD 2015, an updated list of 30 essential items that should be included in every report of a diagnostic accuracy study. This update incorporates recent evidence about sources of bias and variability in diagnostic accuracy and is intended to facilitate the use of STARD. As such, STARD 2015 may help to improve completeness and transparency in reporting of diagnostic accuracy studies. PMID:26509226

  8. STARD 2015: An Updated List of Essential Items for Reporting Diagnostic Accuracy Studies.

    PubMed

    Bossuyt, Patrick M; Reitsma, Johannes B; Bruns, David E; Gatsonis, Constantine A; Glasziou, Paul P; Irwig, Les; Lijmer, Jeroen G; Moher, David; Rennie, Drummond; de Vet, Henrica C W; Kressel, Herbert Y; Rifai, Nader; Golub, Robert M; Altman, Douglas G; Hooft, Lotty; Korevaar, Daniël A; Cohen, Jérémie F

    2015-12-01

    Incomplete reporting has been identified as a major source of avoidable waste in biomedical research. Essential information is often not provided in study reports, impeding the identification, critical appraisal, and replication of studies. To improve the quality of reporting of diagnostic accuracy studies, the Standards for Reporting of Diagnostic Accuracy Studies (STARD) statement was developed. Here we present STARD 2015, an updated list of 30 essential items that should be included in every report of a diagnostic accuracy study. This update incorporates recent evidence about sources of bias and variability in diagnostic accuracy and is intended to facilitate the use of STARD. As such, STARD 2015 may help to improve completeness and transparency in reporting of diagnostic accuracy studies. PMID:26510957

  9. High accuracy NMR chemical shift corrected for bulk magnetization as a tool for structural elucidation of dilutable microemulsions. Part 1 - Proof of concept.

    PubMed

    Hoffman, Roy E; Darmon, Eliezer; Aserin, Abraham; Garti, Nissim

    2016-02-01

    In microemulsions, changes in droplet size and shape and possible transformations occur under various conditions. They are difficult to characterize by most analytical tools because of their nano-sized structure and dynamic nature. Several methods are usually combined to obtain reliable information, guiding the scientist in understanding their physical behavior. We felt that there is a need for a technique that complements those in use today in order to provide more information on the microemulsion behavior, mainly as a function of dilution with water. The improvement of NMR chemical shift measurements independent of bulk magnetization effects makes it possible to study the very weak intermolecular chemical shift effects. In the present study, we used NMR high resolution magic angle spinning to measure the chemical shift very accurately, free of bulk magnetization effects. The chemical shift of microemulsion components is measured as a function of the water content in order to validate the method in an interesting and promising, U-type dilutable microemulsion, which had been previously studied by a variety of techniques. Phase transition points of the microemulsion (O/W, bicontinuous, W/O) and changes in droplet shape were successfully detected using high-accuracy chemical shift measurements. We analyzed the results and found them to be compatible with the previous studies, paving the way for high-accuracy chemical shifts to be used for the study of other microemulsion systems. We detected two transition points along the water dilution line of the concentrate (reverse micelles) corresponding to the transition from swollen W/O nano-droplets to bicontinuous to the O/W droplets along with the changes in the droplets' sizes and shapes. The method seems to be in excellent agreement with other previously studied techniques and shows the advantage of this easy and valid technique. PMID:25113928

  10. Dynamic optimization case studies in DYNOPT tool

    NASA Astrophysics Data System (ADS)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.

  11. Dynamic Development of Complexity and Accuracy: A Case Study in Second Language Academic Writing

    ERIC Educational Resources Information Center

    Rosmawati

    2014-01-01

    This paper reports on the development of complexity and accuracy in English as a Second Language (ESL) academic writing. Although research into complexity and accuracy development in second language (L2) writing has been well established, few studies have assumed the multidimensionality of these two constructs (Norris & Ortega, 2009) or…

  12. Effects of Varying Feedback Accuracy on Task Acquisition: A Computerized Translational Study

    ERIC Educational Resources Information Center

    Hirst, Jason M.; DiGennaro Reed, Florence D.; Reed, Derek D.

    2013-01-01

    Research has shown that the accuracy of instructions influences responding immediately and under later conditions. The purpose of the present study was to extend this literature and use a translational approach to assess the short- and long-term effects of feedback accuracy on the acquisition of a task. Three levels of inaccurate feedback were…

  13. Accuracy of Self-Evaluation in Adults with ADHD: Evidence from a Driving Study

    ERIC Educational Resources Information Center

    Knouse, Laura E.; Bagwell, Catherine L.; Barkley, Russell A.; Murphy, Kevin R.

    2005-01-01

    Research on children with ADHD indicates an association with inaccuracy of self-appraisal. This study examines the accuracy of self-evaluations in clinic-referred adults diagnosed with ADHD. Self-assessments and performance measures of driving in naturalistic settings and on a virtual-reality driving simulator are used to assess accuracy of…

  14. Alternative Confidence Interval Methods Used in the Diagnostic Accuracy Studies

    PubMed Central

    Gülhan, Orekıcı Temel

    2016-01-01

    Background/Aim. It is necessary to decide whether the newly improved methods are better than the standard or reference test or not. To decide whether the new diagnostics test is better than the gold standard test/imperfect standard test, the differences of estimated sensitivity/specificity are calculated with the help of information obtained from samples. However, to generalize this value to the population, it should be given with the confidence intervals. The aim of this study is to evaluate the confidence interval methods developed for the differences between the two dependent sensitivity/specificity values on a clinical application. Materials and Methods. In this study, confidence interval methods like Asymptotic Intervals, Conditional Intervals, Unconditional Interval, Score Intervals, and Nonparametric Methods Based on Relative Effects Intervals are used. Besides, as clinical application, data used in diagnostics study by Dickel et al. (2010) has been taken as a sample. Results. The results belonging to the alternative confidence interval methods for Nickel Sulfate, Potassium Dichromate, and Lanolin Alcohol are given as a table. Conclusion. While preferring the confidence interval methods, the researchers have to consider whether the case to be compared is single ratio or dependent binary ratio differences, the correlation coefficient between the rates in two dependent ratios and the sample sizes. PMID:27478491

  15. A computational tool for identifying minimotifs in protein-protein interactions and improving the accuracy of minimotif predictions.

    PubMed

    Rajasekaran, Sanguthevar; Merlin, Jerlin Camilus; Kundeti, Vamsi; Mi, Tian; Oommen, Aaron; Vyas, Jay; Alaniz, Izua; Chung, Keith; Chowdhury, Farah; Deverasatty, Sandeep; Irvey, Tenisha M; Lacambacal, David; Lara, Darlene; Panchangam, Subhasree; Rathnayake, Viraj; Watts, Paula; Schiller, Martin R

    2011-01-01

    Protein-protein interactions are important to understanding cell functions; however, our theoretical understanding is limited. There is a general discontinuity between the well-accepted physical and chemical forces that drive protein-protein interactions and the large collections of identified protein-protein interactions in various databases. Minimotifs are short functional peptide sequences that provide a basis to bridge this gap in knowledge. However, there is no systematic way to study minimotifs in the context of protein-protein interactions or vice versa. Here we have engineered a set of algorithms that can be used to identify minimotifs in known protein-protein interactions and implemented this for use by scientists in Minimotif Miner. By globally testing these algorithms on verified data and on 100 individual proteins as test cases, we demonstrate the utility of these new computation tools. This tool also can be used to reduce false-positive predictions in the discovery of novel minimotifs. The statistical significance of these algorithms is demonstrated by an ROC analysis (P = 0.001). PMID:20938975

  16. Twin Studies: A Unique Epidemiological Tool

    PubMed Central

    Sahu, Monalisha; Prasuna, Josyula G

    2016-01-01

    Twin studies are a special type of epidemiological studies designed to measure the contribution of genetics as opposed to the environment, to a given trait. Despite the facts that the classical twin studies are still being guided by assumptions made back in the 1920s and that the inherent limitation lies in the study design itself, the results suggested by earlier twin studies have often been confirmed by molecular genetic studies later. Use of twin registries and various innovative yet complex software packages such as the (SAS) and their extensions (e.g., SAS PROC GENMOD and SAS PROC PHREG) has increased the potential of this epidemiological tool toward contributing significantly to the field of genetics and other life sciences. PMID:27385869

  17. Evaluation of accuracy of cone beam computed tomography for measurement of periodontal defects: A clinical study

    PubMed Central

    Banodkar, Akshaya Bhupesh; Gaikwad, Rajesh Prabhakar; Gunjikar, Tanay Udayrao; Lobo, Tanya Arthur

    2015-01-01

    Aims: The aim of the present study was to evaluate the accuracy of Cone Beam Computed Tomography (CBCT) measurements of alveolar bone defects caused due to periodontal disease, by comparing it with actual surgical measurements which is the gold standard. Materials and Methods: Hundred periodontal bone defects in fifteen patients suffering from periodontitis and scheduled for flap surgery were included in the study. On the day of surgery prior to anesthesia, CBCT of the quadrant to be operated was taken. After reflection of the flap, clinical measurements of periodontal defect were made using a reamer and digital vernier caliper. The measurements taken during surgery were then compared to the measurements done with CBCT and subjected to statistical analysis using the Pearson's correlation test. Results: Overall there was a very high correlation of 0.988 between the surgical and CBCT measurements. In case of type of defects the correlation was higher in horizontal defects as compared to vertical defects. Conclusions: CBCT is highly accurate in measurement of periodontal defects and proves to be a very useful tool in periodontal diagnosis and treatment assessment. PMID:26229268

  18. The predictive accuracy of PREDICT: a personalized decision-making tool for Southeast Asian women with breast cancer.

    PubMed

    Wong, Hoong-Seam; Subramaniam, Shridevi; Alias, Zarifah; Taib, Nur Aishah; Ho, Gwo-Fuang; Ng, Char-Hong; Yip, Cheng-Har; Verkooijen, Helena M; Hartman, Mikael; Bhoo-Pathy, Nirmala

    2015-02-01

    Web-based prognostication tools may provide a simple and economically feasible option to aid prognostication and selection of chemotherapy in early breast cancers. We validated PREDICT, a free online breast cancer prognostication and treatment benefit tool, in a resource-limited setting. All 1480 patients who underwent complete surgical treatment for stages I to III breast cancer from 1998 to 2006 were identified from the prospective breast cancer registry of University Malaya Medical Centre, Kuala Lumpur, Malaysia. Calibration was evaluated by comparing the model-predicted overall survival (OS) with patients' actual OS. Model discrimination was tested using receiver-operating characteristic (ROC) analysis. Median age at diagnosis was 50 years. The median tumor size at presentation was 3 cm and 54% of patients had lymph node-negative disease. About 55% of women had estrogen receptor-positive breast cancer. Overall, the model-predicted 5 and 10-year OS was 86.3% and 77.5%, respectively, whereas the observed 5 and 10-year OS was 87.6% (difference: -1.3%) and 74.2% (difference: 3.3%), respectively; P values for goodness-of-fit test were 0.18 and 0.12, respectively. The program was accurate in most subgroups of patients, but significantly overestimated survival in patients aged <40 years, and in those receiving neoadjuvant chemotherapy. PREDICT performed well in terms of discrimination; areas under ROC curve were 0.78 (95% confidence interval [CI]: 0.74-0.81) and 0.73 (95% CI: 0.68-0.78) for 5 and 10-year OS, respectively. Based on its accurate performance in this study, PREDICT may be clinically useful in prognosticating women with breast cancer and personalizing breast cancer treatment in resource-limited settings. PMID:25715267

  19. Photochemical tools to study dynamic biological processes

    PubMed Central

    Specht, Alexandre; Bolze, Frédéric; Omran, Ziad; Nicoud, Jean-François; Goeldner, Maurice

    2009-01-01

    Light-responsive biologically active compounds offer the possibility to study the dynamics of biological processes. Phototriggers and photoswitches have been designed, providing the capability to rapidly cause the initiation of wide range of dynamic biological phenomena. We will discuss, in this article, recent developments in the field of light-triggered chemical tools, specially how two-photon excitation, “caged” fluorophores, and the photoregulation of protein activities in combination with time-resolved x-ray techniques should break new grounds in the understanding of dynamic biological processes. PMID:20119482

  20. Analysis of Accuracy in Pointing with Redundant Hand-held Tools: A Geometric Approach to the Uncontrolled Manifold Method

    PubMed Central

    Campolo, Domenico; Widjaja, Ferdinan; Xu, Hong; Ang, Wei Tech; Burdet, Etienne

    2013-01-01

    This work introduces a coordinate-independent method to analyse movement variability of tasks performed with hand-held tools, such as a pen or a surgical scalpel. We extend the classical uncontrolled manifold (UCM) approach by exploiting the geometry of rigid body motions, used to describe tool configurations. In particular, we analyse variability during a static pointing task with a hand-held tool, where subjects are asked to keep the tool tip in steady contact with another object. In this case the tool is redundant with respect to the task, as subjects control position/orientation of the tool, i.e. 6 degrees-of-freedom (dof), to maintain the tool tip position (3dof) steady. To test the new method, subjects performed a pointing task with and without arm support. The additional dof introduced in the unsupported condition, injecting more variability into the system, represented a resource to minimise variability in the task space via coordinated motion. The results show that all of the seven subjects channeled more variability along directions not directly affecting the task (UCM), consistent with previous literature but now shown in a coordinate-independent way. Variability in the unsupported condition was only slightly larger at the endpoint but much larger in the UCM. PMID:23592956

  1. Eyewitness memory of a supermarket robbery: a case study of accuracy and confidence after 3 months.

    PubMed

    Odinot, Geralda; Wolters, Gezinus; van Koppen, Peter J

    2009-12-01

    In this case study, 14 witnesses of an armed robbery were interviewed after 3 months. Security camera recordings were used to assess memory accuracy. Of all information that could be remembered about 84% was correct. Although accurately recalled information had a higher confidence level on average than inaccurately recalled information, the mean accuracy-confidence correlation was rather modest (0.38). These findings indicate that confidence is not a reliable predictor of accuracy. A higher level of self-reported, post-event thinking about the incident was associated with higher confidence levels, while a higher level of self-reported emotional impact was associated with greater accuracy. A potential source of (mis)information, a reconstruction of the robbery broadcasted on TV, did not alter the original memories of the witnesses. PMID:18719983

  2. Fluorescence microscopy: A tool to study autophagy

    NASA Astrophysics Data System (ADS)

    Rai, Shashank; Manjithaya, Ravi

    2015-08-01

    Autophagy is a cellular recycling process through which a cell degrades old and damaged cellular components such as organelles and proteins and the degradation products are reused to provide energy and building blocks. Dysfunctional autophagy is reported in several pathological situations. Hence, autophagy plays an important role in both cellular homeostasis and diseased conditions. Autophagy can be studied through various techniques including fluorescence based microscopy. With the advancements of newer technologies in fluorescence microscopy, several novel processes of autophagy have been discovered which makes it an essential tool for autophagy research. Moreover, ability to tag fluorescent proteins with sub cellular targets has enabled us to evaluate autophagy processes in real time under fluorescent microscope. In this article, we demonstrate different aspects of autophagy in two different model organisms i.e. yeast and mammalian cells, with the help of fluorescence microscopy.

  3. Short-Term Forecasting of Loads and Wind Power for Latvian Power System: Accuracy and Capacity of the Developed Tools

    NASA Astrophysics Data System (ADS)

    Radziukynas, V.; Klementavičius, A.

    2016-04-01

    The paper analyses the performance results of the recently developed short-term forecasting suit for the Latvian power system. The system load and wind power are forecasted using ANN and ARIMA models, respectively, and the forecasting accuracy is evaluated in terms of errors, mean absolute errors and mean absolute percentage errors. The investigation of influence of additional input variables on load forecasting errors is performed. The interplay of hourly loads and wind power forecasting errors is also evaluated for the Latvian power system with historical loads (the year 2011) and planned wind power capacities (the year 2023).

  4. Do knowledge, knowledge sources and reasoning skills affect the accuracy of nursing diagnoses? a randomised study

    PubMed Central

    2012-01-01

    Background This paper reports a study about the effect of knowledge sources, such as handbooks, an assessment format and a predefined record structure for diagnostic documentation, as well as the influence of knowledge, disposition toward critical thinking and reasoning skills, on the accuracy of nursing diagnoses. Knowledge sources can support nurses in deriving diagnoses. A nurse’s disposition toward critical thinking and reasoning skills is also thought to influence the accuracy of his or her nursing diagnoses. Method A randomised factorial design was used in 2008–2009 to determine the effect of knowledge sources. We used the following instruments to assess the influence of ready knowledge, disposition, and reasoning skills on the accuracy of diagnoses: (1) a knowledge inventory, (2) the California Critical Thinking Disposition Inventory, and (3) the Health Science Reasoning Test. Nurses (n = 249) were randomly assigned to one of four factorial groups, and were instructed to derive diagnoses based on an assessment interview with a simulated patient/actor. Results The use of a predefined record structure resulted in a significantly higher accuracy of nursing diagnoses. A regression analysis reveals that almost half of the variance in the accuracy of diagnoses is explained by the use of a predefined record structure, a nurse’s age and the reasoning skills of `deduction’ and `analysis’. Conclusions Improving nurses’ dispositions toward critical thinking and reasoning skills, and the use of a predefined record structure, improves accuracy of nursing diagnoses. PMID:22852577

  5. Points Clouds Generation Using Tls and Dense-Matching Techniques. a Test on Approachable Accuracies of Different Tools

    NASA Astrophysics Data System (ADS)

    Chiabrando, F.; Spanò, A.

    2013-07-01

    3D detailed models derived from digital survey techniques has increasingly developed and focused in many field of application, ranging from the land and urban areas survey, using remote sensed data, to landscape assets and finally to Cultural Heritage items. The high detailed content and accuracy of such models makes them so attractive and usable for large sets of purposes. The present paper is focused on a test aimed to point clouds generation fulfilled by archaeological data; active and passive sensors techniques and related image matching systems have been used in order to evaluate and compare the accuracy of results, achievable using proper TLS and low cost image-matching software and techniques. After a short review of approachable methods some attained results will be discussed; the test area consists of a set of mosaic floorings in a late roman domus located in Aquileia (UD-Italy) requesting a very high level of details and high scale and precision. The experimental section provides the descriptions of the applied tests in order to compare the different software and the employed methods.

  6. Study the effect of gray component replacement level on reflectance spectra and color reproduction accuracy

    NASA Astrophysics Data System (ADS)

    Spiridonov, I.; Shopova, M.; Boeva, R.

    2013-03-01

    The aim of this study is investigation of gray component replacement (GCR) levels on reflectance spectrum for different overprints of the inks and color reproduction accuracy. The most commonly implemented method in practice for generation of achromatic composition is gray component replacement (GCR). The experiments in this study, have been performed in real production conditions with special test form generated by specialized software. The measuring of reflection spectrum of printed colors, gives a complete conception for the effect of different gray component replacement levels on color reproduction accuracy. For better data analyses and modeling of processes, we have calculated (converted) the CIEL*a*b* color coordinates from the reflection spectra data. The assessment of color accuracy by using different GCR amount has been made by calculation of color difference ΔE* ab. In addition for the specific printing conditions we have created ICC profiles with different GCR amounts. A comparison of the color gamuts has been performed. For a first time a methodology is implemented for examination and estimation of effect of GCR levels on color reproduction accuracy by studying a big number of colors in entire visible spectrum. Implementation in practice of the results achieved in this experiment, will lead to improved gray balance and better color accuracy. Another important effect of this research is reduction of financial costs of printing production by decreasing of ink consumption, indirect reduction of emissions during the manufacture of inks and facilitates the process of deinking during the recycling paper.

  7. Accuracy Studies of a Magnetometer-Only Attitude-and-Rate-Determination System

    NASA Technical Reports Server (NTRS)

    Challa, M. (Editor); Wheeler, C. (Editor)

    1996-01-01

    A personal computer based system was recently prototyped that uses measurements from a three axis magnetometer (TAM) to estimate the attitude and rates of a spacecraft using no a priori knowledge of the spacecraft's state. Past studies using in-flight data from the Solar, Anomalous, and Magnetospheric Particles Explorer focused on the robustness of the system and demonstrated that attitude and rate estimates could be obtained accurately to 1.5 degrees (deg) and 0.01 deg per second (deg/sec), respectively, despite limitations in the data and in the accuracies of te truth models. This paper studies the accuracy of the Kalman filter in the system using several orbits of in-flight Earth Radiation Budget Satellite (ERBS) data and attitude and rate truth models obtained from high precision sensors to demonstrate the practical capabilities. This paper shows the following: Using telemetered TAM data, attitude accuracies of 0.2 to 0.4 deg and rate accuracies of 0.002 to 0.005 deg/sec (within ERBS attitude control requirements of 1 deg and 0.0005 deg/sec) can be obtained with minimal tuning of the filter; Replacing the TAM data in the telemetry with simulated TAM data yields corresponding accuracies of 0.1 to 0.2 deg and 0.002 to 0.005 deg/sec, thus demonstrating that the filter's accuracy can be significantly enhanced by further calibrating the TAM. Factors affecting the fillter's accuracy and techniques for tuning the system's Kalman filter are also presented.

  8. Prediction accuracy of a sample-size estimation method for ROC studies

    PubMed Central

    Chakraborty, Dev P.

    2010-01-01

    Rationale and Objectives Sample-size estimation is an important consideration when planning a receiver operating characteristic (ROC) study. The aim of this work was to assess the prediction accuracy of a sample-size estimation method using the Monte Carlo simulation method. Materials and Methods Two ROC ratings simulators characterized by low reader and high case variabilities (LH) and high reader and low case variabilities (HL) were used to generate pilot data sets in 2 modalities. Dorfman-Berbaum-Metz multiple-reader multiple-case (DBM-MRMC) analysis of the ratings yielded estimates of the modality-reader, modality-case and error variances. These were input to the Hillis-Berbaum (HB) sample-size estimation method, which predicted the number of cases needed to achieve 80% power for 10 readers and an effect size of 0.06 in the pivotal study. Predictions that generalized to readers and cases (random-all), to cases only (random-cases) and to readers only (random-readers) were generated. A prediction-accuracy index defined as the probability that any single prediction yields true power in the range 75% to 90% was used to assess the HB method. Results For random-case generalization the HB-method prediction-accuracy was reasonable, ~ 50% for 5 readers in the pilot study. Prediction-accuracy was generally higher under low reader variability conditions (LH) than under high reader variability conditions (HL). Under ideal conditions (many readers in the pilot study) the DBM-MRMC based HB method overestimated the number of cases. The overestimates could be explained by the observed large variability of the DBM-MRMC modality-reader variance estimates, particularly when reader variability was large (HL). The largest benefit of increasing the number of readers in the pilot study was realized for LH, where 15 readers were enough to yield prediction accuracy > 50% under all generalization conditions, but the benefit was lesser for HL where prediction accuracy was ~ 36% for 15

  9. Experimental studies of high-accuracy RFID localization with channel impairments

    NASA Astrophysics Data System (ADS)

    Pauls, Eric; Zhang, Yimin D.

    2015-05-01

    Radio frequency identification (RFID) systems present an incredibly cost-effective and easy-to-implement solution to close-range localization. One of the important applications of a passive RFID system is to determine the reader position through multilateration based on the estimated distances between the reader and multiple distributed reference tags obtained from, e.g., the received signal strength indicator (RSSI) readings. In practice, the achievable accuracy of passive RFID reader localization suffers from many factors, such as the distorted RSSI reading due to channel impairments in terms of the susceptibility to reader antenna patterns and multipath propagation. Previous studies have shown that the accuracy of passive RFID localization can be significantly improved by properly modeling and compensating for such channel impairments. The objective of this paper is to report experimental study results that validate the effectiveness of such approaches for high-accuracy RFID localization. We also examine a number of practical issues arising in the underlying problem that limit the accuracy of reader-tag distance measurements and, therefore, the estimated reader localization. These issues include the variations in tag radiation characteristics for similar tags, effects of tag orientations, and reader RSS quantization and measurement errors. As such, this paper reveals valuable insights of the issues and solutions toward achieving high-accuracy passive RFID localization.

  10. Dose calculation accuracies in whole breast radiotherapy treatment planning: a multi-institutional study.

    PubMed

    Hatanaka, Shogo; Miyabe, Yuki; Tohyama, Naoki; Kumazaki, Yu; Kurooka, Masahiko; Okamoto, Hiroyuki; Tachibana, Hidenobu; Kito, Satoshi; Wakita, Akihisa; Ohotomo, Yuko; Ikagawa, Hiroyuki; Ishikura, Satoshi; Nozaki, Miwako; Kagami, Yoshikazu; Hiraoka, Masahiro; Nishio, Teiji

    2015-07-01

    Our objective in this study was to evaluate the variation in the doses delivered among institutions due to dose calculation inaccuracies in whole breast radiotherapy. We have developed practical procedures for quality assurance (QA) of radiation treatment planning systems. These QA procedures are designed to be performed easily at any institution and to permit comparisons of results across institutions. The dose calculation accuracy was evaluated across seven institutions using various irradiation conditions. In some conditions, there was a >3 % difference between the calculated dose and the measured dose. The dose calculation accuracy differs among institutions because it is dependent on both the dose calculation algorithm and beam modeling. The QA procedures in this study are useful for verifying the accuracy of the dose calculation algorithm and of the beam model before clinical use for whole breast radiotherapy. PMID:25646770

  11. Precision Fabrication of a Large-Area Sinusoidal Surface Using a Fast-Tool-Servo Technique ─Improvement of Local Fabrication Accuracy

    NASA Astrophysics Data System (ADS)

    Gao, Wei; Tano, Makoto; Araki, Takeshi; Kiyono, Satoshi

    This paper describes a diamond turning fabrication system for a sinusoidal grid surface. The wavelength and amplitude of the sinusoidal wave in each direction are 100µm and 100nm, respectively. The fabrication system, which is based on a fast-tool-servo (FTS), has the ability to generate the angle grid surface over an area of φ 150mm. This paper focuses on the improvement of the local fabrication accuracy. The areas considered are each approximately 1 × 1mm, and can be imaged by an interference microscope. Specific fabrication errors of the manufacturing process, caused by the round nose geometry of the diamond cutting tool and the data digitization, are successfully identified by Discrete Fourier Transform of the microscope images. Compensation processes are carried out to reduce the errors. As a result, the fabrication errors in local areas of the angle grid surface are reduced by 1/10.

  12. Immunogenetics as a tool in anthropological studies.

    PubMed

    Sanchez-Mazas, Alicia; Fernandez-Viña, Marcelo; Middleton, Derek; Hollenbach, Jill A; Buhler, Stéphane; Di, Da; Rajalingam, Raja; Dugoujon, Jean-Michel; Mack, Steven J; Thorsby, Erik

    2011-06-01

    The genes coding for the main molecules involved in the human immune system--immunoglobulins, human leucocyte antigen (HLA) molecules and killer-cell immunoglobulin-like receptors (KIR)--exhibit a very high level of polymorphism that reveals remarkable frequency variation in human populations. 'Genetic marker' (GM) allotypes located in the constant domains of IgG antibodies have been studied for over 40 years through serological typing, leading to the identification of a variety of GM haplotypes whose frequencies vary sharply from one geographic region to another. An impressive diversity of HLA alleles, which results in amino acid substitutions located in the antigen-binding region of HLA molecules, also varies greatly among populations. The KIR differ between individuals according to both gene content and allelic variation, and also display considerable population diversity. Whereas the molecular evolution of these polymorphisms has most likely been subject to natural selection, principally driven by host-pathogen interactions, their patterns of genetic variation worldwide show significant signals of human geographic expansion, demographic history and cultural diversification. As current developments in population genetic analysis and computer simulation improve our ability to discriminate among different--either stochastic or deterministic--forces acting on the genetic evolution of human populations, the study of these systems shows great promise for investigating both the peopling history of modern humans in the time since their common origin and human adaptation to past environmental (e.g. pathogenic) changes. Therefore, in addition to mitochondrial DNA, Y-chromosome, microsatellites, single nucleotide polymorphisms and other markers, immunogenetic polymorphisms represent essential and complementary tools for anthropological studies. PMID:21480890

  13. Immunogenetics as a tool in anthropological studies

    PubMed Central

    Sanchez-Mazas, Alicia; Fernandez-Viña, Marcelo; Middleton, Derek; Hollenbach, Jill A; Buhler, Stéphane; Di, Da; Rajalingam, Raja; Dugoujon, Jean-Michel; Mack, Steven J; Thorsby, Erik

    2011-01-01

    The genes coding for the main molecules involved in the human immune system – immunoglobulins, human leucocyte antigen (HLA) molecules and killer-cell immunoglobulin-like receptors (KIR) – exhibit a very high level of polymorphism that reveals remarkable frequency variation in human populations. ‘Genetic marker’ (GM) allotypes located in the constant domains of IgG antibodies have been studied for over 40 years through serological typing, leading to the identification of a variety of GM haplotypes whose frequencies vary sharply from one geographic region to another. An impressive diversity of HLA alleles, which results in amino acid substitutions located in the antigen-binding region of HLA molecules, also varies greatly among populations. The KIR differ between individuals according to both gene content and allelic variation, and also display considerable population diversity. Whereas the molecular evolution of these polymorphisms has most likely been subject to natural selection, principally driven by host–pathogen interactions, their patterns of genetic variation worldwide show significant signals of human geographic expansion, demographic history and cultural diversification. As current developments in population genetic analysis and computer simulation improve our ability to discriminate among different – either stochastic or deterministic – forces acting on the genetic evolution of human populations, the study of these systems shows great promise for investigating both the peopling history of modern humans in the time since their common origin and human adaptation to past environmental (e.g. pathogenic) changes. Therefore, in addition to mitochondrial DNA, Y-chromosome, microsatellites, single nucleotide polymorphisms and other markers, immunogenetic polymorphisms represent essential and complementary tools for anthropological studies. PMID:21480890

  14. Microinjection--a tool to study gravitropism

    NASA Technical Reports Server (NTRS)

    Scherp, P.; Hasenstein, K. H.

    2003-01-01

    Despite extensive studies on plant gravitropism this phenomenon is still poorly understood. The separation of gravity sensing, signal transduction and response is a common concept but especially the mechanism of gravisensing remains unclear. This paper focuses on microinjection as powerful tool to investigate gravisensing in plants. We describe the microinjection of magnetic beads in rhizoids of the green alga Chara and related subsequent manipulation of the gravisensing system. After injection, an external magnet can control the movement of the magnetic beads. We demonstrate successful injection of magnetic beads into rhizoids and describe a multitude of experiments that can be carried out to investigate gravitropism in Chara rhizoids. In addition to examining mechanical properties, bead microinjection is also useful for probing the function of the cytoskeleton by coating beads with drugs that interfere with the cytoskeleton. The injection of fluorescently labeled beads or probes may reveal the involvement of the cytoskeleton during gravistimulation and response in living cells. c2003 COSPAR. Published by Elsevier Ltd. All rights reserved.

  15. The Accuracy of a Simple, Low-Cost GPS Data Logger/Receiver to Study Outdoor Human Walking in View of Health and Clinical Studies

    PubMed Central

    Noury-Desvaux, Bénédicte; Abraham, Pierre; Mahé, Guillaume; Sauvaget, Thomas; Leftheriotis, Georges; Le Faucheur, Alexis

    2011-01-01

    Introduction Accurate and objective measurements of physical activity and lower-extremity function are important in health and disease monitoring, particularly given the current epidemic of chronic diseases and their related functional impairment. Purpose The aim of the present study was to determine the accuracy of a handy (lightweight, small, only one stop/start button) and low-cost (∼$75 with its external antenna) Global Positioning System (GPS) data logger/receiver (the DG100) as a tool to study outdoor human walking in perspective of health and clinical research studies. Methods. Healthy subjects performed two experiments that consisted of different prescribed outdoor walking protocols. Experiment 1. We studied the accuracy of the DG100 for detecting bouts of walking and resting. Experiment 2. We studied the accuracy of the DG100 for estimating distances and speeds of walking. Results Experiment 1. The performance in the detection of bouts, expressed as the percentage of walking and resting bouts that were correctly detected, was 92.4% [95% Confidence Interval: 90.6–94.3]. Experiment 2. The coefficients of variation [95% Confidence Interval] for the accuracy of estimating the distances and speeds of walking were low: 3.1% [2.9–3.3] and 2.8% [2.6–3.1], respectively. Conclusion The DG100 produces acceptable accuracy both in detecting bouts of walking and resting and in estimating distances and speeds of walking during the detected walking bouts. However, before we can confirm that the DG100 can be used to study walking with respect to health and clinical studies, the inter- and intra-DG100 variability should be studied. Trial Registration ClinicalTrials.gov NCT00485147 PMID:21931593

  16. Assessing the Accuracy of a Child's Account of Sexual Abuse: A Case Study.

    ERIC Educational Resources Information Center

    Orbach, Yael; Lamb, Michael E.

    1999-01-01

    This study examined the accuracy of a 13-year-old girl's account of a sexually abusive incident. Information given by the victim was compared with an audiotaped record. Over 50% of information reported by the victim was corroborated by the audio record and 64% was confirmed by more than one source. (Author/CR)

  17. Do Fixation Cues Ensure Fixation Accuracy in Split-Fovea Studies of Word Recognition?

    ERIC Educational Resources Information Center

    Jordan, Timothy R.; Paterson, Kevin B.; Kurtev, Stoyan; Xu, Mengyun

    2009-01-01

    Many studies have claimed that hemispheric processing is split precisely at the foveal midline and so place great emphasis on the precise location at which words are fixated. These claims are based on experiments in which a variety of fixation procedures were used to ensure fixation accuracy but the effectiveness of these procedures is unclear. We…

  18. Who Should Mark What? A Study of Factors Affecting Marking Accuracy in a Biology Examination

    ERIC Educational Resources Information Center

    Suto, Irenka; Nadas, Rita; Bell, John

    2011-01-01

    Accurate marking is crucial to the reliability and validity of public examinations, in England and internationally. Factors contributing to accuracy have been conceptualised as affecting either marking task demands or markers' personal expertise. The aim of this empirical study was to develop this conceptualisation through investigating the…

  19. The Accuracy of Surgeons' Provided Estimates for the Duration of Hysterectomies: A Pilot Study

    PubMed Central

    Roque, Dario R.; Robison, Katina; Raker, Christina A.; Wharton, Gary G.; Frishman, Gary N.

    2016-01-01

    Study Objective To determine the accuracy of gynecologic surgeons' estimate of operative times for hysterectomies and to compare these with the existing computer-generated estimate at our institution. Design Pilot prospective cohort study (Canadian Task Force classification II-2). Setting Academic tertiary women's hospital in the Northeast United States. Participants Thirty gynecologic surgeons including 23 general gynecologists, 4 gynecologic oncologists, and 3 urogynecologists. Intervention Via a 6-question survey, surgeons were asked to predict the operative time for a hysterectomy they were about to perform. The surgeons' predictions were then compared with the time predicted by the scheduling system at our institution and with the actual operative time, to determine accuracy and differences between actual and predicted times. Patient and surgery data were collected to perform a secondary analysis to determine factors that may have significantly affected the prediction. Measurements and Main Results Of 75 hysterectomies analyzed, 36 were performed abdominally, 18 vaginally, and 21 laparoscopically. Accuracy was established if the actual procedure time was within the 15-minute increment predicted by either the surgeons or the scheduling system. The surgeons accurately predicted the duration of 20 hysterectomies (26.7%), whereas the accuracy of the scheduling system was only 9.3%. The scheduling system accuracy was significantly less precise than the surgeons, primarily due to overestimation (p = .01); operative time was overestimated on average 34 minutes. The scheduling system overestimated the time required to a greater extent than the surgeons for nearly all data examined, including patient body mass index, surgical approach, indication for surgery, surgeon experience, uterine size, and previous abdominal surgery. Conclusion Although surgeons' accuracy in predicting operative time was poor, it was significantly better than that of the computerized scheduling

  20. Tools for the study of dynamical spacetimes

    NASA Astrophysics Data System (ADS)

    Zhang, Fan

    This thesis covers a range of topics in numerical and analytical relativity, centered around introducing tools and methodologies for the study of dynamical spacetimes. The scope of the studies is limited to classical (as opposed to quantum) vacuum spacetimes described by Einstein's general theory of relativity. The numerical works presented here are carried out within the Spectral Einstein Code (SpEC) infrastructure, while analytical calculations extensively utilize Wolfram's Mathematica program. We begin by examining highly dynamical spacetimes such as binary black hole mergers, which can be investigated using numerical simulations. However, there are difficulties in interpreting the output of such simulations. One difficulty stems from the lack of a canonical coordinate system (henceforth referred to as gauge freedom) and tetrad, against which quantities such as Newman-Penrose Psi4 (usually interpreted as the gravitational wave part of curvature) should be measured. We tackle this problem in Chapter 2 by introducing a set of geometrically motivated coordinates that are independent of the simulation gauge choice, as well as a quasi-Kinnersley tetrad, also invariant under gauge changes in addition to being optimally suited to the task of gravitational wave extraction. Another difficulty arises from the need to condense the overwhelming amount of data generated by the numerical simulations. In order to extract physical information in a succinct and transparent manner, one may define a version of gravitational field lines and field strength using spatial projections of the Weyl curvature tensor. Introduction, investigation and utilization of these quantities will constitute the main content in Chapters 3 through 6. For the last two chapters, we turn to the analytical study of a simpler dynamical spacetime, namely a perturbed Kerr black hole. We will introduce in Chapter 7 a new analytical approximation to the quasi-normal mode (QNM) frequencies, and relate various

  1. Visual DMDX: A web-based authoring tool for DMDX, a Windows display program with millisecond accuracy.

    PubMed

    Garaizar, Pablo; Reips, Ulf-Dietrich

    2015-09-01

    DMDX is a software package for the experimental control and timing of stimulus display for Microsoft Windows systems. DMDX is reliable, flexible, millisecond accurate, and can be downloaded free of charge; therefore it has become very popular among experimental researchers. However, setting up a DMDX-based experiment is burdensome because of its command-based interface. Further, DMDX relies on RTF files in which parts of the stimuli, design, and procedure of an experiment are defined in a complicated (DMASTR-compatible) syntax. Other experiment software, such as E-Prime, Psychopy, and WEXTOR, became successful as a result of integrated visual authoring tools. Such an intuitive interface was lacking for DMDX. We therefore created and present here Visual DMDX (http://visualdmdx.com/), a HTML5-based web interface to set up experiments and export them to DMDX item files format in RTF. Visual DMDX offers most of the features available from the rich DMDX/DMASTR syntax, and it is a useful tool to support researchers who are new to DMDX. Both old and modern versions of DMDX syntax are supported. Further, with Visual DMDX, we go beyond DMDX by having added export to JSON (a versatile web format), easy backup, and a preview option for experiments. In two examples, one experiment each on lexical decision making and affective priming, we explain in a step-by-step fashion how to create experiments using Visual DMDX. We release Visual DMDX under an open-source license to foster collaboration in its continuous improvement. PMID:24912762

  2. Galaxy tools to study genome diversity

    PubMed Central

    2013-01-01

    Background Intra-species genetic variation can be used to investigate population structure, selection, and gene flow in non-model vertebrates; and due to the plummeting costs for genome sequencing, it is now possible for small labs to obtain full-genome variation data from their species of interest. However, those labs may not have easy access to, and familiarity with, computational tools to analyze those data. Results We have created a suite of tools for the Galaxy web server aimed at handling nucleotide and amino-acid polymorphisms discovered by full-genome sequencing of several individuals of the same species, or using a SNP genotyping microarray. In addition to providing user-friendly tools, a main goal is to make published analyses reproducible. While most of the examples discussed in this paper deal with nuclear-genome diversity in non-human vertebrates, we also illustrate the application of the tools to fungal genomes, human biomedical data, and mitochondrial sequences. Conclusions This project illustrates that a small group can design, implement, test, document, and distribute a Galaxy tool collection to meet the needs of a particular community of biologists. PMID:24377391

  3. Studies on dynamic motion compensation and positioning accuracy on star tracker.

    PubMed

    Jun, Zhang; Yuncai, Hao; Li, Wang; Da, Liu

    2015-10-01

    Error from motion is the dominant restriction on the improvement of dynamic performance on a star tracker. As a remarkable motion error, the degree of nonuniformity of the star image velocity field on the detector is studied, and thus a general model for the moving star spot is built. To minimize velocity nonuniformity, a novel general method is proposed to derive the proper motion compensation and location accuracy in cases of both uniform velocity and acceleration. Using this method, a theoretic analysis on the accuracy of time-delayed integration and similar techniques, which are thought of as state-of-the-art approaches to reduce error from motion, is conducted. The simulations and experimental results validate the proposed method. Our method shows a more steady performance than the dynamic binning algorithm. The positional error could be neglected when the smear length is far less than 3.464 times the scale of star spot, which suggests accuracy can be maintained by changing frame-integration time inverse proportional to the velocity on the focal plane. It also shows that the acceleration effect must be compensated to achieve accuracy close to the Cramér-Rao lower bound. PMID:26479618

  4. Conventional Analyses of Data from Dietary Validation Studies May Misestimate Reporting Accuracy: Illustration from a Study of the Effect of Interview Modality on Children’s Reporting Accuracy

    PubMed Central

    Smith, Albert F.; Baxter, Suzanne Domel; Hardin, James W.; Nichols, Michele D.

    2008-01-01

    Objective To compare two approaches to analyzing energy- and nutrient-converted data from dietary validation (and relative validation) studies—conventional analyses, in which the accuracy of reported items is not ascertained, and reporting-error sensitive analyses, in which reported items are classified as matches (items actually eaten) or intrusions (items not actually eaten), and reported amounts are classified as corresponding or overreported. Design Subjects were observed eating school breakfast and lunch, and interviewed that evening about that day’s intake. For conventional analyses, reference and reported information were converted to energy and macronutrients; then t-tests, correlation coefficients, and report rates (reported/reference) were calculated. For reporting-error sensitive analyses, reported items were classified as matches or intrusions, reported amounts were classified as corresponding or overreported, and correspondence rates (corresponding amount/reference amount) and inflation ratios (overreported amount/reference amount) were calculated. Subjects Sixty-nine fourth-grade children (35 girls) from 10 elementary schools in Georgia (US). Results For energy and each macronutrient, conventional analyses found that reported amounts were significantly less than reference amounts (ps < .021; paired t-tests); correlations between reported and reference amounts exceeded 0.52 (ps < .001); and median report rates ranged from 76% to 95%. Analyses sensitive to reporting errors found median correspondence rates between 67% and 79%, and that median inflation ratios, which ranged from 7% to 17%, differed significantly from 0 (ps < .0001; sign tests). Conclusions Conventional analyses of energy and nutrient data from dietary-reporting validation (and relative validation) studies may overestimate accuracy and mask the complexity of dietary reporting error. PMID:17381899

  5. Accuracy of bite mark analysis from food substances: A comparative study

    PubMed Central

    Daniel, M. Jonathan; Pazhani, Ambiga

    2015-01-01

    Aims and Objectives: The aims and objectives of the study were to compare the accuracy of bite mark analysis from three different food substances-apple, cheese and chocolate using two techniques-the manual docking procedure and computer assisted overlay generation technique and to compare the accuracy of the two techniques for bite mark analysis on food substances. Materials and Methods: The individuals who participated in the study were made to bite on three food substances-apple, cheese, and chocolate. Dentate individuals were included in the study. Edentulous individuals and individuals having a missing anterior tooth were excluded from the study. The dental casts of the individual were applied to the positive cast of the bitten food substance to determine docking or matching. Then, computer generated overlays were compared with bite mark pattern on the foodstuff. Results: The results were tabulated and the comparison of bite mark analysis on the three different food substances was analyzed by Kruskall-Wallis ANOVA test and the comparison of the two techniques was analyzed by Spearman's Rho correlation coefficient. Conclusion: On comparing the bite marks analysis from the three food substances-apple, cheese and chocolate, the accuracy was found to be greater for chocolate and cheese than apple. PMID:26816463

  6. Does the Reporting Quality of Diagnostic Test Accuracy Studies, as Defined by STARD 2015, Affect Citation?

    PubMed Central

    Choi, Young Jun; Chung, Mi Sun; Koo, Hyun Jung; Park, Ji Eun; Yoon, Hee Mang

    2016-01-01

    Objective To determine the rate with which diagnostic test accuracy studies that are published in a general radiology journal adhere to the Standards for Reporting of Diagnostic Accuracy Studies (STARD) 2015, and to explore the relationship between adherence rate and citation rate while avoiding confounding by journal factors. Materials and Methods All eligible diagnostic test accuracy studies that were published in the Korean Journal of Radiology in 2011–2015 were identified. Five reviewers assessed each article for yes/no compliance with 27 of the 30 STARD 2015 checklist items (items 28, 29, and 30 were excluded). The total STARD score (number of fulfilled STARD items) was calculated. The score of the 15 STARD items that related directly to the Quality Assessment of Diagnostic Accuracy Studies (QUADAS)-2 was also calculated. The number of times each article was cited (as indicated by the Web of Science) after publication until March 2016 and the article exposure time (time in months between publication and March 2016) were extracted. Results Sixty-three articles were analyzed. The mean (range) total and QUADAS-2-related STARD scores were 20.0 (14.5–25) and 11.4 (7–15), respectively. The mean citation number was 4 (0–21). Citation number did not associate significantly with either STARD score after accounting for exposure time (total score: correlation coefficient = 0.154, p = 0.232; QUADAS-2-related score: correlation coefficient = 0.143, p = 0.266). Conclusion The degree of adherence to STARD 2015 was moderate for this journal, indicating that there is room for improvement. When adjusted for exposure time, the degree of adherence did not affect the citation rate. PMID:27587959

  7. Accuracy of Electronic Health Record Data for Identifying Stroke Cases in Large-Scale Epidemiological Studies: A Systematic Review from the UK Biobank Stroke Outcomes Group

    PubMed Central

    Woodfield, Rebecca; Grant, Ian; Sudlow, Cathie L. M.

    2015-01-01

    Objective Long-term follow-up of population-based prospective studies is often achieved through linkages to coded regional or national health care data. Our knowledge of the accuracy of such data is incomplete. To inform methods for identifying stroke cases in UK Biobank (a prospective study of 503,000 UK adults recruited in middle-age), we systematically evaluated the accuracy of these data for stroke and its main pathological types (ischaemic stroke, intracerebral haemorrhage, subarachnoid haemorrhage), determining the optimum codes for case identification. Methods We sought studies published from 1990-November 2013, which compared coded data from death certificates, hospital admissions or primary care with a reference standard for stroke or its pathological types. We extracted information on a range of study characteristics and assessed study quality with the Quality Assessment of Diagnostic Studies tool (QUADAS-2). To assess accuracy, we extracted data on positive predictive values (PPV) and—where available—on sensitivity, specificity, and negative predictive values (NPV). Results 37 of 39 eligible studies assessed accuracy of International Classification of Diseases (ICD)-coded hospital or death certificate data. They varied widely in their settings, methods, reporting, quality, and in the choice and accuracy of codes. Although PPVs for stroke and its pathological types ranged from 6–97%, appropriately selected, stroke-specific codes (rather than broad cerebrovascular codes) consistently produced PPVs >70%, and in several studies >90%. The few studies with data on sensitivity, specificity and NPV showed higher sensitivity of hospital versus death certificate data for stroke, with specificity and NPV consistently >96%. Few studies assessed either primary care data or combinations of data sources. Conclusions Particular stroke-specific codes can yield high PPVs (>90%) for stroke/stroke types. Inclusion of primary care data and combining data sources should

  8. Comparative study of dimensional accuracy of different impression techniques using addition silicone impression material.

    PubMed

    Penaflor, C F; Semacio, R C; De Las Alas, L T; Uy, H G

    1998-01-01

    This study compared dimensional accuracy of the single, double with spacer, double with cut-out and double mix impression technique using addition silicone impression material. A typhodont containing Ivorine teeth model with six (6) full-crown tooth preparations were used as the positive control. Two stone replication models for each impression technique were made as test materials. Accuracy of the techniques were assessed by measuring four dimensions on the stone dies poured from the impression of the Ivorine teeth model. Results indicated that most of the measurements for the height, width and diameter slightly decreased and a few increased compared with the Ivorine teeth model. The double with cut-out and double mix technique presents the least difference from the master model as compared to the two latter impression techniques. PMID:10202524

  9. An Initial Study of Airport Arrival Heinz Capacity Benefits Due to Improved Scheduling Accuracy

    NASA Technical Reports Server (NTRS)

    Meyn, Larry; Erzberger, Heinz

    2005-01-01

    The long-term growth rate in air-traffic demand leads to future air-traffic densities that are unmanageable by today's air-traffic control system. I n order to accommodate such growth, new technology and operational methods will be needed in the next generation air-traffic control system. One proposal for such a system is the Automated Airspace Concept (AAC). One of the precepts of AAC is to direct aircraft using trajectories that are sent via an air-ground data link. This greatly improves the accuracy in directing aircraft to specific waypoints at specific times. Studies of the Center-TRACON Automation System (CTAS) have shown that increased scheduling accuracy enables increased arrival capacity at CTAS equipped airports.

  10. SYRCLE’s risk of bias tool for animal studies

    PubMed Central

    2014-01-01

    Background Systematic Reviews (SRs) of experimental animal studies are not yet common practice, but awareness of the merits of conducting such SRs is steadily increasing. As animal intervention studies differ from randomized clinical trials (RCT) in many aspects, the methodology for SRs of clinical trials needs to be adapted and optimized for animal intervention studies. The Cochrane Collaboration developed a Risk of Bias (RoB) tool to establish consistency and avoid discrepancies in assessing the methodological quality of RCTs. A similar initiative is warranted in the field of animal experimentation. Methods We provide an RoB tool for animal intervention studies (SYRCLE’s RoB tool). This tool is based on the Cochrane RoB tool and has been adjusted for aspects of bias that play a specific role in animal intervention studies. To enhance transparency and applicability, we formulated signalling questions to facilitate judgment. Results The resulting RoB tool for animal studies contains 10 entries. These entries are related to selection bias, performance bias, detection bias, attrition bias, reporting bias and other biases. Half these items are in agreement with the items in the Cochrane RoB tool. Most of the variations between the two tools are due to differences in design between RCTs and animal studies. Shortcomings in, or unfamiliarity with, specific aspects of experimental design of animal studies compared to clinical studies also play a role. Conclusions SYRCLE’s RoB tool is an adapted version of the Cochrane RoB tool. Widespread adoption and implementation of this tool will facilitate and improve critical appraisal of evidence from animal studies. This may subsequently enhance the efficiency of translating animal research into clinical practice and increase awareness of the necessity of improving the methodological quality of animal studies. PMID:24667063

  11. Accuracy Assessment Study of UNB3m Neutral Atmosphere Model for Global Tropospheric Delay Mitigation

    NASA Astrophysics Data System (ADS)

    Farah, Ashraf

    2015-12-01

    Tropospheric delay is the second major source of error after the ionospheric delay for satellite navigation systems. The transmitted signal could face a delay caused by the troposphere of over 2m at zenith and 20m at lower satellite elevation angles of 10 degrees and below. Positioning errors of 10m or greater can result from the inaccurate mitigation of the tropospheric delay. Many techniques are available for tropospheric delay mitigation consisting of surface meteorological models and global empirical models. Surface meteorological models need surface meteorological data to give high accuracy mitigation while the global empirical models need not. Several hybrid neutral atmosphere delay models have been developed by (University of New Brunswick, Canada) UNB researchers over the past decade or so. The most widely applicable current version is UNB3m, which uses the Saastamoinen zenith delays, Niell mapping functions, and a look-up table with annual mean and amplitude for temperature, pressure, and water vapour pressure varying with respect to latitude and height. This paper presents an assessment study of the behaviour of the UNB3m model compared with highly accurate IGS-tropospheric estimation for three different (latitude/height) IGS stations. The study was performed over four nonconsecutive weeks on different seasons over one year (October 2014 to July 2015). It can be concluded that using UNB3m model gives tropospheric delay correction accuracy of 0.050m in average for low latitude regions in all seasons. The model's accuracy is about 0.075m for medium latitude regions, while its highest accuracy is about 0.014m for high latitude regions.

  12. A reference dataset for deformable image registration spatial accuracy evaluation using the COPDgene study archive

    NASA Astrophysics Data System (ADS)

    Castillo, Richard; Castillo, Edward; Fuentes, David; Ahmad, Moiz; Wood, Abbie M.; Ludwig, Michelle S.; Guerrero, Thomas

    2013-05-01

    Landmark point-pairs provide a strategy to assess deformable image registration (DIR) accuracy in terms of the spatial registration of the underlying anatomy depicted in medical images. In this study, we propose to augment a publicly available database (www.dir-lab.com) of medical images with large sets of manually identified anatomic feature pairs between breath-hold computed tomography (BH-CT) images for DIR spatial accuracy evaluation. Ten BH-CT image pairs were randomly selected from the COPDgene study cases. Each patient had received CT imaging of the entire thorax in the supine position at one-fourth dose normal expiration and maximum effort full dose inspiration. Using dedicated in-house software, an imaging expert manually identified large sets of anatomic feature pairs between images. Estimates of inter- and intra-observer spatial variation in feature localization were determined by repeat measurements of multiple observers over subsets of randomly selected features. 7298 anatomic landmark features were manually paired between the 10 sets of images. Quantity of feature pairs per case ranged from 447 to 1172. Average 3D Euclidean landmark displacements varied substantially among cases, ranging from 12.29 (SD: 6.39) to 30.90 (SD: 14.05) mm. Repeat registration of uniformly sampled subsets of 150 landmarks for each case yielded estimates of observer localization error, which ranged in average from 0.58 (SD: 0.87) to 1.06 (SD: 2.38) mm for each case. The additions to the online web database (www.dir-lab.com) described in this work will broaden the applicability of the reference data, providing a freely available common dataset for targeted critical evaluation of DIR spatial accuracy performance in multiple clinical settings. Estimates of observer variance in feature localization suggest consistent spatial accuracy for all observers across both four-dimensional CT and COPDgene patient cohorts.

  13. The Effect of Study Design Biases on the Diagnostic Accuracy of Magnetic Resonance Imaging to Detect Silicone Breast Implant Ruptures: A Meta-Analysis

    PubMed Central

    Song, Jae W.; Kim, Hyungjin Myra; Bellfi, Lillian T.; Chung, Kevin C.

    2010-01-01

    Background All silicone breast implant recipients are recommended by the US Food and Drug Administration to undergo serial screening to detect implant rupture with magnetic resonance imaging (MRI). We performed a systematic review of the literature to assess the quality of diagnostic accuracy studies utilizing MRI or ultrasound to detect silicone breast implant rupture and conducted a meta-analysis to examine the effect of study design biases on the estimation of MRI diagnostic accuracy measures. Method Studies investigating the diagnostic accuracy of MRI and ultrasound in evaluating ruptured silicone breast implants were identified using MEDLINE, EMBASE, ISI Web of Science, and Cochrane library databases. Two reviewers independently screened potential studies for inclusion and extracted data. Study design biases were assessed using the QUADAS tool and the STARDS checklist. Meta-analyses estimated the influence of biases on diagnostic odds ratios. Results Among 1175 identified articles, 21 met the inclusion criteria. Most studies using MRI (n= 10 of 16) and ultrasound (n=10 of 13) examined symptomatic subjects. Meta-analyses revealed that MRI studies evaluating symptomatic subjects had 14-fold higher diagnostic accuracy estimates compared to studies using an asymptomatic sample (RDOR 13.8; 95% CI 1.83–104.6) and 2-fold higher diagnostic accuracy estimates compared to studies using a screening sample (RDOR 1.89; 95% CI 0.05–75.7). Conclusion Many of the published studies utilizing MRI or ultrasound to detect silicone breast implant rupture are flawed with methodological biases. These methodological shortcomings may result in overestimated MRI diagnostic accuracy measures and should be interpreted with caution when applying the data to a screening population. PMID:21364405

  14. A Comparison of Parameter Study Creation and Job Submission Tools

    NASA Technical Reports Server (NTRS)

    DeVivo, Adrian; Yarrow, Maurice; McCann, Karen M.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We consider the differences between the available general purpose parameter study and job submission tools. These tools necessarily share many features, but frequently with differences in the way they are designed and implemented For this class of features, we will only briefly outline the essential differences. However we will focus on the unique features which distinguish the ILab parameter study and job submission tool from other packages, and which make the ILab tool easier and more suitable for use in our research and engineering environment.

  15. High accuracy differential pressure measurements using fluid-filled catheters - A feasibility study in compliant tubes.

    PubMed

    Rotman, Oren Moshe; Weiss, Dar; Zaretsky, Uri; Shitzer, Avraham; Einav, Shmuel

    2015-09-18

    High accuracy differential pressure measurements are required in various biomedical and medical applications, such as in fluid-dynamic test systems, or in the cath-lab. Differential pressure measurements using fluid-filled catheters are relatively inexpensive, yet may be subjected to common mode pressure errors (CMP), which can significantly reduce the measurement accuracy. Recently, a novel correction method for high accuracy differential pressure measurements was presented, and was shown to effectively remove CMP distortions from measurements acquired in rigid tubes. The purpose of the present study was to test the feasibility of this correction method inside compliant tubes, which effectively simulate arteries. Two tubes with varying compliance were tested under dynamic flow and pressure conditions to cover the physiological range of radial distensibility in coronary arteries. A third, compliant model, with a 70% stenosis severity was additionally tested. Differential pressure measurements were acquired over a 3 cm tube length using a fluid-filled double-lumen catheter, and were corrected using the proposed CMP correction method. Validation of the corrected differential pressure signals was performed by comparison to differential pressure recordings taken via a direct connection to the compliant tubes, and by comparison to predicted differential pressure readings of matching fluid-structure interaction (FSI) computational simulations. The results show excellent agreement between the experimentally acquired and computationally determined differential pressure signals. This validates the application of the CMP correction method in compliant tubes of the physiological range for up to intermediate size stenosis severity of 70%. PMID:26087881

  16. Dimensional Accuracy of Hydrophilic and Hydrophobic VPS Impression Materials Using Different Impression Techniques - An Invitro Study

    PubMed Central

    Pilla, Ajai; Pathipaka, Suman

    2016-01-01

    Introduction The dimensional stability of the impression material could have an influence on the accuracy of the final restoration. Vinyl Polysiloxane Impression materials (VPS) are most frequently used as the impression material in fixed prosthodontics. As VPS is hydrophobic when it is poured with gypsum products, manufacturers added intrinsic surfactants and marketed as hydrophilic VPS. These hydrophilic VPS have shown increased wettability with gypsum slurries. VPS are available in different viscosities ranging from very low to very high for usage under different impression techniques. Aim To compare the dimensional accuracy of hydrophilic VPS and hydrophobic VPS using monophase, one step and two step putty wash impression techniques. Materials and Methods To test the dimensional accuracy of the impression materials a stainless steel die was fabricated as prescribed by ADA specification no. 19 for elastomeric impression materials. A total of 60 impressions were made. The materials were divided into two groups, Group1 hydrophilic VPS (Aquasil) and Group 2 hydrophobic VPS (Variotime). These were further divided into three subgroups A, B, C for monophase, one-step and two-step putty wash technique with 10 samples in each subgroup. The dimensional accuracy of the impressions was evaluated after 24 hours using vertical profile projector with lens magnification range of 20X-125X illumination. The study was analyzed through one-way ANOVA, post-hoc Tukey HSD test and unpaired t-test for mean comparison between groups. Results Results showed that the three different impression techniques (monophase, 1-step, 2-step putty wash techniques) did cause significant change in dimensional accuracy between hydrophilic VPS and hydrophobic VPS impression materials. One-way ANOVA disclosed, mean dimensional change and SD for hydrophilic VPS varied between 0.56% and 0.16%, which were low, suggesting hydrophilic VPS was satisfactory with all three impression techniques. However, mean

  17. DIAGNOSTIC TOOL DEVELOPMENT AND APPLICATION THROUGH REGIONAL CASE STUDIES

    EPA Science Inventory

    Case studies are a useful vehicle for developing and testing conceptual models, classification systems, diagnostic tools and models, and stressor-response relationships. Furthermore, case studies focused on specific places or issues of interest to the Agency provide an excellent ...

  18. Status of the VOTech Design Study about User Tools

    NASA Astrophysics Data System (ADS)

    Dolensky, M.; Pierfederici, F.; Allen, M.; Boch, T.; Bonnarel, F.; Derrière, S.; Fernique, P.; Noddle, K.; Smareglia, R.

    2006-07-01

    The VOTech design study on future tools started in spring 2005. This project, co-funded by the EC, produces design documents and software prototypes for new VO-compliant end-user tools. It is based on the experience and feedback of precursor projects and on input from the scientific user community. This status report details a number of early deliverables available from the project pages wiki.eurovotech.org, section DS4. This includes a summary of existing tools, desired future tools as derived from the AVO SRM, requirements for a cross matcher, a simple method for transferring instrumental footprints, use cases for simulations and the evaluation of various technologies.

  19. A unification of models for meta-analysis of diagnostic accuracy studies without a gold standard.

    PubMed

    Liu, Yulun; Chen, Yong; Chu, Haitao

    2015-06-01

    Several statistical methods for meta-analysis of diagnostic accuracy studies have been discussed in the presence of a gold standard. However, in practice, the selected reference test may be imperfect due to measurement error, non-existence, invasive nature, or expensive cost of a gold standard. It has been suggested that treating an imperfect reference test as a gold standard can lead to substantial bias in the estimation of diagnostic test accuracy. Recently, two models have been proposed to account for imperfect reference test, namely, a multivariate generalized linear mixed model (MGLMM) and a hierarchical summary receiver operating characteristic (HSROC) model. Both models are very flexible in accounting for heterogeneity in accuracies of tests across studies as well as the dependence between tests. In this article, we show that these two models, although with different formulations, are closely related and are equivalent in the absence of study-level covariates. Furthermore, we provide the exact relations between the parameters of these two models and assumptions under which two models can be reduced to equivalent submodels. On the other hand, we show that some submodels of the MGLMM do not have corresponding equivalent submodels of the HSROC model, and vice versa. With three real examples, we illustrate the cases when fitting the MGLMM and HSROC models leads to equivalent submodels and hence identical inference, and the cases when the inferences from two models are slightly different. Our results generalize the important relations between the bivariate generalized linear mixed model and HSROC model when the reference test is a gold standard. PMID:25358907

  20. Diagnostic accuracy of the Eurotest for dementia: a naturalistic, multicenter phase II study

    PubMed Central

    Carnero-Pardo, Cristobal; Gurpegui, Manuel; Sanchez-Cantalejo, Emilio; Frank, Ana; Mola, Santiago; Barquero, M Sagrario; Montoro-Rios, M Teresa

    2006-01-01

    Background Available screening tests for dementia are of limited usefulness because they are influenced by the patient's culture and educational level. The Eurotest, an instrument based on the knowledge and handling of money, was designed to overcome these limitations. The objective of this study was to evaluate the diagnostic accuracy of the Eurotest in identifying dementia in customary clinical practice. Methods A cross-sectional, multi-center, naturalistic phase II study was conducted. The Eurotest was administered to consecutive patients, older than 60 years, in general neurology clinics. The patients' condition was classified as dementia or no dementia according to DSM-IV diagnostic criteria. We calculated sensitivity (Sn), specificity (Sp) and area under the ROC curves (aROC) with 95% confidence intervals. The influence of social and educational factors on scores was evaluated with multiple linear regression analysis, and the influence of these factors on diagnostic accuracy was evaluated with logistic regression. Results Sixteen neurologists recruited a total of 516 participants: 101 with dementia, 380 without dementia, and 35 who were excluded. Of the 481 participants who took the Eurotest, 38.7% were totally or functionally illiterate and 45.5% had received no formal education. Mean time needed to administer the test was 8.2+/-2.0 minutes. The best cut-off point was 20/21, with Sn = 0.91 (0.84–0.96), Sp = 0.82 (0.77–0.85), and aROC = 0.93 (0.91–0.95). Neither the scores on the Eurotest nor its diagnostic accuracy were influenced by social or educational factors. Conclusion This naturalistic and pragmatic study shows that the Eurotest is a rapid, simple and useful screening instrument, which is free from educational influences, and has appropriate internal and external validity. PMID:16606455

  1. Astra: Interdisciplinary study on enhancement of the end-to-end accuracy for spacecraft tracking techniques

    NASA Astrophysics Data System (ADS)

    Iess, Luciano; Di Benedetto, Mauro; James, Nick; Mercolino, Mattia; Simone, Lorenzo; Tortora, Paolo

    2014-02-01

    Navigation of deep-space probes is accomplished through a variety of different radio observables, namely Doppler, ranging and Delta-Differential One-Way Ranging (Delta-DOR). The particular mix of observations used for navigation mainly depends on the available on-board radio system, the mission phase and orbit determination requirements. The accuracy of current ESA and NASA tracking systems is at level of 0.1 mm/s at 60 s integration time for Doppler, 1-5 m for ranging and 6-15 nrad for Delta-DOR measurements in a wide range of operational conditions. The ASTRA study, funded under ESA's General Studies Programme (GSP), addresses the ways to improve the end-to-end accuracy of Doppler, ranging and Delta-DOR systems by roughly a factor of 10. The target accuracies were set to 0.01 mm/s at 60 s integration time for Doppler, 20 cm for ranging and 1 nrad for Delta-DOR. The companies and universities that took part in the study were the University of Rome Sapienza, ALMASpace, BAE Systems and Thales Alenia Space Italy. The analysis of an extensive data set of radio-metric observables and dedicated tests of the ground station allowed consolidating the error budget for each measurement technique. The radio-metric data set comprises X/X, X/Ka and Ka/Ka range and Doppler observables from the Cassini and Rosetta missions. It includes also measurements from the Advanced Media Calibration System (AMCS) developed by JPL for the radio science experiments of the Cassini mission. The error budget for the three radio-metric observables was consolidated by comparing the statistical properties of the data set with the expected error models. The analysis confirmed the contribution from some error sources, but revealed also some discrepancies and ultimately led to improved error models. The error budget reassessment provides adequate information for building guidelines and strategies to effectively improve the navigation accuracies of future deep space missions. We report both on updated

  2. Diagnostic Accuracy of Noncontrast CT in Detecting Acute Appendicitis: A Meta-analysis of Prospective Studies.

    PubMed

    Xiong, Bing; Zhong, Baishu; Li, Zhenwei; Zhou, Feng; Hu, Ruying; Feng, Zhan; Xu, Shunliang; Chen, Feng

    2015-06-01

    The aim of the study is to evaluate the diagnostic accuracy of noncontrast CT in detecting acute appendicitis. Prospective studies in which noncontrast CT was performed to evaluate acute appendicitis were found on PubMed, EMBASE, and Cochrane Library. Pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, and diagnostic odds ratio were assessed. The summary receiver-operating characteristic curve was conducted and the area under the curve was calculated. Seven original studies investigating a total of 845 patients were included in this meta-analysis. The pooled sensitivity and specificity were 0.90 (95% CI: 0.86-0.92) and 0.94 (95% CI: 0.92-0.97), respectively. The pooled positive likelihood ratio, negative likelihood ratio, and diagnostic odds ratio was 12.90 (95% CI: 4.80-34.67), 0.09 (95% CI: 0.04-0.20), and 162.76 (95% CI: 31.05-853.26), respectively. The summary receiver-operating characteristic curve was symmetrical and the area under the curve was 0.97 (95% CI: 0.95-0.99). In conclusion, noncontrast CT has high diagnostic accuracy in detecting acute appendicitis, which is adequate for clinical decision making. PMID:26031278

  3. Tools for Teaching Climate Change Studies

    SciTech Connect

    Maestas, A.M.; Jones, L.A.

    2005-03-18

    The Atmospheric Radiation Measurement Climate Research Facility (ACRF) develops public outreach materials and educational resources for schools. Studies prove that science education in rural and indigenous communities improves when educators integrate regional knowledge of climate and environmental issues into school curriculum and public outreach materials. In order to promote understanding of ACRF climate change studies, ACRF Education and Outreach has developed interactive kiosks about climate change for host communities close to the research sites. A kiosk for the North Slope of Alaska (NSA) community was installed at the Iupiat Heritage Center in 2003, and a kiosk for the Tropical Western Pacific locales will be installed in 2005. The kiosks feature interviews with local community elders, regional agency officials, and Atmospheric Radiation Measurement (ARM) Program scientists, which highlight both research and local observations of some aspects of environmental and climatic change in the Arctic and Pacific. The kiosks offer viewers a unique opportunity to learn about the environmental concerns and knowledge of respected community elders, and to also understand state-of-the-art climate research. An archive of interviews from the communities will also be distributed with supplemental lessons and activities to encourage teachers and students to compare and contrast climate change studies and oral history observations from two distinct locations. The U.S. Department of Energy's ACRF supports education and outreach efforts for communities and schools located near its sites. ACRF Education and Outreach has developed interactive kiosks at the request of the communities to provide an opportunity for the public to learn about climate change from both scientific and indigenous perspectives. Kiosks include interviews with ARM scientists and provide users with basic information about climate change studies as well as interviews with elders and community leaders

  4. A PILOT STUDY OF THE ACCURACY OF CO2 SENSORS IN COMMERCIAL BUILDINGS

    SciTech Connect

    Fisk, William; Fisk, William J.; Faulkner, David; Sullivan, Douglas P.

    2007-09-01

    Carbon dioxide (CO2) sensors are often deployed in commercial buildings to obtain CO2 data that are used to automatically modulate rates of outdoor air supply. The goal is to keep ventilation rates at or above design requirements and to save energy by avoiding ventilation rates exceeding design requirements. However, there have been many anecdotal reports of poor CO2 sensor performance in actual commercial building applications. This study evaluated the accuracy of 44 CO2 sensors located in nine commercial buildings to determine if CO2 sensor performance, in practice, is generally acceptable or problematic. CO2 measurement errors varied widely and were sometimes hundreds of parts per million. Despite its small size, this study provides a strong indication that the accuracy of CO2 sensors, as they are applied and maintained in commercial buildings, is frequently less than needed to measure typical values of maximum one-hour-average indoor-outdoor CO2 concentration differences with less than a 20percent error. Thus, we conclude that there is a need for more accurate CO2 sensors and/or better sensor maintenance or calibration procedures.

  5. Sex differences in accuracy and precision when judging time to arrival: data from two Internet studies.

    PubMed

    Sanders, Geoff; Sinclair, Kamila

    2011-12-01

    We report two Internet studies that investigated sex differences in the accuracy and precision of judging time to arrival. We used accuracy to mean the ability to match the actual time to arrival and precision to mean the consistency with which each participant made their judgments. Our task was presented as a computer game in which a toy UFO moved obliquely towards the participant through a virtual three-dimensional space on route to a docking station. The UFO disappeared before docking and participants pressed their space bar at the precise moment they thought the UFO would have docked. Study 1 showed it was possible to conduct quantitative studies of spatiotemporal judgments in virtual reality via the Internet and confirmed reports that men are more accurate because women underestimate, but found no difference in precision measured as intra-participant variation. Study 2 repeated Study 1 with five additional presentations of one condition to provide a better measure of precision. Again, men were more accurate than women but there were no sex differences in precision. However, within the coincidence-anticipation timing (CAT) literature, of those studies that report sex differences, a majority found that males are both more accurate and more precise than females. Noting that many CAT studies report no sex differences, we discuss appropriate interpretations of such null findings. While acknowledging that CAT performance may be influenced by experience we suggest that the sex difference may have originated among our ancestors with the evolutionary selection of men for hunting and women for gathering. PMID:21125324

  6. Case studies on forecasting for innovative technologies: frequent revisions improve accuracy.

    PubMed

    Lerner, Jeffrey C; Robertson, Diane C; Goldstein, Sara M

    2015-02-01

    Health technology forecasting is designed to provide reliable predictions about costs, utilization, diffusion, and other market realities before the technologies enter routine clinical use. In this article we address three questions central to forecasting's usefulness: Are early forecasts sufficiently accurate to help providers acquire the most promising technology and payers to set effective coverage policies? What variables contribute to inaccurate forecasts? How can forecasters manage the variables to improve accuracy? We analyzed forecasts published between 2007 and 2010 by the ECRI Institute on four technologies: single-room proton beam radiation therapy for various cancers; digital breast tomosynthesis imaging technology for breast cancer screening; transcatheter aortic valve replacement for serious heart valve disease; and minimally invasive robot-assisted surgery for various cancers. We then examined revised ECRI forecasts published in 2013 (digital breast tomosynthesis) and 2014 (the other three topics) to identify inaccuracies in the earlier forecasts and explore why they occurred. We found that five of twenty early predictions were inaccurate when compared with the updated forecasts. The inaccuracies pertained to two technologies that had more time-sensitive variables to consider. The case studies suggest that frequent revision of forecasts could improve accuracy, especially for complex technologies whose eventual use is governed by multiple interactive factors. PMID:25646112

  7. Rapid Diagnostic Algorithms as a Screening Tool for Tuberculosis: An Assessor Blinded Cross-Sectional Study

    PubMed Central

    Ratzinger, Franz; Bruckschwaiger, Harald; Wischenbart, Martin; Parschalk, Bernhard; Fernandez-Reyes, Delmiro; Lagler, Heimo; Indra, Alexandra; Graninger, Wolfgang; Winkler, Stefan; Krishna, Sanjeev; Ramharter, Michael

    2012-01-01

    Background A major obstacle to effectively treat and control tuberculosis is the absence of an accurate, rapid, and low-cost diagnostic tool. A new approach for the screening of patients for tuberculosis is the use of rapid diagnostic classification algorithms. Methods We tested a previously published diagnostic algorithm based on four biomarkers as a screening tool for tuberculosis in a Central European patient population using an assessor-blinded cross-sectional study design. In addition, we developed an improved diagnostic classification algorithm based on a study population at a tertiary hospital in Vienna, Austria, by supervised computational statistics. Results The diagnostic accuracy of the previously published diagnostic algorithm for our patient population consisting of 206 patients was 54% (CI: 47%–61%). An improved model was constructed using inflammation parameters and clinical information. A diagnostic accuracy of 86% (CI: 80%–90%) was demonstrated by 10-fold cross validation. An alternative model relying solely on clinical parameters exhibited a diagnostic accuracy of 85% (CI: 79%–89%). Conclusion Here we show that a rapid diagnostic algorithm based on clinical parameters is only slightly improved by inclusion of inflammation markers in our cohort. Our results also emphasize the need for validation of new diagnostic algorithms in different settings and patient populations. PMID:23185397

  8. Impact of contacting study authors to obtain additional data for systematic reviews: diagnostic accuracy studies for hepatic fibrosis

    PubMed Central

    2014-01-01

    Background Seventeen of 172 included studies in a recent systematic review of blood tests for hepatic fibrosis or cirrhosis reported diagnostic accuracy results discordant from 2 × 2 tables, and 60 studies reported inadequate data to construct 2 × 2 tables. This study explores the yield of contacting authors of diagnostic accuracy studies and impact on the systematic review findings. Methods Sixty-six corresponding authors were sent letters requesting additional information or clarification of data from 77 studies. Data received from the authors were synthesized with data included in the previous review, and diagnostic accuracy sensitivities, specificities, and positive and likelihood ratios were recalculated. Results Of the 66 authors, 68% were successfully contacted and 42% provided additional data for 29 out of 77 studies (38%). All authors who provided data at all did so by the third emailed request (ten authors provided data after one request). Authors of more recent studies were more likely to be located and provide data compared to authors of older studies. The effects of requests for additional data on the conclusions regarding the utility of blood tests to identify patients with clinically significant fibrosis or cirrhosis were generally small for ten out of 12 tests. Additional data resulted in reclassification (using median likelihood ratio estimates) from less useful to moderately useful or vice versa for the remaining two blood tests and enabled the calculation of an estimate for a third blood test for which previously the data had been insufficient to do so. We did not identify a clear pattern for the directional impact of additional data on estimates of diagnostic accuracy. Conclusions We successfully contacted and received results from 42% of authors who provided data for 38% of included studies. Contacting authors of studies evaluating the diagnostic accuracy of serum biomarkers for hepatic fibrosis and cirrhosis in hepatitis C patients

  9. Understanding of accuracy on calculated soil moisture field for the study of land-atmosphere interaction

    NASA Astrophysics Data System (ADS)

    Yorozu, K.; Tanaka, K.; Nakakita, E.; Ikebuchi, S.

    2007-12-01

    Understanding the state of soil moisture is effective to enhance climate predictability on inter-seasonal or annual time scales. Thus, the Global Soil Wetness Project (GSWP) has been implemented as an environmental modeling research activity. The SiBUC (Simple Biosphere including Urban Canopy) land surface model is one of the participants of the 2nd GSWP, and it uses mosaic approach to incorporate all kind of land-use. In order to estimate the global soil moisture field as accurately as possible and to utilize the products of GSWP2 simulation more efficiently, SiBUC is run with irrigation scheme activated. Integration of one-way uncoupled SiBUC model from 1986 to 1995 have produced global soil moisture field. Both the model and forcing data may contain uncertainty. However, the SiBUC model is one of the few models which can consider irrigation effect. And also, the advantage of the meteorological forcing data provided from GSWP2 is hybridization among reanalysis products, observation data and satellite data. In this sense, it is assumed that GSWP2 products is the most accurate global land surface hydrological data set in available. Thus, these global products should be applied to land-atmosphere interaction study, if possible. To do this, it is important to understand inter-annual or much higher time scale accuracy on calculated soil moisture filed. In this study, calculated soil moisture field are validated with observation of soil moisture in five regions (Illinois:USA, China, India, Mongolia, Russia). The Russian data has two types data: one is located in spring wheat and another is located in winter wheat. These observation data are provided from Global Soil Moisture Data Bank (GSMDB). To understand the time scale accuracy on soil moisture field, three correlation coefficients are calculated between calculated soil moisture and observed soil moisture: inter-annual, inter-seasonal and monthly mean correlation, respectively. As a result, if the median value in

  10. Longitudinal Study: Efficacy of Online Technology Tools for Instructional Use

    NASA Technical Reports Server (NTRS)

    Uenking, Michael D.

    2011-01-01

    Studies show that the student population (secondary and post secondary) is becoming increasingly more technologically savvy. Use of the internet, computers, MP3 players, and other technologies along with online gaming has increased tremendously amongst this population such that it is creating an apparent paradigm shift in the learning modalities of these students. Instructors and facilitators of learning can no longer rely solely on traditional lecture-based lesson formals. In order to achieve student academic success and satisfaction and to increase student retention, instructors must embrace various technology tools that are available and employ them in their lessons. A longitudinal study (January 2009-June 2010) has been performed that encompasses the use of several technology tools in an instructional setting. The study provides further evidence that students not only like the tools that are being used, but prefer that these tools be used to help supplement and enhance instruction.

  11. Accuracy in Rietveld quantitative phase analysis: a comparative study of strictly monochromatic Mo and Cu radiations

    PubMed Central

    León-Reina, L.; García-Maté, M.; Álvarez-Pinazo, G.; Santacruz, I.; Vallcorba, O.; De la Torre, A. G.; Aranda, M. A. G.

    2016-01-01

    This study reports 78 Rietveld quantitative phase analyses using Cu Kα1, Mo Kα1 and synchrotron radiations. Synchrotron powder diffraction has been used to validate the most challenging analyses. From the results for three series with increasing contents of an analyte (an inorganic crystalline phase, an organic crystalline phase and a glass), it is inferred that Rietveld analyses from high-energy Mo Kα1 radiation have slightly better accuracies than those obtained from Cu Kα1 radiation. This behaviour has been established from the results of the calibration graphics obtained through the spiking method and also from Kullback–Leibler distance statistic studies. This outcome is explained, in spite of the lower diffraction power for Mo radiation when compared to Cu radiation, as arising because of the larger volume tested with Mo and also because higher energy allows one to record patterns with fewer systematic errors. The limit of detection (LoD) and limit of quantification (LoQ) have also been established for the studied series. For similar recording times, the LoDs in Cu patterns, ∼0.2 wt%, are slightly lower than those derived from Mo patterns, ∼0.3 wt%. The LoQ for a well crystallized inorganic phase using laboratory powder diffraction was established to be close to 0.10 wt% in stable fits with good precision. However, the accuracy of these analyses was poor with relative errors near to 100%. Only contents higher than 1.0 wt% yielded analyses with relative errors lower than 20%. PMID:27275132

  12. Diagnostic accuracy of bedside tests for predicting difficult intubation in Indian population: An observational study

    PubMed Central

    Dhanger, Sangeeta; Gupta, Suman Lata; Vinayagam, Stalin; Bidkar, Prasanna Udupi; Elakkumanan, Lenin Babu; Badhe, Ashok Shankar

    2016-01-01

    Background: Unanticipated difficult intubation can be challenging to anesthesiologists, and various bedside tests have been tried to predict difficult intubation. Aims: The aim of this study was to determine the incidence of difficult intubation in the Indian population and also to determine the diagnostic accuracy of bedside tests in predicting difficult intubation. Settings and Design: In this study, 200 patients belonging to age group 18–60 years of American Society of Anesthesiologists I and II, scheduled for surgery under general anesthesia requiring endotracheal intubation were enrolled. Patients with upper airway pathology, neck mass, and cervical spine injury were excluded from the study. Materials and Methods: An attending anesthesiologist conducted preoperative assessment and recorded parameters such as body mass index, modified Mallampati grading, inter-incisor distance, neck circumference, and thyromental distance (NC/TMD). After standard anesthetic induction, laryngoscopy was performed, and intubation difficulty assessed using intubation difficulty scale on the basis of seven variables. Statistical Analysis: The Chi-square test or student t-test was performed when appropriate. The binary multivariate logistic regression (forward-Wald) model was used to determine the independent risk factors. Results: Among the 200 patients, 26 patients had difficult intubation with an incidence of 13%. Among different variables, the Mallampati score and NC/TMD were independently associated with difficult intubation. Receiver operating characteristic curve showed a cut-off point of 3 or 4 for Mallampati score and 5.62 for NC/TMD to predict difficult intubation. Conclusion: The diagnostic accuracy of NC/TM ratio and Mallampatti score were better compared to other bedside tests to predict difficult intubation in Indian population. PMID:26957691

  13. A material sensitivity study on the accuracy of deformable organ registration using linear biomechanical models

    SciTech Connect

    Chi, Y.; Liang, J.; Yan, D.

    2006-02-15

    Model-based deformable organ registration techniques using the finite element method (FEM) have recently been investigated intensively and applied to image-guided adaptive radiotherapy (IGART). These techniques assume that human organs are linearly elastic material, and their mechanical properties are predetermined. Unfortunately, the accurate measurement of the tissue material properties is challenging and the properties usually vary between patients. A common issue is therefore the achievable accuracy of the calculation due to the limited access to tissue elastic material constants. In this study, we performed a systematic investigation on this subject based on tissue biomechanics and computer simulations to establish the relationships between achievable registration accuracy and tissue mechanical and organ geometrical properties. Primarily we focused on image registration for three organs: rectal wall, bladder wall, and prostate. The tissue anisotropy due to orientation preference in tissue fiber alignment is captured by using an orthotropic or a transversely isotropic elastic model. First we developed biomechanical models for the rectal wall, bladder wall, and prostate using simplified geometries and investigated the effect of varying material parameters on the resulting organ deformation. Then computer models based on patient image data were constructed, and image registrations were performed. The sensitivity of registration errors was studied by perturbating the tissue material properties from their mean values while fixing the boundary conditions. The simulation results demonstrated that registration error for a subvolume increases as its distance from the boundary increases. Also, a variable associated with material stability was found to be a dominant factor in registration accuracy in the context of material uncertainty. For hollow thin organs such as rectal walls and bladder walls, the registration errors are limited. Given 30% in material uncertainty

  14. Accuracy and Linearity of Positive Airway Pressure Devices: A Technical Bench Testing Study

    PubMed Central

    Torre-Bouscoulet, Luis; López-Escárcega, Elodia; Carrillo-Alduenda, José Luis; Arredondo-del-Bosque, Fernando; Reyes-Zúñiga, Margarita; Castorena-Maldonado, Armando

    2010-01-01

    Study Objectives: To analyze the accuracy and linearity of different CPAP devices outside of the manufacturers' own quality control environment. Methods: Accuracy (how well readings agree with the gold standard) and linearity were evaluated by comparing programmed pressure to measured CPAP pressure using an instrument established as the gold standard. Comparisons were made centimeter-by-centimeter (linearity) throughout the entire programming spectrum of each device (from 4 to 20 cm H2O). Results: A total of 108 CPAP devices were tested (1836 measurements); mean use of the devices was 956 hours. Twenty-two of them were new. The intra-class correlation coefficient (ICC) decreased from 0.97 at pressures programmed between 4 and 10 cm H2O, to 0.84 at pressures of 16 to 20 cm H2O. Despite this high ICC, the 95% agreement limit oscillated between −1 and 1 cm H2O. This same behavior was observed in relation to hours of use: the ICC for readings taken on devices with < 2,000 hours of use was 0.99, while that of the 50 measurements made on devices with > 6,000 hours was 0.97 (the agreement limit oscillated between −1.3 and 2.5 cm H2O). “Adequate adjustments” were documented in 97% of measurements when the definition was ± 1 cm H2O of the programmed pressure, but this index of adequate adjustment readings decreased to 85% when the ± 0.5 cm H2O criterion was applied. Conclusions: In general, the CPAP devices were accurate and linear throughout the spectrum of programmable pressures; however, strategies to assure short- and long-term equipment reliability are required in conditions of routine use. Citation: Torre-Bouscoulet L; López-Escárcega E; Carrillo-Alduenda JL; Arredondo-del-Bosque F; Reyes-Zúñiga M; Castorena-Maldonado A. Accuracy and linearity of positive airway pressure devices: a technical bench testing study. J Clin Sleep Med 2010;6(4):369-373. PMID:20726286

  15. Compensation of kinematic geometric parameters error and comparative study of accuracy testing for robot

    NASA Astrophysics Data System (ADS)

    Du, Liang; Shi, Guangming; Guan, Weibin; Zhong, Yuansheng; Li, Jin

    2014-12-01

    Geometric error is the main error of the industrial robot, and it plays a more significantly important fact than other error facts for robot. The compensation model of kinematic error is proposed in this article. Many methods can be used to test the robot accuracy, therefore, how to compare which method is better one. In this article, a method is used to compare two methods for robot accuracy testing. It used Laser Tracker System (LTS) and Three Coordinate Measuring instrument (TCM) to test the robot accuracy according to standard. According to the compensation result, it gets the better method which can improve the robot accuracy apparently.

  16. The Precision and Accuracy of AIRS Level 1B Radiances for Climate Studies

    NASA Technical Reports Server (NTRS)

    Hearty, Thomas J.; Gaiser, Steve; Pagano, Tom; Aumann, Hartmut

    2004-01-01

    We investigate uncertainties in the Atmospheric Infrared Sounder (AIRS) radiances based on in-flight and preflight calibration algorithms and observations. The global coverage and spectra1 resolution ((lamda)/(Delta)(lamda) 1200) of AIRS enable it to produce a data set that can be used as a climate data record over the lifetime of the instrument. Therefore, we examine the effects of the uncertainties in the calibration and the detector stability on future climate studies. The uncertainties of the parameters that go into the AIRS radiometric calibration are propagated to estimate the accuracy of the radiances and any climate data record created from AIRS measurements. The calculated radiance uncertainties are consistent with observations. Algorithm enhancements may be able to reduce the radiance uncertainties by as much as 7%. We find that the orbital variation of the gain contributes a brightness temperature bias of < 0.01 K.

  17. Speed and accuracy of facial expression classification in avoidant personality disorder: a preliminary study.

    PubMed

    Rosenthal, M Zachary; Kim, Kwanguk; Herr, Nathaniel R; Smoski, Moria J; Cheavens, Jennifer S; Lynch, Thomas R; Kosson, David S

    2011-10-01

    The aim of this preliminary study was to examine whether individuals with avoidant personality disorder (APD) could be characterized by deficits in the classification of dynamically presented facial emotional expressions. Using a community sample of adults with APD (n = 17) and non-APD controls (n = 16), speed and accuracy of facial emotional expression recognition was investigated in a task that morphs facial expressions from neutral to prototypical expressions (Multi-Morph Facial Affect Recognition Task; Blair, Colledge, Murray, & Mitchell, 2001). Results indicated that individuals with APD were significantly more likely than controls to make errors when classifying fully expressed fear. However, no differences were found between groups in the speed to correctly classify facial emotional expressions. The findings are some of the first to investigate facial emotional processing in a sample of individuals with APD and point to an underlying deficit in processing social cues that may be involved in the maintenance of APD. PMID:22448805

  18. Analysis tools for turbulence studies at Alcator C-Mod

    NASA Astrophysics Data System (ADS)

    Burns, C.; Shehata, S.; White, A. E.; Cziegler, I.; Dominguez, A.; Terry, J. L.; Pace, D. C.

    2010-11-01

    A new suite of analysis tools written in IDL is being developed to support experimental investigation of turbulence at Alcator C-Mod. The tools include GUIs for spectral analysis (coherence, cross-phase and bicoherence) and characteristic frequency calculations. A user-friendly interface for the GENRAY code, to facilitate in-between shot ray-tracing analysis, is also being developed. The spectral analysis tool is being used to analyze data from existing edge turbulence diagnostics, such as the O-mode correlation reflectometer and Gas Puff Imaging, during I-mode, ITB and EDA H-mode plasmas. GENRAY and the characteristic frequency tool are being used to study diagnostic accessibility limits set by wave propagation and refraction for X-mode Doppler Backscattering and Correlation Electron Cyclotron Emission (CECE) systems that are being planned for core turbulence studies at Alcator C-Mod.

  19. In vivo Study of the Accuracy of Dual-arch Impressions

    PubMed Central

    de Lima, Luciana Martinelli Santayana; Borges, Gilberto Antonio; Junior, Luiz Henrique Burnett; Spohr, Ana Maria

    2014-01-01

    Background: This study evaluated in vivo the accuracy of metal (Smart®) and plastic (Triple Tray®) dual-arch trays used with vinyl polysiloxane (Flexitime®), in the putty/wash viscosity, as well as polyether (Impregum Soft®) in the regular viscosity. Materials and Methods: In one patient, an implant-level transfer was screwed on an implant in the mandibular right first molar, serving as a pattern. Ten impressions were made with each tray and impression material. The impressions were poured with Type IV gypsum. The width and height of the pattern and casts were measured in a profile projector (Nikon). The results were submitted to Student’s t-test for one sample (α = 0.05). Results: For the width distance, the plastic dual-arch trays with vinyl polysiloxane (4.513 mm) and with polyether (4.531 mm) were statistically wider than the pattern (4.489 mm). The metal dual-arch tray with vinyl polysiloxane (4.504 mm) and with polyether (4.500 mm) did not differ statistically from the pattern. For the height distance, only the metal dual-arch tray with polyether (2.253 mm) differed statistically from the pattern (2.310 mm). Conclusion: The metal dual-arch tray with vinyl polysiloxane, in the putty/wash viscosities, reproduced casts with less distortion in comparison with the same technique with the plastic dual-arch tray. The plastic or metal dual-arch trays with polyether reproduced cast with greater distortion. How to cite the article: Santayana de Lima LM, Borges GA, Burnett LH Jr, Spohr AM. In vivo study of the accuracy of dual-arch impressions. J Int Oral Health 2014;6(3):50-5. PMID:25083032

  20. EM-navigated catheter placement for gynecologic brachytherapy: an accuracy study

    NASA Astrophysics Data System (ADS)

    Mehrtash, Alireza; Damato, Antonio; Pernelle, Guillaume; Barber, Lauren; Farhat, Nabgha; Viswanathan, Akila; Cormack, Robert; Kapur, Tina

    2014-03-01

    Gynecologic malignancies, including cervical, endometrial, ovarian, vaginal and vulvar cancers, cause significant mortality in women worldwide. The standard care for many primary and recurrent gynecologic cancers consists of chemoradiation followed by brachytherapy. In high dose rate (HDR) brachytherapy, intracavitary applicators and /or interstitial needles are placed directly inside the cancerous tissue so as to provide catheters to deliver high doses of radiation. Although technology for the navigation of catheters and needles is well developed for procedures such as prostate biopsy, brain biopsy, and cardiac ablation, it is notably lacking for gynecologic HDR brachytherapy. Using a benchtop study that closely mimics the clinical interstitial gynecologic brachytherapy procedure, we developed a method for evaluating the accuracy of image-guided catheter placement. Future bedside translation of this technology offers the potential benefit of maximizing tumor coverage during catheter placement while avoiding damage to the adjacent organs, for example bladder, rectum and bowel. In the study, two independent experiments were performed on a phantom model to evaluate the targeting accuracy of an electromagnetic (EM) tracking system. The procedure was carried out using a laptop computer (2.1GHz Intel Core i7 computer, 8GB RAM, Windows 7 64-bit), an EM Aurora tracking system with a 1.3mm diameter 6 DOF sensor, and 6F (2 mm) brachytherapy catheters inserted through a Syed-Neblett applicator. The 3D Slicer and PLUS open source software were used to develop the system. The mean of the targeting error was less than 2.9mm, which is comparable to the targeting errors in commercial clinical navigation systems.

  1. Physical Activity Level Improves the Predictive Accuracy of Cardiovascular Disease Risk Score: The ATTICA Study (2002–2012)

    PubMed Central

    Georgousopoulou, Ekavi N.; Panagiotakos, Demosthenes B.; Bougatsas, Dimitrios; Chatzigeorgiou, Michael; Kavouras, Stavros A.; Chrysohoou, Christina; Skoumas, Ioannis; Tousoulis, Dimitrios; Stefanadis, Christodoulos; Pitsavos, Christos

    2016-01-01

    Background: Although physical activity (PA) has long been associated with cardiovascular disease (CVD), assessment of PA status has never been used as a part of CVD risk prediction tools. The aim of the present work was to examine whether the inclusion of PA status in a CVD risk model improves its predictive accuracy. Methods: Data from the 10-year follow-up (2002–2012) of the n = 2020 participants (aged 18–89 years) of the ATTICA prospective study were used to test the research hypothesis. The HellenicSCORE (that incorporates age, sex, smoking, total cholesterol, and systolic blood pressure levels) was calculated to estimate the baseline 10-year CVD risk; assessment of PA status was based on the International Physical Activity Questionnaire. The estimated CVD risk was tested against the observed 10-year incidence (i.e., development of acute coronary syndromes, stroke, or other CVD according to the World Health Organization [WHO]-International Classification of Diseases [ICD]-10 criteria). Changes in the predictive ability of the nested CVD risk model that contained the HellenicSCORE plus PA assessment were evaluated using Harrell's C and net reclassification index. Results: Both HellenicSCORE and PA status were predictors of future CVD events (P < 0.05). However, the estimating classification bias of the model that included only the HellenicSCORE was significantly reduced when PA assessment was included (Harrel's C = 0.012, P = 0.032); this reduction remained significant even when adjusted for diabetes mellitus and dietary habits (P < 0.05). Conclusions: CVD risk scores seem to be more accurate by incorporating individuals’ PA status; thus, may be more effective tools in primary prevention by efficiently allocating CVD candidates. PMID:27076890

  2. Crowdsourcing and the Accuracy of Online Information Regarding Weight Gain in Pregnancy: A Descriptive Study

    PubMed Central

    Verma, Bianca A; Shull, Trevor; Moniz, Michelle H; Kohatsu, Lauren; Plegue, Melissa A; Collins-Thompson, Kevyn

    2016-01-01

    Background Excess weight gain affects nearly half of all pregnancies in the United States and is a strong risk factor for adverse maternal and fetal outcomes, including long-term obesity. The Internet is a prominent source of information during pregnancy; however, the accuracy of this online information is unknown. Objective To identify, characterize, and assess the accuracy of frequently accessed webpages containing information about weight gain during pregnancy. Methods A descriptive study was used to identify and search frequently used phrases related to weight gain during pregnancy on the Google search engine. The first 10 webpages of each query were characterized by type and then assessed for accuracy and completeness, as compared to Institute of Medicine guidelines, using crowdsourcing. Results A total of 114 queries were searched, yielding 305 unique webpages. Of these webpages, 181 (59.3%) included information regarding weight gain during pregnancy. Out of 181 webpages, 62 (34.3%) contained no specific recommendations, 48 (26.5%) contained accurate but incomplete recommendations, 41 (22.7%) contained complete and accurate recommendations, and 22 (12.2%) were inaccurate. Webpages were most commonly from for-profit websites (112/181, 61.9%), followed by government (19/181, 10.5%), medical organizations or associations (13/181, 7.2%), and news sites (12/181, 6.6%). The largest proportion of for-profit sites contained no specific recommendations (44/112, 39.3%). Among pages that provided inaccurate information (22/181, 12.2%), 68% (15/22) were from for-profit sites. Conclusions For-profit websites dominate the online space with regard to weight gain during pregnancy and largely contain incomplete, inaccurate, or no specific recommendations. This represents a significant information gap regarding an important risk factor for obesity among mothers and infants. Our findings suggest that greater clinical and public health efforts to disseminate accurate information

  3. Studying Doctoral Education: Using Activity Theory to Shape Methodological Tools

    ERIC Educational Resources Information Center

    Beauchamp, Catherine; Jazvac-Martek, Marian; McAlpine, Lynn

    2009-01-01

    The study reported here, one part of a larger study on doctoral education, describes a pilot study that used Activity Theory to shape a methodological tool for better understanding the tensions inherent in the doctoral experience. As doctoral students may function within a range of activity systems, we designed data collection protocols based on…

  4. Screw Placement Accuracy and Outcomes Following O-Arm-Navigated Atlantoaxial Fusion: A Feasibility Study.

    PubMed

    Smith, Jacob D; Jack, Megan M; Harn, Nicholas R; Bertsch, Judson R; Arnold, Paul M

    2016-06-01

    Study Design Case series of seven patients. Objective C2 stabilization can be challenging due to the complex anatomy of the upper cervical vertebrae. We describe seven cases of C1-C2 fusion using intraoperative navigation to aid in the screw placement at the atlantoaxial (C1-C2) junction. Methods Between 2011 and 2014, seven patients underwent posterior atlantoaxial fusion using intraoperative frameless stereotactic O-arm Surgical Imaging and StealthStation Surgical Navigation System (Medtronic, Inc., Minneapolis, Minnesota, United States). Outcome measures included screw accuracy, neurologic status, radiation dosing, and surgical complications. Results Four patients had fusion at C1-C2 only, and in the remaining three, fixation extended down to C3 due to anatomical considerations for screw placement recognized on intraoperative imaging. Out of 30 screws placed, all demonstrated minimal divergence from desired placement in either C1 lateral mass, C2 pedicle, or C3 lateral mass. No neurovascular compromise was seen following the use of intraoperative guided screw placement. The average radiation dosing due to intraoperative imaging was 39.0 mGy. All patients were followed for a minimum of 12 months. All patients went on to solid fusion. Conclusion C1-C2 fusion using computed tomography-guided navigation is a safe and effective way to treat atlantoaxial instability. Intraoperative neuronavigation allows for high accuracy of screw placement, limits complications by sparing injury to the critical structures in the upper cervical spine, and can help surgeons make intraoperative decisions regarding complex pathology. PMID:27190736

  5. Study on Improvement of Accuracy in Inertial Photogrammetry by Combining Images with Inertial Measurement Unit

    NASA Astrophysics Data System (ADS)

    Kawasaki, Hideaki; Anzai, Shojiro; Koizumi, Toshio

    2016-06-01

    Inertial photogrammetry is defined as photogrammetry that involves using a camera on which an inertial measurement unit (IMU) is mounted. In inertial photogrammetry, the position and inclination of a shooting camera are calculated using the IMU. An IMU is characterized by error growth caused by time accumulation because acceleration is integrated with respect to time. This study examines the procedure to estimate the position of the camera accurately while shooting using the IMU and the structure from motion (SfM) technology, which is applied in many fields, such as computer vision. When neither the coordinates of the position of the camera nor those of feature points are known, SfM provides a similar positional relationship between the position of the camera and feature points. Therefore, the actual length of positional coordinates is not determined. If the actual length of the position of the camera is unknown, the camera acceleration is obtained by calculating the second order differential of the position of the camera, with respect to the shooting time. The authors had determined the actual length by assigning the position of IMU to the SfM-calculated position. Hence, accuracy decreased because of the error growth, which was the characteristic feature of IMU. In order to solve this problem, a new calculation method was proposed. Using this method, the difference between the IMU-calculated acceleration and the camera-calculated acceleration can be obtained using the method of least squares, and the magnification required for calculating the actual dimension from the position of the camera can be obtained. The actual length can be calculated by multiplying all the SfM point groups by the obtained magnification factor. This calculation method suppresses the error growth, which is due to the time accumulation in IMU, and improves the accuracy of inertial photogrammetry.

  6. Model accuracy impact through rescaled observations in hydrological data assimilation studies

    NASA Astrophysics Data System (ADS)

    Tugrul Yilmaz, M.; Crow, Wade T.; Ryu, Dongryeol

    2015-04-01

    Relative magnitudes of signal and noise in soil moisture datasets (e.g. satellite-, model-, station-based) feature significant variability. Optimality of the analysis when assimilating observations into the model depends on the degree that the differences between the signal variances of model and observations are minimized. Rescaling techniques that aim to reduce such differences in general only focus on matching certain statistics of the model and the observations while the impact of their relative accuracy over the optimality of the analysis remains unexplored. In this study the impacts of the relative accuracies of seasonality and anomaly components of modeled and observation-based soil moisture time series on optimality of assimilation analysis is investigated. Experiments using well-controlled synthetic and real datasets are performed. Experiments are performed by rescaling observations to model with varying aggressiveness: i) rescaling the entire observation time-series as one-piece or each month separately; ii) rescaling observation seasonality and anomaly components separately; iii) inserting model seasonality directly into observations while anomaly components are only rescaled. A simple Antecedent Precipitation Index (API) model is selected in both synthetic and real dataset experiments. Observations are assimilated into the API model using Kalman filter. Real dataset experiments use the Land Parameter Retrieval Model (LPRM) product based on the Advanced Microwave Scanning Radiometer on the Aqua platform (AMSR-E) observations over four USDA-ARS watersheds, while ground-based observations collected over these watersheds are used for validation. Results show that it is favorable to rescale observations more aggressively to a model when the model is more accurate (higher signal to noise ratio than the observations), while rescaling the observations strongly to the model degrades the analysis if the observations are more skillful.

  7. Dual-energy CT for the diagnosis of gout: an accuracy and diagnostic yield study

    PubMed Central

    Bongartz, Tim; Glazebrook, Katrina N; Kavros, Steven J; Murthy, Naveen S; Merry, Stephen P; Franz, Walter B; Michet, Clement J; Veetil, Barath M Akkara; Davis, John M; Mason, Thomas G; Warrington, Kenneth J; Ytterberg, Steven R; Matteson, Eric L; Crowson, Cynthia S; Leng, Shuai; McCollough, Cynthia H

    2015-01-01

    Objectives To assess the accuracy of dual-energy CT (DECT) for diagnosing gout, and to explore whether it can have any impact on clinical decision making beyond the established diagnostic approach using polarising microscopy of synovial fluid (diagnostic yield). Methods Diagnostic single-centre study of 40 patients with active gout, and 41 individuals with other types of joint disease. Sensitivity and specificity of DECT for diagnosing gout was calculated against a combined reference standard (polarising and electron microscopy of synovial fluid). To explore the diagnostic yield of DECT scanning, a third cohort was assembled consisting of patients with inflammatory arthritis and risk factors for gout who had negative synovial fluid polarising microscopy results. Among these patients, the proportion of subjects with DECT findings indicating a diagnosis of gout was assessed. Results The sensitivity and specificity of DECT for diagnosing gout was 0.90 (95% CI 0.76 to 0.97) and 0.83 (95% CI 0.68 to 0.93), respectively. All false negative patients were observed among patients with acute, recent-onset gout. All false positive patients had advanced knee osteoarthritis. DECT in the diagnostic yield cohort revealed evidence of uric acid deposition in 14 out of 30 patients (46.7%). Conclusions DECT provides good diagnostic accuracy for detection of monosodium urate (MSU) deposits in patients with gout. However, sensitivity is lower in patients with recent-onset disease. DECT has a significant impact on clinical decision making when gout is suspected, but polarising microscopy of synovial fluid fails to demonstrate the presence of MSU crystals. PMID:24671771

  8. Summarising and validating test accuracy results across multiple studies for use in clinical practice.

    PubMed

    Riley, Richard D; Ahmed, Ikhlaaq; Debray, Thomas P A; Willis, Brian H; Noordzij, J Pieter; Higgins, Julian P T; Deeks, Jonathan J

    2015-06-15

    Following a meta-analysis of test accuracy studies, the translation of summary results into clinical practice is potentially problematic. The sensitivity, specificity and positive (PPV) and negative (NPV) predictive values of a test may differ substantially from the average meta-analysis findings, because of heterogeneity. Clinicians thus need more guidance: given the meta-analysis, is a test likely to be useful in new populations, and if so, how should test results inform the probability of existing disease (for a diagnostic test) or future adverse outcome (for a prognostic test)? We propose ways to address this. Firstly, following a meta-analysis, we suggest deriving prediction intervals and probability statements about the potential accuracy of a test in a new population. Secondly, we suggest strategies on how clinicians should derive post-test probabilities (PPV and NPV) in a new population based on existing meta-analysis results and propose a cross-validation approach for examining and comparing their calibration performance. Application is made to two clinical examples. In the first example, the joint probability that both sensitivity and specificity will be >80% in a new population is just 0.19, because of a low sensitivity. However, the summary PPV of 0.97 is high and calibrates well in new populations, with a probability of 0.78 that the true PPV will be at least 0.95. In the second example, post-test probabilities calibrate better when tailored to the prevalence in the new population, with cross-validation revealing a probability of 0.97 that the observed NPV will be within 10% of the predicted NPV. PMID:25800943

  9. Accuracy evaluation of the optical surface monitoring system on EDGE linear accelerator in a phantom study.

    PubMed

    Mancosu, Pietro; Fogliata, Antonella; Stravato, Antonella; Tomatis, Stefano; Cozzi, Luca; Scorsetti, Marta

    2016-01-01

    Frameless stereotactic radiosurgery (SRS) requires dedicated systems to monitor the patient position during the treatment to avoid target underdosage due to involuntary shift. The optical surface monitoring system (OSMS) is here evaluated in a phantom-based study. The new EDGE linear accelerator from Varian (Varian, Palo Alto, CA) integrates, for cranial lesions, the common cone beam computed tomography (CBCT) and kV-MV portal images to the optical surface monitoring system (OSMS), a device able to detect real-time patient׳s face movements in all 6 couch axes (vertical, longitudinal, lateral, rotation along the vertical axis, pitch, and roll). We have evaluated the OSMS imaging capability in checking the phantoms׳ position and monitoring its motion. With this aim, a home-made cranial phantom was developed to evaluate the OSMS accuracy in 4 different experiments: (1) comparison with CBCT in isocenter location, (2) capability to recognize predefined shifts up to 2° or 3cm, (3) evaluation at different couch angles, (4) ability to properly reconstruct the surface when the linac gantry visually block one of the cameras. The OSMS system showed, with a phantom, to be accurate for positioning in respect to the CBCT imaging system with differences of 0.6 ± 0.3mm for linear vector displacement, with a maximum rotational inaccuracy of 0.3°. OSMS presented an accuracy of 0.3mm for displacement up to 1cm and 1°, and 0.5mm for larger displacements. Different couch angles (45° and 90°) induced a mean vector uncertainty < 0.4mm. Coverage of 1 camera produced an uncertainty < 0.5mm. Translations and rotations of a phantom can be accurately detect with the optical surface detector system. PMID:26994827

  10. Screw Placement Accuracy and Outcomes Following O-Arm-Navigated Atlantoaxial Fusion: A Feasibility Study

    PubMed Central

    Smith, Jacob D.; Jack, Megan M.; Harn, Nicholas R.; Bertsch, Judson R.; Arnold, Paul M.

    2015-01-01

    Study Design Case series of seven patients. Objective C2 stabilization can be challenging due to the complex anatomy of the upper cervical vertebrae. We describe seven cases of C1–C2 fusion using intraoperative navigation to aid in the screw placement at the atlantoaxial (C1–C2) junction. Methods Between 2011 and 2014, seven patients underwent posterior atlantoaxial fusion using intraoperative frameless stereotactic O-arm Surgical Imaging and StealthStation Surgical Navigation System (Medtronic, Inc., Minneapolis, Minnesota, United States). Outcome measures included screw accuracy, neurologic status, radiation dosing, and surgical complications. Results Four patients had fusion at C1–C2 only, and in the remaining three, fixation extended down to C3 due to anatomical considerations for screw placement recognized on intraoperative imaging. Out of 30 screws placed, all demonstrated minimal divergence from desired placement in either C1 lateral mass, C2 pedicle, or C3 lateral mass. No neurovascular compromise was seen following the use of intraoperative guided screw placement. The average radiation dosing due to intraoperative imaging was 39.0 mGy. All patients were followed for a minimum of 12 months. All patients went on to solid fusion. Conclusion C1–C2 fusion using computed tomography-guided navigation is a safe and effective way to treat atlantoaxial instability. Intraoperative neuronavigation allows for high accuracy of screw placement, limits complications by sparing injury to the critical structures in the upper cervical spine, and can help surgeons make intraoperative decisions regarding complex pathology. PMID:27190736

  11. A method for studying knife tool marks on bone.

    PubMed

    Shaw, Kai-Ping; Chung, Ju-Hui; Chung, Fang-Chun; Tseng, Bo-Yuan; Pan, Chih-Hsin; Yang, Kai-Ting; Yang, Chun-Pang

    2011-07-01

    The characteristics of knife tool marks retained on hard tissues can be used to outline the shape and angle of a knife. The purpose of this study was to describe such marks on bone tissues that had been chopped with knives. A chopping stage with a gravity accelerator and a fixed bone platform was designed to reconstruct the chopping action. A digital microscope was also used to measure the knife angle (θ) and retained V-shape tool mark angle (ψ) in a pig skull. The κ value (elasticity coefficient; θ/ψ) was derived and recorded after the knife angle (θ) and the accompanied velocity were compared with the proportional impulsive force of the knife and ψ on the bone. The constant impulsive force revealed a correlation between the V-shape tool mark angle (ψ) and the elasticity coefficient (κ). These results describe the tool marks--crucial in the medicolegal investigation--of a knife on hard tissues. PMID:21480893

  12. Assessing the accuracy of GIS-based elementary multi criteria decision analysis as a spatial prediction tool - A case of predicting potential zones of sustainable groundwater resources

    NASA Astrophysics Data System (ADS)

    Adiat, K. A. N.; Nawawi, M. N. M.; Abdullah, K.

    2012-05-01

    SummaryInappropriate handling/integration of data from various sources is a problem that can make any spatial prediction tasking and inaccurate. Attempt was made in this study to offer solution to this problem by exploring the capability of GIS-based elementary MCDA as a spatial prediction tool. In order to achieve the set objectives, spatial prediction of potential zones of sustainable groundwater resources in a given study area was used as a case study. A total of five set of criteria/factors believed to be influencing groundwater storage potential in the area were selected. Each criterion/factor was assigned appropriate weight based on Saaty's 9 point scale and the weights were normalized through the analytic hierarchy process (AHP). The process was integrated in the GIS environment to produce the groundwater potential prediction map for the area. The effect of coherence of criteria on the efficiency of MCDA as a prediction tool was also examined. The prediction map produced was found to be 81.25% accurate. The results of the examination of the effect of coherence of criteria revealed that the ability of the method to produce accurate prediction is dependent on the exhaustiveness of the set of criteria used. It was established in the study that the GIS-based elementary MCDA technique is capable of producing accurate and reliable prediction particularly if the set of criteria use for the prediction is coherent.

  13. Accuracy and reproducibility of tumor positioning during prolonged and multi-modality animal imaging studies

    NASA Astrophysics Data System (ADS)

    Zhang, Mutian; Huang, Minming; Le, Carl; Zanzonico, Pat B.; Claus, Filip; Kolbert, Katherine S.; Martin, Kyle; Ling, C. Clifton; Koutcher, Jason A.; Humm, John L.

    2008-10-01

    Dedicated small-animal imaging devices, e.g. positron emission tomography (PET), computed tomography (CT) and magnetic resonance imaging (MRI) scanners, are being increasingly used for translational molecular imaging studies. The objective of this work was to determine the positional accuracy and precision with which tumors in situ can be reliably and reproducibly imaged on dedicated small-animal imaging equipment. We designed, fabricated and tested a custom rodent cradle with a stereotactic template to facilitate registration among image sets. To quantify tumor motion during our small-animal imaging protocols, 'gold standard' multi-modality point markers were inserted into tumor masses on the hind limbs of rats. Three types of imaging examination were then performed with the animals continuously anesthetized and immobilized: (i) consecutive microPET and MR images of tumor xenografts in which the animals remained in the same scanner for 2 h duration, (ii) multi-modality imaging studies in which the animals were transported between distant imaging devices and (iii) serial microPET scans in which the animals were repositioned in the same scanner for subsequent images. Our results showed that the animal tumor moved by less than 0.2-0.3 mm over a continuous 2 h microPET or MR imaging session. The process of transporting the animal between instruments introduced additional errors of ~0.2 mm. In serial animal imaging studies, the positioning reproducibility within ~0.8 mm could be obtained.

  14. Effects of sampling and mineral separation on accuracy of detrital zircon studies

    NASA Astrophysics Data System (ADS)

    SláMa, JiřÃ.­; KošLer, Jan

    2012-05-01

    We investigated some of the sampling and mineral separation biases that affect the accuracy of detrital zircon provenance studies. The study has been carried on a natural catchment in the Scottish Highlands that represents a simple two-component source system and on samples of synthetic sediment prepared for this study to test the effects of heavy mineral separation on the resulting zircon age spectra. The results suggest that zircon fertility of the source rocks and physical properties of zircon represent the most important factors affecting the distribution of zircon age populations in the stream sediments. The sample preparation and selection of zircons for analysis may result in preferential loss of information from small zircon grains. Together with the preference for larger crystals during handpicking, it can result in several-fold difference compared to the real age distribution in the sediment sample. These factors appear to be more important for the reproducibility of zircon age spectra than is the number of zircon grains analyzed per sample.

  15. Accuracy and repeatability of Roentgen stereophotogrammetric analysis (RSA) for measuring knee laxity in longitudinal studies.

    PubMed

    Fleming, B C; Peura, G D; Abate, J A; Beynnon, B D

    2001-10-01

    Roentgen stereophotogrammetric analysis (RSA) can be used to assess temporal changes in anterior-posterior (A-P) knee laxity. However, the accuracy and precision of RSA is dependent on many factors and should be independently evaluated for a particular application. The objective of this study was to evaluate the use of RSA for measuring A-P knee laxity. The specific aims were to assess the variation or "noise" inherent to RSA, to determine the reproducibility of RSA for repeated A-P laxity testing, and to assess the accuracy of these measurements. Two experiments were performed. The first experiment utilized three rigid models of the tibiofemoral joint to assess the noise and to compare digitization errors of two independent examiners. No differences were found in the kinematic outputs of the RSA due to examiner, repeated trials, or the model used. In a second experiment, A-P laxity values between the A-P shear load limits of +/-60 N of five cadaver goat knees were measured to assess the error associated with repeated testing. The RSA laxity values were also compared to those obtained from a custom designed linkage system. The mean A-P laxity values with the knee 30 degrees, 60 degrees, and 90 degrees of flexion for the ACL-intact goat knee (+/-95% confidence interval) were 0.8 (+/-0.25), 0.9 (+/-0.29), and 0.4 (+/-0.22) mm, respectively. In the ACL-deficient knee, the A-P laxity values increased by an order of magnitude to 8.8 (+/-1.39), 7.6 (+/-1.32), and 3.1 (+/-1.20)mm, respectively. No significant differences were found between the A-P laxity values measured by RSA and the independent measurement technique. A highly significant linear relationship (r(2)=0.83) was also found between these techniques. This study suggests that the RSA method is an accurate and precise means to measure A-P knee laxity for repeated testing over time. PMID:11522316

  16. Study of on-machine error identification and compensation methods for micro machine tools

    NASA Astrophysics Data System (ADS)

    Wang, Shih-Ming; Yu, Han-Jen; Lee, Chun-Yi; Chiu, Hung-Sheng

    2016-08-01

    Micro machining plays an important role in the manufacturing of miniature products which are made of various materials with complex 3D shapes and tight machining tolerance. To further improve the accuracy of a micro machining process without increasing the manufacturing cost of a micro machine tool, an effective machining error measurement method and a software-based compensation method are essential. To avoid introducing additional errors caused by the re-installment of the workpiece, the measurement and compensation method should be on-machine conducted. In addition, because the contour of a miniature workpiece machined with a micro machining process is very tiny, the measurement method should be non-contact. By integrating the image re-constructive method, camera pixel correction, coordinate transformation, the error identification algorithm, and trajectory auto-correction method, a vision-based error measurement and compensation method that can on-machine inspect the micro machining errors and automatically generate an error-corrected numerical control (NC) program for error compensation was developed in this study. With the use of the Canny edge detection algorithm and camera pixel calibration, the edges of the contour of a machined workpiece were identified and used to re-construct the actual contour of the work piece. The actual contour was then mapped to the theoretical contour to identify the actual cutting points and compute the machining errors. With the use of a moving matching window and calculation of the similarity between the actual and theoretical contour, the errors between the actual cutting points and theoretical cutting points were calculated and used to correct the NC program. With the use of the error-corrected NC program, the accuracy of a micro machining process can be effectively improved. To prove the feasibility and effectiveness of the proposed methods, micro-milling experiments on a micro machine tool were conducted, and the results

  17. A study of the parameters affecting the accuracy of the total pore blocking method.

    PubMed

    Liekens, Anuschka; Cabooter, Deirdre; Denayer, Joeri; Desmet, Gert

    2010-10-22

    We report on a study wherein we investigate the different factors affecting the accuracy of the total pore blocking method to determine the interstitial volume of reversed-phase packed bed columns. Octane, nonane, decane and dodecane were all found to be suitable blocking agents, whereas heptane already dissolves too well in the applied fully aqueous buffers. The method of moments needs to be used to accurately determine the elution times, and a proper correction for the frit volume is needed. Failing to do so can lead to errors on the observed interstitial volume of the order of 2% or more. It has also been shown that the application of a high flow rate or a high pressure does not force the blocking agent out of the mesopores of the particles. The only potential source of loss of blocking agent is dissolution into the mobile phase (even though this is a buffered fully aqueous solution). This effect however only becomes significant after the elution of 400 geometrical column volumes, i.e., orders more than needed for a regular total pore blocking experiment. PMID:20580009

  18. An experimental study of the accuracy in measurement of modulation transfer function using an edge method

    NASA Astrophysics Data System (ADS)

    Lee, Dong-Hoon; Kim, Ye-seul; Park, Hye-Suk; Lee, Young-Jin; Kim, Hee-Joung

    2015-03-01

    Image evaluation is necessary in digital radiography (DR) which is widely used in medical imaging. Among parameters of image evaluation, modulation transfer function (MTF) is the important factor in the field of medical imaging and necessary to obtain detective quantum efficiency (DQE) which represents overall performance of the detector signal-to-noise ratio. However, the accurate measurement of MTF is still not easy because of geometric effect, electric noise, quantum noise, and truncation error. Therefore, in order to improve accuracy of MTF, four experimental methods were tested in this study such as changing the tube current, applying smoothing method in edge spread function (ESF), adjusting line spread function (LSF) range, and changing tube angle. Our results showed that MTF's fluctuation was decreased by high tube current and smoothing method. However, tube current should not exceed detector saturation and smoothing in ESF causes a distortion in ESF and MTF. In addition, decreasing LSF range diminished fluctuation and the number of sampling in MTF and high tube angle generates degradation in MTF. Based on these results, excessively low tube current and the smoothing method should be avoided. Also, optimal range of LSF considering reduction of fluctuation and the number of sampling in MTF was necessary and precise tube angle is essential to obtain an accurate MTF. In conclusion, our results demonstrated that accurate MTF can be acquired.

  19. Extreme-ultraviolet phase-shifting point-diffraction interferometer: a wave-front metrology tool with subangstrom reference-wave accuracy.

    PubMed

    Naulleau, P P; Goldberg, K A; Lee, S H; Chang, C; Attwood, D; Bokor, J

    1999-12-11

    The phase-shifting point-diffraction interferometer (PS/PDI) was recently developed and implemented at Lawrence Berkeley National Laboratory to characterize extreme-ultraviolet (EUV) projection optical systems for lithography. Here we quantitatively characterize the accuracy and precision of the PS/PDI. Experimental measurements are compared with theoretical results. Two major classes of errors affect the accuracy of the interferometer: systematic effects arising from measurement geometry and systematic and random errors due to an imperfect reference wave. To characterize these effects, and hence to calibrate the interferometer, a null test is used. This null test also serves as a measure of the accuracy of the interferometer. We show the EUV PS/PDI, as currently implemented, to have a systematic error-limited reference-wave accuracy of 0.0028 waves (lambda/357 or 0.038 nm at lambda = 13.5 nm) within a numerical aperture of 0.082. PMID:18324274

  20. SMS as a Learning Tool: An Experimental Study

    ERIC Educational Resources Information Center

    Plana, Mar Gutiérrez-Colon; Torrano, Pere Gallardo; Grova, M. Elisa

    2012-01-01

    The aim of this experimental study was to find out the potential of using mobile phones in teaching English as a foreign language (EFL), specifically the use of Short Message Service (SMS) as a support tool in the EFL class. The research questions formulated for this project are the following: (1) Is using SMS messages via a mobile phone an…

  1. Popular Music as a Learning Tool in the Social Studies.

    ERIC Educational Resources Information Center

    Litevich, John A., Jr.

    This teaching guide reflects the belief that popular music is an effective tool for teachers to use in presenting social studies lessons to students. Titles of songs representative of popular music from 1955 to 1982 are listed by subject matter and suggest a possible lesson to be used in teaching that particular issue. Subject areas listed…

  2. Softdesk energy: A case study in early design tool integration

    SciTech Connect

    Gowri, K.; Chassin, D.P.; Friedrich, M.

    1998-04-01

    Softdesk Energy is a design tool that integrates building energy analysis capability into a highly automated production drafting environment (AutoCAD and Softdesk AutoArchitect). This tool provides users of computer aided software the opportunity to evaluate the aided design/drafting (CAD) energy impact of design decisions much earlier in the design process than previously possible with energy analysis software. The authors review the technical challenges of integrating analytic methods into design tools, the opportunities such integrated tools create for building designers, and a usage scenario from the perspective of a current user of Softdesk Energy. A comparison between the simplified calculations in Softdesk Energy and detailed simulations using DOE-2 energy analysis is made to evaluate the applicability of the Softdesk Energy approach. As a unique example of integrating decision and drafting, Softdesk Energy provides an opportunity to study the strengths and weaknesses of integrated design tools and gives some insight into the future direction of the CAD software towards meeting the needs of diverse design disciplines.

  3. Hepatic perfusion in a tumor model using DCE-CT: an accuracy and precision study

    NASA Astrophysics Data System (ADS)

    Stewart, Errol E.; Chen, Xiaogang; Hadway, Jennifer; Lee, Ting-Yim

    2008-08-01

    In the current study we investigate the accuracy and precision of hepatic perfusion measurements based on the Johnson and Wilson model with the adiabatic approximation. VX2 carcinoma cells were implanted into the livers of New Zealand white rabbits. Simultaneous dynamic contrast-enhanced computed tomography (DCE-CT) and radiolabeled microsphere studies were performed under steady-state normo-, hyper- and hypo-capnia. The hepatic arterial blood flows (HABF) obtained using both techniques were compared with ANOVA. The precision was assessed by the coefficient of variation (CV). Under normo-capnia the microsphere HABF were 51.9 ± 4.2, 40.7 ± 4.9 and 99.7 ± 6.0 ml min-1 (100 g)-1 while DCE-CT HABF were 50.0 ± 5.7, 37.1 ± 4.5 and 99.8 ± 6.8 ml min-1 (100 g)-1 in normal tissue, tumor core and rim, respectively. There were no significant differences between HABF measurements obtained with both techniques (P > 0.05). Furthermore, a strong correlation was observed between HABF values from both techniques: slope of 0.92 ± 0.05, intercept of 4.62 ± 2.69 ml min-1 (100 g)-1 and R2 = 0.81 ± 0.05 (P < 0.05). The Bland-Altman plot comparing DCE-CT and microsphere HABF measurements gives a mean difference of -0.13 ml min-1 (100 g)-1, which is not significantly different from zero. DCE-CT HABF is precise, with CV of 5.7, 24.9 and 1.4% in the normal tissue, tumor core and rim, respectively. Non-invasive measurement of HABF with DCE-CT is accurate and precise. DCE-CT can be an important extension of CT to assess hepatic function besides morphology in liver diseases.

  4. DNA barcoding and minibarcoding as a powerful tool for feather mite studies.

    PubMed

    Doña, Jorge; Diaz-Real, Javier; Mironov, Sergey; Bazaga, Pilar; Serrano, David; Jovani, Roger

    2015-09-01

    Feather mites (Astigmata: Analgoidea and Pterolichoidea) are among the most abundant and commonly occurring bird ectosymbionts. Basic questions on the ecology and evolution of feather mites remain unanswered because feather mite species identification is often only possible for adult males, and it is laborious even for specialized taxonomists, thus precluding large-scale identifications. Here, we tested DNA barcoding as a useful molecular tool to identify feather mites from passerine birds. Three hundred and sixty-one specimens of 72 species of feather mites from 68 species of European passerine birds from Russia and Spain were barcoded. The accuracy of barcoding and minibarcoding was tested. Moreover, threshold choice (a controversial issue in barcoding studies) was also explored in a new way, by calculating through simulations the effect of sampling effort (in species number and species composition) on threshold calculations. We found one 200-bp minibarcode region that showed the same accuracy as the full-length barcode (602 bp) and was surrounded by conserved regions potentially useful for group-specific degenerate primers. Species identification accuracy was perfect (100%) but decreased when singletons or species of the Proctophyllodes pinnatus group were included. In fact, barcoding confirmed previous taxonomic issues within the P. pinnatus group. Following an integrative taxonomy approach, we compared our barcode study with previous taxonomic knowledge on feather mites, discovering three new putative cryptic species and validating three previous morphologically different (but still undescribed) new species. PMID:25655349

  5. Alaska Case Study: Scientists Venturing Into Field with Journalists Improves Accuracy

    NASA Astrophysics Data System (ADS)

    Ekwurzel, B.; Detjen, J.; Hayes, R.; Nurnberger, L.; Pavangadkar, A.; Poulson, D.

    2008-12-01

    Issues such as climate change, stem cell research, public health vaccination, etc., can be fraught with public misunderstanding, myths, as well as deliberate distortions of the fundamental science. Journalists are adept at creating print, radio, and video content that can be both compelling and informative to the public. Yet most scientists have little time or training to devote to developing media content for the public and spend little time with journalists who cover science stories. We conducted a case study to examine whether the time and funding invested in exposing journalists to scientists in the field over several days would improve accuracy of media stories about complex scientific topics. Twelve journalists were selected from the 70 who applied for a four-day environmental journalism fellowship in Alaska. The final group achieved the goal of a broad geographic spectrum of the media outlets (small regional to large national organizations), medium (print, radio, online), and experience (early career to senior producers). Reporters met with a diverse group of scientists. The lessons learned and successful techniques will be presented. Initial results demonstrate that stories were highly accurate and rich with audio or visual content for lay audiences. The journalists have also maintained contact with the scientists, asking for leads on emerging stories and seeking new experts that can assist in their reporting. Science-based institutions should devote more funding to foster direct journalist-scientist interactions in the lab and field. These positive goals can be achieved: (1) more accurate dissemination of science information to the public; (2) a broader portion of the scientific community will become a resource to journalists instead of the same eloquent few in the community; (3) scientists will appreciate the skill and pressures of those who survive the media downsizing and provide media savvy content; and (4) the public may incorporate science evidence

  6. Accuracy of migrant landbird habitat maps produced from LANDSAT TM data: Two case studies in southern Belize

    USGS Publications Warehouse

    Spruce, J.P.; Sader, S.; Robbins, C.S.; Dowell, B.A.

    1995-01-01

    The study investigated the utility of Landsat TM data applied to produce geo-referenced habitat maps for two study areas (Toledo and Stann Creek). Locational and non-site-specific map accuracy was evaluated by stratified random sampling and statistical analysis of satellite classification (SCR) versus air photo interpretation results (PIR) for the overall classification and individual classes. The effect of classification scheme specificity on map accuracy was also assessed. A decision criteria was developed for the minimum acceptable level of map performance (i.e., classification accuracy and scheme specificity). A satellite map was deemed acceptable if it has a useful degree of classification specificity, plus either an adequate overall locational agreement (< 70%) and/or non-site specific agreement (Chi Square goodness of fit test results indicating insufficient evidence to reject the null hypothesis that the overall classification distribution for the SCR and PIR are equal). For the most detailed revised classification, overall locational accuracy ranges from 52% (5 classes) for the Toledo to 63% (9 classes) for the Stann Creek. For the least detailed revised classification, overall locational accuracy ranges from 91% (2 classes) for Toledo to 86% (5 classes) for Stann Creek. Considering both location and non-site-specific accuracy results, the most detailed yet insufficient accurate classification for both sites includes low/medium/tall broadleaf forest, broadleaf forest scrub and herb-dominated openings. For these classifications, the overall locational accuracy is 72% for Toledo (4 classes) and 75% for Stann Creek (7 classes). This level of classification detail is suitable for aiding many analyses of migrant landbird habitat use.

  7. Accuracy of stated energy contents of restaurant foods in a multi-site study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Context National recommendations for prevention and treatment of obesity emphasize reducing energy intake. Foods purchased in restaurants provide approximately 35% of daily energy intake, but the accuracy of information on the energy contents of these foods is unknown. Objective To examine the a...

  8. Fast and Confident: Postdicting Eyewitness Identification Accuracy in a Field Study

    ERIC Educational Resources Information Center

    Sauerland, Melanie; Sporer, Siegfried L.

    2009-01-01

    The combined postdictive value of postdecision confidence, decision time, and Remember-Know-Familiar (RKF) judgments as markers of identification accuracy was evaluated with 10 targets and 720 participants. In a pedestrian area, passers-by were asked for directions. Identifications were made from target-absent or target-present lineups. Fast…

  9. Applying Signal-Detection Theory to the Study of Observer Accuracy and Bias in Behavioral Assessment

    ERIC Educational Resources Information Center

    Lerman, Dorothea C.; Tetreault, Allison; Hovanetz, Alyson; Bellaci, Emily; Miller, Jonathan; Karp, Hilary; Mahmood, Angela; Strobel, Maggie; Mullen, Shelley; Keyl, Alice; Toupard, Alexis

    2010-01-01

    We evaluated the feasibility and utility of a laboratory model for examining observer accuracy within the framework of signal-detection theory (SDT). Sixty-one individuals collected data on aggression while viewing videotaped segments of simulated teacher-child interactions. The purpose of Experiment 1 was to determine if brief feedback and…

  10. Accuracy of navigation-assisted acetabular component positioning studied by computed tomography measurements: methods and results.

    PubMed

    Ybinger, Thomas; Kumpan, W; Hoffart, H E; Muschalik, B; Bullmann, W; Zweymüller, K

    2007-09-01

    The postoperative position of the acetabular component is key for the outcome of total hip arthroplasty. Various aids have been developed to support the surgeon during implant placement. In a prospective study involving 4 centers, the computer-recorded cup alignment of 37 hip systems at the end of navigation-assisted surgery was compared with the cup angles measured on postoperative computerized tomograms. This comparison showed an average difference of 3.5 degrees (SD, 4.4 degrees ) for inclination and 6.5 degrees (SD, 7.3 degrees ) for anteversion angles. The differences in inclination correlated with the thickness of the soft tissue overlying the anterior superior iliac spine (r = 0.44; P = .007), whereas the differences in anteversion showed a correlation with the thickness of the soft tissue overlying the pubic tubercles (r = 0.52; P = .001). In centers experienced in the use of navigational tools, deviations were smaller than in units with little experience in their use. PMID:17826270

  11. Improvements are needed in reporting of accuracy studies for diagnostic tests used for detection of finfish pathogens.

    PubMed

    Gardner, Ian A; Burnley, Timothy; Caraguel, Charles

    2014-12-01

    Indices of test accuracy, such as diagnostic sensitivity and specificity, are important considerations in test selection for a defined purpose (e.g., screening or confirmation) and affect the interpretation of test results. Many biomedical journals recommend that authors clearly and transparently report test accuracy studies following the Standards for Reporting of Diagnostic Accuracy (STARD) guidelines ( www.stard-statement.org ). This allows readers to evaluate overall study validity and assess potential bias in diagnostic sensitivity and specificity estimates. The purpose of the present study was to evaluate the reporting quality of studies evaluating test accuracy for finfish diseases using the 25 items in the STARD checklist. Based on a database search, 11 studies that included estimates of diagnostic accuracy were identified for independent evaluation by three reviewers. For each study, STARD checklist items were scored as "yes," "no," or "not applicable." Only 10 of the 25 items were consistently reported in most (≥80%) papers, and reporting of the other items was highly variable (mostly between 30% and 60%). Three items ("number, training, and expertise of readers and testers"; "time interval between index tests and reference standard"; and "handling of indeterminate results, missing data, and outliers of the index tests") were reported in less than 10% of papers. Two items ("time interval between index tests and reference standard" and "adverse effects from testing") were considered minimally relevant to fish health because test samples usually are collected postmortem. Modification of STARD to fit finfish studies should increase use by authors and thereby improve the overall reporting quality regardless of how the study was designed. Furthermore, the use of STARD may lead to the improved design of future studies. PMID:25252270

  12. Tools of the trade: studying molecular networks in plants.

    PubMed

    Proost, Sebastian; Mutwil, Marek

    2016-04-01

    Driven by recent technological improvements, genes can be now studied in a larger biological context. Genes and their protein products rarely operate as a single entity and large-scale mapping by protein-protein interactions can unveil the molecular complexes that form in the cell to carry out various functions. Expression analysis under multiple conditions, supplemented with protein-DNA binding data can highlight when genes are active and how they are regulated. Representing these data in networks and finding strongly connected sub-graphs has proven to be a powerful tool to predict the function of unknown genes. As such networks are gradually becoming available for various plant species, it becomes possible to study how networks evolve. This review summarizes currently available network data and related tools for plants. Furthermore we aim to provide an outlook of future analyses that can be done in plants based on work done in other fields. PMID:26990519

  13. Polarization as a tool for studying particle properties

    SciTech Connect

    Grosse-Wiesmann, P.

    1988-05-01

    The use of polarized beams in e/sup /plus//e/sup /minus// collisions at the Z/sup 0/pole provides a powerful tool for the separation of the charge and spin of the produced fermions. Such a separation is essential for many investigations of particle properties. It is shown that this technique can be used to substantially improve studies of CP violation in neutral B mesons and the charged structure of /tau/ decays.

  14. Call Accuracy and Distance from the Play: A Study with Brazilian Soccer Referees

    PubMed Central

    DE OLIVEIRA, MARIO CESAR; ORBETELLI, ROGERIO; DE BARROS NETO, TURIBIO LEITE

    2011-01-01

    Refereeing decisions in soccer has always been a controversial issue. In order to better understand this subject, foul calls made by Brazilian soccer referees were evaluated to determine the potential relationship between the distance from the referee to a foul play and the accuracy of the call. Soccer matches supervised by the São Paulo State Football Federation were recorded and 321 foul calls were analyzed. No significant association was found between the referee’s distance from a foul play and accuracy of the call (p = 0.561). However, there was a significant increase in the number of correct calls in the last 15 minutes of the second half compared with the number of correct calls in the first 30 minutes of the same half (p = 0.003). PMID:27182355

  15. Design and Preliminary Accuracy Studies of an MRI-Guided Transrectal Prostate Intervention System

    PubMed Central

    Krieger, Axel; Csoma, Csaba; Iordachita, Iulian I.; Guion, Peter; Singh, Anurag K.; Fichtinger, Gabor; Whitcomb, Louis L.

    2012-01-01

    This paper reports a novel system for magnetic resonance imaging (MRI) guided transrectal prostate interventions, such as needle biopsy, fiducial marker placement, and therapy delivery. The system utilizes a hybrid tracking method, comprised of passive fiducial tracking for initial registration and subsequent incremental motion measurement along the degrees of freedom using fiber-optical encoders and mechanical scales. Targeting accuracy of the system is evaluated in prostate phantom experiments. Achieved targeting accuracy and procedure times were found to compare favorably with existing systems using passive and active tracking methods. Moreover, the portable design of the system using only standard MRI image sequences and minimal custom scanner interfacing allows the system to be easily used on different MRI scanners. PMID:18044553

  16. Classification Accuracy of MMPI-2 Validity Scales in the Detection of Pain-Related Malingering: A Known-Groups Study

    ERIC Educational Resources Information Center

    Bianchini, Kevin J.; Etherton, Joseph L.; Greve, Kevin W.; Heinly, Matthew T.; Meyers, John E.

    2008-01-01

    The purpose of this study was to determine the accuracy of "Minnesota Multiphasic Personality Inventory" 2nd edition (MMPI-2; Butcher, Dahlstrom, Graham, Tellegen, & Kaemmer, 1989) validity indicators in the detection of malingering in clinical patients with chronic pain using a hybrid clinical-known groups/simulator design. The sample consisted…

  17. A Cross-National Comparison Study on the Accuracy of Self-Efficacy Beliefs of Middle-School Mathematics Students

    ERIC Educational Resources Information Center

    Chen, Peggy; Zimmerman, Barry

    2007-01-01

    In this cross-national study, the authors compared mathematics self-efficacy beliefs of American (n = 107) and Taiwanese (n = 188) middle-school students for level and calibration (accuracy and bias). Taiwanese students surpassed Americans in math achievement. American students evidenced slightly higher self-efficacy levels for easy math items but…

  18. Accuracy of genomic selection in barley breeding programs: a simulation study based on the real SNP data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The aim of this study was to compare the accuracy of genomic selection (i.e., selection based on genome-wide markers) to phenotypic selection through simulations based on real barley SNPs data (1325 SNPs x 863 breeding lines). We simulated 100 QTL at randomly selected SNPs, which were dropped from t...

  19. Dynamic Patterns in Development of Accuracy and Complexity: A Longitudinal Case Study in the Acquisition of Finnish

    ERIC Educational Resources Information Center

    Spoelman, Marianne; Verspoor, Marjolijn

    2010-01-01

    Within a Dynamic System Theory (DST) approach, it is assumed that language is in a constant flux, but that differences in the degree of variability can give insight into the developmental process. This longitudinal case study focuses on intra-individual variability in accuracy rates and complexity measures in Finnish learner language. The study…

  20. Additional studies of forest classification accuracy as influenced by multispectral scanner spatial resolution

    NASA Technical Reports Server (NTRS)

    Sadowski, F. E.; Sarno, J. E.

    1976-01-01

    First, an analysis of forest feature signatures was used to help explain the large variation in classification accuracy that can occur among individual forest features for any one case of spatial resolution and the inconsistent changes in classification accuracy that were demonstrated among features as spatial resolution was degraded. Second, the classification rejection threshold was varied in an effort to reduce the large proportion of unclassified resolution elements that previously appeared in the processing of coarse resolution data when a constant rejection threshold was used for all cases of spatial resolution. For the signature analysis, two-channel ellipse plots showing the feature signature distributions for several cases of spatial resolution indicated that the capability of signatures to correctly identify their respective features is dependent on the amount of statistical overlap among signatures. Reductions in signature variance that occur in data of degraded spatial resolution may not necessarily decrease the amount of statistical overlap among signatures having large variance and small mean separations. Features classified by such signatures may thus continue to have similar amounts of misclassified elements in coarser resolution data, and thus, not necessarily improve in classification accuracy.

  1. [Study on the wavelength accuracy of the 2-D slit-array Hadamard spectrometer].

    PubMed

    Chi, Ming-Bo; Hao, Peng; Wu, Yi-Hui

    2013-01-01

    The 2-D slit array mask is a new design of Hadamard spectrometer mask. Having discussed the influence of the inconsistency caused by the machining errors in the size and location between the slits in the same column on the wavelength accuracy of the Hadamard spectrometer, the authors bring up with the way to decrease the influence on the wavelength accuracy of the spectrometer caused by the difference in the height and location vertical to the spectrum between the slits in the same column, and then estimate the spectral shift caused by the relative location shift along the spectrum between the slits in the same column. A model for simulation was built, and the measurement errors in the decoded spectrum generated by one column of the slits on the mask were calculated, when there are inconsistency errors in width and location along the spectrum between the slits in another column. Based on the simulation calculation, we can determine the machining precision of the mask. The research will be meaningful to the design of the 2-D slit array mask using MEMS(micro-electro-mechanism system) technique and the revise of the decoded spectrum, which can provide the spectrometer with a reasonable wavelength accuracy. PMID:23586265

  2. [Study on high accuracy detection of multi-component gas in oil-immerse power transformer].

    PubMed

    Fan, Jie; Chen, Xiao; Huang, Qi-Feng; Zhou, Yu; Chen, Gang

    2013-12-01

    In order to solve the problem of low accuracy and mutual interference in multi-component gas detection, a kind of multi-component gas detection network with high accuracy was designed. A semiconductor laser with narrow bandwidth was utilized as light source and a novel long-path gas cell was also used in this system. By taking the single sine signal to modulate the spectrum of laser and using space division multiplexing (SDM) and time division multiplexing (TDM) technique, the detection of multi-component gas was achieved. The experiments indicate that the linearity relevance coefficient is 0. 99 and the measurement relative error is less than 4%. The system dynamic response time is less than 15 s, by filling a volume of multi-component gas into the gas cell gradually. The system has advantages of high accuracy and quick response, which can be used in the fault gas on-line monitoring for power transformers in real time. PMID:24611396

  3. Recognition Accuracy Using 3D Endoscopic Images for Superficial Gastrointestinal Cancer: A Crossover Study

    PubMed Central

    Kaise, Mitsuru; Kikuchi, Daisuke; Iizuka, Toshiro; Fukuma, Yumiko; Kuribayashi, Yasutaka; Tanaka, Masami; Toba, Takahito; Furuhata, Tsukasa; Yamashita, Satoshi; Matsui, Akira; Mitani, Toshifumi; Hoteya, Shu

    2016-01-01

    Aim. To determine whether 3D endoscopic images improved recognition accuracy for superficial gastrointestinal cancer compared with 2D images. Methods. We created an image catalog using 2D and 3D images of 20 specimens resected by endoscopic submucosal dissection. The twelve participants were allocated into two groups. Group 1 evaluated only 2D images at first, group 2 evaluated 3D images, and, after an interval of 2 weeks, group 1 next evaluated 3D and group 2 evaluated 2D images. The evaluation items were as follows: (1) diagnostic accuracy of the tumor extent and (2) confidence levels in assessing (a) tumor extent, (b) morphology, (c) microsurface structure, and (d) comprehensive recognition. Results. The use of 3D images resulted in an improvement in diagnostic accuracy in both group 1 (2D: 76.9%, 3D: 78.6%) and group 2 (2D: 79.9%, 3D: 83.6%), with no statistically significant difference. The confidence levels were higher for all items ((a) to (d)) when 3D images were used. With respect to experience, the degree of the improvement showed the following trend: novices > trainees > experts. Conclusions. By conversion into 3D images, there was a significant improvement in the diagnostic confidence level for superficial tumors, and the improvement was greater in individuals with lower endoscopic expertise. PMID:27597863

  4. Recognition Accuracy Using 3D Endoscopic Images for Superficial Gastrointestinal Cancer: A Crossover Study.

    PubMed

    Nomura, Kosuke; Kaise, Mitsuru; Kikuchi, Daisuke; Iizuka, Toshiro; Fukuma, Yumiko; Kuribayashi, Yasutaka; Tanaka, Masami; Toba, Takahito; Furuhata, Tsukasa; Yamashita, Satoshi; Matsui, Akira; Mitani, Toshifumi; Hoteya, Shu

    2016-01-01

    Aim. To determine whether 3D endoscopic images improved recognition accuracy for superficial gastrointestinal cancer compared with 2D images. Methods. We created an image catalog using 2D and 3D images of 20 specimens resected by endoscopic submucosal dissection. The twelve participants were allocated into two groups. Group 1 evaluated only 2D images at first, group 2 evaluated 3D images, and, after an interval of 2 weeks, group 1 next evaluated 3D and group 2 evaluated 2D images. The evaluation items were as follows: (1) diagnostic accuracy of the tumor extent and (2) confidence levels in assessing (a) tumor extent, (b) morphology, (c) microsurface structure, and (d) comprehensive recognition. Results. The use of 3D images resulted in an improvement in diagnostic accuracy in both group 1 (2D: 76.9%, 3D: 78.6%) and group 2 (2D: 79.9%, 3D: 83.6%), with no statistically significant difference. The confidence levels were higher for all items ((a) to (d)) when 3D images were used. With respect to experience, the degree of the improvement showed the following trend: novices > trainees > experts. Conclusions. By conversion into 3D images, there was a significant improvement in the diagnostic confidence level for superficial tumors, and the improvement was greater in individuals with lower endoscopic expertise. PMID:27597863

  5. Study of decoder complexity for HEVC and AVC standards based on tool-by-tool comparison

    NASA Astrophysics Data System (ADS)

    Ahn, Y. J.; Han, W. J.; Sim, D. G.

    2012-10-01

    High Efficiency Video Coding (HEVC) is the latest standardization efforts of ISO/IEC MPEG and ITU-T VCEG for further improving the coding efficiency of H.264/AVC standard. It has been reported that HEVC can provide comparable subjective visual quality with H.264/AVC at only half bit-rates in many cases. In this paper, decoder complexities between HEVC and H.264/AVC are studied for providing initial complexity estimates of the HEVC decoder compared with the H.264/AVC decoder. For this purpose, several selected coding tools including intra prediction, motion compensation, transform, loop filters and entropy coder have been analyzed in terms of number of operations as well as their statistical differences.

  6. Improving accuracy and usability of growth charts: case study in Rwanda

    PubMed Central

    Brown, Suzana; McSharry, Patrick

    2016-01-01

    Objectives We evaluate and compare manually collected paper records against electronic records for monitoring the weights of children under the age of 5. Setting Data were collected by 24 community health workers (CHWs) in 2 Rwandan communities, 1 urban and 1 rural. Participants The same CHWs collected paper and electronic records. Paper data contain weight and age for 320 boys and 380 girls. Electronic data contain weight and age for 922 girls and 886 boys. Electronic data were collected over 9 months; most of the data is cross-sectional, with about 330 children with time-series data. Both data sets are compared with the international standard provided by the WHO growth chart. Primary and secondary outcome measures The plan was to collect 2000 individual records for the electronic data set—we finally collected 1878 records. Paper data were collected by the same CHWs, but most data were fragmented and hard to read. We transcribed data only from children for whom we were able to obtain the date of birth, to determine the exact age at the time of measurement. Results Mean absolute error (MAE) and mean absolute percentage error (MAPE) provide a way to quantify the magnitude of the error in using a given model. Comparing a model, log(weight)=a+b log(age), shows that electronic records provide considerable improvements over paper records, with 40% reduction in both performance metrics. Electronic data improve performance over the WHO model by 10% in MAPE and 7% in MAE. Results are statistically significant using the Kolmogorov-Smirnov test at p<0.01. Conclusions This study demonstrates that using modern electronic tools for health data collection is allowing better tracking of health indicators. We have demonstrated that electronic records facilitate development of a country-specific model that is more accurate than the international standard provided by the WHO growth chart. PMID:26817635

  7. Experimental Tools to Study Molecular Recognition within the Nanoparticle Corona

    PubMed Central

    Landry, Markita P.; Kruss, Sebastian; Nelson, Justin T.; Bisker, Gili; Iverson, Nicole M.; Reuel, Nigel F.; Strano, Michael S.

    2014-01-01

    Advancements in optical nanosensor development have enabled the design of sensors using syntheticmolecular recognition elements through a recently developed method called Corona Phase MolecularRecognition (CoPhMoRe). The synthetic sensors resulting from these design principles are highly selective for specific analytes, and demonstrate remarkable stability for use under a variety of conditions. An essential element of nanosensor development hinges on the ability to understand the interface between nanoparticles and the associated corona phase surrounding the nanosensor, an environment outside of the range of traditional characterization tools, such as NMR. This review discusses the need for new strategies and instrumentation to study the nanoparticle corona, operating in both in vitro and in vivo environments. Approaches to instrumentation must have the capacity to concurrently monitor nanosensor operation and the molecular changes in the corona phase. A detailed overview of new tools for the understanding of CoPhMoRe mechanisms is provided for future applications. PMID:25184487

  8. Accuracy of magnetic resonance imaging for measuring maturing cartilage: A phantom study

    PubMed Central

    McKinney, Jennifer R; Sussman, Marshall S; Moineddin, Rahim; Amirabadi, Afsaneh; Rayner, Tammy; Doria, Andrea S

    2016-01-01

    OBJECTIVES: To evaluate the accuracy of magnetic resonance imaging measurements of cartilage tissue-mimicking phantoms and to determine a combination of magnetic resonance imaging parameters to optimize accuracy while minimizing scan time. METHOD: Edge dimensions from 4 rectangular agar phantoms ranging from 10.5 to 14.5 mm in length and 1.25 to 5.5 mm in width were independently measured by two readers using a steel ruler. Coronal T1 spin echo (T1 SE), fast spoiled gradient-recalled echo (FSPGR) and multiplanar gradient-recalled echo (GRE MPGR) sequences were used to obtain phantom images on a 1.5-T scanner. RESULTS: Inter- and intra-reader reliability were high for both direct measurements and for magnetic resonance imaging measurements of phantoms. Statistically significant differences were noted between the mean direct measurements and the mean magnetic resonance imaging measurements for phantom 1 when using a GRE MPGR sequence (512x512 pixels, 1.5-mm slice thickness, 5:49 min scan time), while borderline differences were noted for T1 SE sequences with the following parameters: 320x320 pixels, 1.5-mm slice thickness, 6:11 min scan time; 320x320 pixels, 4-mm slice thickness, 6:11 min scan time; and 512x512 pixels, 1.5-mm slice thickness, 9:48 min scan time. Borderline differences were also noted when using a FSPGR sequence with 512x512 pixels, a 1.5-mm slice thickness and a 3:36 min scan time. CONCLUSIONS: FSPGR sequences, regardless of the magnetic resonance imaging parameter combination used, provided accurate measurements. The GRE MPGR sequence using 512x512 pixels, a 1.5-mm slice thickness and a 5:49 min scan time and, to a lesser degree, all tested T1 SE sequences produced suboptimal accuracy when measuring the widest phantom. PMID:27464298

  9. Surgical accuracy of three-dimensional virtual planning: a pilot study of bimaxillary orthognathic procedures including maxillary segmentation.

    PubMed

    Stokbro, K; Aagaard, E; Torkov, P; Bell, R B; Thygesen, T

    2016-01-01

    This retrospective study evaluated the precision and positional accuracy of different orthognathic procedures following virtual surgical planning in 30 patients. To date, no studies of three-dimensional virtual surgical planning have evaluated the influence of segmentation on positional accuracy and transverse expansion. Furthermore, only a few have evaluated the precision and accuracy of genioplasty in placement of the chin segment. The virtual surgical plan was compared with the postsurgical outcome by using three linear and three rotational measurements. The influence of maxillary segmentation was analyzed in both superior and inferior maxillary repositioning. In addition, transverse surgical expansion was compared with the postsurgical expansion obtained. An overall, high degree of linear accuracy between planned and postsurgical outcomes was found, but with a large standard deviation. Rotational difference showed an increase in pitch, mainly affecting the maxilla. Segmentation had no significant influence on maxillary placement. However, a posterior movement was observed in inferior maxillary repositioning. A lack of transverse expansion was observed in the segmented maxilla independent of the degree of expansion. PMID:26250603

  10. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  11. Agricultural case studies of classification accuracy, spectral resolution, and model over-fitting.

    PubMed

    Nansen, Christian; Geremias, Leandro Delalibera; Xue, Yingen; Huang, Fangneng; Parra, Jose Roberto

    2013-11-01

    This paper describes the relationship between spectral resolution and classification accuracy in analyses of hyperspectral imaging data acquired from crop leaves. The main scope is to discuss and reduce the risk of model over-fitting. Over-fitting of a classification model occurs when too many and/or irrelevant model terms are included (i.e., a large number of spectral bands), and it may lead to low robustness/repeatability when the classification model is applied to independent validation data. We outline a simple way to quantify the level of model over-fitting by comparing the observed classification accuracies with those obtained from explanatory random data. Hyperspectral imaging data were acquired from two crop-insect pest systems: (1) potato psyllid (Bactericera cockerelli) infestations of individual bell pepper plants (Capsicum annuum) with the acquisition of hyperspectral imaging data under controlled-light conditions (data set 1), and (2) sugarcane borer (Diatraea saccharalis) infestations of individual maize plants (Zea mays) with the acquisition of hyperspectral imaging data from the same plants under two markedly different image-acquisition conditions (data sets 2a and b). For each data set, reflectance data were analyzed based on seven spectral resolutions by dividing 160 spectral bands from 405 to 907 nm into 4, 16, 32, 40, 53, 80, or 160 bands. In the two data sets, similar classification results were obtained with spectral resolutions ranging from 3.1 to 12.6 nm. Thus, the size of the initial input data could be reduced fourfold with only a negligible loss of classification accuracy. In the analysis of data set 1, several validation approaches all demonstrated consistently that insect-induced stress could be accurately detected and that therefore there was little indication of model over-fitting. In the analyses of data set 2, inconsistent validation results were obtained and the observed classification accuracy (81.06%) was only a few percentage

  12. Comparative evaluation of dimensional accuracy of different polyvinyl siloxane putty-wash impression techniques-in vitro study.

    PubMed Central

    Dugal, Ramandeep; Railkar, Bhargavi; Musani, Smita

    2013-01-01

    Background: Dimensional accuracy when making impressions is crucial to the quality of fixed prosthodontic treatment, and the impression technique is a critical factor affecting this accuracy. The purpose of this in vitro study was to compare the dimensional accuracy of the casts obtained from one step double mix, two step double mix polyvinyl siloxane putty- wash impression techniques using three different spacer thicknesses (0.5mm, 1mm and 1.5mm), in order to determine the impression technique that displays the maximum linear dimensional accuracy. Materials & Methods: A Mild steel model with 2 abutment preparations was fabricated, and impressions were made 15 times with each technique. All impressions were made with an addition-reaction silicone impression material (Express, 3M ESPE) and customarily made perforated metal trays. The 1-step putty/light-body impressions were made with simultaneous use of putty and light-body materials. The 2-step putty/light-body impressions were made with 0.5-mm, 1mm and 1.5mm-thick metal-prefabricated spacer caps. The accuracy of the 4 different impression techniques was assessed by measuring 7 dimensions (intra- and inter abutment) (20-μm accuracy) on stone casts poured from the impressions of the mild steel model. The data were analyzed by one sample‘t’ test. Results: The stone dies obtained with all the techniques had significantly larger or smaller dimensions as compared to those of the mild steel model (P<0.05). The order for highest to lowest deviation from the mild steel model was: single step putty/light body, 2-step putty/light body with 0.5mm spacer thickness, 2-step putty/light body1.5mm spacer thickness, and 2-step putty/light body with 1mm spacer thickness. Significant differences among all of the groups for both absolute dimensions of the stone dies, and their standard deviations from the master model (P<0.05), were noted. Conclusions: The 2-step putty/light-body impression technique with 1mm spacer thickness was

  13. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2014-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio and number of control surfaces. A doublet lattice approach is taken to compute generalized forces. A rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. Although, all parameters can be easily modified if desired.The focus of this paper is on tool presentation, verification and validation. This process is carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool. Therefore the flutter speed and frequency for a clamped plate are computed using V-g and V-f analysis. The computational results are compared to a previously published computational analysis and wind tunnel results for the same structure. Finally a case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to V-g and V-f analysis. This also includes the analysis of the model in response to a 1-cos gust.

  14. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2015-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this paper is on tool presentation, verification, and validation. These processes are carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  15. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  16. 76 FR 71341 - BASINS and WEPP Climate Assessment Tools: Case Study Guide to Potential Applications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-17

    ... Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (EPA/600/R-11/123A). EPA also... Assessment Tool (BASINS CAT) and the Water Erosion Prediction Project Climate Assessment Tool (WEPPCAT),...

  17. The effect of a low radiation CT protocol on accuracy of CT guided implant migration measurement: A cadaver study.

    PubMed

    Boettner, Friedrich; Sculco, Peter K; Lipman, Joseph; Saboeiro, Gregory; Renner, Lisa; Faschingbauer, Martin

    2016-04-01

    The current study compared the impact of low radiation CT protocols on the accuracy, repeatability, and inter- and intra-observer variability of implant migration studies in total hip arthroplasty. Two total hip replacements were performed in two human cadavers and six tantalum beads were inserted into the femur similar to radiostereometric analysis. Six different 28 mm heads (-3 mm, 0 mm, 2.5 mm, 5.0 mm, 7.5 mm, and 10 mm) were added to simulate five reproducible translations (maximum total point migration) of the center of the head. Three CT scans with varying levels of radiation were performed for each head position. The effective dose (mSv) was 3.8 mSv for Protocol A (standard protocol), 0.7 mSv for Protocol B and 1.6 mSv for Protocol C. Implant migration was measured in a 3-D analysis software (Geomagic Studio 7). The accuracy was 0.16 mm for CT Protocol A, 0.13 mm for Protocol B and 0.14 mm for Protocol C; The repeatability was 0.22 mm for CT Protocol A, 0.18 mm for Protocol B and 0.20 mm for Protocol C; ICC for inter observer reliability was 0.89, intra observer reliability was 0.95. The difference in accuracy between standard protocol A and the two low radiation protocols (B, C) was less than 0.05 mm. The accuracy, inter- and intra-observer reliability of all three CT protocols is comparable to radiostereometric analysis. Reducing the CT radiation exposure to numbers similar to an AP Pelvis radiograph (0.7 mSv protocol B) does not affect the accuracy of implant migration measurements. PMID:26425921

  18. Accuracy Study of the Space-Time CE/SE Method for Computational Aeroacoustics Problems Involving Shock Waves

    NASA Technical Reports Server (NTRS)

    Wang, Xiao Yen; Chang, Sin-Chung; Jorgenson, Philip C. E.

    1999-01-01

    The space-time conservation element and solution element(CE/SE) method is used to study the sound-shock interaction problem. The order of accuracy of numerical schemes is investigated. The linear model problem.govemed by the 1-D scalar convection equation, sound-shock interaction problem governed by the 1-D Euler equations, and the 1-D shock-tube problem which involves moving shock waves and contact surfaces are solved to investigate the order of accuracy of numerical schemes. It is concluded that the accuracy of the CE/SE numerical scheme with designed 2nd-order accuracy becomes 1st order when a moving shock wave exists. However, the absolute error in the CE/SE solution downstream of the shock wave is on the same order as that obtained using a fourth-order accurate essentially nonoscillatory (ENO) scheme. No special techniques are used for either high-frequency low-amplitude waves or shock waves.

  19. Accuracy of Probabilistic Linkage Using the Enhanced Matching System for Public Health and Epidemiological Studies

    PubMed Central

    Aldridge, Robert W.; Shaji, Kunju; Hayward, Andrew C.; Abubakar, Ibrahim

    2015-01-01

    Background The Enhanced Matching System (EMS) is a probabilistic record linkage program developed by the tuberculosis section at Public Health England to match data for individuals across two datasets. This paper outlines how EMS works and investigates its accuracy for linkage across public health datasets. Methods EMS is a configurable Microsoft SQL Server database program. To examine the accuracy of EMS, two public health databases were matched using National Health Service (NHS) numbers as a gold standard unique identifier. Probabilistic linkage was then performed on the same two datasets without inclusion of NHS number. Sensitivity analyses were carried out to examine the effect of varying matching process parameters. Results Exact matching using NHS number between two datasets (containing 5931 and 1759 records) identified 1071 matched pairs. EMS probabilistic linkage identified 1068 record pairs. The sensitivity of probabilistic linkage was calculated as 99.5% (95%CI: 98.9, 99.8), specificity 100.0% (95%CI: 99.9, 100.0), positive predictive value 99.8% (95%CI: 99.3, 100.0), and negative predictive value 99.9% (95%CI: 99.8, 100.0). Probabilistic matching was most accurate when including address variables and using the automatically generated threshold for determining links with manual review. Conclusion With the establishment of national electronic datasets across health and social care, EMS enables previously unanswerable research questions to be tackled with confidence in the accuracy of the linkage process. In scenarios where a small sample is being matched into a very large database (such as national records of hospital attendance) then, compared to results presented in this analysis, the positive predictive value or sensitivity may drop according to the prevalence of matches between databases. Despite this possible limitation, probabilistic linkage has great potential to be used where exact matching using a common identifier is not possible, including in

  20. Accuracy studies with carbon clusters at the Penning trap mass spectrometer TRIGA-TRAP

    NASA Astrophysics Data System (ADS)

    Ketelaer, J.; Beyer, T.; Blaum, K.; Block, M.; Eberhardt, K.; Eibach, M.; Herfurth, F.; Smorra, C.; Nagy, Sz.

    2010-05-01

    Extensive cross-reference measurements of well-known frequency ratios using various sizes of carbon cluster ions 12Cn + (10≤n≤23) were performed to determine the effects limiting the accuracy of mass measurements at the Penning-trap facility TRIGA-TRAP. Two major contributions to the uncertainty of a mass measurement have been identified. Fluctuations of the magnetic field cause an uncertainty in the frequency ratio due to the required calibration by a reference ion of uf(νref)/νref = 6(2) × 10-11/min × Δt. A mass-dependent systematic shift of the frequency ratio of epsilonm(r)/r = -2.2(2) × 10-9 × (m-mref)/u has been found as well. Finally, the nuclide 197Au was used as a cross-check since its mass is already known with an uncertainty of 0.6 keV.

  1. Comparison of accuracy of anterior and superomedial approaches to shoulder injection: an experimental study

    PubMed Central

    Chernchujit, Bancha; Zonthichai, Nutthapon

    2016-01-01

    Introduction: We aimed to compare the accuracy between the standard anterior technique of shoulder injection and the new superomedial technique modified from Neviaser arthroscopic portal placement. Intra-articular placement, especially at the long head of biceps (LHB) tendon, and needle depth were evaluated. Methods: Fifty-eight patients (ages 57 ± 10 years) requiring shoulder arthroscopy in the beach-chair position were recruited. Needle punctures for both techniques were performed by an experienced sports medicine orthopedist. Patients were anesthetized, and the shoulder placed in the neutral position. A single needle was passed through the skin, with only one redirection allowed per trial. The superomedial technique was performed, then the anterior technique. Posterior-portal arthroscopy determined whether needle placement was inside the joint. The percentage of intra-articular needle placements for each technique defined accuracy. When inside the joint, the needle’s precise location was determined and its depth measured. A marginal χ2 test compared results between techniques. Results: The superomedial technique was significantly more accurate than the anterior technique (84% vs. 55%, p < 0.05). For superomedial versus anterior attempts, the LHB tendon was penetrated in 4% vs. 28% of patients, respectively, and the superior labrum in 35% vs. 0% of patients, respectively; the needle depth was 42 ± 7 vs. 32 ± 7 mm, respectively (all p < 0.05). Conclusions: The superomedial technique was more accurate, penetrating the LHB tendon less frequently than the standard anterior technique. A small-diameter needle was needed to minimize superior labral injury. The superomedial technique required a longer needle to access the shoulder joint. PMID:27163102

  2. Toward robust deconvolution of pass-through paleomagnetic measurements: new tool to estimate magnetometer sensor response and laser interferometry of sample positioning accuracy

    NASA Astrophysics Data System (ADS)

    Oda, Hirokuni; Xuan, Chuang; Yamamoto, Yuhji

    2016-07-01

    Pass-through superconducting rock magnetometers (SRM) offer rapid and high-precision remanence measurements for continuous samples that are essential for modern paleomagnetism studies. However, continuous SRM measurements are inevitably smoothed and distorted due to the convolution effect of SRM sensor response. Deconvolution is necessary to restore accurate magnetization from pass-through SRM data, and robust deconvolution requires reliable estimate of SRM sensor response as well as understanding of uncertainties associated with the SRM measurement system. In this paper, we use the SRM at Kochi Core Center (KCC), Japan, as an example to introduce new tool and procedure for accurate and efficient estimate of SRM sensor response. To quantify uncertainties associated with the SRM measurement due to track positioning errors and test their effects on deconvolution, we employed laser interferometry for precise monitoring of track positions both with and without placing a u-channel sample on the SRM tray. The acquired KCC SRM sensor response shows significant cross-term of Z-axis magnetization on the X-axis pick-up coil and full widths of ~46-54 mm at half-maximum response for the three pick-up coils, which are significantly narrower than those (~73-80 mm) for the liquid He-free SRM at Oregon State University. Laser interferometry measurements on the KCC SRM tracking system indicate positioning uncertainties of ~0.1-0.2 and ~0.5 mm for tracking with and without u-channel sample on the tray, respectively. Positioning errors appear to have reproducible components of up to ~0.5 mm possibly due to patterns or damages on tray surface or rope used for the tracking system. Deconvolution of 50,000 simulated measurement data with realistic error introduced based on the position uncertainties indicates that although the SRM tracking system has recognizable positioning uncertainties, they do not significantly debilitate the use of deconvolution to accurately restore high

  3. Overlay accuracy fundamentals

    NASA Astrophysics Data System (ADS)

    Kandel, Daniel; Levinski, Vladimir; Sapiens, Noam; Cohen, Guy; Amit, Eran; Klein, Dana; Vakshtein, Irina

    2012-03-01

    Currently, the performance of overlay metrology is evaluated mainly based on random error contributions such as precision and TIS variability. With the expected shrinkage of the overlay metrology budget to < 0.5nm, it becomes crucial to include also systematic error contributions which affect the accuracy of the metrology. Here we discuss fundamental aspects of overlay accuracy and a methodology to improve accuracy significantly. We identify overlay mark imperfections and their interaction with the metrology technology, as the main source of overlay inaccuracy. The most important type of mark imperfection is mark asymmetry. Overlay mark asymmetry leads to a geometrical ambiguity in the definition of overlay, which can be ~1nm or less. It is shown theoretically and in simulations that the metrology may enhance the effect of overlay mark asymmetry significantly and lead to metrology inaccuracy ~10nm, much larger than the geometrical ambiguity. The analysis is carried out for two different overlay metrology technologies: Imaging overlay and DBO (1st order diffraction based overlay). It is demonstrated that the sensitivity of DBO to overlay mark asymmetry is larger than the sensitivity of imaging overlay. Finally, we show that a recently developed measurement quality metric serves as a valuable tool for improving overlay metrology accuracy. Simulation results demonstrate that the accuracy of imaging overlay can be improved significantly by recipe setup optimized using the quality metric. We conclude that imaging overlay metrology, complemented by appropriate use of measurement quality metric, results in optimal overlay accuracy.

  4. Tailored selection of study individuals to be sequenced in order to improve the accuracy of genotype imputation.

    PubMed

    Peil, Barbara; Kabisch, Maria; Fischer, Christine; Hamann, Ute; Bermejo, Justo Lorenzo

    2015-02-01

    The addition of sequence data from own-study individuals to genotypes from external data repositories, for example, the HapMap, has been shown to improve the accuracy of imputed genotypes. Early approaches for reference panel selection favored individuals who best reflect recombination patterns in the study population. By contrast, a maximization of genetic diversity in the reference panel has been recently proposed. We investigate here a novel strategy to select individuals for sequencing that relies on the characterization of the ancestral kernel of the study population. The simulated study scenarios consisted of several combinations of subpopulations from HapMap. HapMap individuals who did not belong to the study population constituted an external reference panel which was complemented with the sequences of study individuals selected according to different strategies. In addition to a random choice, individuals with the largest statistical depth according to the first genetic principal components were selected. In all simulated scenarios the integration of sequences from own-study individuals increased imputation accuracy. The selection of individuals based on the statistical depth resulted in the highest imputation accuracy for European and Asian study scenarios, whereas random selection performed best for an African-study scenario. Present findings indicate that there is no universal 'best strategy' to select individuals for sequencing. We propose to use the methodology described in the manuscript to assess the advantage of focusing on the ancestral kernel under own study characteristics (study size, genetic diversity, availability and properties of external reference panels, frequency of imputed variants…). PMID:25537753

  5. Accuracy of Continuous Glucose Monitoring During Three Closed-Loop Home Studies Under Free-Living Conditions

    PubMed Central

    Thabit, Hood; Leelarathna, Lalantha; Wilinska, Malgorzata E.; Elleri, Daniella; Allen, Janet M.; Lubina-Solomon, Alexandra; Walkinshaw, Emma; Stadler, Marietta; Choudhary, Pratik; Mader, Julia K.; Dellweg, Sibylle; Benesch, Carsten; Pieber, Thomas R.; Arnolds, Sabine; Heller, Simon R.; Amiel, Stephanie A.; Dunger, David; Evans, Mark L.

    2015-01-01

    Abstract Objectives: Closed-loop (CL) systems modulate insulin delivery based on glucose levels measured by a continuous glucose monitor (CGM). Accuracy of the CGM affects CL performance and safety. We evaluated the accuracy of the Freestyle Navigator® II CGM (Abbott Diabetes Care, Alameda, CA) during three unsupervised, randomized, open-label, crossover home CL studies. Materials and Methods: Paired CGM and capillary glucose values (10,597 pairs) were collected from 57 participants with type 1 diabetes (41 adults [mean±SD age, 39±12 years; mean±SD hemoglobin A1c, 7.9±0.8%] recruited at five centers and 16 adolescents [mean±SD age, 15.6±3.6 years; mean±SD hemoglobin A1c, 8.1±0.8%] recruited at two centers). Numerical accuracy was assessed by absolute relative difference (ARD) and International Organization for Standardization (ISO) 15197:2013 15/15% limits, and clinical accuracy was assessed by Clarke error grid analysis. Results: Total duration of sensor use was 2,002 days (48,052 h). Overall sensor accuracy for the capillary glucose range (1.1–27.8 mmol/L) showed mean±SD and median (interquartile range) ARD of 14.2±15.5% and 10.0% (4.5%, 18.4%), respectively. Lowest mean ARD was observed in the hyperglycemic range (9.8±8.8%). Over 95% of pairs were in combined Clarke error grid Zones A and B (A, 80.1%, B, 16.2%). Overall, 70.0% of the sensor readings satisfied ISO criteria. Mean ARD was consistent (12.3%; 95% of the values fall within ±3.7%) and not different between participants (P=0.06) within the euglycemic and hyperglycemic range, when CL is actively modulating insulin delivery. Conclusions: Consistent accuracy of the CGM within the euglycemic–hyperglycemic range using the Freestyle Navigator II was observed and supports its use in home CL studies. Our results may contribute toward establishing normative CGM performance criteria for unsupervised home use of CL. PMID:26241693

  6. Manual landmark identification and tracking during the medial rotation test of the shoulder: an accuracy study using three-dimensional ultrasound and motion analysis measures.

    PubMed

    Morrissey, D; Morrissey, M C; Driver, W; King, J B; Woledge, R C

    2008-12-01

    Palpation of movement is a common clinical tool for assessment of movement in patients with musculoskeletal symptoms. The purpose of this study was to measure the accuracy of palpation of shoulder girdle translation during the medial rotation test (MRT) of the shoulder. The translation of the gleno-humeral and scapulo-thoracic joints was measured using both three-dimensional ultrasound and palpation in order to determine the accuracy of translation tracking during the MRT of the shoulder. Two movements of 11 normal subjects (mean age 24 (SD=4), range 19-47 years) were measured. The agreement between measures was good for scapulo-thoracic translation (r=0.83). Gleno-humeral translation was systematically under estimated (p=0.03) although moderate correlation was found (r=0.65). These results indicate that translation of the measured joints can be tracked by palpation and further tests of the efficacy of palpation tracking during musculoskeletal assessment may be warranted. PMID:18359266

  7. Improving diagnostic accuracy using EHR in emergency departments: A simulation-based study.

    PubMed

    Ben-Assuli, Ofir; Sagi, Doron; Leshno, Moshe; Ironi, Avinoah; Ziv, Amitai

    2015-06-01

    It is widely believed that Electronic Health Records (EHR) improve medical decision-making by enabling medical staff to access medical information stored in the system. It remains unclear, however, whether EHR indeed fulfills this claim under the severe time constraints of Emergency Departments (EDs). We assessed whether accessing EHR in an ED actually improves decision-making by clinicians. A simulated ED environment was created at the Israel Center for Medical Simulation (MSR). Four different actors were trained to simulate four specific complaints and behavior and 'consulted' 26 volunteer ED physicians. Each physician treated half of the cases (randomly) with access to EHR, and their medical decisions were compared to those where the physicians had no access to EHR. Comparison of diagnostic accuracy with and without access showed that accessing the EHR led to an increase in the quality of the clinical decisions. Physicians accessing EHR were more highly informed and thus made more accurate decisions. The percentage of correct diagnoses was higher and these physicians were more confident in their diagnoses and made their decisions faster. PMID:25817921

  8. A study of the accuracy of neutrally buoyant bubbles used as flow tracers in air

    NASA Technical Reports Server (NTRS)

    Kerho, Michael F.

    1993-01-01

    Research has been performed to determine the accuracy of neutrally buoyant and near neutrally buoyant bubbles used as flow tracers in air. Theoretical, computational, and experimental results are presented to evaluate the dynamics of bubble trajectories and factors affecting their ability to trace flow-field streamlines. The equation of motion for a single bubble was obtained and evaluated using a computational scheme to determine the factors which affect a bubble's trajectory. A two-dimensional experiment was also conducted to experimentally determine bubble trajectories in the stagnation region of NACA 0012 airfoil at 0 deg angle of attack using a commercially available helium bubble generation system. Physical properties of the experimental bubble trajectories were estimated using the computational scheme. These properties included the density ratio and diameter of the individual bubbles. the helium bubble system was then used to visualize and document the flow field about a 30 deg swept semispan wing with simulated glaze ice. Results were compared to Navier-Stokes calculations and surface oil flow visualization. The theoretical and computational analysis have shown that neutrally buoyant bubbles will trace even the most complex flow patterns. Experimental analysis revealed that the use of bubbles to trace flow patterns should be limited to qualitative measurements unless care is taken to ensure neutral buoyancy. This is due to the difficulty in the production of neutrally buoyant bubbles.

  9. Accuracy of needle implantation in brachytherapy using a medical AR system: a phantom study

    NASA Astrophysics Data System (ADS)

    Wesarg, Stefan; Firle, Evelyn A.; Schwald, Bernd; Seibert, Helmut; Zogal, Pawel; Roeddiger, Sandra

    2004-05-01

    Brachytherapy is the treatment method of choice for patients with a tumor relapse after a radiation therapy with external beams or tumors in regions with sensitive surrounding organs-at-risk, e. g. prostate tumors. The standard needle implantation procedure in brachytherapy uses pre-operatively acquired image data displayed as slices on a monitor beneath the operation table. Since this information allows only a rough orientation for the surgeon, the position of the needles has to be verified repeatedly during the intervention. Within the project Medarpa a transparent display being the core component of a medical Augmented Reality (AR) system has been developed. There, pre-operatively acquired image data is displayed together with the position of the tracked instrument allowing a navigated implantation of the brachytherapy needles. The surgeon is enabled to see the anatomical information as well as the virtual instrument in front of the operation area. Thus, the Medarpa system serves as "window into the patient". This paper deals with the results of first clinical trials of the system. Phantoms have been used for evaluating the achieved accuracy of the needle implantation. This has been done by comparing the output of the system (instrument positions relative to the phantom) with the real positions of the needles measured by means of a verification CT scan.

  10. Accuracy of the unified approach in maternally influenced traits - illustrated by a simulation study in the honey bee (Apis mellifera)

    PubMed Central

    2013-01-01

    Background The honey bee is an economically important species. With a rapid decline of the honey bee population, it is necessary to implement an improved genetic evaluation methodology. In this study, we investigated the applicability of the unified approach and its impact on the accuracy of estimation of breeding values for maternally influenced traits on a simulated dataset for the honey bee. Due to the limitation to the number of individuals that can be genotyped in a honey bee population, the unified approach can be an efficient strategy to increase the genetic gain and to provide a more accurate estimation of breeding values. We calculated the accuracy of estimated breeding values for two evaluation approaches, the unified approach and the traditional pedigree based approach. We analyzed the effects of different heritabilities as well as genetic correlation between direct and maternal effects on the accuracy of estimation of direct, maternal and overall breeding values (sum of maternal and direct breeding values). The genetic and reproductive biology of the honey bee was accounted for by taking into consideration characteristics such as colony structure, uncertain paternity, overlapping generations and polyandry. In addition, we used a modified numerator relationship matrix and a realistic genome for the honey bee. Results For all values of heritability and correlation, the accuracy of overall estimated breeding values increased significantly with the unified approach. The increase in accuracy was always higher for the case when there was no correlation as compared to the case where a negative correlation existed between maternal and direct effects. Conclusions Our study shows that the unified approach is a useful methodology for genetic evaluation in honey bees, and can contribute immensely to the improvement of traits of apicultural interest such as resistance to Varroa or production and behavioural traits. In particular, the study is of great interest for

  11. Study of academic achievements using spatial analysis tools

    NASA Astrophysics Data System (ADS)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  12. In vivo diagnostic accuracy of high resolution microendoscopy in differentiating neoplastic from non-neoplastic colorectal polyps: a prospective study

    PubMed Central

    Parikh, Neil; Perl, Daniel; Lee, Michelle H.; Shah, Brijen; Young, Yuki; Chang, Shannon S.; Shukla, Richa; Polydorides, Alexandros D.; Moshier, Erin; Godbold, James; Zhou, Elinor; Mitchaml, Josephine; Richards-Kortum, Rebecca; Anandasabapathy, Sharmila

    2013-01-01

    High-resolution microendoscopy (HRME) is a low-cost, “optical biopsy” technology that allows for subcellular imaging. The purpose of this study was to determine the in vivo diagnostic accuracy of the HRME for the differentiation of neoplastic from non-neoplastic colorectal polyps and compare it to that of high-definition white-light endoscopy (WLE) with histopathology as the gold standard. Three endoscopists prospectively detected a total of 171 polyps from 94 patients that were then imaged by HRME and classified in real-time as neoplastic (adenomatous, cancer) or non-neoplastic (normal, hyperplastic, inflammatory). HRME had a significantly higher accuracy (94%), specificity (95%), and positive predictive value (87%) for the determination of neoplastic colorectal polyps compared to WLE (65%, 39%, and 55%, respectively). When looking at small colorectal polyps (less than 10 mm), HRME continued to significantly outperform WLE in terms of accuracy (95% vs. 64%), specificity (98% vs. 40%) and positive predictive value (92% vs. 55%). These trends continued when evaluating diminutive polyps (less than 5 mm) as HRME's accuracy (95%), specificity (98%), and positive predictive value (93%) were all significantly greater than their WLE counterparts (62%, 41%, and 53%, respectively). In conclusion, this in vivo study demonstrates that HRME can be a very effective modality in the differentiation of neoplastic and non-neoplastic colorectal polyps. A combination of standard white-light colonoscopy for polyp detection and HRME for polyp classification has the potential to truly allow the endoscopist to selectively determine which lesions can be left in situ, which lesions can simply be discarded, and which lesions need formal histopathologic analysis. PMID:24296752

  13. A new statistical tool for NOAA local climate studies

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Meyers, J. C.; Hollingshead, A.

    2011-12-01

    The National Weather Services (NWS) Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to efficiently access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NOAA's staff to conduct regional and local climate studies using state-of-the-art station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. LCAT will augment current climate reference materials with information pertinent to the local and regional levels as they apply to diverse variables appropriate to each locality. The LCAT main emphasis is to enable studies of extreme meteorological and hydrological events such as tornadoes, flood, drought, severe storms, etc. LCAT will close a very critical gap in NWS local climate services because it will allow addressing climate variables beyond average temperature and total precipitation. NWS external partners and government agencies will benefit from the LCAT outputs that could be easily incorporated into their own analysis and/or delivery systems. Presently we identified five existing requirements for local climate: (1) Local impacts of climate change; (2) Local impacts of climate variability; (3) Drought studies; (4) Attribution of severe meteorological and hydrological events; and (5) Climate studies for water resources. The methodologies for the first three requirements will be included in the LCAT first phase implementation. Local rate of climate change is defined as a slope of the mean trend estimated from the ensemble of three trend techniques: (1) hinge, (2) Optimal Climate Normals (running mean for optimal time periods), (3) exponentially

  14. Diagnostic Accuracy of Renal Mass Biopsy: An Ex Vivo Study of 100 Nephrectomy Specimens.

    PubMed

    von Rundstedt, Friedrich-Carl; Mata, Douglas Alexander; Kryvenko, Oleksandr N; Roth, Stephan; Degener, Stephan; Dreger, Nici Markus; Goedde, Daniel; Assaid, Ahmed; Kamper, Lars; Haage, Patrick; Stoerkel, Stephan; Lazica, David A

    2016-05-01

    We investigated the diagnostic accuracy of renal mass biopsy in an ex vivo model, as well as compared the agreement of the preoperative radiological diagnosis with the final pathologic diagnosis. Two 18-gauge needle-core and 2 vacuum-needle biopsies were performed ex vivo from the tumors of 100 consecutive patients undergoing radical nephrectomy between 2006 and 2010. The median tumor size was 5.5 cm. There was no significant difference with regard to cylinder length or tissue quality between the sampling methods. At least 1 of 4 needle cores contained diagnostic tissue in 88% of patients. Biopsy specimens identified clear cell (54%), papillary (13%), or chromophobe (5%) renal cell carcinoma; urothelial carcinoma (6%); oncocytoma (5%); liposarcoma (1%); metastatic colorectal carcinoma (1%); squamous cell carcinoma (1%); unclassified renal cell neoplasm (1%); and no tumor sampled (12%). The sensitivity of the biopsy for accurately determining the diagnosis was 88% (95% CI: 79% to 93%). The specificity was 100% (95% CI: 17% to 100%). Biopsy grade correlated strongly with final pathology (83.5% agreement). There was no difference in average tumor size in cases with the same versus higher grade on final pathology (5.87 vs 5.97; P = .87). Appraisal of tumor histology by radiology agreed with the pathologic diagnosis in 68% of cases. Provided that the biopsy samples the tumor tissue in a renal mass, pathologic analysis is of great diagnostic value in respect of grade and tumor type and correlates well with excisional pathology. This constitutes strong ground for increasingly used renal mass biopsy in patients considering active surveillance or ablation therapy. PMID:26811388

  15. Numerical Relativity as a tool for studying the Early Universe

    NASA Astrophysics Data System (ADS)

    Garrison, David

    2013-04-01

    Numerical simulations are becoming a more effective tool for conducting detailed investigations into the evolution of our universe. In this presentation, I show how the framework of numerical relativity can be used for studying cosmological models. We are working to develop a large-scale simulation of the dynamical processes in the early universe. These take into account interactions of dark matter, scalar perturbations, gravitational waves, magnetic fields and a turbulent plasma. The code described in this report is a GRMHD code based on the Cactus framework and is structured to utilize one of several different differencing methods chosen at run-time. It is being developed and tested on the Texas Learning and Computation Center's Xanadu cluster.

  16. Total Diet Studies as a Tool for Ensuring Food Safety

    PubMed Central

    Lee, Joon-Goo; Kim, Sheen-Hee; Kim, Hae-Jung

    2015-01-01

    With the diversification and internationalization of the food industry and the increased focus on health from a majority of consumers, food safety policies are being implemented based on scientific evidence. Risk analysis represents the most useful scientific approach for making food safety decisions. Total diet study (TDS) is often used as a risk assessment tool to evaluate exposure to hazardous elements. Many countries perform TDSs to screen for chemicals in foods and analyze exposure trends to hazardous elements. TDSs differ from traditional food monitoring in two major aspects: chemicals are analyzed in food in the form in which it will be consumed and it is cost-effective in analyzing composite samples after processing multiple ingredients together. In Korea, TDSs have been conducted to estimate dietary intakes of heavy metals, pesticides, mycotoxins, persistent organic pollutants, and processing contaminants. TDSs need to be carried out periodically to ensure food safety. PMID:26483881

  17. Screw Placement Accuracy for Minimally Invasive Transforaminal Lumbar Interbody Fusion Surgery: A Study on 3-D Neuronavigation-Guided Surgery

    PubMed Central

    Torres, Jorge; James, Andrew R.; Alimi, Marjan; Tsiouris, Apostolos John; Geannette, Christian; Härtl, Roger

    2012-01-01

    Purpose The aim of this study was to assess the impact of 3-D navigation for pedicle screw placement accuracy in minimally invasive transverse lumbar interbody fusion (MIS-TLIF). Methods A retrospective review of 52 patients who had MIS-TLIF assisted with 3D navigation is presented. Clinical outcomes were assessed with the Oswestry Disability Index (ODI), Visual Analog Scales (VAS), and MacNab scores. Radiographic outcomes were assessed using X-rays and thin-slice computed tomography. Result The mean age was 56.5 years, and 172 screws were implanted with 16 pedicle breaches (91.0% accuracy rate). Radiographic fusion rate at a mean follow-up of 15.6 months was 87.23%. No revision surgeries were required. The mean improvement in the VAS back pain, VAS leg pain, and ODI at 11.3 months follow-up was 4.3, 4.5, and 26.8 points, respectively. At last follow-up the mean postoperative disc height gain was 4.92 mm and the mean postoperative disc angle gain was 2.79 degrees. At L5–S1 level, there was a significant correlation between a greater disc space height gain and a lower VAS leg score. Conclusion Our data support that application of 3-D navigation in MIS-TLIF is associated with a high level of accuracy in the pedicle screw placement. PMID:24353961

  18. Comparison of Accuracy of Uncorrected and Corrected Sagittal Tomography in Detection of Mandibular Condyle Erosions: an Exvivo Study

    PubMed Central

    Naser, Asieh Zamani; Shirani, Amir Mansour; Hekmatian, Ehsan; Valiani, Ali; Ardestani, Pegah; Vali, Ava

    2010-01-01

    Background: Radiographic examination of TMJ is indicated when there are clinical signs of pathological conditions, mainly bone changes that may influence the diagnosis and treatment planning. The purpose of this study was to evaluate and to compare the validity and diagnostic accuracy of uncorrected and corrected sagittal tomographic images in the detection of simulated mandibular condyle erosions. Methods Simulated lesions were created in 10 dry mandibles using a dental round bur. Using uncorrected and corrected sagittal tomography techniques, mandibular condyles were imaged by a Cranex Tome X-ray unit before and after creating the lesions. The uncorrected and corrected tomography images were examined by two independent observers for absence or presence of a lesion. The accuracy for detecting mandibular condyle lesions was expressed as sensitivity, specificity, and validity values. Differences between the two radiographic modalities were tested by Wilcoxon for paired data tests. Inter-observer agreement was determined by Cohen's Kappa. Results: The sensitivity, specificity and validity were 45%, 85% and 30% in uncorrected sagittal tomographic images, respectively, and 70%, 92.5% and 60% in corrected sagittal tomographic images, respectively. There was a significant statistical difference between the accuracy of uncorrected and corrected sagittal tomography in detection of mandibular condyle erosions (P = 0.016). The inter-observer agreement was slight for uncorrected sagittal tomography and moderate for corrected sagittal tomography. Conclusion: The accuracy of corrected sagittal tomography is significantly higher than that of uncorrected sagittal tomography. Therefore, corrected sagittal tomography seems to be a better modality in detection of mandibular condyle erosions. PMID:22013461

  19. EMU battery/SMM power tool characterization study

    SciTech Connect

    Palandati, C.

    1982-01-01

    The power tool which will be used to replace the attitude control system in the SMM spacecraft was modified to operate from a self contained battery. The extravehicular mobility unit (EMU) battery was tested for the power tool application. The results are that the EMU battery is capable of operating the power tool within the pulse current range of 2.0 to 15.0 amperes and battery temperature range of -10 to 40 degrees Celsius.

  20. A 3-D numerical study of pinhole diffraction to predict the accuracy of EUV point diffraction interferometry

    SciTech Connect

    Goldberg, K.A. |; Tejnil, E.; Bokor, J. |

    1995-12-01

    A 3-D electromagnetic field simulation is used to model the propagation of extreme ultraviolet (EUV), 13-nm, light through sub-1500 {Angstrom} dia pinholes in a highly absorptive medium. Deviations of the diffracted wavefront phase from an ideal sphere are studied within 0.1 numerical aperture, to predict the accuracy of EUV point diffraction interferometersused in at-wavelength testing of nearly diffraction-limited EUV optical systems. Aberration magnitudes are studied for various 3-D pinhole models, including cylindrical and conical pinhole bores.

  1. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    NASA Technical Reports Server (NTRS)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  2. Drosophila tools and assays for the study of human diseases.

    PubMed

    Ugur, Berrak; Chen, Kuchuan; Bellen, Hugo J

    2016-03-01

    Many of the internal organ systems of Drosophila melanogaster are functionally analogous to those in vertebrates, including humans. Although humans and flies differ greatly in terms of their gross morphological and cellular features, many of the molecular mechanisms that govern development and drive cellular and physiological processes are conserved between both organisms. The morphological differences are deceiving and have led researchers to undervalue the study of invertebrate organs in unraveling pathogenic mechanisms of diseases. In this review and accompanying poster, we highlight the physiological and molecular parallels between fly and human organs that validate the use of Drosophila to study the molecular pathogenesis underlying human diseases. We discuss assays that have been developed in flies to study the function of specific genes in the central nervous system, heart, liver and kidney, and provide examples of the use of these assays to address questions related to human diseases. These assays provide us with simple yet powerful tools to study the pathogenic mechanisms associated with human disease-causing genes. PMID:26935102

  3. Drosophila tools and assays for the study of human diseases

    PubMed Central

    Ugur, Berrak; Chen, Kuchuan; Bellen, Hugo J.

    2016-01-01

    ABSTRACT Many of the internal organ systems of Drosophila melanogaster are functionally analogous to those in vertebrates, including humans. Although humans and flies differ greatly in terms of their gross morphological and cellular features, many of the molecular mechanisms that govern development and drive cellular and physiological processes are conserved between both organisms. The morphological differences are deceiving and have led researchers to undervalue the study of invertebrate organs in unraveling pathogenic mechanisms of diseases. In this review and accompanying poster, we highlight the physiological and molecular parallels between fly and human organs that validate the use of Drosophila to study the molecular pathogenesis underlying human diseases. We discuss assays that have been developed in flies to study the function of specific genes in the central nervous system, heart, liver and kidney, and provide examples of the use of these assays to address questions related to human diseases. These assays provide us with simple yet powerful tools to study the pathogenic mechanisms associated with human disease-causing genes. PMID:26935102

  4. Accuracy of tablet splitting: Comparison study between hand splitting and tablet cutter

    PubMed Central

    Habib, Walid A.; Alanizi, Abdulaziz S.; Abdelhamid, Magdi M.; Alanizi, Fars K.

    2013-01-01

    Background Tablet splitting is often used in pharmacy practice to adjust the administered doses. It is also used as a method of reducing medication costs. Objective To investigate the accuracy of tablet splitting by comparing hand splitting vs. a tablet cutter for a low dose drug tablet. Methods Salbutamol tablets (4 mg) were chosen as low dose tablets. A randomly selected equal number of tablets were split by hand and a tablet cutter, and the remaining tablets were kept whole. Weight variation and drug content were analysed for salbutamol in 0.1 N HCl using a validated spectrophotometric method. The percentages by which each whole tablet’s or half-tablet’s drug content and weight difference from sample mean values were compared with USP specification ranges for drug content. The %RSD was also calculated in order to determine whether the drugs met USP specification for %RSD. The tablets and half tablets were scanned using electron microscopy to show any visual differences arising from splitting. Results 27.5% of samples differed from sample mean values by a percentage that fell outside of USP specification for weight, of which 15% from the tablet cutter and 25% from those split by hand fell outside the specifications. All whole tablets and half tablets met the USP specifications for drug content but the variation of content between the two halves reached 21.3% of total content in case of hand splitting, and 7.13% only for the tablet cutter. The %RSDs for drug content and weight met the USP specification for whole salbutamol tablets and the half tablets which were split by tablet cutter. The halves which were split by hand fell outside the specification for %RSD (drug content = 6.43%, weight = 8.33%). The differences were visually clear in the electron microscope scans. Conclusion Drug content variation in half-tablets appeared to be attributable to weight variation occurring during the splitting process. This could have serious clinical consequences for

  5. [Arabidopsis thaliana accessions - a tool for biochemical and phylogentical studies].

    PubMed

    Szymańska, Renata; Gabruk, Michał; Kruk, Jerzy

    2015-01-01

    Arabidopsis thaliana since a few decades is used as a model for biological and plant genetic research. Natural variation of this species is related to its geographical range which covers different climate zones and habitats. The ability to occupy such a wide area by Arabidopsis is possible due to its stress tolerance and adaptability. Arabidopsis accessions exhibit phenotypic and genotypic variation, which is a result of adaptation to local environmental conditions. During development, plants are subjected to various stress factors. Plants show a spectrum of reactions, processes and phenomena that determine their survival in these adverse conditions. The response of plants to stress involves signal detection and transmission. These reactions are different and depend on the stressor, its intensity, plant species and life strategy. It is assumed that the populations of the same species from different geographical regions acclimated to the stress conditions develop a set of alleles, which allow them to grow and reproduce. Therefore, the study of natural variation in response to abiotic stress among Arabidopsis thaliana accessions allows to find key genes or alleles, and thus the mechanisms by which plants cope with adverse physical and chemical conditions. This paper presents an overview of recent findings, tools and research directions used in the study of natural variation in Arabidopsis thaliana accessions. Additionally, we explain why accessions can be used in the phylogenetic analyses and to study demography and migration of Arabidopsis thaliana. PMID:26281359

  6. Bellis perennis: a useful tool for protein localization studies.

    PubMed

    Jaedicke, Katharina; Rösler, Jutta; Gans, Tanja; Hughes, Jon

    2011-10-01

    Fluorescent fusion proteins together with transient transformation techniques are commonly used to investigate intracellular protein localisation in vivo. Biolistic transfection is reliable, efficient and avoids experimental problems associated with producing and handling fragile protoplasts. Onion epidermis pavement cells are frequently used with this technique, their excellent properties for microscopy resulting from their easy removal from the underlying tissues and large size. They also have advantages over mesophyll cells for fluorescence microscopy, as they are devoid of chloroplasts whose autofluorescence can pose problems. The arrested plastid development is peculiar to epidermal cells, however, and stands in the way of studies on protein targeting to plastids. We have developed a system enabling studies of in vivo protein targeting to organelles including chloroplasts within a photosynthetically active plant cell with excellent optical properties using a transient transformation procedure. We established biolistic transfection in epidermal pavement cells of the lawn daisy (Bellis perennis L., cultivar "Galaxy red") which unusually contain a moderate number of functional chloroplasts. These cells are excellent objects for fluorescence microscopy using current reporters, combining the advantages of the ease of biolistic transfection, the excellent optical properties of a single cell layer and access to chloroplast protein targeting. We demonstrate chloroplast targeting of plastid-localised heme oxygenase, and two further proteins whose localisation was equivocal. We also demonstrate unambiguous targeting to mitochondria, peroxisomes and nuclei. We thus propose that the Bellis system represents a valuable tool for protein localisation studies in living plant cells. PMID:21626148

  7. Drop Your Tools: An Allegory for Organizational Studies.

    ERIC Educational Resources Information Center

    Weick, Karl E.

    1996-01-01

    Ponders the failure of 27 wildland firefighters to drop their tools and outrun an exploding fire, using explanations developed by James D. Thompson, first editor of "Administrative Science Quarterly." Organization scholars are in analogous threatened positions; they too seem to be keeping their heavy tools and falling behind. Reaffirming…

  8. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part II. Statistical Methods of Meta-Analysis.

    PubMed

    Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi; Park, Seong Ho

    2015-01-01

    Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107

  9. A Meta-Analysis of Typhoid Diagnostic Accuracy Studies: A Recommendation to Adopt a Standardized Composite Reference.

    PubMed

    Storey, Helen L; Huang, Ying; Crudder, Chris; Golden, Allison; de los Santos, Tala; Hawkins, Kenneth

    2015-01-01

    Novel typhoid diagnostics currently under development have the potential to improve clinical care, surveillance, and the disease burden estimates that support vaccine introduction. Blood culture is most often used as the reference method to evaluate the accuracy of new typhoid tests; however, it is recognized to be an imperfect gold standard. If no single gold standard test exists, use of a composite reference standard (CRS) can improve estimation of diagnostic accuracy. Numerous studies have used a CRS to evaluate new typhoid diagnostics; however, there is no consensus on an appropriate CRS. In order to evaluate existing tests for use as a reference test or inclusion in a CRS, we performed a systematic review of the typhoid literature to include all index/reference test combinations observed. We described the landscape of comparisons performed, showed results of a meta-analysis on the accuracy of the more common combinations, and evaluated sources of variability based on study quality. This wide-ranging meta-analysis suggests that no single test has sufficiently good performance but some existing diagnostics may be useful as part of a CRS. Additionally, based on findings from the meta-analysis and a constructed numerical example demonstrating the use of CRS, we proposed necessary criteria and potential components of a typhoid CRS to guide future recommendations. Agreement and adoption by all investigators of a standardized CRS is requisite, and would improve comparison of new diagnostics across independent studies, leading to the identification of a better reference test and improved confidence in prevalence estimates. PMID:26566275

  10. A Meta-Analysis of Typhoid Diagnostic Accuracy Studies: A Recommendation to Adopt a Standardized Composite Reference

    PubMed Central

    Storey, Helen L.; Huang, Ying; Crudder, Chris; Golden, Allison; de los Santos, Tala; Hawkins, Kenneth

    2015-01-01

    Novel typhoid diagnostics currently under development have the potential to improve clinical care, surveillance, and the disease burden estimates that support vaccine introduction. Blood culture is most often used as the reference method to evaluate the accuracy of new typhoid tests; however, it is recognized to be an imperfect gold standard. If no single gold standard test exists, use of a composite reference standard (CRS) can improve estimation of diagnostic accuracy. Numerous studies have used a CRS to evaluate new typhoid diagnostics; however, there is no consensus on an appropriate CRS. In order to evaluate existing tests for use as a reference test or inclusion in a CRS, we performed a systematic review of the typhoid literature to include all index/reference test combinations observed. We described the landscape of comparisons performed, showed results of a meta-analysis on the accuracy of the more common combinations, and evaluated sources of variability based on study quality. This wide-ranging meta-analysis suggests that no single test has sufficiently good performance but some existing diagnostics may be useful as part of a CRS. Additionally, based on findings from the meta-analysis and a constructed numerical example demonstrating the use of CRS, we proposed necessary criteria and potential components of a typhoid CRS to guide future recommendations. Agreement and adoption by all investigators of a standardized CRS is requisite, and would improve comparison of new diagnostics across independent studies, leading to the identification of a better reference test and improved confidence in prevalence estimates. PMID:26566275

  11. Photogrammetric measurement and visualization of blood vessel branching casting: a tool for quantitative accuracy tests of MR, CT, and DS angiography

    NASA Astrophysics Data System (ADS)

    D'Apuzzo, Nicola

    2000-12-01

    Currently three different angiographic techniques are used to measure and visualize major blood vessels in the human body: magnetic resonance (MR), computer tomography (CT) and digital subtraction (DS) angiography. Although these imaging systems have been already qualitatively compared, a quantitative assessment is still missing. The goal of this work is to provide a tool enabling a quantitative comparison of the three imaging techniques to an unbiased reference. MR-, CT- and DS-angiographies are first performed on a corpse. Then, a casting of the abdominal aorta and its main branches is prepared, removed from the body and measured with photogrammetric methods. The elongated and thin cast is fixed in a 3D frame with 16 signalized small spheres used for calibration and orientation purposes. Three fixed CCD cameras acquire triplets of images of the casting, which is turned in 8 positions. In order to perform multi-image matching, an artificial random texture is projected onto the object. For each triplet of images, a semi-automated matching process based on least squares matching determines a dense set of corresponding points. Their 3D coordinates are then computed by forward intersection, with a mean standard deviation of about 0.2 mm. The result from the 8 positions are merged together into a 3D point cloud and an adequate filter is applied to remove the noise and the redundancy in the overlapping regions. The paper depicts the basic design of the system and the measurement methods. Furthermore some preliminary results are presented.

  12. Psychological autopsy studies as diagnostic tools: are they methodologically flawed?

    PubMed

    Hjelmeland, Heidi; Dieserud, Gudrun; Dyregrov, Kari; Knizek, Birthe L; Leenaars, Antoon A

    2012-08-01

    One of the most established "truths" in suicidology is that almost all (90% or more) of those who kill themselves suffer from one or more mental disorders, and a causal link between the two is implied. Psychological autopsy (PA) studies constitute one main evidence base for this conclusion. However, there has been little reflection on the reliability and validity of this method. For example, psychiatric diagnoses are assigned to people who have died by suicide by interviewing a few of the relatives and/or friends, often many years after the suicide. In this article, we scrutinize PA studies with particular focus on the diagnostic process and demonstrate that they cannot constitute a valid evidence base for a strong relationship between mental disorders and suicide. We show that most questions asked to assign a diagnosis are impossible to answer reliably by proxies, and thus, one cannot validly make conclusions. Thus, as a diagnostic tool psychological autopsies should now be abandoned. Instead, we recommend qualitative approaches focusing on the understanding of suicide beyond mental disorders, where narratives from a relatively high number of informants around each suicide are systematically analyzed in terms of the informants' relationships with the deceased. PMID:24563941

  13. Ciliobrevins as tools for studying dynein motor function

    PubMed Central

    Roossien, Douglas H.; Miller, Kyle E.; Gallo, Gianluca

    2015-01-01

    Dyneins are a small class of molecular motors that bind to microtubules and walk toward their minus ends. They are essential for the transport and distribution of organelles, signaling complexes and cytoskeletal elements. In addition dyneins generate forces on microtubule arrays that power the beating of cilia and flagella, cell division, migration and growth cone motility. Classical approaches to the study of dynein function in axons involve the depletion of dynein, expression of mutant/truncated forms of the motor, or interference with accessory subunits. By necessity, these approaches require prolonged time periods for the expression or manipulation of cellular dynein levels. With the discovery of the ciliobrevins, a class of cell permeable small molecule inhibitors of dynein, it is now possible to acutely disrupt dynein both globally and locally. In this review, we briefly summarize recent work using ciliobrevins to inhibit dynein and discuss the insights ciliobrevins have provided about dynein function in various cell types with a focus on neurons. We temper this with a discussion of the need for studies that will elucidate the mechanism of action of ciliobrevin and as well as the need for experiments to further analyze the specificity of ciliobreviens for dynein. Although much remains to be learned about ciliobrevins, these small molecules are proving themselves to be valuable novel tools to assess the cellular functions of dynein. PMID:26217180

  14. Oral Fluency, Accuracy, and Complexity in Formal Instruction and Study Abroad Learning Contexts

    ERIC Educational Resources Information Center

    Mora, Joan C.; Valls-Ferrer, Margalida

    2012-01-01

    This study investigates the differential effects of two learning contexts, formal instruction (FI) at home and a study abroad period (SA), on the oral production skills of advanced-level Catalan-Spanish undergraduate learners of English. Speech samples elicited through an interview at three data collection times over a 2-year period were…

  15. Assessment of the accuracy of ABC/2 variations in traumatic epidural hematoma volume estimation: a retrospective study

    PubMed Central

    Hu, Tingting; Zhang, Zhen

    2016-01-01

    Background. The traumatic epidural hematoma (tEDH) volume is often used to assist in tEDH treatment planning and outcome prediction. ABC/2 is a well-accepted volume estimation method that can be used for tEDH volume estimation. Previous studies have proposed different variations of ABC/2; however, it is unclear which variation will provide a higher accuracy. Given the promising clinical contribution of accurate tEDH volume estimations, we sought to assess the accuracy of several ABC/2 variations in tEDH volume estimation. Methods. The study group comprised 53 patients with tEDH who had undergone non-contrast head computed tomography scans. For each patient, the tEDH volume was automatically estimated by eight ABC/2 variations (four traditional and four newly derived) with an in-house program, and results were compared to those from manual planimetry. Linear regression, the closest value, percentage deviation, and Bland-Altman plot were adopted to comprehensively assess accuracy. Results. Among all ABC/2 variations assessed, the traditional variations y = 0.5 × A1B1C1 (or A2B2C1) and the newly derived variations y = 0.65 × A1B1C1 (or A2B2C1) achieved higher accuracy than the other variations. No significant differences were observed between the estimated volume values generated by these variations and those of planimetry (p > 0.05). Comparatively, the former performed better than the latter in general, with smaller mean percentage deviations (7.28 ± 5.90% and 6.42 ± 5.74% versus 19.12 ± 6.33% and 21.28 ± 6.80%, respectively) and more values closest to planimetry (18/53 and 18/53 versus 2/53 and 0/53, respectively). Besides, deviations of most cases in the former fell within the range of <10% (71.70% and 84.91%, respectively), whereas deviations of most cases in the latter were in the range of 10–20% and >20% (90.57% and 96.23, respectively). Discussion. In the current study, we adopted an automatic approach to assess the accuracy of several ABC/2 variations

  16. Preliminary study for improving the VIIRS DNB low light calibration accuracy with ground based active light source

    NASA Astrophysics Data System (ADS)

    Cao, Changyong; Zong, Yuqing; Bai, Yan; Shao, Xi

    2015-09-01

    There is a growing interest in the science and user community in the Visible Infrared Imaging Radiometer Suite (VIIRS) Day/Night Band (DNB) low light detection capabilities at night for quantitative applications such as airglow, geophysical retrievals under lunar illumination, light power estimation, search and rescue, energy use, urban expansion and other human activities. Given the growing interest in the use of the DNB data, a pressing need arises for improving the calibration stability and absolute accuracy of the DNB at low radiances. Currently the low light calibration accuracy was estimated at a moderate 15%-100% while the long-term stability has yet to be characterized. This study investigates selected existing night light point sources from Suomi NPP DNB observations and evaluates the feasibility of SI traceable nightlight source at radiance levels near 3 nW·cm-2·sr-1, that potentially can be installed at selected sites for VIIRS DNB calibration/validation. The illumination geometry, surrounding environment, as well as atmospheric effects are also discussed. The uncertainties of the ground based light source are estimated. This study will contribute to the understanding of how the Earth's atmosphere and surface variability contribute to the stability of the DNB measured radiances, and how to separate them from instrument calibration stability. It presents the need for SI traceable active light sources to monitor the calibration stability, radiometric and geolocation accuracy, and point spread functions of the DNB. Finally, it is also hoped to address whether or not active light sources can be used for detecting environmental changes, such as aerosols.

  17. Accuracy of a real-time continuous glucose monitoring system in children with septic shock: A pilot study

    PubMed Central

    Prabhudesai, Sumant; Kanjani, Amruta; Bhagat, Isha; Ravikumar, Karnam G.; Ramachandran, Bala

    2015-01-01

    Aims: The aim of this prospective, observational study was to determine the accuracy of a real-time continuous glucose monitoring system (CGMS) in children with septic shock. Subjects and Methods: Children aged 30 days to 18 years admitted to the Pediatric Intensive Care Unit with septic shock were included. A real-time CGMS sensor was used to obtain interstitial glucose readings. CGMS readings were compared statistically with simultaneous laboratory blood glucose (BG). Results: Nineteen children were included, and 235 pairs of BG-CGMS readings were obtained. BG and CGMS had a correlation coefficient of 0.61 (P < 0.001) and a median relative absolute difference of 17.29%. On Clarke's error grid analysis, 222 (94.5%) readings were in the clinically acceptable zones (A and B). When BG was < 70, 70–180, and > 180 mg/dL, 44%, 100%, and 76.9% readings were in zones A and B, respectively (P < 0.001). The accuracy of CGMS was not affected by the presence of edema, acidosis, vasopressors, steroids, or renal replacement therapy. On receiver operating characteristics curve analysis, a CGMS reading <97 mg/dL predicted hypoglycemia (sensitivity 85.2%, specificity 75%, area under the curve [AUC] =0.85). A reading > 141 mg/dL predicted hyperglycemia (sensitivity 84.6%, specificity 89.6%, AUC = 0.87). Conclusion: CGMS provides a fairly, accurate estimate of BG in children with septic shock. It is unaffected by a variety of clinical variables. The accuracy over extremes of blood sugar may be a concern. We recommend larger studies to evaluate its use for the early detection of hypoglycemia and hyperglycemia. PMID:26730114

  18. ACCURACY AND PRECISION OF A METHOD TO STUDY KINEMATICS OF THE TEMPOROMANDIBULAR JOINT: COMBINATION OF MOTION DATA AND CT IMAGING

    PubMed Central

    Baltali, Evre; Zhao, Kristin D.; Koff, Matthew F.; Keller, Eugene E.; An, Kai-Nan

    2008-01-01

    The purpose of the study was to test the precision and accuracy of a method used to track selected landmarks during motion of the temporomandibular joint (TMJ). A precision phantom device was constructed and relative motions between two rigid bodies on the phantom device were measured using optoelectronic (OE) and electromagnetic (EM) motion tracking devices. The motion recordings were also combined with a 3D CT image for each type of motion tracking system (EM+CT and OE+CT) to mimic methods used in previous studies. In the OE and EM data collections, specific landmarks on the rigid bodies were determined using digitization. In the EM+CT and OE+CT data sets, the landmark locations were obtained from the CT images. 3D linear distances and 3D curvilinear path distances were calculated for the points. The accuracy and precision for all 4 methods were evaluated (EM, OE, EM+CT and OE+CT). In addition, results were compared with and without the CT imaging (EM vs. EM+CT, OE vs. OE+CT). All systems overestimated the actual 3D curvilinear path lengths. All systems also underestimated the actual rotation values. The accuracy of all methods was within 0.5 mm for 3D curvilinear path calculations, 0.05 mm for 3D linear distance calculations, and 0.2° for rotation calculations. In addition, Bland-Altman plots for each configuration of the systems suggest that measurements obtained from either system are repeatable and comparable. PMID:18617178

  19. CopyRighter: a rapid tool for improving the accuracy of microbial community profiles through lineage-specific gene copy number correction

    PubMed Central

    2014-01-01

    Background Culture-independent molecular surveys targeting conserved marker genes, most notably 16S rRNA, to assess microbial diversity remain semi-quantitative due to variations in the number of gene copies between species. Results Based on 2,900 sequenced reference genomes, we show that 16S rRNA gene copy number (GCN) is strongly linked to microbial phylogenetic taxonomy, potentially under-representing Archaea in amplicon microbial profiles. Using this relationship, we inferred the GCN of all bacterial and archaeal lineages in the Greengenes database within a phylogenetic framework. We created CopyRighter, new software which uses these estimates to correct 16S rRNA amplicon microbial profiles and associated quantitative (q)PCR total abundance. CopyRighter parses microbial profiles and, because GCN estimates are pre-computed for all taxa in the reference taxonomy, rapidly corrects GCN bias. Software validation with in silico and in vitro mock communities indicated that GCN correction results in more accurate estimates of microbial relative abundance and improves the agreement between metagenomic and amplicon profiles. Analyses of human-associated and anaerobic digester microbiomes illustrate that correction makes tangible changes to estimates of qPCR total abundance, α and β diversity, and can significantly change biological interpretation. For example, human gut microbiomes from twins were reclassified into three rather than two enterotypes after GCN correction. Conclusions The CopyRighter bioinformatic tools permits rapid correction of GCN in microbial surveys, resulting in improved estimates of microbial abundance, α and β diversity. PMID:24708850

  20. A crowdsourcing model for creating preclinical medical education study tools.

    PubMed

    Bow, Hansen C; Dattilo, Jonathan R; Jonas, Andrea M; Lehmann, Christoph U

    2013-06-01

    During their preclinical course work, medical students must memorize and recall substantial amounts of information. Recent trends in medical education emphasize collaboration through team-based learning. In the technology world, the trend toward collaboration has been characterized by the crowdsourcing movement. In 2011, the authors developed an innovative approach to team-based learning that combined students' use of flashcards to master large volumes of content with a crowdsourcing model, using a simple informatics system to enable those students to share in the effort of generating concise, high-yield study materials. The authors used Google Drive and developed a simple Java software program that enabled students to simultaneously access and edit sets of questions and answers in the form of flashcards. Through this crowdsourcing model, medical students in the class of 2014 at the Johns Hopkins University School of Medicine created a database of over 16,000 questions that corresponded to the Genes to Society basic science curriculum. An analysis of exam scores revealed that students in the class of 2014 outperformed those in the class of 2013, who did not have access to the flashcard system, and a survey of students demonstrated that users were generally satisfied with the system and found it a valuable study tool. In this article, the authors describe the development and implementation of their crowdsourcing model for creating study materials, emphasize its simplicity and user-friendliness, describe its impact on students' exam performance, and discuss how students in any educational discipline could implement a similar model of collaborative learning. PMID:23619061

  1. Computer vision as a tool to study plant development.

    PubMed

    Spalding, Edgar P

    2009-01-01

    Morphological phenotypes due to mutations frequently provide key information about the biological function of the affected genes. This has long been true of the plant Arabidopsis thaliana, though phenotypes are known for only a minority of this model organism's approximately 25,000 genes. One common explanation for lack of phenotype in a given mutant is that a genetic redundancy masks the effect of the missing gene. Another possibility is that a phenotype escaped detection or manifests itself only in a certain unexamined condition. Addressing this potentially nettlesome alternative requires the development of more sophisticated tools for studying morphological development. Computer vision is a technical field that holds much promise in this regard. This chapter explains in general terms how computer algorithms can extract quantitative information from images of plant structures undergoing development. Automation is a central feature of a successful computer vision application as it enables more conditions and more dependencies to be characterized. This in turn expands the concept of phenotype into a point set in multidimensional condition space. New ways of measuring and thinking about phenotypes, and therefore the functions of genes, are expected to result from expanding the role of computer vision in plant biology. PMID:19588113

  2. ent-Steroids: Novel Tools for Studies of Signaling Pathways

    PubMed Central

    Covey, Douglas F.

    2008-01-01

    Membrane receptors are often modulated by steroids and it is necessary to distinguish the effects of steroids at these receptors from effects occurring at nuclear receptors. Additionally, it may also be mechanistically important to distinguish between direct effects caused by binding of steroids to membrane receptors and indirect effects on membrane receptor function caused by steroid perturbation of the membrane containing the receptor. In this regard, ent-steroids, the mirror images of naturally occurring steroids, are novel tools for distinguishing between these various actions of steroids. The review provides a background for understanding the different actions that can be expected of steroids and ent-steroids in biological systems, references for the preparation of ent-steroids, a short discussion about relevant forms of stereoisomerism and the requirements that need to be fulfilled for the interaction between two molecules to be enantioselective. The review then summarizes results of biophysical, biochemical and pharmacological studies published since 1992 in which ent-steroids have been used to investigate the actions of steroids in membranes and/or receptor-mediated signaling pathways. PMID:19103212

  3. A Longitudinal Study of Novice-Level Changes in Fluency and Accuracy in Student Monologues

    ERIC Educational Resources Information Center

    Long, Robert W., III.

    2012-01-01

    Detailed research concerning the issue fluency, specifically relating to pauses, mean length runs, and fluency rates in Japanese EFL learners, is limited. Furthermore, the issue of tracking fluency gains has often been ignored, misunderstood or minimized in EFL educational research. The present study, which is based on six monologues conducted…

  4. A Longitudinal Study of Complexity, Accuracy and Fluency Variation in Second Language Development

    ERIC Educational Resources Information Center

    Ferraris, Stefania

    2012-01-01

    This chapter presents the results of a study on interlanguage variation. The production of four L2 learners of Italian, tested four times at yearly intervals while engaged in four oral tasks, is compared to that of two native speakers, and analysed with quantitative CAF measures. Thus, time, task type, nativeness, as well as group vs. individual…

  5. Improved accuracy of markerless motion tracking on bone suppression images: preliminary study for image-guided radiation therapy (IGRT)

    NASA Astrophysics Data System (ADS)

    Tanaka, Rie; Sanada, Shigeru; Sakuta, Keita; Kawashima, Hiroki

    2015-05-01

    The bone suppression technique based on advanced image processing can suppress the conspicuity of bones on chest radiographs, creating soft tissue images obtained by the dual-energy subtraction technique. This study was performed to evaluate the usefulness of bone suppression image processing in image-guided radiation therapy. We demonstrated the improved accuracy of markerless motion tracking on bone suppression images. Chest fluoroscopic images of nine patients with lung nodules during respiration were obtained using a flat-panel detector system (120 kV, 0.1 mAs/pulse, 5 fps). Commercial bone suppression image processing software was applied to the fluoroscopic images to create corresponding bone suppression images. Regions of interest were manually located on lung nodules and automatic target tracking was conducted based on the template matching technique. To evaluate the accuracy of target tracking, the maximum tracking error in the resulting images was compared with that of conventional fluoroscopic images. The tracking errors were decreased by half in eight of nine cases. The average maximum tracking errors in bone suppression and conventional fluoroscopic images were 1.3   ±   1.0 and 3.3   ±   3.3 mm, respectively. The bone suppression technique was especially effective in the lower lung area where pulmonary vessels, bronchi, and ribs showed complex movements. The bone suppression technique improved tracking accuracy without special equipment and implantation of fiducial markers, and with only additional small dose to the patient. Bone suppression fluoroscopy is a potential measure for respiratory displacement of the target. This paper was presented at RSNA 2013 and was carried out at Kanazawa University, JAPAN.

  6. Accuracy of Family History of Hemochromatosis or Iron Overload: The Hemochromatosis and Iron Overload Screening Study

    PubMed Central

    Acton, Ronald T.; Barton, James C.; Passmore, Leah V.; Adams, Paul C.; Mclaren, Gordon D.; Leiendecker–Foster, Catherine; Speechley, Mark R.; Harris, Emily L.; Castro, Oswaldo; Reiss, Jacob A.; Snively, Beverly M.; Harrison, Barbara W.; Mclaren, Christine E.

    2013-01-01

    Background & Aims The aim of this study was to assess the analytic validity of self-reported family history of hemochromatosis or iron overload. Methods A total of 141 probands, 549 family members, and 641 controls participated in the primary care Hemochromatosis and Iron Overload Screening Study. Participants received a postscreening clinical examination and completed questionnaires about personal and family histories of hemochromatosis or iron overload, arthritis, diabetes, liver disease, and heart disease. We evaluated sensitivities and specificities of proband-reported family history, and concordance of HFE genotype C282Y/C282Y in probands and siblings who reported having hemochromatosis or iron overload. Results The sensitivities of proband-reported family history ranged from 81.4% for hemochromatosis or iron overload to 18.4% for liver disease; specificities for diabetes, liver disease, and heart disease were greater than 94%. Hemochromatosis or iron overload was associated with a positive family history across all racial/ethnic groups in the study (odds ratio, 14.53; 95% confidence intervals, 7.41–28.49; P < .0001) and among Caucasians (odds ratio, 16.98; 95% confidence intervals, 7.53–38.32; P < .0001). There was 100% concordance of HFE genotype C282Y/C282Y in 6 probands and 8 of their siblings who reported having hemochromatosis or iron overload. Conclusions Self-reported family history of hemochromatosis or iron overload can be used to identify individuals whose risk of hemochromatosis or iron overload and associated conditions is increased. These individuals could benefit from further evaluation with iron phenotyping and HFE mutation analysis. PMID:18585964

  7. Improving Mars-GRAM: Increasing the Accuracy of Sensitivity Studies at Large Optical Depths

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, C. G.; Badger, Andrew M.

    2010-01-01

    Extensively utilized for numerous mission applications, the Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model. In a Monte-Carlo mode, Mars-GRAM's perturbation modeling capability is used to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). Mars-GRAM has been found to be inexact when used during the Mars Science Laboratory (MSL) site selection process for sensitivity studies for MapYear=0 and large optical depth values such as tau=3. Mars-GRAM is based on the NASA Ames Mars General Circulation Model (MGCM) from the surface to 80 km altitude. Mars-GRAM with the MapYear parameter set to 0 utilizes results from a MGCM run with a fixed value of tau=3 at all locations for the entire year. Imprecise atmospheric density and pressure at all altitudes is a consequence of this use of MGCM with tau=3. Density factor values have been determined for tau=0.3, 1 and 3 as a preliminary fix to this pressure-density problem. These factors adjust the input values of MGCM MapYear 0 pressure and density to achieve a better match of Mars-GRAM MapYear 0 with Thermal Emission Spectrometer (TES) observations for MapYears 1 and 2 at comparable dust loading. These density factors are fixed values for all latitudes and Ls and are included in Mars-GRAM Release 1.3. Work currently being done, to derive better multipliers by including variations with latitude and/or Ls by comparison of MapYear 0 output directly against TES limb data, will be highlighted in the presentation. The TES limb data utilized in this process has been validated by a comparison study between Mars atmospheric density estimates from Mars-GRAM and measurements by Mars Global Surveyor (MGS). This comparison study was undertaken for locations on Mars of varying latitudes, Ls, and LTST. The more precise density factors will be included in Mars-GRAM 2005 Release 1.4 and thus improve the results of future sensitivity studies done for large

  8. Strategies to Improve the Accuracy of Mars-GRAM Sensitivity Studies at Large Optical Depths

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, Carl G.; Badger, Andrew M.

    2010-01-01

    The poster provides an overview of techniques to improve the Mars Global Reference Atmospheric Model (Mars-GRAM) sensitivity. It has been discovered during the Mars Science Laboratory (MSL) site selection process that the Mars Global Reference Atmospheric Model (Mars-GRAM) when used for sensitivity studies for TES MapYear = 0 and large optical depth values such as tau = 3 is less than realistic. A preliminary fix has been made to Mars-GRAM by adding a density factor value that was determined for tau = 0.3, 1 and 3.

  9. Effect of considering the initial parameters on accuracy of experimental studies conclusions

    NASA Astrophysics Data System (ADS)

    Zagulova, D.; Nesterenko, A.; Kapilevich, L.; Popova, J.

    2015-11-01

    The presented paper contains the evidences of the necessity to take into account the initial level of physiological parameters while conducting the biomedical research; it is exemplified by certain indicators of cardiorespiratory system. The analysis is based on the employment of data obtained via the multiple surveys of medical and pharmaceutical college students. There has been revealed a negative correlation of changes of the studied parameters of cardiorespiratory system in the repeated measurements compared to their initial level. It is assumed that the dependence of the changes of physiological parameters from the baseline can be caused by the biorhythmic changes inherent for all body systems.

  10. Strategies to Improve the Accuracy of Mars-GRAM Sensitivity Studies at Large Optical Depths

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, Carl G.; Badger, Andrew M.

    2009-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). It has been discovered during the Mars Science Laboratory (MSL) site selection process that Mars-GRAM when used for sensitivity studies for MapYear=0 and large optical depth values such as tau=3 is less than realistic. A comparison study between Mars atmospheric density estimates from Mars- GRAM and measurements by Mars Global Surveyor (MGS) has been undertaken for locations of varying latitudes, Ls, and LTST on Mars. The preliminary results from this study have validated the Thermal Emission Spectrometer (TES) limb data. From the surface to 80 km altitude, Mars- GRAM is based on the NASA Ames Mars General Circulation Model (MGCM). MGCM results that were used for Mars-GRAM with MapYear=0 were from a MGCM run with a fixed value of tau=3 for the entire year at all locations. Unrealistic energy absorption by uniform atmospheric dust leads to an unrealistic thermal energy balance on the polar caps. The outcome is an inaccurate cycle of condensation/sublimation of the polar caps and, as a consequence, an inaccurate cycle of total atmospheric mass and global-average surface pressure. Under an assumption of unchanged temperature profile and hydrostatic equilibrium, a given percentage change in surface pressure would produce a corresponding percentage change in density at all altitudes. Consequently, the final result of a change in surface pressure is an imprecise atmospheric density at all altitudes. To solve this pressure-density problem, a density factor value was determined for tau=.3, 1 and 3 that will adjust the input values of MGCM MapYear 0 pressure and density to achieve a better match of Mars-GRAM MapYear=0 with MapYears 1 and 2 MGCM output

  11. Strategies to Improve the Accuracy of Mars-GRAM Sensitivity Studies at Large Optical Depths

    NASA Astrophysics Data System (ADS)

    Justh, H. L.; Justus, C. G.; Badger, A. M.

    2009-12-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM’s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). It has been discovered during the Mars Science Laboratory (MSL) site selection process that Mars-GRAM when used for sensitivity studies for MapYear=0 and large optical depth values such as tau=3 is less than realistic. A comparison study between Mars atmospheric density estimates from Mars-GRAM and measurements by Mars Global Surveyor (MGS) has been undertaken for locations of varying latitudes, Ls, and LTST on Mars. The preliminary results from this study have validated the Thermal Emission Spectrometer (TES) limb data. From the surface to 80 km altitude, Mars-GRAM is based on the NASA Ames Mars General Circulation Model (MGCM). MGCM results that were used for Mars-GRAM with MapYear=0 were from a MGCM run with a fixed value of tau=3 for the entire year at all locations. Unrealistic energy absorption by uniform atmospheric dust leads to an unrealistic thermal energy balance on the polar caps. The outcome is an inaccurate cycle of condensation/sublimation of the polar caps and, as a consequence, an inaccurate cycle of total atmospheric mass and global-average surface pressure. Under an assumption of unchanged temperature profile and hydrostatic equilibrium, a given percentage change in surface pressure would produce a corresponding percentage change in density at all altitudes. Consequently, the final result of a change in surface pressure is an imprecise atmospheric density at all altitudes. To solve this pressure-density problem, a density factor value was determined for tau=.3, 1 and 3 that will adjust the input values of MGCM MapYear 0 pressure and density to achieve a better match of Mars-GRAM MapYear 0 with MapYears 1 and 2 MGCM output

  12. Pilot Study to Determine Accuracy of Posterior Approach Ultrasound for Shoulder Dislocation by Novice Sonographers

    PubMed Central

    Lahham, Shadi; Becker, Brent; Chiem, Alan; Joseph, Linda M.; Anderson, Craig L.; Wilson, Sean P.; Subeh, Mohammad; Trinh, Alex; Viquez, Eric; Fox, John C.

    2016-01-01

    Introduction The goal of this study was to investigate the efficacy of diagnosing shoulder dislocation using a single-view, posterior approach point-of-care ultrasound (POCUS) performed by undergraduate research students, and to establish the range of measured distance that discriminates dislocated shoulder from normal. Methods We enrolled a prospective, convenience sample of adult patients presenting to the emergency department with acute shoulder pain following injury. Patients underwent ultrasonographic evaluation of possible shoulder dislocation comprising a single transverse view of the posterior shoulder and assessment of the relative positioning of the glenoid fossa and the humeral head. The sonographic measurement of the distance between these two anatomic structures was termed the Glenohumeral Separation Distance (GhSD). A positive GhSD represented a posterior position of the glenoid rim relative to the humeral head and a negative GhSD value represented an anterior position of the glenoid rim relative to the humeral head. We compared ultrasound (US) findings to conventional radiography to determine the optimum GhSD cutoff for the diagnosis of shoulder dislocation. Sensitivity, specificity, positive predictive value, and negative predictive value of the derived US method were calculated. Results A total of 84 patients were enrolled and 19 (22.6%) demonstrated shoulder dislocation on conventional radiography, all of which were anterior. All confirmed dislocations had a negative measurement of the GhSD, while all patients with normal anatomic position had GhSD>0. This value represents an optimum GhSD cutoff of 0 for the diagnosis of (anterior) shoulder dislocation. This method demonstrated a sensitivity of 100% (95% CI [82.4–100]), specificity of 100% (95% CI [94.5–100]), positive predictive value of 100% (95% CI [82.4–100]), and negative predictive value of 100% (95% CI [94.5–100]). Conclusion Our study suggests that a single, posterior

  13. A pre–postintervention study to evaluate the impact of dose calculators on the accuracy of gentamicin and vancomycin initial doses

    PubMed Central

    Hamad, Anas; Cavell, Gillian; Hinton, James; Wade, Paul; Whittlesea, Cate

    2015-01-01

    Objectives Gentamicin and vancomycin are narrow-therapeutic-index antibiotics with potential for high toxicity requiring dose individualisation and continuous monitoring. Clinical decision support (CDS) tools have been effective in reducing gentamicin and vancomycin dosing errors. Online dose calculators for these drugs were implemented in a London National Health Service hospital. This study aimed to evaluate the impact of these calculators on the accuracy of gentamicin and vancomycin initial doses. Methods The study used a pre–postintervention design. Data were collected using electronic patient records and paper notes. Random samples of gentamicin and vancomycin initial doses administered during the 8 months before implementation of the calculators were assessed retrospectively against hospital guidelines. Following implementation of the calculators, doses were assessed prospectively. Any gentamicin dose not within ±10% and any vancomycin dose not within ±20% of the guideline-recommended dose were considered incorrect. Results The intranet calculator pages were visited 721 times (gentamicin=333; vancomycin=388) during the 2-month period following the calculators’ implementation. Gentamicin dose errors fell from 61.5% (120/195) to 44.2% (95/215), p<0.001. Incorrect vancomycin loading doses fell from 58.1% (90/155) to 32.4% (46/142), p<0.001. Incorrect vancomycin first maintenance doses fell from 55.5% (86/155) to 33.1% (47/142), p<0.001. Loading and first maintenance vancomycin doses were both incorrect in 37.4% (58/155) of patients before and 13.4% (19/142) after calculator implementation, p<0.001. Conclusions This study suggests that gentamicin and vancomycin dose calculators significantly improved the prescribing of initial doses of these agents. Therefore, healthcare organisations should consider using such CDS tools to support the prescribing of these high-risk drugs. PMID:26044758

  14. Accuracy of ED Bedside Ultrasound for Identification of gallstones: retrospective analysis of 575 studies

    PubMed Central

    Scruggs, William; Fox, J. Christian; Potts, Brian; Zlidenny, Alexander; McDonough, JoAnne; Anderson, Craig L.; Larson, Jarrod; Barajas, Graciela; Langdorf, Mark I.

    2008-01-01

    Study Objective To determine the ability of emergency department (ED) physicians to diagnose cholelithiasis with bedside ultrasound. Methods ED gallbladder ultrasounds recorded over 37 months were compared to radiology ultrasound interpretation. Results Of 1,690 ED gallbladder ultrasound scans performed during this period, radiology ultrasound was performed in 575/1690 (34%) cases. ED physician bedside interpretation was 88% sensitive [95% CI, 84–91] and 87% specific [95% CI, 82–91], while positive predictive value (PPV) was 91% [88–94%] and negative predictive value (NPV) was 83% [78–87%], using radiology interpretation as the criterion reference. Conclusion ED physician ultrasound of the gallbladder for cholelithiasis is both sensitive and specific. PMID:19561694

  15. The Diagnostic Value of Surface Markers in Acute Appendicitis; A Diagnostic Accuracy Study

    PubMed Central

    Gholi Mezerji, Naser Mohammad; Rafeie, Mohammad; Shayan, Zahra; Mosayebi, Ghasem

    2015-01-01

    Objective: To determine the diagnostic value of blood cells surface markers in patients with acute appendicitis. Methods: In this cross-sectional study, 71 patients who underwent appendectomy following a diagnosis of appendicitis were recruited during a one-year period. The patients were divided into two groups: patients with histopathologically confirmed acute appendicitis and subjects with normal appendix. Blood cell surface markers of all patients were measured. Univariate and multivariate analytical methods were applied to identify the most useful markers. Receiver operating characteristics (ROC) curves were also used to find the best cut-off point, sensitivity, and specificity. Results: Overall we included 71 patients with mean age of 22.6±10.7 years. Of the 71 cases, 45 (63.4%) had acute appendicitis while 26 (36.6%) were normal. There was no significant difference between two study groups regarding the age (p=0.151) and sex (p=0.142). The initial WBC count was significantly higher in those with acute appendicitis (p=0.033). Maximum and minimum area under the ROC curve in univariate analysis was reported for CD3/RA (0.71) and CD38 (0.533), respectively. Multivariate regression models revealed the percentage of accurate diagnoses based on the combination of γ/δ TCR, CD3/RO, and CD3/RA markers to be 74.65%. Maximum area under the ROC curve (0.79) was also obtained for the same combination. Conclusion: the best blood cell surface markers in the prediction of acute appendicitis were HLA-DR+CD19, a/β TCR, and CD3/RA. The simultaneous use of γ/δ TCR, CD3/RA, and CD3/RO showed the highest diagnostic value in acute appendicitis. PMID:27162905

  16. Updating Mars-GRAM to Increase the Accuracy of Sensitivity Studies at Large Optical Depths

    NASA Technical Reports Server (NTRS)

    Justh, Hiliary L.; Justus, C. G.; Badger, Andrew M.

    2010-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). During the Mars Science Laboratory (MSL) site selection process, it was discovered that Mars-GRAM, when used for sensitivity studies for MapYear=0 and large optical depth values such as tau=3, is less than realistic. From the surface to 80 km altitude, Mars-GRAM is based on the NASA Ames Mars General Circulation Model (MGCM). MGCM results that were used for Mars-GRAM with MapYear set to 0 were from a MGCM run with a fixed value of tau=3 for the entire year at all locations. This has resulted in an imprecise atmospheric density at all altitudes. As a preliminary fix to this pressure-density problem, density factor values were determined for tau=0.3, 1 and 3 that will adjust the input values of MGCM MapYear 0 pressure and density to achieve a better match of Mars-GRAM MapYear 0 with Thermal Emission Spectrometer (TES) observations for MapYears 1 and 2 at comparable dust loading. Currently, these density factors are fixed values for all latitudes and Ls. Results will be presented from work being done to derive better multipliers by including variation with latitude and/or Ls by comparison of MapYear 0 output directly against TES limb data. The addition of these more precise density factors to Mars-GRAM 2005 Release 1.4 will improve the results of the sensitivity studies done for large optical depths.

  17. Associations between visual perception accuracy and confidence in a dopaminergic manipulation study.

    PubMed

    Andreou, Christina; Bozikas, Vasilis P; Luedtke, Thies; Moritz, Steffen

    2015-01-01

    Delusions are defined as fixed erroneous beliefs that are based on misinterpretation of events or perception, and cannot be corrected by argumentation to the opposite. Cognitive theories of delusions regard this symptom as resulting from specific distorted thinking styles that lead to biased integration and interpretation of perceived stimuli (i.e., reasoning biases). In previous studies, we were able to show that one of these reasoning biases, overconfidence in errors, can be modulated by drugs that act on the dopamine system, a major neurotransmitter system implicated in the pathogenesis of delusions and other psychotic symptoms. Another processing domain suggested to involve the dopamine system and to be abnormal in psychotic disorders is sensory perception. The present study aimed to investigate whether (lower-order) sensory perception and (higher-order) overconfidence in errors are similarly affected by dopaminergic modulation in healthy subjects. Thirty-four healthy individuals were assessed upon administration of l-dopa, placebo, or haloperidol within a randomized, double-blind, cross-over design. Variables of interest were hits and false alarms in an illusory perception paradigm requiring speeded detection of pictures over a noisy background, and subjective confidence ratings for correct and incorrect responses. There was a significant linear increase of false alarm rates from haloperidol to placebo to l-dopa, whereas hit rates were not affected by dopaminergic manipulation. As hypothesized, confidence in error responses was significantly higher with l-dopa compared to placebo. Moreover, confidence in erroneous responses significantly correlated with false alarm rates. These findings suggest that overconfidence in errors and aberrant sensory processing might be both interdependent and related to dopaminergic transmission abnormalities in patients with psychosis. PMID:25932015

  18. Associations between visual perception accuracy and confidence in a dopaminergic manipulation study

    PubMed Central

    Andreou, Christina; Bozikas, Vasilis P.; Luedtke, Thies; Moritz, Steffen

    2015-01-01

    Delusions are defined as fixed erroneous beliefs that are based on misinterpretation of events or perception, and cannot be corrected by argumentation to the opposite. Cognitive theories of delusions regard this symptom as resulting from specific distorted thinking styles that lead to biased integration and interpretation of perceived stimuli (i.e., reasoning biases). In previous studies, we were able to show that one of these reasoning biases, overconfidence in errors, can be modulated by drugs that act on the dopamine system, a major neurotransmitter system implicated in the pathogenesis of delusions and other psychotic symptoms. Another processing domain suggested to involve the dopamine system and to be abnormal in psychotic disorders is sensory perception. The present study aimed to investigate whether (lower-order) sensory perception and (higher-order) overconfidence in errors are similarly affected by dopaminergic modulation in healthy subjects. Thirty-four healthy individuals were assessed upon administration of l-dopa, placebo, or haloperidol within a randomized, double-blind, cross-over design. Variables of interest were hits and false alarms in an illusory perception paradigm requiring speeded detection of pictures over a noisy background, and subjective confidence ratings for correct and incorrect responses. There was a significant linear increase of false alarm rates from haloperidol to placebo to l-dopa, whereas hit rates were not affected by dopaminergic manipulation. As hypothesized, confidence in error responses was significantly higher with l-dopa compared to placebo. Moreover, confidence in erroneous responses significantly correlated with false alarm rates. These findings suggest that overconfidence in errors and aberrant sensory processing might be both interdependent and related to dopaminergic transmission abnormalities in patients with psychosis. PMID:25932015

  19. Accurate Radiometry from Space: An Essential Tool for Climate Studies

    NASA Technical Reports Server (NTRS)

    Fox, Nigel; Kaiser-Weiss, Andrea; Schmutz, Werner; Thome, Kurtis; Young, Dave; Wielicki, Bruce; Winkler, Rainer; Woolliams, Emma

    2011-01-01

    The Earth s climate is undoubtedly changing; however, the time scale, consequences and causal attribution remain the subject of significant debate and uncertainty. Detection of subtle indicators from a background of natural variability requires measurements over a time base of decades. This places severe demands on the instrumentation used, requiring measurements of sufficient accuracy and sensitivity that can allow reliable judgements to be made decades apart. The International System of Units (SI) and the network of National Metrology Institutes were developed to address such requirements. However, ensuring and maintaining SI traceability of sufficient accuracy in instruments orbiting the Earth presents a significant new challenge to the metrology community. This paper highlights some key measurands and applications driving the uncertainty demand of the climate community in the solar reflective domain, e.g. solar irradiances and reflectances/radiances of the Earth. It discusses how meeting these uncertainties facilitate significant improvement in the forecasting abilities of climate models. After discussing the current state of the art, it describes a new satellite mission, called TRUTHS, which enables, for the first time, high-accuracy SI traceability to be established in orbit. The direct use of a primary standard and replication of the terrestrial traceability chain extends the SI into space, in effect realizing a metrology laboratory in space . Keywords: climate change; Earth observation; satellites; radiometry; solar irradiance

  20. Primary aldosteronism caused by unilateral adrenal hyperplasia: rethinking the accuracy of imaging studies.

    PubMed

    Chen, Su-Yu; Shen, Sjen-Jung; Chou, Chien-Wen; Yang, Chwen-Yi; Cheng, Hon-Mei

    2006-03-01

    A rare type of aldosteronism, known as unilateral adrenal hyperplasia (UAH), is difficult to diagnose, not only because it fails to conform to the typical common subtypes, but also because imaging results are unreliable. We report 2 Taiwanese patients with UAH. Case 1 was a 44-year-old man with 2 episodes of hypokalemic paralysis. Hypertension and suppressed plasma renin activity (PRA) with elevated plasma aldosterone concentration (PAC) were observed. Abdominal computed tomography (CT) and magnetic resonance imaging (MRI) showed a right adrenal mass, but adrenal scintigraphy revealed no definite laterality. The patient underwent a laparoscopic right adrenalectomy. Adrenal cortical hyperplasia was discovered from results of the histologic analysis. Case 2 was a 33-year-old woman referred for hypokalemia, hypertension, and a left adrenal mass found on a CT scan. However, MRI revealed normal adrenal glands. The adrenal vein sampling for PAC showed overproduction of PAC from the left adrenal gland. A laparoscopic left adrenalectomy was done. Pathology results revealed micronodular cortical hyperplasia with central hemorrhage. Blood pressure, plasma potassium, aldosterone, and renin activity levels returned to normal after operation in both cases. Both patients have been well for 3 years and 16 months, respectively, after surgery. We review the literature and discuss the limitations of imaging studies. PMID:16599018

  1. Paramedic accuracy in using a decision support algorithm when recognising adult death: a prospective cohort study

    PubMed Central

    Jones, T; Woollard, M

    2003-01-01

    Method: This prospective 16 month cohort study evaluated 188 events of recognition of adult death (ROAD) by paramedics in the period from November 1999 to February 2001. Results: Of 188 ROAD applications, errors were made in 13 cases (6.9%, 95% CI 3.7 to 11.5. Additionally, there was one adverse clinical incident associated with a case in which ROAD was applied (0.5%, 95% CI 0.01 to 2.9%). ECG strips were unavailable for eight cases, although ambulance records indicated a rhythm of asystole for each of these. Assuming this diagnosis was correct, ROAD was used 174 times without errors (93%, 95% CI 88 to 96%). Assuming that it was not, the ROAD protocol was applied without errors in 166 cases (88.3%, 95% CI 82.8 to 92.5%). None of the errors made appeared to be attributable to poor clinical decision making, compromised treatment, or changed patient outcome. The mean on-scene time for ambulance crews using the ROAD policy was 60 minutes. Conclusion: Paramedics can accurately apply a decision support algorithm when recognising adult death. It could be argued that the attendance of a medical practitioner to confirm death is therefore an inappropriate use of such personnel and may result in unnecessarily protracted on-scene times for ambulance crews. Further research is required to confirm this, and to determine the proportion of patients suitable for recognition of adult death who are actually identified as such by paramedics. PMID:12954697

  2. Comparing Accuracy of Cervical Pedicle Screw Placement between a Guidance System and Manual Manipulation: A Cadaver Study

    PubMed Central

    Cong, Yu; Bao, Nirong; Zhao, Jianning; Mao, Guangping

    2015-01-01

    Background The aim of this study was to compare the accuracy of cervical pedicle screw placement between a three-dimensional guidance system and manual manipulation. Material/Methods Eighteen adult cadavers were randomized into group A (n=9) and group B (n=9). Ninety pedicle screws were placed into the C3-C7 under the guidance of a three-dimensional locator in group A, and 90 screws were inserted by manual manipulation in group B. The cervical spines were scanned using computed tomography (CT). Parallel and angular offsets of the screws were compared between the two placement methods. Results In group A, 90% of the screws were within the pedicles and 10% breached the pedicle cortex. In group B, 55.6% were within the pedicle and 44.4% breached the pedicle cortex. Locator guidance showed significantly lower parallel and angular offsets in axial CT images (P<0.01), and significantly lower angular offset in sagittal CT images (P<0.01) than manual manipulation. Conclusions Locator guidance is superior to manual manipulation in accuracy of cervical screw placement. Locator guidance might provide better safety than manual manipulation in placing cervical screws. PMID:26348197

  3. The accuracy of the Neosono Ultima EZ apex locator using files of different alloys: an in vitro study.

    PubMed

    Nekoofar, M H; Sadeghi, K; Sadighi Akha, E; Akha, E Sadighi; Namazikhah, M Sadegh

    2002-09-01

    The purpose of this study was to compare the precision of one of the new generation of root canal measuring devices, Neosono Ultima EZ, while using files manufactured of different alloys. Fifty-four root canals of extracted teeth were chosen. They were placed in special tubes with roots immersed in 2 percent agar with phosphate buffered saline. The device was used to locate the apex of each canal in wet conditions at the zero digital reading, first using a stainless steel file and then using a nickel-titanium file. These values were compared to the actual lengths obtained by measuring the distance of the coronal reference point to the apical opening with a size 10 file minus 0.5 mm. The accuracy of the device was 94 percent with nickel-titanium files and 91 percent with stainless steel. No significant difference was noted between the results for either file. The accuracy of the Neosono Ultima EZ in wet conditions exceeded 90 percent regardless of the alloy used. PMID:12365847

  4. Acute response in vivo of a fiber-optic sensor for continuous glucose monitoring from canine studies on point accuracy.

    PubMed

    Liao, Kuo-Chih; Chang, Shih-Chieh; Chiu, Cheng-Yang; Chou, Yu-Hsiang

    2010-01-01

    The objective of this study was to evaluate the acute response of Sencil(™), a fiber-optic sensor, in point accuracy for glucose monitoring in vivo on healthy dogs under anesthesia. A total of four dogs with clinically normal glycemia were implanted with one sensor each in the chest region to measure the interstitial glucose concentration during the ovariohysterectomy procedure. The data was acquired every 10 seconds after initiation, and was compared to the concentration of venous plasma glucose sampled during the surgery procedures for accuracy of agreement analysis. In the four trials with a range of 71-297 mg/dL plasma glucose, the collected 21 pairs of ISF readings from the Sencil™ and the plasma reference showed superior dispersion of residue values than the conventional system, and a linear correlation (the Pearson correlation coefficient is 0.9288 and the y-intercept is 14.22 mg/dL). The MAD (17.6 mg/dL) and RMAD (16.16%) of Sencil™ measurements were in the comparable range of the conventional system. The Clarke error grid analysis indicated that 100% of the paired points were in the clinically acceptable zone A (61.9%) and B (38.1%). PMID:22163627

  5. The Effect of File Size on the Accuracy of the Raypex 5 Apex Locator: An In Vitro Study

    PubMed Central

    Sadeghi, Shiva; Abolghasemi, Masoomeh

    2008-01-01

    Background and aims Determining the proper length of the root canals is essential for successful endodontic treatment. The purpose of this in vitro study was to evaluate the effect of file size on the accuracy of the Raypex 5 electronic apex locator for working length determination of uninstrumented canals. Materials and methods Twenty maxillary central incisors with single straight canals were used. Following access cavity preparation, electronic working length by means of Raypex 5 apex locator and actual working length were determined. Data were analyzed using ANOVA with repeated measurements and LSD test. Results There was no significant difference between electronic and actual working lengths when a size 15 K-file was used. Conclusion Under the conditions of the present study, a size 15 K-file is a more suitable size for de-termining working length. PMID:23285326

  6. High-resolution terrain and landcover mapping with a lightweight, semi-autonomous, remotely-piloted aircraft (RPA): a case study and accuracy assessment

    NASA Astrophysics Data System (ADS)

    Hugenholtz, C.; Whitehead, K.; Moorman, B.; Brown, O.; Hamilton, T.; Barchyn, T.; Riddell, K.; LeClair, A.

    2012-04-01

    Remotely-piloted aircraft (RPA) have evolved into a viable research tool for a range of Earth science applications. Significant technological advances driven by military and surveillance programs have steadily become mainstream and affordable. Thus, RPA technology has the potential to reinvigorate various aspects of geomorphological research, especially at the landform scale. In this presentation we will report results and experiences using a lightweight, semi-autonomous RPA for high-resolution terrain and landcover mapping. The goal was to test the accuracy of the photogrammetrically-derived terrain model and assess the overall performance of the RPA system for landform characterization. The test site was comprised an area of semi-vegetated sand dunes in the Canadian Prairies. The RPA survey was conducted with a RQ-84Z AreoHawk (Hawkeye UAV Ltd) and a low-cost digital camera. During the survey the RPA acquired images semi-autonomously with the aid of proprietary mission planning software developed by Accuas Inc. A total of 44 GCPs were used in the block adjustment to create the terrain model, while an additional 400 independent GPS check points were used for accuracy assessment. The 1 m resolution terrain model developed with Trimble's INPHO photogrammetric software was compared to the independent check points, yielding a RMS error comparable to airborne LiDAR data. The resulting orthophoto mosaic had a resolution of 0.1 m, revealing a number of geomorphic features beyond the resolution of airborne and QuickBird imagery. Overall, this case study highlights the potential of RPA technology for resolving terrain and landcover attributes at the landform scale. We believe one of the most significant and emerging applications of RPA in geomorphology is their potential to quantify rates of landform erosion/deposition in an affordable and flexible manner, allowing investigators to reduce the gap between recorded and natural morphodynamics.

  7. Tools in polypharmacy. Current evidence from observational and controlled studies.

    PubMed

    Dovjak, P

    2012-08-01

    Increasing evidence in managing polypharmacy in the growing elderly population with a higher prevalence of multiple chronic disease is the basis for this paper. Poor adherence, drug-drug interactions, drug-disease interactions, and inappropriate medication challenge the prescriptions of health care providers in this group of patients. Risk factors, the prevalence of polypharmacy, and the impact on health issues will be shown by analyzing the recent literature. Based on intervention trials, several tools in polypharmacy have emerged as practical guides for clinical practice or for the geriatric ward to solve this problem. The Medication Appropriateness Index (MAI) and national lists of potentially inappropriate medication used in clinical practice are presented, including Screening Tool to Alert Doctors to the Right Treatment (START), Screening Tool of Older Persons' Potentially Inappropriate Prescriptions (STOPP), and Assess, Comprehensive Geriatric Assessment, Adherence, Development, Emergence, Minimization, Interdisciplinarity, Alertness (ACADEMIA). PMID:22767400

  8. Developing a temperature sensitive tool for studying spin dissipation

    NASA Astrophysics Data System (ADS)

    Wickey, Kurtis Jon

    Measuring the thermodynamic properties of nanoscale structures is becoming increasingly important as heterostructures and devices shrink in size. For example, recent discoveries of spin thermal effects such as spin Seebeck and spin Peltier show that thermal gradients can manipulate spin systems and vice versa. However, the relevant interactions occur within a spin diffusion length of a spin active interface, making study of these spin thermal effects challenging. In addition, recent ferromagnetic resonance studies of spatially confined nanomagnets have shown unique magnon modes in arrays and lines which may give rise to unique magnon-phonon interactions. In this case, the small volume of magnetic material presents a challenge to measurement and as a result the bulk of the work is done on arrays with measurements of the magnetization of individual particles possible through various microscopies but limited access to thermal properties. As a result, tools capable of measuring the thermal properties of nanoscale structures are required to fully explore this emerging science. One approach to addressing this challenge is the use of microscale suspended platforms that maximize their sensitivity to these spin thermal interactions through thermal isolation from their surroundings. Combining this thermal decoupling with sensitive thermometry allows for the measurement of nanojoule heat accumulations, such as those resulting from the small heat flows associated with spin transport and spin relaxation. As these heat flows may manifest themselves in a variety of spin-thermal effects, the development of measurement platforms that can be tailored to optimize their sensitivity to specific thermal measurements is essential. To address these needs, I have fabricated thermally isolated platforms using a unique focused ion beam (FIB) machining that allow for flexible geometries as well as a wide choice of material systems. The thermal characteristics of these platforms were

  9. Diagnostic Accuracy of 123I-Meta-Iodobenzylguanidine Myocardial Scintigraphy in Dementia with Lewy Bodies: A Multicenter Study

    PubMed Central

    Yoshita, Mitsuhiro; Arai, Heii; Arai, Hiroyuki; Arai, Tetsuaki; Asada, Takashi; Fujishiro, Hiroshige; Hanyu, Haruo; Iizuka, Osamu; Iseki, Eizo; Kashihara, Kenichi; Kosaka, Kenji; Maruno, Hirotaka; Mizukami, Katsuyoshi; Mizuno, Yoshikuni; Mori, Etsuro; Nakajima, Kenichi; Nakamura, Hiroyuki; Nakano, Seigo; Nakashima, Kenji; Nishio, Yoshiyuki; Orimo, Satoshi; Samuraki, Miharu; Takahashi, Akira; Taki, Junichi; Tokuda, Takahiko; Urakami, Katsuya; Utsumi, Kumiko; Wada, Kenji; Washimi, Yukihiko; Yamasaki, Junichi; Yamashina, Shouhei; Yamada, Masahito

    2015-01-01

    Background and Purpose Dementia with Lewy bodies (DLB) needs to be distinguished from Alzheimer’s disease (AD) because of important differences in patient management and outcome. Severe cardiac sympathetic degeneration occurs in DLB, but not in AD, offering a potential system for a biological diagnostic marker. The primary aim of this study was to investigate the diagnostic accuracy, in the ante-mortem differentiation of probable DLB from probable AD, of cardiac imaging with the ligand 123I-meta-iodobenzylguanidine (MIBG) which binds to the noradrenaline reuptake site, in the first multicenter study. Methods We performed a multicenter study in which we used 123I-MIBG scans to assess 133 patients with clinical diagnoses of probable (n = 61) or possible (n = 26) DLB or probable AD (n = 46) established by a consensus panel. Three readers, unaware of the clinical diagnosis, classified the images as either normal or abnormal by visual inspection. The heart-to-mediastinum ratios of 123I-MIBG uptake were also calculated using an automated region-of-interest based system. Results Using the heart-to-mediastinum ratio calculated with the automated system, the sensitivity was 68.9% and the specificity was 89.1% to differentiate probable DLB from probable AD in both early and delayed images. By visual assessment, the sensitivity and specificity were 68.9% and 87.0%, respectively. In a subpopulation of patients with mild dementia (MMSE ≥ 22, n = 47), the sensitivity and specificity were 77.4% and 93.8%, respectively, with the delayed heart-to-mediastinum ratio. Conclusions Our first multicenter study confirmed the high correlation between abnormal cardiac sympathetic activity evaluated with 123I-MIBG myocardial scintigraphy and a clinical diagnosis of probable DLB. The diagnostic accuracy is sufficiently high for this technique to be clinically useful in distinguishing DLB from AD, especially in patients with mild dementia. PMID:25793585

  10. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Technical Reports Server (NTRS)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  11. Neutron Reflectivity as a Tool for Physics-Based Studies of Model Bacterial Membranes.

    PubMed

    Barker, Robert D; McKinley, Laura E; Titmuss, Simon

    2016-01-01

    The principles of neutron reflectivity and its application as a tool to provide structural information at the (sub-) molecular unit length scale from models for bacterial membranes are described. The model membranes can take the form of a monolayer for a single leaflet spread at the air/water interface, or bilayers of increasing complexity at the solid/liquid interface. Solid-supported bilayers constrain the bilayer to 2D but can be used to characterize interactions with antimicrobial peptides and benchmark high throughput lab-based techniques. Floating bilayers allow for membrane fluctuations, making the phase behaviour more representative of native membranes. Bilayers of varying levels of compositional accuracy can now be constructed, facilitating studies with aims that range from characterizing the fundamental physical interactions, through to the characterization of accurate mimetics for the inner and outer membranes of Gram-negative bacteria. Studies of the interactions of antimicrobial peptides with monolayer and bilayer models for the inner and outer membranes have revealed information about the molecular control of the outer membrane permeability, and the mode of interaction of antimicrobials with both inner and outer membranes. PMID:27193548

  12. The dimensional accuracy of polyvinyl siloxane impression materials using two different impression techniques: An in vitro study

    PubMed Central

    Kumari, Nirmala; Nandeeshwar, D. B.

    2015-01-01

    Aim of the Study: To evaluate and compare the linear dimensional changes of the three representative polyvinyl siloxane (PVS) impression materials and to compare the accuracy of single mix with double mix impression technique. Methodology: A study mold was prepared according to revised American Dental Association specification number 19 for nonaqueous elastic dental impression materials. Three PVS impression materials selected were Elite-HD, Imprint™ II Garant, Aquasil Ultra Heavy. Two impression techniques used were single mix and double mix impression technique. A total of 60 specimens were made and after 24 h the specimens were measured using profile projector. Statistical Analysis: The data were analyzed using one-way analyses of variance analysis and significant differences were separated using Student's Newman–Keul's test. Results: When all the three study group impression materials were compared for double mix technique, the statistically significant difference was found only between Imprint™ II Garantand Elite-HD (P < 0.05). Similarly, using single mix technique, statistically significant difference were found between Elite-HD and Imprint™ II Garant (P < 0.05) and also between Aquasil Ultra Heavy and Elite-HD (P < 0.05). When the linear dimensional accuracy of all three impression material in double mix impression technique and single mix impression technique were compared with the control group, Imprint™ II Garant showed the values more nearing to the values of master die, followed by Aquasil Ultra Heavy and Elite-HD respectively. Conclusion: Among the impression materials Imprint™ II Garant showed least dimensional change. Among the impression techniques, double mix impression technique showed the better results. PMID:26929515

  13. Information Literacy and Office Tool Competencies: A Benchmark Study

    ERIC Educational Resources Information Center

    Heinrichs, John H.; Lim, Jeen-Su

    2010-01-01

    Present information science literature recognizes the importance of information technology to achieve information literacy. The authors report the results of a benchmarking student survey regarding perceived functional skills and competencies in word-processing and presentation tools. They used analysis of variance and regression analysis to…

  14. Study of hot hardness characteristics of tool steels

    NASA Technical Reports Server (NTRS)

    Chevalier, J. L.; Dietrich, M. W.; Zaretsky, E. V.

    1972-01-01

    Hardness measurements of tool steel materials in electric furnace at elevated temperatures and low oxygen environment are discussed. Development of equation to predict short term hardness as function of intial room temperature hardness of steel is reported. Types of steel involved in the process are identified.

  15. A Visualization Tool for Managing and Studying Online Communications

    ERIC Educational Resources Information Center

    Gibbs, William J.; Olexa, Vladimir; Bernas, Ronan S.

    2006-01-01

    Most colleges and universities have adopted course management systems (e.g., Blackboard, WebCT). Worldwide faculty and students use them for class communications and discussions. The discussion tools provided by course management systems, while powerful, often do not offer adequate capabilities to appraise communication patterns, online behaviors,…

  16. A Comparison of Accuracy of Matrix Impression System with Putty Reline Technique and Multiple Mix Technique: An In Vitro Study

    PubMed Central

    Kumar, M Praveen; Patil, Suneel G; Dheeraj, Bhandari; Reddy, Keshav; Goel, Dinker; Krishna, Gopi

    2015-01-01

    Background: The difficulty in obtaining an acceptable impression increases exponentially as the number of abutments increases. Accuracy of the impression material and the use of a suitable impression technique are of utmost importance in the fabrication of a fixed partial denture. This study compared the accuracy of the matrix impression system with conventional putty reline and multiple mix technique for individual dies by comparing the inter-abutment distance in the casts obtained from the impressions. Materials and Methods: Three groups, 10 impressions each with three impression techniques (matrix impression system, putty reline technique and multiple mix technique) were made of a master die. Typodont teeth were embedded in a maxillary frasaco model base. The left first premolar was removed to create a three-unit fixed partial denture situation and the left canine and second premolar were prepared conservatively, and hatch marks were made on the abutment teeth. The final casts obtained from the impressions were examined under a profile projector and the inter-abutment distance was calculated for all the casts and compared. Results: The results from this study showed that in the mesiodistal dimensions the percentage deviation from master model in Group I was 0.1 and 0.2, in Group II was 0.9 and 0.3, and Group III was 1.6 and 1.5, respectively. In the labio-palatal dimensions the percentage deviation from master model in Group I was 0.01 and 0.4, Group II was 1.9 and 1.3, and Group III was 2.2 and 2.0, respectively. In the cervico-incisal dimensions the percentage deviation from the master model in Group I was 1.1 and 0.2, Group II was 3.9 and 1.7, and Group III was 1.9 and 3.0, respectively. In the inter-abutment dimension of dies, percentage deviation from master model in Group I was 0.1, Group II was 0.6, and Group III was 1.0. Conclusion: The matrix impression system showed more accuracy of reproduction for individual dies when compared with putty reline

  17. BASINS and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (External Review Draft)

    EPA Science Inventory

    This draft report supports application of two recently developed water modeling tools, the BASINS and WEPP climate assessment tools. The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments...

  18. HEFCE's People Management Self-Assessment Tool: Ticking Boxes or Adding Value? A Case Study

    ERIC Educational Resources Information Center

    McDonald, Claire

    2009-01-01

    This article examines one specific organisational development tool in depth and uses a case study to investigate whether using the tool is more than a tick-box exercise and really can add value and help organisations to develop and improve. The People Management Self-Assessment Tool (SAT) is used to examine higher education institutions' (HEIs)…

  19. Social Networking Tools and Teacher Education Learning Communities: A Case Study

    ERIC Educational Resources Information Center

    Poulin, Michael T.

    2014-01-01

    Social networking tools have become an integral part of a pre-service teacher's educational experience. As a result, the educational value of social networking tools in teacher preparation programs must be examined. The specific problem addressed in this study is that the role of social networking tools in teacher education learning communities…

  20. Relative accuracy of grid references derived from postcode and address in UK epidemiological studies of overhead power lines.

    PubMed

    Swanson, J; Vincent, T J; Bunch, K J

    2014-12-01

    In the UK, the location of an address, necessary for calculating the distance to overhead power lines in epidemiological studies, is available from different sources. We assess the accuracy of each. The grid reference specific to each address, provided by the Ordnance Survey product Address-Point, is generally accurate to a few metres, which will usually be sufficient for calculating magnetic fields from the power lines. The grid reference derived from the postcode rather than the individual address is generally accurate to tens of metres, and may be acceptable for assessing effects that vary in the general proximity of the power line, but is probably not acceptable for assessing magnetic-field effects. PMID:25325707

  1. Dosimetric accuracy of the cone-beam CT-based treatment planning of the Vero system: a phantom study.

    PubMed

    Yohannes, Indra; Prasetio, Heru; Kallis, Karoline; Bert, Christoph

    2016-01-01

    We report an investigation on the accuracy of dose calculation based on the cone-beam computed tomography (CBCT) images of the nonbowtie filter kV imaging system of the Vero linear accelerator. Different sets of materials and tube voltages were employed to generate the Hounsfield unit lookup tables (HLUTs) for both CBCT and fan-beam CT (FBCT) systems. The HLUTs were then implemented for the dose calculation in a treatment planning system (TPS). Dosimetric evaluation was carried out on an in-house-developed cube phantom that consists of water-equivalent slabs and inhomogeneity inserts. Two independent dosimeters positioned in the cube phantom were used in this study for point-dose and two-dimensional (2D) dose distribution measurements. The differences of HLUTs from various materials and tube voltages in both CT systems resulted in differences in dose calculation accuracy. We found that the higher the tube voltage used to obtain CT images, the better the point-dose calculation and the gamma passing rate of the 2D dose distribution agree to the values determined in the TPS. Moreover, the insert materials that are not tissue-equivalent led to higher dose-calculation inaccuracy. There were negligible differences in dosimetric evaluation between the CBCT- and FBCT-based treatment planning if the HLUTs were generated using the tissue-equivalent materials. In this study, the CBCT images of the Vero system from a complex inhomogeneity phantom can be applied for the TPS dose calculation if the system is calibrated using tissue-equivalent materials scanned at high tube voltage (i.e., 120 kV). PMID:27455496

  2. Postmarketing Safety Study Tool: A Web Based, Dynamic, and Interoperable System for Postmarketing Drug Surveillance Studies

    PubMed Central

    Sinaci, A. Anil; Laleci Erturkmen, Gokce B.; Gonul, Suat; Yuksel, Mustafa; Invernizzi, Paolo; Thakrar, Bharat; Pacaci, Anil; Cinar, H. Alper; Cicekli, Nihan Kesim

    2015-01-01

    Postmarketing drug surveillance is a crucial aspect of the clinical research activities in pharmacovigilance and pharmacoepidemiology. Successful utilization of available Electronic Health Record (EHR) data can complement and strengthen postmarketing safety studies. In terms of the secondary use of EHRs, access and analysis of patient data across different domains are a critical factor; we address this data interoperability problem between EHR systems and clinical research systems in this paper. We demonstrate that this problem can be solved in an upper level with the use of common data elements in a standardized fashion so that clinical researchers can work with different EHR systems independently of the underlying information model. Postmarketing Safety Study Tool lets the clinical researchers extract data from different EHR systems by designing data collection set schemas through common data elements. The tool interacts with a semantic metadata registry through IHE data element exchange profile. Postmarketing Safety Study Tool and its supporting components have been implemented and deployed on the central data warehouse of the Lombardy region, Italy, which contains anonymized records of about 16 million patients with over 10-year longitudinal data on average. Clinical researchers in Roche validate the tool with real life use cases. PMID:26543873

  3. Postmarketing Safety Study Tool: A Web Based, Dynamic, and Interoperable System for Postmarketing Drug Surveillance Studies.

    PubMed

    Sinaci, A Anil; Laleci Erturkmen, Gokce B; Gonul, Suat; Yuksel, Mustafa; Invernizzi, Paolo; Thakrar, Bharat; Pacaci, Anil; Cinar, H Alper; Cicekli, Nihan Kesim

    2015-01-01

    Postmarketing drug surveillance is a crucial aspect of the clinical research activities in pharmacovigilance and pharmacoepidemiology. Successful utilization of available Electronic Health Record (EHR) data can complement and strengthen postmarketing safety studies. In terms of the secondary use of EHRs, access and analysis of patient data across different domains are a critical factor; we address this data interoperability problem between EHR systems and clinical research systems in this paper. We demonstrate that this problem can be solved in an upper level with the use of common data elements in a standardized fashion so that clinical researchers can work with different EHR systems independently of the underlying information model. Postmarketing Safety Study Tool lets the clinical researchers extract data from different EHR systems by designing data collection set schemas through common data elements. The tool interacts with a semantic metadata registry through IHE data element exchange profile. Postmarketing Safety Study Tool and its supporting components have been implemented and deployed on the central data warehouse of the Lombardy region, Italy, which contains anonymized records of about 16 million patients with over 10-year longitudinal data on average. Clinical researchers in Roche validate the tool with real life use cases. PMID:26543873

  4. Accuracy study of a new assistance system under the application of Navigated Control® for manual milling on a head phantom.

    PubMed

    Shi, Jiaxi; Stenzel, Roland; Wenger, Thomas; Lueth, Tim C

    2010-01-01

    In this article, a technical study of a new assistance system to support surgeons in milling on the temporal bone is presented. In particular, the overall accuracy of a new assistance system was investigated experimentally under conditions close to surgical practice. For the experiment, the assistance system has been used with its associated navigation system for ear-nose-throat (ENT) surgery. A specially constructed head phantom allowed the implementation of reproducible experiments. Thereby, N = 10 specimens were milled by three test persons without medical knowledge and the distance between points on the milled surface and the security zone around the planned nerve for each specimen were calculated. The result was as follows: None of the 10 milled specimens overlapped more than 2mm with the security zone, the average distances to the planned surface of the security zone for each specimen were between 0.01mm and 2.23mm, and the corresponding standard deviations varied from 0.41mm to 1.17mm. But it also shows some variation in averages and standard deviations and it was often too little material removed. This deviation is probably caused by the patient registration and the tool calibration. PMID:21097019

  5. A comparative evaluation of the marginal accuracy of crowns fabricated from four commercially available provisional materials: An in vitro study

    PubMed Central

    Amin, Bhavya Mohandas; Aras, Meena Ajay; Chitre, Vidya

    2015-01-01

    Purpose: The purpose of this in vitro study was to evaluate and compare the primary marginal accuracy of four commercially available provisional materials (Protemp 4, Luxatemp Star, Visalys Temp and DPI tooth moulding powder and liquid) at 2 time intervals (10 and 30 min). Materials and Methods: A customized stainless steel master model containing two interchangeable dies was used for fabrication of provisional crowns. Forty crowns (n = 10) were fabricated, and each crown was evaluated under a stereomicroscope. Vertical marginal discrepancies were noted and compared at 10 min since the start of mixing and then at 30 min. Observations and Results: Protemp 4 showed the least vertical marginal discrepancy (71.59 μ), followed by Luxatemp Star (91.93 μ) at 10 min. DPI showed a marginal discrepancy of 95.94 μ while Visalys Temp crowns had vertical marginal discrepancy of 106.81 μ. There was a significant difference in the marginal discrepancy values of Protemp 4 and Visalys Temp. At 30 min, there was a significant difference between the marginal discrepancy of Protemp 4 crowns (83.11 μ) and Visalys Temp crowns (128.97 μ) and between Protemp 4 and DPI (118.88 μ). No significant differences were observed between Protemp 4 and Luxatemp Star. Conclusion: The vertical marginal discrepancy of temporary crowns fabricated from the four commercially available provisional materials ranged from 71 to 106 μ immediately after fabrication (at 10 min from the start of mix) to 83–128 μ (30 min from the start of mix). The time elapsed after mixing had a significant influence on the marginal accuracy of the crowns. PMID:26097348

  6. Accuracy and Precision of Three-Dimensional Low Dose CT Compared to Standard RSA in Acetabular Cups: An Experimental Study.

    PubMed

    Brodén, Cyrus; Olivecrona, Henrik; Maguire, Gerald Q; Noz, Marilyn E; Zeleznik, Michael P; Sköldenberg, Olof

    2016-01-01

    Background and Purpose. The gold standard for detection of implant wear and migration is currently radiostereometry (RSA). The purpose of this study is to compare a three-dimensional computed tomography technique (3D CT) to standard RSA as an alternative technique for measuring migration of acetabular cups in total hip arthroplasty. Materials and Methods. With tantalum beads, we marked one cemented and one uncemented cup and mounted these on a similarly marked pelvic model. A comparison was made between 3D CT and standard RSA for measuring migration. Twelve repeated stereoradiographs and CT scans with double examinations in each position and gradual migration of the implants were made. Precision and accuracy of the 3D CT were calculated. Results. The accuracy of the 3D CT ranged between 0.07 and 0.32 mm for translations and 0.21 and 0.82° for rotation. The precision ranged between 0.01 and 0.09 mm for translations and 0.06 and 0.29° for rotations, respectively. For standard RSA, the precision ranged between 0.04 and 0.09 mm for translations and 0.08 and 0.32° for rotations, respectively. There was no significant difference in precision between 3D CT and standard RSA. The effective radiation dose of the 3D CT method, comparable to RSA, was estimated to be 0.33 mSv. Interpretation. Low dose 3D CT is a comparable method to standard RSA in an experimental setting. PMID:27478832

  7. Accuracy and Precision of Three-Dimensional Low Dose CT Compared to Standard RSA in Acetabular Cups: An Experimental Study

    PubMed Central

    Olivecrona, Henrik; Maguire, Gerald Q.; Noz, Marilyn E.; Zeleznik, Michael P.

    2016-01-01

    Background and Purpose. The gold standard for detection of implant wear and migration is currently radiostereometry (RSA). The purpose of this study is to compare a three-dimensional computed tomography technique (3D CT) to standard RSA as an alternative technique for measuring migration of acetabular cups in total hip arthroplasty. Materials and Methods. With tantalum beads, we marked one cemented and one uncemented cup and mounted these on a similarly marked pelvic model. A comparison was made between 3D CT and standard RSA for measuring migration. Twelve repeated stereoradiographs and CT scans with double examinations in each position and gradual migration of the implants were made. Precision and accuracy of the 3D CT were calculated. Results. The accuracy of the 3D CT ranged between 0.07 and 0.32 mm for translations and 0.21 and 0.82° for rotation. The precision ranged between 0.01 and 0.09 mm for translations and 0.06 and 0.29° for rotations, respectively. For standard RSA, the precision ranged between 0.04 and 0.09 mm for translations and 0.08 and 0.32° for rotations, respectively. There was no significant difference in precision between 3D CT and standard RSA. The effective radiation dose of the 3D CT method, comparable to RSA, was estimated to be 0.33 mSv. Interpretation. Low dose 3D CT is a comparable method to standard RSA in an experimental setting. PMID:27478832

  8. Effect of the impression margin thickness on the linear accuracy of impression and stone dies: an in vitro study.

    PubMed

    Naveen, Y G; Patil, Raghunath

    2013-03-01

    The space available for impression material in gingival sulcus immediately after the removal of retraction cord has been found to be 0.3-0.4 mm. However after 40 s only 0.2 mm of the retracted space is available. This is of concern when impression of multiple abutments is to be made. Hence a study was planned to determine the minimum width of the retracted sulcus necessary to obtain a good impression. Five metal dies were machined to accurately fit a stainless steel block with a square cavity in the center with spaces, 1 mm deep and of varying widths (0.11-0.3 mm) away from the block. Polyvinyl siloxane impressions were made and poured using a high strength stone. Using traveling microscope, length and widths of abutment, impression and die were measured and compared for linear accuracy and completeness of impression. Results showed 1.5-3 times greater mean distortion and larger coefficient of variance in the 0.11 mm group than in the wider sulcular groups. ANOVA test for distortion also showed statistically significant differences (P < 0.05). 75 % of impressions in 0.11 mm group were defective compared to less than 25 % of impressions in other width groups. It is not always possible to predictably obtain accurate impressions in sulcus width of 0.11 mm or lesser. Dimensionally accurate and defect free impressions were obtained in sulcus width of 0.15 mm and wider. Hence clinicians must choose retraction methods to obtain a width greater than 0.35 mm. Further immediate loading of the impression material after cord removal may improve accuracy. PMID:24431701

  9. A new automatic blood pressure kit auscultates for accurate reading with a smartphone: A diagnostic accuracy study.

    PubMed

    Wu, Hongjun; Wang, Bingjian; Zhu, Xinpu; Chu, Guang; Zhang, Zhi

    2016-08-01

    The widely used oscillometric automated blood pressure (BP) monitor was continuously questioned on its accuracy. A novel BP kit named Accutension which adopted Korotkoff auscultation method was then devised. Accutension worked with a miniature microphone, a pressure sensor, and a smartphone. The BP values were automatically displayed on the smartphone screen through the installed App. Data recorded in the phone could be played back and reconfirmed after measurement. They could also be uploaded and saved to the iCloud. The accuracy and consistency of this novel electronic auscultatory sphygmomanometer was preliminarily verified here. Thirty-two subjects were included and 82 qualified readings were obtained. The mean differences ± SD for systolic and diastolic BP readings between Accutension and mercury sphygmomanometer were 0.87 ± 2.86 and -0.94 ± 2.93 mm Hg. Agreements between Accutension and mercury sphygmomanometer were highly significant for systolic (ICC = 0.993, 95% confidence interval (CI): 0.989-0.995) and diastolic (ICC = 0.987, 95% CI: 0.979-0.991). In conclusion, Accutension worked accurately based on our pilot study data. The difference was acceptable. ICC and Bland-Altman plot charts showed good agreements with manual measurements. Systolic readings of Accutension were slightly higher than those of manual measurement, while diastolic readings were slightly lower. One possible reason was that Accutension captured the first and the last korotkoff sound more sensitively than human ear during manual measurement and avoided sound missing, so that it might be more accurate than traditional mercury sphygmomanometer. By documenting and analyzing of variant tendency of BP values, Accutension helps management of hypertension and therefore contributes to the mobile heath service. PMID:27512876

  10. Accuracy and stability of measuring GABA, glutamate, and glutamine by proton magnetic resonance spectroscopy: A phantom study at 4 Tesla

    NASA Astrophysics Data System (ADS)

    Henry, Michael E.; Lauriat, Tara L.; Shanahan, Meghan; Renshaw, Perry F.; Jensen, J. Eric

    2011-02-01

    Proton magnetic resonance spectroscopy has the potential to provide valuable information about alterations in gamma-aminobutyric acid (GABA), glutamate (Glu), and glutamine (Gln) in psychiatric and neurological disorders. In order to use this technique effectively, it is important to establish the accuracy and reproducibility of the methodology. In this study, phantoms with known metabolite concentrations were used to compare the accuracy of 2D J-resolved MRS, single-echo 30 ms PRESS, and GABA-edited MEGA-PRESS for measuring all three aforementioned neurochemicals simultaneously. The phantoms included metabolite concentrations above and below the physiological range and scans were performed at baseline, 1 week, and 1 month time-points. For GABA measurement, MEGA-PRESS proved optimal with a measured-to-target correlation of R2 = 0.999, with J-resolved providing R2 = 0.973 for GABA. All three methods proved effective in measuring Glu with R2 = 0.987 (30 ms PRESS), R2 = 0.996 (J-resolved) and R2 = 0.910 (MEGA-PRESS). J-resolved and MEGA-PRESS yielded good results for Gln measures with respective R2 = 0.855 (J-resolved) and R2 = 0.815 (MEGA-PRESS). The 30 ms PRESS method proved ineffective in measuring GABA and Gln. When measurement stability at in vivo concentration was assessed as a function of varying spectral quality, J-resolved proved the most stable and immune to signal-to-noise and linewidth fluctuation compared to MEGA-PRESS and 30 ms PRESS.

  11. Zagreb Amblyopia Preschool Screening Study: near and distance visual acuity testing increase the diagnostic accuracy of screening for amblyopia

    PubMed Central

    Bušić, Mladen; Bjeloš, Mirjana; Petrovečki, Mladen; Kuzmanović Elabjer, Biljana; Bosnar, Damir; Ramić, Senad; Miletić, Daliborka; Andrijašević, Lidija; Kondža Krstonijević, Edita; Jakovljević, Vid; Bišćan Tvrdi, Ana; Predović, Jurica; Kokot, Antonio; Bišćan, Filip; Kovačević Ljubić, Mirna; Motušić Aras, Ranka

    2016-01-01

    Aim To present and evaluate a new screening protocol for amblyopia in preschool children. Methods Zagreb Amblyopia Preschool Screening (ZAPS) study protocol performed screening for amblyopia by near and distance visual acuity (VA) testing of 15 648 children aged 48-54 months attending kindergartens in the City of Zagreb County between September 2011 and June 2014 using Lea Symbols in lines test. If VA in either eye was >0.1 logMAR, the child was re-tested, if failed at re-test, the child was referred to comprehensive eye examination at the Eye Clinic. Results 78.04% of children passed the screening test. Estimated prevalence of amblyopia was 8.08%. Testability, sensitivity, and specificity of the ZAPS study protocol were 99.19%, 100.00%, and 96.68% respectively. Conclusion The ZAPS study used the most discriminative VA test with optotypes in lines as they do not underestimate amblyopia. The estimated prevalence of amblyopia was considerably higher than reported elsewhere. To the best of our knowledge, the ZAPS study protocol reached the highest sensitivity and specificity when evaluating diagnostic accuracy of VA tests for screening. The pass level defined at ≤0.1 logMAR for 4-year-old children, using Lea Symbols in lines missed no amblyopia cases, advocating that both near and distance VA testing should be performed when screening for amblyopia. PMID:26935612

  12. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  13. Databases and web tools for cancer genomics study.

    PubMed

    Yang, Yadong; Dong, Xunong; Xie, Bingbing; Ding, Nan; Chen, Juan; Li, Yongjun; Zhang, Qian; Qu, Hongzhu; Fang, Xiangdong

    2015-02-01

    Publicly-accessible resources have promoted the advance of scientific discovery. The era of genomics and big data has brought the need for collaboration and data sharing in order to make effective use of this new knowledge. Here, we describe the web resources for cancer genomics research and rate them on the basis of the diversity of cancer types, sample size, omics data comprehensiveness, and user experience. The resources reviewed include data repository and analysis tools; and we hope such introduction will promote the awareness and facilitate the usage of these resources in the cancer research community. PMID:25707591

  14. Databases and Web Tools for Cancer Genomics Study

    PubMed Central

    Yang, Yadong; Dong, Xunong; Xie, Bingbing; Ding, Nan; Chen, Juan; Li, Yongjun; Zhang, Qian; Qu, Hongzhu; Fang, Xiangdong

    2015-01-01

    Publicly-accessible resources have promoted the advance of scientific discovery. The era of genomics and big data has brought the need for collaboration and data sharing in order to make effective use of this new knowledge. Here, we describe the web resources for cancer genomics research and rate them on the basis of the diversity of cancer types, sample size, omics data comprehensiveness, and user experience. The resources reviewed include data repository and analysis tools; and we hope such introduction will promote the awareness and facilitate the usage of these resources in the cancer research community. PMID:25707591

  15. Interoceptive accuracy and panic.

    PubMed

    Zoellner, L A; Craske, M G

    1999-12-01

    Psychophysiological models of panic hypothesize that panickers focus attention on and become anxious about the physical sensations associated with panic. Attention on internal somatic cues has been labeled interoception. The present study examined the role of physiological arousal and subjective anxiety on interoceptive accuracy. Infrequent panickers and nonanxious participants participated in an initial baseline to examine overall interoceptive accuracy. Next, participants ingested caffeine, about which they received either safety or no safety information. Using a mental heartbeat tracking paradigm, participants' count of their heartbeats during specific time intervals were coded based on polygraph measures. Infrequent panickers were more accurate in the perception of their heartbeats than nonanxious participants. Changes in physiological arousal were not associated with increased accuracy on the heartbeat perception task. However, higher levels of self-reported anxiety were associated with superior performance. PMID:10596462

  16. Millimeter-accuracy GPS landslide monitoring using Precise Point Positioning with Single Receiver Phase Ambiguity (PPP-SRPA) resolution: a case study in Puerto Rico

    NASA Astrophysics Data System (ADS)

    Wang, G. Q.

    2013-03-01

    Continuous Global Positioning System (GPS) monitoring is essential for establishing the rate and pattern of superficial movements of landslides. This study demonstrates a technique which uses a stand-alone GPS station to conduct millimeter-accuracy landslide monitoring. The Precise Point Positioning with Single Receiver Phase Ambiguity (PPP-SRPA) resolution employed by the GIPSY/OASIS software package (V6.1.2) was applied in this study. Two-years of continuous GPS data collected at a creeping landslide were used to evaluate the accuracy of the PPP-SRPA solutions. The criterion for accuracy was the root-mean-square (RMS) of residuals of the PPP-SRPA solutions with respect to "true" landslide displacements over the two-year period. RMS is often regarded as repeatability or precision in GPS literature. However, when contrasted with a known "true" position or displacement it could be termed RMS accuracy or simply accuracy. This study indicated that the PPP-SRPA resolution can provide an accuracy of 2 to 3 mm horizontally and 8 mm vertically for 24-hour sessions with few outliers (< 1%) in the Puerto Rico region. Horizontal accuracy below 5 mm can be stably achieved with 4-hour or longer sessions if avoiding the collection of data during extreme weather conditions. Vertical accuracy below 10 mm can be achieved with 8-hour or longer sessions. This study indicates that the PPP-SRPA resolution is competitive with the conventional carrier-phase double-difference network resolution for static (longer than 4 hours) landslide monitoring while maintaining many advantages. It is evident that the PPP-SRPA method would become an attractive alternative to the conventional carrier-phase double-difference method for landslide monitoring, notably in remote areas or developing countries.

  17. Pitfalls at the root of facial assessment on photographs: a quantitative study of accuracy in positioning facial landmarks.

    PubMed

    Cummaudo, M; Guerzoni, M; Marasciuolo, L; Gibelli, D; Cigada, A; Obertovà, Z; Ratnayake, M; Poppa, P; Gabriel, P; Ritz-Timme, S; Cattaneo, C

    2013-05-01

    In the last years, facial analysis has gained great interest also for forensic anthropology. The application of facial landmarks may bring about relevant advantages for the analysis of 2D images by measuring distances and extracting quantitative indices. However, this is a complex task which depends upon the variability in positioning facial landmarks. In addition, literature provides only general indications concerning the reliability in positioning facial landmarks on photographic material, and no study is available concerning the specific errors which may be encountered in such an operation. The aim of this study is to analyze the inter- and intra-observer error in defining facial landmarks on photographs by using a software specifically developed for this purpose. Twenty-four operators were requested to define 22 facial landmarks on frontal view photographs and 11 on lateral view images; in addition, three operators repeated the procedure on the same photographs 20 times (at distance of 24 h). In the frontal view, the landmarks with less dispersion were the pupil, cheilion, endocanthion, and stomion (sto), and the landmarks with the highest dispersion were gonion, zygion, frontotemporale, tragion, and selion (se). In the lateral view, the landmarks with the least dispersion were se, pronasale, subnasale, and sto, whereas landmarks with the highest dispersion were gnathion, pogonion, and tragion. Results confirm that few anatomical points can be defined with the highest accuracy and show the importance of the preliminary investigation of reliability in positioning facial landmarks. PMID:23515681

  18. A computer simulation study comparing lesion detection accuracy with digital mammography, breast tomosynthesis, and cone-beam CT breast imaging

    SciTech Connect

    Gong Xing; Glick, Stephen J.; Liu, Bob; Vedula, Aruna A.; Thacker, Samta

    2006-04-15

    Although conventional mammography is currently the best modality to detect early breast cancer, it is limited in that the recorded image represents the superposition of a three-dimensional (3D) object onto a 2D plane. Recently, two promising approaches for 3D volumetric breast imaging have been proposed, breast tomosynthesis (BT) and CT breast imaging (CTBI). To investigate possible improvements in lesion detection accuracy with either breast tomosynthesis or CT breast imaging as compared to digital mammography (DM), a computer simulation study was conducted using simulated lesions embedded into a structured 3D breast model. The computer simulation realistically modeled x-ray transport through a breast model, as well as the signal and noise propagation through a CsI based flat-panel imager. Polyenergetic x-ray spectra of Mo/Mo 28 kVp for digital mammography, Mo/Rh 28 kVp for BT, and W/Ce 50 kVp for CTBI were modeled. For the CTBI simulation, the intensity of the x-ray spectra for each projection view was determined so as to provide a total average glandular dose of 4 mGy, which is approximately equivalent to that given in conventional two-view screening mammography. The same total dose was modeled for both the DM and BT simulations. Irregular lesions were simulated by using a stochastic growth algorithm providing lesions with an effective diameter of 5 mm. Breast tissue was simulated by generating an ensemble of backgrounds with a power law spectrum, with the composition of 50% fibroglandular and 50% adipose tissue. To evaluate lesion detection accuracy, a receiver operating characteristic (ROC) study was performed with five observers reading an ensemble of images for each case. The average area under the ROC curves (A{sub z}) was 0.76 for DM, 0.93 for BT, and 0.94 for CTBI. Results indicated that for the same dose, a 5 mm lesion embedded in a structured breast phantom was detected by the two volumetric breast imaging systems, BT and CTBI, with statistically

  19. A qualitative study into the difficulties experienced by healthcare decision makers when reading a Cochrane diagnostic test accuracy review

    PubMed Central

    2013-01-01

    Background Cochrane reviews are one of the best known and most trusted sources of evidence-based information in health care. While steps have been taken to make Cochrane intervention reviews accessible to a diverse readership, little is known about the accessibility of the newcomer to the Cochrane library: diagnostic test accuracy reviews (DTARs). The current qualitative study explored how healthcare decision makers, who varied in their knowledge and experience with test accuracy research and systematic reviews, read and made sense of DTARs. Methods A purposive sample of clinicians, researchers and policy makers (n = 21) took part in a series of think-aloud interviews, using as interview material the first three DTARs published in the Cochrane library. Thematic qualitative analysis of the transcripts was carried out to identify patterns in participants’ ‘reading’ and interpretation of the reviews and the difficulties they encountered. Results Participants unfamiliar with the design and methodology of DTARs found the reviews largely inaccessible and experienced a range of difficulties stemming mainly from the mismatch between background knowledge and level of explanation provided in the text. Experience with systematic reviews of interventions did not guarantee better understanding and, in some cases, led to confusion and misinterpretation. These difficulties were further exacerbated by poor layout and presentation, which affected even those with relatively good knowledge of DTARs and had a negative impact not only on their understanding of the reviews but also on their motivation to engage with the text. Comparison between the readings of the three reviews showed that more accessible presentation, such as presenting the results as natural frequencies, significantly increased participants’ understanding. Conclusions The study demonstrates that authors and editors should pay more attention to the presentation as well as the content of Cochrane DTARs

  20. An Observational Study to Evaluate the Usability and Intent to Adopt an Artificial Intelligence–Powered Medication Reconciliation Tool

    PubMed Central

    Yuan, Michael Juntao; Poonawala, Robina

    2016-01-01

    Background Medication reconciliation (the process of creating an accurate list of all medications a patient is taking) is a widely practiced procedure to reduce medication errors. It is mandated by the Joint Commission and reimbursed by Medicare. Yet, in practice, medication reconciliation is often not effective owing to knowledge gaps in the team. A promising approach to improve medication reconciliation is to incorporate artificial intelligence (AI) decision support tools into the process to engage patients and bridge the knowledge gap. Objective The aim of this study was to improve the accuracy and efficiency of medication reconciliation by engaging the patient, the nurse, and the physician as a team via an iPad tool. With assistance from the AI agent, the patient will review his or her own medication list from the electronic medical record (EMR) and annotate changes, before reviewing together with the physician and making decisions on the shared iPad screen. Methods In this study, we developed iPad-based software tools, with AI decision support, to engage patients to “self-service” medication reconciliation and then share the annotated reconciled list with the physician. To evaluate the software tool’s user interface and workflow, a small number of patients (10) in a primary care clinic were recruited, and they were observed through the whole process during a pilot study. The patients are surveyed for the tool’s usability afterward. Results All patients were able to complete the medication reconciliation process correctly. Every patient found at least one error or other issues with their EMR medication lists. All of them reported that the tool was easy to use, and 8 of 10 patients reported that they will use the tool in the future. However, few patients interacted with the learning modules in the tool. The physician and nurses reported the tool to be easy-to-use, easy to integrate into existing workflow, and potentially time-saving. Conclusions We have

  1. Accuracy of deception judgments.

    PubMed

    Bond, Charles F; DePaulo, Bella M

    2006-01-01

    We analyze the accuracy of deception judgments, synthesizing research results from 206 documents and 24,483 judges. In relevant studies, people attempt to discriminate lies from truths in real time with no special aids or training. In these circumstances, people achieve an average of 54% correct lie-truth judgments, correctly classifying 47% of lies as deceptive and 61% of truths as nondeceptive. Relative to cross-judge differences in accuracy, mean lie-truth discrimination abilities are nontrivial, with a mean accuracy d of roughly .40. This produces an effect that is at roughly the 60th percentile in size, relative to others that have been meta-analyzed by social psychologists. Alternative indexes of lie-truth discrimination accuracy correlate highly with percentage correct, and rates of lie detection vary little from study to study. Our meta-analyses reveal that people are more accurate in judging audible than visible lies, that people appear deceptive when motivated to be believed, and that individuals regard their interaction partners as honest. We propose that people judge others' deceptions more harshly than their own and that this double standard in evaluating deceit can explain much of the accumulated literature. PMID:16859438

  2. Assessment of the haptic robot as a new tool for the study of the neural control of reaching.

    PubMed

    Rakusa, Martin; Hribar, Ales; Koritnik, Blaz; Munih, Marko; Battaglni, Piero Paolo; Belic, Ales; Zidar, Janez

    2013-10-01

    Current experimental methods for the study of reaching in the MRI environment do not exactly mimic actual reaching, due to constrains in movement which are imposed by the MRI machine itself. We tested a haptic robot (HR) as such a tool. Positive results would also be promising for combined use of fMRI and EEG to study reaching. Twenty right-handed subjects performed reaching tasks with their right hand with and without the HR. Reaction time, movement time (MT), accuracy, event-related potentials (ERPs) and event-related desynchronisation/synchronisation (ERD/ERS) were studied. Reaction times and accuracies did not differ significantly between the two tasks, while the MT was significantly longer in HR reaching (959 vs. 447 ms). We identified two positive and two negative ERP peaks across all leads in both tasks. The latencies of the P1 and N2 peaks were significantly longer in HR reaching, while there were no significant differences in the P3 and N4 latencies. ERD/ERS topographies were similar between tasks and similar to other reaching studies. Main difference was in ERS rebound which was observed only in actual reaching. Probable reason was significantly larger MT. We found that reaching with the HR engages similar neural structures as in actual reaching. Although there are some constrains, its use may be superior to other techniques used for reaching studies in the MRI environment, where freedom of movement is limited. PMID:23474640

  3. The hidden KPI registration accuracy.

    PubMed

    Shorrosh, Paul

    2011-09-01

    Determining the registration accuracy rate is fundamental to improving revenue cycle key performance indicators. A registration quality assurance (QA) process allows errors to be corrected before bills are sent and helps registrars learn from their mistakes. Tools are available to help patient access staff who perform registration QA manually. PMID:21923052

  4. Genetic transformation: a tool to study protein targeting in diatoms.

    PubMed

    Kroth, Peter G

    2007-01-01

    Diatoms are unicellular photoautotrophic eukaryotes that play an important role in ecology by fixing large amounts of CO2 in the oceans. Because they evolved by secondary endocytobiosis-- a process of uptake of a eukaryotic alga into another eukaryotic cell--they have a rather unusual cell biology and genetic constitution. Because the preparation of organelles is rather difficult as a result of the cytosolic structures, genetic transformation and expression of preproteins fused to green fluorescent protein (GFP) became one of the major tools to analyze subcellular localization of proteins in diatoms. Meanwhile several groups successfully attempted to develop genetic transformation protocols for diatoms. These methods are based on "biolistic" DNA delivery via a particle gun and allow the introduction and expression of foreign genes in the algae. Here a protocol for the genetic transformation of the diatom Phaeodactylum tricornutum is described as well as the subsequent characterization of the transformants. PMID:17951693

  5. Novel genetic tools for studying food borne Salmonella

    PubMed Central

    Santiviago, Carlos A; McClelland, Michael

    2009-01-01

    Summary of Recent Advances Non-typhoidal Salmonellae are highly prevalent food borne pathogens. High-throughput sequencing of Salmonella genomes is expanding our knowledge of the evolution of serovars and epidemic isolates. Genome sequences have also allowed the creation of complete microarrays. Microarrays have improved the throughput of In vivo expression technology (IVET) used to uncover promoters active during infection. In another method, signature tagged mutagenesis (STM), pools of mutants are subjected to selection. Changes in the population are monitored on a microarray, revealing genes under selection. Complete genome sequences permit the construction of pools of targeted in-frame deletions that have improved STM by minimizing the number of clones and the polarity of each mutant. Together, genome sequences and the continuing development of new tools for functional genomics will drive a revolution in the understanding of Salmonellae in many different niches that are critical for food safety. PMID:19285855

  6. A descriptive study of a clinical evaluation tool and process: student and faculty perspectives.

    PubMed

    Krautscheid, Lorretta; Moceri, Joane; Stragnell, Susan; Manthey, Lisa; Neal, Thea

    2014-03-01

    Clinical evaluation tools are designed to assess nursing students' knowledge, skills, and attitudes related to program and course outcomes and professional nursing standards. Students, faculty, administrators, and the public rely on the effectiveness of the tool and process to determine progression within the curriculum and validate competency. In May 2012, a revised clinical evaluation tool was implemented in a baccalaureate nursing program. This study was undertaken to evaluate the revised clinical evaluation tool by exploring the perspectives of students and faculty who use the tool and engage in the evaluation process. Findings revealed the tool was user friendly and instructions were clear, with sufficient grading criteria to determine clinical competency. Findings also revealed areas for improvement in the evaluation process, including orientation to the tool, connecting program outcomes to clinical performance, and meaningful participation in evaluation. Recommendations are made for improving the clinical evaluation process. PMID:24512331

  7. Chimpanzees create and modify probe tools functionally: A study with zoo-housed chimpanzees

    PubMed Central

    Hopper, Lydia M; Tennie, Claudio; Ross, Stephen R; Lonsdorf, Elizabeth V

    2015-01-01

    Chimpanzees (Pan troglodytes) use tools to probe for out-of-reach food, both in the wild and in captivity. Beyond gathering appropriately-sized materials to create tools, chimpanzees also perform secondary modifications in order to create an optimized tool. In this study, we recorded the behavior of a group of zoo-housed chimpanzees when presented with opportunities to use tools to probe for liquid foods in an artificial termite mound within their enclosure. Previous research with this group of chimpanzees has shown that they are proficient at gathering materials from within their environment in order to create tools to probe for the liquid food within the artificial mound. Extending beyond this basic question, we first asked whether they only made and modified probe tools when it was appropriate to do so (i.e. when the mound was baited with food). Second, by collecting continuous data on their behavior, we also asked whether the chimpanzees first (intentionally) modified their tools prior to probing for food or whether such modifications occurred after tool use, possibly as a by-product of chewing and eating the food from the tools. Following our predictions, we found that tool modification predicted tool use; the chimpanzees began using their tools within a short delay of creating and modifying them, and the chimpanzees performed more tool modifying behaviors when food was available than when they could not gain food through the use of probe tools. We also discuss our results in terms of the chimpanzees’ acquisition of the skills, and their flexibility of tool use and learning. Am. J. Primatol. 77:162–170, 2015. © 2014 The Authors. American Journal of Primatology Published by Wiley Periodicals Inc. PMID:25220050

  8. Chimpanzees create and modify probe tools functionally: A study with zoo-housed chimpanzees.

    PubMed

    Hopper, Lydia M; Tennie, Claudio; Ross, Stephen R; Lonsdorf, Elizabeth V

    2015-02-01

    Chimpanzees (Pan troglodytes) use tools to probe for out-of-reach food, both in the wild and in captivity. Beyond gathering appropriately-sized materials to create tools, chimpanzees also perform secondary modifications in order to create an optimized tool. In this study, we recorded the behavior of a group of zoo-housed chimpanzees when presented with opportunities to use tools to probe for liquid foods in an artificial termite mound within their enclosure. Previous research with this group of chimpanzees has shown that they are proficient at gathering materials from within their environment in order to create tools to probe for the liquid food within the artificial mound. Extending beyond this basic question, we first asked whether they only made and modified probe tools when it was appropriate to do so (i.e. when the mound was baited with food). Second, by collecting continuous data on their behavior, we also asked whether the chimpanzees first (intentionally) modified their tools prior to probing for food or whether such modifications occurred after tool use, possibly as a by-product of chewing and eating the food from the tools. Following our predictions, we found that tool modification predicted tool use; the chimpanzees began using their tools within a short delay of creating and modifying them, and the chimpanzees performed more tool modifying behaviors when food was available than when they could not gain food through the use of probe tools. We also discuss our results in terms of the chimpanzees' acquisition of the skills, and their flexibility of tool use and learning. PMID:25220050

  9. Accuracy of ceramic restorations made using an in-office optical scanning technique: an in vitro study.

    PubMed

    Tidehag, P; Ottosson, K; Sjögren, G

    2014-01-01

    The present in vitro study concerns determination of the pre-cementation gap width of all-ceramic crowns made using an in-office digital-impression technique and subsequent computer-aided design/computer-aided manufacturing (CAD/CAM) production. Two chairside video camera systems were used: the Lava Oral scanner and Cadent's iTero scanner. Digital scans were made of a first molar typodont tooth that was suitably prepared for an all-ceramic crown. The digital impressions were sent via the Internet to commercial dental laboratories, where the crowns were made. Also, an impression of the typodont tooth was made, poured, and scanned in order to evaluate the pre-cementation gap of crowns produced from scanning stone dies. These methods and systems were evaluated by creating replicas of the intermediate space using an addition-cured silicone, and the gap widths were determined using a measuring microscope. Hot-pressed leucite-reinforced glass-ceramic crowns were selected as a reference. The mean value for the marginal measuring points of the control was 170 μm, and the values for all the evaluated crowns ranged from 107 to 128 μm. Corresponding figures for the internal measuring points were 141-210 μm and 115-237 μm, respectively. Based on the findings in the present study, an in-office digital-impression technique can be used to fabricate CAD/CAM ceramic single crowns with a marginal and internal accuracy that is on the same level as that of a conventional hot-pressed glass-ceramic crown. In the present study, however, slight differences could be seen between the two types of ceramic crowns studied with respect to the internal fit obtained. PMID:24111810

  10. Diagnostic accuracy of endoscopic biopsies for the diagnosis of gastrointestinal follicular lymphoma: a clinicopathologic study of 48 patients.

    PubMed

    Iwamuro, Masaya; Okada, Hiroyuki; Takata, Katsuyoshi; Nose, Soichiro; Miyatani, Katsuya; Yoshino, Tadashi; Yamamoto, Kazuhide

    2014-04-01

    The purpose of this study was to reveal the diagnostic accuracy of initial pathologic assessment of biopsied samples in patients with gastrointestinal follicular lymphoma lesions. A total of 48 patients with follicular lymphoma (Lugano system stage I: n = 30; II1: n = 4; II2: n = 4; IV: n = 10) with gastrointestinal involvement who underwent endoscopic biopsy were enrolled and retrospectively reviewed. Nine (18.8%) of the 48 patients were not appropriately diagnosed as having follicular lymphoma at the initial biopsy. The initial pathological diagnosis included extranodal marginal zone lymphoma of mucosa-associated lymphoid tissue (n = 4), necrotic tissue (n = 2), duodenitis (n = 1), or suspected lymphoma of unspecified subtype (n = 2). The reasons for these inappropriate diagnoses were insufficient histopathologic analysis lacking CD10 and BCL2 staining (n = 7) and unsuitable biopsy samples taken from erosions or ulcers that contained scanty lymphoma cells or no lymphoid follicles (n = 2). In conclusion, incomplete histopathologic analysis and unsuitable biopsy samples are pitfalls in the diagnosis of gastrointestinal follicular lymphoma. PMID:24513028

  11. Evaluation of accuracy of non-linear finite element computations for surgical simulation: study using brain phantom.

    PubMed

    Ma, J; Wittek, A; Singh, S; Joldes, G; Washio, T; Chinzei, K; Miller, K

    2010-12-01

    In this paper, the accuracy of non-linear finite element computations in application to surgical simulation was evaluated by comparing the experiment and modelling of indentation of the human brain phantom. The evaluation was realised by comparing forces acting on the indenter and the deformation of the brain phantom. The deformation of the brain phantom was measured by tracking 3D motions of X-ray opaque markers, placed within the brain phantom using a custom-built bi-plane X-ray image intensifier system. The model was implemented using the ABAQUS(TM) finite element solver. Realistic geometry obtained from magnetic resonance images and specific constitutive properties determined through compression tests were used in the model. The model accurately predicted the indentation force-displacement relations and marker displacements. Good agreement between modelling and experimental results verifies the reliability of the finite element modelling techniques used in this study and confirms the predictive power of these techniques in surgical simulation. PMID:21153973

  12. Cost-Saving Early Diagnosis of Functional Pain in Nonmalignant Pain: A Noninferiority Study of Diagnostic Accuracy

    PubMed Central

    Cámara, Rafael J. A.; Merz, Christian; von Känel, Roland; Egloff, Niklaus

    2016-01-01

    Objectives. We compared two index screening tests for early diagnosis of functional pain: pressure pain measurement by electronic diagnostic equipment, which is accurate but too specialized for primary health care, versus peg testing, which is cost-saving and more easily manageable but of unknown sensitivity and specificity. Early distinction of functional (altered pain perception; nervous sensitization) from neuropathic or nociceptive pain improves pain management. Methods. Clinicians blinded for the index screening tests assessed the reference standard of this noninferiority diagnostic accuracy study, namely, comprehensive medical history taking with all previous findings and treatment outcomes. All consenting patients referred to a university hospital for nonmalignant musculoskeletal pain participated. The main analysis compared the receiver operating characteristic (ROC) curves of both index screening tests. Results. The area under the ROC curve for peg testing was not inferior to that of electronic equipment: it was at least 95% as large for finger measures (two-sided p = 0.038) and at least equally as large for ear measures (two-sided p = 0.003). Conclusions. Routine diagnostic testing by peg, which is accessible for general practitioners, is at least as accurate as specialized equipment. This may shorten time-to-treatment in general practices, thereby improving the prognosis and quality of life. PMID:27088013

  13. Retrospective study: The diagnostic accuracy of conventional forceps biopsy of gastric epithelial compared to endoscopic submucosal dissection (STROBE compliant).

    PubMed

    Lu, Chao; Lv, Xueyou; Lin, Yiming; Li, Dejian; Chen, Lihua; Ji, Feng; Li, Youming; Yu, Chaohui

    2016-07-01

    Conventional forceps biopsy (CFB) is the most popular way to screen for gastric epithelial neoplasia (GEN) and adenocarcinoma of gastric epithelium. The aim of this study was to compare the diagnostic accuracy between conventional forceps biopsy and endoscopic submucosal dissection (ESD).Four hundred forty-four patients who finally undertook ESD in our hospital were enrolled from Jan 1, 2009 to Sep 1, 2015. We retrospectively assessed the characteristics of pathological results of CFB and ESD.The concordance rate between CFB and ESD specimens was 68.92% (306/444). Men showed a lower concordance rate (63.61% vs 79.33%; P = 0.001) and concordance patients were younger (P = 0.048). In multivariate analysis, men significantly had a lower concordance rate (coefficient -0.730, P = 0.002) and a higher rate of pathological upgrade (coefficient -0.648, P = 0.015). Locations of CFB did not influence the concordance rate statistically.The concordance rate was relatively high in our hospital. According to our analysis, old men plus gastric fundus or antrum of CFB were strongly suggested to perform ESD if precancerous lesions were found. And young women with low-grade intraepithelial neoplasia could select regular follow-up. PMID:27472723

  14. Accuracy of cut-off value by measurement of third molar index: Study of a Colombian sample.

    PubMed

    De Luca, Stefano; Aguilar, Lina; Rivera, Marcela; Palacio, Luz Andrea Velandia; Riccomi, Giulia; Bestetti, Fiorella; Cameriere, Roberto

    2016-04-01

    The aim of this cross-sectional study was to test the accuracy of cut-off value of 0.08 by measurement of third molar index (I3M) in assessing legal adult age of 18 years in a sample of Colombian children and young adults. Digital orthopantomographs of 288 Colombian children and young adults (163 girls and 125 boys), aged between 13 and 22 years, were analysed. Concordance correlation coefficient (ρc) and κ statistics (Cohen's Kappa coefficient) showed that repeatability and reproducibility are high for both intra- and inter-observer error. κ statistics for intra- and inter-observer agreement in decision on adult or minor was 0.913 and 0.877, respectively. Age distribution gradually decreases as I3M increases in both girls and boys. For girls, the sensitivity test was 95.1% (95% CI 87.1%-95%) and specificity was 93.8% (95% CI 87.1%-98.8%). The proportion of correctly classified individuals was 95.1%. For boys, the sensitivity test was 91.7% (95% CI 85.1%-96.8%) and specificity was 90.6% (95% CI 82.1%-97.8%). The proportion of correctly classified individuals was 89.7%. The cut-off value of 0.08 is highly useful to determine if a subject is 18 years of age or older or not. PMID:26898677

  15. A Case Study of Using a Social Annotation Tool to Support Collaboratively Learning

    ERIC Educational Resources Information Center

    Gao, Fei

    2013-01-01

    The purpose of the study was to understand student interaction and learning supported by a collaboratively social annotation tool--Diigo. The researcher examined through a case study how students participated and interacted when learning an online text with the social annotation tool--Diigo, and how they perceived their experience. The findings…

  16. SU-E-E-02: An Excel-Based Study Tool for ABR-Style Exams

    SciTech Connect

    Cline, K; Stanley, D; Defoor, D; Stathakis, S; Gutierrez, A; Papanikolaou, N; Kirby, N

    2015-06-15

    Purpose: As the landscape of learning and testing shifts toward a computer-based environment, a replacement for paper-based methods of studying is desirable. Using Microsoft Excel, a study tool was developed that allows the user to populate multiple-choice questions and then generate an interactive quiz session to answer them. Methods: The code for the tool was written using Microsoft Excel Visual Basic for Applications with the intent that this tool could be implemented by any institution with Excel. The base tool is a template with a setup macro, which builds out the structure based on user’s input. Once the framework is built, the user can input sets of multiple-choice questions, answer choices, and even add figures. The tool can be run in random-question or sequential-question mode for single or multiple courses of study. The interactive session allows the user to select answer choices and immediate feedback is provided. Once the user is finished studying, the tool records the day’s progress by reporting progress statistics useful for trending. Results: Six doctoral students at UTHSCSA have used this tool for the past two months to study for their qualifying exam, which is similar in format and content to the American Board of Radiology (ABR) Therapeutic Part II exam. The students collaborated to create a repository of questions, met weekly to go over these questions, and then used the tool to prepare for their exam. Conclusion: The study tool has provided an effective and efficient way for students to collaborate and be held accountable for exam preparation. The ease of use and familiarity of Excel are important factors for the tool’s use. There are software packages to create similar question banks, but this study tool has no additional cost for those that already have Excel. The study tool will be made openly available.

  17. Dolphin echolocation strategies studied with the Biosonar Measurement Tool

    NASA Astrophysics Data System (ADS)

    Houser, Dorian S.; Martin, Steve W.; Phillips, Michael; Bauer, Eric; Moore, Patrick W.

    2003-10-01

    Two free-swimming dolphins (Tt722 and Tt673) were trained to carry the Biosonar Measurement Tool (BMT) during open water, proud target searches in order to explore echolocation behavior without the constraints of traditional experimental designs. The BMT recorded the angular motion, depth, and velocity of the dolphin as well as echolocation clicks and echoes returning from insonified targets. Mean search time for Tt722 was 24.6+/-7.3 s and 6.5+/-3.0 s for Tt673 on target present trials, the former strategy resulting in the lower false alarm rate. The majority of clicks exceeded 195 dB re: 1 μPa throughout all trials for both animals but each demonstrated preferences for particular frequency bands of echolocation. Considering all trials, only 3.6% of all clicks produced by Tt722 contained peak frequencies greater than 60 kHz whereas Tt673 produced clicks with peak frequencies above 60 kHz 20.4% of the time. Distinctive frequency bands in the distribution of clicks were notable: bands for Tt673 occurred at 38, 54, and 69 kHz with less defined higher order bands; bands for Tt722 occurred at 25, 35, and 40 kHz. Distinctive frequency bands suggest a preferential use or mechanical constraint on harmonically related click frequencies.

  18. The ADENOMA Study. Accuracy of Detection using Endocuff Vision™ Optimization of Mucosal Abnormalities: study protocol for randomized controlled trial

    PubMed Central

    Bevan, Roisin; Ngu, Wee Sing; Saunders, Brian P.; Tsiamoulos, Zacharias; Bassett, Paul; Hoare, Zoe; Rees, Colin J.

    2016-01-01

    Background: Colonoscopy is the gold standard investigation for the diagnosis of bowel pathology and colorectal cancer screening. Adenoma detection rate is a marker of high quality colonoscopy and a high adenoma detection rate is associated with a lower incidence of interval cancers. Several technological advancements have been explored to improve adenoma detection rate. A new device called Endocuff Vision™ has been shown to improve adenoma detection rate in pilot studies. Methods/Design: This is a prospective, multicenter, randomized controlled trial comparing the adenoma detection rate in patients undergoing Endocuff Vision™-assisted colonoscopy with standard colonoscopy. All patients above 18 years of age referred for screening, surveillance, or diagnostic colonoscopy who are able to consent are invited to the study. Patients with absolute contraindications to colonoscopy, large bowel obstruction or pseudo-obstruction, colon cancer or polyposis syndromes, colonic strictures, severe diverticular segments, active colitis, anticoagulant therapy, or pregnancy are excluded. Patients are randomized according to site, age, sex, and bowel cancer screening status to receive Endocuff Vision™-assisted colonoscopy or standard colonoscopy on the day of procedure. Baseline data, colonoscopy, and polyp data including histology are collected. Nurse assessment of patient comfort and patient comfort questionnaires are completed post procedure. Patients are followed up at 21 days and complete a patient experience questionnaire. This study will take place across seven NHS Hospital Trusts: one in London and six within the Northern Region Endoscopy Group. A maximum of 10 colonoscopists per site will recruit a total of 1772 patients, with a maximum of four bowel screening colonoscopists permitted per site. Discussion: This is the first trial to evaluate the adenoma detection rate of Endocuff Vision™ in all screening, surveillance, and diagnostic patient groups. This timely

  19. A comparative study between evaluation methods for quality control procedures for determining the accuracy of PET/CT registration

    NASA Astrophysics Data System (ADS)

    Cha, Min Kyoung; Ko, Hyun Soo; Jung, Woo Young; Ryu, Jae Kwang; Choe, Bo-Young

    2015-08-01

    The Accuracy of registration between positron emission tomography (PET) and computed tomography (CT) images is one of the important factors for reliable diagnosis in PET/CT examinations. Although quality control (QC) for checking alignment of PET and CT images should be performed periodically, the procedures have not been fully established. The aim of this study is to determine optimal quality control (QC) procedures that can be performed at the user level to ensure the accuracy of PET/CT registration. Two phantoms were used to carry out this study: the American college of Radiology (ACR)-approved PET phantom and National Electrical Manufacturers Association (NEMA) International Electrotechnical Commission (IEC) body phantom, containing fillable spheres. All PET/CT images were acquired on a Biograph TruePoint 40 PET/CT scanner using routine protocols. To measure registration error, the spatial coordinates of the estimated centers of the target slice (spheres) was calculated independently for the PET and the CT images in two ways. We compared the images from the ACR-approved PET phantom to that from the NEMA IEC body phantom. Also, we measured the total time required from phantom preparation to image analysis. The first analysis method showed a total difference of 0.636 ± 0.11 mm for the largest hot sphere and 0.198 ± 0.09 mm for the largest cold sphere in the case of the ACR-approved PET phantom. In the NEMA IEC body phantom, the total difference was 3.720 ± 0.97 mm for the largest hot sphere and 4.800 ± 0.85 mm for the largest cold sphere. The second analysis method showed that the differences in the x location at the line profile of the lesion on PET and CT were (1.33, 1.33) mm for a bone lesion, (-1.26, -1.33) mm for an air lesion and (-1.67, -1.60) mm for a hot sphere lesion for the ACR-approved PET phantom. For the NEMA IEC body phantom, the differences in the x location at the line profile of the lesion on PET and CT were (-1.33, 4.00) mm for the air

  20. Evaluation of factors influencing accuracy of principal procedure coding based on ICD-9-CM: an Iranian study.

    PubMed

    Farzandipour, Mehrdad; Sheikhtaheri, Abbas

    2009-01-01

    To evaluate the accuracy of procedural coding and the factors that influence it, 246 records were randomly selected from four teaching hospitals in Kashan, Iran. "Recodes" were assigned blindly and then compared to the original codes. Furthermore, the coders' professional behaviors were carefully observed during the coding process. Coding errors were classified as major or minor. The relations between coding accuracy and possible effective factors were analyzed by chi(2) or Fisher exact tests as well as the odds ratio (OR) and the 95 percent confidence interval for the OR. The results showed that using a tabular index for rechecking codes reduces errors (83 percent vs. 72 percent accuracy). Further, more thorough documentation by the clinician positively affected coding accuracy, though this relation was not significant. Readability of records decreased errors overall (p = .003), including major ones (p = .012). Moreover, records with no abbreviations had fewer major errors (p = .021). In conclusion, not using abbreviations, ensuring more readable documentation, and paying more attention to available information increased coding accuracy and the quality of procedure databases. PMID:19471647

  1. Evaluation of Factors Influencing Accuracy of Principal Procedure Coding Based on ICD-9-CM: An Iranian Study

    PubMed Central

    Farzandipour, Mehrdad; Sheikhtaheri, Abbas

    2009-01-01

    To evaluate the accuracy of procedural coding and the factors that influence it, 246 records were randomly selected from four teaching hospitals in Kashan, Iran. “Recodes” were assigned blindly and then compared to the original codes. Furthermore, the coders' professional behaviors were carefully observed during the coding process. Coding errors were classified as major or minor. The relations between coding accuracy and possible effective factors were analyzed by χ2 or Fisher exact tests as well as the odds ratio (OR) and the 95 percent confidence interval for the OR. The results showed that using a tabular index for rechecking codes reduces errors (83 percent vs. 72 percent accuracy). Further, more thorough documentation by the clinician positively affected coding accuracy, though this relation was not significant. Readability of records decreased errors overall (p = .003), including major ones (p = .012). Moreover, records with no abbreviations had fewer major errors (p = .021). In conclusion, not using abbreviations, ensuring more readable documentation, and paying more attention to available information increased coding accuracy and the quality of procedure databases. PMID:19471647

  2. Comparative accuracy of different risk scores in assessing cardiovascular risk in Indians: A study in patients with first myocardial infarction

    PubMed Central

    Bansal, Manish; Kasliwal, Ravi R.; Trehan, Naresh

    2014-01-01

    Background Although a number of risk assessment models are available for estimating 10-year risk of cardiovascular (CV) events in patients requiring primary prevention of CV disease, the predictive accuracy of the contemporary risk models has not been adequately evaluated in Indians. Methods 149 patients [mean age 59.4 ± 10.6 years; 123 (82.6%) males] without prior CV disease and presenting with acute myocardial infarction (MI) were included. The four clinically most relevant risk assessment models [Framingham Risk score (RiskFRS), World Health Organization risk prediction charts (RiskWHO), American College of Cardiology/American Heart Association pooled cohort equations (RiskACC/AHA) and the 3rd Joint British Societies' risk calculator (RiskJBS)] were applied to estimate what would have been their predicted 10-year risk of CV events if they had presented just prior to suffering the acute MI. Results RiskWHO provided the lowest risk estimates with 86.6% patients estimated to be having <20% 10-year risk. In comparison, RiskFRS and RiskACC/AHA returned higher risk estimates (61.7% and 69.8% with risk <20%, respectively; p values <0.001 for comparison with RiskWHO). However, the RiskJBS identified the highest proportion of the patients as being at high-risk (only 44.1% at <20% risk, p values 0 < 0.01 for comparison with all the other 3 risk scores). Conclusions This is the first study to show that in Indian patients presenting with acute MI, RiskJBS is likely to identify the largest proportion of the patients as at ‘high-risk’ as compared to RiskWHO, RiskFRS and RiskACC/AHA. However, large-scale prospective studies are needed to confirm these findings. PMID:25634388

  3. The effect of mandibular buccal tilting on the accuracy of posterior mandibular spiral tomographic images: An in vitro study

    PubMed Central

    Sheikhi, Mahnaz; Maleki, Vida

    2011-01-01

    Background: Accurate measurement of the height and buccolingual thickness of available bone has a significant role in dental implantology. The shadow of ramus on the mandibular second molar region disturbs the sharpness of conventional tomographic images. The aim of this study was to evaluate the effect of transferring the shadow of ramus from the center of the focal plane, by changing the position of mandible, on the sharpness of the posterior mandibular region. Materials and Methods: In this experimental study, we used 10 dry human mandibles. Three metal balls were mounted on the midline and mandibular second molar regions bilaterally. Standard panoramic and tomographic images were taken. Then, the mandible was tilted buccaly for 8° – compensating the normal lingual inclination of the mandibular ridge and teeth on this region – and tomographic images were taken again. The height and thickness of bone were measured on the images and then compared with the real amounts measured directly on mandibles. Also, the sharpness of mandibular canals was compared between the two tomographic methods. Findings were analyzed with repeated measured ANOVA test (P<0.05). Results: The height of mandibular bone, on the images of the tilted tomography technique was more accurate compared to standard (P<0.001), but standard tomography had more accuracy in estimating the buccolingual thickness at the half-height point. Regarding the sharpness of mandibular canals, we found no significant differences between two tomographic methods. Conclusion: Buccal tilting is recommended when measuring the bone height is more important, but routine standard tomography is preferred when the thickness is requested. PMID:23372586

  4. Use of Molecular Diagnostic Tools for the Identification of Species Responsible for Snakebite in Nepal: A Pilot Study

    PubMed Central

    Sharma, Sanjib Kumar; Kuch, Ulrich; Höde, Patrick; Bruhse, Laura; Pandey, Deb P.; Ghimire, Anup; Chappuis, François; Alirol, Emilie

    2016-01-01

    Snakebite is an important medical emergency in rural Nepal. Correct identification of the biting species is crucial for clinicians to choose appropriate treatment and anticipate complications. This is particularly important for neurotoxic envenoming which, depending on the snake species involved, may not respond to available antivenoms. Adequate species identification tools are lacking. This study used a combination of morphological and molecular approaches (PCR-aided DNA sequencing from swabs of bite sites) to determine the contribution of venomous and non-venomous species to the snakebite burden in southern Nepal. Out of 749 patients admitted with a history of snakebite to one of three study centres, the biting species could be identified in 194 (25.9%). Out of these, 87 had been bitten by a venomous snake, most commonly the Indian spectacled cobra (Naja naja; n = 42) and the common krait (Bungarus caeruleus; n = 22). When both morphological identification and PCR/sequencing results were available, a 100% agreement was noted. The probability of a positive PCR result was significantly lower among patients who had used inadequate “first aid” measures (e.g. tourniquets or local application of remedies). This study is the first to report the use of forensic genetics methods for snake species identification in a prospective clinical study. If high diagnostic accuracy is confirmed in larger cohorts, this method will be a very useful reference diagnostic tool for epidemiological investigations and clinical studies. PMID:27105074

  5. Use of Molecular Diagnostic Tools for the Identification of Species Responsible for Snakebite in Nepal: A Pilot Study.

    PubMed

    Sharma, Sanjib Kumar; Kuch, Ulrich; Höde, Patrick; Bruhse, Laura; Pandey, Deb P; Ghimire, Anup; Chappuis, François; Alirol, Emilie

    2016-04-01

    Snakebite is an important medical emergency in rural Nepal. Correct identification of the biting species is crucial for clinicians to choose appropriate treatment and anticipate complications. This is particularly important for neurotoxic envenoming which, depending on the snake species involved, may not respond to available antivenoms. Adequate species identification tools are lacking. This study used a combination of morphological and molecular approaches (PCR-aided DNA sequencing from swabs of bite sites) to determine the contribution of venomous and non-venomous species to the snakebite burden in southern Nepal. Out of 749 patients admitted with a history of snakebite to one of three study centres, the biting species could be identified in 194 (25.9%). Out of these, 87 had been bitten by a venomous snake, most commonly the Indian spectacled cobra (Naja naja; n = 42) and the common krait (Bungarus caeruleus; n = 22). When both morphological identification and PCR/sequencing results were available, a 100% agreement was noted. The probability of a positive PCR result was significantly lower among patients who had used inadequate "first aid" measures (e.g. tourniquets or local application of remedies). This study is the first to report the use of forensic genetics methods for snake species identification in a prospective clinical study. If high diagnostic accuracy is confirmed in larger cohorts, this method will be a very useful reference diagnostic tool for epidemiological investigations and clinical studies. PMID:27105074

  6. A Comparative Study on Diagnostic Accuracy of Colour Coded Digital Images, Direct Digital Images and Conventional Radiographs for Periapical Lesions – An In Vitro Study

    PubMed Central

    Mubeen; K.R., Vijayalakshmi; Bhuyan, Sanat Kumar; Panigrahi, Rajat G; Priyadarshini, Smita R; Misra, Satyaranjan; Singh, Chandravir

    2014-01-01

    Objectives: The identification and radiographic interpretation of periapical bone lesions is important for accurate diagnosis and treatment. The present study was undertaken to study the feasibility and diagnostic accuracy of colour coded digital radiographs in terms of presence and size of lesion and to compare the diagnostic accuracy of colour coded digital images with direct digital images and conventional radiographs for assessing periapical lesions. Materials and Methods: Sixty human dry cadaver hemimandibles were obtained and periapical lesions were created in first and second premolar teeth at the junction of cancellous and cortical bone using a micromotor handpiece and carbide burs of sizes 2, 4 and 6. After each successive use of round burs, a conventional, RVG and colour coded image was taken for each specimen. All the images were evaluated by three observers. The diagnostic accuracy for each bur and image mode was calculated statistically. Results: Our results showed good interobserver (kappa > 0.61) agreement for the different radiographic techniques and for the different bur sizes. Conventional Radiography outperformed Digital Radiography in diagnosing periapical lesions made with Size two bur. Both were equally diagnostic for lesions made with larger bur sizes. Colour coding method was least accurate among all the techniques. Conclusion: Conventional radiography traditionally forms the backbone in the diagnosis, treatment planning and follow-up of periapical lesions. Direct digital imaging is an efficient technique, in diagnostic sense. Colour coding of digital radiography was feasible but less accurate however, this imaging technique, like any other, needs to be studied continuously with the emphasis on safety of patients and diagnostic quality of images. PMID:25584318

  7. The study of opened CNC system of turning-grinding composite machine tool based on UMAC

    NASA Astrophysics Data System (ADS)

    Wang, Hongjun; Han, Qiushi; Wu, Guoxin; Ma, Chao

    2010-12-01

    The general function analysis of a turning-grinding composite machine tool (TGCM) is done. The structure of the TGCM based on 'process integration with one setup' theory in this paper is presented. The CNC system functions of TGCM are analyzed and the CNC framework of TGCM is discussed. Finally the opened-CNC system for this machine tool is developed based on UMAC (Universal Motion and Automation Controller) included hardware system and software system. The hardware structure layout is put forward and the software system is implemented by using VC++6.0. The hardware system was composed of IPC and UMAC. The general control system meets the requirement of integrity machining and matches the hardware structure system of TGCM. The practical machining experiment results showed that the system is valid with high accuracy and high reliability.

  8. The study of opened CNC system of turning-grinding composite machine tool based on UMAC

    NASA Astrophysics Data System (ADS)

    Wang, Hongjun; Han, Qiushi; Wu, Guoxin; Ma, Chao

    2011-05-01

    The general function analysis of a turning-grinding composite machine tool (TGCM) is done. The structure of the TGCM based on 'process integration with one setup' theory in this paper is presented. The CNC system functions of TGCM are analyzed and the CNC framework of TGCM is discussed. Finally the opened-CNC system for this machine tool is developed based on UMAC (Universal Motion and Automation Controller) included hardware system and software system. The hardware structure layout is put forward and the software system is implemented by using VC++6.0. The hardware system was composed of IPC and UMAC. The general control system meets the requirement of integrity machining and matches the hardware structure system of TGCM. The practical machining experiment results showed that the system is valid with high accuracy and high reliability.

  9. ForestPMPlot: A Flexible Tool for Visualizing Heterogeneity Between Studies in Meta-analysis

    PubMed Central

    Kang, Eun Yong; Park, Yurang; Li, Xiao; Segrè, Ayellet V.; Han, Buhm; Eskin, Eleazar

    2016-01-01

    Meta-analysis has become a popular tool for genetic association studies to combine different genetic studies. A key challenge in meta-analysis is heterogeneity, or the differences in effect sizes between studies. Heterogeneity complicates the interpretation of meta-analyses. In this paper, we describe ForestPMPlot, a flexible visualization tool for analyzing studies included in a meta-analysis. The main feature of the tool is visualizing the differences in the effect sizes of the studies to understand why the studies exhibit heterogeneity for a particular phenotype and locus pair under different conditions. We show the application of this tool to interpret a meta-analysis of 17 mouse studies, and to interpret a multi-tissue eQTL study. PMID:27194809

  10. [True color accuracy in digital forensic photography].

    PubMed

    Ramsthaler, Frank; Birngruber, Christoph G; Kröll, Ann-Katrin; Kettner, Mattias; Verhoff, Marcel A

    2016-01-01

    Forensic photographs not only need to be unaltered and authentic and capture context-relevant images, along with certain minimum requirements for image sharpness and information density, but color accuracy also plays an important role, for instance, in the assessment of injuries or taphonomic stages, or in the identification and evaluation of traces from photos. The perception of color not only varies subjectively from person to person, but as a discrete property of an image, color in digital photos is also to a considerable extent influenced by technical factors such as lighting, acquisition settings, camera, and output medium (print, monitor). For these reasons, consistent color accuracy has so far been limited in digital photography. Because images usually contain a wealth of color information, especially for complex or composite colors or shades of color, and the wavelength-dependent sensitivity to factors such as light and shadow may vary between cameras, the usefulness of issuing general recommendations for camera capture settings is limited. Our results indicate that true image colors can best and most realistically be captured with the SpyderCheckr technical calibration tool for digital cameras tested in this study. Apart from aspects such as the simplicity and quickness of the calibration procedure, a further advantage of the tool is that the results are independent of the camera used and can also be used for the color management of output devices such as monitors and printers. The SpyderCheckr color-code patches allow true colors to be captured more realistically than with a manual white balance tool or an automatic flash. We therefore recommend that the use of a color management tool should be considered for the acquisition of all images that demand high true color accuracy (in particular in the setting of injury documentation). PMID:27386623

  11. Intravital microscopy as a tool to study drug delivery in preclinical studies

    PubMed Central

    Amornphimoltham, Panomwat; Masedunskas, Andrius; Weigert, Roberto

    2010-01-01

    The technical developments in the field of non-linear microscopy have made intravital microscopy one of the most successful techniques for studying physiological and pathological processes in live animals. Intravital microscopy has been utilized to address many biological questions in basic research and is now a fundamental tool for preclinical studies, with an enormous potential for clinical applications. The ability to dynamically image cellular and subcellular structures combined with the possibility to perform longitudinal studies have empowered investigators to use this discipline to study the mechanisms of action of therapeutic agents and assess the efficacy on their targets in vivo. The goal of this review is to provide a general overview of the recent advances in intravital microscopy and to discuss some of its applications in preclinical studies. PMID:20933026

  12. Adoption of online health management tools among healthy older adults: An exploratory study.

    PubMed

    Zettel-Watson, Laura; Tsukerman, Dmitry

    2016-06-01

    As the population ages and chronic diseases abound, overburdened healthcare systems will increasingly require individuals to manage their own health. Online health management tools, quickly increasing in popularity, have the potential to diminish or even replace in-person contact with health professionals, but overall efficacy and usage trends are unknown. The current study explored perceptions and usage patterns among users of online health management tools, and identified barriers and barrier-breakers among non-users. An online survey was completed by 169 computer users (aged 50+). Analyses revealed that a sizable minority (37%) of participants use online health management tools and most users (89%) are satisfied with these tools, but a limited range of tools are being used and usage occurs in relatively limited domains. Improved awareness and education for online health management tools could enhance people's abilities to remain at home as they age, reducing the financial burden on formal assistance programs. PMID:25149210

  13. Diagnostic Accuracy Study of Intraoperative and Perioperative Serum Intact PTH Level for Successful Parathyroidectomy in 501 Secondary Hyperparathyroidism Patients

    PubMed Central

    Zhang, Lina; Xing, Changying; Shen, Chong; Zeng, Ming; Yang, Guang; Mao, Huijuan; Zhang, Bo; Yu, Xiangbao; Cui, Yiyao; Sun, Bin; Ouyang, Chun; Ge, Yifei; Jiang, Yao; Yin, Caixia; Zha, Xiaoming; Wang, Ningning

    2016-01-01

    Parathyroidectomy (PTX) is an effective treatment for severe secondary hyperparathyroidism (SHPT); however, persistent SHPT may occur because of supernumerary and ectopic parathyroids. Here a diagnostic accuracy study of intraoperative and perioperative serum intact parathyroid hormone (iPTH) was performed to predict successful surgery in 501 patients, who received total PTX + autotransplantation without thymectomy. Serum iPTH values before incision (io-iPTH0), 10 and 20 min after removing the last parathyroid (io-iPTH10, io-iPTH20), and the first and fourth day after PTX (D1-iPTH, D4-iPTH) were recoded. Patients whose serum iPTH was >50 pg/mL at the first postoperative week were followed up within six months. Successful PTX was defined if iPTH was <300 pg/mL, on the contrary, persistent SHPT was regarded. There were 86.4% patients underwent successful PTX, 9.8% remained as persistent SHPT and 3.8% were undetermined. Intraoperative serum iPTH demonstrated no significant differences in two subgroups with or without chronic hepatitis. Receiver operating characteristic (ROC) curves showed that >88.9% of io-iPTH20% could predict successful PTX (area under the curve [AUC] 0.909, sensitivity 78.6%, specificity 88.5%), thereby avoiding unnecessary exploration to reduce operative complications. D4-iPTH >147.4 pg/mL could predict persistent SHPT (AUC 0.998, sensitivity 100%, specificity 99.5%), so that medical intervention or reoperation start timely. PMID:27231027

  14. Diagnostic Accuracy Study of Intraoperative and Perioperative Serum Intact PTH Level for Successful Parathyroidectomy in 501 Secondary Hyperparathyroidism Patients.

    PubMed

    Zhang, Lina; Xing, Changying; Shen, Chong; Zeng, Ming; Yang, Guang; Mao, Huijuan; Zhang, Bo; Yu, Xiangbao; Cui, Yiyao; Sun, Bin; Ouyang, Chun; Ge, Yifei; Jiang, Yao; Yin, Caixia; Zha, Xiaoming; Wang, Ningning

    2016-01-01

    Parathyroidectomy (PTX) is an effective treatment for severe secondary hyperparathyroidism (SHPT); however, persistent SHPT may occur because of supernumerary and ectopic parathyroids. Here a diagnostic accuracy study of intraoperative and perioperative serum intact parathyroid hormone (iPTH) was performed to predict successful surgery in 501 patients, who received total PTX + autotransplantation without thymectomy. Serum iPTH values before incision (io-iPTH0), 10 and 20 min after removing the last parathyroid (io-iPTH10, io-iPTH20), and the first and fourth day after PTX (D1-iPTH, D4-iPTH) were recoded. Patients whose serum iPTH was >50 pg/mL at the first postoperative week were followed up within six months. Successful PTX was defined if iPTH was <300 pg/mL, on the contrary, persistent SHPT was regarded. There were 86.4% patients underwent successful PTX, 9.8% remained as persistent SHPT and 3.8% were undetermined. Intraoperative serum iPTH demonstrated no significant differences in two subgroups with or without chronic hepatitis. Receiver operating characteristic (ROC) curves showed that >88.9% of io-iPTH20% could predict successful PTX (area under the curve [AUC] 0.909, sensitivity 78.6%, specificity 88.5%), thereby avoiding unnecessary exploration to reduce operative complications. D4-iPTH >147.4 pg/mL could predict persistent SHPT (AUC 0.998, sensitivity 100%, specificity 99.5%), so that medical intervention or reoperation start timely. PMID:27231027

  15. A high accuracy femto-/picosecond laser damage test facility dedicated to the study of optical thin films

    SciTech Connect

    Mangote, B.; Gallais, L.; Zerrad, M.; Lemarchand, F.; Gao, L. H.; Commandre, M.; Lequime, M.

    2012-01-15

    A laser damage test facility delivering pulses from 100 fs to 3 ps and designed to operate at 1030 nm is presented. The different details of its implementation and performances are given. The originality of this system relies the online damage detection system based on Nomarski microscopy and the use of a non-conventional energy detection method based on the utilization of a cooled CCD that offers the possibility to obtain the laser induced damage threshold (LIDT) with high accuracy. Applications of this instrument to study thin films under laser irradiation are presented. Particularly the deterministic behavior of the sub-picosecond damage is investigated in the case of fused silica and oxide films. It is demonstrated that the transition of 0-1 damage probability is very sharp and the LIDT is perfectly deterministic at few hundreds of femtoseconds. The damage process in dielectric materials being the results of electronic processes, specific information such as the material bandgap is needed for the interpretation of results and applications of scaling laws. A review of the different approaches for the estimation of the absorption gap of optical dielectric coatings is conducted and the results given by the different methods are compared and discussed. The LIDT and gap of several oxide materials are then measured with the presented instrument: Al{sub 2}O{sub 3}, Nb{sub 2}O{sub 5}, HfO{sub 2}, SiO{sub 2}, Ta{sub 2}O{sub 5}, and ZrO{sub 2}. The obtained relation between the LIDT and gap at 1030 nm confirms the linear evolution of the threshold with the bandgap that exists at 800 nm, and our work expands the number of tested materials.

  16. Positional Accuracy Assessment of the Openstreetmap Buildings Layer Through Automatic Homologous Pairs Detection: the Method and a Case Study

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Minghini, M.; Molinari, M. E.; Zamboni, G.

    2016-06-01

    OpenStreetMap (OSM) is currently the largest openly licensed collection of geospatial data. Being OSM increasingly exploited in a variety of applications, research has placed great attention on the assessment of its quality. This work focuses on assessing the quality of OSM buildings. While most of the studies available in literature are limited to the evaluation of OSM building completeness, this work proposes an original approach to assess the positional accuracy of OSM buildings based on comparison with a reference dataset. The comparison relies on a quasi-automated detection of homologous pairs on the two datasets. Based on the homologous pairs found, warping algorithms like e.g. affine transformations and multi-resolution splines can be applied to the OSM buildings to generate a new version having an optimal local match to the reference layer. A quality assessment of the OSM buildings of Milan Municipality (Northern Italy), having an area of about 180 km2, is then presented. After computing some measures of completeness, the algorithm based on homologous points is run using the building layer of the official vector cartography of Milan Municipality as the reference dataset. Approximately 100000 homologous points are found, which show a systematic translation of about 0.4 m on both the X and Y directions and a mean distance of about 0.8 m between the datasets. Besides its efficiency and high degree of automation, the algorithm generates a warped version of OSM buildings which, having by definition a closest match to the reference buildings, can be eventually integrated in the OSM database.

  17. Accuracy of Intraocular Lens Power Formulas Involving 148 Eyes with Long Axial Lengths: A Retrospective Chart-Review Study.

    PubMed

    Chen, Chong; Xu, Xian; Miao, Yuyu; Zheng, Gaoxin; Sun, Yong; Xu, Xun

    2015-01-01

    Purpose. This study aims to compare the accuracy of intraocular lens power calculation formulas in eyes with long axial lengths from Chinese patients subjected to cataract surgery. Methods. A total of 148 eyes with an axial length of >26 mm from 148 patients who underwent phacoemulsification with intraocular lens implantation were included. The Haigis, Hoffer Q, Holladay 1, and SRK/T formulas were used to calculate the refractive power of the intraocular lenses and the postoperative estimated power. Results. Overall, the Haigis formula achieved the lowest level of median absolute error 1.025 D (P < 0.01 for Haigis versus each of the other formulas), followed by SRK/T formula (1.040 D). All formulas were least accurate when eyes were with axial length of >33 mm, and median absolute errors were significantly higher for those eyes than eyes with axial length = 26.01-30.00 mm. Absolute error was correlated with axial length for the SRK/T (r = 0.212, P = 0.010) and Hoffer Q (r = 0.223, P = 0.007) formulas. For axial lengths > 33 mm, eyes exhibited a postoperative hyperopic refractive error. Conclusions. The Haigis and SRK/T formulas may be more suitable for calculating intraocular lens power for eyes with axial lengths ranging from 26 to 33 mm. And for axial length over 33 mm, the Haigis formula could be more accurate. PMID:26793392

  18. Accuracy assessment of a marker-free method for registration of CT and stereo images applied in image-guided implantology: a phantom study.

    PubMed

    Mohagheghi, Saeed; Ahmadian, Alireza; Yaghoobee, Siamak

    2014-12-01

    To assess the accuracy of a proposed marker-free registration method as opposed to the conventional marker-based method using an image-guided dental system, and investigating the best configurations of anatomical landmarks for various surgical fields in a phantom study, a CT-compatible dental phantom consisting of implanted targets was used. Two marker-free registration methods were evaluated, first using dental anatomical landmarks and second, using a reference marker tool. Six implanted markers, distributed in the inner space of the phantom were used as the targets; the values of target registration error (TRE) for each target were measured and compared with the marker-based method. Then, the effects of different landmark configurations on TRE values, measured using the Parsiss IV Guided Navigation system (Parsiss, Tehran, Iran), were investigated to find the best landmark arrangement for reaching the minimum registration error in each target region. It was proved that marker-free registration can be as precise as the marker-based method. This has a great impact on image-guided implantology systems whereby the drawbacks of fiducial markers for patient and surgeon are removed. It was also shown that smaller values of TRE could be achieved by using appropriate landmark configurations and moving the center of the landmark set closer to the surgery target. Other common factors would not necessarily decrease the TRE value so the conventional rules accepted in the clinical community about the ways to reduce TRE should be adapted to the selected field of dental surgery. PMID:25441868

  19. Videogames, Tools for Change: A Study Based on Activity Theory

    ERIC Educational Resources Information Center

    Méndez, Laura; Lacasa, Pilar

    2015-01-01

    Introduction: The purpose of this study is to provide a framework for analysis from which to interpret the transformations that take place, as perceived by the participants, when commercial video games are used in the classroom. We will show how Activity Theory (AT) is able to explain and interpret these changes. Method: Case studies are…

  20. Accuracy in optical overlay metrology

    NASA Astrophysics Data System (ADS)

    Bringoltz, Barak; Marciano, Tal; Yaziv, Tal; DeLeeuw, Yaron; Klein, Dana; Feler, Yoel; Adam, Ido; Gurevich, Evgeni; Sella, Noga; Lindenfeld, Ze'ev; Leviant, Tom; Saltoun, Lilach; Ashwal, Eltsafon; Alumot, Dror; Lamhot, Yuval; Gao, Xindong; Manka, James; Chen, Bryan; Wagner, Mark

    2016-03-01

    In this paper we discuss the mechanism by which process variations determine the overlay accuracy of optical metrology. We start by focusing on scatterometry, and showing that the underlying physics of this mechanism involves interference effects between cavity modes that travel between the upper and lower gratings in the scatterometry target. A direct result is the behavior of accuracy as a function of wavelength, and the existence of relatively well defined spectral regimes in which the overlay accuracy and process robustness degrades (`resonant regimes'). These resonances are separated by wavelength regions in which the overlay accuracy is better and independent of wavelength (we term these `flat regions'). The combination of flat and resonant regions forms a spectral signature which is unique to each overlay alignment and carries certain universal features with respect to different types of process variations. We term this signature the `landscape', and discuss its universality. Next, we show how to characterize overlay performance with a finite set of metrics that are available on the fly, and that are derived from the angular behavior of the signal and the way it flags resonances. These metrics are used to guarantee the selection of accurate recipes and targets for the metrology tool, and for process control with the overlay tool. We end with comments on the similarity of imaging overlay to scatterometry overlay, and on the way that pupil overlay scatterometry and field overlay scatterometry differ from an accuracy perspective.

  1. The Effect of Delayed-JOLs and Sentence Generation on Children's Monitoring Accuracy and Regulation of Idiom Study

    ERIC Educational Resources Information Center

    van Loon, Mariëtte H.; de Bruin, Anique B. H.; van Gog, Tamara; van Merriënboer, Jeroen J. G.

    2013-01-01

    When studying verbal materials, both adults and children are often poor at accurately monitoring their level of learning and regulating their subsequent restudy of materials, which leads to suboptimal test performance. The present experiment investigated how monitoring accuracy and regulation of study could be improved when learning idiomatic…

  2. Effects of random study checks and guided notes study cards on middle school special education students' notetaking accuracy and science vocabulary quiz scores

    NASA Astrophysics Data System (ADS)

    Wood, Charles L.

    Federal legislation mandates that all students with disabilities have meaningful access to the general education curriculum and that students with and without disabilities be held equally accountable to the same academic standards (IDEIA, 2004; NCLB, 2001). Many students with disabilities, however, perform poorly in academic content courses, especially at the middle and secondary school levels. Previous research has reported increased notetaking accuracy and quiz scores over lecture content when students completed guided notes compared to taking their own notes. This study evaluated the effects of a pre-quiz review procedure and specially formatted guided notes on middle school special education students' learning of science vocabulary. This study compared the effects of three experimental conditions. (a) Own Notes (ON), (b) Own Notes+Random Study Checks (ON+RSC), and (c) Guided Notes Study Cards+Random Study Checks (GNSC+RSC) on each student's accuracy of notes, next-day quiz scores, and review quiz scores. Each session, the teacher presented 12 science vocabulary terms and definitions during a lecture and students took notes. The students were given 5 minutes to study their notes at the end of each session and were reminded to study their notes at home and in study hall period. In the ON condition students took notes on a sheet of paper with numbered lines from 1 to 12. Just before each next-day quiz in the ON+RSC condition students used write-on response cards to answer two teacher-posed questions over randomly selected vocabulary terms from the previous day's lecture. If the answer on a randomly selected student's response card was correct, that student earned a lottery ticket for inexpensive prizes and a quiz bonus point for herself and each classmate. In the GNSC+RSC condition students took notes on specially formatted guided notes that after the lecture they cut into a set of flashcards that could used for study. The students' mean notetaking accuracy was 75

  3. Experience of Integrating Various Technological Tools into the Study and Future Teaching of Mathematics Education Students

    ERIC Educational Resources Information Center

    Gorev, Dvora; Gurevich-Leibman, Irina

    2015-01-01

    This paper presents our experience of integrating technological tools into our mathematics teaching (in both disciplinary and didactic courses) for student-teachers. In the first cycle of our study, a variety of technological tools were used (e.g., dynamic software, hypertexts, video and applets) in teaching two disciplinary mathematics courses.…

  4. Generating Animal and Tool Names: An fMRI Study of Effective Connectivity

    ERIC Educational Resources Information Center

    Vitali, P.; Abutalebi, J.; Tettamanti, M.; Rowe, J.; Scifo, P.; Fazio, F.; Cappa, S.F.; Perani, D.

    2005-01-01

    The present fMRI study of semantic fluency for animal and tool names provides further evidence for category-specific brain activations, and reports task-related changes in effective connectivity among defined cerebral regions. Two partially segregated systems of functional integration were highlighted: the tool condition was associated with an…

  5. Dietary Adherence Monitoring Tool for Free-living, Controlled Feeding Studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Objective: To devise a dietary adherence monitoring tool for use in controlled human feeding trials involving free-living study participants. Methods: A scoring tool was devised to measure and track dietary adherence for an 8-wk randomized trial evaluating the effects of two different dietary patter...

  6. WiFiSiM: An Educational Tool for the Study and Design of Wireless Networks

    ERIC Educational Resources Information Center

    Mateo Sanguino, T. J.; Serrano Lopez, C.; Marquez Hernandez, F. A.

    2013-01-01

    A new educational simulation tool designed for the generic study of wireless networks, the Wireless Fidelity Simulator (WiFiSim), is presented in this paper. The goal of this work was to create and implement a didactic tool to improve the teaching and learning of computer networks by means of two complementary strategies: simulating the behavior…

  7. Tools to study and manage grazing behavior at multiple scales to enhance the sustainability of livestock

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Free-ranging animal behavior is a multifaceted and complex phenomenon within rangeland ecology that must be understood and ultimately managed. Improving behavioral studies requires tools appropriate for use at the landscape scale. Though tools alone do not assure research will generate accurate in...

  8. GoPro as an Ethnographic Tool: A Wayfinding Study in an Academic Library

    ERIC Educational Resources Information Center

    Kinsley, Kirsten M.; Schoonover, Dan; Spitler, Jasmine

    2016-01-01

    In this study, researchers sought to capture students' authentic experience of finding books in the main library using a GoPro camera and the think-aloud protocol. The GoPro provided a first-person perspective and was an effective ethnographic tool for observing a student's individual experience, while also demonstrating what tools they use to…

  9. Parents' and Service Providers' Perceptions of the Family Goal Setting Tool: A Pilot Study

    ERIC Educational Resources Information Center

    Rodger, Sylvia; O'Keefe, Amy; Cook, Madonna; Jones, Judy

    2012-01-01

    Background: This qualitative study describes parents' and service providers' experiences in using the Family Goal Setting Tool (FGST). This article looks specifically at the tool's perceived clinical utility during annual, collaborative goal setting. Methods: Participants included eight parents and ten service providers involved in a Family and…

  10. U.S. CASE STUDIES USING MUNICIPAL SOLID WASTE DECISION SUPPORT TOOL

    EPA Science Inventory

    The paper provides an overview of some case studies using the recently completed muniicpal solid waste decision support tool (MSW-DST) in communities across the U.S. The purpose of the overview is to help illustrate the variety of potential applications of the tool. The methodolo...

  11. A Usability Study of Users' Perceptions toward a Multimedia Computer-Assisted Learning Tool for Neuroanatomy

    ERIC Educational Resources Information Center

    Gould, Douglas J.; Terrell, Mark A.; Fleming, Jo

    2008-01-01

    This usability study evaluated users' perceptions of a multimedia prototype for a new e-learning tool: Anatomy of the Central Nervous System: A Multimedia Course. Usability testing is a collection of formative evaluation methods that inform the developmental design of e-learning tools to maximize user acceptance, satisfaction, and adoption.…

  12. The Accuracy of Recidivism Risk Assessments for Sexual Offenders: A Meta-Analysis of 118 Prediction Studies

    ERIC Educational Resources Information Center

    Hanson, R. Karl; Morton-Bourgon, Kelly E.

    2009-01-01

    This review compared the accuracy of various approaches to the prediction of recidivism among sexual offenders. On the basis of a meta-analysis of 536 findings drawn from 118 distinct samples (45,398 sexual offenders, 16 countries), empirically derived actuarial measures were more accurate than unstructured professional judgment for all outcomes…

  13. The effects of relatedness and GxE interaction on prediction accuracies in genomic selection: a study in cassava

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Prior to implementation of genomic selection, an evaluation of the potential accuracy of prediction can be obtained by cross validation. In this procedure, a population with both phenotypes and genotypes is split into training and validation sets. The prediction model is fitted using the training se...

  14. Real-Word and Nonword Repetition in Italian-Speaking Children with Specific Language Impairment: A Study of Diagnostic Accuracy

    ERIC Educational Resources Information Center

    Dispaldro, Marco; Leonard, Laurence B.; Deevy, Patricia

    2013-01-01

    Purpose: Using 2 different scoring methods, the authors examined the diagnostic accuracy of both real-word and nonword repetition in identifying Italian-speaking children with and without specific language impairment (SLI). Method: A total of 34 children ages 3;11-5;8 (years;months) participated--17 children with SLI and 17 typically developing…

  15. Refining Ovarian Cancer Test accuracy Scores (ROCkeTS): protocol for a prospective longitudinal test accuracy study to validate new risk scores in women with symptoms of suspected ovarian cancer

    PubMed Central

    Sundar, Sudha; Rick, Caroline; Dowling, Francis; Au, Pui; Rai, Nirmala; Champaneria, Rita; Stobart, Hilary; Neal, Richard; Davenport, Clare; Mallett, Susan; Sutton, Andrew; Kehoe, Sean; Timmerman, Dirk; Bourne, Tom; Van Calster, Ben; Gentry-Maharaj, Aleksandra; Deeks, Jon

    2016-01-01

    Introduction Ovarian cancer (OC) is associated with non-specific symptoms such as bloating, making accurate diagnosis challenging: only 1 in 3 women with OC presents through primary care referral. National Institute for Health and Care Excellence guidelines recommends sequential testing with CA125 and routine ultrasound in primary care. However, these diagnostic tests have limited sensitivity or specificity. Improving accurate triage in women with vague symptoms is likely to improve mortality by streamlining referral and care pathways. The Refining Ovarian Cancer Test Accuracy Scores (ROCkeTS; HTA 13/13/01) project will derive and validate new tests/risk prediction models that estimate the probability of having OC in women with symptoms. This protocol refers to the prospective study only (phase III). Methods and analysis ROCkeTS comprises four parallel phases. The full ROCkeTS protocol can be found at http://www.birmingham.ac.uk/ROCKETS. Phase III is a prospective test accuracy study. The study will recruit 2450 patients from 15 UK sites. Recruited patients complete symptom and anxiety questionnaires, donate a serum sample and undergo ultrasound scored as per International Ovarian Tumour Analysis (IOTA) criteria. Recruitment is at rapid access clinics, emergency departments and elective clinics. Models to be evaluated include those based on ultrasound derived by the IOTA group and novel models derived from analysis of existing data sets. Estimates of sensitivity, specificity, c-statistic (area under receiver operating curve), positive predictive value and negative predictive value of diagnostic tests are evaluated and a calibration plot for models will be presented. ROCkeTS has received ethical approval from the NHS West Midlands REC (14/WM/1241) and is registered on the controlled trials website (ISRCTN17160843) and the National Institute of Health Research Cancer and Reproductive Health portfolios. PMID:27507231

  16. PFMS (Precision Flexible Manufacturing System) tool characterization study at Lawrence Livermore National Laboratory

    SciTech Connect

    Hansen, H.J.; Prokosch, M.W.

    1989-06-02

    A tool characterization study was conducted as part of the modeling effort at LLNL in support of the Y-12 Enhanced T-base Project. The goal of the study was to identify tool error sources and measure their influence on the finish machining process of D-38 Hemi Shells. This paper outlines the procedures and results of this study. The condition of new carbide inserts used at Y-12 are examined including insert edge quality, geometry, and tool radius size and contour deviation. Test parts were cut on LLNL's Pneumo Precision T-Base lathe according to common Y-12 finishing procedures. Part profile and tool contour were measured before and after each cut. The effect of initial tool profile errors and tool wear during the cut are discussed as they relate to the finished part profile. Insert and part inspection procedures are covered in detail. The established tool wear patterns for both the .002'' and the .005'' depths of cut typically used at Y-12 are presented. Additional tool error source observations are also included. 22 figs.

  17. Developing a Social Autopsy Tool for Dengue Mortality: A Pilot Study

    PubMed Central

    Arauz, María José; Ridde, Valéry; Hernández, Libia Milena; Charris, Yaneth; Carabali, Mabel; Villar, Luis Ángel

    2015-01-01

    Background Dengue fever is a public health problem in the tropical and sub-tropical world. Dengue cases have grown dramatically in recent years as well as dengue mortality. Colombia has experienced periodic dengue outbreaks with numerous dengue related-deaths, where the Santander department has been particularly affected. Although social determinants of health (SDH) shape health outcomes, including mortality, it is not yet understood how these affect dengue mortality. The aim of this pilot study was to develop and pre-test a social autopsy (SA) tool for dengue mortality. Methods and Findings The tool was developed and pre-tested in three steps. First, dengue fatal cases and ‘near misses’ (those who recovered from dengue complications) definitions were elaborated. Second, a conceptual framework on determinants of dengue mortality was developed to guide the construction of the tool. Lastly, the tool was designed and pre-tested among three relatives of fatal cases and six near misses in 2013 in the metropolitan zone of Bucaramanga. The tool turned out to be practical in the context of dengue mortality in Colombia after some modifications. The tool aims to study the social, individual, and health systems determinants of dengue mortality. The tool is focused on studying the socioeconomic position and the intermediary SDH rather than the socioeconomic and political context. Conclusions The SA tool is based on the scientific literature, a validated conceptual framework, researchers’ and health professionals’ expertise, and a pilot study. It is the first time that a SA tool has been created for the dengue mortality context. Our work furthers the study on SDH and how these are applied to neglected tropical diseases, like dengue. This tool could be integrated in surveillance systems to provide complementary information on the modifiable and avoidable death-related factors and therefore, be able to formulate interventions for dengue mortality reduction. PMID:25658485

  18. Psychological Autopsy Studies as Diagnostic Tools: Are They Methodologically Flawed?

    ERIC Educational Resources Information Center

    Hjelmeland, Heidi; Dieserud, Gudrun; Dyregrov, Kari; Knizek, Birthe L.; Leenaars, Antoon A.

    2012-01-01

    One of the most established "truths" in suicidology is that almost all (90% or more) of those who kill themselves suffer from one or more mental disorders, and a causal link between the two is implied. Psychological autopsy (PA) studies constitute one main evidence base for this conclusion. However, there has been little reflection on the…

  19. A generalized approach and computer tool for quantitative genetics study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative genetics is one of the most important components to provide valuable genetic information for improving production and quality of plants and animals. The research history of quantitative genetics study could be traced back more than one hundred years. Since the Analysis of Variance (ANOV...

  20. Factor Analysis: A Tool for Studying Mathematics Anxiety.

    ERIC Educational Resources Information Center

    McAuliffe, Elizabeth A.; Trueblood, Cecil R.

    Mathematics anxiety and its relationship to other constructs was studied in 138 preservice elementary and special education teachers. The students, primarily women, were enrolled in a variety of professional courses and field experiences. Five instruments were administered, their factor structures were determined, and intercorrelations among the…

  1. The Writing Workshop as an Inservice Tool: A Case Study.

    ERIC Educational Resources Information Center

    Pollock, Jeri

    1994-01-01

    Presents a case study of an inservice writing workshop (at Our Lady of Mercy School in Rio de Janeiro, Brazil) designed to give teachers hands-on experience in applying computer writing to their individual subjects. Describes how a computer culture was developed at the school. (RS)

  2. Value of systematic detection of physical child abuse at emergency rooms: a cross-sectional diagnostic accuracy study

    PubMed Central

    Sittig, Judith S; Uiterwaal, Cuno S P M; Moons, Karel G M; Russel, Ingrid M B; Nievelstein, Rutger A J; Nieuwenhuis, Edward E S; van de Putte, Elise M

    2016-01-01

    Objectives The aim of our diagnostic accuracy study Child Abuse Inventory at Emergency Rooms (CHAIN-ER) was to establish whether a widely used checklist accurately detects or excludes physical abuse among children presenting to ERs with physical injury. Design A large multicentre study with a 6-month follow-up. Setting 4 ERs in The Netherlands. Participants 4290 children aged 0–7 years attending the ER because of physical injury. All children were systematically tested with an easy-to-use child abuse checklist (index test). A national expert panel (reference standard) retrospectively assessed all children with positive screens and a 15% random sample of the children with negative screens for physical abuse, using additional information, namely, an injury history taken by a paediatrician, information provided by the general practitioner, youth doctor and social services by structured questionnaires, and 6-month follow-up information. Main outcome measure Physical child abuse. Secondary outcome measure Injury due to neglect and need for help. Results 4253/4290 (99%) parents agreed to follow-up. At a prevalence of 0.07% (3/4253) for inflicted injury by expert panel decision, the positive predictive value of the checklist was 0.03 (95% CI 0.006 to 0.085), and the negative predictive value 1.0 (0.994 to 1.0). There was 100% (93 to 100) agreement about inflicted injury in children, with positive screens between the expert panel and child abuse experts. Conclusions Rare cases of inflicted injury among preschool children presenting at ERs for injury are very likely captured by easy-to-use checklists, but at very high false-positive rates. Subsequent assessment by child abuse experts can be safely restricted to children with positive screens at very low risk of missing cases of inflicted injury. Because of the high false positive rate, we do advise careful prior consideration of cost-effectiveness and clinical and societal implications before de novo implementation

  3. Case studies: low cost, high-strength, large carbon foam tooling

    SciTech Connect

    Lucas, R.; Danford, H.

    2009-01-15

    A new carbon foam tooling system has been developed that results in a low-cost, high-strength material that has been proving attractive for creation of tooling for composite parts. Composites are stronger; lighter and less subject to corrosion and fatigue than materials that are currently used for fabrication of advanced structures. Tools to manufacture these composite parts must be rigid, durable and able to offer a coefficient of thermal expansion (CTE) closely matching that of the composites. Current technology makes it difficult to match the CTE of a composite part in the curing cycle with anything other than a carbon composite or a nickel iron alloy such as Invar. Fabrication of metallic tooling requires many, expensive stages of long duration with a large infrastructure investment. Card ban fiber reinforced polymer resin composite tooling has a shorter lead-time but limited production use because of durability concerns. Coal-based carbon foam material has a compatible CTE and strong durability, that make it an attractive alternative for use in tooling. The use of coal-based carbon foam in tooling for carbon composites is advantageous because of its low cost, light weight, machinability , vacuum integrity and compatibility with a wide range of curing processes. Large-scale tooling case studies will be presented detailing carbon foam's potential for tooling applications.

  4. DeID - a data sharing tool for neuroimaging studies.

    PubMed

    Song, Xuebo; Wang, James; Wang, Anlin; Meng, Qingping; Prescott, Christian; Tsu, Loretta; Eckert, Mark A

    2015-01-01

    Funding institutions and researchers increasingly expect that data will be shared to increase scientific integrity and provide other scientists with the opportunity to use the data with novel methods that may advance understanding in a particular field of study. In practice, sharing human subject data can be complicated because data must be de-identified prior to sharing. Moreover, integrating varied data types collected in a study can be challenging and time consuming. For example, sharing data from structural imaging studies of a complex disorder requires the integration of imaging, demographic and/or behavioral data in a way that no subject identifiers are included in the de-identified dataset and with new subject labels or identification values that cannot be tracked back to the original ones. We have developed a Java program that users can use to remove identifying information in neuroimaging datasets, while still maintaining the association among different data types from the same subject for further studies. This software provides a series of user interaction wizards to allow users to select data variables to be de-identified, implements functions for auditing and validation of de-identified data, and enables the user to share the de-identified data in a single compressed package through various communication protocols, such as FTPS and SFTP. DeID runs with Windows, Linux, and Mac operating systems and its open architecture allows it to be easily adapted to support a broader array of data types, with the goal of facilitating data sharing. DeID can be obtained at http://www.nitrc.org/projects/deid. PMID:26441500

  5. Matrix isolation as a tool for studying interstellar chemical reactions

    NASA Technical Reports Server (NTRS)

    Ball, David W.; Ortman, Bryan J.; Hauge, Robert H.; Margrave, John L.

    1989-01-01

    Since the identification of the OH radical as an interstellar species, over 50 molecular species were identified as interstellar denizens. While identification of new species appears straightforward, an explanation for their mechanisms of formation is not. Most astronomers concede that large bodies like interstellar dust grains are necessary for adsorption of molecules and their energies of reactions, but many of the mechanistic steps are unknown and speculative. It is proposed that data from matrix isolation experiments involving the reactions of refractory materials (especially C, Si, and Fe atoms and clusters) with small molecules (mainly H2, H2O, CO, CO2) are particularly applicable to explaining mechanistic details of likely interstellar chemical reactions. In many cases, matrix isolation techniques are the sole method of studying such reactions; also in many cases, complexations and bond rearrangements yield molecules never before observed. The study of these reactions thus provides a logical basis for the mechanisms of interstellar reactions. A list of reactions is presented that would simulate interstellar chemical reactions. These reactions were studied using FTIR-matrix isolation techniques.

  6. Development of a Burn Escharotomy Assessment Tool: A Pilot Study.

    PubMed

    Ur, Rebecca; Holmes, James H; Johnson, James E; Molnar, Joseph A; Carter, Jeffrey E

    2016-01-01

    Severe burn injuries can require escharotomies which are urgent, infrequent, and relatively high-risk procedures necessary to preserve limb perfusion and sometimes ventilation. The American Burn Association Advanced Burn Life Support© course educates surgeons and emergency providers about escharotomy incisions but lacks a biomimetic trainer to demonstrate, practice, or provide assessment. The goal was to build an affordable biomimetic trainer with discrete points of failure and pilot a validation study. Fellowship-trained burn and plastic surgeons worked with special effect artists and anatomists to develop a biomimetic trainer with three discrete points of failure: median or ulnar nerve injury, fasciotomy, and failure to check distal pulse. Participants were divided between experienced and inexperienced, survey pre- and post-procedure on a biomimetic model while being timed. The trainer total cost per participant was less than $35. Eighteen participants were involved in the study. The inexperienced (0-1 prior escharotomies performed) had significantly more violations at the discrete points of failure relative to more experienced participants (P = .036). Face validity was assessed with 100% of participants agreement that the model appeared similar to real life and was valuable in their training. Given the advancements in biomimetic models and the need to train surgeons in how to perform infrequent, emergent surgical procedures, an escharotomy trainer is needed today. The authors developed an affordable model with a successful pilot study demonstrating discrimination between experienced and inexperienced surgeons. Additional research is needed to increase the reliability and assessment metrics. PMID:26594860

  7. The space elevator: a new tool for space studies.

    PubMed

    Edwards, Bradley C

    2003-06-01

    The objective has been to develop a viable scenario for the construction, deployment and operation of a space elevator using current or near future technology. This effort has been primarily a paper study with several experimental tests of specific systems. Computer simulations, engineering designs, literature studies and inclusion of existing programs have been utilized to produce a design for the first space elevator. The results from this effort illustrate a viable design using current and near-term technology for the construction of the first space elevator. The timeline for possible construction is within the coming decades and estimated costs are less than $10 B. The initial elevator would have a 5 ton/day capacity and operating costs near $100/lb for payloads going to any Earth orbit or traveling to the Moon, Mars, Venus or the asteroids. An operational space elevator would allow for larger and much longer-term biological space studies at selectable gravity levels. The high-capacity and low operational cost of this system would also allow for inexpensive searches for life throughout our solar system and the first tests of environmental engineering. This work is supported by a grant from the NASA Institute for Advanced Concepts (NIAC). PMID:12959137

  8. Studying PubMed usages in the field for complex problem solving: Implications for tool design.

    PubMed

    Mirel, Barbara; Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa

    2013-05-01

    Many recent studies on MEDLINE-based information seeking have shed light on scientists' behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists' problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375

  9. Studying PubMed usages in the field for complex problem solving: Implications for tool design

    PubMed Central

    Song, Jean; Tonks, Jennifer Steiner; Meng, Fan; Xuan, Weijian; Ameziane, Rafiqa

    2012-01-01

    Many recent studies on MEDLINE-based information seeking have shed light on scientists’ behaviors and associated tool innovations that may improve efficiency and effectiveness. Few if any studies, however, examine scientists’ problem-solving uses of PubMed in actual contexts of work and corresponding needs for better tool support. Addressing this gap, we conducted a field study of novice scientists (14 upper level undergraduate majors in molecular biology) as they engaged in a problem solving activity with PubMed in a laboratory setting. Findings reveal many common stages and patterns of information seeking across users as well as variations, especially variations in cognitive search styles. Based on findings, we suggest tool improvements that both confirm and qualify many results found in other recent studies. Our findings highlight the need to use results from context-rich studies to inform decisions in tool design about when to offer improved features to users. PMID:24376375

  10. In vitro study of accuracy of cervical pedicle screw insertion using an electronic conductivity device (ATPS part III)

    PubMed Central

    Hitzl, Wolfgang; Acosta, Frank; Tauber, Mark; Zenner, Juliane; Resch, Herbert; Yukawa, Yasutsugu; Meier, Oliver; Schmidt, Rene; Mayer, Michael

    2009-01-01

    Reconstruction of the highly unstable, anteriorly decompressed cervical spine poses biomechanical challenges to current stabilization strategies, including circumferential instrumented fusion, to prevent failure. To avoid secondary posterior surgery, particularly in the elderly population, while increasing primary construct rigidity of anterior-only reconstructions, the authors introduced the concept of anterior transpedicular screw (ATPS) fixation and plating. We demonstrated its morphological feasibility, its superior biomechanical pull-out characteristics compared with vertebral body screws and the accuracy of inserting ATPS using a manual fluoroscopically assisted technique. Although accuracy was high, showing non-critical breaches in the axial and sagittal plane in 78 and 96%, further research was indicated refining technique and increasing accuracy. In light of first clinical case series, the authors analyzed the impact of using an electronic conductivity device (ECD, PediGuard) on the accuracy of ATPS insertion. As there exist only experiences in thoracolumbar surgery the versatility of the ECD was also assessed for posterior cervical pedicle screw fixation (pCPS). 30 ATPS and 30 pCPS were inserted alternately into the C3–T1 vertebra of five fresh-frozen specimen. Fluoroscopic assistance was only used for the entry point selection, pedicle tract preparation was done using the ECD. Preoperative CT scans were assessed for sclerosis at the pedicle entrance or core, and vertebrae with dense pedicles were excluded. Pre- and postoperative reconstructed CT scans were analyzed for pedicle screw positions according to a previously established grading system. Statistical analysis revealed an astonishingly high accuracy for the ATPS group with no critical screw position (0%) in axial or sagittal plane. In the pCPS group, 88.9% of screws inserted showed non-critical screw position, while 11.1% showed critical pedicle perforations. The usage of an ECD for posterior and

  11. Elastohydrodynamic lubrication calculations used as a tool to study scuffing

    NASA Technical Reports Server (NTRS)

    Houpert, L. G.; Hamrock, B. J.

    1985-01-01

    A new Reynolds equation was developed that takes into account the nonlinear viscous behavior of the fluid. The new Reynolds equation considers the nonlinear viscous fluid model of Eyring, the equilibrium equation, the constant mass flow, and the kinematic boundary condition. The new Reynolds equation and the elasticity equation are solved simultaneously by using a system approach and a Newton-Raphson technique. Comparisons are made with results obtained from the classical Reynolds equation. The effects of sliding speed and introducing a bump or a groove within the conjunction are studied. Results are shown for both moderate and heavy loads.

  12. Formaldehyde crosslinking: a tool for the study of chromatin complexes.

    PubMed

    Hoffman, Elizabeth A; Frey, Brian L; Smith, Lloyd M; Auble, David T

    2015-10-30

    Formaldehyde has been used for decades to probe macromolecular structure and function and to trap complexes, cells, and tissues for further analysis. Formaldehyde crosslinking is routinely employed for detection and quantification of protein-DNA interactions, interactions between chromatin proteins, and interactions between distal segments of the chromatin fiber. Despite widespread use and a rich biochemical literature, important aspects of formaldehyde behavior in cells have not been well described. Here, we highlight features of formaldehyde chemistry relevant to its use in analyses of chromatin complexes, focusing on how its properties may influence studies of chromatin structure and function. PMID:26354429

  13. NANIVID: A New Research Tool for Tissue Microenvironment Studies

    NASA Astrophysics Data System (ADS)

    Raja, Waseem K.

    Metastatic tumors are heterogeneous in nature and composed of subpopulations of cells having various metastatic potentials. The time progression of a tumor creates a unique microenvironment to improve the invasion capabilities and survivability of cancer cells in different microenvironments. In the early stages of intravasation, cancer cells establish communication with other cell types through a paracrine loop and covers long distances by sensing growth factor gradients through extracellular matrices. Cellular migration both in vitro and in vivo is a complex process and to understand their motility in depth, sophisticated techniques are required to document and record events in real time. This study presents the design and optimization of a new versatile chemotaxis device called the NANIVID (NANo IntraVital Imaging Device), developed using advanced Nano/Micro fabrication techniques. The current version of this device has been demonstrated to form a stable (epidermal growth factor) EGF gradient in vitro (2D and 3D) while a miniaturized size of NANIVID is used as an implantable device for intravital studies of chemotaxis and to collect cells in vivo. The device is fabricated using microfabrication techniques in which two substrates are bonded together using a thin polymer layer creating a bonded device with one point source (approximately 150 im x 50 im) outlet. The main structures of the device consist of two transparent substrates: one having etched chambers and channel while the second consists of a microelectrode system to measure real time cell arrival inside the device. The chamber of the device is loaded with a growth factor reservoir consisting of hydrogel to sustain a steady release of growth factor into the surrounding environment for long periods of time and establishing a concentration gradient from the device. The focus of this study was to design and optimize the new device for cell chemotaxis studies in breast cancer cells in cell culture. Our results

  14. Oscillation mechanics of the respiratory system in never-smoking patients with silicosis: pathophysiological study and evaluation of diagnostic accuracy

    PubMed Central

    de Sá, Paula Morisco; Lopes, Agnaldo José; Jansen, José Manoel; de Melo, Pedro Lopes

    2013-01-01

    OBJECTIVES: Silicosis is a chronic and incurable occupational disease that can progress even after the cessation of exposure. Recent studies suggest that the forced oscillation technique may help to clarify the changes in lung mechanics resulting from silicosis as well as the detection of these changes. We investigated the effects of airway obstruction in silicosis on respiratory impedance and evaluated the diagnostic efficacy of the forced oscillation technique in these patients. METHODS: Spirometry was used to classify the airway obstruction, which resulted in four subject categories: controls (n = 21), patients with a normal exam (n = 12), patients with mild obstruction (n = 22), and patients with moderate-to-severe obstruction (n = 12). Resistive data were interpreted using the zero-intercept resistance (R0), the resistance at 4 Hz (Rrs4), and the mean resistance. We also analyzed the mean reactance (Xm) and the dynamic compliance. The total mechanical load was evaluated using the absolute value of the respiratory impedance (Z4Hz). The diagnostic potential was evaluated by investigating the area under the receiver operating characteristic curve. ClinicalTrials.gov: NCT01725971. RESULTS: We observed significant (p<0.0002) increases in R0, Rrs4, Rm, and Z4Hz and significant reductions in Crs,dyn (p<0.0002) and Xm (p<0.0001). R0, Rrs4, Rm, and Z4Hz performed adequately in the diagnosis of mild obstruction (area under the curve>0.80) and highly accurately in the detection of moderate-to-severe obstruction (area under the curve>0.90). CONCLUSIONS: The forced oscillation technique may contribute to the study of the pathophysiology of silicosis and may improve the treatment offered to these patients, thus representing an alternative and/or complementary tool for evaluating respiratory mechanics. PMID:23778400

  15. Cell transfection as a tool to study growth hormone action

    SciTech Connect

    Norstedt, G.; Enberg, B.; Francis, S.

    1994-12-31

    The isolation of growth hormone receptor (GHR) cDNA clones has made possible the transfection of GHRs into cultured cells. Our aim in this minireview is to show how the application of such approaches have benefited GHR research. GH stimulation of cells expressing GHR cDNAs can cause an alteration of cellular function that mimic those of the endogenous GHR. GHR cDNA transfected cells also offer a system where the mechanism of GH action can be studied. Such a system has been used to demonstrate that the GHR itself becomes tyrosine phosphorylated and that further phosphorylation of downstream proteins is important in GH action. The GH signals are transmitted to the nucleus and GH regulated genes have now begun to be characterized. The ability to use cell transfection for mechanistic studies of GH action will be instrumental to define domains within the receptor that are of functional importance and to determined pathways whereby GH signals are conveyed within the cell. 33 refs., 2 tabs.

  16. Exposure tool chuck flatness study and effects on lithography

    NASA Astrophysics Data System (ADS)

    Mukherjee-Roy, Moitreyee; Tan, Cher-Huan; Tan, Yong K.; Samudra, Ganesh S.

    2001-04-01

    The flatness of the chuck on the stepper or scanner is critical to obtain good patterning performance especially in the sub quarter micron regime. In this study an attempt has been made to u7nderstand the flatness signature of the chuck by measuring the flatness of a super flat wafer in two different notch orientations and subtracting the signatures. If the chuck or the wafer were ideally flat then there would be no different in flatness signatures between the two orientations. However in practice difference was found as neither the chuck nor the wafer is perfectly flat. This difference could be used to obtain an understanding about the flatness signature on the scanner chuck itself. This signature could be used by equipment manufacturers as an additional method to measure chuck flatness so that only superior chucks are used for equipment that are being made for sub quarter micron lithography. The second part of this study consisted of finding out the effect of this flatness on the resulting CD on wafers. Wafers, with different flatness signatures, were exposed at different orientations and the CD variations were evaluated. All wafers showed improvements in the orientation of better flatness. For some wafers the improvements was significant but for others the result was close to the CD variation due to rework. This could be attributed to the inherent signatures on the wafers and how abrupt the change in flatness was. The wafer deformation factor was not analyzed for brevity as this would make the problem far more complex.

  17. Emerging tools to study proteoglycan function during skeletal development.

    PubMed

    Brown, D S; Eames, B F

    2016-01-01

    In the past 20years, appreciation for the varied roles of proteoglycans (PGs), which are specific types of sugar-coated proteins, has increased dramatically. PGs in the extracellular matrix were long known to impart structural functions to many tissues, especially articular cartilage, which cushions bones and allows mobility at skeletal joints. Indeed, osteoarthritis is a debilitating disease associated with loss of PGs in articular cartilage. Today, however, PGs have a demonstrated role in cell biological processes, such as growth factor signalling, prompting new perspectives on the etiology of PG-associated diseases. Here, we review diseases associated with defects in PG synthesis and sulfation, also highlighting current understanding of the underlying genetics, biochemistry, and cell biology. Since most research has analyzed a class of PGs called heparan sulfate PGs, more attention is paid here to studies of chondroitin sulfate PGs (CSPGs), which are abundant in cartilage. Interestingly, CSPG synthesis is tightly linked to the cell biological processes of secretion and lysosomal degradation, suggesting that these systems may be linked genetically. Animal models of loss of CSPG function have revealed CSPGs to impact skeletal development. Specifically, our work from a mutagenesis screen in zebrafish led to the hypothesis that cartilage PGs normally delay the timing of endochondral ossification. Finally, we outline emerging approaches in zebrafish that may revolutionize the study of cartilage PG function, including transgenic methods and novel imaging techniques. Our recent work with X-ray fluorescent imaging, for example, enables direct correlation of PG function with PG-dependent biological processes. PMID:27312503

  18. Improvement of focus accuracy on processed wafer

    NASA Astrophysics Data System (ADS)

    Higashibata, Satomi; Komine, Nobuhiro; Fukuhara, Kazuya; Koike, Takashi; Kato, Yoshimitsu; Hashimoto, Kohji

    2013-04-01

    As feature size shrinkage in semiconductor device progress, process fluctuation, especially focus strongly affects device performance. Because focus control is an ongoing challenge in optical lithography, various studies have sought for improving focus monitoring and control. Focus errors are due to wafers, exposure tools, reticles, QCs, and so on. Few studies are performed to minimize the measurement errors of auto focus (AF) sensors of exposure tool, especially when processed wafers are exposed. With current focus measurement techniques, the phase shift grating (PSG) focus monitor 1) has been already proposed and its basic principle is that the intensity of the diffraction light of the mask pattern is made asymmetric by arranging a π/2 phase shift area on a reticle. The resist pattern exposed at the defocus position is shifted on the wafer and shifted pattern can be easily measured using an overlay inspection tool. However, it is difficult to measure shifted pattern for the pattern on the processed wafer because of interruptions caused by other patterns in the underlayer. In this paper, we therefore propose "SEM-PSG" technique, where the shift of the PSG resist mark is measured by employing critical dimension-scanning electron microscope (CD-SEM) to measure the focus error on the processed wafer. First, we evaluate the accuracy of SEM-PSG technique. Second, by applying the SEM-PSG technique and feeding the results back to the exposure, we evaluate the focus accuracy on processed wafers. By applying SEM-PSG feedback, the focus accuracy on the processed wafer was improved from 40 to 29 nm in 3σ.

  19. Methods and tools to enjoy and to study inaccessible Heritage

    NASA Astrophysics Data System (ADS)

    Capone, M.; Campi, M.

    2014-06-01

    Our research on a multi-purpose survey of cultural Heritage located in UNESCO Historical Centre of Naples has the following goals: to test some innovative strategies to improve public enjoyment for inaccessible sites; to explore the use of some interactive systems to study heritage in remote; to explore how to access the information system through AR applications. In this paper we are going to focus on comparison between interactive system to access 3D data and photogrammetric processing of panoramic images. We investigated on: a. the use of 360° panorama for 3D restitutions; b. the use of 360° panorama as an interface to 3D data to extract real 3D coordinates and accurately measure distances; c. the use of 3D PDF to access a 3D database.

  20. Collagen matrix as a tool in studying fibroblastic cell behavior.

    PubMed

    Kanta, Jiří

    2015-01-01

    Type I collagen is a fibrillar protein, a member of a large family of collagen proteins. It is present in most body tissues, usually in combination with other collagens and other components of extracellular matrix. Its synthesis is increased in various pathological situations, in healing wounds, in fibrotic tissues and in many tumors. After extraction from collagen-rich tissues it is widely used in studies of cell behavior, especially those of fibroblasts and myofibroblasts. Cells cultured in a classical way, on planar plastic dishes, lack the third dimension that is characteristic of body tissues. Collagen I forms gel at neutral pH and may become a basis of a 3D matrix that better mimics conditions in tissue than plastic dishes. PMID:25734486

  1. Collagen matrix as a tool in studying fibroblastic cell behavior

    PubMed Central

    Kanta, Jiří

    2015-01-01

    Type I collagen is a fibrillar protein, a member of a large family of collagen proteins. It is present in most body tissues, usually in combination with other collagens and other components of extracellular matrix. Its synthesis is increased in various pathological situations, in healing wounds, in fibrotic tissues and in many tumors. After extraction from collagen-rich tissues it is widely used in studies of cell behavior, especially those of fibroblasts and myofibroblasts. Cells cultured in a classical way, on planar plastic dishes, lack the third dimension that is characteristic of body tissues. Collagen I forms gel at neutral pH and may become a basis of a 3D matrix that better mimics conditions in tissue than plastic dishes. PMID:25734486

  2. Sphingolipidomics: An Important Mechanistic Tool for Studying Fungal Pathogens

    PubMed Central

    Singh, Ashutosh; Del Poeta, Maurizio

    2016-01-01

    Sphingolipids form of a unique and complex group of bioactive lipids in fungi. Structurally, sphingolipids of fungi are quite diverse with unique differences in the sphingoid backbone, amide linked fatty acyl chain and the polar head group. Two of the most studied and conserved sphingolipid classes in fungi are the glucosyl- or galactosyl-ceramides and the phosphorylinositol containing phytoceramides. Comprehensive structural characterization and quantification of these lipids is largely based on advanced analytical mass spectrometry based lipidomic methods. While separation of complex lipid mixtures is achieved through high performance liquid chromatography, the soft – electrospray ionization tandem mass spectrometry allows a high sensitivity and selectivity of detection. Herein, we present an overview of lipid extraction, chromatographic separation and mass spectrometry employed in qualitative and quantitative sphingolipidomics in fungi. PMID:27148190

  3. Phase segmentation of X-ray computer tomography rock images using machine learning techniques: an accuracy and performance study

    NASA Astrophysics Data System (ADS)

    Chauhan, Swarup; Rühaak, Wolfram; Anbergen, Hauke; Kabdenov, Alen; Freise, Marcus; Wille, Thorsten; Sass, Ingo

    2016-07-01

    Performance and accuracy of machine learning techniques to segment rock grains, matrix and pore voxels from a 3-D volume of X-ray tomographic (XCT) grayscale rock images was evaluated. The segmentation and classification capability of unsupervised (k-means, fuzzy c-means, self-organized maps), supervised (artificial neural networks, least-squares support vector machines) and ensemble classifiers (bragging and boosting) were tested using XCT images of andesite volcanic rock, Berea sandstone, Rotliegend sandstone and a synthetic sample. The averaged porosity obtained for andesite (15.8 ± 2.5 %), Berea sandstone (16.3 ± 2.6 %), Rotliegend sandstone (13.4 ± 7.4 %) and the synthetic sample (48.3 ± 13.3 %) is in very good agreement with the respective laboratory measurement data and varies by a factor of 0.2. The k-means algorithm is the fastest of all machine learning algorithms, whereas a least-squares support vector machine is the most computationally expensive. Metrics entropy, purity, mean square root error, receiver operational characteristic curve and 10 K-fold cross-validation were used to determine the accuracy of unsupervised, supervised and ensemble classifier techniques. In general, the accuracy was found to be largely affected by the feature vector selection scheme. As it is always a trade-off between performance and accuracy, it is difficult to isolate one particular machine learning algorithm which is best suited for the complex phase segmentation problem. Therefore, our investigation provides parameters that can help in selecting the appropriate machine learning techniques for phase segmentation.

  4. Scanning force microscopy as a tool for fracture studies

    SciTech Connect

    Thome, F.; Goeken, M.; Grosse Gehling, M.; Vehoff, H.

    1999-08-01

    Dynamic simulations of the fracture toughness as a function of the orientation and temperature were carried out and compared with experimental results obtained by in-situ loading pre-cracked NiAl single crystals inside a scanning force microscope (SFM). In order to compare the simulations with the experiments the problem of the short crack with dislocations was solved for general loading and arbitrary slip line directions. The stress and strain field obtained could be directly connected to FEM calculations which allowed the examination of the stability of micro cracks at notches. The effect of different fracture conditions for biaxial loading were studied in detail. The dynamic simulation yielded predictions of K{sub IC}, slip line length and dislocation distributions as a function of loading rate, temperature and orientation. These predictions were tested by in-situ loading NiAl single crystals inside a SFM at various temperatures. The local COD, slip line length and apparent dislocation distribution at the surface were measured as a function of the applied load and the temperature. The experiments clearly demonstrated that dislocations emit from the crack tip before unstable crack jumps occur. The local COD could be directly related to the number of dislocations emitted from the crack tip. With increasing temperature the number of dislocations and the local COD increased before unstable crack jumps or final fracture occurred.

  5. Animal models as tools to study the pathophysiology of depression.

    PubMed

    Abelaira, Helena M; Réus, Gislaine Z; Quevedo, João

    2013-01-01

    The incidence of depressive illness is high worldwide, and the inadequacy of currently available drug treatments contributes to the significant health burden associated with depression. A basic understanding of the underlying disease processes in depression is lacking; therefore, recreating the disease in animal models is not possible. Popular current models of depression creatively merge ethologically valid behavioral assays with the latest technological advances in molecular biology. Within this context, this study aims to evaluate animal models of depression and determine which has the best face, construct, and predictive validity. These models differ in the degree to which they produce features that resemble a depressive-like state, and models that include stress exposure are widely used. Paradigms that employ acute or sub-chronic stress exposure include learned helplessness, the forced swimming test, the tail suspension test, maternal deprivation, chronic mild stress, and sleep deprivation, to name but a few, all of which employ relatively short-term exposure to inescapable or uncontrollable stress and can reliably detect antidepressant drug response. PMID:24271223

  6. Understanding FRET as a Research Tool for Cellular Studies

    PubMed Central

    Shrestha, Dilip; Jenei, Attila; Nagy, Péter; Vereb, György; Szöllősi, János

    2015-01-01

    Communication of molecular species through dynamic association and/or dissociation at various cellular sites governs biological functions. Understanding these physiological processes require delineation of molecular events occurring at the level of individual complexes in a living cell. Among the few non-invasive approaches with nanometer resolution are methods based on Förster Resonance Energy Transfer (FRET). FRET is effective at a distance of 1–10 nm which is equivalent to the size of macromolecules, thus providing an unprecedented level of detail on molecular interactions. The emergence of fluorescent proteins and SNAP- and CLIP- tag proteins provided FRET with the capability to monitor changes in a molecular complex in real-time making it possible to establish the functional significance of the studied molecules in a native environment. Now, FRET is widely used in biological sciences, including the field of proteomics, signal transduction, diagnostics and drug development to address questions almost unimaginable with biochemical methods and conventional microscopies. However, the underlying physics of FRET often scares biologists. Therefore, in this review, our goal is to introduce FRET to non-physicists in a lucid manner. We will also discuss our contributions to various FRET methodologies based on microscopy and flow cytometry, while describing its application for determining the molecular heterogeneity of the plasma membrane in various cell types. PMID:25815593

  7. Next generation sequencing technologies: tool to study avian virus diversity.

    PubMed

    Kapgate, S S; Barbuddhe, S B; Kumanan, K

    2015-03-01

    Increased globalisation, climatic changes and wildlife-livestock interface led to emergence of novel viral pathogens or zoonoses that have become serious concern to avian, animal and human health. High biodiversity and bird migration facilitate spread of the pathogen and provide reservoirs for emerging infectious diseases. Current classical diagnostic methods designed to be virus-specific or aim to be limited to group of viral agents, hinder identifying of novel viruses or viral variants. Recently developed approaches of next-generation sequencing (NGS) provide culture-independent methods that are useful for understanding viral diversity and discovery of novel virus, thereby enabling a better diagnosis and disease control. This review discusses the different possible steps of a NGS study utilizing sequence-independent amplification, high-throughput sequencing and bioinformatics approaches to identify novel avian viruses and their diversity. NGS lead to the identification of a wide range of new viruses such as picobirnavirus, picornavirus, orthoreovirus and avian gamma coronavirus associated with fulminating disease in guinea fowl and is also used in describing viral diversity among avian species. The review also briefly discusses areas of viral-host interaction and disease associated causalities with newly identified avian viruses. PMID:25790045

  8. Immediate effects of lower cervical spine manipulation on handgrip strength and free-throw accuracy of asymptomatic basketball players: a pilot study

    PubMed Central

    Humphries, Kelley M.; Ward, John; Coats, Jesse; Nobert, Jeannique; Amonette, William; Dyess, Stephen

    2013-01-01

    Objective The purpose of this pilot study was to collect preliminary information for a study to determine the immediate effects of a single unilateral chiropractic manipulation to the lower cervical spine on handgrip strength and free-throw accuracy in asymptomatic male recreational basketball players. Methods For this study, 24 asymptomatic male recreational right-handed basketball players (age = 26.3 ± 9.2 years, height = 1.81 ± 0.07 m, body mass = 82.6 ± 10.4 kg [mean ± SD]) underwent baseline dominant handgrip isometric strength and free-throw accuracy testing in an indoor basketball court. They were then equally randomized to receive either (1) diversified left lower cervical spine chiropractic manipulative therapy (CMT) at C5/C6 or (2) placebo CMT at C5/C6 using an Activator adjusting instrument on zero force setting. Participants then underwent posttesting of isometric handgrip strength and free-throw accuracy. A paired-samples t test was used to make within-group pre to post comparisons and between-group pre to post comparisons. Results No statistically significant difference was shown between either of the 2 basketball performance variables measured in either group. Isometric handgrip strength marginally improved by 0.7 kg (mean) in the CMT group (P = .710). Free-throw accuracy increased by 13.2% in the CMT group (P = .058). The placebo CMT group performed the same or more poorly during their second test session. Conclusions The results of this preliminary study showed that a single lower cervical spine manipulation did not significantly impact basketball performance for this group of healthy asymptomatic participants. A slight increase in free-throw percentage was seen, which deserves further investigation. This pilot study demonstrates that a larger study to evaluate if CMT affects handgrip strength and free-throw accuracy is feasible. PMID:24396315

  9. Prostate intrafraction motion evaluation using kV fluoroscopy during treatment delivery: A feasibility and accuracy study

    PubMed Central

    Adamson, Justus; Wu, Qiuwen

    2008-01-01

    Margin reduction for prostate radiotherapy is limited by uncertainty in prostate localization during treatment. We investigated the feasibility and accuracy of measuring prostate intrafraction motion using kV fluoroscopy performed simultaneously with radiotherapy. Three gold coils used for target localization were implanted into the patient’s prostate gland before undergoing hypofractionated online image-guided step-and-shoot intensity modulated radiation therapy (IMRT) on an Elekta Synergy linear accelerator. At each fraction, the patient was aligned using a cone-beam computed tomography (CBCT), after which the IMRT treatment delivery and fluoroscopy were performed simultaneously. In addition, a post-treatment CBCT was acquired with the patient still on the table. To measure the intrafraction motion, we developed an algorithm to register the fluoroscopy images to a reference image derived from the post-treatment CBCT, and we estimated coil motion in three-dimensional (3D) space by combining information from registrations at different gantry angles. We also detected the MV beam turning on and off using MV scatter incident in the same fluoroscopy images, and used this information to synchronize our intrafraction evaluation with the treatment delivery. In addition, we assessed the following: the method to synchronize with treatment delivery, the dose from kV imaging, the accuracy of the localization, and the error propagated into the 3D localization from motion between fluoroscopy acquisitions. With 0.16 mAs∕frame and a bowtie filter implemented, the coils could be localized with the gantry at both 0° and 270° with the MV beam off, and at 270° with the MV beam on when multiple fluoroscopy frames were averaged. The localization in two-dimensions for phantom and patient measurements was performed with submillimeter accuracy. After backprojection into 3D the patient localization error was (−0.04±0.30) mm, (0.09±0.36) mm, and (0.03±0.68) mm in the right

  10. Prostate intrafraction motion evaluation using kV fluoroscopy during treatment delivery: A feasibility and accuracy study

    SciTech Connect

    Adamson, Justus; Wu Qiuwen

    2008-05-15

    Margin reduction for prostate radiotherapy is limited by uncertainty in prostate localization during treatment. We investigated the feasibility and accuracy of measuring prostate intrafraction motion using kV fluoroscopy performed simultaneously with radiotherapy. Three gold coils used for target localization were implanted into the patient's prostate gland before undergoing hypofractionated online image-guided step-and-shoot intensity modulated radiation therapy (IMRT) on an Elekta Synergy linear accelerator. At each fraction, the patient was aligned using a cone-beam computed tomography (CBCT), after which the IMRT treatment delivery and fluoroscopy were performed simultaneously. In addition, a post-treatment CBCT was acquired with the patient still on the table. To measure the intrafraction motion, we developed an algorithm to register the fluoroscopy images to a reference image derived from the post-treatment CBCT, and we estimated coil motion in three-dimensional (3D) space by combining information from registrations at different gantry angles. We also detected the MV beam turning on and off using MV scatter incident in the same fluoroscopy images, and used this information to synchronize our intrafraction evaluation with the treatment delivery. In addition, we assessed the following: the method to synchronize with treatment delivery, the dose from kV imaging, the accuracy of the localization, and the error propagated into the 3D localization from motion between fluoroscopy acquisitions. With 0.16 mAs/frame and a bowtie filter implemented, the coils could be localized with the gantry at both 0 deg. and 270 deg. with the MV beam off, and at 270 deg. with the MV beam on when multiple fluoroscopy frames were averaged. The localization in two-dimensions for phantom and patient measurements was performed with submillimeter accuracy. After backprojection into 3D the patient localization error was (-0.04{+-}0.30) mm, (0.09{+-}0.36) mm, and (0.03{+-}0.68) mm in the

  11. Rapid decision tool to predict earthquake destruction in Sumatra by using first motion study

    NASA Astrophysics Data System (ADS)

    Bhakta, Shardul Sanjay

    The main idea of this project is to build an interactive and smart Geographic Information system tool which can help predict intensity of real time earthquakes in Sumatra Island of Indonesia. The tool has an underlying intelligence to predict the intensity of an earthquake depending on analysis of similar earthquakes in the past in that specific region. Whenever an earthquake takes place in Sumatra, a First Motion Study is conducted; this decides its type, depth, latitude and longitude. When the user inputs this information into the input string, the tool will try to find similar earthquakes with a similar First Motion Survey and depth. It will do a survey of similar earthquakes and predict if this real time earthquake can be disastrous or not. This tool has been developed in JAVA. I have used MOJO (Map Objects JAVA Objects) to show map of Indonesia and earthquake locations in the form of points. ESRI has created MOJO which is a set of JAVA API's. The Indonesia map, earthquake location points and its co-relation was all designed using MOJO. MOJO is a powerful tool which made it easy to design the tool. This tool is easy to use and the user has to input only a few parameters for the end result. I hope this tool justifies its use in prediction of earthquakes and help save lives in Sumatra.

  12. Eating tools in hand activate the brain systems for eating action: a transcranial magnetic stimulation study.

    PubMed

    Yamaguchi, Kaori; Nakamura, Kimihiro; Oga, Tatsuhide; Nakajima, Yasoichi

    2014-07-01

    There is increasing neuroimaging evidence suggesting that visually presented tools automatically activate the human sensorimotor system coding learned motor actions relevant to the visual stimuli. Such crossmodal activation may reflect a general functional property of the human motor memory and thus can be operating in other, non-limb effector organs, such as the orofacial system involved in eating. In the present study, we predicted that somatosensory signals produced by eating tools in hand covertly activate the neuromuscular systems involved in eating action. In Experiments 1 and 2, we measured motor evoked response (MEP) of the masseter muscle in normal humans to examine the possible impact of tools in hand (chopsticks and scissors) on the neuromuscular systems during the observation of food stimuli. We found that eating tools (chopsticks) enhanced the masseter MEPs more greatly than other tools (scissors) during the visual recognition of food, although this covert change in motor excitability was not detectable at the behavioral level. In Experiment 3, we further observed that chopsticks overall increased MEPs more greatly than scissors and this tool-driven increase of MEPs was greater when participants viewed food stimuli than when they viewed non-food stimuli. A joint analysis of the three experiments confirmed a significant impact of eating tools on the masseter MEPs during food recognition. Taken together, these results suggest that eating tools in hand exert a category-specific impact on the neuromuscular system for eating. PMID:24835403

  13. Accuracy of autofluorescence in diagnosing oral squamous cell carcinoma and oral potentially malignant disorders: a comparative study with aero-digestive lesions

    PubMed Central

    Luo, Xiaobo; Xu, Hao; He, Mingjing; Han, Qi; Wang, Hui; Sun, Chongkui; Li, Jing; Jiang, Lu; Zhou, Yu; Dan, Hongxia; Feng, Xiaodong; Zeng, Xin; Chen, Qianming

    2016-01-01

    Presently, various studies had investigated the accuracy of autofluorescence in diagnosing oral squamous cell carcinoma (OSCC) and oral potentially malignant disorders (OPMD) with diverse conclusions. This study aimed to assess its accuracy for OSCC and OPMD and to investigate its applicability in general dental practice. After a comprehensive literature search, a meta-analysis was conducted to calculate the pooled diagnostic indexes of autofluorescence for premalignant lesions (PML) and malignant lesions (ML) of the oral cavity, lung, esophagus, stomach and colorectum and to compute indexes regarding the detection of OSCC aided by algorithms. Besides, a u test was performed. Twenty-four studies detecting OSCC and OPMD in 2761 lesions were included. This demonstrated that the overall accuracy of autofluorescence for OSCC and OPMD was superior to PML and ML of the lung, esophagus and stomach, slightly inferior to the colorectum. Additionally, the sensitivity and specificity for OSCC and OPMD were 0.89 and 0.8, respectively. Furthermore, the specificity could be remarkably improved by additional algorithms. With relatively high accuracy, autofluorescence could be potentially applied as an adjunct for early diagnosis of OSCC and OPMD. Moreover, approaches such as algorithms could enhance its specificity to ensure its efficacy in primary care. PMID:27416981

  14. Accuracy of autofluorescence in diagnosing oral squamous cell carcinoma and oral potentially malignant disorders: a comparative study with aero-digestive lesions.

    PubMed

    Luo, Xiaobo; Xu, Hao; He, Mingjing; Han, Qi; Wang, Hui; Sun, Chongkui; Li, Jing; Jiang, Lu; Zhou, Yu; Dan, Hongxia; Feng, Xiaodong; Zeng, Xin; Chen, Qianming

    2016-01-01

    Presently, various studies had investigated the accuracy of autofluorescence in diagnosing oral squamous cell carcinoma (OSCC) and oral potentially malignant disorders (OPMD) with diverse conclusions. This study aimed to assess its accuracy for OSCC and OPMD and to investigate its applicability in general dental practice. After a comprehensive literature search, a meta-analysis was conducted to calculate the pooled diagnostic indexes of autofluorescence for premalignant lesions (PML) and malignant lesions (ML) of the oral cavity, lung, esophagus, stomach and colorectum and to compute indexes regarding the detection of OSCC aided by algorithms. Besides, a u test was performed. Twenty-four studies detecting OSCC and OPMD in 2761 lesions were included. This demonstrated that the overall accuracy of autofluorescence for OSCC and OPMD was superior to PML and ML of the lung, esophagus and stomach, slightly inferior to the colorectum. Additionally, the sensitivity and specificity for OSCC and OPMD were 0.89 and 0.8, respectively. Furthermore, the specificity could be remarkably improved by additional algorithms. With relatively high accuracy, autofluorescence could be potentially applied as an adjunct for early diagnosis of OSCC and OPMD. Moreover, approaches such as algorithms could enhance its specificity to ensure its efficacy in primary care. PMID:27416981

  15. Study on accuracy and interobserver reliability of the assessment of odontoid fracture union using plain radiographs or CT scans

    PubMed Central

    Kolb, Klaus; Zenner, Juliane; Reynolds, Jeremy; Dvorak, Marcel; Acosta, Frank; Forstner, Rosemarie; Mayer, Michael; Tauber, Mark; Auffarth, Alexander; Kathrein, Anton; Hitzl, Wolfgang

    2009-01-01

    In odontoid fracture research, outcome can be evaluated based on validated questionnaires, based on functional outcome in terms of atlantoaxial and total neck rotation, and based on the treatment-related union rate. Data on clinical and functional outcome are still sparse. In contrast, there is abundant information on union rates, although, frequently the rates differ widely. Odontoid union is the most frequently assessed outcome parameter and therefore it is imperative to investigate the interobserver reliability of fusion assessment using radiographs compared to CT scans. Our objective was to identify the diagnostic accuracy of plain radiographs in detecting union and non-union after odontoid fractures and compare this to CT scans as the standard of reference. Complete sets of biplanar plain radiographs and CT scans of 21 patients treated for odontoid fractures were subjected to interobserver assessment of fusion. Image sets were presented to 18 international observers with a mean experience in fusion assessment of 10.7 years. Patients selected had complete radiographic follow-up at a mean of 63.3 ± 53 months. Mean age of the patients at follow-up was 68.2 years. We calculated interobserver agreement of the diagnostic assessment using radiographs compared to using CT scans, as well as the sensitivity and specificity of the radiographic assessment. Agreement on the fusion status using radiographs compared to CT scans ranged between 62 and 90% depending on the observer. Concerning the assessment of non-union and fusion, the mean specificity was 62% and mean sensitivity was 77%. Statistical analysis revealed an agreement of 80–100% in 48% of cases only, between the biplanar radiographs and the reconstructed CT scans. In 50% of patients assessed there was an agreement of less than 80%. The mean sensitivity and specificity values indicate that radiographs are not a reliable measure to indicate odontoid fracture union or non-union. Regarding experience in years

  16. Astrophysics with Microarcsecond Accuracy Astrometry

    NASA Technical Reports Server (NTRS)

    Unwin, Stephen C.

    2008-01-01

    Space-based astrometry promises to provide a powerful new tool for astrophysics. At a precision level of a few microarcsonds, a wide range of phenomena are opened up for study. In this paper we discuss the capabilities of the SIM Lite mission, the first space-based long-baseline optical interferometer, which will deliver parallaxes to 4 microarcsec. A companion paper in this volume will cover the development and operation of this instrument. At the level that SIM Lite will reach, better than 1 microarcsec in a single measurement, planets as small as one Earth can be detected around many dozen of the nearest stars. Not only can planet masses be definitely measured, but also the full orbital parameters determined, allowing study of system stability in multiple planet systems. This capability to survey our nearby stellar neighbors for terrestrial planets will be a unique contribution to our understanding of the local universe. SIM Lite will be able to tackle a wide range of interesting problems in stellar and Galactic astrophysics. By tracing the motions of stars in dwarf spheroidal galaxies orbiting our Milky Way, SIM Lite will probe the shape of the galactic potential history of the formation of the galaxy, and the nature of dark matter. Because it is flexibly scheduled, the instrument can dwell on faint targets, maintaining its full accuracy on objects as faint as V=19. This paper is a brief survey of the diverse problems in modern astrophysics that SIM Lite will be able to address.

  17. Validity of ICD-9-CM codes for breast, lung and colorectal cancers in three Italian administrative healthcare databases: a diagnostic accuracy study protocol

    PubMed Central

    Abraha, Iosief; Serraino, Diego; Giovannini, Gianni; Stracci, Fabrizio; Casucci, Paola; Alessandrini, Giuliana; Bidoli, Ettore; Chiari, Rita; Cirocchi, Roberto; De Giorgi, Marcello; Franchini, David; Vitale, Maria Francesca; Fusco, Mario; Montedori, Alessandro

    2016-01-01

    Introduction Administrative healthcare databases are useful tools to study healthcare outcomes and to monitor the health status of a population. Patients with cancer can be identified through disease-specific codes, prescriptions and physician claims, but prior validation is required to achieve an accurate case definition. The objective of this protocol is to assess the accuracy of International Classification of Diseases Ninth Revision—Clinical Modification (ICD-9-CM) codes for breast, lung and colorectal cancers in identifying patients diagnosed with the relative disease in three Italian administrative databases. Methods and analysis Data from the administrative databases of Umbria Region (910 000 residents), Local Health Unit 3 of Napoli (1 170 000 residents) and Friuli-Venezia Giulia Region (1 227 000 residents) will be considered. In each administrative database, patients with the first occurrence of diagnosis of breast, lung or colorectal cancer between 2012 and 2014 will be identified using the following groups of ICD-9-CM codes in primary position: (1) 233.0 and (2) 174.x for breast cancer; (3) 162.x for lung cancer; (4) 153.x for colon cancer and (5) 154.0–154.1 and 154.8 for rectal cancer. Only incident cases will be considered, that is, excluding cases that have the same diagnosis in the 5 years (2007–2011) before the period of interest. A random sample of cases and non-cases will be selected from each administrative database and the corresponding medical charts will be assessed for validation by pairs of trained, independent reviewers. Case ascertainment within the medical charts will be based on (1) the presence of a primary nodular lesion in the breast, lung or colon–rectum, documented with imaging or endoscopy and (2) a cytological or histological documentation of cancer from a primary or metastatic site. Sensitivity and specificity with 95% CIs will be calculated. Dissemination Study results will be disseminated widely through

  18. Urinary beta-2 microglobulin and alpha-1 microglobulin are useful screening markers for tenofovir-induced kidney tubulopathy in patients with HIV-1 infection: a diagnostic accuracy study.

    PubMed

    Nishijima, Takeshi; Shimbo, Takuro; Komatsu, Hirokazu; Takano, Misao; Tanuma, Junko; Tsukada, Kunihisa; Teruya, Katsuji; Gatanaga, Hiroyuki; Kikuchi, Yoshimi; Oka, Shinichi

    2013-10-01

    Kidney tubulopathy is a well-known adverse event of antiretroviral agent tenofovir. A cross-sectional study was conducted to compare the diagnostic accuracy of five tubular markers, with a collection of abnormalities in these markers as the reference standard. The study subjects were patients with HIV-1 infection on ritonavir-boosted darunavir plus tenofovir/emtricitabine with suppressed viral load. Kidney tubular dysfunction (KTD) was predefined as the presence of at least three abnormalities in the following five parameters: β2-microglobulinuria (β2M), α1-microglobulinuria (α1M), high urinary N-acetyl-β-D-glucosaminidase (NAG), fractional excretion of phosphate (FEIP), and fractional excretion of uric acid (FEUA). Receiver operating characteristic curves and areas under the curves (AUC) were estimated, and the differences between the largest AUC and each of the other AUCs were tested using a nonparametric method. The cutoff value of each tubular marker was determined using raw data of 100% sensitivity with maximal specificity. KTD was diagnosed in 19 of the 190 (10%) patients. The AUCs (95% CIs) of each tubular marker were β2M, 0.970 (0.947-0.992); α1M, 0.968 (0.944-0.992); NAG, 0.901 (0.828-0.974); FEIP, 0.757 (0.607-0.907), and FEUA, 0.762 (0.653-0.872). The AUCs of β2M and α1M were not significantly different, whereas those of the other three markers were smaller. The optimal cutoff values with 100% sensitivity were 1,123 μg/gCr (β2M, specificity 89%), 15.4 mg/gCr (α1M, specificity 87%), 3.58 U/gCr (NAG, specificity 46%), 1.02% (FEIP, specificity 0%), and 3.92% (FEUA, specificity 12%). Urinary β2M and α1M are potentially suitable screening tools for tenofovir-induced KTD. Monitoring either urinary β2M or α1M should be useful in early detection of tenofovir nephrotoxicity. PMID:23467792

  19. Accuracy of surface registration compared to conventional volumetric registration in patient positioning for head-and-neck radiotherapy: A simulation study using patient data

    SciTech Connect

    Kim, Youngjun; Li, Ruijiang; Na, Yong Hum; Xing, Lei; Lee, Rena

    2014-12-15

    Purpose: 3D optical surface imaging has been applied to patient positioning in radiation therapy (RT). The optical patient positioning system is advantageous over conventional method using cone-beam computed tomography (CBCT) in that it is radiation free, frameless, and is capable of real-time monitoring. While the conventional radiographic method uses volumetric registration, the optical system uses surface matching for patient alignment. The relative accuracy of these two methods has not yet been sufficiently investigated. This study aims to investigate the theoretical accuracy of the surface registration based on a simulation study using patient data. Methods: This study compares the relative accuracy of surface and volumetric registration in head-and-neck RT. The authors examined 26 patient data sets, each consisting of planning CT data acquired before treatment and patient setup CBCT data acquired at the time of treatment. As input data of surface registration, patient’s skin surfaces were created by contouring patient skin from planning CT and treatment CBCT. Surface registration was performed using the iterative closest points algorithm by point–plane closest, which minimizes the normal distance between source points and target surfaces. Six degrees of freedom (three translations and three rotations) were used in both surface and volumetric registrations and the results were compared. The accuracy of each method was estimated by digital phantom tests. Results: Based on the results of 26 patients, the authors found that the average and maximum root-mean-square translation deviation between the surface and volumetric registrations were 2.7 and 5.2 mm, respectively. The residual error of the surface registration was calculated to have an average of 0.9 mm and a maximum of 1.7 mm. Conclusions: Surface registration may lead to results different from those of the conventional volumetric registration. Only limited accuracy can be achieved for patient

  20. Using checklists and algorithms to improve qualitative exposure judgment accuracy.

    PubMed

    Arnold, Susan F; Stenzel, Mark; Drolet, Daniel; Ramachandran, Gurumurthy

    2016-01-01

    Most exposure assessments are conducted without the aid of robust personal exposure data and are based instead on qualitative inputs such as education and experience, training, documentation on the process chemicals, tasks and equipment, and other information. Qualitative assessments determine whether there is any follow-up, and influence the type that occurs, such as quantitative sampling, worker training, and implementing exposure and risk management measures. Accurate qualitative exposure judgments ensure appropriate follow-up that in turn ensures appropriate exposure management. Studies suggest that qualitative judgment accuracy is low. A qualitative exposure assessment Checklist tool was developed to guide the application of a set of heuristics to aid decision making. Practicing hygienists (n = 39) and novice industrial hygienists (n = 8) were recruited for a study evaluating the influence of the Checklist on exposure judgment accuracy. Participants generated 85 pre-training judgments and 195 Checklist-guided judgments. Pre-training judgment accuracy was low (33%) and not statistically significantly different from random chance. A tendency for IHs to underestimate the true exposure was observed. Exposure judgment accuracy improved significantly (p <0.001) to 63% when aided by the Checklist. Qualitative judgments guided by the Checklist tool were categorically accurate or over-estimated the true exposure by one category 70% of the time. The overall magnitude of exposure judgment precision also improved following training. Fleiss' κ, evaluating inter-rater agreement between novice assessors was fair to moderate (κ = 0.39). Cohen's weighted and unweighted κ were good to excellent for novice (0.77 and 0.80) and practicing IHs (0.73 and 0.89), respectively. Checklist judgment accuracy was similar to quantitative exposure judgment accuracy observed in studies of similar design using personal exposure measurements, suggesting that the tool could be useful in

  1. A student-oriented study tool for heterogeneous HyperCard courseware.

    PubMed

    Rathe, R; Garren, T

    1991-01-01

    Several computer-assisted instruction initiatives at our institution rely on a shared laserdisc for image storage and HyperCard for interactive programming. This paper furnishes a brief overview of problems encountered with multimedia projects and how we are attempting to overcome them using a generic, student-oriented study tool. The tool provides bookmarks, annotations, quotations, and other utilities across our entire HyperCard courseware collection. PMID:1807701

  2. Recommended reporting standards for test accuracy studies of infectious diseases of finfish, amphibians, molluscs and crustaceans: the STRADAS-aquatic checklist.

    PubMed

    Gardner, Ian A; Whittington, Richard J; Caraguel, Charles G B; Hick, Paul; Moody, Nicholas J G; Corbeil, Serge; Garver, Kyle A; Warg, Janet V; Arzul, Isabelle; Purcell, Maureen K; Crane, Mark St J; Waltzek, Thomas B; Olesen, Niels J; Gallardo Lagno, Alicia

    2016-02-25

    Complete and transparent reporting of key elements of diagnostic accuracy studies for infectious diseases in cultured and wild aquatic animals benefits end-users of these tests, enabling the rational design of surveillance programs, the assessment of test results from clinical cases and comparisons of diagnostic test performance. Based on deficiencies in the Standards for Reporting of Diagnostic Accuracy (STARD) guidelines identified in a prior finfish study (Gardner et al. 2014), we adapted the Standards for Reporting of Animal Diagnostic Accuracy Studies-paratuberculosis (STRADAS-paraTB) checklist of 25 reporting items to increase their relevance to finfish, amphibians, molluscs, and crustaceans and provided examples and explanations for each item. The checklist, known as STRADAS-aquatic, was developed and refined by an expert group of 14 transdisciplinary scientists with experience in test evaluation studies using field and experimental samples, in operation of reference laboratories for aquatic animal pathogens, and in development of international aquatic animal health policy. The main changes to the STRADAS-paraTB checklist were to nomenclature related to the species, the addition of guidelines for experimental challenge studies, and the designation of some items as relevant only to experimental studies and ante-mortem tests. We believe that adoption of these guidelines will improve reporting of primary studies of test accuracy for aquatic animal diseases and facilitate assessment of their fitness-for-purpose. Given the importance of diagnostic tests to underpin the Sanitary and Phytosanitary agreement of the World Trade Organization, the principles outlined in this paper should be applied to other World Organisation for Animal Health (OIE)-relevant species. PMID:26912041

  3. Recommended reporting standards for test accuracy studies of infectious diseases of finfish, amphibians, molluscs and crustaceans: the STRADAS-aquatic checklist

    USGS Publications Warehouse

    Gardner, Ian A; Whittington, Richard J; Caraguel, Charles G B; Hick, Paul; Moody, Nicholas J G; Corbeil, Serge; Garver, Kyle A.; Warg, Janet V; Arzul, Isabelle; Purcell, Maureen; St. J. Crane, Mark; Waltzek, Thomas B.; Olesen, Niels J; Lagno, Alicia Gallardo

    2016-01-01

    Complete and transparent reporting of key elements of diagnostic accuracy studies for infectious diseases in cultured and wild aquatic animals benefits end-users of these tests, enabling the rational design of surveillance programs, the assessment of test results from clinical cases and comparisons of diagnostic test performance. Based on deficiencies in the Standards for Reporting of Diagnostic Accuracy (STARD) guidelines identified in a prior finfish study (Gardner et al. 2014), we adapted the Standards for Reporting of Animal Diagnostic Accuracy Studies—paratuberculosis (STRADAS-paraTB) checklist of 25 reporting items to increase their relevance to finfish, amphibians, molluscs, and crustaceans and provided examples and explanations for each item. The checklist, known as STRADAS-aquatic, was developed and refined by an expert group of 14 transdisciplinary scientists with experience in test evaluation studies using field and experimental samples, in operation of reference laboratories for aquatic animal pathogens, and in development of international aquatic animal health policy. The main changes to the STRADAS-paraTB checklist were to nomenclature related to the species, the addition of guidelines for experimental challenge studies, and the designation of some items as relevant only to experimental studies and ante-mortem tests. We believe that adoption of these guidelines will improve reporting of primary studies of test accuracy for aquatic animal diseases and facilitate assessment of their fitness-for-purpose. Given the importance of diagnostic tests to underpin the Sanitary and Phytosanitary agreement of the World Trade Organization, the principles outlined in this paper should be applied to other World Organisation for Animal Health (OIE)-relevant species.

  4. Added value of cost-utility analysis in simple diagnostic studies of accuracy: 18F-fluoromethylcholine PET/CT in prostate cancer staging

    PubMed Central

    Gerke, Oke; Poulsen, Mads H; Høilund-Carlsen, Poul Flemming

    2015-01-01

    Diagnostic studies of accuracy targeting sensitivity and specificity are commonly done in a paired design in which all modalities are applied in each patient, whereas cost-effectiveness and cost-utility analyses are usually assessed either directly alongside to or indirectly by means of stochastic modeling based on larger randomized controlled trials (RCTs). However the conduct of RCTs is hampered in an environment such as ours, in which technology is rapidly evolving. As such, there is a relatively limited number of RCTs. Therefore, we investigated as to which extent paired diagnostic studies of accuracy can be also used to shed light on economic implications when considering a new diagnostic test. We propose a simple decision tree model-based cost-utility analysis of a diagnostic test when compared to the current standard procedure and exemplify this approach with published data from lymph node staging of prostate cancer. Average procedure costs were taken from the Danish Diagnosis Related Groups Tariff in 2013 and life expectancy was estimated for an ideal 60 year old patient based on prostate cancer stage and prostatectomy or radiation and chemotherapy. Quality-adjusted life-years (QALYs) were deduced from the literature, and an incremental cost-effectiveness ratio (ICER) was used to compare lymph node dissection with respective histopathological examination (reference standard) and 18F-fluoromethylcholine positron emission tomography/computed tomography (FCH-PET/CT). Lower bounds of sensitivity and specificity of FCH-PET/CT were established at which the replacement of the reference standard by FCH-PET/CT comes with a trade-off between worse effectiveness and lower costs. Compared to the reference standard in a diagnostic accuracy study, any imperfections in accuracy of a diagnostic test imply that replacing the reference standard generates a loss in effectiveness and utility. We conclude that diagnostic studies of accuracy can be put to a more extensive use

  5. [The study of tool use as the way for general estimation of cognitive abilities in animals].

    PubMed

    Reznikova, Zh I

    2006-01-01

    Investigation of tool use is an effective way to determine cognitive abilities of animals. This approach raises hypotheses, which delineate limits of animal's competence in understanding of objects properties and interrelations and the influence of individual and social experience on their behaviour. On the basis of brief review of different models of manipulation with objects and tools manufacturing (detaching, subtracting and reshaping) by various animals (from elephants to ants) in natural conditions the experimental data concerning tool usage was considered. Tool behaviour of anumals could be observed rarely and its distribution among different taxons is rather odd. Recent studies have revealed that some species (for instance, bonobos and tamarins) which didn't manipulate tools in wild life appears to be an advanced tool users and even manufacturers in laboratory. Experimental studies of animals tool use include investigation of their ability to use objects physical properties, to categorize objects involved in tool activity by its functional properties, to take forces affecting objects into account, as well as their capacity of planning their actions. The crucial question is whether animals can abstract general principles of relations between objects regardless of the exact circumstances, or they develop specific associations between concerete things and situations. Effectiveness of laboratory methods is estimated in the review basing on comparative studies of tool behaviour, such as "support problem", "stick problem", "tube- and tube-trap problem", and "reserve tube problem". Levels of social learning, the role of imprinting, and species-specific predisposition to formation of specific domains are discussed. Experimental investigation of tool use allows estimation of the individuals' intelligence in populations. A hypothesis suggesting that strong predisposition to formation of specific associations can serve as a driving force and at the same time as

  6. [Measurement accuracy of oscillatory and whole body plethysmography determination of airway resistance. Study of a mechanical model].

    PubMed

    Walliser, D; Lenders, H; Gleisberg, F; Schumann, K; Neuerburg, W

    1991-01-01

    The degree of accuracy of the plethysmographic and oscillatory method in determining respiratory resistance has been examined on a mechanical lung model. At this model different levels of the resistance could be reproducibly adjusted and exactly determined with sensitive measuring instruments. The plethysmographic method allows a precise estimation of the resistance. It was found that the absolute variation of the plethysmographically measured values is not greater than 5%. The Ros pointer scale of the Siregnost FD 5 yields systematically incorrect curve diagrams. In the lower range of the resistance the measured values are to high while the measured results of the resistance become progressively to low with an increasing resistance. The reason is the Ros pointer scale which does not show the real component of the impedance at a phase angle of 0 degree. The values of the real component of the respiratory impedance (Rreal) which yields the Siemens standard set show a great coincidence with the lung model resistance (R(aw)). The coincidence could be even improved by use of electronic data processing. With a computer program developed by us it is possible for the first time to indicate and registrate consecutively individual and average values of the real component (Rrealcomp) and the reactance of the respiratory impedance as well as the phase angle between the alternating pressure delta p and the oscillating flow (V). Thereby the accuracy of measurement is improved and the long winded analysis with the "phase diagram" is not necessary anymore. Further experimental and clinical investigation have to show whether the oscillatory method in the way described above will offer new possibilities for the assessment of the pulmonary function. The phase angle and its course during the respiration cycle is in this connection of special importance as a possible new parameter. PMID:1808869

  7. Expected accuracy of tilt measurements on a novel hexapod-based digital zenith camera system: a Monte-Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Hirt, Christian; Papp, Gábor; Pál, András; Benedek, Judit; Szũcs, Eszter

    2014-08-01

    Digital zenith camera systems (DZCS) are dedicated astronomical-geodetic measurement systems for the observation of the direction of the plumb line. A DZCS key component is a pair of tilt meters for the determination of the instrumental tilt with respect to the plumb line. Highest accuracy (i.e., 0.1 arc-seconds or better) is achieved in practice through observation with precision tilt meters in opposite faces (180° instrumental rotation), and application of rigorous tilt reduction models. A novel concept proposes the development of a hexapod (Stewart platform)-based DZCS. However, hexapod-based total rotations are limited to about 30°-60° in azimuth (equivalent to ±15° to ±30° yaw rotation), which raises the question of the impact of the rotation angle between the two faces on the accuracy of the tilt measurement. The goal of the present study is the investigation of the expected accuracy of tilt measurements to be carried out on future hexapod-based DZCS, with special focus placed on the role of the limited rotation angle. A Monte-Carlo simulation study is carried out in order to derive accuracy estimates for the tilt determination as a function of several input parameters, and the results are validated against analytical error propagation. As the main result of the study, limitation of the instrumental rotation to 60° (30°) deteriorates the tilt accuracy by a factor of about 2 (4) compared to a 180° rotation between the faces. Nonetheless, a tilt accuracy at the 0.1 arc-second level is expected when the rotation is at least 45°, and 0.05 arc-second (about 0.25 microradian) accurate tilt meters are deployed. As such, a hexapod-based DZCS can be expected to allow sufficiently accurate determination of the instrumental tilt. This provides supporting evidence for the feasibility of such a novel instrumentation. The outcomes of our study are not only relevant to the field of DZCS, but also to all other types of instruments where the instrumental tilt

  8. Predicting Out-of-Office Blood Pressure in the Clinic (PROOF-BP): Derivation and Validation of a Tool to Improve the Accuracy of Blood Pressure Measurement in Clinical Practice.

    PubMed

    Sheppard, James P; Stevens, Richard; Gill, Paramjit; Martin, Una; Godwin, Marshall; Hanley, Janet; Heneghan, Carl; Hobbs, F D Richard; Mant, Jonathan; McKinstry, Brian; Myers, Martin; Nunan, David; Ward, Alison; Williams, Bryan; McManus, Richard J

    2016-05-01

    Patients often have lower (white coat effect) or higher (masked effect) ambulatory/home blood pressure readings compared with clinic measurements, resulting in misdiagnosis of hypertension. The present study assessed whether blood pressure and patient characteristics from a single clinic visit can accurately predict the difference between ambulatory/home and clinic blood pressure readings (the home-clinic difference). A linear regression model predicting the home-clinic blood pressure difference was derived in 2 data sets measuring automated clinic and ambulatory/home blood pressure (n=991) using candidate predictors identified from a literature review. The model was validated in 4 further data sets (n=1172) using area under the receiver operator characteristic curve analysis. A masked effect was associated with male sex, a positive clinic blood pressure change (difference between consecutive measurements during a single visit), and a diagnosis of hypertension. Increasing age, clinic blood pressure level, and pulse pressure were associated with a white coat effect. The model showed good calibration across data sets (Pearson correlation, 0.48-0.80) and performed well-predicting ambulatory hypertension (area under the receiver operator characteristic curve, 0.75; 95% confidence interval, 0.72-0.79 [systolic]; 0.87; 0.85-0.89 [diastolic]). Used as a triaging tool for ambulatory monitoring, the model improved classification of a patient's blood pressure status compared with other guideline recommended approaches (93% [92% to 95%] classified correctly; United States, 73% [70% to 75%]; Canada, 74% [71% to 77%]; United Kingdom, 78% [76% to 81%]). This study demonstrates that patient characteristics from a single clinic visit can accurately predict a patient's ambulatory blood pressure. Usage of this prediction tool for triaging of ambulatory monitoring could result in more accurate diagnosis of hypertension and hence more appropriate treatment. PMID:27001299

  9. Relative accuracy evaluation.

    PubMed

    Zhang, Yan; Wang, Hongzhi; Yang, Zhongsheng; Li, Jianzhong

    2014-01-01

    The quality of data plays an important role in business analysis and decision making, and data accuracy is an important aspect in data quality. Thus one necessary task for data quality management is to evaluate the accuracy of the data. And in order to solve the problem that the accuracy of the whole data set is low while a useful part may be high, it is also necessary to evaluate the accuracy of the query results, called relative accuracy. However, as far as we know, neither measure nor effective methods for the accuracy evaluation methods are proposed. Motivated by this, for relative accuracy evaluation, we propose a systematic method. We design a relative accuracy evaluation framework for relational databases based on a new metric to measure the accuracy using statistics. We apply the methods to evaluate the precision and recall of basic queries, which show the result's relative accuracy. We also propose the method to handle data update and to improve accuracy evaluation using functional dependencies. Extensive experimental results show the effectiveness and efficiency of our proposed framework and algorithms. PMID:25133752

  10. Relative Accuracy Evaluation

    PubMed Central

    Zhang, Yan; Wang, Hongzhi; Yang, Zhongsheng; Li, Jianzhong

    2014-01-01

    The quality of data plays an important role in business analysis and decision making, and data accuracy is an important aspect in data quality. Thus one necessary task for data quality management is to evaluate the accuracy of the data. And in order to solve the problem that the accuracy of the whole data set is low while a useful part may be high, it is also necessary to evaluate the accuracy of the query results, called relative accuracy. However, as far as we know, neither measure nor effective methods for the accuracy evaluation methods are proposed. Motivated by this, for relative accuracy evaluation, we propose a systematic method. We design a relative accuracy evaluation framework for relational databases based on a new metric to measure the accuracy using statistics. We apply the methods to evaluate the precision and recall of basic queries, which show the result's relative accuracy. We also propose the method to handle data update and to improve accuracy evaluation using functional dependencies. Extensive experimental results show the effectiveness and efficiency of our proposed framework and algorithms. PMID:25133752

  11. Inertial Measures of Motion for Clinical Biomechanics: Comparative Assessment of Accuracy under Controlled Conditions – Changes in Accuracy over Time

    PubMed Central

    Lebel, Karina; Boissy, Patrick; Hamel, Mathieu; Duval, Christian

    2015-01-01

    Background Interest in 3D inertial motion tracking devices (AHRS) has been growing rapidly among the biomechanical community. Although the convenience of such tracking devices seems to open a whole new world of possibilities for evaluation in clinical biomechanics, its limitations haven’t been extensively documented. The objectives of this study are: 1) to assess the change in absolute and relative accuracy of multiple units of 3 commercially available AHRS over time; and 2) to identify different sources of errors affecting AHRS accuracy and to document how they may affect the measurements over time. Methods This study used an instrumented Gimbal table on which AHRS modules were carefully attached and put through a series of velocity-controlled sustained motions including 2 minutes motion trials (2MT) and 12 minutes multiple dynamic phases motion trials (12MDP). Absolute accuracy was assessed by comparison of the AHRS orientation measurements to those of an optical gold standard. Relative accuracy was evaluated using the variation in relative orientation between modules during the trials. Findings Both absolute and relative accuracy decreased over time during 2MT. 12MDP trials showed a significant decrease in accuracy over multiple phases, but accuracy could be enhanced significantly by resetting the reference point and/or compensating for initial Inertial frame estimation reference for each phase. Interpretation The variation in AHRS accuracy observed between the different systems and with time can be attributed in part to the dynamic estimation error, but also and foremost, to the ability of AHRS units to locate the same Inertial frame. Conclusions Mean accuracies obtained under the Gimbal table sustained conditions of motion suggest that AHRS are promising tools for clinical mobility assessment under constrained conditions of use. However, improvement in magnetic compensation and alignment between AHRS modules are desirable in order for AHRS to reach their

  12. Can Interactive Web-based CAD Tools Improve the Learning of Engineering Drawing? A Case Study

    NASA Astrophysics Data System (ADS)

    Pando Cerra, Pablo; Suárez González, Jesús M.; Busto Parra, Bernardo; Rodríguez Ortiz, Diana; Álvarez Peñín, Pedro I.

    2014-06-01

    Many current Web-based learning environments facilitate the theoretical teaching of a subject but this may not be sufficient for those disciplines that require a significant use of graphic mechanisms to resolve problems. This research study looks at the use of an environment that can help students learn engineering drawing with Web-based CAD tools, including a self-correction component. A comparative study of 121 students was carried out. The students were divided into two experimental groups using Web-based interactive CAD tools and into two control groups using traditional learning tools. A statistical analysis of all the samples was carried out in order to study student behavior during the research and the effectiveness of these self-study tools in the learning process. The results showed that a greater number of students in the experimental groups passed the test and improved their test scores. Therefore, the use Web-based graphic interactive tools to learn engineering drawing can be considered a significant improvement in the teaching of this kind of academic discipline.

  13. A HTML5 open source tool to conduct studies based on Libet's clock paradigm.

    PubMed

    Garaizar, Pablo; Cubillas, Carmelo P; Matute, Helena

    2016-01-01

    Libet's clock is a well-known procedure in experiments in psychology and neuroscience. Examples of its use include experiments exploring the subjective sense of agency, action-effect binding, and subjective timing of conscious decisions and perceptions. However, the technical details of the apparatus used to conduct these types of experiments are complex, and are rarely explained in sufficient detail as to guarantee an exact replication of the procedure. With this in mind, we developed Labclock Web, a web tool designed to conduct online and offline experiments using Libet's clock. After describing its technical features, we explain how to configure specific experiments using this tool. Its degree of accuracy and precision in the presentation of stimuli has been technically validated, including the use of two cognitive experiments conducted with voluntary participants who performed the experiment both in our laboratory and via the Internet. Labclock Web is distributed without charge under a free software license (GPLv3) since one of our main objectives is to facilitate the replication of experiments and hence the advancement of knowledge in this area. PMID:27623167

  14. Leadership Trust in Virtual Teams Using Communication Tools: A Quantitative Correlational Study

    ERIC Educational Resources Information Center

    Clark, Robert Lynn

    2014-01-01

    The purpose of this quantitative correlational study was to address leadership trust in virtual teams using communication tools in a small south-central, family-owned pharmaceutical organization, with multiple dispersed locations located in the United States. The results of the current research study could assist leaders to develop a communication…

  15. An Entrepreneurial Learning Exercise as a Pedagogical Tool for Teaching CSR: A Peruvian Study

    ERIC Educational Resources Information Center

    Farber, Vanina A.; Prialé, María Angela; Fuchs, Rosa María

    2015-01-01

    This paper reports on an exploratory cross-sectional study of the value of an entrepreneurial learning exercise as a tool for examining the entrepreneurship dimension of corporate social responsibility (CSR). The study used grounded theory to analyse diaries kept by graduate (MBA) students during the "20 Nuevos Soles Project". From the…

  16. The Use of Economic Impact Studies as a Service Learning Tool in Undergraduate Business Programs

    ERIC Educational Resources Information Center

    Misner, John M.

    2004-01-01

    This paper examines the use of community based economic impact studies as service learning tools for undergraduate business programs. Economic impact studies are used to measure the economic benefits of a variety of activities such as community redevelopment, tourism, and expansions of existing facilities for both private and public producers.…

  17. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    ERIC Educational Resources Information Center

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  18. SU-E-J-147: Monte Carlo Study of the Precision and Accuracy of Proton CT Reconstructed Relative Stopping Power Maps

    SciTech Connect

    Dedes, G; Asano, Y; Parodi, K; Arbor, N; Dauvergne, D; Testa, E; Letang, J; Rit, S

    2015-06-15

    Purpose: The quantification of the intrinsic performances of proton computed tomography (pCT) as a modality for treatment planning in proton therapy. The performance of an ideal pCT scanner is studied as a function of various parameters. Methods: Using GATE/Geant4, we simulated an ideal pCT scanner and scans of several cylindrical phantoms with various tissue equivalent inserts of different sizes. Insert materials were selected in order to be of clinical relevance. Tomographic images were reconstructed using a filtered backprojection algorithm taking into account the scattering of protons into the phantom. To quantify the performance of the ideal pCT scanner, we study the precision and the accuracy with respect to the theoretical relative stopping power ratios (RSP) values for different beam energies, imaging doses, insert sizes and detector positions. The planning range uncertainty resulting from the reconstructed RSP is also assessed by comparison with the range of the protons in the analytically simulated phantoms. Results: The results indicate that pCT can intrinsically achieve RSP resolution below 1%, for most examined tissues at beam energies below 300 MeV and for imaging doses around 1 mGy. RSP maps accuracy of less than 0.5 % is observed for most tissue types within the studied dose range (0.2–1.5 mGy). Finally, the uncertainty in the proton range due to the accuracy of the reconstructed RSP map is well below 1%. Conclusion: This work explores the intrinsic performance of pCT as an imaging modality for proton treatment planning. The obtained results show that under ideal conditions, 3D RSP maps can be reconstructed with an accuracy better than 1%. Hence, pCT is a promising candidate for reducing the range uncertainties introduced by the use of X-ray CT alongside with a semiempirical calibration to RSP.Supported by the DFG Cluster of Excellence Munich-Centre for Advanced Photonics (MAP)

  19. Studying the Effect of Adaptive Momentum in Improving the Accuracy of Gradient Descent Back Propagation Algorithm on Classification Problems

    NASA Astrophysics Data System (ADS)

    Rehman, Muhammad Zubair; Nawi, Nazri Mohd.

    Despite being widely used in the practical problems around the world, Gradient Descent Back-propagation algorithm comes with problems like slow convergence and convergence to local minima. Previous researchers have suggested certain modifications to improve the convergence in gradient Descent Back-propagation algorithm such as careful selection of input weights and biases, learning rate, momentum, network topology, activation function and value for 'gain' in the activation function. This research proposed an algorithm for improving the working performance of back-propagation algorithm which is 'Gradient Descent with Adaptive Momentum (GDAM)' by keeping the gain value fixed during all network trials. The performance of GDAM is compared with 'Gradient Descent with fixed Momentum (GDM)' and 'Gradient Descent Method with Adaptive Gain (GDM-AG)'. The learning rate is fixed to 0.4 and maximum epochs are set to 3000 while sigmoid activation function is used for the experimentation. The results show that GDAM is a better approach than previous methods with an accuracy ratio of 1.0 for classification problems like Wine Quality, Mushroom and Thyroid disease.

  20. Hermite finite elements for high accuracy electromagnetic field calculations: A case study of homogeneous and inhomogeneous waveguides

    NASA Astrophysics Data System (ADS)

    Boucher, C. R.; Li, Zehao; Ahheng, C. I.; Albrecht, J. D.; Ram-Mohan, L. R.

    2016-04-01

    Maxwell's vector field equations and their numerical solution represent significant challenges for physical domains with complex geometries. There are several limitations in the presently prevalent approaches to the calculation of field distributions in physical domains, in particular, with the vector finite elements. In order to quantify and resolve issues, we consider the modeling of the field equations for the prototypical examples of waveguides. We employ the finite element method with a new set of Hermite interpolation polynomials derived recently by us using group theoretic considerations. We show that (i) the approach presented here yields better accuracy by several orders of magnitude, with a smoother representation of fields than the vector finite elements for waveguide calculations. (ii) This method does not generate any spurious solutions that plague Lagrange finite elements, even though the C1 -continuous Hermite polynomials are also scalar in nature. (iii) We present solutions for propagating modes in inhomogeneous waveguides satisfying dispersion relations that can be derived directly, and investigate their behavior as the ratio of dielectric constants is varied both theoretically and numerically. Additional comparisons and advantages of the proposed method are detailed in this article. The Hermite interpolation polynomials are shown to provide a robust, accurate, and efficient means of solving Maxwell's equations in a variety of media, potentially offering a computationally inexpensive means of designing devices for optoelectronics and plasmonics of increasing complexity.

  1. Real-Word and Nonword Repetition in Italian-Speaking Children with Specific Language Impairment: A Study of Diagnostic Accuracy

    PubMed Central

    Dispaldro, Marco; Leonard, Laurence B.; Deevy, Patricia

    2013-01-01

    Purpose: Using two different scoring methods, we examined the diagnostic accuracy of both real-word and nonword repetition in identifying Italian-speaking children with and without specific language impairment (SLI). Method: A total of 34 children aged 3;11 to 5;8 participated – 17 children with SLI and 17 typically developing children matched for age (TD-A children). Children completed real-word and nonword repetition tasks. The capacity of real-word and nonword repetition tasks to discriminate children with SLI from TD-A was examined through binary logistic regression and response operating characteristics curves. Results: Both real-word and nonword repetition showed good (or excellent) sensitivity and specificity in distinguishing children with SLI from their typically developing peers. Conclusions: Nonword repetition appears to be a useful diagnostic indicator for Italian, as in other languages. In addition, real-word repetition also holds promise. The contributions of each type of measure are discussed. PMID:22761319

  2. Cognitive Abilities Underlying Reading Accuracy, Fluency and Spelling Acquisition in Korean Hangul Learners from Grades 1 to 4: A Cross-Sectional Study.

    PubMed

    Park, Hyun-Rin; Uno, Akira

    2015-08-01

    The purpose of this cross-sectional study was to examine the cognitive abilities that predict reading and spelling performance in Korean children in Grades 1 to 4, depending on expertise and reading experience. As a result, visual cognition, phonological awareness, naming speed and receptive vocabulary significantly predicted reading accuracy in children in Grades 1 and 2, whereas visual cognition, phonological awareness and rapid naming speed did not predict reading accuracy in children in higher grades. For reading, fluency, phonological awareness, rapid naming speed and receptive vocabulary were crucial abilities in children in Grades 1 to 3, whereas phonological awareness was not a significant predictor in children in Grade 4. In spelling, reading ability and receptive vocabulary were the most important abilities for accurate Hangul spelling. The results suggested that the degree of cognitive abilities required for reading and spelling changed depending on expertise and reading experience. PMID:25997096

  3. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part I. General Guidance and Tips

    PubMed Central

    Kim, Kyung Won; Lee, Juneyoung; Choi, Sang Hyun; Huh, Jimi

    2015-01-01

    In the field of diagnostic test accuracy (DTA), the use of systematic review and meta-analyses is steadily increasing. By means of objective evaluation of all available primary studies, these two processes generate an evidence-based systematic summary regarding a specific research topic. The methodology for systematic review and meta-analysis in DTA studies differs from that in therapeutic/interventional studies, and its content is still evolving. Here we review the overall process from a practical standpoint, which may serve as a reference for those who implement these methods. PMID:26576106

  4. Systematic Review and Meta-Analysis of Studies Evaluating Diagnostic Test Accuracy: A Practical Review for Clinical Researchers-Part I. General Guidance and Tips.

    PubMed

    Kim, Kyung Won; Lee, Juneyoung; Choi, Sang Hyun; Huh, Jimi; Park, Seong Ho

    2015-01-01

    In the field of diagnostic test accuracy (DTA), the use of systematic review and meta-analyses is steadily increasing. By means of objective evaluation of all available primary studies, these two processes generate an evidence-based systematic summary regarding a specific research topic. The methodology for systematic review and meta-analysis in DTA studies differs from that in therapeutic/interventional studies, and its content is still evolving. Here we review the overall process from a practical standpoint, which may serve as a reference for those who implement these methods. PMID:26576106

  5. Development of patient decision support tools for motor neuron disease using stakeholder consultation: a study protocol

    PubMed Central

    Hogden, Anne; Greenfield, David; Caga, Jashelle; Cai, Xiongcai

    2016-01-01

    Introduction Motor neuron disease (MND) is a terminal, progressive, multisystem disorder. Well-timed decisions are key to effective symptom management. To date, there are few published decision support tools, also known as decision aids, to guide patients in making ongoing choices for symptom management and quality of life. This protocol is to develop and validate decision support tools for patients and families to use in conjunction with health professionals in MND multidisciplinary care. The tools will inform patients and families of the benefits and risks of each option, as well as the consequences of accepting or declining treatment. Methods and analysis The study is being conducted from June 2015 to May 2016, using a modified Delphi process. A 2-stage, 7-step process will be used to develop the tools, based on existing literature and stakeholder feedback. The first stage will be to develop the decision support tools, while the second stage will be to validate both the tools and the process used to develop them. Participants will form expert panels, to provide feedback on which the development and validation of the tools will be based. Participants will be drawn from patients with MND, family carers and health professionals, support association workers, peak body representatives, and MND and patient decision-making researchers. Ethics and dissemination Ethical approval for the study has been granted by Macquarie University Human Research Ethics Committee (HREC), approval number 5201500658. Knowledge translation will be conducted via publications, seminar and conference presentations to patients and families, health professionals and researchers. PMID:27053272

  6. Immersion defectivity study with volume production immersion lithography tool for 45 nm node and below

    NASA Astrophysics Data System (ADS)

    Nakano, Katsushi; Nagaoka, Shiro; Yoshida, Masato; Iriuchijima, Yasuhiro; Fujiwara, Tomoharu; Shiraishi, Kenichi; Owa, Soichi

    2008-03-01

    Volume production of 45nm node devices utilizing Nikon's S610C immersion lithography tool has started. Important to the success in achieving high-yields in volume production with immersion lithography has been defectivity reduction. In this study we evaluate several methods of defectivity reduction. The tools used in our defectivity analysis included a dedicated immersion cluster tools consisting of a Nikon S610C, a volume production immersion exposure tool with NA of 1.3, and a resist coater-developer LITHIUS i+ from TEL. In our initial procedure we evaluated defectivity behavior by comparing on a topcoat-less resist process to a conventional topcoat process. Because of its simplicity the topcoatless resist shows lower defect levels than the topcoat process. In a second study we evaluated the defect reduction by introducing the TEL bevel rinse and pre-immersion bevel cleaning techniques. This technique was shown to successfully reduce the defect levels by reducing the particles at the wafer bevel region. For the third defect reduction method, two types of tool cleaning processes are shown. Finally, we discuss the overall defectivity behavior at the 45nm node. To facilitate an understanding of the root cause of the defects, defect source analysis (DSA) was applied to separate the defects into three classes according to the source of defects. DSA analysis revealed that more than 99% of defects relate to material and process, and less than 1% of the defects relate to the exposure tool. Material and process optimization by collaborative work between exposure tool vendors, track vendors and material vendors is a key for success of 45nm node device manufacturing.

  7. Accuracy of self-reported intake of signature foods in a school meal intervention study: comparison between control and intervention period.

    PubMed

    Biltoft-Jensen, Anja; Damsgaard, Camilla Trab; Andersen, Rikke; Ygil, Karin Hess; Andersen, Elisabeth Wreford; Ege, Majken; Christensen, Tue; Sørensen, Louise Bergmann; Stark, Ken D; Tetens, Inge; Thorsen, Anne-Vibeke

    2015-08-28

    Bias in self-reported dietary intake is important when evaluating the effect of dietary interventions, particularly for intervention foods. However, few have investigated this in children, and none have investigated the reporting accuracy of fish intake in children using biomarkers. In a Danish school meal study, 8- to 11-year-old children (n 834) were served the New Nordic Diet (NND) for lunch. The present study examined the accuracy of self-reported intake of signature foods (berries, cabbage, root vegetables, legumes, herbs, potatoes, wild plants, mushrooms, nuts and fish) characterising the NND. Children, assisted by parents, self-reported their diet in a Web-based Dietary Assessment Software for Children during the intervention and control (packed lunch) periods. The reported fish intake by children was compared with their ranking according to fasting whole-blood EPA and DHA concentration and weight percentage using the Spearman correlations and cross-classification. Direct observation of school lunch intake (n 193) was used to score the accuracy of food-reporting as matches, intrusions, omissions and faults. The reporting of all lunch foods had higher percentage of matches compared with the reporting of signature foods in both periods, and the accuracy was higher during the control period compared with the intervention period. Both Spearman's rank correlations and linear mixed models demonstrated positive associations between EPA+DHA and reported fish intake. The direct observations showed that both reported and real intake of signature foods did increase during the intervention period. In conclusion, the self-reported data represented a true increase in the intake of signature foods and can be used to examine dietary intervention effects. PMID:26189886

  8. Accuracy of bleeding scores for patients presenting with myocardial infarction: a meta-analysis of 9 studies and 13 759 patients

    PubMed Central

    D'Ascenzo, Fabrizio; Moretti, Claudio; Omedè, Pierluigi; Montefusco, Antonio; Bach, Richard G.; Alexander, Karen P.; Mehran, Roxana; Ariza-Solé, Albert; Zoccai, Giuseppe Biondi; Gaita, Fiorenzo

    2015-01-01

    Introduction Due to its negative impact on prognosis, a clear assessment of bleeding risk for patients presenting with acute coronary syndrome (ACS) remains crucial. Different risk scores have been proposed and compared, although with inconsistent results. Aim We performed a meta-analysis to evaluate the accuracy of different bleeding risk scores for ACS patients. Material and methods All studies externally validating risk scores for bleeding for patients presenting with ACS were included in the present review. Accuracy of risk scores for external validation cohorts to predict major bleeding in patients with ACS was the primary end point. Sensitivity analysis was performed according to clinical presentation (ST segment elevation myocardial infarction (STEMI) and non-ST segment elevation myocardial infarction (NSTEMI)). Results Nine studies and 13 759 patients were included. CRUSADE, ACUITY, ACTION and GRACE were the scores externally validated. The rate of in-hospital major bleeding was 7.80% (5.5–9.2), 2.05% (1.5–3.0) being related to access and 2.70% (1.7–4.0) needing transfusions. When evaluating all ACS patients, ACTION, CRUSADE and ACUITY performed similarly (AUC 0.75: 0.72–0.79; 0.71: 0.64–0.80 and 0.71: 0.63–0.77 respectively) when compared to GRACE (0.66; 0.64–0.67, all confidence intervals 95%). When appraising only STEMI patients, all the scores performed similarly, while CRUSADE was the only one externally validated for NSTEMI. For ACTION and ACUITY, accuracy increased for radial access patients, while no differences were found for CRUSADE. Conclusions ACTION, CRUSADE and ACUITY perform similarly to predict risk of bleeding in ACS patients. The CRUSADE score is the only one externally validated for NSTEMI, while accuracy of the scores increased with radial access. PMID:26677357

  9. A validation study concerning the effects of interview content, retention interval, and grade on children’s recall accuracy for dietary intake and/or physical activity

    PubMed Central

    Baxter, Suzanne D.; Hitchcock, David B.; Guinn, Caroline H.; Vaadi, Kate K.; Puryear, Megan P.; Royer, Julie A.; McIver, Kerry L.; Dowda, Marsha; Pate, Russell R.; Wilson, Dawn K.

    2014-01-01

    interactions mentioned. Content effects depended on other factors. Grade effects were mixed. Dietary accuracy was better with same-day than previous-day retention interval. Conclusions Results do not support integrating dietary intake and physical activity in children’s recalls, but do support using shorter rather than longer retention intervals to yield more accurate dietary recalls. Further validation studies need to clarify age effects and identify evidence-based practices to improve children’s accuracy for recalling dietary intake and/or physical activity. PMID:24767807

  10. Simulation and Experimental Studies on Substrate Temperature and Gas Density Field in Hfcvd Diamond Films Growth on WC-Co Drill Tools

    NASA Astrophysics Data System (ADS)

    Zhang, Jianguo; Zhang, Tao; Wang, Xinchang; Shen, Bin; Sun, Fanghong

    2013-04-01

    Uniform temperature and gas density field inside the reactor play an important role on synthesis of high-quality diamond films using hot filament chemical vapor deposition (HFCVD) method. In the present study, the finite volume method (FVM) is adopted to simulate the temperature and gas density distribution during the deposition process. Temperature-measuring experiments are conducted to verify the correctness of the simulation results. Thereafter, the deposition parameters are optimized using this model as D (filament separation) = 35 mm, H (filament-substrate distance) = -10 mm and N (number of gas inlet) = 3. Finally, experiments of depositing diamond films on WC-Co drill tools are carried out with the optimal deposition parameters. The results of the characterization by SEM and Raman spectrum exhibit that as-fabricated diamond-coated tools present a layer of high-quality diamond films with homogeneous surface and uniform thickness, further validating the accuracy of the parameter optimization using the simulation method.

  11. MetLab: An In Silico Experimental Design, Simulation and Analysis Tool for Viral Metagenomics Studies

    PubMed Central

    Gourlé, Hadrien; Bongcam-Rudloff, Erik; Hayer, Juliette

    2016-01-01

    Metagenomics, the sequence characterization of all genomes within a sample, is widely used as a virus discovery tool as well as a tool to study viral diversity of animals. Metagenomics can be considered to have three main steps; sample collection and preparation, sequencing and finally bioinformatics. Bioinformatic analysis of metagenomic datasets is in itself a complex process, involving few standardized methodologies, thereby hampering comparison of metagenomics studies between research groups. In this publication the new bioinformatics framework MetLab is presented, aimed at providing scientists with an integrated tool for experimental design and analysis of viral metagenomes. MetLab provides support in designing the metagenomics experiment by estimating the sequencing depth needed for the complete coverage of a species. This is achieved by applying a methodology to calculate the probability of coverage using an adaptation of Stevens’ theorem. It also provides scientists with several pipelines aimed at simplifying the analysis of viral metagenomes, including; quality control, assembly and taxonomic binning. We also implement a tool for simulating metagenomics datasets from several sequencing platforms. The overall aim is to provide virologists with an easy to use tool for designing, simulating and analyzing viral metagenomes. The results presented here include a benchmark towards other existing software, with emphasis on detection of viruses as well as speed of applications. This is packaged, as comprehensive software, readily available for Linux and OSX users at https://github.com/norling/metlab. PMID:27479078

  12. User Friendly Open GIS Tool for Large Scale Data Assimilation - a Case Study of Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Gupta, P. K.

    2012-08-01

    Open source software (OSS) coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc...) and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.). Quantum GIS (QGIS) is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn - Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect), landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  13. MetLab: An In Silico Experimental Design, Simulation and Analysis Tool for Viral Metagenomics Studies.

    PubMed

    Norling, Martin; Karlsson-Lindsjö, Oskar E; Gourlé, Hadrien; Bongcam-Rudloff, Erik; Hayer, Juliette

    2016-01-01

    Metagenomics, the sequence characterization of all genomes within a sample, is widely used as a virus discovery tool as well as a tool to study viral diversity of animals. Metagenomics can be considered to have three main steps; sample collection and preparation, sequencing and finally bioinformatics. Bioinformatic analysis of metagenomic datasets is in itself a complex process, involving few standardized methodologies, thereby hampering comparison of metagenomics studies between research groups. In this publication the new bioinformatics framework MetLab is presented, aimed at providing scientists with an integrated tool for experimental design and analysis of viral metagenomes. MetLab provides support in designing the metagenomics experiment by estimating the sequencing depth needed for the complete coverage of a species. This is achieved by applying a methodology to calculate the probability of coverage using an adaptation of Stevens' theorem. It also provides scientists with several pipelines aimed at simplifying the analysis of viral metagenomes, including; quality control, assembly and taxonomic binning. We also implement a tool for simulating metagenomics datasets from several sequencing platforms. The overall aim is to provide virologists with an easy to use tool for designing, simulating and analyzing viral metagenomes. The results presented here include a benchmark towards other existing software, with emphasis on detection of viruses as well as speed of applications. This is packaged, as comprehensive software, readily available for Linux and OSX users at https://github.com/norling/metlab. PMID:27479078

  14. Numerical accuracy of mean-field calculations in coordinate space

    NASA Astrophysics Data System (ADS)

    Ryssens, W.; Heenen, P.-H.; Bender, M.

    2015-12-01

    Background: Mean-field methods based on an energy density functional (EDF) are powerful tools used to describe many properties of nuclei in the entirety of the nuclear chart. The accuracy required of energies for nuclear physics and astrophysics applications is of the order of 500 keV and much effort is undertaken to build EDFs that meet this requirement. Purpose: Mean-field calculations have to be accurate enough to preserve the accuracy of the EDF. We study this numerical accuracy in detail for a specific numerical choice of representation for mean-field equations that can accommodate any kind of symmetry breaking. Method: The method that we use is a particular implementation of three-dimensional mesh calculations. Its numerical accuracy is governed by three main factors: the size of the box in which the nucleus is confined, the way numerical derivatives are calculated, and the distance between the points on the mesh. Results: We examine the dependence of the results on these three factors for spherical doubly magic nuclei, neutron-rich 34Ne , the fission barrier of 240Pu , and isotopic chains around Z =50 . Conclusions: Mesh calculations offer the user extensive control over the numerical accuracy of the solution scheme. When appropriate choices for the numerical scheme are made the achievable accuracy is well below the model uncertainties of mean-field methods.

  15. Real-time continuous glucose monitoring shows high accuracy within 6 hours after sensor calibration: a prospective study.

    PubMed

    Yue, Xiao-Yan; Zheng, Yi; Cai, Ye-Hua; Yin, Ning-Ning; Zhou, Jian-Xin

    2013-01-01

    Accurate and timely glucose monitoring is essential in intensive care units. Real-time continuous glucose monitoring system (CGMS) has been advocated for many years to improve glycemic management in critically ill patients. In order to determine the effect of calibration time on the accuracy of CGMS, real-time subcutaneous CGMS was used in 18 critically ill patients. CGMS sensor was calibrated with blood glucose measurements by blood gas/glucose analyzer every 12 hours. Venous blood was sampled every 2 to 4 hours, and glucose concentration was measured by standard central laboratory device (CLD) and by blood gas/glucose analyzer. With CLD measurement as reference, relative absolute difference (mean±SD) in CGMS and blood gas/glucose analyzer were 14.4%±12.2% and 6.5%±6.2%, respectively. The percentage of matched points in Clarke error grid zone A was 74.8% in CGMS, and 98.4% in blood gas/glucose analyzer. The relative absolute difference of CGMS obtained within 6 hours after sensor calibration (8.8%±7.2%) was significantly less than that between 6 to 12 hours after calibration (20.1%±13.5%, p<0.0001). The percentage of matched points in Clarke error grid zone A was also significantly higher in data sets within 6 hours after calibration (92.4% versus 57.1%, p<0.0001). In conclusion, real-time subcutaneous CGMS is accurate in glucose monitoring in critically ill patients. CGMS sensor should be calibrated less than 6 hours, no matter what time interval recommended by manufacturer. PMID:23555886

  16. The theoretical accuracy of Runge-Kutta time discretizations for the initial boundary value problem: A careful study of the boundary error

    NASA Technical Reports Server (NTRS)

    Carpenter, Mark H.; Gottlieb, David; Abarbanel, Saul; Don, Wai-Sun

    1993-01-01

    The conventional method of imposing time dependent boundary conditions for Runge-Kutta (RK) time advancement reduces the formal accuracy of the space-time method to first order locally, and second order globally, independently of the spatial operator. This counter intuitive result is analyzed in this paper. Two methods of eliminating this problem are proposed for the linear constant coefficient case: (1) impose the exact boundary condition only at the end of the complete RK cycle, (2) impose consistent intermediate boundary conditions derived from the physical boundary condition and its derivatives. The first method, while retaining the RK accuracy in all cases, results in a scheme with much reduced CFL condition, rendering the RK scheme less attractive. The second method retains the same allowable time step as the periodic problem. However it is a general remedy only for the linear case. For non-linear hyperbolic equations the second method is effective only for for RK schemes of third order accuracy or less. Numerical studies are presented to verify the efficacy of each approach.

  17. The accuracy of pain drawing in identifying psychological distress in low back pain—systematic review and meta-analysis of diagnostic studies

    PubMed Central

    Bertozzi, Lucia; Rosso, Anna; Romeo, Antonio; Villafañe, Jorge Hugo; Guccione, Andrew A.; Pillastrini, Paolo; Vanti, Carla

    2015-01-01

    The aim of this systematic review and meta-analysis was to estimate the accuracy of qualitative pain drawings (PDs) in identifying psychological distress in subacute and chronic low back pain (LBP) patients. [Subjects and Methods] Data were obtained from searches of PubMed, EBSCO, Scopus, PsycINFO and ISI Web of Science from their inception to July 2014. Quality assessments of bias and applicability were conducted using the Quality of Diagnostic Accuracy Studies-2 (QUADAS-2). [Results] The summary estimates were: sensitivity=0.45 (95% CI 0.34, 0.61), specificity=0.66 (95% CI 0.53, 0.82), positive likelihood ratio=1.23 (95% CI 0.93, 1.62), negative likelihood ratio=0.84 (95% CI 0.70, 1.01), and diagnostic odds ratio=1.46 (95% CI 0.79, 2.68). The area under the curve was 78% (CI, 57 to 99%). [Conclusion] The results of this systematic review do not show broad and unqualified support for the accuracy of PDs in detecting psychological distress in subacute and chronic LBP. PMID:26644701

  18. Remote real-time monitoring of free flaps via smartphone photography and 3G wireless Internet: a prospective study evidencing diagnostic accuracy.

    PubMed

    Engel, Holger; Huang, Jung Ju; Tsao, Chung Kan; Lin, Chia-Yu; Chou, Pan-Yu; Brey, Eric M; Henry, Steven L; Cheng, Ming Huei

    2011-11-01

    This prospective study was designed to compare the accuracy rate between remote smartphone photographic assessments and in-person examinations for free flap monitoring. One hundred and three consecutive free flaps were monitored with in-person examinations and assessed remotely by three surgeons (Team A) via photographs transmitted over smartphone. Four other surgeons used the traditional in-person examinations as Team B. The response time to re-exploration was defined as the interval between when a flap was evaluated as compromised by the nurse/house officer and when the decision was made for re-exploration. The accuracy rate was 98.7% and 94.2% for in-person and smartphone photographic assessments, respectively. The response time of 8 ± 3 min in Team A was statistically shorter than the 180 ± 104 min in Team B (P = 0.01 by the Mann-Whitney test). The remote smartphone photography assessment has a comparable accuracy rate and shorter response time compared with in-person examination for free flap monitoring. PMID:22072583

  19. A pilot study to determine whether using a lightweight, wearable micro-camera improves dietary assessment accuracy and offers information on macronutrients and eating rate.

    PubMed

    Pettitt, Claire; Liu, Jindong; Kwasnicki, Richard M; Yang, Guang-Zhong; Preston, Thomas; Frost, Gary

    2016-01-14

    A major limitation in nutritional science is the lack of understanding of the nutritional intake of free-living people. There is an inverse relationship between accuracy of reporting of energy intake by all current nutritional methodologies and body weight. In this pilot study we aim to explore whether using a novel lightweight, wearable micro-camera improves the accuracy of dietary intake assessment. Doubly labelled water (DLW) was used to estimate energy expenditure and intake over a 14-d period, over which time participants (n 6) completed a food diary and wore a micro-camera on 2 of the days. Comparisons were made between the estimated energy intake from the reported food diary alone and together with the images from the micro-camera recordings. There was an average daily deficit of 3912 kJ using food diaries to estimate energy intake compared with estimated energy expenditure from DLW (P=0·0118), representing an under-reporting rate of 34 %. Analysis of food diaries alone showed a significant deficit in estimated daily energy intake compared with estimated intake from food diary analysis with images from the micro-camera recordings (405 kJ). Use of the micro-camera images in conjunction with food diaries improves the accuracy of dietary assessment and provides valuable information on macronutrient intake and eating rate. There is a need to develop this recording technique to remove user and assessor bias. PMID:26537614

  20. Radiolysis of ethanol and ethanol-water solutions: A tool for studying bioradical reactions

    NASA Astrophysics Data System (ADS)

    Jore, D.; Champion, B.; Kaouadji, N.; Jay-Gerin, J.-P.; Ferradini, C.

    Radiolysis of pure ethanol and ethanol-water solutions is examined in view of its relevance to the study of biological radical mechanisms. On the basis of earlier studies, a consistent reaction scheme is adopted. New data on radical yields are obtained from the radiolysis of dilute solutions of vitamins E and C in these solvents. It is shown that the radiolysis of ethanolic solutions provide an efficient tool to study radical reactions of water-insoluble biomolecules.

  1. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  2. Handwriting Characteristics among Secondary Students with and without Physical Disabilities: A Study with a Computerized Tool

    ERIC Educational Resources Information Center

    Li-Tsang, Cecilia W. P.; Au, Ricky K. C.; Chan, Michelle H. Y.; Chan, Lily W. L.; Lau, Gloria M. T.; Lo, T. K.; Leung, Howard W. H.

    2011-01-01

    The purpose of the present study was to investigate the handwriting characteristics of secondary school students with and without physical disabilities (PD). With the use of a computerized Chinese Handwriting Assessment Tool (CHAT), it was made possible to objectively assess and analyze in detail the handwriting characteristics of individual…

  3. The Personal Study Program as a Tool for Career Planning and Personalization of Adult Learning.

    ERIC Educational Resources Information Center

    Onnismaa, Jussi

    2003-01-01

    The personal study program (PSP) can be defined as a tool for the successful accomplishment of vocational adult training. The program defines the objectives of education and training and the best means of achieving these. Through counseling interaction, the adult learner may redefine his goals and relation to a future profession and so revise his…

  4. Factors Influencing the Use of Cognitive Tools in Web-Based Learning Environments: A Case Study

    ERIC Educational Resources Information Center

    Ozcelik, Erol; Yildirim, Soner

    2005-01-01

    High demands on learners in Web-based learning environments and constraints of the human cognitive system cause disorientation and cognitive overload. These problems could be inhibited if appropriate cognitive tools are provided to support learners' cognitive processes. The purpose of this study was to explore the factors influencing the use of…

  5. Study Abroad Programs as Tools of Internationalization: Which Factors Influence Hungarian Business Students to Participate?

    ERIC Educational Resources Information Center

    Huják, Janka

    2015-01-01

    The internationalization of higher education has been on the agenda for decades now all over the world. Study abroad programs are undoubtedly tools of the internationalization endeavors. The ERASMUS Student Mobility Program is one of the flagships of the European Union's educational exchange programs implicitly aiming for the internationalization…

  6. Music: Artistic Performance or a Therapeutic Tool? A Study on Differences

    ERIC Educational Resources Information Center

    Petersson, Gunnar; Nystrom, Maria

    2011-01-01

    The aim of this study is to analyze and describe how musicians who are also music therapy students separate music as artistic performance from music as a therapeutic tool. The data consist of 18 written reflections from music therapy students that were analyzed according to a phenomenographic method. The findings are presented as four…

  7. A Study of Turnitin as an Educational Tool in Student Dissertations

    ERIC Educational Resources Information Center

    Biggam, John; McCann, Margaret

    2010-01-01

    Purpose: This paper explores the use of Turnitin as a learning tool (particularly in relation to citing sources and paraphrasing) and as a vehicle for reducing incidences of plagiarism. Design/methodology/approach: The research was implemented using a case study of 49 final-year "honours" undergraduate students undertaking their year-long core…

  8. iMindMap as an Innovative Tool in Teaching and Learning Accounting: An Exploratory Study

    ERIC Educational Resources Information Center

    Wan Jusoh, Wan Noor Hazlina; Ahmad, Suraya

    2016-01-01

    Purpose: The purpose of this study is to explore the use of iMindMap software as an interactive tool in the teaching and learning method and also to be able to consider iMindMap as an alternative instrument in achieving the ultimate learning outcome. Design/Methodology/Approach: Out of 268 students of the management accounting at the University of…

  9. Critical Reflection as a Learning Tool for Nurse Supervisors: A Hermeneutic Phenomenological Study

    ERIC Educational Resources Information Center

    Urbas-Llewellyn, Agnes

    2013-01-01

    Critical reflection as a learning tool for nursing supervisors is a complex and multifaceted process not completely understood by healthcare leadership, specifically nurse supervisors. Despite a multitude of research studies on critical reflection, there remains a gap in the literature regarding the perceptions of the individual, the support…

  10. Basins and Wepp Climate Assessment Tools (Cat): Case Study Guide to Potential Applications (Final Report)

    EPA Science Inventory

    Cover of the BASINS and WEPP Climate Assessment <span class=Tool: Case Study Final report"> This final report supports application of two recently developed...

  11. Computer-Mediated Communication as a Teaching Tool: A Case Study.

    ERIC Educational Resources Information Center

    Everett, Donna R.; Ahern, Terence C.

    1994-01-01

    Discussion of emerging educational technologies focuses on a case study of college students that was conducted to observe the effects of using computer-mediated communication and appropriate groupware as a teaching tool. Highlights include effects on the students, the structure of the classroom, and interpersonal interactions. (Contains 29…

  12. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  13. The Life Story Board: A Feasibility Study of a Visual Interview Tool for School Counsellors

    ERIC Educational Resources Information Center

    Chase, Robert M.; Medina, Maria Fernanda; Mignone, Javier

    2012-01-01

    The article describes the findings of a pilot study of the Life Story Board (LSB), a novel visual information system with a play board and sets of magnetic cards designed to be a practical clinical tool for counsellors, therapists, and researchers. The LSB is similar to a multidimensional genogram, and serves as a platform to depict personal…

  14. Social Networking as an Admission Tool: A Case Study in Success

    ERIC Educational Resources Information Center

    Hayes, Thomas J.; Ruschman, Doug; Walker, Mary M.

    2009-01-01

    The concept of social networking, the focus of this article, targets the development of online communities in higher education, and in particular, as part of the admission process. A successful case study of a university is presented on how one university has used this tool to compete for students. A discussion including suggestions on how to…

  15. Preschoolers Monitor the Relative Accuracy of Informants

    ERIC Educational Resources Information Center

    Pasquini, Elisabeth S.; Corriveau, Kathleen H.; Koenig, Melissa; Harris, Paul L.

    2007-01-01

    In 2 studies, the sensitivity of 3- and 4-year-olds to the previous accuracy of informants was assessed. Children viewed films in which 2 informants labeled familiar objects with differential accuracy (across the 2 experiments, children were exposed to the following rates of accuracy by the more and less accurate informants, respectively: 100% vs.…

  16. The use of analytical surface tools in the fundamental study of wear. [atomic nature of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    Various techniques and surface tools available for the study of the atomic nature of the wear of materials are reviewed These include chemical etching, x-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which affect wear, such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding, are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  17. The use of analytical surface tools in the fundamental study of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    This paper reviews the various techniques and surface tools available for the study of the atomic nature of the wear of materials. These include chemical etching, X-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which effect wear such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  18. Data accuracy assessment using enterprise architecture

    NASA Astrophysics Data System (ADS)

    Närman, Per; Holm, Hannes; Johnson, Pontus; König, Johan; Chenine, Moustafa; Ekstedt, Mathias

    2011-02-01

    Errors in business processes result in poor data accuracy. This article proposes an architecture analysis method which utilises ArchiMate and the Probabilistic Relational Model formalism to model and analyse data accuracy. Since the resources available for architecture analysis are usually quite scarce, the method advocates interviews as the primary data collection technique. A case study demonstrates that the method yields correct data accuracy estimates and is more resource-efficient than a competing sampling-based data accuracy estimation method.

  19. Compiling a placement audit tool: a case study of one college's use of the QualCube model in developing an audit tool for student placement areas.

    PubMed

    Fritz, K

    1997-10-01

    Nursing students, as users of college services, can rightly expect satisfaction and even continuous improvement in their clinical experience, which is one of the cornerstones of nurse education. The North Yorkshire College of Health Studies has developed and used a multidisciplinary placement audit tool based on QualCube, a framework for standard setting, auditing and improving quality developed by Nicklin and Lankshear (in Nicklin & Kenworthy 1995 p 18). In order to facilitate a thorough audit of placement areas, a working group devised the tool based on the relevant areas of intersection of the three planes of the cube and then set written standards and specified performance criteria. This article describes the application of the QualCube framework, the design and description of the tool and feedback on its usage. The exciting and useful difference in the tool is its final page, the action plan, which is designed to realistically encourage continuous improvement in each placement area. PMID:9370633

  20. IGG: A tool to integrate GeneChips for genetic studies.

    PubMed

    Li, M-X; Jiang, L; Ho, S-L; Song, Y-Q; Sham, P-C

    2007-11-15

    To facilitate genetic studies using high-throughput genotyping technologies, we have developed an open source tool to integrate genotype data across the Affymetrix and Illumina platforms. It can efficiently integrate a large amount of data from various GeneChips, add genotypes of the HapMap Project into a specific project, flexibly trim and export the integrated data with different formats of popular genetic analysis tools, and highly control the quality of genotype data. Furthermore, this tool has sufficiently simplified its usage through its user-friendly graphic interface and is independent of third-party databases. IGG has successfully been applied to a genome-wide linkage scan in a Charcot-Marie-Tooth disease pedigree by integrating three types of GeneChips and HapMap project genotypes. PMID:17872914

  1. Study of a wheel-like electrorheological finishing tool and its applications to small parts.

    PubMed

    Su, Jingshi; Cheng, Haobo; Feng, Yunpeng; Tam, Hon-Yuen

    2016-02-01

    A wheel-like electrorheological finishing (ERF) tool for small parts polishing is proposed and thoroughly studied. First, the electrorheological polishing fluid is tested, and its properties suggest usability for electrorheological fluid-assisted finishing. Then, the mathematical removal model of the ERF tool is built employing the conformal mapping method and high-order multipolar moment theory. Finally, a micropattern of trough is fabricated on a slide glass (7 mm wide and 1 mm thick). The trough is 70 nm deep, and its flat bottom is 1.5 m wide (peak to valley of 3.16 nm and root mean square of 1.27 nm); the surface roughness finally achieves 0.86 nm. The results demonstrate the stable machining capability of the ERF tool for miniature parts. PMID:26836063

  2. Experience of integrating various technological tools into the study and future teaching of mathematics education students

    NASA Astrophysics Data System (ADS)

    Gorev, Dvora; Gurevich-Leibman, Irina

    2015-07-01

    This paper presents our experience of integrating technological tools into our mathematics teaching (in both disciplinary and didactic courses) for student-teachers. In the first cycle of our study, a variety of technological tools were used (e.g., dynamic software, hypertexts, video and applets) in teaching two disciplinary mathematics courses. We found that the tool most preferred by the students was dynamic software, while the applets were almost neglected. In the next cycle, we focused on using various applets in both disciplinary and didactic mathematics courses. We found that if the assignments were applet-oriented, i.e., adjusted to the chosen applet, or vice versa - the applet was chosen appropriately to suit the given assignment - then the students were able to make use of applets in an effective way. Furthermore, the students came to see the potential of applets for improving learning.

  3. Do Proficiency and Study-Abroad Experience Affect Speech Act Production? Analysis of Appropriateness, Accuracy, and Fluency

    ERIC Educational Resources Information Center

    Taguchi, Naoko

    2011-01-01

    This cross-sectional study examined the effect of general proficiency and study-abroad experience in production of speech acts among learners of L2 English. Participants were 25 native speakers of English and 64 Japanese college students of English divided into three groups. Group 1 (n = 22) had lower proficiency and no study-abroad experience.…

  4. The accuracy of breast volume measurement methods: A systematic review.

    PubMed

    Choppin, S B; Wheat, J S; Gee, M; Goyal, A

    2016-08-01

    Breast volume is a key metric in breast surgery and there are a number of different methods which measure it. However, a lack of knowledge regarding a method's accuracy and comparability has made it difficult to establish a clinical standard. We have performed a systematic review of the literature to examine the various techniques for measurement of breast volume and to assess their accuracy and usefulness in clinical practice. Each of the fifteen studies we identified had more than ten live participants and assessed volume measurement accuracy using a gold-standard based on the volume, or mass, of a mastectomy specimen. Many of the studies from this review report large (>200 ml) uncertainty in breast volume and many fail to assess measurement accuracy using appropriate statistical tools. Of the methods assessed, MRI scanning consistently demonstrated the highest accuracy with three studies reporting errors lower than 10% for small (250 ml), medium (500 ml) and large (1000 ml) breasts. However, as a high-cost, non-routine assessment other methods may be more appropriate. PMID:27288864

  5. Updating flood maps efficiently using existing hydraulic models, very-high-accuracy elevation data, and a geographic information system; a pilot study on the Nisqually River, Washington

    USGS Publications Warehouse

    Jones, Joseph L.; Haluska, Tana L.; Kresch, David L.

    2001-01-01

    A method of updating flood inundation maps at a fraction of the expense of using traditional methods was piloted in Washington State as part of the U.S. Geological Survey Urban Geologic and Hydrologic Hazards Initiative. Large savings in expense may be achieved by building upon previous Flood Insurance Studies and automating the process of flood delineation with a Geographic Information System (GIS); increases in accuracy and detail result from the use of very-high-accuracy elevation data and automated delineation; and the resulting digital data sets contain valuable ancillary information such as flood depth, as well as greatly facilitating map storage and utility. The method consists of creating stage-discharge relations from the archived output of the existing hydraulic model, using these relations to create updated flood stages for recalculated flood discharges, and using a GIS to automate the map generation process. Many of the effective flood maps were created in the late 1970?s and early 1980?s, and suffer from a number of well recognized deficiencies such as out-of-date or inaccurate estimates of discharges for selected recurrence intervals, changes in basin characteristics, and relatively low quality elevation data used for flood delineation. FEMA estimates that 45 percent of effective maps are over 10 years old (FEMA, 1997). Consequently, Congress has mandated the updating and periodic review of existing maps, which have cost the Nation almost 3 billion (1997) dollars. The need to update maps and the cost of doing so were the primary motivations for piloting a more cost-effective and efficient updating method. New technologies such as Geographic Information Systems and LIDAR (Light Detection and Ranging) elevation mapping are key to improving the efficiency of flood map updating, but they also improve the accuracy, detail, and usefulness of the resulting digital flood maps. GISs produce digital maps without manual estimation of inundated areas between

  6. Ground Truth Sampling and LANDSAT Accuracy Assessment

    NASA Technical Reports Server (NTRS)

    Robinson, J. W.; Gunther, F. J.; Campbell, W. J.

    1982-01-01

    It is noted that the key factor in any accuracy assessment of remote sensing data is the method used for determining the ground truth, independent of the remote sensing data itself. The sampling and accuracy procedures developed for nuclear power plant siting study are described. The purpose of the sampling procedure was to provide data for developing supervised classifications for two study sites and for assessing the accuracy of that and the other procedures used. The purpose of the accuracy assessment was to allow the comparison of the cost and accuracy of various classification procedures as applied to various data types.

  7. Advanced Risk Reduction Tool (ARRT) Special Case Study Report: Science and Engineering Technical Assessments (SETA) Program

    NASA Technical Reports Server (NTRS)

    Kirsch, Paul J.; Hayes, Jane; Zelinski, Lillian

    2000-01-01

    This special case study report presents the Science and Engineering Technical Assessments (SETA) team's findings for exploring the correlation between the underlying models of Advanced Risk Reduction Tool (ARRT) relative to how it identifies, estimates, and integrates Independent Verification & Validation (IV&V) activities. The special case study was conducted under the provisions of SETA Contract Task Order (CTO) 15 and the approved technical approach documented in the CTO-15 Modification #1 Task Project Plan.

  8. The pectoralis minor length test: a study of the intra-rater reliability and diagnostic accuracy in subjects with and without shoulder symptoms

    PubMed Central

    Lewis, Jeremy S; Valentine, Rachel E

    2007-01-01

    Background Postural abnormality and muscle imbalance are thought to contribute to pain and a loss of normal function in the upper body. A shortened pectoralis minor muscle is commonly identified as part of this imbalance. Clinical tests have been recommended to test for shortening of this muscle. The aim of this study was to evaluate the intra-rater reliability and diagnostic accuracy of the pectoralis minor length test. Methods Measurements were made in 45 subjects with and 45 subjects without shoulder symptoms. Measurements were made with the subjects lying in supine. In this position the linear distance from the treatment table to the posterior aspect of the acromion was measured on two occasions (separated by a minimum of 30 minutes and additional data collection on other subjects to reduce bias) by one rater. The reliability of the measurements was analyzed using intraclass correlation coefficients (ICC), 95% confidence intervals (CI) and standard error of measurement (SEM). The diagnostic accuracy of the test was investigated by determining the sensitivity, specificity, positive and negative likelihood ratios of the test against a 'gold standard' reference. The assessor remained 'blinded' to data input and the measurements were staggered to reduce examiner bias. Results The pectoralis minor length test was found to have excellent intra-rater reliability for dominant and non-dominant side of the subjects without symptoms, and for the painfree and painful side of the subjects with symptoms. The values calculated for the sensitivity, specificity, positive and negative likelihood ratios suggest this test performed in the manner investigated in this study and recommended in the literature, lacks diagnostic accuracy. Conclusion The findings of this study suggest that although the pectoralis minor length test demonstrates acceptable clinical reliability, its lack of specificity suggests that clinicians using this test to inform the clinical reasoning process with

  9. Smart tool holder

    DOEpatents

    Day, Robert Dean; Foreman, Larry R.; Hatch, Douglas J.; Meadows, Mark S.

    1998-01-01

    There is provided an apparatus for machining surfaces to accuracies within the nanometer range by use of electrical current flow through the contact of the cutting tool with the workpiece as a feedback signal to control depth of cut.

  10. Metagenomics: Tools and Insights for Analyzing Next-Generation Sequencing Data Derived from Biodiversity Studies

    PubMed Central

    Oulas, Anastasis; Pavloudi, Christina; Polymenakou, Paraskevi; Pavlopoulos, Georgios A; Papanikolaou, Nikolas; Kotoulas, Georgios; Arvanitidis, Christos; Iliopoulos, Ioannis

    2015-01-01

    Advances in next-generation sequencing (NGS) have allowed significant breakthroughs in microbial ecology studies. This has led to the rapid expansion of research in the field and the establishment of “metagenomics”, often defined as the analysis of DNA from microbial communities in environmental samples without prior need for culturing. Many metagenomics statistical/computational tools and databases have been developed in order to allow the exploitation of the huge influx of data. In this review article, we provide an overview of the sequencing technologies and how they are uniquely suited to various types of metagenomic studies. We focus on the currently available bioinformatics techniques, tools, and methodologies for performing each individual step of a typical metagenomic dataset analysis. We also provide future trends in the field with respect to tools and technologies currently under development. Moreover, we discuss data management, distribution, and integration tools that are capable of performing comparative metagenomic analyses of multiple datasets using well-established databases, as well as commonly used annotation standards. PMID:25983555

  11. Data collection and processing tools for naturalistic study of powered two-wheelers users' behaviours.

    PubMed

    Espié, Stéphane; Boubezoul, Abderrahmane; Aupetit, Samuel; Bouaziz, Samir

    2013-09-01

    Instrumented vehicles are key tools for in-depth understanding of drivers' behaviours, thus for the design of scientifically based countermeasures to reduce fatalities and injuries. The instrumentation of Powered Two-Wheelers (PTW) has been less widely implemented that for vehicles, in part due to the technical challenges involved. The last decade has seen the development in Europe of several tools and methodologies to study motorcycle riders' behaviours and motorcycle dynamics for a range of situations, including crash events involving falls. Thanks to these tools, a broad-ranging research programme has been conducted, from the design and tuning of real-time falls detection to the study of riding training systems, as well as studies focusing on naturalistic riding situations such as filtering and line splitting. The methodology designed for the in-depth study of riders' behaviours in naturalistic situations can be based upon the combination of several sources of data such as: PTW sensors, context-based video retrieval system, Global Positioning System (GPS) and verbal data on the riders' decisions making process. The goals of this paper are: (1) to present the methodological tools developed and used by INRETS-MSIS (now Ifsttar-TS2/Simu) in the last decade for the study of riders' behaviours in real-world environment as well as on track for situations up to falls, (2) to illustrate the kind of results that can be gained from the conducted studies, (3) to identify the advantages and limitations of the proposed methodology to conduct large scale naturalistic riding studies, and (4) to highlight how the knowledge gained from this approach will fill many of the knowledge gaps about PTW-riders' behaviours and risk factors. PMID:23659861

  12. Tools for studying dry-cured ham processing by using computed tomography.

    PubMed

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions. PMID:22141464

  13. Cost Minimization Using an Artificial Neural Network Sleep Apnea Prediction Tool for Sleep Studies

    PubMed Central

    Teferra, Rahel A.; Grant, Brydon J. B.; Mindel, Jesse W.; Siddiqi, Tauseef A.; Iftikhar, Imran H.; Ajaz, Fatima; Aliling, Jose P.; Khan, Meena S.; Hoffmann, Stephen P.

    2014-01-01

    Rationale: More than a million polysomnograms (PSGs) are performed annually in the United States to diagnose obstructive sleep apnea (OSA). Third-party payers now advocate a home sleep test (HST), rather than an in-laboratory PSG, as the diagnostic study for OSA regardless of clinical probability, but the economic benefit of this approach is not known. Objectives: We determined the diagnostic performance of OSA prediction tools including the newly developed OSUNet, based on an artificial neural network, and performed a cost-minimization analysis when the prediction tools are used to identify patients who should undergo HST. Methods: The OSUNet was trained to predict the presence of OSA in a derivation group of patients who underwent an in-laboratory PSG (n = 383). Validation group 1 consisted of in-laboratory PSG patients (n = 149). The network was trained further in 33 patients who underwent HST and then was validated in a separate group of 100 HST patients (validation group 2). Likelihood ratios (LRs) were compared with two previously published prediction tools. The total costs from the use of the three prediction tools and the third-party approach within a clinical algorithm were compared. Measurements and Main Results: The OSUNet had a higher +LR in all groups compared with the STOP-BANG and the modified neck circumference (MNC) prediction tools. The +LRs for STOP-BANG, MNC, and OSUNet in validation group 1 were 1.1 (1.0–1.2), 1.3 (1.1–1.5), and 2.1 (1.4–3.1); and in validation group 2 they were 1.4 (1.1–1.7), 1.7 (1.3–2.2), and 3.4 (1.8–6.1), respectively. With an OSA prevalence less than 52%, the use of all three clinical prediction tools resulted in cost savings compared with the third-party approach. Conclusions: The routine requirement of an HST to diagnose OSA regardless of clinical probability is more costly compared with the use of OSA clinical prediction tools that identify patients who should undergo this procedure when OSA is expected to

  14. Multiple Oral Rereading: A Descriptive Study of Its Effects on Reading Speed and Accuracy in Selected First-Grade Children.

    ERIC Educational Resources Information Center

    Moyer, Sandra Brown

    Multiple Oral Rereading (MOR), which involves repeated reading of the same instructional unit, has been found effective in remedial reading instruction. In this study, which was designed to provide basic information about the dynamics of such repetition, 32 first-grade children were selected as subjects on the basis of their ability to read, out…

  15. Accuracy in contouring of small and low contrast lesions: Comparison between diagnostic quality computed tomography scanner and computed tomography simulation scanner-A phantom study

    SciTech Connect

    Ho, Yick Wing; Wong, Wing Kei Rebecca; Yu, Siu Ki; Lam, Wai Wang; Geng Hui

    2012-01-01

    To evaluate the accuracy in detection of small and low-contrast regions using a high-definition diagnostic computed tomography (CT) scanner compared with a radiotherapy CT simulation scanner. A custom-made phantom with cylindrical holes of diameters ranging from 2-9 mm was filled with 9 different concentrations of contrast solution. The phantom was scanned using a 16-slice multidetector CT simulation scanner (LightSpeed RT16, General Electric Healthcare, Milwaukee, WI) and a 64-slice high-definition diagnostic CT scanner (Discovery CT750 HD, General Electric Healthcare). The low-contrast regions of interest (ROIs) were delineated automatically upon their full width at half maximum of the CT number profile in Hounsfield units on a treatment planning workstation. Two conformal indexes, CI{sub in}, and CI{sub out}, were calculated to represent the percentage errors of underestimation and overestimation in the automated contours compared with their actual sizes. Summarizing the conformal indexes of different sizes and contrast concentration, the means of CI{sub in} and CI{sub out} for the CT simulation scanner were 33.7% and 60.9%, respectively, and 10.5% and 41.5% were found for the diagnostic CT scanner. The mean differences between the 2 scanners' CI{sub in} and CI{sub out} were shown to be significant with p < 0.001. A descending trend of the index values was observed as the ROI size increases for both scanners, which indicates an improved accuracy when the ROI size increases, whereas no observable trend was found in the contouring accuracy with respect to the contrast levels in this study. Images acquired by the diagnostic CT scanner allow higher accuracy on size estimation compared with the CT simulation scanner in this study. We recommend using a diagnostic CT scanner to scan patients with small lesions (<1 cm in diameter) for radiotherapy treatment planning, especially for those pending for stereotactic radiosurgery in which accurate delineation of small

  16. GEOSPATIAL DATA ACCURACY ASSESSMENT

    EPA Science Inventory

    The development of robust accuracy assessment methods for the validation of spatial data represent's a difficult scientific challenge for the geospatial science community. The importance and timeliness of this issue is related directly to the dramatic escalation in the developmen...

  17. Comparison of diagnostic accuracy of plain film radiographs between original film and smartphone capture: a pilot study.

    PubMed

    Licurse, Mindy Y; Kim, Sung H; Kim, Woojin; Ruutiainen, Alexander T; Cook, Tessa S

    2015-12-01

    The use of mobile devices for medical image capture has become increasingly popular given the widespread use of smartphone cameras. Prior studies have generally compared mobile phone capture images to digitized images. However, many underserved and rural areas without picture archiving and communication systems (PACS) still depend greatly on the use of film radiographs. Additionally, there is a scarcity of specialty-trained or formally licensed radiologists in many of these regions. Subsequently, there is great potential for the use of smartphone capture of plain radiograph films which would allow for increased access to economical and efficient consultation from board-certified radiologists abroad. The present study addresses the ability to diagnose a subset of radiographic findings identified on both the original film radiograph and the captured camera phone image. PMID:25840654

  18. International proficiency study of a consensus L1 PCR assay for the detection and typing of human papillomavirus DNA: evaluation of accuracy and intralaboratory and interlaboratory agreement.

    PubMed

    Kornegay, Janet R; Roger, Michel; Davies, Philip O; Shepard, Amanda P; Guerrero, Nayana A; Lloveras, Belen; Evans, Darren; Coutlée, François

    2003-03-01

    The PGMY L1 consensus primer pair combined with the line blot assay allows the detection of 27 genital human papillomavirus (HPV) genotypes. We conducted an intralaboratory and interlaboratory agreement study to assess the accuracy and reproducibility of PCR for HPV DNA detection and typing using the PGMY primers and typing amplicons with the line blot (PGMY-LB) assay. A test panel of 109 samples consisting of 29 HPV-negative (10 buffer controls and 19 genital samples) and 80 HPV-positive samples (60 genital samples and 20 controls with small or large amounts of HPV DNA plasmids) were tested blindly in triplicate by three laboratories. Intralaboratory agreement ranged from 86 to 98% for HPV DNA detection. PGMY-LB assay results for samples with a low copy number of HPV DNA were less reproducible. The rate of intralaboratory agreement excluding negative results for HPV typing ranged from 78 to 96%. Interlaboratory reliability for HPV DNA positivity and HPV typing was very good, with levels of agreement of >95% and kappa values of >0.87. Again, low-copy-number samples were more prone to generating discrepant results. The accuracy varied from 91 to 100% for HPV DNA positivity and from 90 to 100% for HPV typing. HPV testing can thus be accomplished reliably with PCR by using a standardized written protocol and quality-controlled reagents. The use of validated HPV DNA detection and typing assays demonstrating excellent interlaboratory agreement will allow investigators to better compare results between epidemiological studies. PMID:12624033

  19. Accuracy of thoracolumbar transpedicular and vertebral body percutaneous screw placement: coupling the Rosa® Spine robot with intraoperative flat-panel CT guidance--a cadaver study.

    PubMed

    Lefranc, M; Peltier, J

    2015-12-01

    The primary objective of the present study was to evaluate the accuracy of a new robotic device when coupled with intraoperative flat-panel CT guidance. Screws (D8-S1) were implanted during two separate cadaver sessions by coupling the Rosa(®) Spine robot with the flat-panel CT device. Of 38 implanted screws, 37 (97.4 %) were fully contained within the pedicle. One screw breached the lateral cortical of one pedicle by <1 mm. The mean ± SD accuracy (relative to pre-operative planning) was 2.05 ± 1.2 mm for the screw head, 1.65 ± 1.11 for the middle of the pedicle and 1.57 ± 1.01 for the screw tip. When coupled with intraoperative flat-panel CT guidance, the Rosa(®) Spine robot appears to be accurate in placing pedicle screws within both pedicles and the vertebral body. Large clinical studies are mandatory to confirm this preliminary cadaveric report. PMID:26530846

  20. Experimental and numerical study of the accuracy of flame-speed measurements for methane/air combustion in a slot burner

    SciTech Connect

    Selle, L.; Ferret, B.; Poinsot, T.

    2011-01-15

    Measuring the velocities of premixed laminar flames with precision remains a controversial issue in the combustion community. This paper studies the accuracy of such measurements in two-dimensional slot burners and shows that while methane/air flame speeds can be measured with reasonable accuracy, the method may lack precision for other mixtures such as hydrogen/air. Curvature at the flame tip, strain on the flame sides and local quenching at the flame base can modify local flame speeds and require corrections which are studied using two-dimensional DNS. Numerical simulations also provide stretch, displacement and consumption flame speeds along the flame front. For methane/air flames, DNS show that the local stretch remains small so that the local consumption speed is very close to the unstretched premixed flame speed. The only correction needed to correctly predict flame speeds in this case is due to the finite aspect ratio of the slot used to inject the premixed gases which induces a flow acceleration in the measurement region (this correction can be evaluated from velocity measurement in the slot section or from an analytical solution). The method is applied to methane/air flames with and without water addition and results are compared to experimental data found in the literature. The paper then discusses the limitations of the slot-burner method to measure flame speeds for other mixtures and shows that it is not well adapted to mixtures with a Lewis number far from unity, such as hydrogen/air flames. (author)

  1. State of the Art in Silico Tools for the Study of Signaling Pathways in Cancer

    PubMed Central

    Villaamil, Vanessa Medina; Gallego, Guadalupe Aparicio; Cainzos, Isabel Santamarina; Valladares-Ayerbes, Manuel; Antón Aparicio, Luis M.

    2012-01-01

    In the last several years, researchers have exhibited an intense interest in the evolutionarily conserved signaling pathways that have crucial roles during embryonic development. Interestingly, the malfunctioning of these signaling pathways leads to several human diseases, including cancer. The chemical and biophysical events that occur during cellular signaling, as well as the number of interactions within a signaling pathway, make these systems complex to study. In silico resources are tools used to aid the understanding of cellular signaling pathways. Systems approaches have provided a deeper knowledge of diverse biochemical processes, including individual metabolic pathways, signaling networks and genome-scale metabolic networks. In the future, these tools will be enormously valuable, if they continue to be developed in parallel with growing biological knowledge. In this study, an overview of the bioinformatics resources that are currently available for the analysis of biological networks is provided. PMID:22837650

  2. Study on the wear mechanism and tool life of coated gun drill

    NASA Astrophysics Data System (ADS)

    Wang, Yongguo; Yan, Xiangping; Chen, Xiaoguang; Sun, Changyu; Zhang, Xi

    2010-12-01

    A comprehensive investigation of the wear progress for solid carbide gun drill coated with TiAlN by machining steel S48CSiV at a cutting speed of 12.66m/s has been performed. Cutting torque was recorded and tool wear mechanism was studied. The surface morphology of the tool and the chip have been studied by using scanning electron microscopy (SEM) and energy dispersive spectrometer (EDS). Results show that cutting torque fluctuates between 3% and 5% when machining less than 130 pieces of crankshaft, but it will sharply increased to nearly 18% while machining 150 pieces of crankshaft because the coating is damaged and the wear becoming severity. The dominant wear mechanisms are adhesive wear and chemical dissolution wear.

  3. Study on the wear mechanism and tool life of coated gun drill

    NASA Astrophysics Data System (ADS)

    Wang, Yongguo; Yan, Xiangping; Chen, Xiaoguang; Sun, Changyu; Zhang, Xi

    2011-05-01

    A comprehensive investigation of the wear progress for solid carbide gun drill coated with TiAlN by machining steel S48CSiV at a cutting speed of 12.66m/s has been performed. Cutting torque was recorded and tool wear mechanism was studied. The surface morphology of the tool and the chip have been studied by using scanning electron microscopy (SEM) and energy dispersive spectrometer (EDS). Results show that cutting torque fluctuates between 3% and 5% when machining less than 130 pieces of crankshaft, but it will sharply increased to nearly 18% while machining 150 pieces of crankshaft because the coating is damaged and the wear becoming severity. The dominant wear mechanisms are adhesive wear and chemical dissolution wear.

  4. Image-Guided Localization Accuracy of Stereoscopic Planar and Volumetric Imaging Methods for Stereotactic Radiation Surgery and Stereotactic Body Radiation Therapy: A Phantom Study

    SciTech Connect

    Kim, Jinkoo; Jin, Jian-Yue; Walls, Nicole; Nurushev, Teamour; Movsas, Benjamin; Chetty, Indrin J.; Ryu, Samuel

    2011-04-01

    Purpose: To evaluate the positioning accuracies of two image-guided localization systems, ExacTrac and On-Board Imager (OBI), in a stereotactic treatment unit. Methods and Materials: An anthropomorphic pelvis phantom with eight internal metal markers (BBs) was used. The center of one BB was set as plan isocenter. The phantom was set up on a treatment table with various initial setup errors. Then, the errors were corrected using each of the investigated systems. The residual errors were measured with respect to the radiation isocenter using orthogonal portal images with field size 3 x 3 cm{sup 2}. The angular localization discrepancies of the two systems and the correction accuracy of the robotic couch were also studied. A pair of pre- and post-cone beam computed tomography (CBCT) images was acquired for each angular correction. Then, the correction errors were estimated by using the internal BBs through fiducial marker-based registrations. Results: The isocenter localization errors ({mu} {+-}{sigma}) in the left/right, posterior/anterior, and superior/inferior directions were, respectively, -0.2 {+-} 0.2 mm, -0.8 {+-} 0.2 mm, and -0.8 {+-} 0.4 mm for ExacTrac, and 0.5 {+-} 0.7 mm, 0.6 {+-} 0.5 mm, and 0.0 {+-} 0.5 mm for OBI CBCT. The registration angular discrepancy was 0.1 {+-} 0.2{sup o} between the two systems, and the maximum angle correction error of the robotic couch was 0.2{sup o} about all axes. Conclusion: Both the ExacTrac and the OBI CBCT systems showed approximately 1 mm isocenter localization accuracies. The angular discrepancy of two systems was minimal, and the robotic couch angle correction was accurate. These positioning uncertainties should be taken as a lower bound because the results were based on a rigid dosimetry phantom.

  5. Infrared laser photolysis - A new tool for the study of prebiotic chemistry

    NASA Technical Reports Server (NTRS)

    Davis, D. D.; Smith, G. R.; Guillory, W. A.

    1980-01-01

    Infrared laser induced dielectric breakdown and multiphoton absorption experiments on CH4/NH3 'atmospheres' are described. It is found that HCN, a central intermediate in prebiotic chemistry, is a principal product. This, combined with the fact that dielectric breakdown appears to have much in common with ordinary electric sparks, suggests that the laser could be a useful tool in studies of prebiotic chemistry. Several possible experiments in this vein are suggested.

  6. The impact of early gut microbiota modulation on the risk of child disease: alert to accuracy in probiotic studies.

    PubMed

    Isolauri, E; Salminen, S

    2015-01-01

    The composition of the gut microbiota, and thus also the modification of the gut microbiota by specific probiotics or prebiotics early in life, may have an impact on the risk of disease in the child. Above the impact on gut microecology, probiotic effects have been attributed to restoration to normal of increased intestinal permeability, improvement of the intestine's immunological barrier functions, alleviation of the intestinal inflammatory response, and reduced generation of proinflammatory cytokines characteristic of local and systemic allergic inflammation. Recent demonstrations from experimental and clinical studies suggest that the gut microbiota is also involved in the control of body weight and energy metabolism, affecting the two main causes of obesity: energy acquisition and storage, and contributing to insulin resistance and the inflammatory state characterising obesity. Current research focuses both on characterising specific probiotic strains and on how the food matrix and the dietary content interacts with the most efficient probiotic strains. It is important to characterise each probiotic to species and strain level and to select strains with documented properties, the probiotic potential being strain-specific. As any proof of causality requires clinical intervention studies in humans in different populations, rigorous and detailed documentation will enhance reproducibility and circumvent confusion. PMID:25619446

  7. Improving power and accuracy of genome-wide association studies via a multi-locus mixed linear model methodology.

    PubMed

    Wang, Shi-Bo; Feng, Jian-Ying; Ren, Wen-Long; Huang, Bo; Zhou, Ling; Wen, Yang-Jun; Zhang, Jin; Dunwell, Jim M; Xu, Shizhong; Zhang, Yuan-Ming

    2016-01-01

    Genome-wide association studies (GWAS) have been widely used in genetic dissection of complex traits. However, common methods are all based on a fixed-SNP-effect mixed linear model (MLM) and single marker analysis, such as efficient mixed model analysis (EMMA). These methods require Bonferroni correction for multiple tests, which often is too conservative when the number of markers is extremely large. To address this concern, we proposed a random-SNP-effect MLM (RMLM) and a multi-locus RMLM (MRMLM) for GWAS. The RMLM simply treats the SNP-effect as random, but it allows a modified Bonferroni correction to be used to calculate the threshold p value for significance tests. The MRMLM is a multi-locus model including markers selected from the RMLM method with a less stringent selection criterion. Due to the multi-locus nature, no multiple test correction is needed. Simulation studies show that the MRMLM is more powerful in QTN detection and more accurate in QTN effect estimation than the RMLM, which in turn is more powerful and accurate than the EMMA. To demonstrate the new methods, we analyzed six flowering time related traits in Arabidopsis thaliana and detected more genes than previous reported using the EMMA. Therefore, the MRMLM provides an alternative for multi-locus GWAS. PMID:26787347

  8. Improving power and accuracy of genome-wide association studies via a multi-locus mixed linear model methodology

    PubMed Central

    Wang, Shi-Bo; Feng, Jian-Ying; Ren, Wen-Long; Huang, Bo; Zhou, Ling; Wen, Yang-Jun; Zhang, Jin; Dunwell, Jim M.; Xu, Shizhong; Zhang, Yuan-Ming

    2016-01-01

    Genome-wide association studies (GWAS) have been widely used in genetic dissection of complex traits. However, common methods are all based on a fixed-SNP-effect mixed linear model (MLM) and single marker analysis, such as efficient mixed model analysis (EMMA). These methods require Bonferroni correction for multiple tests, which often is too conservative when the number of markers is extremely large. To address this concern, we proposed a random-SNP-effect MLM (RMLM) and a multi-locus RMLM (MRMLM) for GWAS. The RMLM simply treats the SNP-effect as random, but it allows a modified Bonferroni correction to be used to calculate the threshold p value for significance tests. The MRMLM is a multi-locus model including markers selected from the RMLM method with a less stringent selection criterion. Due to the multi-locus nature, no multiple test correction is needed. Simulation studies show that the MRMLM is more powerful in QTN detection and more accurate in QTN effect estimation than the RMLM, which in turn is more powerful and accurate than the EMMA. To demonstrate the new methods, we analyzed six flowering time related traits in Arabidopsis thaliana and detected more genes than previous reported using the EMMA. Therefore, the MRMLM provides an alternative for multi-locus GWAS. PMID:26787347

  9. Accuracy of erythrogram and serum ferritin for the maternal anemia diagnosis (AMA): a phase 3 diagnostic study on prediction of the therapeutic responsiveness to oral iron in pregnancy

    PubMed Central

    2