Sample records for simulation-based diagnostic tool

  1. A diagnostic interface for the ICOsahedral Non-hydrostatic (ICON) modelling framework based on the Modular Earth Submodel System (MESSy v2.50)

    NASA Astrophysics Data System (ADS)

    Kern, Bastian; Jöckel, Patrick

    2016-10-01

    Numerical climate and weather models have advanced to finer scales, accompanied by large amounts of output data. The model systems hit the input and output (I/O) bottleneck of modern high-performance computing (HPC) systems. We aim to apply diagnostic methods online during the model simulation instead of applying them as a post-processing step to written output data, to reduce the amount of I/O. To include diagnostic tools into the model system, we implemented a standardised, easy-to-use interface based on the Modular Earth Submodel System (MESSy) into the ICOsahedral Non-hydrostatic (ICON) modelling framework. The integration of the diagnostic interface into the model system is briefly described. Furthermore, we present a prototype implementation of an advanced online diagnostic tool for the aggregation of model data onto a user-defined regular coarse grid. This diagnostic tool will be used to reduce the amount of model output in future simulations. Performance tests of the interface and of two different diagnostic tools show, that the interface itself introduces no overhead in form of additional runtime to the model system. The diagnostic tools, however, have significant impact on the model system's runtime. This overhead strongly depends on the characteristics and implementation of the diagnostic tool. A diagnostic tool with high inter-process communication introduces large overhead, whereas the additional runtime of a diagnostic tool without inter-process communication is low. We briefly describe our efforts to reduce the additional runtime from the diagnostic tools, and present a brief analysis of memory consumption. Future work will focus on optimisation of the memory footprint and the I/O operations of the diagnostic interface.

  2. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, S; Ji, Y; Kim, K

    Purpose: A diagnostics Multileaf Collimator (MLC) was designed for diagnostic radiography dose reduction. Monte Carlo simulation was used to evaluate efficiency of shielding material for producing leaves of Multileaf collimator. Material & Methods: The general radiography unit (Rex-650R, Listem, Korea) was modeling with Monte Carlo simulation (MCNPX, LANL, USA) and we used SRS-78 program to calculate the energy spectrum of tube voltage (80, 100, 120 kVp). The shielding materials was SKD 11 alloy tool steel that is composed of 1.6% carbon(C), 0.4% silicon (Si), 0.6% manganese (Mn), 5% chromium (Cr), 1% molybdenum (Mo), and vanadium (V). The density of itmore » was 7.89 g/m3. We simulated leafs diagnostic MLC using SKD 11 with general radiography unit. We calculated efficiency of diagnostic MLC using tally6 card of MCNPX depending on energy. Results: The diagnostic MLC consisted of 25 individual metal shielding leaves on both sides, with dimensions of 10 × 0.5 × 0.5 cm3. The leaves of MLC were controlled by motors positioned on both sides of the MLC. According to energy (tube voltage), the shielding efficiency of MLC in Monte Carlo simulation was 99% (80 kVp), 96% (100 kVp) and 93% (120 kVp). Conclusion: We certified efficiency of diagnostic MLC fabricated from SKD11 alloy tool steel. Based on the results, the diagnostic MLC was designed. We will make the diagnostic MLC for dose reduction of diagnostic radiography.« less

  4. A Cryogenic Fluid System Simulation in Support of Integrated Systems Health Management

    NASA Technical Reports Server (NTRS)

    Barber, John P.; Johnston, Kyle B.; Daigle, Matthew

    2013-01-01

    Simulations serve as important tools throughout the design and operation of engineering systems. In the context of sys-tems health management, simulations serve many uses. For one, the underlying physical models can be used by model-based health management tools to develop diagnostic and prognostic models. These simulations should incorporate both nominal and faulty behavior with the ability to inject various faults into the system. Such simulations can there-fore be used for operator training, for both nominal and faulty situations, as well as for developing and prototyping health management algorithms. In this paper, we describe a methodology for building such simulations. We discuss the design decisions and tools used to build a simulation of a cryogenic fluid test bed, and how it serves as a core technology for systems health management development and maturation.

  5. Diagnostic tool for structural health monitoring: effect of material nonlinearity and vibro-impact process

    NASA Astrophysics Data System (ADS)

    Hiwarkar, V. R.; Babitsky, V. I.; Silberschmidt, V. V.

    2013-07-01

    Numerous techniques are available for monitoring structural health. Most of these techniques are expensive and time-consuming. In this paper, vibration-based techniques are explored together with their use as diagnostic tools for structural health monitoring. Finite-element simulations are used to study the effect of material nonlinearity on dynamics of a cracked bar. Additionally, several experiments are performed to study the effect of vibro-impact behavior of crack on its dynamics. It was observed that a change in the natural frequency of the cracked bar due to crack-tip plasticity and vibro-impact behavior linked to interaction of crack faces, obtained from experiments, led to generation of higher harmonics; this can be used as a diagnostic tool for structural health monitoring.

  6. Using the arthroscopic surgery skill evaluation tool as a pass-fail examination.

    PubMed

    Koehler, Ryan J; Nicandri, Gregg T

    2013-12-04

    Examination of arthroscopic skill requires evaluation tools that are valid and reliable with clear criteria for passing. The Arthroscopic Surgery Skill Evaluation Tool was developed as a video-based assessment of technical skill with criteria for passing established by a panel of experts. The purpose of this study was to test the validity and reliability of the Arthroscopic Surgery Skill Evaluation Tool as a pass-fail examination of arthroscopic skill. Twenty-eight residents and two sports medicine faculty members were recorded performing diagnostic knee arthroscopy on a left and right cadaveric specimen in our arthroscopic skills laboratory. Procedure videos were evaluated with use of the Arthroscopic Surgery Skill Evaluation Tool by two raters blind to subject identity. Subjects were considered to pass the Arthroscopic Surgery Skill Evaluation Tool when they attained scores of ≥ 3 on all eight assessment domains. The raters agreed on a pass-fail rating for fifty-five of sixty videos rated with an interclass correlation coefficient value of 0.83. Ten of thirty participants were assigned passing scores by both raters for both diagnostic arthroscopies performed in the laboratory. Receiver operating characteristic analysis demonstrated that logging more than eighty arthroscopic cases or performing more than thirty-five arthroscopic knee cases was predictive of attaining a passing Arthroscopic Surgery Skill Evaluation Tool score on both procedures performed in the laboratory. The Arthroscopic Surgery Skill Evaluation Tool is valid and reliable as a pass-fail examination of diagnostic arthroscopy of the knee in the simulation laboratory. This study demonstrates that the Arthroscopic Surgery Skill Evaluation Tool may be a useful tool for pass-fail examination of diagnostic arthroscopy of the knee in the simulation laboratory. Further study is necessary to determine whether the Arthroscopic Surgery Skill Evaluation Tool can be used for the assessment of multiple arthroscopic procedures and whether it can be used to evaluate arthroscopic procedures performed in the operating room.

  7. Community-based benchmarking of the CMIP DECK experiments

    NASA Astrophysics Data System (ADS)

    Gleckler, P. J.

    2015-12-01

    A diversity of community-based efforts are independently developing "diagnostic packages" with little or no coordination between them. A short list of examples include NCAR's Climate Variability Diagnostics Package (CVDP), ORNL's International Land Model Benchmarking (ILAMB), LBNL's Toolkit for Extreme Climate Analysis (TECA), PCMDI's Metrics Package (PMP), the EU EMBRACE ESMValTool, the WGNE MJO diagnostics package, and CFMIP diagnostics. The full value of these efforts cannot be realized without some coordination. As a first step, a WCRP effort has initiated a catalog to document candidate packages that could potentially be applied in a "repeat-use" fashion to all simulations contributed to the CMIP DECK (Diagnostic, Evaluation and Characterization of Klima) experiments. Some coordination of community-based diagnostics has the additional potential to improve how CMIP modeling groups analyze their simulations during model-development. The fact that most modeling groups now maintain a "CMIP compliant" data stream means that in principal without much effort they could readily adopt a set of well organized diagnostic capabilities specifically designed to operate on CMIP DECK experiments. Ultimately, a detailed listing of and access to analysis codes that are demonstrated to work "out of the box" with CMIP data could enable model developers (and others) to select those codes they wish to implement in-house, potentially enabling more systematic evaluation during the model development process.

  8. A simulation study to quantify the impacts of exposure ...

    EPA Pesticide Factsheets

    A simulation study to quantify the impacts of exposure measurement error on air pollution health risk estimates in copollutant time-series models The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  9. 3-d brownian motion simulator for high-sensitivity nanobiotechnological applications.

    PubMed

    Toth, Arpád; Banky, Dániel; Grolmusz, Vince

    2011-12-01

    A wide variety of nanobiotechnologic applications are being developed for nanoparticle based in vitro diagnostic and imaging systems. Some of these systems make possible highly sensitive detection of molecular biomarkers. Frequently, the very low concentration of the biomarkers makes impossible the classical, partial differential equation-based mathematical simulation of the motion of the nanoparticles involved. We present a three-dimensional Brownian motion simulation tool for the prediction of the movement of nanoparticles in various thermal, viscosity, and geometric settings in a rectangular cuvette. For nonprofit users the server is freely available at the site http://brownian.pitgroup.org.

  10. Consequences of Base Time for Redundant Signals Experiments

    PubMed Central

    Townsend, James T.; Honey, Christopher

    2007-01-01

    We report analytical and computational investigations into the effects of base time on the diagnosticity of two popular theoretical tools in the redundant signals literature: (1) the race model inequality and (2) the capacity coefficient. We show analytically and without distributional assumptions that the presence of base time decreases the sensitivity of both of these measures to model violations. We further use simulations to investigate the statistical power model selection tools based on the race model inequality, both with and without base time. Base time decreases statistical power, and biases the race model test toward conservatism. The magnitude of this biasing effect increases as we increase the proportion of total reaction time variance contributed by base time. We marshal empirical evidence to suggest that the proportion of reaction time variance contributed by base time is relatively small, and that the effects of base time on the diagnosticity of our model-selection tools are therefore likely to be minor. However, uncertainty remains concerning the magnitude and even the definition of base time. Experimentalists should continue to be alert to situations in which base time may contribute a large proportion of the total reaction time variance. PMID:18670591

  11. Evaluation of a virtual-reality-based simulator using passive haptic feedback for knee arthroscopy.

    PubMed

    Fucentese, Sandro F; Rahm, Stefan; Wieser, Karl; Spillmann, Jonas; Harders, Matthias; Koch, Peter P

    2015-04-01

    The aim of this work is to determine face validity and construct validity of a new virtual-reality-based simulator for diagnostic and therapeutic knee arthroscopy. The study tests a novel arthroscopic simulator based on passive haptics. Sixty-eight participants were grouped into novices, intermediates, and experts. All participants completed two exercises. In order to establish face validity, all participants filled out a questionnaire concerning different aspects of simulator realism, training capacity, and different statements using a seven-point Likert scale (range 1-7). Construct validity was tested by comparing various simulator metric values between novices and experts. Face validity could be established: overall realism was rated with a mean value of 5.5 points. Global training capacity scored a mean value of 5.9. Participants considered the simulator as useful for procedural training of diagnostic and therapeutic arthroscopy. In the foreign body removal exercise, experts were overall significantly faster in the whole procedure (6 min 24 s vs. 8 min 24 s, p < 0.001), took less time to complete the diagnostic tour (2 min 49 s vs. 3 min 32 s, p = 0.027), and had a shorter camera path length (186 vs. 246 cm, p = 0.006). The simulator achieved high scores in terms of realism. It was regarded as a useful training tool, which is also capable of differentiating between varying levels of arthroscopic experience. Nevertheless, further improvements of the simulator especially in the field of therapeutic arthroscopy are desirable. In general, the findings support that virtual-reality-based simulation using passive haptics has the potential to complement conventional training of knee arthroscopy skills. II.

  12. System diagnostic builder: a rule-generation tool for expert systems that do intelligent data evaluation

    NASA Astrophysics Data System (ADS)

    Nieten, Joseph L.; Burke, Roger

    1993-03-01

    The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.

  13. Simulation of light transport in arthritic- and non-arthritic human fingers

    NASA Astrophysics Data System (ADS)

    Milanic, Matija; Paluchowski, Lukasz A.; Randeberg, Lise L.

    2014-03-01

    Rheumatoid arthritis is a disease that frequently leads to joint destruction. It has high incidence rates worldwide, and the disease significantly reduces patient's quality of life due to pain, swelling and stiffness of the affected joints. Early diagnosis is necessary to improve course of the disease, therefore sensitive and accurate diagnostic tools are required. Optical imaging techniques have capability for early diagnosis and monitoring of arthritis. As compared to conventional diagnostic techniques optical technique is a noninvasive, noncontact and fast way of collecting diagnostic information. However, a realistic model of light transport in human joints is needed for understanding and developing of such optical diagnostic tools. The aim of this study is to develop a 3D numerical model of light transport in a human finger. The model will guide development of a hyperspectral imaging (HSI) diagnostic modality for arthritis in human fingers. The implemented human finger geometry is based on anatomical data. Optical data of finger tissues are adjusted to represent either an arthritic or an unaffected finger. The geometry and optical data serve as input into a 3D Monte Carlo method, which calculate diffuse reflectance, transmittance and absorbed energy distributions. The parameters of the model are optimized based on HIS-measurements of human fingers. The presented model serves as an important tool for understanding and development of HSI as an arthritis diagnostic modality. Yet, it can be applied to other optical techniques and finger diseases.

  14. Methods Developed by the Tools for Engine Diagnostics Task to Monitor and Predict Rotor Damage in Real Time

    NASA Technical Reports Server (NTRS)

    Baaklini, George Y.; Smith, Kevin; Raulerson, David; Gyekenyesi, Andrew L.; Sawicki, Jerzy T.; Brasche, Lisa

    2003-01-01

    Tools for Engine Diagnostics is a major task in the Propulsion System Health Management area of the Single Aircraft Accident Prevention project under NASA s Aviation Safety Program. The major goal of the Aviation Safety Program is to reduce fatal aircraft accidents by 80 percent within 10 years and by 90 percent within 25 years. The goal of the Propulsion System Health Management area is to eliminate propulsion system malfunctions as a primary or contributing factor to the cause of aircraft accidents. The purpose of Tools for Engine Diagnostics, a 2-yr-old task, is to establish and improve tools for engine diagnostics and prognostics that measure the deformation and damage of rotating engine components at the ground level and that perform intermittent or continuous monitoring on the engine wing. In this work, nondestructive-evaluation- (NDE-) based technology is combined with model-dependent disk spin experimental simulation systems, like finite element modeling (FEM) and modal norms, to monitor and predict rotor damage in real time. Fracture mechanics time-dependent fatigue crack growth and damage-mechanics-based life estimation are being developed, and their potential use investigated. In addition, wireless eddy current and advanced acoustics are being developed for on-wing and just-in-time NDE engine inspection to provide deeper access and higher sensitivity to extend on-wing capabilities and improve inspection readiness. In the long run, these methods could establish a base for prognostic sensing while an engine is running, without any overt actions, like inspections. This damage-detection strategy includes experimentally acquired vibration-, eddy-current- and capacitance-based displacement measurements and analytically computed FEM-, modal norms-, and conventional rotordynamics-based models of well-defined damages and critical mass imbalances in rotating disks and rotors.

  15. An architecture for the development of real-time fault diagnosis systems using model-based reasoning

    NASA Technical Reports Server (NTRS)

    Hall, Gardiner A.; Schuetzle, James; Lavallee, David; Gupta, Uday

    1992-01-01

    Presented here is an architecture for implementing real-time telemetry based diagnostic systems using model-based reasoning. First, we describe Paragon, a knowledge acquisition tool for offline entry and validation of physical system models. Paragon provides domain experts with a structured editing capability to capture the physical component's structure, behavior, and causal relationships. We next describe the architecture of the run time diagnostic system. The diagnostic system, written entirely in Ada, uses the behavioral model developed offline by Paragon to simulate expected component states as reflected in the telemetry stream. The diagnostic algorithm traces causal relationships contained within the model to isolate system faults. Since the diagnostic process relies exclusively on the behavioral model and is implemented without the use of heuristic rules, it can be used to isolate unpredicted faults in a wide variety of systems. Finally, we discuss the implementation of a prototype system constructed using this technique for diagnosing faults in a science instrument. The prototype demonstrates the use of model-based reasoning to develop maintainable systems with greater diagnostic capabilities at a lower cost.

  16. Improving team information sharing with a structured call-out in anaesthetic emergencies: a randomized controlled trial.

    PubMed

    Weller, J M; Torrie, J; Boyd, M; Frengley, R; Garden, A; Ng, W L; Frampton, C

    2014-06-01

    Sharing information with the team is critical in developing a shared mental model in an emergency, and fundamental to effective teamwork. We developed a structured call-out tool, encapsulated in the acronym 'SNAPPI': Stop; Notify; Assessment; Plan; Priorities; Invite ideas. We explored whether a video-based intervention could improve structured call-outs during simulated crises and if this would improve information sharing and medical management. In a simulation-based randomized, blinded study, we evaluated the effect of the video-intervention teaching SNAPPI on scores for SNAPPI, information sharing, and medical management using baseline and follow-up crisis simulations. We assessed information sharing using a probe technique where nurses and technicians received unique, clinically relevant information probes before the simulation. Shared knowledge of probes was measured in a written, post-simulation test. We also scored sharing of diagnostic options with the team and medical management. Anaesthetists' scores for SNAPPI were significantly improved, as was the number of diagnostic options they shared. We found a non-significant trend to improve information-probe sharing and medical management in the intervention group, and across all simulations, a significant correlation between SNAPPI and information-probe sharing. Of note, only 27% of the clinically relevant information about the patient provided to the nurse and technician in the pre-simulation information probes was subsequently learnt by the anaesthetist. We developed a structured communication tool, SNAPPI, to improve information sharing between anaesthetists and their team, taught it using a video-based intervention, and provide initial evidence to support its value for improving communication in a crisis. © The Author [2014]. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Software Tools to Support the Assessment of System Health

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of sensors that optimally meet the performance goals and the constraints. It identifies optimal sensor suite solutions by utilizing a merit (i.e., cost) function with one of several available optimization approaches. As part of its analysis, S4 can expose fault conditions that are difficult to diagnose due to an incomplete diagnostic philosophy and/or a lack of sensors. S4 was originally developed and applied to liquid rocket engines. It was subsequently used to study the optimized selection of sensors for a simulation ]based aircraft engine diagnostic system. The ETA Tool is a software ]based analysis tool that augments the testability analysis and reporting capabilities of a commercial ]off ]the ]shelf (COTS) package. An initial diagnostic assessment is performed by the COTS software using a user ]developed, qualitative, directed ]graph model of the system being analyzed. The ETA Tool accesses system design information captured within the model and the associated testability analysis output to create a series of six reports for various system engineering needs. These reports are highlighted in the presentation. The ETA Tool was developed by NASA to support the verification of fault management requirements early in the Launch Vehicle process. Due to their early development during the design process, the TEAMS ]based diagnostic model and the ETA Tool were able to positively influence the system design by highlighting gaps in failure detection, fault isolation, and failure recovery.

  18. Case-Deletion Diagnostics for Maximum Likelihood Multipoint Quantitative Trait Locus Linkage Analysis

    PubMed Central

    Mendoza, Maria C.B.; Burns, Trudy L.; Jones, Michael P.

    2009-01-01

    Objectives Case-deletion diagnostic methods are tools that allow identification of influential observations that may affect parameter estimates and model fitting conclusions. The goal of this paper was to develop two case-deletion diagnostics, the exact case deletion (ECD) and the empirical influence function (EIF), for detecting outliers that can affect results of sib-pair maximum likelihood quantitative trait locus (QTL) linkage analysis. Methods Subroutines to compute the ECD and EIF were incorporated into the maximum likelihood QTL variance estimation components of the linkage analysis program MAPMAKER/SIBS. Performance of the diagnostics was compared in simulation studies that evaluated the proportion of outliers correctly identified (sensitivity), and the proportion of non-outliers correctly identified (specificity). Results Simulations involving nuclear family data sets with one outlier showed EIF sensitivities approximated ECD sensitivities well for outlier-affected parameters. Sensitivities were high, indicating the outlier was identified a high proportion of the time. Simulations also showed the enormous computational time advantage of the EIF. Diagnostics applied to body mass index in nuclear families detected observations influential on the lod score and model parameter estimates. Conclusions The EIF is a practical diagnostic tool that has the advantages of high sensitivity and quick computation. PMID:19172086

  19. ADAM: An Accident Diagnostic,Analysis and Management System - Applications to Severe Accident Simulation and Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zavisca, M.J.; Khatib-Rahbar, M.; Esmaili, H.

    2002-07-01

    The Accident Diagnostic, Analysis and Management (ADAM) computer code has been developed as a tool for on-line applications to accident diagnostics, simulation, management and training. ADAM's severe accident simulation capabilities incorporate a balance of mechanistic, phenomenologically based models with simple parametric approaches for elements including (but not limited to) thermal hydraulics; heat transfer; fuel heatup, meltdown, and relocation; fission product release and transport; combustible gas generation and combustion; and core-concrete interaction. The overall model is defined by a relatively coarse spatial nodalization of the reactor coolant and containment systems and is advanced explicitly in time. The result is to enablemore » much faster than real time (i.e., 100 to 1000 times faster than real time on a personal computer) applications to on-line investigations and/or accident management training. Other features of the simulation module include provision for activation of water injection, including the Engineered Safety Features, as well as other mechanisms for the assessment of accident management and recovery strategies and the evaluation of PSA success criteria. The accident diagnostics module of ADAM uses on-line access to selected plant parameters (as measured by plant sensors) to compute the thermodynamic state of the plant, and to predict various margins to safety (e.g., times to pressure vessel saturation and steam generator dryout). Rule-based logic is employed to classify the measured data as belonging to one of a number of likely scenarios based on symptoms, and a number of 'alarms' are generated to signal the state of the reactor and containment. This paper will address the features and limitations of ADAM with particular focus on accident simulation and management. (authors)« less

  20. The medical simulation markup language - simplifying the biomechanical modeling workflow.

    PubMed

    Suwelack, Stefan; Stoll, Markus; Schalck, Sebastian; Schoch, Nicolai; Dillmann, Rüdiger; Bendl, Rolf; Heuveline, Vincent; Speidel, Stefanie

    2014-01-01

    Modeling and simulation of the human body by means of continuum mechanics has become an important tool in diagnostics, computer-assisted interventions and training. This modeling approach seeks to construct patient-specific biomechanical models from tomographic data. Usually many different tools such as segmentation and meshing algorithms are involved in this workflow. In this paper we present a generalized and flexible description for biomechanical models. The unique feature of the new modeling language is that it not only describes the final biomechanical simulation, but also the workflow how the biomechanical model is constructed from tomographic data. In this way, the MSML can act as a middleware between all tools used in the modeling pipeline. The MSML thus greatly facilitates the prototyping of medical simulation workflows for clinical and research purposes. In this paper, we not only detail the XML-based modeling scheme, but also present a concrete implementation. Different examples highlight the flexibility, robustness and ease-of-use of the approach.

  1. Script-theory virtual case: A novel tool for education and research.

    PubMed

    Hayward, Jake; Cheung, Amandy; Velji, Alkarim; Altarejos, Jenny; Gill, Peter; Scarfe, Andrew; Lewis, Melanie

    2016-11-01

    Context/Setting: The script theory of diagnostic reasoning proposes that clinicians evaluate cases in the context of an "illness script," iteratively testing internal hypotheses against new information eventually reaching a diagnosis. We present a novel tool for teaching diagnostic reasoning to undergraduate medical students based on an adaptation of script theory. We developed a virtual patient case that used clinically authentic audio and video, interactive three-dimensional (3D) body images, and a simulated electronic medical record. Next, we used interactive slide bars to record respondents' likelihood estimates of diagnostic possibilities at various stages of the case. Responses were dynamically compared to data from expert clinicians and peers. Comparative frequency distributions were presented to the learner and final diagnostic likelihood estimates were analyzed. Detailed student feedback was collected. Over two academic years, 322 students participated. Student diagnostic likelihood estimates were similar year to year, but were consistently different from expert clinician estimates. Student feedback was overwhelmingly positive: students found the case was novel, innovative, clinically authentic, and a valuable learning experience. We demonstrate the successful implementation of a novel approach to teaching diagnostic reasoning. Future study may delineate reasoning processes associated with differences between novice and expert responses.

  2. A Sensor-Based Method for Diagnostics of Machine Tool Linear Axes.

    PubMed

    Vogl, Gregory W; Weiss, Brian A; Donmez, M Alkan

    2015-01-01

    A linear axis is a vital subsystem of machine tools, which are vital systems within many manufacturing operations. When installed and operating within a manufacturing facility, a machine tool needs to stay in good condition for parts production. All machine tools degrade during operations, yet knowledge of that degradation is illusive; specifically, accurately detecting degradation of linear axes is a manual and time-consuming process. Thus, manufacturers need automated and efficient methods to diagnose the condition of their machine tool linear axes without disruptions to production. The Prognostics and Health Management for Smart Manufacturing Systems (PHM4SMS) project at the National Institute of Standards and Technology (NIST) developed a sensor-based method to quickly estimate the performance degradation of linear axes. The multi-sensor-based method uses data collected from a 'sensor box' to identify changes in linear and angular errors due to axis degradation; the sensor box contains inclinometers, accelerometers, and rate gyroscopes to capture this data. The sensors are expected to be cost effective with respect to savings in production losses and scrapped parts for a machine tool. Numerical simulations, based on sensor bandwidth and noise specifications, show that changes in straightness and angular errors could be known with acceptable test uncertainty ratios. If a sensor box resides on a machine tool and data is collected periodically, then the degradation of the linear axes can be determined and used for diagnostics and prognostics to help optimize maintenance, production schedules, and ultimately part quality.

  3. A Sensor-Based Method for Diagnostics of Machine Tool Linear Axes

    PubMed Central

    Vogl, Gregory W.; Weiss, Brian A.; Donmez, M. Alkan

    2017-01-01

    A linear axis is a vital subsystem of machine tools, which are vital systems within many manufacturing operations. When installed and operating within a manufacturing facility, a machine tool needs to stay in good condition for parts production. All machine tools degrade during operations, yet knowledge of that degradation is illusive; specifically, accurately detecting degradation of linear axes is a manual and time-consuming process. Thus, manufacturers need automated and efficient methods to diagnose the condition of their machine tool linear axes without disruptions to production. The Prognostics and Health Management for Smart Manufacturing Systems (PHM4SMS) project at the National Institute of Standards and Technology (NIST) developed a sensor-based method to quickly estimate the performance degradation of linear axes. The multi-sensor-based method uses data collected from a ‘sensor box’ to identify changes in linear and angular errors due to axis degradation; the sensor box contains inclinometers, accelerometers, and rate gyroscopes to capture this data. The sensors are expected to be cost effective with respect to savings in production losses and scrapped parts for a machine tool. Numerical simulations, based on sensor bandwidth and noise specifications, show that changes in straightness and angular errors could be known with acceptable test uncertainty ratios. If a sensor box resides on a machine tool and data is collected periodically, then the degradation of the linear axes can be determined and used for diagnostics and prognostics to help optimize maintenance, production schedules, and ultimately part quality. PMID:28691039

  4. A software tool for analyzing multichannel cochlear implant signals.

    PubMed

    Lai, Wai Kong; Bögli, Hans; Dillier, Norbert

    2003-10-01

    A useful and convenient means to analyze the radio frequency (RF) signals being sent by a speech processor to a cochlear implant would be to actually capture and display them with appropriate software. This is particularly useful for development or diagnostic purposes. sCILab (Swiss Cochlear Implant Laboratory) is such a PC-based software tool intended for the Nucleus family of Multichannel Cochlear Implants. Its graphical user interface provides a convenient and intuitive means for visualizing and analyzing the signals encoding speech information. Both numerical and graphic displays are available for detailed examination of the captured CI signals, as well as an acoustic simulation of these CI signals. sCILab has been used in the design and verification of new speech coding strategies, and has also been applied as an analytical tool in studies of how different parameter settings of existing speech coding strategies affect speech perception. As a diagnostic tool, it is also useful for troubleshooting problems with the external equipment of the cochlear implant systems.

  5. Using hypermedia to develop an intelligent tutorial/diagnostic system for the Space Shuttle Main Engine Controller Lab

    NASA Technical Reports Server (NTRS)

    Oreilly, Daniel; Williams, Robert; Yarborough, Kevin

    1988-01-01

    This is a tutorial/diagnostic system for training personnel in the use of the Space Shuttle Main Engine Controller (SSMEC) Simulation Lab. It also provides a diagnostic capable of isolating lab failures at least to the major lab component. The system was implemented using Hypercard, which is an program of hypermedia running on Apple Macintosh computers. Hypercard proved to be a viable platform for the development and use of sophisticated tutorial systems and moderately capable diagnostic systems. This tutorial/diagnostic system uses the basic Hypercard tools to provide the tutorial. The diagnostic part of the system uses a simple interpreter written in the Hypercard language (Hypertalk) to implement the backward chaining rule based logic commonly found in diagnostic systems using Prolog. Some of the advantages of Hypercard in developing this type of system include sophisticated graphics, animation, sound and voice capabilities, its ability as a hypermedia tool, and its ability to include digitized pictures. The major disadvantage is the slow execution time for evaluation of rules (due to the interpretive processing of the language). Other disadvantages include the limitation on the size of the cards, that color is not supported, that it does not support grey scale graphics, and its lack of selectable fonts for text fields.

  6. Simulated color: a diagnostic tool for skin lesions like port-wine stain

    NASA Astrophysics Data System (ADS)

    Randeberg, Lise L.; Svaasand, Lars O.

    2001-05-01

    A device independent method for skin color visualization has been developed. Colors reconstructed from a reflectance spectrum are presented on a computer screen by sRGB (standard Red Green Blue) color coordinates. The colors are presented as adjacent patches surrounded by a medium grey border. CIELAB color coordinates and CIE (International Commission on Illumination) color difference (Delta) E are computed. The change in skin color due to a change in average blood content or scattering properties in dermis is investigated. This is done by analytical simulations based on the diffusion approximation. It is found that an 11% change in average blood content and a 15% change in scattering properties will give a visible color change. A supposed visibility limit for (Delta) E is given. This value is based on experimental testing and the known properties of the human visual system. This limit value can be used as a tool to determine when to terminate laser treatment of port- wine stain due to low treatment response, i.e. low (Delta) E between treatments. The visualization method presented seems promising for medical applications as port-wine stain diagnostics. The method gives good possibilities for electronic transfer of data between clinics because it is device independent.

  7. An Intelligent Computer-aided Training System (CAT) for Diagnosing Adult Illiterates: Integrating NASA Technology into Workplace Literacy

    NASA Technical Reports Server (NTRS)

    Yaden, David B., Jr.

    1991-01-01

    An important part of NASA's mission involves the secondary application of its technologies in the public and private sectors. One current application being developed is The Adult Literacy Evaluator, a simulation-based diagnostic tool designed to assess the operant literacy abilities of adults having difficulties in learning to read and write. Using Intelligent Computer-Aided Training (ICAT) system technology in addition to speech recognition, closed-captioned television (CCTV), live video and other state-of-the-art graphics and storage capabilities, this project attempts to overcome the negative effects of adult literacy assessment by allowing the client to interact with an intelligent computer system which simulates real-life literacy activities and materials and which measures literacy performance in the actual context of its use. The specific objectives of the project are as follows: (1) to develop a simulation-based diagnostic tool to assess adults' prior knowledge about reading and writing processes in actual contexts of application; (2) to provide a profile of readers' strengths and weaknesses; and (3) to suggest instructional strategies and materials which can be used as a beginning point for remediation. In the first and development phase of the project, descriptions of literacy events and environments are being written and functional literacy documents analyzed for their components. From these descriptions, scripts are being generated which define the interaction between the student, an on-screen guide and the simulated literacy environment.

  8. Knee Arthroscopy Simulation: A Randomized Controlled Trial Evaluating the Effectiveness of the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) Tool.

    PubMed

    Bhattacharyya, Rahul; Davidson, Donald J; Sugand, Kapil; Bartlett, Matthew J; Bhattacharya, Rajarshi; Gupte, Chinmay M

    2017-10-04

    Virtual-reality and cadaveric simulations are expensive and not readily accessible. Innovative and accessible training adjuncts are required to help to meet training needs. Cognitive task analysis has been used extensively to train pilots and in other surgical specialties. However, the use of cognitive task analyses within orthopaedics is in its infancy. The purpose of this study was to evaluate the effectiveness of a novel cognitive task analysis tool to train novice surgeons in diagnostic knee arthroscopy in high-fidelity, phantom-limb simulation. Three expert knee surgeons were interviewed independently to generate a list of technical steps, decision points, and errors for diagnostic knee arthroscopy. A modified Delphi technique was used to generate the final cognitive task analysis. A video and a voiceover were recorded for each phase of this procedure. These were combined to produce the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) tool that utilizes written and audiovisual stimuli to describe each phase of a diagnostic knee arthroscopy. In this double-blinded, randomized controlled trial, a power calculation was performed prior to recruitment. Sixteen novice orthopaedic trainees who performed ≤10 diagnostic knee arthroscopies were randomized into 2 equal groups. The intervention group (IKACTA group) was given the IKACTA tool and the control group had no additional learning material. They were assessed objectively (validated Arthroscopic Surgical Skill Evaluation Tool [ASSET] global rating scale) on a high-fidelity, phantom-knee simulator. All participants, using the Likert rating scale, subjectively rated the tool. The mean ASSET score (and standard deviation) was 19.5 ± 3.7 points in the IKACTA group and 10.6 ± 2.3 points in the control group, resulting in an improvement of 8.9 points (95% confidence interval, 7.6 to 10.1 points; p = 0.002); the score was determined as 51.3% (19.5 of 38) for the IKACTA group, 27.9% (10.6 of 38) for the control group, and 23.4% (8.9 of 38) for the improvement. All participants agreed that the cognitive task analysis learning tool was a useful training adjunct to learning in the operating room. To our knowledge, this is the first cognitive task analysis in diagnostic knee arthroscopy that is user-friendly and inexpensive and has demonstrated significant benefits in training. The IKACTA will provide trainees with a demonstrably strong foundation in diagnostic knee arthroscopy that will flatten learning curves in both technical skills and decision-making.

  9. Modelling the impacts of new diagnostic tools for tuberculosis in developing countries to enhance policy decisions.

    PubMed

    Langley, Ivor; Doulla, Basra; Lin, Hsien-Ho; Millington, Kerry; Squire, Bertie

    2012-09-01

    The introduction and scale-up of new tools for the diagnosis of Tuberculosis (TB) in developing countries has the potential to make a huge difference to the lives of millions of people living in poverty. To achieve this, policy makers need the information to make the right decisions about which new tools to implement and where in the diagnostic algorithm to apply them most effectively. These decisions are difficult as the new tools are often expensive to implement and use, and the health system and patient impacts uncertain, particularly in developing countries where there is a high burden of TB. The authors demonstrate that a discrete event simulation model could play a significant part in improving and informing these decisions. The feasibility of linking the discrete event simulation to a dynamic epidemiology model is also explored in order to take account of longer term impacts on the incidence of TB. Results from two diagnostic districts in Tanzania are used to illustrate how the approach could be used to improve decisions.

  10. Intelligent model-based diagnostics for vehicle health management

    NASA Astrophysics Data System (ADS)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  11. Technical Highlight: NREL Improves Building Energy Simulation Programs Through Diagnostic Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polly, B.

    2012-01-09

    This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market.

  12. The Impact of Measurement Noise in GPA Diagnostic Analysis of a Gas Turbine Engine

    NASA Astrophysics Data System (ADS)

    Ntantis, Efstratios L.; Li, Y. G.

    2013-12-01

    The performance diagnostic analysis of a gas turbine is accomplished by estimating a set of internal engine health parameters from available sensor measurements. No physical measuring instruments however can ever completely eliminate the presence of measurement uncertainties. Sensor measurements are often distorted by noise and bias leading to inaccurate estimation results. This paper explores the impact of measurement noise on Gas Turbine GPA analysis. The analysis is demonstrated with a test case where gas turbine performance simulation and diagnostics code TURBOMATCH is used to build a performance model of a model engine similar to Rolls-Royce Trent 500 turbofan engine, and carry out the diagnostic analysis with the presence of different levels of measurement noise. Conclusively, to improve the reliability of the diagnostic results, a statistical analysis of the data scattering caused by sensor uncertainties is made. The diagnostic tool used to deal with the statistical analysis of measurement noise impact is a model-based method utilizing a non-linear GPA.

  13. Continued Development of Expert System Tools for NPSS Engine Diagnostics

    NASA Technical Reports Server (NTRS)

    Lewandowski, Henry

    1996-01-01

    The objectives of this grant were to work with previously developed NPSS (Numerical Propulsion System Simulation) tools and enhance their functionality; explore similar AI systems; and work with the High Performance Computing Communication (HPCC) K-12 program. Activities for this reporting period are briefly summarized and a paper addressing the implementation, monitoring and zooming in a distributed jet engine simulation is included as an attachment.

  14. Model Performance Evaluation and Scenario Analysis ...

    EPA Pesticide Factsheets

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too

  15. ROC curve analyses of eyewitness identification decisions: An analysis of the recent debate.

    PubMed

    Rotello, Caren M; Chen, Tina

    2016-01-01

    How should the accuracy of eyewitness identification decisions be measured, so that best practices for identification can be determined? This fundamental question is under intense debate. One side advocates for continued use of a traditional measure of identification accuracy, known as the diagnosticity ratio , whereas the other side argues that receiver operating characteristic curves (ROCs) should be used instead because diagnosticity is confounded with response bias. Diagnosticity proponents have offered several criticisms of ROCs, which we show are either false or irrelevant to the assessment of eyewitness accuracy. We also show that, like diagnosticity, Bayesian measures of identification accuracy confound response bias with witnesses' ability to discriminate guilty from innocent suspects. ROCs are an essential tool for distinguishing memory-based processes from decisional aspects of a response; simulations of different possible identification tasks and response strategies show that they offer important constraints on theory development.

  16. Numerical simulation of the geographical sources of water for Continental Scale Experiments (CSEs) Precipitation

    NASA Technical Reports Server (NTRS)

    Bosilovich, Michael G.; Sud, Yogesh; Schubert, Siegfried D.; Walker, Gregory K.

    2003-01-01

    There are several important research questions that the Global Energy and Water Cycle Experiment (GEWEX) is actively pursuing, namely: What is the intensity of the water cycle and how does it change? And what is the sustainability of water resources? Much of the research to address these questions is directed at understanding the atmospheric water cycle. In this paper, we have used a new diagnostic tool, called Water Vapor Tracers (WVTs), to quantify the how much precipitation originated as continental or oceanic evaporation. This shows how long water can remain in the atmosphere and how far it can travel. The model-simulated data are analyzed over regions of interest to the GEWEX community, specifically, their Continental Scale Experiments (CSEs) that are in place in the United States, Europe, Asia, Brazil, Africa and Canada. The paper presents quantitative data on how much each continent and ocean on Earth supplies water for each CSE. Furthermore, the analysis also shows the seasonal variation of the water sources. For example, in the United States, summertime precipitation is dominated by continental (land surface) sources of water, while wintertime precipitation is dominated by the Pacific Ocean sources of water. We also analyze the residence time of water in the atmosphere. The new diagnostic shows a longer residence time for water (9.2 days) than more traditional estimates (7.5 days). We emphasize that the results are based on model simulations and they depend on the model s veracity. However, there are many potential uses for the new diagnostic tool in understanding weather processes and large and small scales.

  17. Adding Four- Dimensional Data Assimilation (aka grid ...

    EPA Pesticide Factsheets

    Adding four-dimensional data assimilation (a.k.a. grid nudging) to MPAS.The U.S. Environmental Protection Agency is investigating the use of MPAS as the meteorological driver for its next-generation air quality model. To function as such, MPAS needs to operate in a diagnostic mode in much the same manner as the current meteorological driver, the Weather Research and Forecasting (WRF) model. The WRF operates in diagnostic mode using Four-Dimensional Data Assimilation, also known as "grid nudging". MPAS version 4.0 has been modified with the addition of an FDDA routine to the standard physics drivers to nudge the state variables for wind, temperature and water vapor towards MPAS initialization fields defined at 6-hour intervals from GFS-derived data. The results to be shown demonstrate the ability to constrain MPAS simulations to known historical conditions and thus provide the U.S. EPA with a practical meteorological driver for global-scale air quality simulations. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use bo

  18. Recent gyrokinetic turbulence insights with GENE and direct comparison with experimental measurements

    NASA Astrophysics Data System (ADS)

    Goerler, Tobias

    2017-10-01

    Throughout the last years direct comparisons between gyrokinetic turbulence simulations and experimental measurements have been intensified substantially. Such studies are largely motivated by the urgent need for reliable transport predictions for future burning plasma devices and the associated necessity for validating the numerical tools. On the other hand, they can be helpful to assess the way a particular diagnostic experiences turbulence and provide ideas for further optimization and the physics that may not yet be accessible. Here, synthetic diagnostics, i.e. models that mimic the spatial and sometimes temporal response of the experimental diagnostic, play an important role. In the contribution at hand, we focus on recent gyrokinetic GENE simulations dedicated to ASDEX Upgrade L-mode plasmas and comparison with various turbulence measurements. Particular emphasis will be given to density fluctuation spectra which are experimentally accessible via Doppler reflectometry. A sophisticated synthetic diagnostic involving a fullwave code has recently been established and solves the long-lasting question on different spectral roll-overs in gyrokinetic and measured spectra as well as the potentially different power laws in the O- and X-mode signals. The demonstrated agreement furthermore extends the validation data base deep into spectral space and confirms a proper coverage of the turbulence cascade physics. The flux-matched GENE simulations are then used to study the sensitivity of the latter to the main microinstability drive and investigate the energetics at the various scales. Additionally, electron scale turbulence based modifications of the high-k power law spectra in such plasmas will be presented and their visibility in measurable signals be discussed.

  19. Sequential simulation (SqS) of clinical pathways: a tool for public and patient engagement in point-of-care diagnostics

    PubMed Central

    Huddy, Jeremy R; Weldon, Sharon-Marie; Ralhan, Shvaita; Painter, Tim; Hanna, George B; Kneebone, Roger; Bello, Fernando

    2016-01-01

    Objectives Public and patient engagement (PPE) is fundamental to healthcare research. To facilitate effective engagement in novel point-of-care tests (POCTs), the test and downstream consequences of the result need to be considered. Sequential simulation (SqS) is a tool to represent patient journeys and the effects of intervention at each and subsequent stages. This case study presents a process evaluation of SqS as a tool for PPE in the development of a volatile organic compound-based breath test POCT for the diagnosis of oesophagogastric (OG) cancer. Setting Three 3-hour workshops in central London. Participants 38 members of public attended a workshop, 26 (68%) had no prior experience of the OG cancer diagnostic pathway. Interventions Clinical pathway SqS was developed from a storyboard of a patient, played by an actor, noticing symptoms of oesophageal cancer and following a typical diagnostic pathway. The proposed breath testing strategy was then introduced and incorporated into a second SqS to demonstrate pathway impact. Facilitated group discussions followed each SqS. Primary and secondary outcome measures Evaluation was conducted through pre-event and postevent questionnaires, field notes and analysis of audiovisual recordings. Results 38 participants attended a workshop. All participants agreed they were able to contribute to discussions and like the idea of an OG cancer breath test. Five themes emerged related to the proposed new breath test including awareness of OG cancer, barriers to testing and diagnosis, design of new test device, new clinical pathway and placement of test device. 3 themes emerged related to the use of SqS: participatory engagement, simulation and empathetic engagement, and why participants attended. Conclusions SqS facilitated a shared immersive experience for participants and researchers that led to the coconstruction of knowledge that will guide future research activities and be of value to stakeholders concerned with the invention and adoption of POCT. PMID:27625053

  20. Space applications of artificial intelligence; Proceedings of the Annual Goddard Conference, Greenbelt, MD, May 16, 17, 1989

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Editor); Dent, Carolyn P. (Editor)

    1989-01-01

    Theoretical and implementation aspects of AI systems for space applications are discussed in reviews and reports. Sections are devoted to planning and scheduling, fault isolation and diagnosis, data management, modeling and simulation, and development tools and methods. Particular attention is given to a situated reasoning architecture for space repair and replace tasks, parallel plan execution with self-processing networks, the electrical diagnostics expert system for Spacelab life-sciences experiments, diagnostic tolerance for missing sensor data, the integration of perception and reasoning in fast neural modules, a connectionist model for dynamic control, and applications of fuzzy sets to the development of rule-based expert systems.

  1. Virtual reality based surgical assistance and training system for long duration space missions.

    PubMed

    Montgomery, K; Thonier, G; Stephanides, M; Schendel, S

    2001-01-01

    Access to medical care during long duration space missions is extremely important. Numerous unanticipated medical problems will need to be addressed promptly and efficiently. Although telemedicine provides a convenient tool for remote diagnosis and treatment, it is impractical due to the long delay between data transmission and reception to Earth. While a well-trained surgeon-internist-astronaut would be an essential addition to the crew, the vast number of potential medical problems necessitate instant access to computerized, skill-enhancing and diagnostic tools. A functional prototype of a virtual reality based surgical training and assistance tool was created at our center, using low-power, small, lightweight components that would be easy to transport on a space mission. The system consists of a tracked, head-mounted display, a computer system, and a number of tracked surgical instruments. The software provides a real-time surgical simulation system with integrated monitoring and information retrieval and a voice input/output subsystem. Initial medical content for the system has been created, comprising craniofacial, hand, inner ear, and general anatomy, as well as information on a number of surgical procedures and techniques. One surgical specialty in particular, microsurgery, was provided as a full simulation due to its long training requirements, significant impact on result due to experience, and likelihood for need. However, the system is easily adapted to realistically simulate a large number of other surgical procedures. By providing a general system for surgical simulation and assistance, the astronaut-surgeon can maintain their skills, acquire new specialty skills, and use tools for computer-based surgical planning and assistance to minimize overall crew and mission risk.

  2. Measuring the impact of diagnostic decision support on the quality of clinical decision making: development of a reliable and valid composite score.

    PubMed

    Ramnarayan, Padmanabhan; Kapoor, Ritika R; Coren, Michael; Nanduri, Vasantha; Tomlinson, Amanda L; Taylor, Paul M; Wyatt, Jeremy C; Britto, Joseph F

    2003-01-01

    Few previous studies evaluating the benefits of diagnostic decision support systems have simultaneously measured changes in diagnostic quality and clinical management prompted by use of the system. This report describes a reliable and valid scoring technique to measure the quality of clinical decision plans in an acute medical setting, where diagnostic decision support tools might prove most useful. Sets of differential diagnoses and clinical management plans generated by 71 clinicians for six simulated cases, before and after decision support from a Web-based pediatric differential diagnostic tool (ISABEL), were used. A composite quality score was calculated separately for each diagnostic and management plan by considering the appropriateness value of each component diagnostic or management suggestion, a weighted sum of individual suggestion ratings, relevance of the entire plan, and its comprehensiveness. The reliability and validity (face, concurrent, construct, and content) of these two final scores were examined. Two hundred fifty-two diagnostic and 350 management suggestions were included in the interrater reliability analysis. There was good agreement between raters (intraclass correlation coefficient, 0.79 for diagnoses, and 0.72 for management). No counterintuitive scores were demonstrated on visual inspection of the sets. Content validity was verified by a consultation process with pediatricians. Both scores discriminated adequately between the plans of consultants and medical students and correlated well with clinicians' subjective opinions of overall plan quality (Spearman rho 0.65, p < 0.01). The diagnostic and management scores for each episode showed moderate correlation (r = 0.51). The scores described can be used as key outcome measures in a larger study to fully assess the value of diagnostic decision aids, such as the ISABEL system.

  3. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  4. Diagnosing the Dynamics of Observed and Simulated Ecosystem Gross Primary Productivity with Time Causal Information Theory Quantifiers

    DOE PAGES

    Sippel, Sebastian; Lange, Holger; Mahecha, Miguel D.; ...

    2016-10-20

    Data analysis and model-data comparisons in the environmental sciences require diagnostic measures that quantify time series dynamics and structure, and are robust to noise in observational data. This paper investigates the temporal dynamics of environmental time series using measures quantifying their information content and complexity. The measures are used to classify natural processes on one hand, and to compare models with observations on the other. The present analysis focuses on the global carbon cycle as an area of research in which model-data integration and comparisons are key to improving our understanding of natural phenomena. We investigate the dynamics of observedmore » and simulated time series of Gross Primary Productivity (GPP), a key variable in terrestrial ecosystems that quantifies ecosystem carbon uptake. However, the dynamics, patterns and magnitudes of GPP time series, both observed and simulated, vary substantially on different temporal and spatial scales. Here we demonstrate that information content and complexity, or Information Theory Quantifiers (ITQ) for short, serve as robust and efficient data-analytical and model benchmarking tools for evaluating the temporal structure and dynamical properties of simulated or observed time series at various spatial scales. At continental scale, we compare GPP time series simulated with two models and an observations-based product. This analysis reveals qualitative differences between model evaluation based on ITQ compared to traditional model performance metrics, indicating that good model performance in terms of absolute or relative error does not imply that the dynamics of the observations is captured well. Furthermore, we show, using an ensemble of site-scale measurements obtained from the FLUXNET archive in the Mediterranean, that model-data or model-model mismatches as indicated by ITQ can be attributed to and interpreted as differences in the temporal structure of the respective ecological time series. At global scale, our understanding of C fluxes relies on the use of consistently applied land models. Here, we use ITQ to evaluate model structure: The measures are largely insensitive to climatic scenarios, land use and atmospheric gas concentrations used to drive them, but clearly separate the structure of 13 different land models taken from the CMIP5 archive and an observations-based product. In conclusion, diagnostic measures of this kind provide data-analytical tools that distinguish different types of natural processes based solely on their dynamics, and are thus highly suitable for environmental science applications such as model structural diagnostics.« less

  5. Diagnosing the Dynamics of Observed and Simulated Ecosystem Gross Primary Productivity with Time Causal Information Theory Quantifiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sippel, Sebastian; Lange, Holger; Mahecha, Miguel D.

    Data analysis and model-data comparisons in the environmental sciences require diagnostic measures that quantify time series dynamics and structure, and are robust to noise in observational data. This paper investigates the temporal dynamics of environmental time series using measures quantifying their information content and complexity. The measures are used to classify natural processes on one hand, and to compare models with observations on the other. The present analysis focuses on the global carbon cycle as an area of research in which model-data integration and comparisons are key to improving our understanding of natural phenomena. We investigate the dynamics of observedmore » and simulated time series of Gross Primary Productivity (GPP), a key variable in terrestrial ecosystems that quantifies ecosystem carbon uptake. However, the dynamics, patterns and magnitudes of GPP time series, both observed and simulated, vary substantially on different temporal and spatial scales. Here we demonstrate that information content and complexity, or Information Theory Quantifiers (ITQ) for short, serve as robust and efficient data-analytical and model benchmarking tools for evaluating the temporal structure and dynamical properties of simulated or observed time series at various spatial scales. At continental scale, we compare GPP time series simulated with two models and an observations-based product. This analysis reveals qualitative differences between model evaluation based on ITQ compared to traditional model performance metrics, indicating that good model performance in terms of absolute or relative error does not imply that the dynamics of the observations is captured well. Furthermore, we show, using an ensemble of site-scale measurements obtained from the FLUXNET archive in the Mediterranean, that model-data or model-model mismatches as indicated by ITQ can be attributed to and interpreted as differences in the temporal structure of the respective ecological time series. At global scale, our understanding of C fluxes relies on the use of consistently applied land models. Here, we use ITQ to evaluate model structure: The measures are largely insensitive to climatic scenarios, land use and atmospheric gas concentrations used to drive them, but clearly separate the structure of 13 different land models taken from the CMIP5 archive and an observations-based product. In conclusion, diagnostic measures of this kind provide data-analytical tools that distinguish different types of natural processes based solely on their dynamics, and are thus highly suitable for environmental science applications such as model structural diagnostics.« less

  6. Diagnosing the Dynamics of Observed and Simulated Ecosystem Gross Primary Productivity with Time Causal Information Theory Quantifiers

    PubMed Central

    Sippel, Sebastian; Mahecha, Miguel D.; Hauhs, Michael; Bodesheim, Paul; Kaminski, Thomas; Gans, Fabian; Rosso, Osvaldo A.

    2016-01-01

    Data analysis and model-data comparisons in the environmental sciences require diagnostic measures that quantify time series dynamics and structure, and are robust to noise in observational data. This paper investigates the temporal dynamics of environmental time series using measures quantifying their information content and complexity. The measures are used to classify natural processes on one hand, and to compare models with observations on the other. The present analysis focuses on the global carbon cycle as an area of research in which model-data integration and comparisons are key to improving our understanding of natural phenomena. We investigate the dynamics of observed and simulated time series of Gross Primary Productivity (GPP), a key variable in terrestrial ecosystems that quantifies ecosystem carbon uptake. However, the dynamics, patterns and magnitudes of GPP time series, both observed and simulated, vary substantially on different temporal and spatial scales. We demonstrate here that information content and complexity, or Information Theory Quantifiers (ITQ) for short, serve as robust and efficient data-analytical and model benchmarking tools for evaluating the temporal structure and dynamical properties of simulated or observed time series at various spatial scales. At continental scale, we compare GPP time series simulated with two models and an observations-based product. This analysis reveals qualitative differences between model evaluation based on ITQ compared to traditional model performance metrics, indicating that good model performance in terms of absolute or relative error does not imply that the dynamics of the observations is captured well. Furthermore, we show, using an ensemble of site-scale measurements obtained from the FLUXNET archive in the Mediterranean, that model-data or model-model mismatches as indicated by ITQ can be attributed to and interpreted as differences in the temporal structure of the respective ecological time series. At global scale, our understanding of C fluxes relies on the use of consistently applied land models. Here, we use ITQ to evaluate model structure: The measures are largely insensitive to climatic scenarios, land use and atmospheric gas concentrations used to drive them, but clearly separate the structure of 13 different land models taken from the CMIP5 archive and an observations-based product. In conclusion, diagnostic measures of this kind provide data-analytical tools that distinguish different types of natural processes based solely on their dynamics, and are thus highly suitable for environmental science applications such as model structural diagnostics. PMID:27764187

  7. Tapered Roller Bearing Damage Detection Using Decision Fusion Analysis

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Kreider, Gary; Fichter, Thomas

    2006-01-01

    A diagnostic tool was developed for detecting fatigue damage of tapered roller bearings. Tapered roller bearings are used in helicopter transmissions and have potential for use in high bypass advanced gas turbine aircraft engines. A diagnostic tool was developed and evaluated experimentally by collecting oil debris data from failure progression tests conducted using health monitoring hardware. Failure progression tests were performed with tapered roller bearings under simulated engine load conditions. Tests were performed on one healthy bearing and three pre-damaged bearings. During each test, data from an on-line, in-line, inductance type oil debris sensor and three accelerometers were monitored and recorded for the occurrence of bearing failure. The bearing was removed and inspected periodically for damage progression throughout testing. Using data fusion techniques, two different monitoring technologies, oil debris analysis and vibration, were integrated into a health monitoring system for detecting bearing surface fatigue pitting damage. The data fusion diagnostic tool was evaluated during bearing failure progression tests under simulated engine load conditions. This integrated system showed improved detection of fatigue damage and health assessment of the tapered roller bearings as compared to using individual health monitoring technologies.

  8. Transmission of a Viral Disease (AIDS) Detected by a Modified ELISA Reaction: A Laboratory Simulation.

    ERIC Educational Resources Information Center

    Grimes, William J.; Chambers, Linda; Kubo, Kenneth M.; Narro, Martha L.

    1998-01-01

    Describes a laboratory exercise that simulates the spread of an infectious agent among students in a classroom. Uses a modified Enzyme Linked ImmunoSorbent Assay (ELISA) to provide students with experience using an authentic diagnostic tool for detecting human infections. (DDR)

  9. Investigating the Effectiveness of Classroom Diagnostic Tools

    ERIC Educational Resources Information Center

    Schultz, Robert K.

    2012-01-01

    The primary purposes of the study are to investigate what teachers experience while using the Classroom Diagnostic Tools (CDT) and to relate those experiences to the rate of growth in students' mathematics achievement. The CDT contains three components: an online computer adaptive diagnostic test, interactive web-based student reports, and…

  10. Research Priorities in the Utilization and Interpretation of Diagnostic Imaging: Education, Assessment, and Competency.

    PubMed

    Lewiss, Resa E; Chan, Wilma; Sheng, Alexander Y; Soto, Jorge; Castro, Alexandra; Meltzer, Andrew C; Cherney, Alan; Kumaravel, Manickam; Cody, Dianna; Chen, Esther H

    2015-12-01

    The appropriate selection and accurate interpretation of diagnostic imaging is a crucial skill for emergency practitioners. To date, the majority of the published literature and research on competency assessment comes from the subspecialty of point-of-care ultrasound. A group of radiologists, physicists, and emergency physicians convened at the 2015 Academic Emergency Medicine consensus conference to discuss and prioritize a research agenda related to education, assessment, and competency in ordering and interpreting diagnostic imaging. A set of questions for the continued development of an educational curriculum on diagnostic imaging for trainees and competency assessment using specific assessment methods based on current best practices was delineated. The research priorities were developed through an iterative consensus-driven process using a modified nominal group technique that culminated in an in-person breakout session. The four recommendations are: 1) develop a diagnostic imaging curriculum for emergency medicine (EM) residency training; 2) develop, study, and validate tools to assess competency in diagnostic imaging interpretation; 3) evaluate the role of simulation in education, assessment, and competency measures for diagnostic imaging; 4) study is needed regarding the American College of Radiology Appropriateness Criteria, an evidence-based peer-reviewed resource in determining the use of diagnostic imaging, to maximize its value in EM. In this article, the authors review the supporting reliability and validity evidence and make specific recommendations for future research on the education, competency, and assessment of learning diagnostic imaging. © 2015 by the Society for Academic Emergency Medicine.

  11. Behavioral Health Program Element

    NASA Technical Reports Server (NTRS)

    Leveton, Lauren B.

    2006-01-01

    The project goal is to develop behavioral health prevention and maintenance system for continued crew health, safety, and performance for exploration missions. The basic scope includes a) Operationally-relevant research related to clinical cognitive and behavioral health of crewmembers; b) Ground-based studies using analog environments (Antarctic, NEEMO, simulations, and other testbeds; c) ISS studies (ISSMP) focusing on operational issues related to behavioral health outcomes and standards; d) Technology development activities for monitoring and diagnostic tools; and e) Cross-disciplinary research (e.g., human factors and habitability research, skeletal muscle, radiation).

  12. WRF/CMAQ AQMEII3 Simulations of US Regional-Scale ...

    EPA Pesticide Factsheets

    Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, performed during the third phase of the Air Quality Model Evaluation International Initiative (AQMEII3), we perform annual simulations over North America with chemical boundary conditions prepared from four different global models. Results indicate that the impacts of different boundary conditions are significant for ozone throughout the year and most pronounced outside the summer season. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  13. Goodness-of-fit tests and model diagnostics for negative binomial regression of RNA sequencing data.

    PubMed

    Mi, Gu; Di, Yanming; Schafer, Daniel W

    2015-01-01

    This work is about assessing model adequacy for negative binomial (NB) regression, particularly (1) assessing the adequacy of the NB assumption, and (2) assessing the appropriateness of models for NB dispersion parameters. Tools for the first are appropriate for NB regression generally; those for the second are primarily intended for RNA sequencing (RNA-Seq) data analysis. The typically small number of biological samples and large number of genes in RNA-Seq analysis motivate us to address the trade-offs between robustness and statistical power using NB regression models. One widely-used power-saving strategy, for example, is to assume some commonalities of NB dispersion parameters across genes via simple models relating them to mean expression rates, and many such models have been proposed. As RNA-Seq analysis is becoming ever more popular, it is appropriate to make more thorough investigations into power and robustness of the resulting methods, and into practical tools for model assessment. In this article, we propose simulation-based statistical tests and diagnostic graphics to address model adequacy. We provide simulated and real data examples to illustrate that our proposed methods are effective for detecting the misspecification of the NB mean-variance relationship as well as judging the adequacy of fit of several NB dispersion models.

  14. Ares I-X Ground Diagnostic Prototype

    NASA Technical Reports Server (NTRS)

    Schwabacher, Mark A.; Martin, Rodney Alexander; Waterman, Robert D.; Oostdyk, Rebecca Lynn; Ossenfort, John P.; Matthews, Bryan

    2010-01-01

    The automation of pre-launch diagnostics for launch vehicles offers three potential benefits: improving safety, reducing cost, and reducing launch delays. The Ares I-X Ground Diagnostic Prototype demonstrated anomaly detection, fault detection, fault isolation, and diagnostics for the Ares I-X first-stage Thrust Vector Control and for the associated ground hydraulics while the vehicle was in the Vehicle Assembly Building at Kennedy Space Center (KSC) and while it was on the launch pad. The prototype combines three existing tools. The first tool, TEAMS (Testability Engineering and Maintenance System), is a model-based tool from Qualtech Systems Inc. for fault isolation and diagnostics. The second tool, SHINE (Spacecraft Health Inference Engine), is a rule-based expert system that was developed at the NASA Jet Propulsion Laboratory. We developed SHINE rules for fault detection and mode identification, and used the outputs of SHINE as inputs to TEAMS. The third tool, IMS (Inductive Monitoring System), is an anomaly detection tool that was developed at NASA Ames Research Center. The three tools were integrated and deployed to KSC, where they were interfaced with live data. This paper describes how the prototype performed during the period of time before the launch, including accuracy and computer resource usage. The paper concludes with some of the lessons that we learned from the experience of developing and deploying the prototype.

  15. Development of “-omics” research in Schistosoma spp. and -omics-based new diagnostic tools for schistosomiasis

    PubMed Central

    Wang, Shuqi; Hu, Wei

    2014-01-01

    Schistosomiasis, caused by dioecious flatworms in the genus Schistosoma, is torturing people from many developing countries nowadays and frequently leads to severe morbidity and mortality of the patients. Praziquantel based chemotherapy and morbidity control for this disease adopted currently necessitate viable and efficient diagnostic technologies. Fortunately, those “-omics” researches, which rely on high-throughput experimental technologies to produce massive amounts of informative data, have substantially contributed to the exploitation and innovation of diagnostic tools of schistosomiasis. In its first section, this review provides a concise conclusion on the progresses pertaining to schistosomal “-omics” researches to date, followed by a comprehensive section on the diagnostic methods of schistosomiasis, especially those innovative ones based on the detection of antibodies, antigens, nucleic acids, and metabolites with a focus on those achievements inspired by “-omics” researches. Finally, suggestions about the design of future diagnostic tools of schistosomiasis are proposed, in order to better harness those data produced by “-omics” studies. PMID:25018752

  16. Going glass to digital: virtual microscopy as a simulation-based revolution in pathology and laboratory science.

    PubMed

    Nelson, Danielle; Ziv, Amitai; Bandali, Karim S

    2012-10-01

    The recent technological advance of digital high resolution imaging has allowed the field of pathology and medical laboratory science to undergo a dramatic transformation with the incorporation of virtual microscopy as a simulation-based educational and diagnostic tool. This transformation has correlated with an overall increase in the use of simulation in medicine in an effort to address dwindling clinical resource availability and patient safety issues currently facing the modern healthcare system. Virtual microscopy represents one such simulation-based technology that has the potential to enhance student learning and readiness to practice while revolutionising the ability to clinically diagnose pathology collaboratively across the world. While understanding that a substantial amount of literature already exists on virtual microscopy, much more research is still required to elucidate the full capabilities of this technology. This review explores the use of virtual microscopy in medical education and disease diagnosis with a unique focus on key requirements needed to take this technology to the next level in its use in medical education and clinical practice.

  17. Republished: going glass to digital: virtual microscopy as a simulation-based revolution in pathology and laboratory science.

    PubMed

    Nelson, Danielle; Ziv, Amitai; Bandali, Karim S

    2013-10-01

    The recent technological advance of digital high resolution imaging has allowed the field of pathology and medical laboratory science to undergo a dramatic transformation with the incorporation of virtual microscopy as a simulation-based educational and diagnostic tool. This transformation has correlated with an overall increase in the use of simulation in medicine in an effort to address dwindling clinical resource availability and patient safety issues currently facing the modern healthcare system. Virtual microscopy represents one such simulation-based technology that has the potential to enhance student learning and readiness to practice while revolutionising the ability to clinically diagnose pathology collaboratively across the world. While understanding that a substantial amount of literature already exists on virtual microscopy, much more research is still required to elucidate the full capabilities of this technology. This review explores the use of virtual microscopy in medical education and disease diagnosis with a unique focus on key requirements needed to take this technology to the next level in its use in medical education and clinical practice.

  18. Synthetic neutron camera and spectrometer in JET based on AFSI-ASCOT simulations

    NASA Astrophysics Data System (ADS)

    Sirén, P.; Varje, J.; Weisen, H.; Koskela, T.; contributors, JET

    2017-09-01

    The ASCOT Fusion Source Integrator (AFSI) has been used to calculate neutron production rates and spectra corresponding to the JET 19-channel neutron camera (KN3) and the time-of-flight spectrometer (TOFOR) as ideal diagnostics, without detector-related effects. AFSI calculates fusion product distributions in 4D, based on Monte Carlo integration from arbitrary reactant distribution functions. The distribution functions were calculated by the ASCOT Monte Carlo particle orbit following code for thermal, NBI and ICRH particle reactions. Fusion cross-sections were defined based on the Bosch-Hale model and both DD and DT reactions have been included. Neutrons generated by AFSI-ASCOT simulations have already been applied as a neutron source of the Serpent neutron transport code in ITER studies. Additionally, AFSI has been selected to be a main tool as the fusion product generator in the complete analysis calculation chain: ASCOT - AFSI - SERPENT (neutron and gamma transport Monte Carlo code) - APROS (system and power plant modelling code), which encompasses the plasma as an energy source, heat deposition in plant structures as well as cooling and balance-of-plant in DEMO applications and other reactor relevant analyses. This conference paper presents the first results and validation of the AFSI DD fusion model for different auxiliary heating scenarios (NBI, ICRH) with very different fast particle distribution functions. Both calculated quantities (production rates and spectra) have been compared with experimental data from KN3 and synthetic spectrometer data from ControlRoom code. No unexplained differences have been observed. In future work, AFSI will be extended for synthetic gamma diagnostics and additionally, AFSI will be used as part of the neutron transport calculation chain to model real diagnostics instead of ideal synthetic diagnostics for quantitative benchmarking.

  19. System Modeling and Diagnostics for Liquefying-Fuel Hybrid Rockets

    NASA Technical Reports Server (NTRS)

    Poll, Scott; Iverson, David; Ou, Jeremy; Sanderfer, Dwight; Patterson-Hine, Ann

    2003-01-01

    A Hybrid Combustion Facility (HCF) was recently built at NASA Ames Research Center to study the combustion properties of a new fuel formulation that burns approximately three times faster than conventional hybrid fuels. Researchers at Ames working in the area of Integrated Vehicle Health Management recognized a good opportunity to apply IVHM techniques to a candidate technology for next generation launch systems. Five tools were selected to examine various IVHM techniques for the HCF. Three of the tools, TEAMS (Testability Engineering and Maintenance System), L2 (Livingstone2), and RODON, are model-based reasoning (or diagnostic) systems. Two other tools in this study, ICS (Interval Constraint Simulator) and IMS (Inductive Monitoring System) do not attempt to isolate the cause of the failure but may be used for fault detection. Models of varying scope and completeness were created, both qualitative and quantitative. In each of the models, the structure and behavior of the physical system are captured. In the qualitative models, the temporal aspects of the system behavior and the abstraction of sensor data are handled outside of the model and require the development of additional code. In the quantitative model, less extensive processing code is also necessary. Examples of fault diagnoses are given.

  20. Flexible Simulation E-Learning Environment for Studying Digital Circuits and Possibilities for It Deployment as Semantic Web Service

    ERIC Educational Resources Information Center

    Radoyska, P.; Ivanova, T.; Spasova, N.

    2011-01-01

    In this article we present a partially realized project for building a distributed learning environment for studying digital circuits Test and Diagnostics at TU-Sofia. We describe the main requirements for this environment, substantiate the developer platform choice, and present our simulation and circuit parameter calculation tools.…

  1. Web-based Visual Analytics for Extreme Scale Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Evans, Katherine J; Harney, John F

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less

  2. Sequential simulation (SqS) of clinical pathways: a tool for public and patient engagement in point-of-care diagnostics.

    PubMed

    Huddy, Jeremy R; Weldon, Sharon-Marie; Ralhan, Shvaita; Painter, Tim; Hanna, George B; Kneebone, Roger; Bello, Fernando

    2016-09-13

    Public and patient engagement (PPE) is fundamental to healthcare research. To facilitate effective engagement in novel point-of-care tests (POCTs), the test and downstream consequences of the result need to be considered. Sequential simulation (SqS) is a tool to represent patient journeys and the effects of intervention at each and subsequent stages. This case study presents a process evaluation of SqS as a tool for PPE in the development of a volatile organic compound-based breath test POCT for the diagnosis of oesophagogastric (OG) cancer. Three 3-hour workshops in central London. 38 members of public attended a workshop, 26 (68%) had no prior experience of the OG cancer diagnostic pathway. Clinical pathway SqS was developed from a storyboard of a patient, played by an actor, noticing symptoms of oesophageal cancer and following a typical diagnostic pathway. The proposed breath testing strategy was then introduced and incorporated into a second SqS to demonstrate pathway impact. Facilitated group discussions followed each SqS. Evaluation was conducted through pre-event and postevent questionnaires, field notes and analysis of audiovisual recordings. 38 participants attended a workshop. All participants agreed they were able to contribute to discussions and like the idea of an OG cancer breath test. Five themes emerged related to the proposed new breath test including awareness of OG cancer, barriers to testing and diagnosis, design of new test device, new clinical pathway and placement of test device. 3 themes emerged related to the use of SqS: participatory engagement, simulation and empathetic engagement, and why participants attended. SqS facilitated a shared immersive experience for participants and researchers that led to the coconstruction of knowledge that will guide future research activities and be of value to stakeholders concerned with the invention and adoption of POCT. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  3. A defect-driven diagnostic method for machine tool spindles

    PubMed Central

    Vogl, Gregory W.; Donmez, M. Alkan

    2016-01-01

    Simple vibration-based metrics are, in many cases, insufficient to diagnose machine tool spindle condition. These metrics couple defect-based motion with spindle dynamics; diagnostics should be defect-driven. A new method and spindle condition estimation device (SCED) were developed to acquire data and to separate system dynamics from defect geometry. Based on this method, a spindle condition metric relying only on defect geometry is proposed. Application of the SCED on various milling and turning spindles shows that the new approach is robust for diagnosing the machine tool spindle condition. PMID:28065985

  4. Dynamic Evaluation of Two Decades of CMAQ Simulations ...

    EPA Pesticide Factsheets

    This presentation focuses on the dynamic evaluation of the CMAQ model over the continental United States using multi-decadal simulations for the period from 1990 to 2010 to examine how well the changes in observed ozone air quality induced by variations in meteorology and/or emissions are simulated by the model. We applied spectral decomposition of the ozone time-series using the KZ filter to assess the variations in the strengths of synoptic (weather-induced variations) and baseline (long-term variation) forcings, embedded in the simulated and observed concentrations. The results reveal that CMAQ captured the year-to-year variability (more so in the later years than the earlier years) and the synoptic forcing in accordance with what the observations are showing. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  5. Spinoff 2013

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Topics covered include: Innovative Software Tools Measure Behavioral Alertness; Miniaturized, Portable Sensors Monitor Metabolic Health; Patient Simulators Train Emergency Caregivers; Solar Refrigerators Store Life-Saving Vaccines; Monitors Enable Medication Management in Patients' Homes; Handheld Diagnostic Device Delivers Quick Medical Readings; Experiments Result in Safer, Spin-Resistant Aircraft; Interfaces Visualize Data for Airline Safety, Efficiency; Data Mining Tools Make Flights Safer, More Efficient; NASA Standards Inform Comfortable Car Seats; Heat Shield Paves the Way for Commercial Space; Air Systems Provide Life Support to Miners; Coatings Preserve Metal, Stone, Tile, and Concrete; Robots Spur Software That Lends a Hand; Cloud-Based Data Sharing Connects Emergency Managers; Catalytic Converters Maintain Air Quality in Mines; NASA-Enhanced Water Bottles Filter Water on the Go; Brainwave Monitoring Software Improves Distracted Minds; Thermal Materials Protect Priceless, Personal Keepsakes; Home Air Purifiers Eradicate Harmful Pathogens; Thermal Materials Drive Professional Apparel Line; Radiant Barriers Save Energy in Buildings; Open Source Initiative Powers Real-Time Data Streams; Shuttle Engine Designs Revolutionize Solar Power; Procedure-Authoring Tool Improves Safety on Oil Rigs; Satellite Data Aid Monitoring of Nation's Forests; Mars Technologies Spawn Durable Wind Turbines; Programs Visualize Earth and Space for Interactive Education; Processor Units Reduce Satellite Construction Costs; Software Accelerates Computing Time for Complex Math; Simulation Tools Prevent Signal Interference on Spacecraft; Software Simplifies the Sharing of Numerical Models; Virtual Machine Language Controls Remote Devices; Micro-Accelerometers Monitor Equipment Health; Reactors Save Energy, Costs for Hydrogen Production; Cameras Monitor Spacecraft Integrity to Prevent Failures; Testing Devices Garner Data on Insulation Performance; Smart Sensors Gather Information for Machine Diagnostics; Oxygen Sensors Monitor Bioreactors and Ensure Health and Safety; Vision Algorithms Catch Defects in Screen Displays; and Deformable Mirrors Capture Exoplanet Data, Reflect Lasers.

  6. Investigations on the structure of the extracted ion beam from an electron cyclotron resonance ion source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spaedtke, P.; Lang, R.; Maeder, J.

    2012-02-15

    Using improved beam diagnostic tools, the structure of an ion beam extracted from an electron cyclotron resonance ion source (ECRIS) becomes visible. Especially viewing targets to display the beam profile and pepper pot devices for emittance measurements turned out to be very useful. On the contrary, diagnostic tools integrating over one space coordinate like wire harps for profile measurements or slit-slit devices, respectively slit-grid devices to measure the emittance might be applicable for beam transport investigations in a quadrupole channel, but are not very meaningful for investigations regarding the given ECRIS symmetry. Here we try to reproduce the experimentally foundmore » structure on the ion beam by simulation. For the simulation, a certain model has to be used to reproduce the experimental results. The model is also described in this paper.« less

  7. Use of a Process Analysis Tool for Diagnostic Study on Fine Particulate Matter Predictions in the U.S.-Part II: Analysis and Sensitivity Simulations

    EPA Science Inventory

    Following the Part I paper that described an application of the U.S. EPA Models-3/Community Multiscale Air Quality (CMAQ) modeling system to the 1999 Southern Oxidants Study episode, this paper presents results from process analysis (PA) using the PA tool embedded in CMAQ and s...

  8. Comparison of Numerically Simulated and Experimentally Measured Performance of a Rotating Detonation Engine

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.; Fotia, Matthew L.; Hoke, John; Schauer, Fred

    2015-01-01

    A quasi-two-dimensional, computational fluid dynamic (CFD) simulation of a rotating detonation engine (RDE) is described. The simulation operates in the detonation frame of reference and utilizes a relatively coarse grid such that only the essential primary flow field structure is captured. This construction and other simplifications yield rapidly converging, steady solutions. Viscous effects, and heat transfer effects are modeled using source terms. The effects of potential inlet flow reversals are modeled using boundary conditions. Results from the simulation are compared to measured data from an experimental RDE rig with a converging-diverging nozzle added. The comparison is favorable for the two operating points examined. The utility of the code as a performance optimization tool and a diagnostic tool are discussed.

  9. NONMEMory: a run management tool for NONMEM.

    PubMed

    Wilkins, Justin J

    2005-06-01

    NONMEM is an extremely powerful tool for nonlinear mixed-effect modelling and simulation of pharmacokinetic and pharmacodynamic data. However, it is a console-based application whose output does not lend itself to rapid interpretation or efficient management. NONMEMory has been created to be a comprehensive project manager for NONMEM, providing detailed summary, comparison and overview of the runs comprising a given project, including the display of output data, simple post-run processing, fast diagnostic plots and run output management, complementary to other available modelling aids. Analysis time ought not to be spent on trivial tasks, and NONMEMory's role is to eliminate these as far as possible by increasing the efficiency of the modelling process. NONMEMory is freely available from http://www.uct.ac.za/depts/pha/nonmemory.php.

  10. Phytophthora database 2.0: update and future direction.

    PubMed

    Park, Bongsoo; Martin, Frank; Geiser, David M; Kim, Hye-Seon; Mansfield, Michele A; Nikolaeva, Ekaterina; Park, Sook-Young; Coffey, Michael D; Russo, Joseph; Kim, Seong H; Balci, Yilmaz; Abad, Gloria; Burgess, Treena; Grünwald, Niklaus J; Cheong, Kyeongchae; Choi, Jaeyoung; Lee, Yong-Hwan; Kang, Seogchan

    2013-12-01

    The online community resource Phytophthora database (PD) was developed to support accurate and rapid identification of Phytophthora and to help characterize and catalog the diversity and evolutionary relationships within the genus. Since its release in 2008, the sequence database has grown to cover 1 to 12 loci for ≈2,600 isolates (representing 138 described and provisional species). Sequences of multiple mitochondrial loci were added to complement nuclear loci-based phylogenetic analyses and diagnostic tool development. Key characteristics of most newly described and provisional species have been summarized. Other additions to improve the PD functionality include: (i) geographic information system tools that enable users to visualize the geographic origins of chosen isolates on a global-scale map, (ii) a tool for comparing genetic similarity between isolates via microsatellite markers to support population genetic studies, (iii) a comprehensive review of molecular diagnostics tools and relevant references, (iv) sequence alignments used to develop polymerase chain reaction-based diagnostics tools to support their utilization and new diagnostic tool development, and (v) an online community forum for sharing and preserving experience and knowledge accumulated in the global Phytophthora community. Here we present how these improvements can support users and discuss the PD's future direction.

  11. Simulation modelling as a tool for knowledge mobilisation in health policy settings: a case study protocol.

    PubMed

    Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L

    2016-09-21

    Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived commitment, influence and confidence of stakeholders in implementing policy and program decisions identified in the modelling process; and the impact of the process in terms of policy and program change. The study will generate empirical evidence on the feasibility and potential value of simulation modelling to support knowledge mobilisation and consensus building in health settings.

  12. A fault injection experiment using the AIRLAB Diagnostic Emulation Facility

    NASA Technical Reports Server (NTRS)

    Baker, Robert; Mangum, Scott; Scheper, Charlotte

    1988-01-01

    The preparation for, conduct of, and results of a simulation based fault injection experiment conducted using the AIRLAB Diagnostic Emulation facilities is described. An objective of this experiment was to determine the effectiveness of the diagnostic self-test sequences used to uncover latent faults in a logic network providing the key fault tolerance features for a flight control computer. Another objective was to develop methods, tools, and techniques for conducting the experiment. More than 1600 faults were injected into a logic gate level model of the Data Communicator/Interstage (C/I). For each fault injected, diagnostic self-test sequences consisting of over 300 test vectors were supplied to the C/I model as inputs. For each test vector within a test sequence, the outputs from the C/I model were compared to the outputs of a fault free C/I. If the outputs differed, the fault was considered detectable for the given test vector. These results were then analyzed to determine the effectiveness of some test sequences. The results established coverage of selt-test diagnostics, identified areas in the C/I logic where the tests did not locate faults, and suggest fault latency reduction opportunities.

  13. The Design, Development, and Evaluation of an Evaluative Computer Simulation.

    ERIC Educational Resources Information Center

    Ehrlich, Lisa R.

    This paper discusses evaluation design considerations for a computer based evaluation simulation developed at the University of Iowa College of Medicine in Cardiology to assess the diagnostic skills of primary care physicians and medical students. The simulation developed allows for the assessment of diagnostic skills of physicians in the…

  14. Simulation and characterization of silicon-based 0.5-MHz ultrasonic nozzles

    NASA Astrophysics Data System (ADS)

    Song, Y. L.; Tsai, S. C.; Chen, W. J.; Chou, Y. F.; Tseng, T. K.; Tsai, C. S.

    2004-01-01

    This paper compares the simulation results with the experimental results of impedance analysis and longitudinal vibration measurement of micro-fabricated 0.5 MHz silicon-based ultrasonic nozzles. Impedance analysis serves as a good diagnostic tool for evaluation of longitudinal vibration of the nozzles. Each nozzle is made of a piezoelectric drive section and a silicon-resonator consisting of multiple Fourier horns each with half wavelength design and twice amplitude magnification. The experimental results verified the simulation prediction of one pure longitudinal vibration mode at the resonant frequency in excellent agreement with the design value. Furthermore, at the resonant frequency, the measured longitudinal vibration amplitude gain at the nozzle tip increases as the number of Fourier horns (n) increases in good agreement with the theoretical value of 2n. Using this design, very high vibration amplitude at the nozzle tip can be achieved with no reduction in the tip cross sectional area. Therefore, the required electric drive power should be drastically reduced, decreasing the likelihood of transducer failure in ultrasonic atomization.

  15. Tools for surveying and improving the quality of life: people with special needs in focus.

    PubMed

    Hoyningen-Süess, Ursula; Oberholzer, David; Stalder, René; Brügger, Urs

    2012-01-01

    This article seeks to describe online tools for surveying and improving quality of life for people with disabilities living in assisted living centers and special education service organizations. Ensuring a decent quality of life for disabled people is an important welfare state goal. Using well-accepted quality of life conceptions, online diagnostic and planning tools were developed during an Institute for Education, University of Zurich, research project. The diagnostic tools measure, evaluate and analyze disabled people's quality of life. The planning tools identify factors that can affect their quality of life and suggest improvements. Instrument validity and reliability are not tested according to the standard statistical procedures. This will be done at a more advanced stage of the project. Instead, the tool is developed, refined and adjusted in cooperation with practitioners who are constantly judging it according to best practice standards. The tools support staff in assisted living centers and special education service organizations. These tools offer comprehensive resources for surveying, quantifying, evaluating, describing and simulating quality of life elements.

  16. Educational and Scientific Applications of Climate Model Diagnostic Analyzer

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.

    2016-12-01

    Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of Two Variables, and the datasets used are NCAR CAM total cloud fraction and MODIS total cloud fraction. The scientific highlight of the use case is that the CAM5 model overall does a fairly decent job at simulating total cloud cover, though simulates too few clouds especially near and offshore of the eastern ocean basins where low clouds are dominant.

  17. Computational assessment of hemodynamics-based diagnostic tools using a database of virtual subjects: Application to three case studies.

    PubMed

    Willemet, Marie; Vennin, Samuel; Alastruey, Jordi

    2016-12-08

    Many physiological indexes and algorithms based on pulse wave analysis have been suggested in order to better assess cardiovascular function. Because these tools are often computed from in-vivo hemodynamic measurements, their validation is time-consuming, challenging, and biased by measurement errors. Recently, a new methodology has been suggested to assess theoretically these computed tools: a database of virtual subjects generated using numerical 1D-0D modeling of arterial hemodynamics. The generated set of simulations encloses a wide selection of healthy cases that could be encountered in a clinical study. We applied this new methodology to three different case studies that demonstrate the potential of our new tool, and illustrated each of them with a clinically relevant example: (i) we assessed the accuracy of indexes estimating pulse wave velocity; (ii) we validated and refined an algorithm that computes central blood pressure; and (iii) we investigated theoretical mechanisms behind the augmentation index. Our database of virtual subjects is a new tool to assist the clinician: it provides insight into the physical mechanisms underlying the correlations observed in clinical practice. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. An evidence-based diagnostic classification system for low back pain

    PubMed Central

    Vining, Robert; Potocki, Eric; Seidman, Michael; Morgenthal, A. Paige

    2013-01-01

    Introduction: While clinicians generally accept that musculoskeletal low back pain (LBP) can arise from specific tissues, it remains difficult to confirm specific sources. Methods: Based on evidence supported by diagnostic utility studies, doctors of chiropractic functioning as members of a research clinic created a diagnostic classification system, corresponding exam and checklist based on strength of evidence, and in-office efficiency. Results: The diagnostic classification system contains one screening category, two pain categories: Nociceptive, Neuropathic, one functional evaluation category, and one category for unknown or poorly defined diagnoses. Nociceptive and neuropathic pain categories are each divided into 4 subcategories. Conclusion: This article describes and discusses the strength of evidence surrounding diagnostic categories for an in-office, clinical exam and checklist tool for LBP diagnosis. The use of a standardized tool for diagnosing low back pain in clinical and research settings is encouraged. PMID:23997245

  19. Evaluating the Diagnostic Validity of a Facet-Based Formative Assessment System

    ERIC Educational Resources Information Center

    DeBarger, Angela Haydel; DiBello, Louis; Minstrell, Jim; Feng, Mingyu; Stout, William; Pellegrino, James; Haertel, Geneva; Harris, Christopher; Ructinger, Liliana

    2011-01-01

    This paper describes methods for an alignment study and psychometric analyses of a formative assessment system, Diagnoser Tools for physics. Diagnoser Tools begin with facet clusters as the interpretive framework for designing questions and instructional activities. Thus each question in the diagnostic assessments includes distractors that…

  20. Synthetic Diagnostics Platform for Fusion Plasma and a Two-Dimensional Synthetic Electron Cyclotron Emission Imaging Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Lei

    Magnetic confinement fusion is one of the most promising approaches to achieve fusion energy. With the rapid increase of the computational power over the past decades, numerical simulation have become an important tool to study the fusion plasmas. Eventually, the numerical models will be used to predict the performance of future devices, such as the International Thermonuclear Experiment Reactor (ITER) or DEMO. However, the reliability of these models needs to be carefully validated against experiments before the results can be trusted. The validation between simulations and measurements is hard particularly because the quantities directly available from both sides are different.more » While the simulations have the information of the plasma quantities calculated explicitly, the measurements are usually in forms of diagnostic signals. The traditional way of making the comparison relies on the diagnosticians to interpret the measured signals as plasma quantities. The interpretation is in general very complicated and sometimes not even unique. In contrast, given the plasma quantities from the plasma simulations, we can unambiguously calculate the generation and propagation of the diagnostic signals. These calculations are called synthetic diagnostics, and they enable an alternate way to compare the simulation results with the measurements. In this dissertation, we present a platform for developing and applying synthetic diagnostic codes. Three diagnostics on the platform are introduced. The reflectometry and beam emission spectroscopy diagnostics measure the electron density, and the electron cyclotron emission diagnostic measures the electron temperature. The theoretical derivation and numerical implementation of a new two dimensional Electron cyclotron Emission Imaging code is discussed in detail. This new code has shown the potential to address many challenging aspects of the present ECE measurements, such as runaway electron effects, and detection of the cross phase between the electron temperature and density fluctuations.« less

  1. Quantum cascade laser based monitoring of CF{sub 2} radical concentration as a diagnostic tool of dielectric etching plasma processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hübner, M.; Lang, N.; Röpcke, J.

    2015-01-19

    Dielectric etching plasma processes for modern interlevel dielectrics become more and more complex by the introduction of new ultra low-k dielectrics. One challenge is the minimization of sidewall damage, while etching ultra low-k porous SiCOH by fluorocarbon plasmas. The optimization of this process requires a deeper understanding of the concentration of the CF{sub 2} radical, which acts as precursor in the polymerization of the etch sample surfaces. In an industrial dielectric etching plasma reactor, the CF{sub 2} radical was measured in situ using a continuous wave quantum cascade laser (cw-QCL) around 1106.2 cm{sup −1}. We measured Doppler-resolved ro-vibrational absorption lines andmore » determined absolute densities using transitions in the ν{sub 3} fundamental band of CF{sub 2} with the aid of an improved simulation of the line strengths. We found that the CF{sub 2} radical concentration during the etching plasma process directly correlates to the layer structure of the etched wafer. Hence, this correlation can serve as a diagnostic tool of dielectric etching plasma processes. Applying QCL based absorption spectroscopy opens up the way for advanced process monitoring and etching controlling in semiconductor manufacturing.« less

  2. Impacts of Lateral Boundary Conditions on US Ozone ...

    EPA Pesticide Factsheets

    Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, we perform annual simulations over North America with chemical boundary conditions prepared from two global models (GEOS-CHEM and Hemispheric CMAQ). Results indicate that the impacts of different boundary conditions on ozone can be significant throughout the year. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  3. Dosimetry in MARS spectral CT: TOPAS Monte Carlo simulations and ion chamber measurements.

    PubMed

    Lu, Gray; Marsh, Steven; Damet, Jerome; Carbonez, Pierre; Laban, John; Bateman, Christopher; Butler, Anthony; Butler, Phil

    2017-06-01

    Spectral computed tomography (CT) is an up and coming imaging modality which shows great promise in revealing unique diagnostic information. Because this imaging modality is based on X-ray CT, it is of utmost importance to study the radiation dose aspects of its use. This study reports on the implementation and evaluation of a Monte Carlo simulation tool using TOPAS for estimating dose in a pre-clinical spectral CT scanner known as the MARS scanner. Simulated estimates were compared with measurements from an ionization chamber. For a typical MARS scan, TOPAS estimated for a 30 mm diameter cylindrical phantom a CT dose index (CTDI) of 29.7 mGy; CTDI was measured by ion chamber to within 3% of TOPAS estimates. Although further development is required, our investigation of TOPAS for estimating MARS scan dosimetry has shown its potential for further study of spectral scanning protocols and dose to scanned objects.

  4. Computer Simulation Of An In-Process Surface Finish Sensor.

    NASA Astrophysics Data System (ADS)

    Rakels, Jan H.

    1987-01-01

    It is generally accepted, that optical methods are the most promising for the in-process measurement of surface finish. These methods have the advantages of being non-contacting and fast data acquisition. Furthermore, these optical instruments can be easily retrofitted on existing machine-tools. In the Micro-Engineering Centre at the University of Warwick, an optical sensor has been developed which can measure the rms roughness, slope and wavelength of turned and precision ground surfaces during machining. The operation of this device is based upon the Kirchhoff-Fresnel diffraction integral. Application of this theory to ideal turned and ground surfaces is straightforward, and indeed the calculated diffraction patterns are in close agreement with patterns produced by an actual optical instrument. Since it is mathematically difficult to introduce real machine-tool behaviour into the diffraction integral, a computer program has been devised, which simulates the operation of the optical sensor. The program produces a diffraction pattern as a graphical output. Comparison between computer generated and actual diffraction patterns of the same surfaces show a high correlation. The main aim of this program is to construct an atlas, which maps known machine-tool errors versus optical diffraction patterns. This atlas can then be used for machine-tool condition diagnostics. It has been found that optical monitoring is very sensitive to minor defects. Therefore machine-tool detoriation can be detected before it is detrimental.

  5. Mass spectrometry based proteomics profiling as diagnostic tool in oncology: current status and future perspective.

    PubMed

    Findeisen, Peter; Neumaier, Michael

    2009-01-01

    Proteomics analysis has been heralded as a novel tool for identifying new and specific biomarkers that may improve diagnosis and monitoring of various disease states. Recent years have brought a number of proteomics profiling technologies. Although proteomics profiling has resulted in the detection of disease-associated differences and modification of proteins, current proteomics technologies display certain limitations that are hampering the introduction of these new technologies into clinical laboratory diagnostics and routine applications. In this review, we summarize current advances in mass spectrometry based biomarker discovery. The promises and challenges of this new technology are discussed with particular emphasis on diagnostic perspectives of mass-spectrometry based proteomics profiling for malignant diseases.

  6. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less

  7. Verifying Diagnostic Software

    NASA Technical Reports Server (NTRS)

    Lindsey, Tony; Pecheur, Charles

    2004-01-01

    Livingstone PathFinder (LPF) is a simulation-based computer program for verifying autonomous diagnostic software. LPF is designed especially to be applied to NASA s Livingstone computer program, which implements a qualitative-model-based algorithm that diagnoses faults in a complex automated system (e.g., an exploratory robot, spacecraft, or aircraft). LPF forms a software test bed containing a Livingstone diagnosis engine, embedded in a simulated operating environment consisting of a simulator of the system to be diagnosed by Livingstone and a driver program that issues commands and faults according to a nondeterministic scenario provided by the user. LPF runs the test bed through all executions allowed by the scenario, checking for various selectable error conditions after each step. All components of the test bed are instrumented, so that execution can be single-stepped both backward and forward. The architecture of LPF is modular and includes generic interfaces to facilitate substitution of alternative versions of its different parts. Altogether, LPF provides a flexible, extensible framework for simulation-based analysis of diagnostic software; these characteristics also render it amenable to application to diagnostic programs other than Livingstone.

  8. Digital Tools to Enhance Clinical Reasoning.

    PubMed

    Manesh, Reza; Dhaliwal, Gurpreet

    2018-05-01

    Physicians can improve their diagnostic acumen by adopting a simulation-based approach to analyzing published cases. The tight coupling of clinical problems and their solutions affords physicians the opportunity to efficiently upgrade their illness scripts (structured knowledge of a specific disease) and schemas (structured frameworks for common problems). The more times clinicians practice accessing and applying those knowledge structures through published cases, the greater the odds that they will have an enhanced approach to similar patient-cases in the future. This article highlights digital resources that increase the number of cases a clinician experiences and learns from. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. A novel modification of the Turing test for artificial intelligence and robotics in healthcare.

    PubMed

    Ashrafian, Hutan; Darzi, Ara; Athanasiou, Thanos

    2015-03-01

    The increasing demands of delivering higher quality global healthcare has resulted in a corresponding expansion in the development of computer-based and robotic healthcare tools that rely on artificially intelligent technologies. The Turing test was designed to assess artificial intelligence (AI) in computer technology. It remains an important qualitative tool for testing the next generation of medical diagnostics and medical robotics. Development of quantifiable diagnostic accuracy meta-analytical evaluative techniques for the Turing test paradigm. Modification of the Turing test to offer quantifiable diagnostic precision and statistical effect-size robustness in the assessment of AI for computer-based and robotic healthcare technologies. Modification of the Turing test to offer robust diagnostic scores for AI can contribute to enhancing and refining the next generation of digital diagnostic technologies and healthcare robotics. Copyright © 2014 John Wiley & Sons, Ltd.

  10. Advanced studies of electromagnetic scattering

    NASA Technical Reports Server (NTRS)

    Ling, Hao

    1994-01-01

    In radar signature applications it is often desirable to generate the range profiles and inverse synthetic aperture radar (ISAR) images of a target. They can be used either as identification tools to distinguish and classify the target from a collection of possible targets, or as diagnostic/design tools to pinpoint the key scattering centers on the target. The simulation of synthetic range profiles and ISAR images is usually a time intensive task and computation time is of prime importance. Our research has been focused on the development of fast simulation algorithms for range profiles and ISAR images using the shooting and bouncing ray (SBR) method, a high frequency electromagnetic simulation technique for predicting the radar returns from realistic aerospace vehicles and the scattering by complex media.

  11. A new dump system design for stray light reduction of Thomson scattering diagnostic system on EAST.

    PubMed

    Xiao, Shumei; Zang, Qing; Han, Xiaofeng; Wang, Tengfei; Yu, Jin; Zhao, Junyu

    2016-07-01

    Thomson scattering (TS) diagnostic is an important diagnostic for measuring electron temperature and density during plasma discharge. However, the measurement of Thomson scattering signal is disturbed by the stray light easily. The stray light sources in the Experimental Advanced Superconducting Tokamak (EAST) TS diagnostic system were analyzed by a simulation model of the diagnostic system, and simulation results show that the dump system is the primary stray light source. Based on the optics theory and the simulation analysis, a novel dump system including an improved beam trap was proposed and installed. The measurement results indicate that the new dump system can reduce more than 60% of the stray light for the diagnostic system, and the influence of stray light on the error of measured density decreases.

  12. Laser speckle and skin cancer: skin roughness assessment

    NASA Astrophysics Data System (ADS)

    Lee, Tim K.; Tchvialeva, Lioudmila; Zeng, Haishan; McLean, David I.; Lui, Harvey

    2009-10-01

    Incidence of skin cancer has been increasing rapidly since the last few decades. Non-invasive optical diagnostic tools may improve the diagnostic accuracy. In this paper, skin structure, skin cancer statistics and subtypes of skin cancer are briefly reviewed. Among the subtypes, malignant melanoma is the most aggressive and dangerous; early detection dramatically improves the prognosis. Therefore, a non-invasive diagnostic tool for malignant melanoma is especially needed. In addition, in order for the diagnostic tool to be useful, it must be able to differentiate melanoma from common skin conditions such as seborrheic keratosis, a benign skin disease that resembles melanoma according to the well known clinical-assessment ABCD rule. The key diagnostic feature between these two diseases is surface roughness. Based on laser speckle contrast, our research team has recently developed a portable, optical, non-invasive, in-vivo diagnostic device for quantifying skin surface roughness. The methodology of our technique is described in details. Examining the preliminary data collected in a pilot clinical study for the prototype, we found that there was a difference in roughness between melanoma and seborrheic keratosis. In fact, there was a perfect cutoff value for the two diseases based on our initial data.

  13. Improvement of AEP Predictions Using Diurnal CFD Modelling with Site-Specific Stability Weightings Provided from Mesoscale Simulation

    NASA Astrophysics Data System (ADS)

    Hristov, Y.; Oxley, G.; Žagar, M.

    2014-06-01

    The Bolund measurement campaign, performed by Danish Technical University (DTU) Wind Energy Department (also known as RISØ), provided significant insight into wind flow modeling over complex terrain. In the blind comparison study several modelling solutions were submitted with the vast majority being steady-state Computational Fluid Dynamics (CFD) approaches with two equation k-epsilon turbulence closure. This approach yielded the most accurate results, and was identified as the state-of-the-art tool for wind turbine generator (WTG) micro-siting. Based on the findings from Bolund, further comparison between CFD and field measurement data has been deemed essential in order to improve simulation accuracy for turbine load and long-term Annual Energy Production (AEP) estimations. Vestas Wind Systems A/S is a major WTG original equipment manufacturer (OEM) with an installed base of over 60GW in over 70 countries accounting for 19% of the global installed base. The Vestas Performance and Diagnostic Centre (VPDC) provides online live data to more than 47GW of these turbines allowing a comprehensive comparison between modelled and real-world energy production data. In previous studies, multiple sites have been simulated with a steady neutral CFD formulation for the atmospheric surface layer (ASL), and wind resource (RSF) files have been generated as a base for long-term AEP predictions showing significant improvement over predictions performed with the industry standard linear WAsP tool. In this study, further improvements to the wind resource file generation with CFD are examined using an unsteady diurnal cycle approach with a full atmospheric boundary layer (ABL) formulation, with the unique stratifications throughout the cycle weighted according to mesoscale simulated sectorwise stability frequencies.

  14. Acquaintance to Artificial Neural Networks and use of artificial intelligence as a diagnostic tool for tuberculosis: A review.

    PubMed

    Dande, Payal; Samant, Purva

    2018-01-01

    Tuberculosis [TB] has afflicted numerous nations in the world. As per a report by the World Health Organization [WHO], an estimated 1.4 million TB deaths in 2015 and an additional 0.4 million deaths resulting from TB disease among people living with HIV, were observed. Most of the TB deaths can be prevented if it is detected at an early stage. The existing processes of diagnosis like blood tests or sputum tests are not only tedious but also take a long time for analysis and cannot differentiate between different drug resistant stages of TB. The need to find newer prompt methods for disease detection has been aided by the latest Artificial Intelligence [AI] tools. Artificial Neural Network [ANN] is one of the important tools that is being used widely in diagnosis and evaluation of medical conditions. This review aims at providing brief introduction to various AI tools that are used in TB detection and gives a detailed description about the utilization of ANN as an efficient diagnostic technique. The paper also provides a critical assessment of ANN and the existing techniques for their diagnosis of TB. Researchers and Practitioners in the field are looking forward to use ANN and other upcoming AI tools such as Fuzzy-logic, genetic algorithms and artificial intelligence simulation as a promising current and future technology tools towards tackling the global menace of Tuberculosis. Latest advancements in the diagnostic field include the combined use of ANN with various other AI tools like the Fuzzy-logic, which has led to an increase in the efficacy and specificity of the diagnostic techniques. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. The rapid evolution of molecular genetic diagnostics in neuromuscular diseases.

    PubMed

    Volk, Alexander E; Kubisch, Christian

    2017-10-01

    The development of massively parallel sequencing (MPS) has revolutionized molecular genetic diagnostics in monogenic disorders. The present review gives a brief overview of different MPS-based approaches used in clinical diagnostics of neuromuscular disorders (NMDs) and highlights their advantages and limitations. MPS-based approaches like gene panel sequencing, (whole) exome sequencing, (whole) genome sequencing, and RNA sequencing have been used to identify the genetic cause in NMDs. Although gene panel sequencing has evolved as a standard test for heterogeneous diseases, it is still debated, mainly because of financial issues and unsolved problems of variant interpretation, whether genome sequencing (and to a lesser extent also exome sequencing) of single patients can already be regarded as routine diagnostics. However, it has been shown that the inclusion of parents and additional family members often leads to a substantial increase in the diagnostic yield in exome-wide/genome-wide MPS approaches. In addition, MPS-based RNA sequencing just enters the research and diagnostic scene. Next-generation sequencing increasingly enables the detection of the genetic cause in highly heterogeneous diseases like NMDs in an efficient and affordable way. Gene panel sequencing and family-based exome sequencing have been proven as potent and cost-efficient diagnostic tools. Although clinical validation and interpretation of genome sequencing is still challenging, diagnostic RNA sequencing represents a promising tool to bypass some hurdles of diagnostics using genomic DNA.

  16. Damage tolerance modeling and validation of a wireless sensory composite panel for a structural health monitoring system

    NASA Astrophysics Data System (ADS)

    Talagani, Mohamad R.; Abdi, Frank; Saravanos, Dimitris; Chrysohoidis, Nikos; Nikbin, Kamran; Ragalini, Rose; Rodov, Irena

    2013-05-01

    The paper proposes the diagnostic and prognostic modeling and test validation of a Wireless Integrated Strain Monitoring and Simulation System (WISMOS). The effort verifies a hardware and web based software tool that is able to evaluate and optimize sensorized aerospace composite structures for the purpose of Structural Health Monitoring (SHM). The tool is an extension of an existing suite of an SHM system, based on a diagnostic-prognostic system (DPS) methodology. The goal of the extended SHM-DPS is to apply multi-scale nonlinear physics-based Progressive Failure analyses to the "as-is" structural configuration to determine residual strength, remaining service life, and future inspection intervals and maintenance procedures. The DPS solution meets the JTI Green Regional Aircraft (GRA) goals towards low weight, durable and reliable commercial aircraft. It will take advantage of the currently developed methodologies within the European Clean sky JTI project WISMOS, with the capability to transmit, store and process strain data from a network of wireless sensors (e.g. strain gages, FBGA) and utilize a DPS-based methodology, based on multi scale progressive failure analysis (MS-PFA), to determine structural health and to advice with respect to condition based inspection and maintenance. As part of the validation of the Diagnostic and prognostic system, Carbon/Epoxy ASTM coupons were fabricated and tested to extract the mechanical properties. Subsequently two composite stiffened panels were manufactured, instrumented and tested under compressive loading: 1) an undamaged stiffened buckling panel; and 2) a damaged stiffened buckling panel including an initial diamond cut. Next numerical Finite element models of the two panels were developed and analyzed under test conditions using Multi-Scale Progressive Failure Analysis (an extension of FEM) to evaluate the damage/fracture evolution process, as well as the identification of contributing failure modes. The comparisons between predictions and test results were within 10% accuracy.

  17. The IDEA Assessment Tool: Assessing the Reporting, Diagnostic Reasoning, and Decision-Making Skills Demonstrated in Medical Students' Hospital Admission Notes.

    PubMed

    Baker, Elizabeth A; Ledford, Cynthia H; Fogg, Louis; Way, David P; Park, Yoon Soo

    2015-01-01

    Construct: Clinical skills are used in the care of patients, including reporting, diagnostic reasoning, and decision-making skills. Written comprehensive new patient admission notes (H&Ps) are a ubiquitous part of student education but are underutilized in the assessment of clinical skills. The interpretive summary, differential diagnosis, explanation of reasoning, and alternatives (IDEA) assessment tool was developed to assess students' clinical skills using written comprehensive new patient admission notes. The validity evidence for assessment of clinical skills using clinical documentation following authentic patient encounters has not been well documented. Diagnostic justification tools and postencounter notes are described in the literature (1,2) but are based on standardized patient encounters. To our knowledge, the IDEA assessment tool is the first published tool that uses medical students' H&Ps to rate students' clinical skills. The IDEA assessment tool is a 15-item instrument that asks evaluators to rate students' reporting, diagnostic reasoning, and decision-making skills based on medical students' new patient admission notes. This study presents validity evidence in support of the IDEA assessment tool using Messick's unified framework, including content (theoretical framework), response process (interrater reliability), internal structure (factor analysis and internal-consistency reliability), and relationship to other variables. Validity evidence is based on results from four studies conducted between 2010 and 2013. First, the factor analysis (2010, n = 216) yielded a three-factor solution, measuring patient story, IDEA, and completeness, with reliabilities of .79, .88, and .79, respectively. Second, an initial interrater reliability study (2010) involving two raters demonstrated fair to moderate consensus (κ = .21-.56, ρ =.42-.79). Third, a second interrater reliability study (2011) with 22 trained raters also demonstrated fair to moderate agreement (intraclass correlations [ICCs] = .29-.67). There was moderate reliability for all three skill domains, including reporting skills (ICC = .53), diagnostic reasoning skills (ICC = .64), and decision-making skills (ICC = .63). Fourth, there was a significant correlation between IDEA rating scores (2010-2013) and final Internal Medicine clerkship grades (r = .24), 95% confidence interval (CI) [.15, .33]. The IDEA assessment tool is a novel tool with validity evidence to support its use in the assessment of students' reporting, diagnostic reasoning, and decision-making skills. The moderate reliability achieved supports formative or lower stakes summative uses rather than high-stakes summative judgments.

  18. A new dump system design for stray light reduction of Thomson scattering diagnostic system on EAST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, Shumei; Zang, Qing, E-mail: zangq@ipp.ac.cn; Han, Xiaofeng

    Thomson scattering (TS) diagnostic is an important diagnostic for measuring electron temperature and density during plasma discharge. However, the measurement of Thomson scattering signal is disturbed by the stray light easily. The stray light sources in the Experimental Advanced Superconducting Tokamak (EAST) TS diagnostic system were analyzed by a simulation model of the diagnostic system, and simulation results show that the dump system is the primary stray light source. Based on the optics theory and the simulation analysis, a novel dump system including an improved beam trap was proposed and installed. The measurement results indicate that the new dump systemmore » can reduce more than 60% of the stray light for the diagnostic system, and the influence of stray light on the error of measured density decreases.« less

  19. First Steps Toward Incorporating Image Based Diagnostics Into Particle Accelerator Control Systems Using Convolutional Neural Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edelen, A. L.; Biedron, S. G.; Milton, S. V.

    At present, a variety of image-based diagnostics are used in particle accelerator systems. Often times, these are viewed by a human operator who then makes appropriate adjustments to the machine. Given recent advances in using convolutional neural networks (CNNs) for image processing, it should be possible to use image diagnostics directly in control routines (NN-based or otherwise). This is especially appealing for non-intercepting diagnostics that could run continuously during beam operation. Here, we show results of a first step toward implementing such a controller: our trained CNN can predict multiple simulated downstream beam parameters at the Fermilab Accelerator Science andmore » Technology (FAST) facility's low energy beamline using simulated virtual cathode laser images, gun phases, and solenoid strengths.« less

  20. An Event-Based Approach to Design a Teamwork Training Scenario and Assessment Tool in Surgery.

    PubMed

    Nguyen, Ngan; Watson, William D; Dominguez, Edward

    2016-01-01

    Simulation is a technique recommended for teaching and measuring teamwork, but few published methodologies are available on how best to design simulation for teamwork training in surgery and health care in general. The purpose of this article is to describe a general methodology, called event-based approach to training (EBAT), to guide the design of simulation for teamwork training and discuss its application to surgery. The EBAT methodology draws on the science of training by systematically introducing training exercise events that are linked to training requirements (i.e., competencies being trained and learning objectives) and performance assessment. The EBAT process involves: Of the 4 teamwork competencies endorsed by the Agency for Healthcare Research Quality and Department of Defense, "communication" was chosen to be the focus of our training efforts. A total of 5 learning objectives were defined based on 5 validated teamwork and communication techniques. Diagnostic laparoscopy was chosen as the clinical context to frame the training scenario, and 29 KSAs were defined based on review of published literature on patient safety and input from subject matter experts. Critical events included those that correspond to a specific phase in the normal flow of a surgical procedure as well as clinical events that may occur when performing the operation. Similar to the targeted KSAs, targeted responses to the critical events were developed based on existing literature and gathering input from content experts. Finally, a 29-item EBAT-derived checklist was created to assess communication performance. Like any instructional tool, simulation is only effective if it is designed and implemented appropriately. It is recognized that the effectiveness of simulation depends on whether (1) it is built upon a theoretical framework, (2) it uses preplanned structured exercises or events to allow learners the opportunity to exhibit the targeted KSAs, (3) it assesses performance, and (4) it provides formative and constructive feedback to bridge the gap between the learners' KSAs and the targeted KSAs. The EBAT methodology guides the design of simulation that incorporates these 4 features and, thus, enhances training effectiveness with simulation. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  1. Propulsion Diagnostic Method Evaluation Strategy (ProDiMES) User's Guide

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.

    2010-01-01

    This report is a User's Guide for the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES). ProDiMES is a standard benchmarking problem and a set of evaluation metrics to enable the comparison of candidate aircraft engine gas path diagnostic methods. This Matlab (The Mathworks, Inc.) based software tool enables users to independently develop and evaluate diagnostic methods. Additionally, a set of blind test case data is also distributed as part of the software. This will enable the side-by-side comparison of diagnostic approaches developed by multiple users. The Users Guide describes the various components of ProDiMES, and provides instructions for the installation and operation of the tool.

  2. The adult literacy evaluator: An intelligent computer-aided training system for diagnosing adult illiterates

    NASA Technical Reports Server (NTRS)

    Yaden, David B., Jr.

    1992-01-01

    An important part of NASA's mission involves the secondary application of its technologies in the public and private sectors. One current application being developed is The Adult Literacy Evaluator, a simulation-based diagnostic tool designed to assess the operant literacy abilities of adults having difficulties in learning to read and write. Using ICAT system technology in addition to speech recognition, closed-captioned television (CCTV), live video and other state-of-the art graphics and storage capabilities, this project attempts to overcome the negative effects of adult literacy assessment by allowing the client to interact with an intelligent computer system which simulates real-life literacy activities and materials and which measures literacy performance in the actual context of its use. The specific objectives of the project are as follows: (1) To develop a simulation-based diagnostic tool to assess adults' prior knowledge about reading and writing processes in actual contexts of application; (2) to provide a profile of readers' strengths and weaknesses; and (3) to suggest instructional strategies and materials which can be used as a beginning point for remediation. In the first and developmental phase of the project, descriptions of literacy events and environments are being written and functional literacy documents analyzed for their components. Examples of literacy events and situations being considered included interactions with environmental print (e.g., billboards, street signs, commercial marquees, storefront logos, etc.), functional literacy materials (e.g., newspapers, magazines, telephone books, bills, receipts, etc.) and employment related communication (i.e., job descriptions, application forms, technical manuals, memorandums, newsletters, etc.). Each of these situations and materials is being analyzed for its literacy requirements in terms of written display (i.e., knowledge of printed forms and conventions), meaning demands (i.e., comprehension and word knowledge) and social situation. From these descriptions, scripts are being generated which define the interaction between the student, an on-screen guide and the simulated literacy environment. The proposed outcome of the Evaluator is a diagnostic profile which will present broad classifications of literacy behaviors across the major areas of metacognitive abilities, word recognition, vocabulary knowledge, comprehension and writing. From these classifications, suggestions for materials and strategies for instruction with which to begin corrective action will be made. The focus of the Literacy Evaluator will be essentially to provide an expert diagnosis and an interpretation of that assessment which then can be used by a human tutor to further design and individualize a remedial program as needed through the use of an authoring system.

  3. TH-A-BRF-11: Image Intensity Non-Uniformities Between MRI Simulation and Diagnostic MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, E

    2014-06-15

    Purpose: MRI simulation for MRI-based radiotherapy demands that patients be setup in treatment position, which frequently involves use of alternative radiofrequency (RF) coil configurations to accommodate immobilized patients. However, alternative RF coil geometries may exacerbate image intensity non-uniformities (IINU) beyond those observed in diagnostic MRI, which may challenge image segmentation and registration accuracy as well as confound studies assessing radiotherapy response when MR simulation images are used as baselines for evaluation. The goal of this work was to determine whether differences in IINU exist between MR simulation and diagnostic MR images. Methods: ACR-MRI phantom images were acquired at 3T usingmore » a spin-echo sequence (TE/TR:20/500ms, rBW:62.5kHz, TH/skip:5/5mm). MR simulation images were obtained by wrapping two flexible phased-array RF coils around the phantom. Diagnostic MR images were obtained by placing the phantom into a commercial phased-array head coil. Pre-scan normalization was enabled in both cases. Images were transferred offline and corrected for IINU using the MNI N3 algorithm. Coefficients of variation (CV=σ/μ) were calculated for each slice. Wilcoxon matched-pairs and Mann-Whitney tests compared CV values between original and N3 images and between MR simulation and diagnostic MR images. Results: Significant differences in CV were detected between original and N3 images in both MRI simulation and diagnostic MRI groups (p=0.010, p=0.010). In addition, significant differences in CV were detected between original MR simulation and original and N3 diagnostic MR images (p=0.0256, p=0.0016). However, no significant differences in CV were detected between N3 MR simulation images and original or N3 diagnostic MR images, demonstrating the importance of correcting MR simulation images beyond pre-scan normalization prior to use in radiotherapy. Conclusions: Alternative RF coil configurations used in MRI simulation can Result in significant IINU differences compared to diagnostic MR images. The MNI N3 algorithm reduced MR simulation IINU to levels observed in diagnostic MR images. Funding provided by Advancing a Healthier Wisconsin.« less

  4. Projected 2050 Model Simulations for the Chesapeake Bay ...

    EPA Pesticide Factsheets

    The Chesapeake Bay Program as has been tasked with assessing how changes in climate systems are expected to alter key variables and processes within the Watershed in concurrence with land use changes. EPA’s Office of Research and Development will be conducting historic and future, 2050, Weather Research and Forecast (WRF) metrological and Community Multiscale Air Quality (CMAQ) chemical transport model simulations to provide meteorological and nutrient deposition estimates for inclusion of the Chesapeake Bay Program’s assessment of how climate and land use change may impact water quality and ecosystem health. This presentation will present the timeline and research updates. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  5. A Five- Year CMAQ Model Performance for Wildfires and ...

    EPA Pesticide Factsheets

    Biomass burning has been identified as an important contributor to the degradation of air quality because of its impact on ozone and particulate matter. Two components of the biomass burning inventory, wildfires and prescribed fires are routinely estimated in the national emissions inventory. However, there is a large amount of uncertainty in the development of these emission inventory sectors. We have completed a 5 year set of CMAQ model simulations (2008-2012) in which we have simulated regional air quality with and without the wildfire and prescribed fire inventory. We will examine CMAQ model performance over regions with significant PM2.5 and Ozone contribution from prescribed fires and wildfires. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  6. Development and Assessment of a Diagnostic Tool to Identify Organic Chemistry Students' Alternative Conceptions Related to Acid Strength

    ERIC Educational Resources Information Center

    McClary, LaKeisha M.; Bretz, Stacey Lowery

    2012-01-01

    The central goal of this study was to create a new diagnostic tool to identify organic chemistry students' alternative conceptions related to acid strength. Twenty years of research on secondary and college students' conceptions about acids and bases has shown that these important concepts are difficult for students to apply to qualitative problem…

  7. Novel graphene-based biosensor for early detection of Zika virus infection.

    PubMed

    Afsahi, Savannah; Lerner, Mitchell B; Goldstein, Jason M; Lee, Joo; Tang, Xiaoling; Bagarozzi, Dennis A; Pan, Deng; Locascio, Lauren; Walker, Amy; Barron, Francie; Goldsmith, Brett R

    2018-02-15

    We have developed a cost-effective and portable graphene-enabled biosensor to detect Zika virus with a highly specific immobilized monoclonal antibody. Field Effect Biosensing (FEB) with monoclonal antibodies covalently linked to graphene enables real-time, quantitative detection of native Zika viral (ZIKV) antigens. The percent change in capacitance in response to doses of antigen (ZIKV NS1) coincides with levels of clinical significance with detection of antigen in buffer at concentrations as low as 450pM. Potential diagnostic applications were demonstrated by measuring Zika antigen in a simulated human serum. Selectivity was validated using Japanese Encephalitis NS1, a homologous and potentially cross-reactive viral antigen. Further, the graphene platform can simultaneously provide the advanced quantitative data of nonclinical biophysical kinetics tools, making it adaptable to both clinical research and possible diagnostic applications. The speed, sensitivity, and selectivity of this first-of-its-kind graphene-enabled Zika biosensor make it an ideal candidate for development as a medical diagnostic test. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  8. A Multiple-Sessions Interactive Computer-Based Learning Tool for Ability Cultivation in Circuit Simulation

    ERIC Educational Resources Information Center

    Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.

    2011-01-01

    An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…

  9. Propulsion IVHM Technology Experiment

    NASA Technical Reports Server (NTRS)

    Chicatelli, Amy K.; Maul, William A.; Fulton, Christopher E.

    2006-01-01

    The Propulsion IVHM Technology Experiment (PITEX) successfully demonstrated real-time fault detection and isolation of a virtual reusable launch vehicle (RLV) main propulsion system (MPS). Specifically, the PITEX research project developed and applied a model-based diagnostic system for the MPS of the X-34 RLV, a space-launch technology demonstrator. The demonstration was simulation-based using detailed models of the propulsion subsystem to generate nominal and failure scenarios during captive carry, which is the most safety-critical portion of the X-34 flight. Since no system-level testing of the X-34 Main Propulsion System (MPS) was performed, these simulated data were used to verify and validate the software system. Advanced diagnostic and signal processing algorithms were developed and tested in real time on flight-like hardware. In an attempt to expose potential performance problems, the PITEX diagnostic system was subjected to numerous realistic effects in the simulated data including noise, sensor resolution, command/valve talkback information, and nominal build variations. In all cases, the PITEX system performed as required. The research demonstrated potential benefits of model-based diagnostics, defined performance metrics required to evaluate the diagnostic system, and studied the impact of real-world challenges encountered when monitoring propulsion subsystems.

  10. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    PubMed

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  11. Risk Reduction and Training using Simulation Based Tools - 12180

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, Irin P.

    2012-07-01

    Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and Smore » based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition. Integrating these tools into a larger virtual system provides a tool for making larger strategic decisions. The key to integrating and creating these virtual environments is the software and the process used to build them. Although important steps in the direction of using simulation based tools for nuclear domain, the applications described here represent only a small cross section of possible benefits. The next generation of applications will, likely, focus on situational awareness and adaptive planning. Situational awareness refers to the ability to visualize in real time the state of operations. Some useful tools in this area are Geographic Information Systems (GIS), which help monitor and analyze geographically referenced information. Combined with such situational awareness capability, simulation tools can serve as the platform for adaptive planning tools. These are the tools that allow the decision maker to react to the changing environment in real time by synthesizing massive amounts of data into easily understood information. For the nuclear domains, this may mean creation of Virtual Nuclear Systems, from Virtual Waste Processing Plants to Virtual Nuclear Reactors. (authors)« less

  12. Expert Systems Based Clinical Assessment and Tutorial Project.

    ERIC Educational Resources Information Center

    Papa, Frank; Shores, Jay

    This project at the Texas College of Osteopathic Medicine (Fort Worth) evaluated the use of an artificial-intelligence-derived measure, "Knowledge-Based Inference Tool" (KBIT), as the basis for assessing medical students' diagnostic capabilities and designing instruction to improve diagnostic skills. The instrument was designed to…

  13. Deletion Diagnostics for Alternating Logistic Regressions

    PubMed Central

    Preisser, John S.; By, Kunthel; Perin, Jamie; Qaqish, Bahjat F.

    2013-01-01

    Deletion diagnostics are introduced for the regression analysis of clustered binary outcomes estimated with alternating logistic regressions, an implementation of generalized estimating equations (GEE) that estimates regression coefficients in a marginal mean model and in a model for the intracluster association given by the log odds ratio. The diagnostics are developed within an estimating equations framework that recasts the estimating functions for association parameters based upon conditional residuals into equivalent functions based upon marginal residuals. Extensions of earlier work on GEE diagnostics follow directly, including computational formulae for one-step deletion diagnostics that measure the influence of a cluster of observations on the estimated regression parameters and on the overall marginal mean or association model fit. The diagnostic formulae are evaluated with simulations studies and with an application concerning an assessment of factors associated with health maintenance visits in primary care medical practices. The application and the simulations demonstrate that the proposed cluster-deletion diagnostics for alternating logistic regressions are good approximations of their exact fully iterated counterparts. PMID:22777960

  14. Using three-dimensional-computerized tomography as a diagnostic tool for temporo-mandibular joint ankylosis: a case report.

    PubMed

    Kao, S Y; Chou, J; Lo, J; Yang, J; Chou, A P; Joe, C J; Chang, R C

    1999-04-01

    Roentgenographic examination has long been a useful diagnostic tool for temporo-mandibular joint (TMJ) disease. The methods include TMJ tomography, panoramic radiography and computerized tomography (CT) scan with or without injection of contrast media. Recently, three-dimensional CT (3D-CT), reconstructed from the two-dimensional image of a CT scan to simulate the soft tissue or bony structure of the real target, was proposed. In this report, a case of TMJ ankylosis due to traumatic injury is presented. 3D-CT was employed as one of the presurgical roentgenographic diagnostic tools. The conventional radiographic examination including panoramic radiography and tomography showed lesions in both sides of the mandible. CT scanning further suggested that the right-sided lesion was more severe than that on the left. With 3D-CT image reconstruction the size and extent of the lesions were clearly observable. The decision was made to proceed with an initial surgical approach on the right side. With condylectomy and condylar replacement using an autogenous costochondral graft on the right side, the range of mouth opening improved significantly. In this case report, 3D-CT demonstrates its advantages as a tool for the correct and precise diagnosis of TMJ ankylosis.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kapuscinski, A.R.; Hallerman, E.M.

    Among the many methodologies encompassing biotechnology in aquaculture, this report addresses: the production of genetically modified aquatic organisms (aquatic GMOs) by gene transfer, chromosome set manipulation, or hybridization or protoplast fusion between species; new health management tools, including DNA-Based diagnostics and recombinant DNA vaccines; Marker-assisted selection; cryopreservation; and stock marking. These methodologies pose a wide range of potential economic benefits for aquaculture by providing improved or new means to affect the mix of necessary material inputs, enhance production efficiency, or improve product quality. Advances in aquaculture through biotechnology could simulate growth of the aquaculture industry to provide a larger proportionmore » of consummer demand, and thereby reduce pressure and natural stocks from over-harvest. Judicious application of gamete cryopreservation and chromosome set manipulations to achieve sterilization could reduce environmental risks of some aquaculture operations. Given the significant losses to disease in many aquaculture enterprises, potential benefits of DNA-based health management tools are very high and appear to pose no major environmental risks or social concerns.« less

  16. Diagnosis demystified: CT as diagnostic tool in endodontics

    PubMed Central

    Shruthi, Nagaraja; Sreenivasa Murthy, B V; Sundaresh, K J; Mallikarjuna, Rachappa

    2013-01-01

    Diagnosis in endodontics is usually based on clinical and radiographical presentations, which are only empirical methods. The role of healing profession is to apply knowledge and skills towards maintaining and restoring the patient's health. Recent advances in imaging technologies have added to correct interpretation and diagnosis. CT is proving to be an effective tool in solving endodontic mysteries through its three-dimensional visualisation. CT imaging offers many diagnostic advantages to produce reconstructed images in selected projection and low-contrast resolution far superior to that of all other X-ray imaging modalities. This case report is an endeavour towards effective treatment planning of cases with root fracture, root resorption using spiral CT as an adjuvant diagnostic tool. PMID:23814212

  17. Evaluating online diagnostic decision support tools for the clinical setting.

    PubMed

    Pryor, Marie; White, David; Potter, Bronwyn; Traill, Roger

    2012-01-01

    Clinical decision support tools available at the point of care are an effective adjunct to support clinicians to make clinical decisions and improve patient outcomes. We developed a methodology and applied it to evaluate commercially available online clinical diagnostic decision support (DDS) tools for use at the point of care. We identified 11 commercially available DDS tools and assessed these against an evaluation instrument that included 6 categories; general information, content, quality control, search, clinical results and other features. We developed diagnostically challenging clinical case scenarios based on real patient experience that were commonly missed by junior medical staff. The evaluation was divided into 2 phases; an initial evaluation of all identified and accessible DDS tools conducted by the Clinical Information Access Portal (CIAP) team and a second phase that further assessed the top 3 tools identified in the initial evaluation phase. An evaluation panel consisting of senior and junior medical clinicians from NSW Health conducted the second phase. Of the eleven tools that were assessed against the evaluation instrument only 4 tools completely met the DDS definition that was adopted for this evaluation and were able to produce a differential diagnosis. From the initial phase of the evaluation 4 DDS tools scored 70% or more (maximum score 96%) for the content category, 8 tools scored 65% or more (maximum 100%) for the quality control category, 5 tools scored 65% or more (maximum 94%) for the search category, and 4 tools score 70% or more (maximum 81%) for the clinical results category. The second phase of the evaluation was focused on assessing diagnostic accuracy for the top 3 tools identified in the initial phase. Best Practice ranked highest overall against the 6 clinical case scenarios used. Overall the differentiating factor between the top 3 DDS tools was determined by diagnostic accuracy ranking, ease of use and the confidence and credibility of the clinical information. The evaluation methodology used here to assess the quality and comprehensiveness of clinical DDS tools was effective in identifying the most appropriate tool for the clinical setting. The use of clinical case scenarios is fundamental in determining the diagnostic accuracy and usability of the tools.

  18. Development of RAD-Score: A Tool to Assess the Procedural Competence of Diagnostic Radiology Residents.

    PubMed

    Isupov, Inga; McInnes, Matthew D F; Hamstra, Stan J; Doherty, Geoffrey; Gupta, Ashish; Peddle, Susan; Jibri, Zaid; Rakhra, Kawan; Hibbert, Rebecca M

    2017-04-01

    The purpose of this study is to develop a tool to assess the procedural competence of radiology trainees, with sources of evidence gathered from five categories to support the construct validity of tool: content, response process, internal structure, relations to other variables, and consequences. A pilot form for assessing procedural competence among radiology residents, known as the RAD-Score tool, was developed by evaluating published literature and using a modified Delphi procedure involving a group of local content experts. The pilot version of the tool was tested by seven radiology department faculty members who evaluated procedures performed by 25 residents at one institution between October 2014 and June 2015. Residents were evaluated while performing multiple procedures in both clinical and simulation settings. The main outcome measure was the percentage of residents who were considered ready to perform procedures independently, with testing conducted to determine differences between levels of training. A total of 105 forms (for 52 procedures performed in a clinical setting and 53 procedures performed in a simulation setting) were collected for a variety of procedures (eight vascular or interventional, 42 body, 12 musculoskeletal, 23 chest, and 20 breast procedures). A statistically significant difference was noted in the percentage of trainees who were rated as being ready to perform a procedure independently (in postgraduate year [PGY] 2, 12% of residents; in PGY3, 61%; in PGY4, 85%; and in PGY5, 88%; p < 0.05); this difference persisted in the clinical and simulation settings. User feedback and psychometric analysis were used to create a final version of the form. This prospective study describes the successful development of a tool for assessing the procedural competence of radiology trainees with high levels of construct validity in multiple domains. Implementation of the tool in the radiology residency curriculum is planned and can play an instrumental role in the transition to competency-based radiology training.

  19. Comparison of the Effectiveness of Interactive Didactic Lecture Versus Online Simulation-Based CME Programs Directed at Improving the Diagnostic Capabilities of Primary Care Practitioners.

    PubMed

    McFadden, Pam; Crim, Andrew

    2016-01-01

    Diagnostic errors in primary care contribute to increased morbidity and mortality, and billions in costs each year. Improvements in the way practicing physicians are taught so as to optimally perform differential diagnosis can increase patient safety and lower the costs of care. This study represents a comparison of the effectiveness of two approaches to CME training directed at improving the primary care practitioner's diagnostic capabilities against seven common and important causes of joint pain. Using a convenience sampling methodology, one group of primary care practitioners was trained by a traditional live, expert-led, multimedia-based training activity supplemented with interactive practice opportunities and feedback (control group). The second group was trained online with a multimedia-based training activity supplemented with interactive practice opportunities and feedback delivered by an artificial intelligence-driven simulation/tutor (treatment group). Before their respective instructional intervention, there were no significant differences in the diagnostic performance of the two groups against a battery of case vignettes presenting with joint pain. Using the same battery of case vignettes to assess postintervention diagnostic performance, there was a slight but not statistically significant improvement in the control group's diagnostic accuracy (P = .13). The treatment group, however, demonstrated a significant improvement in accuracy (P < .02; Cohen d, effect size = 0.79). These data indicate that within the context of a CME activity, a significant improvement in diagnostic accuracy can be achieved by the use of a web-delivered, multimedia-based instructional activity supplemented by practice opportunities and feedback delivered by an artificial intelligence-driven simulation/tutor.

  20. Are we ready for Taenia solium cysticercosis elimination in sub-Saharan Africa?

    PubMed

    Johansen, Maria Vang; Trevisan, Chiara; Gabriël, Sarah; Magnussen, Pascal; Braae, Uffe Christian

    2017-01-01

    The World Health Organization announced in November 2014 at the fourth international meeting on 'the control of neglected zoonotic diseases - from advocacy to action', that intervention tools for eliminating Taenia solium taeniosis/cysticercosis (TSTC) are in place. The aim of this work was to elucidate theoretical outcomes of various control options suggested for TSTC elimination in sub-Saharan Africa (SSA) over a 4-year period. Our current knowledge regarding T. solium epidemiology and control primarily builds on studies from Latin America. A simple transmission model - built on data from Latin America - has been used to predict the effect of various interventions such as mass treatment of humans, vaccination and treatment of pigs, and health education of communities, potentially leading to change in bad practices and reducing transmission risks. Based on simulations of the transmission model, even a 4-year integrated One Health approach fails to eliminate TSTC from a small community and in all simulations, the prevalence of human taeniosis and porcine cysticercosis start to rise as soon as the programmes end. Our current knowledge regarding transmission and burden of TSTC in SSA is scarce and while claiming to be tool ready, the selection of diagnostic and surveillance tools, as well as the algorithms and stepwise approaches for control and elimination of TSTC remain major challenges.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitrani, J

    Bayesian networks (BN) are an excellent tool for modeling uncertainties in systems with several interdependent variables. A BN is a directed acyclic graph, and consists of a structure, or the set of directional links between variables that depend on other variables, and conditional probabilities (CP) for each variable. In this project, we apply BN's to understand uncertainties in NIF ignition experiments. One can represent various physical properties of National Ignition Facility (NIF) capsule implosions as variables in a BN. A dataset containing simulations of NIF capsule implosions was provided. The dataset was generated from a radiation hydrodynamics code, and itmore » contained 120 simulations of 16 variables. Relevant knowledge about the physics of NIF capsule implosions and greedy search algorithms were used to search for hypothetical structures for a BN. Our preliminary results found 6 links between variables in the dataset. However, we thought there should have been more links between the dataset variables based on the physics of NIF capsule implosions. Important reasons for the paucity of links are the relatively small size of the dataset, and the sampling of the values for dataset variables. Another factor that might have caused the paucity of links is the fact that in the dataset, 20% of the simulations represented successful fusion, and 80% didn't, (simulations of unsuccessful fusion are useful for measuring certain diagnostics) which skewed the distributions of several variables, and possibly reduced the number of links. Nevertheless, by illustrating the interdependencies and conditional probabilities of several parameters and diagnostics, an accurate and complete BN built from an appropriate simulation set would provide uncertainty quantification for NIF capsule implosions.« less

  2. Three-dimensional virtual bronchoscopy using a tablet computer to guide real-time transbronchial needle aspiration.

    PubMed

    Fiorelli, Alfonso; Raucci, Antonio; Cascone, Roberto; Reginelli, Alfonso; Di Natale, Davide; Santoriello, Carlo; Capuozzo, Antonio; Grassi, Roberto; Serra, Nicola; Polverino, Mario; Santini, Mario

    2017-04-01

    We proposed a new virtual bronchoscopy tool to improve the accuracy of traditional transbronchial needle aspiration for mediastinal staging. Chest-computed tomographic images (1 mm thickness) were reconstructed with Osirix software to produce a virtual bronchoscopic simulation. The target adenopathy was identified by measuring its distance from the carina on multiplanar reconstruction images. The static images were uploaded in iMovie Software, which produced a virtual bronchoscopic movie from the images; the movie was then transferred to a tablet computer to provide real-time guidance during a biopsy. To test the validity of our tool, we divided all consecutive patients undergoing transbronchial needle aspiration retrospectively in two groups based on whether the biopsy was guided by virtual bronchoscopy (virtual bronchoscopy group) or not (traditional group). The intergroup diagnostic yields were statistically compared. Our analysis included 53 patients in the traditional and 53 in the virtual bronchoscopy group. The sensitivity, specificity, positive predictive value, negative predictive value and diagnostic accuracy for the traditional group were 66.6%, 100%, 100%, 10.53% and 67.92%, respectively, and for the virtual bronchoscopy group were 84.31%, 100%, 100%, 20% and 84.91%, respectively. The sensitivity ( P  = 0.011) and diagnostic accuracy ( P  = 0.011) of sampling the paratracheal station were better for the virtual bronchoscopy group than for the traditional group; no significant differences were found for the subcarinal lymph node. Our tool is simple, economic and available in all centres. It guided in real time the needle insertion, thereby improving the accuracy of traditional transbronchial needle aspiration, especially when target lesions are located in a difficult site like the paratracheal station. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  3. The road map towards providing a robust Raman spectroscopy-based cancer diagnostic platform and integration into clinic

    NASA Astrophysics Data System (ADS)

    Lau, Katherine; Isabelle, Martin; Lloyd, Gavin R.; Old, Oliver; Shepherd, Neil; Bell, Ian M.; Dorney, Jennifer; Lewis, Aaran; Gaifulina, Riana; Rodriguez-Justo, Manuel; Kendall, Catherine; Stone, Nicolas; Thomas, Geraint; Reece, David

    2016-03-01

    Despite the demonstrated potential as an accurate cancer diagnostic tool, Raman spectroscopy (RS) is yet to be adopted by the clinic for histopathology reviews. The Stratified Medicine through Advanced Raman Technologies (SMART) consortium has begun to address some of the hurdles in its adoption for cancer diagnosis. These hurdles include awareness and acceptance of the technology, practicality of integration into the histopathology workflow, data reproducibility and availability of transferrable models. We have formed a consortium, in joint efforts, to develop optimised protocols for tissue sample preparation, data collection and analysis. These protocols will be supported by provision of suitable hardware and software tools to allow statistically sound classification models to be built and transferred for use on different systems. In addition, we are building a validated gastrointestinal (GI) cancers model, which can be trialled as part of the histopathology workflow at hospitals, and a classification tool. At the end of the project, we aim to deliver a robust Raman based diagnostic platform to enable clinical researchers to stage cancer, define tumour margin, build cancer diagnostic models and discover novel disease bio markers.

  4. Diagnosis and therapy with CRISPR advanced CRISPR based tools for point of care diagnostics and early therapies.

    PubMed

    Uppada, Vanita; Gokara, Mahesh; Rasineni, Girish Kumar

    2018-05-20

    Molecular diagnostics is of critical importance to public health worldwide. It facilitates not only detection and characterization of diseases, but also monitors drug responses, assists in the identification of genetic modifiers and disease susceptibility. Based upon DNA variation, a wide range of molecular-based tests are available to assess/diagnose diseases. The CRISPR-Cas9 system has recently emerged as a versatile tool for biological and medical research. In this system, a single guide RNA (sgRNA) directs the endonuclease Cas9 to a targeted DNA sequence for site-specific manipulation. As designing CRISPR-guided nucleases can be done easily and relatively fast, the CRISPR/Cas9 system has evolved as widely used DNA editing tool. This technique led to a large number of gene editing studies in variety of organisms. CRISPR/Cas9-mediated diagnosis and therapy has picked up pace due to specificity and accuracy of CRISPR. The aim is not only to identify specific pathogens, especially virus but also to repair disease-causing alleles by changing the DNA sequence at the exact location on the chromosome. At present, PCR-based molecular diagnostic testing predominates; however, alternative technologies aimed at reducing genome complexity without PCR are anticipated to gain momentum in the coming years. Furthermore, development of integrated chip devices should allow point-of-care testing and facilitate genetic readouts from single cells and molecules. Together with molecular based therapy CRISPR based diagnostic testing will be a revolution in modern health care settings. In this review, we emphasize on current developing diagnostic techniques based upon CRISPR Cas approach along with short insights on its therapeutic usage. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Technical Performance Measurement, Earned Value, and Risk Management: An Integrated Diagnostic Tool for Program Management

    DTIC Science & Technology

    2002-06-01

    time, the monkey would eventually produce the collected works of Shakespeare . Unfortunately for the analogist, systems, even live ones, do not work...limited his simulated computer monkey to producing, in a single random step, the sentence uttered by Polonius in the play Hamlet : “Methinks it is

  6. Radio-frequency energy quantification in magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Alon, Leeor

    Mapping of radio frequency (RF) energy deposition has been challenging for 50+ years, especially, when scanning patients in the magnetic resonance imaging (MRI) environment. As result, electromagnetic simulation software is often used for estimating the specific absorption rate (SAR), the rate of RF energy deposition in tissue. The thesis work presents challenges associated with aligning information provided by electromagnetic simulation and MRI experiments. As result of the limitations of simulations, experimental methods for the quantification of SAR were established. A system for quantification of the total RF energy deposition was developed for parallel transmit MRI (a system that uses multiple antennas to excite and image the body). The system is capable of monitoring and predicting channel-by-channel RF energy deposition, whole body SAR and capable of tracking potential hardware failures that occur in the transmit chain and may cause the deposition of excessive energy into patients. Similarly, we demonstrated that local RF power deposition can be mapped and predicted for parallel transmit systems based on a series of MRI temperature mapping acquisitions. Resulting from the work, we developed tools for optimal reconstruction temperature maps from MRI acquisitions. The tools developed for temperature mapping paved the way for utilizing MRI as a diagnostic tool for evaluation of RF/microwave emitting device safety. Quantification of the RF energy was demonstrated for both MRI compatible and non-MRI-compatible devices (such as cell phones), while having the advantage of being noninvasive, of providing millimeter resolution and high accuracy.

  7. Investigation of Tapered Roller Bearing Damage Detection Using Oil Debris Analysis

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.; Krieder, Gary; Fichter, Thomas

    2006-01-01

    A diagnostic tool was developed for detecting fatigue damage to tapered roller bearings. Tapered roller bearings are used in helicopter transmissions and have potential for use in high bypass advanced gas turbine aircraft engines. This diagnostic tool was developed and evaluated experimentally by collecting oil debris data from failure progression tests performed by The Timken Company in their Tapered Roller Bearing Health Monitoring Test Rig. Failure progression tests were performed under simulated engine load conditions. Tests were performed on one healthy bearing and three predamaged bearings. During each test, data from an on-line, in-line, inductance type oil debris sensor was monitored and recorded for the occurrence of debris generated during failure of the bearing. The bearing was removed periodically for inspection throughout the failure progression tests. Results indicate the accumulated oil debris mass is a good predictor of damage on tapered roller bearings. The use of a fuzzy logic model to enable an easily interpreted diagnostic metric was proposed and demonstrated.

  8. Real-time diagnostics for a reusable rocket engine

    NASA Technical Reports Server (NTRS)

    Guo, T. H.; Merrill, W.; Duyar, A.

    1992-01-01

    A hierarchical, decentralized diagnostic system is proposed for the Real-Time Diagnostic System component of the Intelligent Control System (ICS) for reusable rocket engines. The proposed diagnostic system has three layers of information processing: condition monitoring, fault mode detection, and expert system diagnostics. The condition monitoring layer is the first level of signal processing. Here, important features of the sensor data are extracted. These processed data are then used by the higher level fault mode detection layer to do preliminary diagnosis on potential faults at the component level. Because of the closely coupled nature of the rocket engine propulsion system components, it is expected that a given engine condition may trigger more than one fault mode detector. Expert knowledge is needed to resolve the conflicting reports from the various failure mode detectors. This is the function of the diagnostic expert layer. Here, the heuristic nature of this decision process makes it desirable to use an expert system approach. Implementation of the real-time diagnostic system described above requires a wide spectrum of information processing capability. Generally, in the condition monitoring layer, fast data processing is often needed for feature extraction and signal conditioning. This is usually followed by some detection logic to determine the selected faults on the component level. Three different techniques are used to attack different fault detection problems in the NASA LeRC ICS testbed simulation. The first technique employed is the neural network application for real-time sensor validation which includes failure detection, isolation, and accommodation. The second approach demonstrated is the model-based fault diagnosis system using on-line parameter identification. Besides these model based diagnostic schemes, there are still many failure modes which need to be diagnosed by the heuristic expert knowledge. The heuristic expert knowledge is implemented using a real-time expert system tool called G2 by Gensym Corp. Finally, the distributed diagnostic system requires another level of intelligence to oversee the fault mode reports generated by component fault detectors. The decision making at this level can best be done using a rule-based expert system. This level of expert knowledge is also implemented using G2.

  9. The Design of an ITS-Based Business Simulation: A New Epistemology for Learning.

    ERIC Educational Resources Information Center

    Gold, Steven C.

    1998-01-01

    Discusses the design and use of intelligent tutoring systems (ITS) for computerized business simulations. Reviews the use of ITS as an instructional technology; presents a model for ITS-based business simulations; examines the user interface and link between the ITS and simulation; and recommends expert-consultant diagnostic testing, and…

  10. Prostate cancer diagnostics: Clinical challenges and the ongoing need for disruptive and effective diagnostic tools.

    PubMed

    Sharma, Shikha; Zapatero-Rodríguez, Julia; O'Kennedy, Richard

    The increased incidence and the significant health burden associated with carcinoma of the prostate have led to substantial changes in its diagnosis over the past century. Despite technological advancements, the management of prostate cancer has become progressively more complex and controversial for both early and late-stage disease. The limitations and potential harms associated with the use of prostate-specific antigen (PSA) as a diagnostic marker have stimulated significant investigation of numerous novel biomarkers that demonstrate varying capacities to detect prostate cancer and can decrease unnecessary biopsies. However, only a few of these markers have been approved for specific clinical settings while the others have not been adequately validated for use. This review systematically and critically assesses ongoing issues and emerging challenges in the current state of prostate cancer diagnostic tools and the need for disruptive next generation tools based on analysis of combinations of these biomarkers to enhance predictive accuracy which will benefit clinical diagnostics and patient welfare. Copyright © 2016. Published by Elsevier Inc.

  11. Meta-analysis diagnostic accuracy of SNP-based pathogenicity detection tools: a case of UTG1A1 gene mutations.

    PubMed

    Galehdari, Hamid; Saki, Najmaldin; Mohammadi-Asl, Javad; Rahim, Fakher

    2013-01-01

    Crigler-Najjar syndrome (CNS) type I and type II are usually inherited as autosomal recessive conditions that result from mutations in the UGT1A1 gene. The main objective of the present review is to summarize results of all available evidence on the accuracy of SNP-based pathogenicity detection tools compared to published clinical result for the prediction of in nsSNPs that leads to disease using prediction performance method. A comprehensive search was performed to find all mutations related to CNS. Database searches included dbSNP, SNPdbe, HGMD, Swissvar, ensemble, and OMIM. All the mutation related to CNS was extracted. The pathogenicity prediction was done using SNP-based pathogenicity detection tools include SIFT, PHD-SNP, PolyPhen2, fathmm, Provean, and Mutpred. Overall, 59 different SNPs related to missense mutations in the UGT1A1 gene, were reviewed. Comparing the diagnostic OR, PolyPhen2 and Mutpred have the highest detection 4.983 (95% CI: 1.24 - 20.02) in both, following by SIFT (diagnostic OR: 3.25, 95% CI: 1.07 - 9.83). The highest MCC of SNP-based pathogenicity detection tools, was belong to SIFT (34.19%) followed by Provean, PolyPhen2, and Mutpred (29.99%, 29.89%, and 29.89%, respectively). Hence the highest SNP-based pathogenicity detection tools ACC, was fit to SIFT (62.71%) followed by PolyPhen2, and Mutpred (61.02%, in both). Our results suggest that some of the well-established SNP-based pathogenicity detection tools can appropriately reflect the role of a disease-associated SNP in both local and global structures.

  12. A computational framework for converting textual clinical diagnostic criteria into the quality data model.

    PubMed

    Hong, Na; Li, Dingcheng; Yu, Yue; Xiu, Qiongying; Liu, Hongfang; Jiang, Guoqian

    2016-10-01

    Constructing standard and computable clinical diagnostic criteria is an important but challenging research field in the clinical informatics community. The Quality Data Model (QDM) is emerging as a promising information model for standardizing clinical diagnostic criteria. To develop and evaluate automated methods for converting textual clinical diagnostic criteria in a structured format using QDM. We used a clinical Natural Language Processing (NLP) tool known as cTAKES to detect sentences and annotate events in diagnostic criteria. We developed a rule-based approach for assigning the QDM datatype(s) to an individual criterion, whereas we invoked a machine learning algorithm based on the Conditional Random Fields (CRFs) for annotating attributes belonging to each particular QDM datatype. We manually developed an annotated corpus as the gold standard and used standard measures (precision, recall and f-measure) for the performance evaluation. We harvested 267 individual criteria with the datatypes of Symptom and Laboratory Test from 63 textual diagnostic criteria. We manually annotated attributes and values in 142 individual Laboratory Test criteria. The average performance of our rule-based approach was 0.84 of precision, 0.86 of recall, and 0.85 of f-measure; the performance of CRFs-based classification was 0.95 of precision, 0.88 of recall and 0.91 of f-measure. We also implemented a web-based tool that automatically translates textual Laboratory Test criteria into the QDM XML template format. The results indicated that our approaches leveraging cTAKES and CRFs are effective in facilitating diagnostic criteria annotation and classification. Our NLP-based computational framework is a feasible and useful solution in developing diagnostic criteria representation and computerization. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Diagnostic methods for atmospheric inversions of long-lived greenhouse gases

    NASA Astrophysics Data System (ADS)

    Michalak, Anna M.; Randazzo, Nina A.; Chevallier, Frédéric

    2017-06-01

    The ability to predict the trajectory of climate change requires a clear understanding of the emissions and uptake (i.e., surface fluxes) of long-lived greenhouse gases (GHGs). Furthermore, the development of climate policies is driving a need to constrain the budgets of anthropogenic GHG emissions. Inverse problems that couple atmospheric observations of GHG concentrations with an atmospheric chemistry and transport model have increasingly been used to gain insights into surface fluxes. Given the inherent technical challenges associated with their solution, it is imperative that objective approaches exist for the evaluation of such inverse problems. Because direct observation of fluxes at compatible spatiotemporal scales is rarely possible, diagnostics tools must rely on indirect measures. Here we review diagnostics that have been implemented in recent studies and discuss their use in informing adjustments to model setup. We group the diagnostics along a continuum starting with those that are most closely related to the scientific question being targeted, and ending with those most closely tied to the statistical and computational setup of the inversion. We thus begin with diagnostics based on assessments against independent information (e.g., unused atmospheric observations, large-scale scientific constraints), followed by statistical diagnostics of inversion results, diagnostics based on sensitivity tests, and analyses of robustness (e.g., tests focusing on the chemistry and transport model, the atmospheric observations, or the statistical and computational framework), and close with the use of synthetic data experiments (i.e., observing system simulation experiments, OSSEs). We find that existing diagnostics provide a crucial toolbox for evaluating and improving flux estimates but, not surprisingly, cannot overcome the fundamental challenges associated with limited atmospheric observations or the lack of direct flux measurements at compatible scales. As atmospheric inversions are increasingly expected to contribute to national reporting of GHG emissions, the need for developing and implementing robust and transparent evaluation approaches will only grow.

  14. Competency-Based Training and Simulation: Making a "Valid" Argument.

    PubMed

    Noureldin, Yasser A; Lee, Jason Y; McDougall, Elspeth M; Sweet, Robert M

    2018-02-01

    The use of simulation as an assessment tool is much more controversial than is its utility as an educational tool. However, without valid simulation-based assessment tools, the ability to objectively assess technical skill competencies in a competency-based medical education framework will remain challenging. The current literature in urologic simulation-based training and assessment uses a definition and framework of validity that is now outdated. This is probably due to the absence of awareness rather than an absence of comprehension. The following review article provides the urologic community an updated taxonomy on validity theory as it relates to simulation-based training and assessments and translates our simulation literature to date into this framework. While the old taxonomy considered validity as distinct subcategories and focused on the simulator itself, the modern taxonomy, for which we translate the literature evidence, considers validity as a unitary construct with a focus on interpretation of simulator data/scores.

  15. A comparison of bivariate, multivariate random-effects, and Poisson correlated gamma-frailty models to meta-analyze individual patient data of ordinal scale diagnostic tests.

    PubMed

    Simoneau, Gabrielle; Levis, Brooke; Cuijpers, Pim; Ioannidis, John P A; Patten, Scott B; Shrier, Ian; Bombardier, Charles H; de Lima Osório, Flavia; Fann, Jesse R; Gjerdingen, Dwenda; Lamers, Femke; Lotrakul, Manote; Löwe, Bernd; Shaaban, Juwita; Stafford, Lesley; van Weert, Henk C P M; Whooley, Mary A; Wittkampf, Karin A; Yeung, Albert S; Thombs, Brett D; Benedetti, Andrea

    2017-11-01

    Individual patient data (IPD) meta-analyses are increasingly common in the literature. In the context of estimating the diagnostic accuracy of ordinal or semi-continuous scale tests, sensitivity and specificity are often reported for a given threshold or a small set of thresholds, and a meta-analysis is conducted via a bivariate approach to account for their correlation. When IPD are available, sensitivity and specificity can be pooled for every possible threshold. Our objective was to compare the bivariate approach, which can be applied separately at every threshold, to two multivariate methods: the ordinal multivariate random-effects model and the Poisson correlated gamma-frailty model. Our comparison was empirical, using IPD from 13 studies that evaluated the diagnostic accuracy of the 9-item Patient Health Questionnaire depression screening tool, and included simulations. The empirical comparison showed that the implementation of the two multivariate methods is more laborious in terms of computational time and sensitivity to user-supplied values compared to the bivariate approach. Simulations showed that ignoring the within-study correlation of sensitivity and specificity across thresholds did not worsen inferences with the bivariate approach compared to the Poisson model. The ordinal approach was not suitable for simulations because the model was highly sensitive to user-supplied starting values. We tentatively recommend the bivariate approach rather than more complex multivariate methods for IPD diagnostic accuracy meta-analyses of ordinal scale tests, although the limited type of diagnostic data considered in the simulation study restricts the generalization of our findings. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Particle Laden Turbulence in a Radiation Environment Using a Portable High Preformace Solver Based on the Legion Runtime System

    NASA Astrophysics Data System (ADS)

    Torres, Hilario; Iaccarino, Gianluca

    2017-11-01

    Soleil-X is a multi-physics solver being developed at Stanford University as a part of the Predictive Science Academic Alliance Program II. Our goal is to conduct high fidelity simulations of particle laden turbulent flows in a radiation environment for solar energy receiver applications as well as to demonstrate our readiness to effectively utilize next generation Exascale machines. The novel aspect of Soleil-X is that it is built upon the Legion runtime system to enable easy portability to different parallel distributed heterogeneous architectures while also being written entirely in high-level/high-productivity languages (Ebb and Regent). An overview of the Soleil-X software architecture will be given. Results from coupled fluid flow, Lagrangian point particle tracking, and thermal radiation simulations will be presented. Performance diagnostic tools and metrics corresponding the the same cases will also be discussed. US Department of Energy, National Nuclear Security Administration.

  17. The next organizational challenge: finding and addressing diagnostic error.

    PubMed

    Graber, Mark L; Trowbridge, Robert; Myers, Jennifer S; Umscheid, Craig A; Strull, William; Kanter, Michael H

    2014-03-01

    Although health care organizations (HCOs) are intensely focused on improving the safety of health care, efforts to date have almost exclusively targeted treatment-related issues. The literature confirms that the approaches HCOs use to identify adverse medical events are not effective in finding diagnostic errors, so the initial challenge is to identify cases of diagnostic error. WHY HEALTH CARE ORGANIZATIONS NEED TO GET INVOLVED: HCOs are preoccupied with many quality- and safety-related operational and clinical issues, including performance measures. The case for paying attention to diagnostic errors, however, is based on the following four points: (1) diagnostic errors are common and harmful, (2) high-quality health care requires high-quality diagnosis, (3) diagnostic errors are costly, and (4) HCOs are well positioned to lead the way in reducing diagnostic error. FINDING DIAGNOSTIC ERRORS: Current approaches to identifying diagnostic errors, such as occurrence screens, incident reports, autopsy, and peer review, were not designed to detect diagnostic issues (or problems of omission in general) and/or rely on voluntary reporting. The realization that the existing tools are inadequate has spurred efforts to identify novel tools that could be used to discover diagnostic errors or breakdowns in the diagnostic process that are associated with errors. New approaches--Maine Medical Center's case-finding of diagnostic errors by facilitating direct reports from physicians and Kaiser Permanente's electronic health record--based reports that detect process breakdowns in the followup of abnormal findings--are described in case studies. By raising awareness and implementing targeted programs that address diagnostic error, HCOs may begin to play an important role in addressing the problem of diagnostic error.

  18. A novel approach for baseline correction in 1H-MRS signals based on ensemble empirical mode decomposition.

    PubMed

    Parto Dezfouli, Mohammad Ali; Dezfouli, Mohsen Parto; Rad, Hamidreza Saligheh

    2014-01-01

    Proton magnetic resonance spectroscopy ((1)H-MRS) is a non-invasive diagnostic tool for measuring biochemical changes in the human body. Acquired (1)H-MRS signals may be corrupted due to a wideband baseline signal generated by macromolecules. Recently, several methods have been developed for the correction of such baseline signals, however most of them are not able to estimate baseline in complex overlapped signal. In this study, a novel automatic baseline correction method is proposed for (1)H-MRS spectra based on ensemble empirical mode decomposition (EEMD). This investigation was applied on both the simulated data and the in-vivo (1)H-MRS of human brain signals. Results justify the efficiency of the proposed method to remove the baseline from (1)H-MRS signals.

  19. Spectrum simulation in DTSA-II.

    PubMed

    Ritchie, Nicholas W M

    2009-10-01

    Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.

  20. Novel diagnostic techniques for celiac disease.

    PubMed

    Kurppa, Kalle; Taavela, Juha; Saavalainen, Päivi; Kaukinen, Katri; Lindfors, Katri

    2016-07-01

    The diagnosis of celiac disease has long been based on the demonstration of gluten-induced small-bowel mucosal damage. However, due to the constantly increasing disease prevalence and limitations in the histology-based criteria there is a pressure towards more serology-based diagnostics. The serological tools are being improved and new non-invasive methods are being developed, but the constantly refined endoscopic and histologic techniques may still prove helpful. Moreover, growing understanding of the disease pathogenesis has led researchers to suggest completely novel approaches to celiac disease diagnostics regardless of disease activity. In this review, we will elucidate the most recent development and possible future innovations in the diagnostic techniques for celiac disease.

  1. Diagnostic accuracy at several reduced radiation dose levels for CT imaging in the diagnosis of appendicitis

    NASA Astrophysics Data System (ADS)

    Zhang, Di; Khatonabadi, Maryam; Kim, Hyun; Jude, Matilda; Zaragoza, Edward; Lee, Margaret; Patel, Maitraya; Poon, Cheryce; Douek, Michael; Andrews-Tang, Denise; Doepke, Laura; McNitt-Gray, Shawn; Cagnon, Chris; DeMarco, John; McNitt-Gray, Michael

    2012-03-01

    Purpose: While several studies have investigated the tradeoffs between radiation dose and image quality (noise) in CT imaging, the purpose of this study was to take this analysis a step further by investigating the tradeoffs between patient radiation dose (including organ dose) and diagnostic accuracy in diagnosis of appendicitis using CT. Methods: This study was IRB approved and utilized data from 20 patients who underwent clinical CT exams for indications of appendicitis. Medical record review established true diagnosis of appendicitis, with 10 positives and 10 negatives. A validated software tool used raw projection data from each scan to create simulated images at lower dose levels (70%, 50%, 30%, 20% of original). An observer study was performed with 6 radiologists reviewing each case at each dose level in random order over several sessions. Readers assessed image quality and provided confidence in their diagnosis of appendicitis, each on a 5 point scale. Liver doses at each case and each dose level were estimated using Monte Carlo simulation based methods. Results: Overall diagnostic accuracy varies across dose levels: 92%, 93%, 91%, 90% and 90% across the 100%, 70%, 50%, 30% and 20% dose levels respectively. And it is 93%, 95%, 88%, 90% and 90% across the 13.5-22mGy, 9.6-13.5mGy, 6.4-9.6mGy, 4-6.4mGy, and 2-4mGy liver dose ranges respectively. Only 4 out of 600 observations were rated "unacceptable" for image quality. Conclusion: The results from this pilot study indicate that the diagnostic accuracy does not change dramatically even at significantly reduced radiation dose.

  2. Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review

    PubMed Central

    Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.

    2009-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508

  3. An ARM data-oriented diagnostics package to evaluate the climate model simulation

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Xie, S.

    2016-12-01

    A set of diagnostics that utilize long-term high frequency measurements from the DOE Atmospheric Radiation Measurement (ARM) program is developed for evaluating the regional simulation of clouds, radiation and precipitation in climate models. The diagnostics results are computed and visualized automatically in a python-based package that aims to serve as an easy entry point for evaluating climate simulations using the ARM data, as well as the CMIP5 multi-model simulations. Basic performance metrics are computed to measure the accuracy of mean state and variability of simulated regional climate. The evaluated physical quantities include vertical profiles of clouds, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, radiative fluxes, aerosol and cloud microphysical properties. Process-oriented diagnostics focusing on individual cloud and precipitation-related phenomena are developed for the evaluation and development of specific model physical parameterizations. Application of the ARM diagnostics package will be presented in the AGU session. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, IM release number is: LLNL-ABS-698645.

  4. New V and V Tools for Diagnostic Modeling Environment (DME)

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Nelson, Stacy; Merriam, Marshall (Technical Monitor)

    2002-01-01

    The purpose of this report is to provide correctness and reliability criteria for verification and validation (V&V) of Second Generation Reusable Launch Vehicle (RLV) Diagnostic Modeling Environment, describe current NASA Ames Research Center tools for V&V of Model Based Reasoning systems, and discuss the applicability of Advanced V&V to DME. This report is divided into the following three sections: (1) correctness and reliability criteria; (2) tools for V&V of Model Based Reasoning; and (3) advanced V&V applicable to DME. The Executive Summary includes an overview of the main points from each section. Supporting details, diagrams, figures, and other information are included in subsequent sections. A glossary, acronym list, appendices, and references are included at the end of this report.

  5. Can time-dependent density functional theory predict intersystem crossing in organic chromophores? A case study on benzo(bis)-X-diazole based donor-acceptor-donor type molecules.

    PubMed

    Tam, Teck Lip Dexter; Lin, Ting Ting; Chua, Ming Hui

    2017-06-21

    Here we utilized new diagnostic tools in time-dependent density functional theory to explain the trend of intersystem crossing in benzo(bis)-X-diazole based donor-acceptor-donor type molecules. These molecules display a wide range of fluorescence quantum yields and triplet yields, making them excellent candidates for testing the validity of these diagnostic tools. We believe that these tools are cost-effective and can be applied to structurally similar organic chromophores to predict/explain the trends of intersystem crossing, and thus fluorescence quantum yields and triplet yields without the use of complex and expensive multireference configuration interaction or multireference pertubation theory methods.

  6. Addressing the Real-World Challenges in the Development of Propulsion IVHM Technology Experiment (PITEX)

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Chicatelli, Amy; Fulton, Christopher E.; Balaban, Edward; Sweet, Adam; Hayden, Sandra Claire; Bajwa, Anupa

    2005-01-01

    The Propulsion IVHM Technology Experiment (PITEX) has been an on-going research effort conducted over several years. PITEX has developed and applied a model-based diagnostic system for the main propulsion system of the X-34 reusable launch vehicle, a space-launch technology demonstrator. The application was simulation-based using detailed models of the propulsion subsystem to generate nominal and failure scenarios during captive carry, which is the most safety-critical portion of the X-34 flight. Since no system-level testing of the X-34 Main Propulsion System (MPS) was performed, these simulated data were used to verify and validate the software system. Advanced diagnostic and signal processing algorithms were developed and tested in real-time on flight-like hardware. In an attempt to expose potential performance problems, these PITEX algorithms were subject to numerous real-world effects in the simulated data including noise, sensor resolution, command/valve talkback information, and nominal build variations. The current research has demonstrated the potential benefits of model-based diagnostics, defined the performance metrics required to evaluate the diagnostic system, and studied the impact of real-world challenges encountered when monitoring propulsion subsystems.

  7. Creating an automated chiller fault detection and diagnostics tool using a data fault library.

    PubMed

    Bailey, Margaret B; Kreider, Jan F

    2003-07-01

    Reliable, automated detection and diagnosis of abnormal behavior within vapor compression refrigeration cycle (VCRC) equipment is extremely desirable for equipment owners and operators. The specific type of VCRC equipment studied in this paper is a 70-ton helical rotary, air-cooled chiller. The fault detection and diagnostic (FDD) tool developed as part of this research analyzes chiller operating data and detects faults through recognizing trends or patterns existing within the data. The FDD method incorporates a neural network (NN) classifier to infer the current state given a vector of observables. Therefore the FDD method relies upon the availability of normal and fault empirical data for training purposes and therefore a fault library of empirical data is assembled. This paper presents procedures for conducting sophisticated fault experiments on chillers that simulate air-cooled condenser, refrigerant, and oil related faults. The experimental processes described here are not well documented in literature and therefore will provide the interested reader with a useful guide. In addition, the authors provide evidence, based on both thermodynamics and empirical data analysis, that chiller performance is significantly degraded during fault operation. The chiller's performance degradation is successfully detected and classified by the NN FDD classifier as discussed in the paper's final section.

  8. Validation of a virtual reality-based simulator for shoulder arthroscopy.

    PubMed

    Rahm, Stefan; Germann, Marco; Hingsammer, Andreas; Wieser, Karl; Gerber, Christian

    2016-05-01

    This study was to determine face and construct validity of a new virtual reality-based shoulder arthroscopy simulator which uses passive haptic feedback. Fifty-one participants including 25 novices (<20 shoulder arthroscopies) and 26 experts (>100 shoulder arthroscopies) completed two tests: for assessment of face validity, a questionnaire was filled out concerning quality of simulated reality and training potential using a 7-point Likert scale (range 1-7). Construct validity was tested by comparing simulator metrics (operation time in seconds, camera and grasper pathway in centimetre and grasper openings) between novices and experts test results. Overall simulated reality was rated high with a median value of 5.5 (range 2.8-7) points. Training capacity scored a median value of 5.8 (range 3-7) points. Experts were significantly faster in the diagnostic test with a median of 91 (range 37-208) s than novices with 1177 (range 81-383) s (p < 0.0001) and in the therapeutic test 102 (range 58-283) s versus 229 (range 114-399) s (p < 0.0001). Similar results were seen in the other metric values except in the camera pathway in the therapeutic test. The tested simulator achieved high scores in terms of realism and training capability. It reliably discriminated between novices and experts. Further improvements of the simulator, especially in the field of therapeutic arthroscopy, might improve its value as training and assessment tool for shoulder arthroscopy skills. II.

  9. Alexander Meets Michotte: A Simulation Tool Based on Pattern Programming and Phenomenology

    ERIC Educational Resources Information Center

    Basawapatna, Ashok

    2016-01-01

    Simulation and modeling activities, a key point of computational thinking, are currently not being integrated into the science classroom. This paper describes a new visual programming tool entitled the Simulation Creation Toolkit. The Simulation Creation Toolkit is a high level pattern-based phenomenological approach to bringing rapid simulation…

  10. Polar bear encephalitis: establishment of a comprehensive next-generation pathogen analysis pipeline for captive and free-living wildlife.

    PubMed

    Szentiks, C A; Tsangaras, K; Abendroth, B; Scheuch, M; Stenglein, M D; Wohlsein, P; Heeger, F; Höveler, R; Chen, W; Sun, W; Damiani, A; Nikolin, V; Gruber, A D; Grobbel, M; Kalthoff, D; Höper, D; Czirják, G Á; Derisi, J; Mazzoni, C J; Schüle, A; Aue, A; East, M L; Hofer, H; Beer, M; Osterrieder, N; Greenwood, A D

    2014-05-01

    This report describes three possibly related incidences of encephalitis, two of them lethal, in captive polar bears (Ursus maritimus). Standard diagnostic methods failed to identify pathogens in any of these cases. A comprehensive, three-stage diagnostic 'pipeline' employing both standard serological methods and new DNA microarray and next generation sequencing-based diagnostics was developed, in part as a consequence of this initial failure. This pipeline approach illustrates the strengths, weaknesses and limitations of these tools in determining pathogen caused deaths in non-model organisms such as wildlife species and why the use of a limited number of diagnostic tools may fail to uncover important wildlife pathogens. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. A spectral Poisson solver for kinetic plasma simulation

    NASA Astrophysics Data System (ADS)

    Szeremley, Daniel; Obberath, Jens; Brinkmann, Ralf

    2011-10-01

    Plasma resonance spectroscopy is a well established plasma diagnostic method, realized in several designs. One of these designs is the multipole resonance probe (MRP). In its idealized - geometrically simplified - version it consists of two dielectrically shielded, hemispherical electrodes to which an RF signal is applied. A numerical tool is under development which is capable of simulating the dynamics of the plasma surrounding the MRP in electrostatic approximation. In this contribution we concentrate on the specialized Poisson solver for that tool. The plasma is represented by an ensemble of point charges. By expanding both the charge density and the potential into spherical harmonics, a largely analytical solution of the Poisson problem can be employed. For a practical implementation, the expansion must be appropriately truncated. With this spectral solver we are able to efficiently solve the Poisson equation in a kinetic plasma simulation without the need of introducing a spatial discretization.

  12. Case studies on design, simulation and visualization of control and measurement applications using REX control system

    NASA Astrophysics Data System (ADS)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    REX Control System is a professional advanced tool for design and implementation of complex control systems that belongs to softPLC category. It covers the entire process starting from simulation of functionality of the application before deployment, through implementation on real-time target, towards analysis, diagnostics and visualization. Basically it consists of two parts: the development tools and the runtime system. It is also compatible with Simulink environment, and the way of implementation of control algorithm is very similar. The control scheme is finally compiled (using RexDraw utility) and uploaded into a chosen real-time target (using RexView utility). There is a wide variety of hardware platforms and real-time operating systems supported by REX Control System such as for example Windows Embedded, Linux, Linux/Xenomai deployed on SBC, IPC, PAC, Raspberry Pi and others with many I/O interfaces. It is modern system designed both for measurement and control applications, offering a lot of additional functions concerning data archiving, visualization based on HTML5, and communication standards. The paper will sum up possibilities of its use in educational process, focused on control of case studies of physical models with classical and advanced control algorithms.

  13. Secure web-based invocation of large-scale plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Dimitrov, D. A.; Busby, R.; Exby, J.; Bruhwiler, D. L.; Cary, J. R.

    2004-12-01

    We present our design and initial implementation of a web-based system for running, both in parallel and serial, Particle-In-Cell (PIC) codes for plasma simulations with automatic post processing and generation of visual diagnostics.

  14. Diagnostic Tests to Support Late-Stage Control Programs for Schistosomiasis and Soil-Transmitted Helminthiases.

    PubMed

    Hawkins, Kenneth R; Cantera, Jason L; Storey, Helen L; Leader, Brandon T; de Los Santos, Tala

    2016-12-01

    Global efforts to address schistosomiasis and soil-transmitted helminthiases (STH) include deworming programs for school-aged children that are made possible by large-scale drug donations. Decisions on these mass drug administration (MDA) programs currently rely on microscopic examination of clinical specimens to determine the presence of parasite eggs. However, microscopy-based methods are not sensitive to the low-intensity infections that characterize populations that have undergone MDA. Thus, there has been increasing recognition within the schistosomiasis and STH communities of the need for improved diagnostic tools to support late-stage control program decisions, such as when to stop or reduce MDA. Failure to adequately address the need for new diagnostics could jeopardize achievement of the 2020 London Declaration goals. In this report, we assess diagnostic needs and landscape potential solutions and determine appropriate strategies to improve diagnostic testing to support control and elimination programs. Based upon literature reviews and previous input from experts in the schistosomiasis and STH communities, we prioritized two diagnostic use cases for further exploration: to inform MDA-stopping decisions and post-MDA surveillance. To this end, PATH has refined target product profiles (TPPs) for schistosomiasis and STH diagnostics that are applicable to these use cases. We evaluated the limitations of current diagnostic methods with regards to these use cases and identified candidate biomarkers and diagnostics with potential application as new tools. Based on this analysis, there is a need to develop antigen-detecting rapid diagnostic tests (RDTs) with simplified, field-deployable sample preparation for schistosomiasis. Additionally, there is a need for diagnostic tests that are more sensitive than the current methods for STH, which may include either a field-deployable molecular test or a simple, low-cost, rapid antigen-detecting test.

  15. Uncertainties in the Antarctic Ice Sheet Contribution to Sea Level Rise: Exploration of Model Response to Errors in Climate Forcing, Boundary Conditions, and Internal Parameters

    NASA Astrophysics Data System (ADS)

    Schlegel, N.; Seroussi, H. L.; Boening, C.; Larour, E. Y.; Limonadi, D.; Schodlok, M.; Watkins, M. M.

    2017-12-01

    The Jet Propulsion Laboratory-University of California at Irvine Ice Sheet System Model (ISSM) is a thermo-mechanical 2D/3D parallelized finite element software used to physically model the continental-scale flow of ice at high resolutions. Embedded into ISSM are uncertainty quantification (UQ) tools, based on the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) software. ISSM-DAKOTA offers various UQ methods for the investigation of how errors in model input impact uncertainty in simulation results. We utilize these tools to regionally sample model input and key parameters, based on specified bounds of uncertainty, and run a suite of continental-scale 100-year ISSM forward simulations of the Antarctic Ice Sheet. Resulting diagnostics (e.g., spread in local mass flux and regional mass balance) inform our conclusion about which parameters and/or forcing has the greatest impact on century-scale model simulations of ice sheet evolution. The results allow us to prioritize the key datasets and measurements that are critical for the minimization of ice sheet model uncertainty. Overall, we find that Antartica's total sea level contribution is strongly affected by grounding line retreat, which is driven by the magnitude of ice shelf basal melt rates and by errors in bedrock topography. In addition, results suggest that after 100 years of simulation, Thwaites glacier is the most significant source of model uncertainty, and its drainage basin has the largest potential for future sea level contribution. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  16. Novel calibration tools and validation concepts for microarray-based platforms used in molecular diagnostics and food safety control.

    PubMed

    Brunner, C; Hoffmann, K; Thiele, T; Schedler, U; Jehle, H; Resch-Genger, U

    2015-04-01

    Commercial platforms consisting of ready-to-use microarrays printed with target-specific DNA probes, a microarray scanner, and software for data analysis are available for different applications in medical diagnostics and food analysis, detecting, e.g., viral and bacteriological DNA sequences. The transfer of these tools from basic research to routine analysis, their broad acceptance in regulated areas, and their use in medical practice requires suitable calibration tools for regular control of instrument performance in addition to internal assay controls. Here, we present the development of a novel assay-adapted calibration slide for a commercialized DNA-based assay platform, consisting of precisely arranged fluorescent areas of various intensities obtained by incorporating different concentrations of a "green" dye and a "red" dye in a polymer matrix. These dyes present "Cy3" and "Cy5" analogues with improved photostability, chosen based upon their spectroscopic properties closely matching those of common labels for the green and red channel of microarray scanners. This simple tool allows to efficiently and regularly assess and control the performance of the microarray scanner provided with the biochip platform and to compare different scanners. It will be eventually used as fluorescence intensity scale for referencing of assays results and to enhance the overall comparability of diagnostic tests.

  17. Circulating microRNA-based screening tool for breast cancer

    PubMed Central

    Boukerroucha, Meriem; Fasquelle, Corinne; Thiry, Jérôme; Bovy, Nicolas; Struman, Ingrid; Geurts, Pierre; Collignon, Joëlle; Schroeder, Hélène; Kridelka, Frédéric; Lifrange, Eric; Jossa, Véronique

    2016-01-01

    Circulating microRNAs (miRNAs) are increasingly recognized as powerful biomarkers in several pathologies, including breast cancer. Here, their plasmatic levels were measured to be used as an alternative screening procedure to mammography for breast cancer diagnosis. A plasma miRNA profile was determined by RT-qPCR in a cohort of 378 women. A diagnostic model was designed based on the expression of 8 miRNAs measured first in a profiling cohort composed of 41 primary breast cancers and 45 controls, and further validated in diverse cohorts composed of 108 primary breast cancers, 88 controls, 35 breast cancers in remission, 31 metastatic breast cancers and 30 gynecologic tumors. A receiver operating characteristic curve derived from the 8-miRNA random forest based diagnostic tool exhibited an area under the curve of 0.81. The accuracy of the diagnostic tool remained unchanged considering age and tumor stage. The miRNA signature correctly identified patients with metastatic breast cancer. The use of the classification model on cohorts of patients with breast cancers in remission and with gynecologic cancers yielded prediction distributions similar to that of the control group. Using a multivariate supervised learning method and a set of 8 circulating miRNAs, we designed an accurate, minimally invasive screening tool for breast cancer. PMID:26734993

  18. A translator and simulator for the Burroughs D machine

    NASA Technical Reports Server (NTRS)

    Roberts, J.

    1972-01-01

    The D Machine is described as a small user microprogrammable computer designed to be a versatile building block for such diverse functions as: disk file controllers, I/O controllers, and emulators. TRANSLANG is an ALGOL-like language, which allows D Machine users to write microprograms in an English-like format as opposed to creating binary bit pattern maps. The TRANSLANG translator parses TRANSLANG programs into D Machine microinstruction bit patterns which can be executed on the D Machine simulator. In addition to simulation and translation, the two programs also offer several debugging tools, such as: a full set of diagnostic error messages, register dumps, simulated memory dumps, traces on instructions and groups of instructions, and breakpoints.

  19. Enhancing 4D PC-MRI in an aortic phantom considering numerical simulations

    NASA Astrophysics Data System (ADS)

    Kratzke, Jonas; Schoch, Nicolai; Weis, Christian; Müller-Eschner, Matthias; Speidel, Stefanie; Farag, Mina; Beller, Carsten J.; Heuveline, Vincent

    2015-03-01

    To date, cardiovascular surgery enables the treatment of a wide range of aortic pathologies. One of the current challenges in this field is given by the detection of high-risk patients for adverse aortic events, who should be treated electively. Reliable diagnostic parameters, which indicate the urge of treatment, have to be determined. Functional imaging by means of 4D phase contrast-magnetic resonance imaging (PC-MRI) enables the time-resolved measurement of blood flow velocity in 3D. Applied to aortic phantoms, three dimensional blood flow properties and their relation to adverse dynamics can be investigated in vitro. Emerging "in silico" methods of numerical simulation can supplement these measurements in computing additional information on crucial parameters. We propose a framework that complements 4D PC-MRI imaging by means of numerical simulation based on the Finite Element Method (FEM). The framework is developed on the basis of a prototypic aortic phantom and validated by 4D PC-MRI measurements of the phantom. Based on physical principles of biomechanics, the derived simulation depicts aortic blood flow properties and characteristics. The framework might help identifying factors that induce aortic pathologies such as aortic dilatation or aortic dissection. Alarming thresholds of parameters such as wall shear stress distribution can be evaluated. The combined techniques of 4D PC-MRI and numerical simulation can be used as complementary tools for risk-stratification of aortic pathology.

  20. Simulation of computed tomography dose based on voxel phantom

    NASA Astrophysics Data System (ADS)

    Liu, Chunyu; Lv, Xiangbo; Li, Zhaojun

    2017-01-01

    Computed Tomography (CT) is one of the preferred and the most valuable imaging tool used in diagnostic radiology, which provides a high-quality cross-sectional image of the body. It still causes higher doses of radiation to patients comparing to the other radiological procedures. The Monte-Carlo method is appropriate for estimation of the radiation dose during the CT examinations. The simulation of the Computed Tomography Dose Index (CTDI) phantom was developed in this paper. Under a similar conditions used in physical measurements, dose profiles were calculated and compared against the measured values that were reported. The results demonstrate a good agreement between the calculated and the measured doses. From different CT exam simulations using the voxel phantom, the highest absorbed dose was recorded for the lung, the brain, the bone surface. A comparison between the different scan type shows that the effective dose for a chest scan is the highest one, whereas the effective dose values during abdomen and pelvis scan are very close, respectively. The lowest effective dose resulted from the head scan. Although, the dose in CT is related to various parameters, such as the tube current, exposure time, beam energy, slice thickness and patient size, this study demonstrates that the MC simulation is a useful tool to accurately estimate the dose delivered to any specific organs for patients undergoing the CT exams and can be also a valuable technique for the design and the optimization of the CT x-ray source.

  1. Quality Assurance Assessment of Diagnostic and Radiation Therapy–Simulation CT Image Registration for Head and Neck Radiation Therapy: Anatomic Region of Interest–based Comparison of Rigid and Deformable Algorithms

    PubMed Central

    Mohamed, Abdallah S. R.; Ruangskul, Manee-Naad; Awan, Musaddiq J.; Baron, Charles A.; Kalpathy-Cramer, Jayashree; Castillo, Richard; Castillo, Edward; Guerrero, Thomas M.; Kocak-Uzel, Esengul; Yang, Jinzhong; Court, Laurence E.; Kantor, Michael E.; Gunn, G. Brandon; Colen, Rivka R.; Frank, Steven J.; Garden, Adam S.; Rosenthal, David I.

    2015-01-01

    Purpose To develop a quality assurance (QA) workflow by using a robust, curated, manually segmented anatomic region-of-interest (ROI) library as a benchmark for quantitative assessment of different image registration techniques used for head and neck radiation therapy–simulation computed tomography (CT) with diagnostic CT coregistration. Materials and Methods Radiation therapy–simulation CT images and diagnostic CT images in 20 patients with head and neck squamous cell carcinoma treated with curative-intent intensity-modulated radiation therapy between August 2011 and May 2012 were retrospectively retrieved with institutional review board approval. Sixty-eight reference anatomic ROIs with gross tumor and nodal targets were then manually contoured on images from each examination. Diagnostic CT images were registered with simulation CT images rigidly and by using four deformable image registration (DIR) algorithms: atlas based, B-spline, demons, and optical flow. The resultant deformed ROIs were compared with manually contoured reference ROIs by using similarity coefficient metrics (ie, Dice similarity coefficient) and surface distance metrics (ie, 95% maximum Hausdorff distance). The nonparametric Steel test with control was used to compare different DIR algorithms with rigid image registration (RIR) by using the post hoc Wilcoxon signed-rank test for stratified metric comparison. Results A total of 2720 anatomic and 50 tumor and nodal ROIs were delineated. All DIR algorithms showed improved performance over RIR for anatomic and target ROI conformance, as shown for most comparison metrics (Steel test, P < .008 after Bonferroni correction). The performance of different algorithms varied substantially with stratification by specific anatomic structures or category and simulation CT section thickness. Conclusion Development of a formal ROI-based QA workflow for registration assessment demonstrated improved performance with DIR techniques over RIR. After QA, DIR implementation should be the standard for head and neck diagnostic CT and simulation CT allineation, especially for target delineation. © RSNA, 2014 Online supplemental material is available for this article. PMID:25380454

  2. Energetic particle instabilities in fusion plasmas

    NASA Astrophysics Data System (ADS)

    Sharapov, S. E.; Alper, B.; Berk, H. L.; Borba, D. N.; Breizman, B. N.; Challis, C. D.; Classen, I. G. J.; Edlund, E. M.; Eriksson, J.; Fasoli, A.; Fredrickson, E. D.; Fu, G. Y.; Garcia-Munoz, M.; Gassner, T.; Ghantous, K.; Goloborodko, V.; Gorelenkov, N. N.; Gryaznevich, M. P.; Hacquin, S.; Heidbrink, W. W.; Hellesen, C.; Kiptily, V. G.; Kramer, G. J.; Lauber, P.; Lilley, M. K.; Lisak, M.; Nabais, F.; Nazikian, R.; Nyqvist, R.; Osakabe, M.; Perez von Thun, C.; Pinches, S. D.; Podesta, M.; Porkolab, M.; Shinohara, K.; Schoepf, K.; Todo, Y.; Toi, K.; Van Zeeland, M. A.; Voitsekhovich, I.; White, R. B.; Yavorskij, V.; TG, ITPA EP; Contributors, JET-EFDA

    2013-10-01

    Remarkable progress has been made in diagnosing energetic particle instabilities on present-day machines and in establishing a theoretical framework for describing them. This overview describes the much improved diagnostics of Alfvén instabilities and modelling tools developed world-wide, and discusses progress in interpreting the observed phenomena. A multi-machine comparison is presented giving information on the performance of both diagnostics and modelling tools for different plasma conditions outlining expectations for ITER based on our present knowledge.

  3. Virtual Reality Compared with Bench-Top Simulation in the Acquisition of Arthroscopic Skill: A Randomized Controlled Trial.

    PubMed

    Banaszek, Daniel; You, Daniel; Chang, Justues; Pickell, Michael; Hesse, Daniel; Hopman, Wilma M; Borschneck, Daniel; Bardana, Davide

    2017-04-05

    Work-hour restrictions as set forth by the Accreditation Council for Graduate Medical Education (ACGME) and other governing bodies have forced training programs to seek out new learning tools to accelerate acquisition of both medical skills and knowledge. As a result, competency-based training has become an important part of residency training. The purpose of this study was to directly compare arthroscopic skill acquisition in both high-fidelity and low-fidelity simulator models and to assess skill transfer from either modality to a cadaveric specimen, simulating intraoperative conditions. Forty surgical novices (pre-clerkship-level medical students) voluntarily participated in this trial. Baseline demographic data, as well as data on arthroscopic knowledge and skill, were collected prior to training. Subjects were randomized to 5-week independent training sessions on a high-fidelity virtual reality arthroscopic simulator or on a bench-top arthroscopic setup, or to an untrained control group. Post-training, subjects were asked to perform a diagnostic arthroscopy on both simulators and in a simulated intraoperative environment on a cadaveric knee. A more difficult surprise task was also incorporated to evaluate skill transfer. Subjects were evaluated using the Global Rating Scale (GRS), the 14-point arthroscopic checklist, and a timer to determine procedural efficiency (time per task). Secondary outcomes focused on objective measures of virtual reality simulator motion analysis. Trainees on both simulators demonstrated a significant improvement (p < 0.05) in arthroscopic skills compared with baseline scores and untrained controls, both in and ex vivo. The virtual reality simulation group consistently outperformed the bench-top model group in the diagnostic arthroscopy crossover tests and in the simulated cadaveric setup. Furthermore, the virtual reality group demonstrated superior skill transfer in the surprise skill transfer task. Both high-fidelity and low-fidelity simulation trainings were effective in arthroscopic skill acquisition. High-fidelity virtual reality simulation was superior to bench-top simulation in the acquisition of arthroscopic skills, both in the laboratory and in vivo. Further clinical investigation is needed to interpret the importance of these results.

  4. Process-Oriented Diagnostics of Tropical Cyclones in Global Climate Models

    NASA Astrophysics Data System (ADS)

    Moon, Y.; Kim, D.; Camargo, S. J.; Wing, A. A.; Sobel, A. H.; Bosilovich, M. G.; Murakami, H.; Reed, K. A.; Vecchi, G. A.; Wehner, M. F.; Zarzycki, C. M.; Zhao, M.

    2017-12-01

    Simulating tropical cyclone (TC) activity with global climate models (GCMs) remains a challenging problem. While some GCMs are able to simulate TC activity that is in good agreement with the observations, many other models exhibit strong biases. Decreasing horizontal grid spacing of the GCM simulations tends to improve the characteristics of simulated TCs, but this enhancement alone does not necessarily lead to greater skill in simulating TC activity. This study uses process-based diagnostics to identify model characteristics that could explain why some GCM simulations are able to produce more realistic TC activity than others. The diagnostics examine how convection, moisture, clouds and related processes are coupled at individual grid points, which yields useful information into how convective parameterizations interact with resolved model dynamics. These diagnostics share similarities with those originally developed to examine the Madden-Julian Oscillations in climate models. This study will examine TCs in eight different GCM simulations performed at NOAA/GFDL, NCAR and NASA that have different horizontal resolutions and ocean coupling. Preliminary results suggest that stronger TCs are closely associated with greater rainfall - thus greater diabatic heating - in the inner-core regions of the storms, which is consistent with previous theoretical studies. Other storm characteristics that can be used to infer why GCM simulations with comparable horizontal grid spacings produce different TC activity will be examined.

  5. NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-01-01

    This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less

  6. Computer-Based Simulations for Maintenance Training: Current ARI Research. Technical Report 544.

    ERIC Educational Resources Information Center

    Knerr, Bruce W.; And Others

    Three research efforts that used computer-based simulations for maintenance training were in progress when this report was written: Game-Based Learning, which investigated the use of computer-based games to train electronics diagnostic skills; Human Performance in Fault Diagnosis Tasks, which evaluated the use of context-free tasks to train…

  7. On the effect of experimental noise on the classification of biological samples using Raman micro-spectroscopy

    NASA Astrophysics Data System (ADS)

    Barton, Sinead J.; Kerr, Laura T.; Domijan, Katarina; Hennelly, Bryan M.

    2016-04-01

    Raman micro-spectroscopy is an optoelectronic technique that can be used to evaluate the chemical composition of biological samples and has been shown to be a powerful diagnostic tool for the investigation of various cancer related diseases including bladder, breast, and cervical cancer. Raman scattering is an inherently weak process with approximately 1 in 107 photons undergoing scattering and for this reason, noise from the recording system can have a significant impact on the quality of the signal, and its suitability for diagnostic classification. The main sources of noise in the recorded signal are shot noise, CCD dark current, and CCD readout noise. Shot noise results from the low signal photon count while dark current results from thermally generated electrons in the semiconductor pixels. Both of these noise sources are time dependent; readout noise is time independent but is inherent in each individual recording and results in the fundamental limit of measurement, arising from the internal electronics of the camera. In this paper, each of the aforementioned noise sources are analysed in isolation, and used to experimentally validate a mathematical model. This model is then used to simulate spectra that might be acquired under various experimental conditions including the use of different cameras, different source wavelength, and power etc. Simulated noisy datasets of T24 and RT112 cell line spectra are generated based on true cell Raman spectrum irradiance values (recorded using very long exposure times) and the addition of simulated noise. These datasets are then input to multivariate classification using Principal Components Analysis and Linear Discriminant Analysis. This method enables an investigation into the effect of noise on the sensitivity and specificity of Raman based classification under various experimental conditions and using different equipment.

  8. Simulation of 20-channel, 50-GHz, Si3N4-based arrayed waveguide grating applying three different photonics tools

    NASA Astrophysics Data System (ADS)

    Gajdošová, Lenka; Seyringer, Dana

    2017-02-01

    We present the design and simulation of 20-channel, 50-GHz Si3N4 based AWG using three different commercial photonics tools, namely PHASAR from Optiwave Systems Inc., APSS from Apollo Photonics Inc. and RSoft from Synopsys Inc. For this purpose we created identical waveguide structures and identical AWG layouts in these tools and performed BPM simulations. For the simulations the same calculation conditions were used. These AWGs were designed for TM-polarized light with an AWG central wavelength of 850 nm. The output of all simulations, the transmission characteristics, were used to calculate the transmission parameters defining the optical properties of the simulated AWGs. These parameters were summarized and compared with each other. The results feature very good correlation between the tools and are comparable to the designed parameters in AWG-Parameters tool.

  9. Individualized adjustments to reference phantom internal organ dosimetry—scaling factors given knowledge of patient internal anatomy

    NASA Astrophysics Data System (ADS)

    Wayson, Michael B.; Bolch, Wesley E.

    2018-04-01

    Various computational tools are currently available that facilitate patient organ dosimetry in diagnostic nuclear medicine, yet they are typically restricted to reporting organ doses to ICRP-defined reference phantoms. The present study, while remaining computational phantom based, provides straightforward tools to adjust reference phantom organ dose for both internal photon and electron sources. A wide variety of monoenergetic specific absorbed fractions were computed using radiation transport simulations for tissue spheres of varying size and separation distance. Scaling methods were then constructed for both photon and electron self-dose and cross-dose, with data validation provided from patient-specific voxel phantom simulations, as well as via comparison to the scaling methodology given in MIRD Pamphlet No. 11. Photon and electron self-dose was found to be dependent on both radiation energy and sphere size. Photon cross-dose was found to be mostly independent of sphere size. Electron cross-dose was found to be dependent on sphere size when the spheres were in close proximity, owing to differences in electron range. The validation studies showed that this dataset was more effective than the MIRD 11 method at predicting patient-specific photon doses for at both high and low energies, but gave similar results at photon energies between 100 keV and 1 MeV. The MIRD 11 method for electron self-dose scaling was accurate for lower energies but began to break down at higher energies. The photon cross-dose scaling methodology developed in this study showed gains in accuracy of up to 9% for actual patient studies, and the electron cross-dose scaling methodology showed gains in accuracy up to 9% as well when only the bremsstrahlung component of the cross-dose was scaled. These dose scaling methods are readily available for incorporation into internal dosimetry software for diagnostic phantom-based organ dosimetry.

  10. Individualized adjustments to reference phantom internal organ dosimetry-scaling factors given knowledge of patient internal anatomy.

    PubMed

    Wayson, Michael B; Bolch, Wesley E

    2018-04-13

    Various computational tools are currently available that facilitate patient organ dosimetry in diagnostic nuclear medicine, yet they are typically restricted to reporting organ doses to ICRP-defined reference phantoms. The present study, while remaining computational phantom based, provides straightforward tools to adjust reference phantom organ dose for both internal photon and electron sources. A wide variety of monoenergetic specific absorbed fractions were computed using radiation transport simulations for tissue spheres of varying size and separation distance. Scaling methods were then constructed for both photon and electron self-dose and cross-dose, with data validation provided from patient-specific voxel phantom simulations, as well as via comparison to the scaling methodology given in MIRD Pamphlet No. 11. Photon and electron self-dose was found to be dependent on both radiation energy and sphere size. Photon cross-dose was found to be mostly independent of sphere size. Electron cross-dose was found to be dependent on sphere size when the spheres were in close proximity, owing to differences in electron range. The validation studies showed that this dataset was more effective than the MIRD 11 method at predicting patient-specific photon doses for at both high and low energies, but gave similar results at photon energies between 100 keV and 1 MeV. The MIRD 11 method for electron self-dose scaling was accurate for lower energies but began to break down at higher energies. The photon cross-dose scaling methodology developed in this study showed gains in accuracy of up to 9% for actual patient studies, and the electron cross-dose scaling methodology showed gains in accuracy up to 9% as well when only the bremsstrahlung component of the cross-dose was scaled. These dose scaling methods are readily available for incorporation into internal dosimetry software for diagnostic phantom-based organ dosimetry.

  11. Model-based diagnostics for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Martin, Eric R.; Lerutte, Marcel G.

    1991-01-01

    An innovative approach to fault management was recently demonstrated for the NASA LeRC Space Station Freedom (SSF) power system testbed. This project capitalized on research in model-based reasoning, which uses knowledge of a system's behavior to monitor its health. The fault management system (FMS) can isolate failures online, or in a post analysis mode, and requires no knowledge of failure symptoms to perform its diagnostics. An in-house tool called MARPLE was used to develop and run the FMS. MARPLE's capabilities are similar to those available from commercial expert system shells, although MARPLE is designed to build model-based as opposed to rule-based systems. These capabilities include functions for capturing behavioral knowledge, a reasoning engine that implements a model-based technique known as constraint suspension, and a tool for quickly generating new user interfaces. The prototype produced by applying MARPLE to SSF not only demonstrated that model-based reasoning is a valuable diagnostic approach, but it also suggested several new applications of MARPLE, including an integration and testing aid, and a complement to state estimation.

  12. Testing the Efficacy of an Education-Based Training Tool to Improve Diagnostic Accuracy of Obsessive-Compulsive Disorder

    ERIC Educational Resources Information Center

    Glazier, Kimberly

    2014-01-01

    Objective: The study aimed to increase awareness of OCD symptomatology among doctoral students in clinical, counseling and school psychology through the implementation of a comprehensive OCD education-based training tool. Method: The program directors across all APA-accredited clinical, counseling, and school psychology doctoral graduate programs…

  13. Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.

  14. Diagnostic Machine Learning Models for Acute Abdominal Pain: Towards an e-Learning Tool for Medical Students.

    PubMed

    Khumrin, Piyapong; Ryan, Anna; Judd, Terry; Verspoor, Karin

    2017-01-01

    Computer-aided learning systems (e-learning systems) can help medical students gain more experience with diagnostic reasoning and decision making. Within this context, providing feedback that matches students' needs (i.e. personalised feedback) is both critical and challenging. In this paper, we describe the development of a machine learning model to support medical students' diagnostic decisions. Machine learning models were trained on 208 clinical cases presenting with abdominal pain, to predict five diagnoses. We assessed which of these models are likely to be most effective for use in an e-learning tool that allows students to interact with a virtual patient. The broader goal is to utilise these models to generate personalised feedback based on the specific patient information requested by students and their active diagnostic hypotheses.

  15. Interactive NMR: A Simulation Based Teaching Tool for Fundamentals to Applications with Tangible Analogies

    NASA Astrophysics Data System (ADS)

    Griesse-Nascimento, Sarah; Bridger, Joshua; Brown, Keith; Westervelt, Robert

    2011-03-01

    Interactive computer simulations increase students' understanding of difficult concepts and their ability to explain complex ideas. We created a module of eight interactive programs and accompanying lesson plans for teaching the fundamental concepts of Nuclear Magnetic Resonance (NMR) and Magnetic Resonance Imaging (MRI) that we call interactive NMR (iNMR). We begin with an analogy between nuclear spins and metronomes to start to build intuition about the dynamics of spins in a magnetic field. We continue to explain T1, T2, and pulse sequences with the metronome analogy. The final three programs are used to introduce and explain the Magnetic Resonance Switch, a recent diagnostic technique based on NMR. A modern relevant application is useful to generate interest in the topic and confidence in the students' ability to apply their knowledge. The iNMR module was incorporated into a high school AP physics class. In a preliminary evaluation of implementation, students expressed enthusiasm and demonstrated enhanced understanding of the material relative to the previous year. Funded by NSF PHY-0646094 grant.

  16. BEATBOX v1.0: Background Error Analysis Testbed with Box Models

    NASA Astrophysics Data System (ADS)

    Knote, Christoph; Barré, Jérôme; Eckl, Max

    2018-02-01

    The Background Error Analysis Testbed (BEATBOX) is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX) to the Kinetic Pre-Processor (KPP), this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE) point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.

  17. 3D-Printed Tissue-Mimicking Phantoms for Medical Imaging and Computational Validation Applications

    PubMed Central

    Shahmirzadi, Danial; Li, Ronny X.; Doyle, Barry J.; Konofagou, Elisa E.; McGloughlin, Tim M.

    2014-01-01

    Abstract Abdominal aortic aneurysm (AAA) is a permanent, irreversible dilation of the distal region of the aorta. Recent efforts have focused on improved AAA screening and biomechanics-based failure prediction. Idealized and patient-specific AAA phantoms are often employed to validate numerical models and imaging modalities. To produce such phantoms, the investment casting process is frequently used, reconstructing the 3D vessel geometry from computed tomography patient scans. In this study the alternative use of 3D printing to produce phantoms is investigated. The mechanical properties of flexible 3D-printed materials are benchmarked against proven elastomers. We demonstrate the utility of this process with particular application to the emerging imaging modality of ultrasound-based pulse wave imaging, a noninvasive diagnostic methodology being developed to obtain regional vascular wall stiffness properties, differentiating normal and pathologic tissue in vivo. Phantom wall displacements under pulsatile loading conditions were observed, showing good correlation to fluid–structure interaction simulations and regions of peak wall stress predicted by finite element analysis. 3D-printed phantoms show a strong potential to improve medical imaging and computational analysis, potentially helping bridge the gap between experimental and clinical diagnostic tools. PMID:28804733

  18. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    PubMed

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  20. Space Laboratory on a Table Top: A Next Generative ECLSS design and diagnostic tool

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale-time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data and inferences from the tests will allow for improvements in the development and design of next generation life support systems and configurations. Preliminary experimental and modeling work in this area will be presented. This involves testing of a single inlet-exit model with detailed 3-D flow visualization and quantitative diagnostics and computational modeling of the system.

  1. Assessment of the Asian Neurogastroenterology and Motility Association Chronic Constipation Criteria: An Asian Multicenter Cross-sectional Study

    PubMed Central

    Gwee, Kok-Ann; Bergmans, Paul; Kim, JinYong; Coudsy, Bogdana; Sim, Angelia; Chen, Minhu; Lin, Lin; Hou, Xiaohua; Wang, Huahong; Goh, Khean-Lee; Pangilinan, John A; Kim, Nayoung; des Varannes, Stanislas Bruley

    2017-01-01

    Background/Aims There is a need for a simple and practical tool adapted for the diagnosis of chronic constipation (CC) in the Asian population. This study compared the Asian Neurogastroenterology and Motility Association (ANMA) CC tool and Rome III criteria for the diagnosis of CC in Asian subjects. Methods This multicenter, cross-sectional study included subjects presenting at outpatient gastrointestinal clinics across Asia. Subjects with CC alert symptoms completed a combination Diagnosis Questionnaire to obtain a diagnosis based on 4 different diagnostic methods: self-defined, investigator’s judgment, ANMA CC tool, and Rome III criteria. The primary endpoint was the level of agreement/disagreement between the ANMA CC diagnostic tool and Rome III criteria for the diagnosis of CC. Results The primary analysis comprised of 449 subjects, 414 of whom had a positive diagnosis according to the ANMA CC tool. Rome III positive/ANMA positive and Rome III negative/ANMA negative diagnoses were reported in 76.8% and 7.8% of subjects, respectively, resulting in an overall percentage agreement of 84.6% between the 2 diagnostic methods. The overall percentage disagreement between these 2 diagnostic methods was 15.4%. A higher level of agreement was seen between the ANMA CC tool and self-defined (374 subjects [90.3%]) or investigator’s judgment criteria (388 subjects [93.7%]) compared with the Rome III criteria. Conclusion This study demonstrates that the ANMA CC tool can be a useful for Asian patients with CC. PMID:27764907

  2. An Evaluation of the Performance Diagnostic Checklist-Human Services to Assess an Employee Performance Problem in a Center-Based Autism Treatment Facility

    ERIC Educational Resources Information Center

    Ditzian, Kyle; Wilder, David A.; King, Allison; Tanz, Jeanine

    2015-01-01

    The Performance Diagnostic Checklist-Human Services (PDC-HS) is an informant-based tool designed to assess the environmental variables that contribute to poor employee performance in human services settings. We administered the PDC-HS to 3 supervisors to assess the variables that contributed to poor performance by 4 staff members when securing…

  3. CNC machine tool's wear diagnostic and prognostic by using dynamic Bayesian networks

    NASA Astrophysics Data System (ADS)

    Tobon-Mejia, D. A.; Medjaher, K.; Zerhouni, N.

    2012-04-01

    The failure of critical components in industrial systems may have negative consequences on the availability, the productivity, the security and the environment. To avoid such situations, the health condition of the physical system, and particularly of its critical components, can be constantly assessed by using the monitoring data to perform on-line system diagnostics and prognostics. The present paper is a contribution on the assessment of the health condition of a computer numerical control (CNC) tool machine and the estimation of its remaining useful life (RUL). The proposed method relies on two main phases: an off-line phase and an on-line phase. During the first phase, the raw data provided by the sensors are processed to extract reliable features. These latter are used as inputs of learning algorithms in order to generate the models that represent the wear's behavior of the cutting tool. Then, in the second phase, which is an assessment one, the constructed models are exploited to identify the tool's current health state, predict its RUL and the associated confidence bounds. The proposed method is applied on a benchmark of condition monitoring data gathered during several cuts of a CNC tool. Simulation results are obtained and discussed at the end of the paper.

  4. Preoperative planning of thoracic surgery with use of three-dimensional reconstruction, rapid prototyping, simulation and virtual navigation.

    PubMed

    Heuts, Samuel; Sardari Nia, Peyman; Maessen, Jos G

    2016-01-01

    For the past decades, surgeries have become more complex, due to the increasing age of the patient population referred for thoracic surgery, more complex pathology and the emergence of minimally invasive thoracic surgery. Together with the early detection of thoracic disease as a result of innovations in diagnostic possibilities and the paradigm shift to personalized medicine, preoperative planning is becoming an indispensable and crucial aspect of surgery. Several new techniques facilitating this paradigm shift have emerged. Pre-operative marking and staining of lesions are already a widely accepted method of preoperative planning in thoracic surgery. However, three-dimensional (3D) image reconstructions, virtual simulation and rapid prototyping (RP) are still in development phase. These new techniques are expected to become an important part of the standard work-up of patients undergoing thoracic surgery in the future. This review aims at graphically presenting and summarizing these new diagnostic and therapeutic tools.

  5. Malaria rapid diagnostic tests in elimination settings—can they find the last parasite?

    PubMed Central

    McMorrow, M. L.; Aidoo, M.; Kachur, S. P.

    2016-01-01

    Rapid diagnostic tests (RDTs) for malaria have improved the availability of parasite-based diagnosis throughout the malaria-endemic world. Accurate malaria diagnosis is essential for malaria case management, surveillance, and elimination. RDTs are inexpensive, simple to perform, and provide results in 15–20 min. Despite high sensitivity and specificity for Plasmodium falciparum infections, RDTs have several limitations that may reduce their utility in low-transmission settings: they do not reliably detect low-density parasitaemia (≤200 parasites/μL), many are less sensitive for Plasmodium vivax infections, and their ability to detect Plasmodium ovale and Plasmodium malariae is unknown. Therefore, in elimination settings, alternative tools with higher sensitivity for low-density infections (e.g. nucleic acid-based tests) are required to complement field diagnostics, and new highly sensitive and specific field-appropriate tests must be developed to ensure accurate diagnosis of symptomatic and asymptomatic carriers. As malaria transmission declines, the proportion of low-density infections among symptomatic and asymptomatic persons is likely to increase, which may limit the utility of RDTs. Monitoring malaria in elimination settings will probably depend on the use of more than one diagnostic tool in clinical-care and surveillance activities, and the combination of tools utilized will need to be informed by regular monitoring of test performance through effective quality assurance. PMID:21910780

  6. Impact of Diagnosticity on the Adequacy of Models for Cognitive Diagnosis under a Linear Attribute Structure: A Simulation Study

    ERIC Educational Resources Information Center

    de La Torre, Jimmy; Karelitz, Tzur M.

    2009-01-01

    Compared to unidimensional item response models (IRMs), cognitive diagnostic models (CDMs) based on latent classes represent examinees' knowledge and item requirements using discrete structures. This study systematically examines the viability of retrofitting CDMs to IRM-based data with a linear attribute structure. The study utilizes a procedure…

  7. Exploring Undergraduates' Understanding of Photosynthesis Using Diagnostic Question Clusters

    ERIC Educational Resources Information Center

    Parker, Joyce M.; Anderson, Charles W.; Heidemann, Merle; Merrill, John; Merritt, Brett; Richmond, Gail; Urban-Lurain, Mark

    2012-01-01

    We present a diagnostic question cluster (DQC) that assesses undergraduates' thinking about photosynthesis. This assessment tool is not designed to identify individual misconceptions. Rather, it is focused on students' abilities to apply basic concepts about photosynthesis by reasoning with a coordinated set of practices based on a few scientific…

  8. Screening and structure-based modeling of T-cell epitopes of Nipah virus proteome: an immunoinformatic approach for designing peptide-based vaccine.

    PubMed

    Kamthania, Mohit; Sharma, D K

    2015-12-01

    Identification of Nipah virus (NiV) T-cell-specific antigen is urgently needed for appropriate diagnostic and vaccination. In the present study, prediction and modeling of T-cell epitopes of Nipah virus antigenic proteins nucleocapsid, phosphoprotein, matrix, fusion, glycoprotein, L protein, W protein, V protein and C protein followed by the binding simulation studies of predicted highest binding scorers with their corresponding MHC class I alleles were done. Immunoinformatic tool ProPred1 was used to predict the promiscuous MHC class I epitopes of viral antigenic proteins. The molecular modelings of the epitopes were done by PEPstr server. And alleles structure were predicted by MODELLER 9.10. Molecular dynamics (MD) simulation studies were performed through the NAMD graphical user interface embedded in visual molecular dynamics. Epitopes VPATNSPEL, NPTAVPFTL and LLFVFGPNL of Nucleocapsid, V protein and Fusion protein have considerable binding energy and score with HLA-B7, HLA-B*2705 and HLA-A2MHC class I allele, respectively. These three predicted peptides are highly potential to induce T-cell-mediated immune response and are expected to be useful in designing epitope-based vaccines against Nipah virus after further testing by wet laboratory studies.

  9. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  10. A dynamic model of Flo-Tron flowmeters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cichy, M.; Bossio, R.B.

    1984-08-01

    The optimization of diagnostic equipment for reciprocating both internal and external combustion engines are deeply affected by suitability of simulation models. One of the most attractive and difficult diagnostic aspect deals with the fuel instantaneous mass flow rate measurement. A new model of the dynamic simulation of the Flo-Tron flowmeter, whose working principle is based on the hydraulic Wheatstone's bridge is then presented, dealing with the state space equations and bond-graph method.

  11. Statistical Methods for Assessments in Simulations and Serious Games. Research Report. ETS RR-14-12

    ERIC Educational Resources Information Center

    Fu, Jianbin; Zapata, Diego; Mavronikolas, Elia

    2014-01-01

    Simulation or game-based assessments produce outcome data and process data. In this article, some statistical models that can potentially be used to analyze data from simulation or game-based assessments are introduced. Specifically, cognitive diagnostic models that can be used to estimate latent skills from outcome data so as to scale these…

  12. Visible Human Project

    MedlinePlus

    ... used for teaching, modeling radiation absorption and therapy, equipment design, surgical simulation, and simulation of diagnostic procedures, ….” ... Project ® " by Michael J. Ackerman, Ph.D. Projects Based on the Visible Human Data Set Applications for ...

  13. INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT

    EPA Science Inventory

    A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...

  14. A Research Agenda for Malaria Eradication: Diagnoses and Diagnostics

    PubMed Central

    2011-01-01

    Many of malaria's signs and symptoms are indistinguishable from those of other febrile diseases. Detection of the presence of Plasmodium parasites is essential, therefore, to guide case management. Improved diagnostic tools are required to enable targeted treatment of infected individuals. In addition, field-ready diagnostic tools for mass screening and surveillance that can detect asymptomatic infections of very low parasite densities are needed to monitor transmission reduction and ensure elimination. Antibody-based tests for infection and novel methods based on biomarkers need further development and validation, as do methods for the detection and treatment of Plasmodium vivax. Current rapid diagnostic tests targeting P. vivax are generally less effective than those targeting Plasmodium falciparum. Moreover, because current drugs for radical cure may cause serious side effects in patients with glucose-6-phosphate dehydrogenase (G6PD) deficiency, more information is needed on the distribution of G6PD-deficiency variants as well as tests to identify at-risk individuals. Finally, in an environment of very low or absent malaria transmission, sustaining interest in elimination and maintaining resources will become increasingly important. Thus, research is required into the context in which malaria diagnostic tests are used, into diagnostics for other febrile diseases, and into the integration of these tests into health systems. PMID:21311583

  15. Paper-based sample-to-answer molecular diagnostic platform for point-of-care diagnostics.

    PubMed

    Choi, Jane Ru; Tang, Ruihua; Wang, ShuQi; Wan Abas, Wan Abu Bakar; Pingguan-Murphy, Belinda; Xu, Feng

    2015-12-15

    Nucleic acid testing (NAT), as a molecular diagnostic technique, including nucleic acid extraction, amplification and detection, plays a fundamental role in medical diagnosis for timely medical treatment. However, current NAT technologies require relatively high-end instrumentation, skilled personnel, and are time-consuming. These drawbacks mean conventional NAT becomes impractical in many resource-limited disease-endemic settings, leading to an urgent need to develop a fast and portable NAT diagnostic tool. Paper-based devices are typically robust, cost-effective and user-friendly, holding a great potential for NAT at the point of care. In view of the escalating demand for the low cost diagnostic devices, we highlight the beneficial use of paper as a platform for NAT, the current state of its development, and the existing challenges preventing its widespread use. We suggest a strategy involving integrating all three steps of NAT into one single paper-based sample-to-answer diagnostic device for rapid medical diagnostics in the near future. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Developing a Multi-Dimensional Early Elementary Mathematics Screener and Diagnostic Tool: The Primary Mathematics Assessment.

    PubMed

    Brendefur, Jonathan L; Johnson, Evelyn S; Thiede, Keith W; Strother, Sam; Severson, Herb H

    2018-01-01

    There is a critical need to identify primary level students experiencing difficulties in mathematics to provide immediate and targeted instruction that remediates their deficits. However, most early math screening instruments focus only on the concept of number, resulting in inadequate and incomplete information for teachers to design intervention efforts. We propose a mathematics assessment that screens and provides diagnostic information in six domains that are important to building a strong foundation in mathematics. This article describes the conceptual framework and psychometric qualities of a web-based assessment tool, the Primary Math Assessment (PMA). The PMA includes a screener to identify students at risk for poor math outcomes and a diagnostic tool to provide a more in-depth profile of children's specific strengths and weaknesses in mathematics. The PMA allows teachers and school personnel to make better instructional decisions by providing more targeted analyses.

  17. Feasibility of streamlining an interactive Bayesian-based diagnostic support tool designed for clinical practice

    NASA Astrophysics Data System (ADS)

    Chen, Po-Hao; Botzolakis, Emmanuel; Mohan, Suyash; Bryan, R. N.; Cook, Tessa

    2016-03-01

    In radiology, diagnostic errors occur either through the failure of detection or incorrect interpretation. Errors are estimated to occur in 30-35% of all exams and contribute to 40-54% of medical malpractice litigations. In this work, we focus on reducing incorrect interpretation of known imaging features. Existing literature categorizes cognitive bias leading a radiologist to an incorrect diagnosis despite having correctly recognized the abnormal imaging features: anchoring bias, framing effect, availability bias, and premature closure. Computational methods make a unique contribution, as they do not exhibit the same cognitive biases as a human. Bayesian networks formalize the diagnostic process. They modify pre-test diagnostic probabilities using clinical and imaging features, arriving at a post-test probability for each possible diagnosis. To translate Bayesian networks to clinical practice, we implemented an entirely web-based open-source software tool. In this tool, the radiologist first selects a network of choice (e.g. basal ganglia). Then, large, clearly labeled buttons displaying salient imaging features are displayed on the screen serving both as a checklist and for input. As the radiologist inputs the value of an extracted imaging feature, the conditional probabilities of each possible diagnosis are updated. The software presents its level of diagnostic discrimination using a Pareto distribution chart, updated with each additional imaging feature. Active collaboration with the clinical radiologist is a feasible approach to software design and leads to design decisions closely coupling the complex mathematics of conditional probability in Bayesian networks with practice.

  18. Coherent tools for physics-based simulation and characterization of noise in semiconductor devices oriented to nonlinear microwave circuit CAD

    NASA Astrophysics Data System (ADS)

    Riah, Zoheir; Sommet, Raphael; Nallatamby, Jean C.; Prigent, Michel; Obregon, Juan

    2004-05-01

    We present in this paper a set of coherent tools for noise characterization and physics-based analysis of noise in semiconductor devices. This noise toolbox relies on a low frequency noise measurement setup with special high current capabilities thanks to an accurate and original calibration. It relies also on a simulation tool based on the drift diffusion equations and the linear perturbation theory, associated with the Green's function technique. This physics-based noise simulator has been implemented successfully in the Scilab environment and is specifically dedicated to HBTs. Some results are given and compared to those existing in the literature.

  19. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  20. Modeling experimental plasma diagnostics in the FLASH code: Thomson scattering

    NASA Astrophysics Data System (ADS)

    Weide, Klaus; Flocke, Norbert; Feister, Scott; Tzeferacos, Petros; Lamb, Donald

    2017-10-01

    Spectral analysis of the Thomson scattering of laser light sent into a plasma provides an experimental method to quantify plasma properties in laser-driven plasma experiments. We have implemented such a synthetic Thomson scattering diagnostic unit in the FLASH code, to emulate the probe-laser propagation, scattering and spectral detection. User-defined laser rays propagate into the FLASH simulation region and experience scattering (change in direction and frequency) based on plasma parameters. After scattering, the rays propagate out of the interaction region and are spectrally characterized. The diagnostic unit can be used either during a physics simulation or in post-processing of simulation results. FLASH is publicly available at flash.uchicago.edu. U.S. DOE NNSA, U.S. DOE NNSA ASC, U.S. DOE Office of Science and NSF.

  1. Integrating Oil Debris and Vibration Measurements for Intelligent Machine Health Monitoring. Degree awarded by Toledo Univ., May 2002

    NASA Technical Reports Server (NTRS)

    Dempsey, Paula J.

    2003-01-01

    A diagnostic tool for detecting damage to gears was developed. Two different measurement technologies, oil debris analysis and vibration were integrated into a health monitoring system for detecting surface fatigue pitting damage on gears. This integrated system showed improved detection and decision-making capabilities as compared to using individual measurement technologies. This diagnostic tool was developed and evaluated experimentally by collecting vibration and oil debris data from fatigue tests performed in the NASA Glenn Spur Gear Fatigue Rig. An oil debris sensor and the two vibration algorithms were adapted as the diagnostic tools. An inductance type oil debris sensor was selected for the oil analysis measurement technology. Gear damage data for this type of sensor was limited to data collected in the NASA Glenn test rigs. For this reason, this analysis included development of a parameter for detecting gear pitting damage using this type of sensor. The vibration data was used to calculate two previously available gear vibration diagnostic algorithms. The two vibration algorithms were selected based on their maturity and published success in detecting damage to gears. Oil debris and vibration features were then developed using fuzzy logic analysis techniques, then input into a multi sensor data fusion process. Results show combining the vibration and oil debris measurement technologies improves the detection of pitting damage on spur gears. As a result of this research, this new diagnostic tool has significantly improved detection of gear damage in the NASA Glenn Spur Gear Fatigue Rigs. This research also resulted in several other findings that will improve the development of future health monitoring systems. Oil debris analysis was found to be more reliable than vibration analysis for detecting pitting fatigue failure of gears and is capable of indicating damage progression. Also, some vibration algorithms are as sensitive to operational effects as they are to damage. Another finding was that clear threshold limits must be established for diagnostic tools. Based on additional experimental data obtained from the NASA Glenn Spiral Bevel Gear Fatigue Rig, the methodology developed in this study can be successfully implemented on other geared systems.

  2. Imperfect practice makes perfect: error management training improves transfer of learning.

    PubMed

    Dyre, Liv; Tabor, Ann; Ringsted, Charlotte; Tolsgaard, Martin G

    2017-02-01

    Traditionally, trainees are instructed to practise with as few errors as possible during simulation-based training. However, transfer of learning may improve if trainees are encouraged to commit errors. The aim of this study was to assess the effects of error management instructions compared with error avoidance instructions during simulation-based ultrasound training. Medical students (n = 60) with no prior ultrasound experience were randomised to error management training (EMT) (n = 32) or error avoidance training (EAT) (n = 28). The EMT group was instructed to deliberately make errors during training. The EAT group was instructed to follow the simulator instructions and to commit as few errors as possible. Training consisted of 3 hours of simulation-based ultrasound training focusing on fetal weight estimation. Simulation-based tests were administered before and after training. Transfer tests were performed on real patients 7-10 days after the completion of training. Primary outcomes were transfer test performance scores and diagnostic accuracy. Secondary outcomes included performance scores and diagnostic accuracy during the simulation-based pre- and post-tests. A total of 56 participants completed the study. On the transfer test, EMT group participants attained higher performance scores (mean score: 67.7%, 95% confidence interval [CI]: 62.4-72.9%) than EAT group members (mean score: 51.7%, 95% CI: 45.8-57.6%) (p < 0.001; Cohen's d = 1.1, 95% CI: 0.5-1.7). There was a moderate improvement in diagnostic accuracy in the EMT group compared with the EAT group (16.7%, 95% CI: 10.2-23.3% weight deviation versus 26.6%, 95% CI: 16.5-36.7% weight deviation [p = 0.082; Cohen's d = 0.46, 95% CI: -0.06 to 1.0]). No significant interaction effects between group and performance improvements between the pre- and post-tests were found in either performance scores (p = 0.25) or diagnostic accuracy (p = 0.09). The provision of error management instructions during simulation-based training improves the transfer of learning to the clinical setting compared with error avoidance instructions. Rather than teaching to avoid errors, the use of errors for learning should be explored further in medical education theory and practice. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  3. A simulation-optimization-based decision support tool for mitigating traffic congestion.

    DOT National Transportation Integrated Search

    2009-12-01

    "Traffic congestion has grown considerably in the United States over the past twenty years. In this paper, we develop : a robust decision support tool based on simulation optimization to evaluate and recommend congestion-mitigation : strategies to tr...

  4. Model-based development of a fault signature matrix to improve solid oxide fuel cell systems on-site diagnosis

    NASA Astrophysics Data System (ADS)

    Polverino, Pierpaolo; Pianese, Cesare; Sorrentino, Marco; Marra, Dario

    2015-04-01

    The paper focuses on the design of a procedure for the development of an on-field diagnostic algorithm for solid oxide fuel cell (SOFC) systems. The diagnosis design phase relies on an in-deep analysis of the mutual interactions among all system components by exploiting the physical knowledge of the SOFC system as a whole. This phase consists of the Fault Tree Analysis (FTA), which identifies the correlations among possible faults and their corresponding symptoms at system components level. The main outcome of the FTA is an inferential isolation tool (Fault Signature Matrix - FSM), which univocally links the faults to the symptoms detected during the system monitoring. In this work the FTA is considered as a starting point to develop an improved FSM. Making use of a model-based investigation, a fault-to-symptoms dependency study is performed. To this purpose a dynamic model, previously developed by the authors, is exploited to simulate the system under faulty conditions. Five faults are simulated, one for the stack and four occurring at BOP level. Moreover, the robustness of the FSM design is increased by exploiting symptom thresholds defined for the investigation of the quantitative effects of the simulated faults on the affected variables.

  5. A gamma beam profile imager for ELI-NP Gamma Beam System

    NASA Astrophysics Data System (ADS)

    Cardarelli, P.; Paternò, G.; Di Domenico, G.; Consoli, E.; Marziani, M.; Andreotti, M.; Evangelisti, F.; Squerzanti, S.; Gambaccini, M.; Albergo, S.; Cappello, G.; Tricomi, A.; Veltri, M.; Adriani, O.; Borgheresi, R.; Graziani, G.; Passaleva, G.; Serban, A.; Starodubtsev, O.; Variola, A.; Palumbo, L.

    2018-06-01

    The Gamma Beam System of ELI-Nuclear Physics is a high brilliance monochromatic gamma source based on the inverse Compton interaction between an intense high power laser and a bright electron beam with tunable energy. The source, currently being assembled in Magurele (Romania), is designed to provide a beam with tunable average energy ranging from 0.2 to 19.5 MeV, rms energy bandwidth down to 0.5% and flux of about 108 photons/s. The system includes a set of detectors for the diagnostic and complete characterization of the gamma beam. To evaluate the spatial distribution of the beam a gamma beam profile imager is required. For this purpose, a detector based on a scintillator target coupled to a CCD camera was designed and a prototype was tested at INFN-Ferrara laboratories. A set of analytical calculations and Monte Carlo simulations were carried out to optimize the imager design and evaluate the performance expected with ELI-NP gamma beam. In this work the design of the imager is described in detail, as well as the simulation tools used and the results obtained. The simulation parameters were tuned and cross-checked with the experimental measurements carried out on the assembled prototype using the beam from an x-ray tube.

  6. Trace Replay and Network Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acun, Bilge; Jain, Nikhil; Bhatele, Abhinav

    2015-03-23

    TraceR is a trace reply tool built upon the ROSS-based CODES simulation framework. TraceR can be used for predicting network performances and understanding network behavior by simulating messaging in High Performance Computing applications on interconnection networks.

  7. Trace Replay and Network Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, Nikhil; Bhatele, Abhinav; Acun, Bilge

    TraceR Is a trace replay tool built upon the ROSS-based CODES simulation framework. TraceR can be used for predicting network performance and understanding network behavior by simulating messaging In High Performance Computing applications on interconnection networks.

  8. Diagnostic performance of a Lattice Boltzmann-based method for CT-based fractional flow reserve.

    PubMed

    Giannopoulos, Andreas A; Tang, Anji; Ge, Yin; Cheezum, Michael K; Steigner, Michael L; Fujimoto, Shinichiro; Kumamaru, Kanako K; Chiappino, Dante; Della Latta, Daniele; Berti, Sergio; Chiappino, Sara; Rybicki, Frank J; Melchionna, Simone; Mitsouras, Dimitrios

    2018-02-20

    Fractional flow reserve (FFR) estimated from coronary computed tomography angiography (CT-FFR) offers non-invasive detection of lesion-specific ischaemia. We aimed to develop and validate a fast CT-FFR algorithm utilising the Lattice Boltzmann method for blood flow simulation (LBM CT-FFR). Sixty-four patients with clinically indicated CTA and invasive FFR measurement from three institutions were retrospectively analysed. CT-FFR was performed using an onsite tool interfacing with a commercial Lattice Boltzmann fluid dynamics cloud-based platform. Diagnostic accuracy of LBM CT-FFR ≤0.8 and percent diameter stenosis >50% by CTA to detect invasive FFR ≤0.8 were compared using area under the receiver operating characteristic curve (AUC). Sixty patients successfully underwent LBM CT-FFR analysis; 29 of 73 lesions in 69 vessels had invasive FFR ≤0.8. Total time to perform LBM CT-FFR was 40±10 min. Compared to invasive FFR, LBM CT-FFR had good correlation (r=0.64), small bias (0.009) and good limits of agreement (-0.223 to 0.206). The AUC of LBM CT-FFR (AUC=0.894, 95% confidence interval [CI]: 0.792-0.996) was significantly higher than CTA (AUC=0.685, 95% CI: 0.576-0.794) to detect FFR ≤0.8 (p=0.0021). Per-lesion specificity, sensitivity, and accuracy of LBM CT-FFR were 97.7%, 79.3%, and 90.4%, respectively. LBM CT-FFR has very good diagnostic accuracy to detect lesion-specific ischaemia (FFR ≤0.8) and can be performed in less than one hour.

  9. Study of Kapton Degradation under Simulated Shuttle Environment

    NASA Technical Reports Server (NTRS)

    Eck, T. G.; Hoffman, R. W.

    1985-01-01

    Weight loss and severe degradation of the surface of Kapton that occurs in low Earth orbit is studied. Atomic oxygen, the major ambient species at low Earth altitude and incident with approximately 5 eV energy in ram conditions, is the primary suspect, but a thorough study of oxygen-Kapton interactions has not yet been carried out. A low-energy ion source is used to simulate the shuttle low Earth orbit environment. This source, together with diagnostic tools including surface analysis and mass spectroscopic capability, is being used to carry out experiments from which quantum yields may be obtained.

  10. Aperture tolerances for neutron-imaging systems in inertial confinement fusion.

    PubMed

    Ghilea, M C; Sangster, T C; Meyerhofer, D D; Lerche, R A; Disdier, L

    2008-02-01

    Neutron-imaging systems are being considered as an ignition diagnostic for the National Ignition Facility (NIF) [Hogan et al., Nucl. Fusion 41, 567 (2001)]. Given the importance of these systems, a neutron-imaging design tool is being used to quantify the effects of aperture fabrication and alignment tolerances on reconstructed neutron images for inertial confinement fusion. The simulations indicate that alignment tolerances of more than 1 mrad would introduce measurable features in a reconstructed image for both pinholes and penumbral aperture systems. These simulations further show that penumbral apertures are several times less sensitive to fabrication errors than pinhole apertures.

  11. Observing system simulations using synthetic radiances and atmospheric retrievals derived for the AMSU and HIRS in a mesoscale model. [Advanced Microwave Sounding Unit

    NASA Technical Reports Server (NTRS)

    Diak, George R.; Huang, Hung-Lung; Kim, Dongsoo

    1990-01-01

    The paper addresses the concept of synthetic satellite imagery as a visualization and diagnostic tool for understanding satellite sensors of the future and to detail preliminary results on the quality of soundings from the current sensors. Preliminary results are presented on the quality of soundings from the combination of the High-Resolution Infrared Radiometer Sounder and the Advanced Microwave Sounding Unit. Results are also presented on the first Observing System Simulation Experiment using this data in a mesoscale numerical prediction model.

  12. Case studies on design, simulation and visualization of control and measurement applications using REX control system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozana, Stepan, E-mail: stepan.ozana@vsb.cz; Pies, Martin, E-mail: martin.pies@vsb.cz; Docekal, Tomas, E-mail: docekalt@email.cz

    REX Control System is a professional advanced tool for design and implementation of complex control systems that belongs to softPLC category. It covers the entire process starting from simulation of functionality of the application before deployment, through implementation on real-time target, towards analysis, diagnostics and visualization. Basically it consists of two parts: the development tools and the runtime system. It is also compatible with Simulink environment, and the way of implementation of control algorithm is very similar. The control scheme is finally compiled (using RexDraw utility) and uploaded into a chosen real-time target (using RexView utility). There is a widemore » variety of hardware platforms and real-time operating systems supported by REX Control System such as for example Windows Embedded, Linux, Linux/Xenomai deployed on SBC, IPC, PAC, Raspberry Pi and others with many I/O interfaces. It is modern system designed both for measurement and control applications, offering a lot of additional functions concerning data archiving, visualization based on HTML5, and communication standards. The paper will sum up possibilities of its use in educational process, focused on control of case studies of physical models with classical and advanced control algorithms.« less

  13. Mechanical properties of growing melanocytic nevi and the progression to melanoma

    NASA Astrophysics Data System (ADS)

    Taloni, Alessandro; Alemi, Alexander; Ciusani, Emilio; Sethna, James P.; Zapperi, Stefano; La Porta, Caterina A. M.; National Research Council Of Italy Team; Lassp, Department Of Physics, Cornell University Team; Istituto Neurologico Carlo Besta Collaboration; Department Of Biosciences, University Of Milano Team

    2015-03-01

    Melanocytic nevi are benign proliferations that sometimes turn into malignant melanoma in a way that is still unclear from the biochemical and genetic point of view. Diagnostic and prognostic tools are then mostly based on dermoscopic examination and morphological analysis of histological tissues. To investigate the role of mechanics and geometry in the morpholgical dynamics of melanocytic nevi, we present a computational model for cell proliferation in a layered non-linear elastic tissue. Our simulations show that the morphology of the nevus is correlated to the initial location of the proliferating cell starting the growth process and to the mechanical properties of the tissue. We also demonstrate that melanocytes are subject to compressive stresses that fluctuate widely in the nevus and depend on the growth stage. Numerical simulations of cells in the epidermis releasing matrix metalloproteinases display an accelerated invasion of the dermis by destroying the basal membrane. Moreover, we show experimentally that osmotic stress and collagen inhibit growth in primary melanoma cells while the effect is much weaker in metastatic cells.

  14. Augmented Reality-Based Simulators as Discovery Learning Tools: An Empirical Study

    ERIC Educational Resources Information Center

    Ibáñez, María-Blanca; Di-Serio, Ángela; Villarán-Molina, Diego; Delgado-Kloos, Carlos

    2015-01-01

    This paper reports empirical evidence on having students use AR-SaBEr, a simulation tool based on augmented reality (AR), to discover the basic principles of electricity through a series of experiments. AR-SaBEr was enhanced with knowledge-based support and inquiry-based scaffolding mechanisms, which proved useful for discovery learning in…

  15. A comparison of three approaches for simulating fine-scale surface winds in support of wildland fire management. Part II. An exploratory study of the effect of simulated winds on fire growth simulations

    Treesearch

    Jason M. Forthofer; Bret W. Butler; Charles W. McHugh; Mark A. Finney; Larry S. Bradshaw; Richard D. Stratton; Kyle S. Shannon; Natalie S. Wagenbrenner

    2014-01-01

    The effect of fine-resolution wind simulations on fire growth simulations is explored. The wind models are (1) a wind field consisting of constant speed and direction applied everywhere over the area of interest; (2) a tool based on the solution of the conservation of mass only (termed mass-conserving model) and (3) a tool based on a solution of conservation of mass...

  16. DNA barcodes and molecular diagnostics to distinguish an introduced and native Laricobius (Coleoptera: Derodontidae) species in eastern North America

    Treesearch

    G.A. Davis; N.P. Havill; Z.N. Adelman; A. Caccone; L.T. Kok; S.M. Salom

    2011-01-01

    Molecular diagnostics based on DNA barcodes can be powerful identification tools in the absence of distinctive morphological characters for distinguishing between closely related species. A specific example is distinguishing the endemic species Laricobius rubidus from Laricobius nigrinus, a biological control agent of hemlock...

  17. Multiple Monochromatic Imaging (MMI) Status and Plans for LANL Campaigns on Omega and NIF

    NASA Astrophysics Data System (ADS)

    Wysocki, F. J.; Hsu, S. C.; Tregillis, I. L.; Schmitt, M. J.; Kyrala, G. A.; Martinson, D. D.; Murphy, T. J.; Mancini, R. C.; Nagayama, T.

    2011-10-01

    LANL's DIME (Defect Implosion Experiment) campaigns on Omega and NIF are aimed at obtaining improved understanding of defect-induced mix via experiments and simulations of directly driven high-Z doped plastic capsules with DD or DT gas fill. To this end, the MMI diagnostic has been identified as a key diagnostic for providing space and time-resolved density, temperature, and mix profiles. The high Z shell dopants used on Omega are Ti and V, and to be used on NIF are Ge and Se. This poster will discuss the following four areas of MMI-related work at LANL, in collaboration with UNR: (1) data and preliminary analysis of MMI data from FY11 Omega campaigns, (2) development of a capability to generate simulated MMI data from radiation- hydrodynamic simulations of ICF implosions, (3) design of an MMI instrument for NIF that will cover the photon energy range 9.5-16.9 keV which includes the Ge/Se, H- like/He-like, α/ β lines, and (4) the development of MMI data post- processing and spectroscopic analysis tools. Supported by DOE NNSA.

  18. SOFT: a synthetic synchrotron diagnostic for runaway electrons

    NASA Astrophysics Data System (ADS)

    Hoppe, M.; Embréus, O.; Tinguely, R. A.; Granetz, R. S.; Stahl, A.; Fülöp, T.

    2018-02-01

    Improved understanding of the dynamics of runaway electrons can be obtained by measurement and interpretation of their synchrotron radiation emission. Models for synchrotron radiation emitted by relativistic electrons are well established, but the question of how various geometric effects—such as magnetic field inhomogeneity and camera placement—influence the synchrotron measurements and their interpretation remains open. In this paper we address this issue by simulating synchrotron images and spectra using the new synthetic synchrotron diagnostic tool SOFT (Synchrotron-detecting Orbit Following Toolkit). We identify the key parameters influencing the synchrotron radiation spot and present scans in those parameters. Using a runaway electron distribution function obtained by Fokker-Planck simulations for parameters from an Alcator C-Mod discharge, we demonstrate that the corresponding synchrotron image is well-reproduced by SOFT simulations, and we explain how it can be understood in terms of the parameter scans. Geometric effects are shown to significantly influence the synchrotron spectrum, and we show that inherent inconsistencies in a simple emission model (i.e. not modeling detection) can lead to incorrect interpretation of the images.

  19. Electrochemistry-based Approaches to Low Cost, High Sensitivity, Automated, Multiplexed Protein Immunoassays for Cancer Diagnostics

    PubMed Central

    Dixit, Chandra K.; Kadimisetty, Karteek; Otieno, Brunah A.; Tang, Chi; Malla, Spundana; Krause, Colleen E.; Rusling, James F.

    2015-01-01

    Early detection and reliable diagnostics are keys to effectively design cancer therapies with better prognoses. Simultaneous detection of panels of biomarker proteins holds great promise as a general tool for reliable cancer diagnostics. A major challenge in designing such a panel is to decide upon a coherent group of biomarkers which have higher specificity for a given type of cancer. The second big challenge is to develop test devices to measure these biomarkers quantitatively with high sensitivity and specificity, such that there are no interferences from the complex serum or tissue matrices. Lastly, integrating all these tests into a technology that doesn’t require exclusive training to operate, and can be used at point-of-care (POC) is another potential bottleneck in futuristic cancer diagnostics. In this article, we review electrochemistry-based tools and technologies developed and/or used in our laboratories to construct low-cost microfluidic protein arrays for highly sensitive detection of the panel of cancer-specific biomarkers with high specificity and at the same time have the potential to be translated into a POC. PMID:26525998

  20. Electrochemistry-based approaches to low cost, high sensitivity, automated, multiplexed protein immunoassays for cancer diagnostics.

    PubMed

    Dixit, Chandra K; Kadimisetty, Karteek; Otieno, Brunah A; Tang, Chi; Malla, Spundana; Krause, Colleen E; Rusling, James F

    2016-01-21

    Early detection and reliable diagnostics are keys to effectively design cancer therapies with better prognoses. The simultaneous detection of panels of biomarker proteins holds great promise as a general tool for reliable cancer diagnostics. A major challenge in designing such a panel is to decide upon a coherent group of biomarkers which have higher specificity for a given type of cancer. The second big challenge is to develop test devices to measure these biomarkers quantitatively with high sensitivity and specificity, such that there are no interferences from the complex serum or tissue matrices. Lastly, integrating all these tests into a technology that does not require exclusive training to operate, and can be used at point-of-care (POC) is another potential bottleneck in futuristic cancer diagnostics. In this article, we review electrochemistry-based tools and technologies developed and/or used in our laboratories to construct low-cost microfluidic protein arrays for the highly sensitive detection of a panel of cancer-specific biomarkers with high specificity which at the same time has the potential to be translated into POC applications.

  1. The validity of a professional competence tool for physiotherapy students in simulation-based clinical education: a Rasch analysis.

    PubMed

    Judd, Belinda K; Scanlan, Justin N; Alison, Jennifer A; Waters, Donna; Gordon, Christopher J

    2016-08-05

    Despite the recent widespread adoption of simulation in clinical education in physiotherapy, there is a lack of validated tools for assessment in this setting. The Assessment of Physiotherapy Practice (APP) is a comprehensive tool used in clinical placement settings in Australia to measure professional competence of physiotherapy students. The aim of the study was to evaluate the validity of the APP for student assessment in simulation settings. A total of 1260 APPs were collected, 971 from students in simulation and 289 from students in clinical placements. Rasch analysis was used to examine the construct validity of the APP tool in three different simulation assessment formats: longitudinal assessment over 1 week of simulation; longitudinal assessment over 2 weeks; and a short-form (25 min) assessment of a single simulation scenario. Comparison with APPs from 5 week clinical placements in hospital and clinic-based settings were also conducted. The APP demonstrated acceptable fit to the expectations of the Rasch model for the 1 and 2 week clinical simulations, exhibiting unidimensional properties that were able to distinguish different levels of student performance. For the short-form simulation, nine of the 20 items recorded greater than 25 % of scores as 'not-assessed' by clinical educators which impacted on the suitability of the APP tool in this simulation format. The APP was a valid assessment tool when used in longitudinal simulation formats. A revised APP may be required for assessment in short-form simulation scenarios.

  2. Web-Based Architecture to Enable Compute-Intensive CAD Tools and Multi-user Synchronization in Teleradiology

    NASA Astrophysics Data System (ADS)

    Mehta, Neville; Kompalli, Suryaprakash; Chaudhary, Vipin

    Teleradiology is the electronic transmission of radiological patient images, such as x-rays, CT, or MR across multiple locations. The goal could be interpretation, consultation, or medical records keeping. Information technology solutions have enabled electronic records and their associated benefits are evident in health care today. However, salient aspects of collaborative interfaces, and computer assisted diagnostic (CAD) tools are yet to be integrated into workflow designs. The Computer Assisted Diagnostics and Interventions (CADI) group at the University at Buffalo has developed an architecture that facilitates web-enabled use of CAD tools, along with the novel concept of synchronized collaboration. The architecture can support multiple teleradiology applications and case studies are presented here.

  3. Measurement of Electron Density Using the Multipole Resonance Probe, Langmuir Probe and Optical Emission Spectroscopy in Low Pressure Plasmas with Different Electron Energy Distribution Functions

    NASA Astrophysics Data System (ADS)

    Oberberg, Moritz; Bibinov, Nikita; Ries, Stefan; Awakowicz, Peter; Institute of Electrical Engineering; Plasma Technology Team

    2016-09-01

    In recently publication, the young diagnostic tool Multipole Resonance Probe (MRP) for electron density measurements was introduced. It is based on active plasma resonance spectroscopy (APRS). The probe was simulated und evaluated for different devices. The geometrical and electrical symmetry simplifies the APRS model, so that the electron density can be easily calculated from the measured resonance. In this work, low pressure nitrogen mixture plasmas with different electron energy distribution functions (EEDF) are investigated. The results of the MRP measurement are compared with measurements of a Langmuir Probe (LP) and Optical Emission Spectroscopy (OES). Probes and OES measure in different regimes of kinetic electron energy. Both probes measure electrons with low kinetic energy (<10 eV), whereas the OES is influenced by electrons with high kinetic energy which are needed for transitions of molecule bands. By the determination of the absolute intensity of N2(C-B) and N2+(B-X)electron temperature and density can be calculated. In a non-maxwellian plasma, all plasma diagnostics need to be combined.

  4. Evaluation of skill level between trainees and community orthopaedic surgeons using a virtual reality arthroscopic knee simulator.

    PubMed

    Cannon, W Dilworth; Nicandri, Gregg T; Reinig, Karl; Mevis, Howard; Wittstein, Jocelyn

    2014-04-02

    Several virtual reality simulators have been developed to assist orthopaedic surgeons in acquiring the skills necessary to perform arthroscopic surgery. The purpose of this study was to assess the construct validity of the ArthroSim virtual reality arthroscopy simulator by evaluating whether skills acquired through increased experience in the operating room lead to improved performance on the simulator. Using the simulator, six postgraduate year-1 orthopaedic residents were compared with six postgraduate year-5 residents and with six community-based orthopaedic surgeons when performing diagnostic arthroscopy. The time to perform the procedure was recorded. To ensure that subjects did not sacrifice the quality of the procedure to complete the task in a shorter time, the simulator was programmed to provide a completeness score that indicated whether the surgeon accurately performed all of the steps of diagnostic arthroscopy in the correct sequence. The mean time to perform the procedure by each group was 610 seconds for community-based orthopaedic surgeons, 745 seconds for postgraduate year-5 residents, and 1028 seconds for postgraduate year-1 residents. Both the postgraduate year-5 residents and the community-based orthopaedic surgeons performed the procedure in significantly less time (p = 0.006) than the postgraduate year-1 residents. There was a trend toward significance (p = 0.055) in time to complete the procedure when the postgraduate year-5 residents were compared with the community-based orthopaedic surgeons. The mean level of completeness as assigned by the simulator for each group was 85% for the community-based orthopaedic surgeons, 79% for the postgraduate year-5 residents, and 71% for the postgraduate year-1 residents. As expected, these differences were not significant, indicating that the three groups had achieved an acceptable level of consistency in their performance of the procedure. Higher levels of surgeon experience resulted in improved efficiency when performing diagnostic knee arthroscopy on the simulator. Further validation studies utilizing the simulator are currently under way and the additional simulated tasks of arthroscopic meniscectomy, meniscal repair, microfracture, and loose body removal are being developed.

  5. Portable Diagnostics Technology Assessment for Space Missions. Part 1; General Technology Capabilities for NASA Exploration Missions

    NASA Technical Reports Server (NTRS)

    Nelson, Emily S.; Chait, Arnon

    2010-01-01

    The changes in the scope of NASA s mission in the coming decade are profound and demand nimble, yet insightful, responses. On-board clinical and environmental diagnostics must be available for both mid-term lunar and long-term Mars exploration missions in an environment marked by scarce resources. Miniaturization has become an obvious focus. Despite solid achievements in lab-based devices, broad-based, robust tools for application in the field are not yet on the market. The confluence of rapid, wide-ranging technology evolution and internal planning needs are the impetus behind this work. This report presents an analytical tool for the ongoing evaluation of promising technology platforms based on mission- and application-specific attributes. It is not meant to assess specific devices, but rather to provide objective guidelines for a rational down-select of general categories of technology platforms. In this study, we have employed our expertise in the microgravity operation of fluidic devices, laboratory diagnostics for space applications, and terrestrial research in biochip development. A rating of the current state of technology development is presented using the present tool. Two mission scenarios are also investigated: a 30-day lunar mission using proven, tested technology in 5 years; and a 2- to 3-year mission to Mars in 10 to 15 years.

  6. [Clinical Application of Non-invasive Diagnostic Tests for Liver Fibrosis].

    PubMed

    Shin, Jung Woo; Park, Neung Hwa

    2016-07-25

    The diagnostic assessment of liver fibrosis is an important step in the management of patients with chronic liver diseases. Liver biopsy is considered the gold standard to assess necroinflammation and fibrosis. However, recent technical advances have introduced numerous serum biomarkers and imaging tools using elastography as noninvasive alternatives to biopsy. Serum markers can be direct or indirect markers of the fibrosis process. The elastography-based studies include transient elastography, acoustic radiation force imaging, supersonic shear wave imaging and magnetic resonance elastography. As accumulation of clinical data shows that noninvasive tests provide prognostic information of clinical relevance, non-invasive diagnostic tools have been incorporated into clinical guidelines and practice. Here, the authors review noninvasive tests for the diagnosis of liver fibrosis.

  7. Insect E-probe Diagnostic Nucleic acid Analysis (EDNA): the application of a novel bioinformatic tool to detection of vectors and pathogens in individual insect and simulated insect trap metagenomes

    USDA-ARS?s Scientific Manuscript database

    Plant pathogen detection takes many forms. In simple cases, researchers are attempting to detect a known pathogen from a known host utilizing targeted nucleic acid or antigenic assays. However, in more complex scenarios researchers may not know the identity of a pathogen, or they may need to screen ...

  8. Optical diagnostics in the oral cavity: an overview.

    PubMed

    Wilder-Smith, P; Holtzman, J; Epstein, J; Le, A

    2010-11-01

    As the emphasis shifts from damage mitigation to disease prevention or reversal of early disease in the oral cavity, the need for sensitive and accurate detection and diagnostic tools become more important. Many novel and emergent optical diagnostic modalities for the oral cavity are becoming available to clinicians with a variety of desirable attributes including: (i) non-invasiveness, (ii) absence of ionizing radiation, (iii) patient-friendliness, (iv) real-time information (v) repeatability, and (vi) high-resolution surface and subsurface images. In this article, the principles behind optical diagnostic approaches, their feasibility and applicability for imaging soft and hard tissues, and their potential usefulness as a tool in the diagnosis of oral mucosal lesions, dental pathologies, and other dental applications will be reviewed. The clinical applications of light-based imaging technologies in the oral cavity and of their derivative devices will be discussed to provide the reader with a comprehensive understanding of emergent diagnostic modalities. © 2010 John Wiley & Sons A/S.

  9. A Comparison of Optical, Electrochemical, Magnetic, and Colorimetric Point-of-Care Biosensors for Infectious Disease Diagnosis.

    PubMed

    Pashchenko, Oleksandra; Shelby, Tyler; Banerjee, Tuhina; Santra, Santimukul

    2018-06-18

    Each year, infectious diseases are responsible for millions of deaths, most of which occur in the rural areas of developing countries. Many of the infectious disease diagnostic tools used today require a great deal of time, a laboratory setting, and trained personnel. Due to this, the need for effective point-of-care (POC) diagnostic tools is greatly increasing with an emphasis on affordability, portability, sensitivity, specificity, timeliness, and ease of use. In this Review, we discuss the various diagnostic modalities that have been utilized toward this end and are being further developed to create POC diagnostic technologies, and we focus on potential effectiveness in resource-limited settings. The main modalities discussed herein are optical-, electrochemical-, magnetic-, and colorimetric-based modalities utilized in diagnostic technologies for infectious diseases. Each of these modalities feature pros and cons when considering application in POC settings but, overall, reveal a promising outlook for the future of this field of technological development.

  10. Demise of Polymerase Chain Reaction/Electrospray Ionization-Mass Spectrometry as an Infectious Diseases Diagnostic Tool.

    PubMed

    Özenci, Volkan; Patel, Robin; Ullberg, Måns; Strålin, Kristoffer

    2018-01-18

    Although there are several US Food and Drug Administration (FDA)-approved/cleared molecular microbiology diagnostics for direct analysis of patient samples, all are single target or panel-based tests. There is no FDA-approved/cleared diagnostic for broad microbial detection. Polymerase chain reaction (PCR)/electrospray ionization-mass spectrometry (PCR/ESI-MS), commercialized as the IRIDICA system (Abbott) and formerly PLEX-ID, had been under development for over a decade and had become CE-marked and commercially available in Europe in 2014. Capable of detecting a large number of microorganisms, it was under review at the FDA when, in April 2017, Abbott discontinued it. This turn of events represents not only the loss of a potential diagnostic tool for infectious diseases but may be a harbinger of similar situations with other emerging and expensive microbial diagnostics, especially genomic tests. © The Author(s) 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  11. DDS: The Dental Diagnostic Simulation System.

    ERIC Educational Resources Information Center

    Tira, Daniel E.

    The Dental Diagnostic Simulation (DDS) System provides an alternative to simulation systems which represent diagnostic case studies of relatively limited scope. It may be used to generate simulated case studies in all of the dental specialty areas with case materials progressing through the gamut of the diagnostic process. The generation of a…

  12. SIMULATION TOOL KIT FOR INDOOR AIR QUALITY AND INHALATION EXPOSURE (IAQX) VERSION 1.0 USER'S GUIDE

    EPA Science Inventory

    The User's Guide describes a Microsoft Windows-based indoor air quality (IAQ) simulation software package designed Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short. This software complements and supplements existing IAQ simulation programs and...

  13. Diagnostics aid for mass spectrometer trouble-shooting

    NASA Astrophysics Data System (ADS)

    Filby, E. E.; Rankin, R. A.; Webb, G. W.

    The MS Expert system provides problem diagnostics for instruments used in the Mass Spectrometry Laboratory (MSL). The most critical results generated on these mass spectrometers are the uranium concentration and isotopic content data used for process control and materials accountability at the Idaho Chemical Processing Plant. The two purposes of the system are: (1) to minimize instrument downtime and thereby provide the best possible support to the Plant, and (2) to improve long-term data quality. This system combines the knowledge of several experts on mass spectrometry to provide a diagnostic tool, and can make these skills available on a more timely basis. It integrates code written in the Pascal language with a knowledge base entered into a commercial expert system shell. The user performs some preliminary status checks, and then selects from among several broad diagnostic categories. These initial steps provide input to the rule base. The overall analysis provides the user with a set of possible solutions to the observed problems, graded as to their probabilities. Besides the trouble-shooting benefits expected from this system, it will also provide structures diagnostic training for lab personnel. In addition, development of the system knowledge base has already produced a better understanding of instrument behavior. Two key findings are that a good user interface is necessary for full acceptance of the tool, and a development system should include standard programming capabilities as well as the expert system shell.

  14. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model.

    PubMed

    Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  15. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model

    NASA Astrophysics Data System (ADS)

    Neic, Aurel; Campos, Fernando O.; Prassl, Anton J.; Niederer, Steven A.; Bishop, Martin J.; Vigmond, Edward J.; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  16. Designing a training tool for imaging mental models

    NASA Technical Reports Server (NTRS)

    Dede, Christopher J.; Jayaram, Geetha

    1990-01-01

    The training process can be conceptualized as the student acquiring an evolutionary sequence of classification-problem solving mental models. For example a physician learns (1) classification systems for patient symptoms, diagnostic procedures, diseases, and therapeutic interventions and (2) interrelationships among these classifications (e.g., how to use diagnostic procedures to collect data about a patient's symptoms in order to identify the disease so that therapeutic measures can be taken. This project developed functional specifications for a computer-based tool, Mental Link, that allows the evaluative imaging of such mental models. The fundamental design approach underlying this representational medium is traversal of virtual cognition space. Typically intangible cognitive entities and links among them are visible as a three-dimensional web that represents a knowledge structure. The tool has a high degree of flexibility and customizability to allow extension to other types of uses, such a front-end to an intelligent tutoring system, knowledge base, hypermedia system, or semantic network.

  17. Diagnostic Tools for Acute Anterior Cruciate Ligament Injury: GNRB, Lachman Test, and Telos.

    PubMed

    Ryu, Seung Min; Na, Ho Dong; Shon, Oog Jin

    2018-06-01

    The purpose of this study is to compare the accuracy of the GNRB arthrometer (Genourob), Lachman test, and Telos device (GmbH) in acute anterior cruciate ligament (ACL) injuries and to evaluate the accuracy of each diagnostic tool according to the length of time from injury to examination. From September 2015 to September 2016, 40 cases of complete ACL rupture were reviewed. We divided the time from injury to examination into three periods of 10 days each and analyzed the diagnostic tools according to the time frame. An analysis of the area under the curve (AUC) of a receiver operating characteristic curve showed that all diagnostic tools were fairly informative. The GNRB showed a higher AUC than other diagnostic tools. In 10 cases assessed within 10 days after injury, the GNRB showed statistically significant side-to-side difference in laxity (p<0.001), whereas the Telos test and Lachman test did not show significantly different laxity (p=0.541 and p=0.413, respectively). All diagnostic values of the GNRB were better than other diagnostic tools in acute ACL injuries. The GNRB was more effective in acute ACL injuries examined within 10 days of injury. The GNRB arthrometer can be a useful diagnostic tool for acute ACL injuries.

  18. Recent H- diagnostics, plasma simulations, and 2X scaled Penning ion source developments at the Rutherford Appleton Laboratory

    NASA Astrophysics Data System (ADS)

    Lawrie, S. R.; Faircloth, D. C.; Smith, J. D.; Sarmento, T. M.; Whitehead, M. O.; Wood, T.; Perkins, M.; Macgregor, J.; Abel, R.

    2018-05-01

    A vessel for extraction and source plasma analyses is being used for Penning H- ion source development at the Rutherford Appleton Laboratory. A new set of optical elements including an einzel lens has been installed, which transports over 80 mA of H- beam successfully. Simultaneously, a 2X scaled Penning source has been developed to reduce cathode power density. The 2X source is now delivering a 65 mA H- ion beam at 10% duty factor, meeting its design criteria. The long-term viability of the einzel lens and 2X source is now being evaluated, so new diagnostic devices have been installed. A pair of electrostatic deflector plates is used to correct beam misalignment and perform fast chopping, with a voltage rise time of 24 ns. A suite of four quartz crystal microbalances has shown that the cesium flux in the vacuum vessel is only increased by a factor of two, despite the absence of a dedicated cold trap. Finally, an infrared camera has demonstrated good agreement with thermal simulations but has indicated unexpected heating due to beam loss on the downstream electrode. These types of diagnostics are suitable for monitoring all operational ion sources. In addition to experimental campaigns and new diagnostic tools, the high-performance VSim and COMSOL software packages are being used for plasma simulations of two novel ion thrusters for space propulsion applications. In parallel, a VSim framework has been established to include arbitrary temperature and cesium fields to allow the modeling of surface physics in H- ion sources.

  19. Treatment Options to Manage Wound Biofilm

    PubMed Central

    Jones, Curtis E.; Kennedy, John P.

    2012-01-01

    Background Bioburden is an accepted barrier to chronic wound healing. Defining the significance, phenotype, clinical classification, and treatment guidelines has been historically lacking of evidence and based on paradigms that do not represent the scientific or clinical reality. The Problem Chronic wound bioburden is typically abundant, polymicrobial, and extremely diverse. These microbes naturally adopt biofilm phenotypes, which are quite often viable but not culturable, thereby going undetected. The failures of culture-based detection have led to abandonment of routine bioburden evaluation and aggressive treatment or, worse, to assume bioburden is not a significant barrier. Predictably, treatment regimens to address biofilm phenotypes lagged behind our diagnostic tools and understanding. Basic/Clinical Science Advances Microbial DNA-based diagnostic tools and treatment regimens have emerged, which provide and leverage objective information, resulting in a dramatic impact on outcomes. Relevance to Clinical Care Modern medicine demands decisions based on objective evidence. The diagnostic and treatment protocols reviewed herein empower clinicians to practice modern medicine with regard to bioburden, with DNA level certainty. Conclusion Bioburden is a significant barrier to healing for all chronic wounds. Molecular diagnostics provide the first objective means of assessing wound bioburden. The accuracy and comprehensive data from such diagnostic methodologies provide clinicians with the ability to employ patient-specific treatment options, targeted to each patient's microbial wound census. Based on current outcomes data, the most effective therapeutic options are topical (TPL) antibiofilm agents (ABF) combined with TPL antibiotics (ABX). In specific patients, systemic ABX and selective biocides are also appropriate, but not exclusive of ABF combined with TPL ABX. PMID:24527291

  20. Diagnostic tools in ocular allergy.

    PubMed

    Leonardi, A; Doan, S; Fauquert, J L; Bozkurt, B; Allegri, P; Marmouz, F; Rondon, C; Jedrzejczak, M; Hellings, P; Delgado, L; Calder, V

    2017-10-01

    Ocular allergy (OA) includes a group of common and less frequent hypersensitivity disorders frequently misdiagnosed and not properly managed. The diagnosis of OA is usually based on clinical history and signs and symptoms, with the support of in vivo and in vitro tests when identification of the specific allergen is required. To date, no specific test is available for the diagnosis of the whole spectrum of the different forms of OA. The lack of recommendations on diagnosis of OA is considered a medical need not only for allergists but also for ophthalmologists. This position paper aims to provide a comprehensive overview of the currently available tools for diagnosing OA to promote a common nomenclature and procedures to be used by different specialists. Questionnaires, sign and symptom grading scales, tests, and potential biomarkers for OA are reviewed. We also identified several unmet needs in the diagnostic tools to generate interest, increase understanding, and inspire further investigations. Tools, recommendations, and algorithms for the diagnosis of OA are proposed for use by both allergists and ophthalmologists. Several unmet needs in the diagnostic tools should be further improved by specific clinical research in OA. © 2017 EAACI and John Wiley and Sons A/S. Published by John Wiley and Sons Ltd.

  1. Development of a tool for calculating early internal doses in the Fukushima Daiichi nuclear power plant accident based on atmospheric dispersion simulation

    NASA Astrophysics Data System (ADS)

    Kurihara, Osamu; Kim, Eunjoo; Kunishima, Naoaki; Tani, Kotaro; Ishikawa, Tetsuo; Furuyama, Kazuo; Hashimoto, Shozo; Akashi, Makoto

    2017-09-01

    A tool was developed to facilitate the calculation of the early internal doses to residents involved in the Fukushima Nuclear Disaster based on atmospheric transport and dispersion model (ATDM) simulations performed using Worldwide version of System for Prediction of Environmental Emergency Information 2nd version (WSPEEDI-II) together with personal behavior data containing the history of the whereabouts of individul's after the accident. The tool generates hourly-averaged air concentration data for the simulation grids nearest to an individual's whereabouts using WSPEEDI-II datasets for the subsequent calculation of internal doses due to inhalation. This paper presents an overview of the developed tool and provides tentative comparisons between direct measurement-based and ATDM-based results regarding the internal doses received by 421 persons from whom personal behavior data available.

  2. Using Visual Simulation Tools And Learning Outcomes-Based Curriculum To Help Transportation Engineering Students And Practitioners To Better Understand And Design Traffic Signal Control Systems

    DOT National Transportation Integrated Search

    2012-06-01

    The use of visual simulation tools to convey complex concepts has become a useful tool in education as well as in research. : This report describes a project that developed curriculum and visualization tools to train transportation engineering studen...

  3. TRIP-ID: A tool for a smart and interactive identification of Magic Formula tyre model parameters from experimental data acquired on track or test rig

    NASA Astrophysics Data System (ADS)

    Farroni, Flavio; Lamberti, Raffaele; Mancinelli, Nicolò; Timpone, Francesco

    2018-03-01

    Tyres play a key role in ground vehicles' dynamics because they are responsible for traction, braking and cornering. A proper tyre-road interaction model is essential for a useful and reliable vehicle dynamics model. In the last two decades Pacejka's Magic Formula (MF) has become a standard in simulation field. This paper presents a Tool, called TRIP-ID (Tyre Road Interaction Parameters IDentification), developed to characterize and to identify with a high grade of accuracy and reliability MF micro-parameters from experimental data deriving from telemetry or from test rig. The tool guides interactively the user through the identification process on the basis of strong diagnostic considerations about the experimental data made evident by the tool itself. A motorsport application of the tool is shown as a case study.

  4. How does study quality affect the results of a diagnostic meta-analysis?

    PubMed Central

    Westwood, Marie E; Whiting, Penny F; Kleijnen, Jos

    2005-01-01

    Background The use of systematic literature review to inform evidence based practice in diagnostics is rapidly expanding. Although the primary diagnostic literature is extensive, studies are often of low methodological quality or poorly reported. There has been no rigorously evaluated, evidence based tool to assess the methodological quality of diagnostic studies. The primary objective of this study was to determine the extent to which variations in the quality of primary studies impact the results of a diagnostic meta-analysis and whether this differs with diagnostic test type. A secondary objective was to contribute to the evaluation of QUADAS, an evidence-based tool for the assessment of quality in diagnostic accuracy studies. Methods This study was conducted as part of large systematic review of tests used in the diagnosis and further investigation of urinary tract infection (UTI) in children. All studies included in this review were assessed using QUADAS, an evidence-based tool for the assessment of quality in systematic reviews of diagnostic accuracy studies. The impact of individual components of QUADAS on a summary measure of diagnostic accuracy was investigated using regression analysis. The review divided the diagnosis and further investigation of UTI into the following three clinical stages: diagnosis of UTI, localisation of infection, and further investigation of the UTI. Each stage used different types of diagnostic test, which were considered to involve different quality concerns. Results Many of the studies included in our review were poorly reported. The proportion of QUADAS items fulfilled was similar for studies in different sections of the review. However, as might be expected, the individual items fulfilled differed between the three clinical stages. Regression analysis found that different items showed a strong association with test performance for the different tests evaluated. These differences were observed both within and between the three clinical stages assessed by the review. The results of regression analyses were also affected by whether or not a weighting (by sample size) was applied. Our analysis was severely limited by the completeness of reporting and the differences between the index tests evaluated and the reference standards used to confirm diagnoses in the primary studies. Few tests were evaluated by sufficient studies to allow meaningful use of meta-analytic pooling and investigation of heterogeneity. This meant that further analysis to investigate heterogeneity could only be undertaken using a subset of studies, and that the findings are open to various interpretations. Conclusion Further work is needed to investigate the influence of methodological quality on the results of diagnostic meta-analyses. Large data sets of well-reported primary studies are needed to address this question. Without significant improvements in the completeness of reporting of primary studies, progress in this area will be limited. PMID:15943861

  5. The development of a quality appraisal tool for studies of diagnostic reliability (QAREL).

    PubMed

    Lucas, Nicholas P; Macaskill, Petra; Irwig, Les; Bogduk, Nikolai

    2010-08-01

    In systematic reviews of the reliability of diagnostic tests, no quality assessment tool has been used consistently. The aim of this study was to develop a specific quality appraisal tool for studies of diagnostic reliability. Key principles for the quality of studies of diagnostic reliability were identified with reference to epidemiologic principles, existing quality appraisal checklists, and the Standards for Reporting of Diagnostic Accuracy (STARD) and Quality Assessment of Diagnostic Accuracy Studies (QUADAS) resources. Specific items that encompassed each of the principles were developed. Experts in diagnostic research provided feedback on the items that were to form the appraisal tool. This process was iterative and continued until consensus among experts was reached. The Quality Appraisal of Reliability Studies (QAREL) checklist includes 11 items that explore seven principles. Items cover the spectrum of subjects, spectrum of examiners, examiner blinding, order effects of examination, suitability of the time interval among repeated measurements, appropriate test application and interpretation, and appropriate statistical analysis. QAREL has been developed as a specific quality appraisal tool for studies of diagnostic reliability. The reliability of this tool in different contexts needs to be evaluated. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  6. Modeling languages for biochemical network simulation: reaction vs equation based approaches.

    PubMed

    Wiechert, Wolfgang; Noack, Stephan; Elsheikh, Atya

    2010-01-01

    Biochemical network modeling and simulation is an essential task in any systems biology project. The systems biology markup language (SBML) was established as a standardized model exchange language for mechanistic models. A specific strength of SBML is that numerous tools for formulating, processing, simulation and analysis of models are freely available. Interestingly, in the field of multidisciplinary simulation, the problem of model exchange between different simulation tools occurred much earlier. Several general modeling languages like Modelica have been developed in the 1990s. Modelica enables an equation based modular specification of arbitrary hierarchical differential algebraic equation models. Moreover, libraries for special application domains can be rapidly developed. This contribution compares the reaction based approach of SBML with the equation based approach of Modelica and explains the specific strengths of both tools. Several biological examples illustrating essential SBML and Modelica concepts are given. The chosen criteria for tool comparison are flexibility for constraint specification, different modeling flavors, hierarchical, modular and multidisciplinary modeling. Additionally, support for spatially distributed systems, event handling and network analysis features is discussed. As a major result it is shown that the choice of the modeling tool has a strong impact on the expressivity of the specified models but also strongly depends on the requirements of the application context.

  7. The computerized adaptive diagnostic test for major depressive disorder (CAD-MDD): a screening tool for depression.

    PubMed

    Gibbons, Robert D; Hooker, Giles; Finkelman, Matthew D; Weiss, David J; Pilkonis, Paul A; Frank, Ellen; Moore, Tara; Kupfer, David J

    2013-07-01

    To develop a computerized adaptive diagnostic screening tool for depression that decreases patient and clinician burden and increases sensitivity and specificity for clinician-based DSM-IV diagnosis of major depressive disorder (MDD). 656 individuals with and without minor and major depression were recruited from a psychiatric clinic and a community mental health center and through public announcements (controls without depression). The focus of the study was the development of the Computerized Adaptive Diagnostic Test for Major Depressive Disorder (CAD-MDD) diagnostic screening tool based on a decision-theoretical approach (random forests and decision trees). The item bank consisted of 88 depression scale items drawn from 73 depression measures. Sensitivity and specificity for predicting clinician-based Structured Clinical Interview for DSM-IV Axis I Disorders diagnoses of MDD were the primary outcomes. Diagnostic screening accuracy was then compared to that of the Patient Health Questionnaire-9 (PHQ-9). An average of 4 items per participant was required (maximum of 6 items). Overall sensitivity and specificity were 0.95 and 0.87, respectively. For the PHQ-9, sensitivity was 0.70 and specificity was 0.91. High sensitivity and reasonable specificity for a clinician-based DSM-IV diagnosis of depression can be obtained using an average of 4 adaptively administered self-report items in less than 1 minute. Relative to the currently used PHQ-9, the CAD-MDD dramatically increased sensitivity while maintaining similar specificity. As such, the CAD-MDD will identify more true positives (lower false-negative rate) than the PHQ-9 using half the number of items. Inexpensive (relative to clinical assessment), efficient, and accurate screening of depression in the settings of primary care, psychiatric epidemiology, molecular genetics, and global health are all direct applications of the current system. © Copyright 2013 Physicians Postgraduate Press, Inc.

  8. Frontiers of beam diagnostics in plasma accelerators: Measuring the ultra-fast and ultra-cold

    NASA Astrophysics Data System (ADS)

    Cianchi, A.; Anania, M. P.; Bisesto, F.; Chiadroni, E.; Curcio, A.; Ferrario, M.; Giribono, A.; Marocchino, A.; Pompili, R.; Scifo, J.; Shpakov, V.; Vaccarezza, C.; Villa, F.; Mostacci, A.; Bacci, A.; Rossi, A. R.; Serafini, L.; Zigler, A.

    2018-05-01

    Advanced diagnostics are essential tools in the development of plasma-based accelerators. The accurate measurement of the quality of beams at the exit of the plasma channel is crucial to optimize the parameters of the plasma accelerator. 6D electron beam diagnostics will be reviewed with emphasis on emittance measurement, which is particularly complex due to large energy spread and divergence of the emerging beams, and on femtosecond bunch length measurements.

  9. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    PubMed

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  10. Interrelation of Evaluation and Self-Evaluation in the Diagnostic Procedures to Assess Teachers' Readiness for Innovation

    ERIC Educational Resources Information Center

    Tyunnikov, Yurii S.

    2016-01-01

    The paper solves the problem of the relationship of external diagnosis and self-diagnosis of readiness of teachers to innovative activity. It highlights major disadvantages of measurement tools that are used to this process. The author demonstrates an alternative approach to harmonizing the diagnosis, based on a modular diagnostic model, general…

  11. Lessons learned developing a diagnostic tool for HIV-associated dementia feasible to implement in resource-limited settings: pilot testing in Kenya.

    PubMed

    Kwasa, Judith; Cettomai, Deanna; Lwanya, Edwin; Osiemo, Dennis; Oyaro, Patrick; Birbeck, Gretchen L; Price, Richard W; Bukusi, Elizabeth A; Cohen, Craig R; Meyer, Ana-Claire L

    2012-01-01

    To conduct a preliminary evaluation of the utility and reliability of a diagnostic tool for HIV-associated dementia (HAD) for use by primary health care workers (HCW) which would be feasible to implement in resource-limited settings. In resource-limited settings, HAD is an indication for anti-retroviral therapy regardless of CD4 T-cell count. Anti-retroviral therapy, the treatment for HAD, is now increasingly available in resource-limited settings. Nonetheless, HAD remains under-diagnosed likely because of limited clinical expertise and availability of diagnostic tests. Thus, a simple diagnostic tool which is practical to implement in resource-limited settings is an urgent need. A convenience sample of 30 HIV-infected outpatients was enrolled in Western Kenya. We assessed the sensitivity and specificity of a diagnostic tool for HAD as administered by a primary HCW. This was compared to an expert clinical assessment which included examination by a physician, neuropsychological testing, and in selected cases, brain imaging. Agreement between HCW and an expert examiner on certain tool components was measured using Kappa statistic. The sample was 57% male, mean age was 38.6 years, mean CD4 T-cell count was 323 cells/µL, and 54% had less than a secondary school education. Six (20%) of the subjects were diagnosed with HAD by expert clinical assessment. The diagnostic tool was 63% sensitive and 67% specific for HAD. Agreement between HCW and expert examiners was poor for many individual items of the diagnostic tool (K = .03-.65). This diagnostic tool had moderate sensitivity and specificity for HAD. However, reliability was poor, suggesting that substantial training and formal evaluations of training adequacy will be critical to enable HCW to reliably administer a brief diagnostic tool for HAD.

  12. Synchronous behaviour in network model based on human cortico-cortical connections.

    PubMed

    Protachevicz, Paulo Ricardo; Borges, Rafael Ribaski; Reis, Adriane da Silva; Borges, Fernando da Silva; Iarosz, Kelly Cristina; Caldas, Ibere Luiz; Lameu, Ewandson Luiz; Macau, Elbert Einstein Nehrer; Viana, Ricardo Luiz; Sokolov, Igor M; Ferrari, Fabiano A S; Kurths, Jürgen; Batista, Antonio Marcos

    2018-06-22

    We consider a network topology according to the cortico-cortical connec- tion network of the human brain, where each cortical area is composed of a random network of adaptive exponential integrate-and-fire neurons. Depending on the parameters, this neuron model can exhibit spike or burst patterns. As a diagnostic tool to identify spike and burst patterns we utilise the coefficient of variation of the neuronal inter-spike interval. In our neuronal network, we verify the existence of spike and burst synchronisation in different cortical areas. Our simulations show that the network arrangement, i.e., its rich-club organisation, plays an important role in the transition of the areas from desynchronous to synchronous behaviours. © 2018 Institute of Physics and Engineering in Medicine.

  13. Innovative applications of artificial intelligence

    NASA Astrophysics Data System (ADS)

    Schorr, Herbert; Rappaport, Alain

    Papers concerning applications of artificial intelligence are presented, covering applications in aerospace technology, banking and finance, biotechnology, emergency services, law, media planning, music, the military, operations management, personnel management, retail packaging, and manufacturing assembly and design. Specific topics include Space Shuttle telemetry monitoring, an intelligent training system for Space Shuttle flight controllers, an expert system for the diagnostics of manufacturing equipment, a logistics management system, a cooling systems design assistant, and a knowledge-based integrated circuit design critic. Additional topics include a hydraulic circuit design assistant, the use of a connector assembly specification expert system to harness detailed assembly process knowledge, a mixed initiative approach to airlift planning, naval battle management decision aids, an inventory simulation tool, a peptide synthesis expert system, and a system for planning the discharging and loading of container ships.

  14. Transverse beam stability measurement and analysis for the SNS accumulator ring

    DOE PAGES

    Xie, Zaipeng; Deibele, Craig; Schulte, Michael J.; ...

    2015-07-01

    In a Field-programmable gate array (FPGA) based transverse feedback damper system we implemented in the Spallation Neutron Source (SNS) accumulator ring with the intention to stabilize the electron-proton (e-p) instability in a frequency range from 1 MHz to 300 MHz. The transverse damper could also be used as a diagnostic tool by measuring the beam transfer function (BTF). An analysis of the BTF measurement provides the stability diagram for the production beam at SNS. Our paper describes the feedback damper system and its set-up as the BTF diagnostic tool. Experimental BTF results are presented and beam stability analysis is performedmore » based on the BTF measurements for the SNS accumulator ring.« less

  15. Ares I-X Ground Diagnostic Prototype

    NASA Technical Reports Server (NTRS)

    Schwabacher, Mark; Martin, Rodney; Waterman, Robert; Oostdyk, Rebecca; Ossenfort, John; Matthews, Bryan

    2010-01-01

    Automating prelaunch diagnostics for launch vehicles offers three potential benefits. First, it potentially improves safety by detecting faults that might otherwise have been missed so that they can be corrected before launch. Second, it potentially reduces launch delays by more quickly diagnosing the cause of anomalies that occur during prelaunch processing. Reducing launch delays will be critical to the success of NASA's planned future missions that require in-orbit rendezvous. Third, it potentially reduces costs by reducing both launch delays and the number of people needed to monitor the prelaunch process. NASA is currently developing the Ares I launch vehicle to bring the Orion capsule and its crew of four astronauts to low-earth orbit on their way to the moon. Ares I-X will be the first unmanned test flight of Ares I. It is scheduled to launch on October 27, 2009. The Ares I-X Ground Diagnostic Prototype is a prototype ground diagnostic system that will provide anomaly detection, fault detection, fault isolation, and diagnostics for the Ares I-X first-stage thrust vector control (TVC) and for the associated ground hydraulics while it is in the Vehicle Assembly Building (VAB) at John F. Kennedy Space Center (KSC) and on the launch pad. It will serve as a prototype for a future operational ground diagnostic system for Ares I. The prototype combines three existing diagnostic tools. The first tool, TEAMS (Testability Engineering and Maintenance System), is a model-based tool that is commercially produced by Qualtech Systems, Inc. It uses a qualitative model of failure propagation to perform fault isolation and diagnostics. We adapted an existing TEAMS model of the TVC to use for diagnostics and developed a TEAMS model of the ground hydraulics. The second tool, Spacecraft Health Inference Engine (SHINE), is a rule-based expert system developed at the NASA Jet Propulsion Laboratory. We developed SHINE rules for fault detection and mode identification. The prototype uses the outputs of SHINE as inputs to TEAMS. The third tool, the Inductive Monitoring System (IMS), is an anomaly detection tool developed at NASA Ames Research Center and is currently used to monitor the International Space Station Control Moment Gyroscopes. IMS automatically "learns" a model of historical nominal data in the form of a set of clusters and signals an alarm when new data fails to match this model. IMS offers the potential to detect faults that have not been modeled. The three tools have been integrated and deployed to Hangar AE at KSC where they interface with live data from the Ares I-X vehicle and from the ground hydraulics. The outputs of the tools are displayed on a console in Hangar AE, one of the locations from which the Ares I-X launch will be monitored. The full paper will describe how the prototype performed before the launch. It will include an analysis of the prototype's accuracy, including false-positive rates, false-negative rates, and receiver operating characteristics (ROC) curves. It will also include a description of the prototype's computational requirements, including CPU usage, main memory usage, and disk usage. If the prototype detects any faults during the prelaunch period then the paper will include a description of those faults. Similarly, if the prototype has any false alarms then the paper will describe them and will attempt to explain their causes.

  16. Error tolerance analysis of wave diagnostic based on coherent modulation imaging in high power laser system

    NASA Astrophysics Data System (ADS)

    Pan, Xingchen; Liu, Cheng; Zhu, Jianqiang

    2018-02-01

    Coherent modulation imaging providing fast convergence speed and high resolution with single diffraction pattern is a promising technique to satisfy the urgent demands for on-line multiple parameter diagnostics with single setup in high power laser facilities (HPLF). However, the influence of noise on the final calculated parameters concerned has not been investigated yet. According to a series of simulations with twenty different sampling beams generated based on the practical parameters and performance of HPLF, the quantitative analysis based on statistical results was first investigated after considering five different error sources. We found the background noise of detector and high quantization error will seriously affect the final accuracy and different parameters have different sensitivity to different noise sources. The simulation results and the corresponding analysis provide the potential directions to further improve the final accuracy of parameter diagnostics which is critically important to its formal applications in the daily routines of HPLF.

  17. A Data-Driven Diagnostic Framework for Wind Turbine Structures: A Holistic Approach

    PubMed Central

    Bogoevska, Simona; Spiridonakos, Minas; Chatzi, Eleni; Dumova-Jovanoska, Elena; Höffer, Rudiger

    2017-01-01

    The complex dynamics of operational wind turbine (WT) structures challenges the applicability of existing structural health monitoring (SHM) strategies for condition assessment. At the center of Europe’s renewable energy strategic planning, WT systems call for implementation of strategies that may describe the WT behavior in its complete operational spectrum. The framework proposed in this paper relies on the symbiotic treatment of acting environmental/operational variables and the monitored vibration response of the structure. The approach aims at accurate simulation of the temporal variability characterizing the WT dynamics, and subsequently at the tracking of the evolution of this variability in a longer-term horizon. The bi-component analysis tool is applied on long-term data, collected as part of continuous monitoring campaigns on two actual operating WT structures located in different sites in Germany. The obtained data-driven structural models verify the potential of the proposed strategy for development of an automated SHM diagnostic tool. PMID:28358346

  18. Modeling and simulation of a beam emission spectroscopy diagnostic for the ITER prototype neutral beam injector.

    PubMed

    Barbisan, M; Zaniol, B; Pasqualotto, R

    2014-11-01

    A test facility for the development of the neutral beam injection system for ITER is under construction at Consorzio RFX. It will host two experiments: SPIDER, a 100 keV H(-)/D(-) ion RF source, and MITICA, a prototype of the full performance ITER injector (1 MV, 17 MW beam). A set of diagnostics will monitor the operation and allow to optimize the performance of the two prototypes. In particular, beam emission spectroscopy will measure the uniformity and the divergence of the fast particles beam exiting the ion source and travelling through the beam line components. This type of measurement is based on the collection of the Hα/Dα emission resulting from the interaction of the energetic particles with the background gas. A numerical model has been developed to simulate the spectrum of the collected emissions in order to design this diagnostic and to study its performance. The paper describes the model at the base of the simulations and presents the modeled Hα spectra in the case of MITICA experiment.

  19. Development of GEM detector for plasma diagnostics application: simulations addressing optimization of its performance

    NASA Astrophysics Data System (ADS)

    Chernyshova, M.; Malinowski, K.; Kowalska-Strzęciwilk, E.; Czarski, T.; Linczuk, P.; Wojeński, A.; Krawczyk, R. D.

    2017-12-01

    The advanced Soft X-ray (SXR) diagnostics setup devoted to studies of the SXR plasma emissivity is at the moment a highly relevant and important for ITER/DEMO application. Especially focusing on the energy range of tungsten emission lines, as plasma contamination by W and its transport in the plasma must be understood and monitored for W plasma-facing material. The Gas Electron Multiplier, with a spatial and energy-resolved photon detecting chamber, based SXR radiation detection system under development by our group may become such a diagnostic setup considering and solving many physical, technical and technological aspects. This work presents the results of simulations aimed to optimize a design of the detector's internal chamber and its performance. The study of the effect of electrodes alignment allowed choosing the gap distances which maximizes electron transmission and choosing the optimal magnitudes of the applied electric fields. Finally, the optimal readout structure design was identified suitable to collect a total formed charge effectively, basing on the range of the simulated electron cloud at the readout plane which was in the order of ~ 2 mm.

  20. A probabilistic method to diagnose faults of air handling units

    NASA Astrophysics Data System (ADS)

    Dey, Debashis

    Air handling unit (AHU) is one of the most extensively used equipment in large commercial buildings. This device is typically customized and lacks quality system integration which can result in hardwire failures and controller errors. Air handling unit Performance Assessment Rules (APAR) is a fault detection tool that uses a set of expert rules derived from mass and energy balances to detect faults in air handling units. APAR is computationally simple enough that it can be embedded in commercial building automation and control systems and relies only upon sensor data and control signals that are commonly available in these systems. Although APAR has many advantages over other methods, for example no training data required and easy to implement commercially, most of the time it is unable to provide the diagnosis of the faults. For instance, a fault on temperature sensor could be fixed bias, drifting bias, inappropriate location, complete failure. Also a fault in mixing box can be return and outdoor damper leak or stuck. In addition, when multiple rules are satisfied the list of faults increases. There is no proper way to have the correct diagnosis for rule based fault detection system. To overcome this limitation we proposed Bayesian Belief Network (BBN) as a diagnostic tool. BBN can be used to simulate diagnostic thinking of FDD experts through a probabilistic way. In this study we developed a new way to detect and diagnose faults in AHU through combining APAR rules and Bayesian Belief network. Bayesian Belief Network is used as a decision support tool for rule based expert system. BBN is highly capable to prioritize faults when multiple rules are satisfied simultaneously. Also it can get information from previous AHU operating conditions and maintenance records to provide proper diagnosis. The proposed model is validated with real time measured data of a campus building at University of Texas at San Antonio (UTSA).The results show that BBN is correctly able to prioritize faults which can be verified by manual investigation.

  1. Diagnostic test accuracy of nutritional tools used to identify undernutrition in patients with colorectal cancer: a systematic review.

    PubMed

    Håkonsen, Sasja Jul; Pedersen, Preben Ulrich; Bath-Hextall, Fiona; Kirkpatrick, Pamela

    2015-05-15

    Effective nutritional screening, nutritional care planning and nutritional support are essential in all settings, and there is no doubt that a health service seeking to increase safety and clinical effectiveness must take nutritional care seriously. Screening and early detection of malnutrition is crucial in identifying patients at nutritional risk. There is a high prevalence of malnutrition in hospitalized patients undergoing treatment for colorectal cancer. To synthesize the best available evidence regarding the diagnostic test accuracy of nutritional tools (sensitivity and specificity) used to identify malnutrition (specifically undernutrition) in patients with colorectal cancer (such as the Malnutrition Screening Tool and Nutritional Risk Index) compared to reference tests (such as the Subjective Global Assessment or Patient Generated Subjective Global Assessment). Patients with colorectal cancer requiring either (or all) surgery, chemotherapy and/or radiotherapy in secondary care. Focus of the review: The diagnostic test accuracy of validated assessment tools/instruments (such as the Malnutrition Screening Tool and Nutritional Risk Index) in the diagnosis of malnutrition (specifically under-nutrition) in patients with colorectal cancer, relative to reference tests (Subjective Global Assessment or Patient Generated Subjective Global Assessment). Types of studies: Diagnostic test accuracy studies regardless of study design. Studies published in English, German, Danish, Swedish and Norwegian were considered for inclusion in this review. Databases were searched from their inception to April 2014. Methodological quality was determined using the Quality Assessment of Diagnostic Accuracy Studies checklist. Data was collected using the data extraction form: the Standards for Reporting Studies of Diagnostic Accuracy checklist for the reporting of studies of diagnostic accuracy. The accuracy of diagnostic tests is presented in terms of sensitivity, specificity, positive and negative predictive values. In addition, the positive likelihood ratio (sensitivity/ [1 - specificity]) and negative likelihood ratio (1 - sensitivity)/ specificity), were also calculated and presented in this review to provide information about the likelihood that a given test result would be expected when the target condition is present compared with the likelihood that the same result would be expected when the condition is absent. Not all trials reported true positive, true negative, false positive and false negative rates, therefore these rates were calculated based on the data in the published papers. A two-by-two truth table was reconstructed for each study, and sensitivity, specificity, positive predictive value, negative predictive value positive likelihood ratio and negative likelihood ratio were calculated for each study. A summary receiver operator characteristics curve was constructed to determine the relationship between sensitivity and specificity, and the area under the summary receiver operator characteristics curve which measured the usefulness of a test was calculated. Meta-analysis was not considered appropriate, therefore data was synthesized in a narrative summary. 1. One study evaluated the Malnutrition Screening Tool against the reference standard Patient-Generated Subjective Global Assessment. The sensitivity was 56% and the specificity 84%. The positive likelihood ratio was 3.100, negative likelihood ratio was 0.59, the diagnostic odds ratio (CI 95%) was 5.20 (1.09-24.90) and the Area Under the Curve (AUC) represents only a poor to fair diagnostic test accuracy. A total of two studies evaluated the diagnostic accuracy of Malnutrition Universal Screening Tool (MUST) (index test) compared to both Subjective Global Assessment (SGA) (reference standard) and PG-SGA (reference standard) in patients with colorectal cancer. In MUST vs SGA the sensitivity of the tool was 96%, specificity was 75%, LR+ 3.826, LR- 0.058, diagnostic OR (CI 95%) 66.00 (6.61-659.24) and AUC represented excellent diagnostic accuracy. In MUST vs PG-SGA the sensitivity of the tool was 72%, specificity 48.9%, LR+ 1.382, LR- 0.579, diagnostic OR (CI 95%) 2.39 (0.87-6.58) and AUC indicated that the tool failed as a diagnostic test to identify patients with colorectal cancer at nutritional risk,. The Nutrition Risk Index (NRI) was compared to SGA representing a sensitivity of 95.2%, specificity of 62.5%, LR+ 2.521, LR- 0.087, diagnostic OR (CI 95%) 28.89 (6.93-120.40) and AUC represented good diagnostic accuracy. In regard to NRI vs PG-SGA the sensitivity of the tool was 68%, specificity 64%, LR+ 1.947, LR- 0.487, diagnostic OR (CI 95%) 4.00 (1.23-13.01) and AUC indicated poor diagnostic test accuracy. There are no single, specific tools used to screen or assess the nutritional status of colorectal cancer patients. All tools showed varied diagnostic accuracies when compared to the reference standards SGA and PG-SGA. Hence clinical judgment combined with perhaps the SGA or PG-SGA should play a major role. The PG-SGA offers several advantages over the SGA tool: 1) the patient completes the medical history component, thereby decreasing the amount of time involved; 2) it contains more nutrition impact symptoms, which are important to the patient with cancer; and 3) it has a scoring system that allows patients to be triaged for nutritional intervention. Therefore, the PG-SGA could be used as a nutrition assessment tool as it allows quick identification and prioritization of colorectal cancer patients with malnutrition in combination with other parameters. This systematic review highlights the need for the following: Further studies needs to investigate the diagnostic accuracy of already existing nutritional screening tools in the context of colorectal cancer patients. If new screenings tools are developed, they should be developed and validated in the specific clinical context within the same patient population (colorectal cancer patients). The Joanna Briggs Institute.

  2. Radiation-Induced Chemical Dynamics in Ar Clusters Exposed to Strong X-Ray Pulses.

    PubMed

    Kumagai, Yoshiaki; Jurek, Zoltan; Xu, Weiqing; Fukuzawa, Hironobu; Motomura, Koji; Iablonskyi, Denys; Nagaya, Kiyonobu; Wada, Shin-Ichi; Mondal, Subhendu; Tachibana, Tetsuya; Ito, Yuta; Sakai, Tsukasa; Matsunami, Kenji; Nishiyama, Toshiyuki; Umemoto, Takayuki; Nicolas, Christophe; Miron, Catalin; Togashi, Tadashi; Ogawa, Kanade; Owada, Shigeki; Tono, Kensuke; Yabashi, Makina; Son, Sang-Kil; Ziaja, Beata; Santra, Robin; Ueda, Kiyoshi

    2018-06-01

    We show that electron and ion spectroscopy reveals the details of the oligomer formation in Ar clusters exposed to an x-ray free electron laser (XFEL) pulse, i.e., chemical dynamics triggered by x rays. With guidance from a dedicated molecular dynamics simulation tool, we find that van der Waals bonding, the oligomer formation mechanism, and charge transfer among the cluster constituents significantly affect ionization dynamics induced by an XFEL pulse of moderate fluence. Our results clearly demonstrate that XFEL pulses can be used not only to "damage and destroy" molecular assemblies but also to modify and transform their molecular structure. The accuracy of the predictions obtained makes it possible to apply the cluster spectroscopy, in connection with the respective simulations, for estimation of the XFEL pulse fluence in the fluence regime below single-atom multiple-photon absorption, which is hardly accessible with other diagnostic tools.

  3. Radiation-Induced Chemical Dynamics in Ar Clusters Exposed to Strong X-Ray Pulses

    NASA Astrophysics Data System (ADS)

    Kumagai, Yoshiaki; Jurek, Zoltan; Xu, Weiqing; Fukuzawa, Hironobu; Motomura, Koji; Iablonskyi, Denys; Nagaya, Kiyonobu; Wada, Shin-ichi; Mondal, Subhendu; Tachibana, Tetsuya; Ito, Yuta; Sakai, Tsukasa; Matsunami, Kenji; Nishiyama, Toshiyuki; Umemoto, Takayuki; Nicolas, Christophe; Miron, Catalin; Togashi, Tadashi; Ogawa, Kanade; Owada, Shigeki; Tono, Kensuke; Yabashi, Makina; Son, Sang-Kil; Ziaja, Beata; Santra, Robin; Ueda, Kiyoshi

    2018-06-01

    We show that electron and ion spectroscopy reveals the details of the oligomer formation in Ar clusters exposed to an x-ray free electron laser (XFEL) pulse, i.e., chemical dynamics triggered by x rays. With guidance from a dedicated molecular dynamics simulation tool, we find that van der Waals bonding, the oligomer formation mechanism, and charge transfer among the cluster constituents significantly affect ionization dynamics induced by an XFEL pulse of moderate fluence. Our results clearly demonstrate that XFEL pulses can be used not only to "damage and destroy" molecular assemblies but also to modify and transform their molecular structure. The accuracy of the predictions obtained makes it possible to apply the cluster spectroscopy, in connection with the respective simulations, for estimation of the XFEL pulse fluence in the fluence regime below single-atom multiple-photon absorption, which is hardly accessible with other diagnostic tools.

  4. SU-F-J-110: MRI-Guided Single-Session Simulation, Online Adaptation, and Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, P; Geurts, M; Mittauer, K

    Purpose: To develop a combined simulation and treatment workflow for MRI-guided radiation therapy using the ViewRay treatment planning and delivery system. Methods: Several features of the ViewRay MRIdian planning and treatment workflows are used to simulate and treat patients that require emergent radiotherapy. A simple “pre-plan” is created on diagnostic imaging retrieved from radiology PACS, where conformal fields are created to target a volume defined by a physician based on review of the diagnostic images and chart notes. After initial consult in radiation oncology, the patient is brought to the treatment room, immobilized, and imaged in treatment position with amore » volumetric MR. While the patient rests on the table, the pre-plan is applied to the treatment planning MR and dose is calculated in the treatment geometry. After physician review, modification of the plan may include updating the target definition, redefining fields, or re-balancing beam weights. Once an acceptable treatment plan is finalized and approved, the patient is treated. Results: Careful preparation and judicious choices in the online planning process allow conformal treatment plans to be created and delivered in a single, thirty-minute session. Several advantages have been identified using this process as compared to conventional urgent CT simulation and delivery. Efficiency gains are notable, as physicians appreciate the predictable time commitment and patient waiting time for treatment is decreased. MR guidance in a treatment position offers both enhanced contrast for target delineation and reduction of setup uncertainties. The MRIdian system tools designed for adaptive radiotherapy are particularly useful, enabling plan changes to be made in minutes. Finally, the resulting plans, typically 6 conformal beams, are delivered as quickly as more conventional AP/PA beam arrangements with comparatively superior dose distributions. Conclusion: The ViewRay treatment planning software and delivery system can accommodate a fast simulation and treatment workflow.« less

  5. A Generic Simulation Framework for Non-Entangled based Experimental Quantum Cryptography and Communication: Quantum Cryptography and Communication Simulator (QuCCs)

    NASA Astrophysics Data System (ADS)

    Buhari, Abudhahir; Zukarnain, Zuriati Ahmad; Khalid, Roszelinda; Zakir Dato', Wira Jaafar Ahmad

    2016-11-01

    The applications of quantum information science move towards bigger and better heights for the next generation technology. Especially, in the field of quantum cryptography and quantum computation, the world already witnessed various ground-breaking tangible product and promising results. Quantum cryptography is one of the mature field from quantum mechanics and already available in the markets. The current state of quantum cryptography is still under various researches in order to reach the heights of digital cryptography. The complexity of quantum cryptography is higher due to combination of hardware and software. The lack of effective simulation tool to design and analyze the quantum cryptography experiments delays the reaching distance of the success. In this paper, we propose a framework to achieve an effective non-entanglement based quantum cryptography simulation tool. We applied hybrid simulation technique i.e. discrete event, continuous event and system dynamics. We also highlight the limitations of a commercial photonic simulation tool based experiments. Finally, we discuss ideas for achieving one-stop simulation package for quantum based secure key distribution experiments. All the modules of simulation framework are viewed from the computer science perspective.

  6. Improving the off-axis spatial resolution and dynamic range of the NIF X-ray streak cameras (invited)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacPhee, A. G., E-mail: macphee2@llnl.gov; Hatch, B. W.; Bell, P. M.

    2016-11-15

    We report simulations and experiments that demonstrate an increase in spatial resolution of the NIF core diagnostic x-ray streak cameras by at least a factor of two, especially off axis. A design was achieved by using a corrector electron optic to flatten the field curvature at the detector plane and corroborated by measurement. In addition, particle in cell simulations were performed to identify the regions in the streak camera that contribute the most to space charge blurring. These simulations provide a tool for convolving synthetic pre-shot spectra with the instrument function so signal levels can be set to maximize dynamicmore » range for the relevant part of the streak record.« less

  7. Improving the off-axis spatial resolution and dynamic range of the NIF X-ray streak cameras (invited).

    PubMed

    MacPhee, A G; Dymoke-Bradshaw, A K L; Hares, J D; Hassett, J; Hatch, B W; Meadowcroft, A L; Bell, P M; Bradley, D K; Datte, P S; Landen, O L; Palmer, N E; Piston, K W; Rekow, V V; Hilsabeck, T J; Kilkenny, J D

    2016-11-01

    We report simulations and experiments that demonstrate an increase in spatial resolution of the NIF core diagnostic x-ray streak cameras by at least a factor of two, especially off axis. A design was achieved by using a corrector electron optic to flatten the field curvature at the detector plane and corroborated by measurement. In addition, particle in cell simulations were performed to identify the regions in the streak camera that contribute the most to space charge blurring. These simulations provide a tool for convolving synthetic pre-shot spectra with the instrument function so signal levels can be set to maximize dynamic range for the relevant part of the streak record.

  8. Synthetic reconstruction of recycling on the limiter during startup phase of W7-X based on EMC3-EIRENE simulations

    NASA Astrophysics Data System (ADS)

    Frerichs, Heinke; Effenberg, Florian; Schmitz, Oliver; Stephey, Laurie; W7-X Team

    2016-10-01

    Interpretation of spectroscopic measurements in the edge region of high-temperature plasmas can be a challenge due to line of sight integration effects. The EMC3-EIRENE code - a 3D fluid edge plasma and kinetic neutral gas transport code - is a suitable tool for full 3D reconstruction of such signals. A versatile synthetic diagnostic module has been developed recently which allows the realistic three dimensional setup of various plasma edge diagnostics to be captured. We present an analysis of recycling on the inboard limiter of W7-X during its startup phase in terms of a synthetic camera for Hα light observations and reconstruct the particle flux from these synthetic images based on ionization per photon coefficients (S/XB). We find that line of sight integration effects can lead to misinterpretation of data (redistribution of particle flux due to neutral gas diffusion), and that local plasma effects are important for the correct treatment of photon emissions. This work was supported by the U.S. Department of Energy (DOE) under Grant DE-SC0014210, by startup funds of the Department of Engineering Physics at the University of Wisconsin - Madison, and by the EUROfusion Consortium under Euratom Grant No 633053.

  9. Structure and Computation in Immunoreagent Design: From Diagnostics to Vaccines.

    PubMed

    Gourlay, Louise; Peri, Claudio; Bolognesi, Martino; Colombo, Giorgio

    2017-12-01

    Novel immunological tools for efficient diagnosis and treatment of emerging infections are urgently required. Advances in the diagnostic and vaccine development fields are continuously progressing, with reverse vaccinology and structural vaccinology (SV) methods for antigen identification and structure-based antigen (re)design playing increasingly relevant roles. SV, in particular, is predicted to be the front-runner in the future development of diagnostics and vaccines targeting challenging diseases such as AIDS and cancer. We review state-of-the-art methodologies for structure-based epitope identification and antigen design, with specific applicative examples. We highlight the implications of such methods for the engineering of biomolecules with improved immunological properties, potential diagnostic and/or therapeutic uses, and discuss the perspectives of structure-based rational design for the production of advanced immunoreagents. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. An Embedded Rule-Based Diagnostic Expert System in Ada

    NASA Technical Reports Server (NTRS)

    Jones, Robert E.; Liberman, Eugene M.

    1992-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with it portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assumed a growing role in providing human-like reasoning capability expertise for computer systems. The integration is discussed of expert system technology with Ada programming language, especially a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell. NASA Lewis was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-based power expert system, in ART-Ada. Three components, the rule-based expert systems, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The rules were written in the ART-Ada development environment and converted to Ada source code. The graphics interface was developed with the Transportable Application Environment (TAE) Plus, which generates Ada source code to control graphics images. SMART-Ada communicates with a remote host to obtain either simulated or real data. The Ada source code generated with ART-Ada, TAE Plus, and communications code was incorporated into an Ada expert system that reads the data from a power distribution test bed, applies the rule to determine a fault, if one exists, and graphically displays it on the screen. The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  11. Computer-Simulated Arthroscopic Knee Surgery: Effects of Distraction on Resident Performance.

    PubMed

    Cowan, James B; Seeley, Mark A; Irwin, Todd A; Caird, Michelle S

    2016-01-01

    Orthopedic surgeons cite "full focus" and "distraction control" as important factors for achieving excellent outcomes. Surgical simulation is a safe and cost-effective way for residents to practice surgical skills, and it is a suitable tool to study the effects of distraction on resident surgical performance. This study investigated the effects of distraction on arthroscopic knee simulator performance among residents at various levels of experience. The authors hypothesized that environmental distractions would negatively affect performance. Twenty-five orthopedic surgery residents performed a diagnostic knee arthroscopy computer simulation according to a checklist of structures to identify and tasks to complete. Participants were evaluated on arthroscopy time, number of chondral injuries, instances of looking down at their hands, and completion of checklist items. Residents repeated this task at least 2 weeks later while simultaneously answering distracting questions. During distracted simulation, the residents had significantly fewer completed checklist items (P<.02) compared with the initial simulation. Senior residents completed the initial simulation in less time (P<.001), with fewer chondral injuries (P<.005) and fewer instances of looking down at their hands (P<.012), compared with junior residents. Senior residents also completed 97% of the diagnostic checklist, whereas junior residents completed 89% (P<.019). During distracted simulation, senior residents continued to complete tasks more quickly (P<.006) and with fewer instances of looking down at their hands (P<.042). Residents at all levels appear to be susceptible to the detrimental effects of distraction when performing arthroscopic simulation. Addressing even straightforward questions intraoperatively may affect surgeon performance. Copyright 2016, SLACK Incorporated.

  12. A Model-Based Expert System for Space Power Distribution Diagnostics

    NASA Technical Reports Server (NTRS)

    Quinn, Todd M.; Schlegelmilch, Richard F.

    1994-01-01

    When engineers diagnose system failures, they often use models to confirm system operation. This concept has produced a class of advanced expert systems that perform model-based diagnosis. A model-based diagnostic expert system for the Space Station Freedom electrical power distribution test bed is currently being developed at the NASA Lewis Research Center. The objective of this expert system is to autonomously detect and isolate electrical fault conditions. Marple, a software package developed at TRW, provides a model-based environment utilizing constraint suspension. Originally, constraint suspension techniques were developed for digital systems. However, Marple provides the mechanisms for applying this approach to analog systems such as the test bed, as well. The expert system was developed using Marple and Lucid Common Lisp running on a Sun Sparc-2 workstation. The Marple modeling environment has proved to be a useful tool for investigating the various aspects of model-based diagnostics. This report describes work completed to date and lessons learned while employing model-based diagnostics using constraint suspension within an analog system.

  13. Simulation-based learning: Just like the real thing

    PubMed Central

    Lateef, Fatimah

    2010-01-01

    Simulation is a technique for practice and learning that can be applied to many different disciplines and trainees. It is a technique (not a technology) to replace and amplify real experiences with guided ones, often “immersive” in nature, that evoke or replicate substantial aspects of the real world in a fully interactive fashion. Simulation-based learning can be the way to develop health professionals’ knowledge, skills, and attitudes, whilst protecting patients from unnecessary risks. Simulation-based medical education can be a platform which provides a valuable tool in learning to mitigate ethical tensions and resolve practical dilemmas. Simulation-based training techniques, tools, and strategies can be applied in designing structured learning experiences, as well as be used as a measurement tool linked to targeted teamwork competencies and learning objectives. It has been widely applied in fields such aviation and the military. In medicine, simulation offers good scope for training of interdisciplinary medical teams. The realistic scenarios and equipment allows for retraining and practice till one can master the procedure or skill. An increasing number of health care institutions and medical schools are now turning to simulation-based learning. Teamwork training conducted in the simulated environment may offer an additive benefit to the traditional didactic instruction, enhance performance, and possibly also help reduce errors. PMID:21063557

  14. Simulation-based learning: Just like the real thing.

    PubMed

    Lateef, Fatimah

    2010-10-01

    Simulation is a technique for practice and learning that can be applied to many different disciplines and trainees. It is a technique (not a technology) to replace and amplify real experiences with guided ones, often "immersive" in nature, that evoke or replicate substantial aspects of the real world in a fully interactive fashion. Simulation-based learning can be the way to develop health professionals' knowledge, skills, and attitudes, whilst protecting patients from unnecessary risks. Simulation-based medical education can be a platform which provides a valuable tool in learning to mitigate ethical tensions and resolve practical dilemmas. Simulation-based training techniques, tools, and strategies can be applied in designing structured learning experiences, as well as be used as a measurement tool linked to targeted teamwork competencies and learning objectives. It has been widely applied in fields such aviation and the military. In medicine, simulation offers good scope for training of interdisciplinary medical teams. The realistic scenarios and equipment allows for retraining and practice till one can master the procedure or skill. An increasing number of health care institutions and medical schools are now turning to simulation-based learning. Teamwork training conducted in the simulated environment may offer an additive benefit to the traditional didactic instruction, enhance performance, and possibly also help reduce errors.

  15. Effect of station examination item sampling on generalizability of student performance.

    PubMed

    Stratford, P W; Thomson, M A; Sanford, J; Saarinen, H; Dilworth, P; Solomon, P; Nixon, P; Fraser-MacDougall, V; Pierce-Fenn, H

    1990-01-01

    This article may be of interest to physical therapy educators who are responsible for structuring station or practical examinations used to evaluate physical therapy students. The global intent of the article is to provide information that may be useful in selecting test items. Specifically, the purposes of this study were 1) to examine how two item-sampling strategies (one based on different diagnostic concepts, or diagnostic probes, and the other based on different anatomical sites) influenced the generalizability of a station examination, 2) to determine the interrater reliability during the station examination, and 3) to determine whether the status of the rater (that of observer or simulated patient) influenced the rating. Using a nested study design, 24 physical therapy students were assessed by eight raters. The raters were randomly and equally assigned to four teams. Each team assessed six students. One rater acted as the simulated patient for the first three students in each group, and the other rater acted as observer. This order was reversed for the last three students. Each student performed nine mini-diagnostic patient cases consisting of three diagnostic probes reproduced at three different anatomical sites. The results demonstrate that 1) similar diagnostic concepts can be generalized across anatomical sites, although different concepts or skills cannot be generalized at a given anatomical site or across sites; 2) interrater reliability was excellent; and 3) the status of the raters (ie, simulated patient or observer) did not bias the ratings.(ABSTRACT TRUNCATED AT 250 WORDS)

  16. Radiometry simulation within the end-to-end simulation tool SENSOR

    NASA Astrophysics Data System (ADS)

    Wiest, Lorenz; Boerner, Anko

    2001-02-01

    12 An end-to-end simulation is a valuable tool for sensor system design, development, optimization, testing, and calibration. This contribution describes the radiometry module of the end-to-end simulation tool SENSOR. It features MODTRAN 4.0-based look up tables in conjunction with a cache-based multilinear interpolation algorithm to speed up radiometry calculations. It employs a linear reflectance parameterization to reduce look up table size, considers effects due to the topology of a digital elevation model (surface slope, sky view factor) and uses a reflectance class feature map to assign Lambertian and BRDF reflectance properties to the digital elevation model. The overall consistency of the radiometry part is demonstrated by good agreement between ATCOR 4-retrieved reflectance spectra of a simulated digital image cube and the original reflectance spectra used to simulate this image data cube.

  17. Modelling the transport of optical photons in scintillation detectors for diagnostic and radiotherapy imaging

    NASA Astrophysics Data System (ADS)

    Roncali, Emilie; Mosleh-Shirazi, Mohammad Amin; Badano, Aldo

    2017-10-01

    Computational modelling of radiation transport can enhance the understanding of the relative importance of individual processes involved in imaging systems. Modelling is a powerful tool for improving detector designs in ways that are impractical or impossible to achieve through experimental measurements. Modelling of light transport in scintillation detectors used in radiology and radiotherapy imaging that rely on the detection of visible light plays an increasingly important role in detector design. Historically, researchers have invested heavily in modelling the transport of ionizing radiation while light transport is often ignored or coarsely modelled. Due to the complexity of existing light transport simulation tools and the breadth of custom codes developed by users, light transport studies are seldom fully exploited and have not reached their full potential. This topical review aims at providing an overview of the methods employed in freely available and other described optical Monte Carlo packages and analytical models and discussing their respective advantages and limitations. In particular, applications of optical transport modelling in nuclear medicine, diagnostic and radiotherapy imaging are described. A discussion on the evolution of these modelling tools into future developments and applications is presented. The authors declare equal leadership and contribution regarding this review.

  18. Stress Inoculation through Cognitive and Biofeedback Training

    DTIC Science & Technology

    2010-12-01

    based on Heart Rate Variability ( HRV ) with innovative simulation game-based training tools. The training system described here will be implemented on a...Variability ( HRV ) with innovative simulation game-based training tools. The training system described here will be implemented on a mobile device...and studies (e.g. Fletcher & Tobias, 2006; Thayer, 2009). HRV Coherence Training for Stress Resilience Satisfactory performance in stressful

  19. Hygrothermal Simulation: A Tool for Building Envelope Design Analysis

    Treesearch

    Samuel V. Glass; Anton TenWolde; Samuel L. Zelinka

    2013-01-01

    Is it possible to gauge the risk of moisture problems while designing the building envelope? This article provides a brief introduction to computer-based hygrothermal (heat and moisture) simulation, shows how simulation can be useful as a design tool, and points out a number of im-portant considerations regarding model inputs and limita-tions. Hygrothermal simulation...

  20. Nanotechnology-Based Surface Plasmon Resonance Affinity Biosensors for In Vitro Diagnostics

    PubMed Central

    Antiochia, Riccarda; Bollella, Paolo; Favero, Gabriele

    2016-01-01

    In the last decades, in vitro diagnostic devices (IVDDs) became a very important tool in medicine for an early and correct diagnosis, a proper screening of targeted population, and also assessing the efficiency of a specific therapy. In this review, the most recent developments regarding different configurations of surface plasmon resonance affinity biosensors modified by using several nanostructured materials for in vitro diagnostics are critically discussed. Both assembly and performances of the IVDDs tested in biological samples are reported and compared. PMID:27594884

  1. Problem Representation, Background Evidence, Analysis, Recommendation: An Oral Case Presentation Tool to Promote Diagnostic Reasoning.

    PubMed

    Carter, Cristina; Akar-Ghibril, Nicole; Sestokas, Jeff; Dixon, Gabrina; Bradford, Wilhelmina; Ottolini, Mary

    2018-03-01

    Oral case presentations provide an opportunity for trainees to communicate diagnostic reasoning at the bedside. However, few tools exist to enable faculty to provide effective feedback. We developed a tool to assess diagnostic reasoning and communication during oral case presentations. Published by Elsevier Inc.

  2. An integrated modeling and design tool for advanced optical spacecraft

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1992-01-01

    Consideration is given to the design and status of the Integrated Modeling of Optical Systems (IMOS) tool and to critical design issues. A multidisciplinary spacecraft design and analysis tool with support for structural dynamics, controls, thermal analysis, and optics, IMOS provides rapid and accurate end-to-end performance analysis, simulations, and optimization of advanced space-based optical systems. The requirements for IMOS-supported numerical arrays, user defined data structures, and a hierarchical data base are outlined, and initial experience with the tool is summarized. A simulation of a flexible telescope illustrates the integrated nature of the tools.

  3. [Complexity level simulation in the German diagnosis-related groups system: the financial effect of coding of comorbidity diagnostics in urology].

    PubMed

    Wenke, A; Gaber, A; Hertle, L; Roeder, N; Pühse, G

    2012-07-01

    Precise and complete coding of diagnoses and procedures is of value for optimizing revenues within the German diagnosis-related groups (G-DRG) system. The implementation of effective structures for coding is cost-intensive. The aim of this study was to prove whether higher costs can be refunded by complete acquisition of comorbidities and complications. Calculations were based on DRG data of the Department of Urology, University Hospital of Münster, Germany, covering all patients treated in 2009. The data were regrouped and subjected to a process of simulation (increase and decrease of patient clinical complexity levels, PCCL) with the help of recently developed software. In urology a strong dependency of quantity and quality of coding of secondary diagnoses on PCCL and subsequent profits was found. Departmental budgetary procedures can be optimized when coding is effective. The new simulation tool can be a valuable aid to improve profits available for distribution. Nevertheless, calculation of time use and financial needs by this procedure are subject to specific departmental terms and conditions. Completeness of coding of (secondary) diagnoses must be the ultimate administrative goal of patient case documentation in urology.

  4. A survey of simulators for palpation training.

    PubMed

    Zhang, Yan; Phillips, Roger; Ward, James; Pisharody, Sandhya

    2009-01-01

    Palpation is a widely used diagnostic method in medical practice. The sensitivity of palpation is highly dependent upon the skill of clinicians, which is often difficult to master. There is a need of simulators in palpation training. This paper summarizes important work and the latest achievements in simulation for palpation training. Three types of simulators; physical models, Virtual Reality (VR) based simulations, and hybrid (computerized and physical) simulators, are surveyed. Comparisons among different kinds of simulators are presented.

  5. Tools and Equipment Modeling for Automobile Interactive Assembling Operating Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu Dianliang; Zhu Hongmin; Shanghai Key Laboratory of Advance Manufacturing Environment

    Tools and equipment play an important role in the simulation of virtual assembly, especially in the assembly process simulation and plan. Because of variety in function and complexity in structure and manipulation, the simulation of tools and equipments remains to be a challenge for interactive assembly operation. Based on analysis of details and characteristics of interactive operations for automobile assembly, the functional requirement for tools and equipments of automobile assembly is given. Then, a unified modeling method for information expression and function realization of general tools and equipments is represented, and the handling methods of manual, semi-automatic, automatic tools andmore » equipments are discussed. Finally, the application in assembly simulation of rear suspension and front suspension of Roewe 750 automobile is given. The result shows that the modeling and handling methods are applicable in the interactive simulation of various tools and equipments, and can also be used for supporting assembly process planning in virtual environment.« less

  6. Preoperative planning of thoracic surgery with use of three-dimensional reconstruction, rapid prototyping, simulation and virtual navigation

    PubMed Central

    Heuts, Samuel; Maessen, Jos G.

    2016-01-01

    For the past decades, surgeries have become more complex, due to the increasing age of the patient population referred for thoracic surgery, more complex pathology and the emergence of minimally invasive thoracic surgery. Together with the early detection of thoracic disease as a result of innovations in diagnostic possibilities and the paradigm shift to personalized medicine, preoperative planning is becoming an indispensable and crucial aspect of surgery. Several new techniques facilitating this paradigm shift have emerged. Pre-operative marking and staining of lesions are already a widely accepted method of preoperative planning in thoracic surgery. However, three-dimensional (3D) image reconstructions, virtual simulation and rapid prototyping (RP) are still in development phase. These new techniques are expected to become an important part of the standard work-up of patients undergoing thoracic surgery in the future. This review aims at graphically presenting and summarizing these new diagnostic and therapeutic tools PMID:29078505

  7. [Virtual reality simulation training in gynecology: review and perspectives].

    PubMed

    Ricard-Gauthier, Dominique; Popescu, Silvia; Benmohamed, Naida; Petignat, Patrick; Dubuisson, Jean

    2016-10-26

    Laparoscopic simulation has rapidly become an important tool for learning and acquiring technical skills in surgery. It is based on two different complementary pedagogic tools : the box model trainer and the virtual reality simulator. The virtual reality simulator has shown its efficiency by improving surgical skills, decreasing operating time, improving economy of movements and improving self-confidence. The main objective of this tool is the opportunity to easily organize a regular, structured and uniformed training program enabling an automated individualized feedback.

  8. FDTD simulation tools for UWB antenna analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  9. Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.

    2015-01-01

    Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.

  10. Improving the off-axis spatial resolution and dynamic range of the NIF X-ray streak cameras (invited)

    DOE PAGES

    MacPhee, A. G.; Dymoke-Bradshaw, A. K. L.; Hares, J. D.; ...

    2016-08-08

    Here, we report simulationsand experiments that demonstrate an increasein spatial resolution ofthe NIF core diagnostic x-ray streak camerasby a factor of two, especially off axis. A designwas achieved by usinga corrector electron optic to flatten the field curvature at the detector planeand corroborated by measurement. In addition, particle in cell simulations were performed to identify theregions in the streak camera that contribute most to space charge blurring. Our simulations provide a tool for convolving syntheticpre-shot spectra with the instrument functionso signal levels can be set to maximize dynamic range for the relevant part of the streak record.

  11. Low-order nonlinear dynamic model of IC engine-variable pitch propeller system for general aviation aircraft

    NASA Technical Reports Server (NTRS)

    Richard, Jacques C.

    1995-01-01

    This paper presents a dynamic model of an internal combustion engine coupled to a variable pitch propeller. The low-order, nonlinear time-dependent model is useful for simulating the propulsion system of general aviation single-engine light aircraft. This model is suitable for investigating engine diagnostics and monitoring and for control design and development. Furthermore, the model may be extended to provide a tool for the study of engine emissions, fuel economy, component effects, alternative fuels, alternative engine cycles, flight simulators, sensors, and actuators. Results show that the model provides a reasonable representation of the propulsion system dynamics from zero to 10 Hertz.

  12. Battery Storage Evaluation Tool, version 1.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-02

    The battery storage evaluation tool developed at Pacific Northwest National Laboratory is used to run a one-year simulation to evaluate the benefits of battery storage for multiple grid applications, including energy arbitrage, balancing service, capacity value, distribution system equipment deferral, and outage mitigation. This tool is based on the optimal control strategies to capture multiple services from a single energy storage device. In this control strategy, at each hour, a lookahead optimization is first formulated and solved to determine the battery base operating point. The minute-by-minute simulation is then performed to simulate the actual battery operation.

  13. A study on directional resistivity logging-while-drilling based on self-adaptive hp-FEM

    NASA Astrophysics Data System (ADS)

    Liu, Dejun; Li, Hui; Zhang, Yingying; Zhu, Gengxue; Ai, Qinghui

    2014-12-01

    Numerical simulation of resistivity logging-while-drilling (LWD) tool response provides guidance for designing novel logging instruments and interpreting real-time logging data. In this paper, based on self-adaptive hp-finite element method (hp-FEM) algorithm, we analyze LWD tool response against model parameters and briefly illustrate geosteering capabilities of directional resistivity LWD. Numerical simulation results indicate that the change of source spacing is of obvious influence on the investigation depth and detecting precision of resistivity LWD tool; the change of frequency can improve the resolution of low-resistivity formation and high-resistivity formation. The simulation results also indicate that the self-adaptive hp-FEM algorithm has good convergence speed and calculation accuracy to guide the geologic steering drilling and it is suitable to simulate the response of resistivity LWD tools.

  14. Simulation Tools for Power Electronics Courses Based on Java Technologies

    ERIC Educational Resources Information Center

    Canesin, Carlos A.; Goncalves, Flavio A. S.; Sampaio, Leonardo P.

    2010-01-01

    This paper presents interactive power electronics educational tools. These interactive tools make use of the benefits of Java language to provide a dynamic and interactive approach to simulating steady-state ideal rectifiers (uncontrolled and controlled; single-phase and three-phase). Additionally, this paper discusses the development and use of…

  15. Toward a 3D dynamic model of a faulty duplex ball bearing

    NASA Astrophysics Data System (ADS)

    Kogan, Gideon; Klein, Renata; Kushnirsky, Alex; Bortman, Jacob

    2015-03-01

    Bearings are vital components for safe and proper operation of machinery. Increasing efficiency of bearing diagnostics usually requires training of health and usage monitoring systems via expensive and time-consuming ground calibration tests. The main goal of this research, therefore, is to improve bearing dynamics modeling tools in order to reduce the time and budget needed to implement the health and usage monitoring approach. The proposed three-dimensional ball bearing dynamic model is based on the classic dynamic and kinematic equations. Interactions between the bodies are simulated using non-linear springs combined with dampers described by Hertz-type contact relation. The force friction is simulated using the hyperbolic-tangent function. The model allows simulation of a wide range of mechanical faults. It is validated by comparison to known bearing behavior and to experimental results. The model results are verified by demonstrating numerical convergence. The model results for the two cases of single and duplex angular ball bearings with axial deformation in the outer ring are presented. The qualitative investigation provides insight into bearing dynamics, the sensitivity study generalizes the qualitative findings for similar cases, and the comparison to the test results validates model reliability. The article demonstrates the variety of the cases that the 3D bearing model can simulate and the findings to which it may lead. The research allowed the identification of new patterns generated by single and duplex bearings with axially deformed outer race. It also enlightened the difference between single and duplex bearing manifestation. In the current research the dynamic model enabled better understanding of the physical behavior of the faulted bearings. Therefore, it is expected that the modeling approach has the potential to simplify and improve the development process of diagnostic algorithms. • A deformed outer race of a single axially loaded bearing is simulated. • The model results are subjected to a sensitivity study. • Duplex bearing with deformed outer race is simulated as well as tested. • The simulation results are in a good agreement with the experimental results.

  16. Target prioritization and strategy selection for active case-finding of pulmonary tuberculosis: a tool to support country-level project planning.

    PubMed

    Nishikiori, Nobuyuki; Van Weezenbeek, Catharina

    2013-02-02

    Despite the progress made in the past decade, tuberculosis (TB) control still faces significant challenges. In many countries with declining TB incidence, the disease tends to concentrate in vulnerable populations that often have limited access to health care. In light of the limitations of the current case-finding approach and the global urgency to improve case detection, active case-finding (ACF) has been suggested as an important complementary strategy to accelerate tuberculosis control especially among high-risk populations. The present exercise aims to develop a model that can be used for county-level project planning. A simple deterministic model was developed to calculate the number of estimated TB cases diagnosed and the associated costs of diagnosis. The model was designed to compare cost-effectiveness parameters, such as the cost per case detected, for different diagnostic algorithms when they are applied to different risk populations. The model was transformed into a web-based tool that can support national TB programmes and civil society partners in designing ACF activities. According to the model output, tuberculosis active case-finding can be a costly endeavor, depending on the target population and the diagnostic strategy. The analysis suggests the following: (1) Active case-finding activities are cost-effective only if the tuberculosis prevalence among the target population is high. (2) Extensive diagnostic methods (e.g. X-ray screening for the entire group, use of sputum culture or molecular diagnostics) can be applied only to very high-risk groups such as TB contacts, prisoners or people living with human immunodeficiency virus (HIV) infection. (3) Basic diagnostic approaches such as TB symptom screening are always applicable although the diagnostic yield is very limited. The cost-effectiveness parameter was sensitive to local diagnostic costs and the tuberculosis prevalence of target populations. The prioritization of appropriate target populations and careful selection of cost-effective diagnostic strategies are critical prerequisites for rational active case-finding activities. A decision to conduct such activities should be based on the setting-specific cost-effectiveness analysis and programmatic assessment. A web-based tool was developed and is available to support national tuberculosis programmes and partners in the formulation of cost-effective active case-finding activities at the national and subnational levels.

  17. Immersed smoothed finite element method for fluid-structure interaction simulation of aortic valves

    NASA Astrophysics Data System (ADS)

    Yao, Jianyao; Liu, G. R.; Narmoneva, Daria A.; Hinton, Robert B.; Zhang, Zhi-Qian

    2012-12-01

    This paper presents a novel numerical method for simulating the fluid-structure interaction (FSI) problems when blood flows over aortic valves. The method uses the immersed boundary/element method and the smoothed finite element method and hence it is termed as IS-FEM. The IS-FEM is a partitioned approach and does not need a body-fitted mesh for FSI simulations. It consists of three main modules: the fluid solver, the solid solver and the FSI force solver. In this work, the blood is modeled as incompressible viscous flow and solved using the characteristic-based-split scheme with FEM for spacial discretization. The leaflets of the aortic valve are modeled as Mooney-Rivlin hyperelastic materials and solved using smoothed finite element method (or S-FEM). The FSI force is calculated on the Lagrangian fictitious fluid mesh that is identical to the moving solid mesh. The octree search and neighbor-to-neighbor schemes are used to detect efficiently the FSI pairs of fluid and solid cells. As an example, a 3D idealized model of aortic valve is modeled, and the opening process of the valve is simulated using the proposed IS-FEM. Numerical results indicate that the IS-FEM can serve as an efficient tool in the study of aortic valve dynamics to reveal the details of stresses in the aortic valves, the flow velocities in the blood, and the shear forces on the interfaces. This tool can also be applied to animal models studying disease processes and may ultimately translate to a new adaptive methods working with magnetic resonance images, leading to improvements on diagnostic and prognostic paradigms, as well as surgical planning, in the care of patients.

  18. Residual gas analyzer mass spectrometry for human breath analysis: a new tool for the non-invasive diagnosis of Helicobacter pylori infection.

    PubMed

    Maity, Abhijit; Banik, Gourab D; Ghosh, Chiranjit; Som, Suman; Chaudhuri, Sujit; Daschakraborty, Sunil B; Ghosh, Shibendu; Ghosh, Barnali; Raychaudhuri, Arup K; Pradhan, Manik

    2014-03-01

    A residual gas analyzer (RGA) coupled with a high vacuum chamber is described for the non-invasive diagnosis of the Helicobacter pylori (H. pylori) infection through ¹³C-urea breath analysis. The present RGA-based mass spectrometry (MS) method is capable of measuring high-precision ¹³CO₂ isotope enrichments in exhaled breath samples from individuals harboring the H. pylori infection. The system exhibited 100% diagnostic sensitivity, and 93% specificity alongside positive and negative predictive values of 95% and 100%, respectively, compared with invasive endoscopy-based biopsy tests. A statistically sound diagnostic cut-off value for the presence of H. pylori was determined to be 3.0‰ using a receiver operating characteristic curve analysis. The diagnostic accuracy and validity of the results are also supported by optical off-axis integrated cavity output spectroscopy measurements. The δ¹³(DOB)C‰ values of both methods correlated well (R² = 0.9973 at 30 min). The RGA-based instrumental setup described here is simple, robust, easy-to-use and more portable and cost-effective compared to all other currently available detection methods, thus making it a new point-of-care medical diagnostic tool for the purpose of large-scale screening of the H. pylori infection in real time. The RGA-MS technique should have broad applicability for ¹³C-breath tests in a wide range of biomedical research and clinical diagnostics for many other diseases and metabolic disorders.

  19. Role and challenges of simulation in undergraduate curriculum.

    PubMed

    Nuzhat, Ayesha; Salem, Raneem Osama; Al Shehri, Fatimah Nasser; Al Hamdan, Nasser

    2014-04-01

    Medical simulation is relatively a novel technology widely utilized for teaching and assessing students clinical skills. Students and faculty face many challenges when simulation sessions are introduced into undergraduate curriculum. The aim of this study is to obtain the opinion of undergraduate medical students and our faculty regarding the role of simulation in undergraduate curriculum, the simulation modalities used, and the perceived barriers in implementing simulation sessions. A self-administered pilot tested questionnaire with 18 items using a 5-point Likert scale was distributed to undergraduate male (n = 125) and female students (n = 70) as well as to the faculty members (n = 14) at King Fahad Medical City, King Saud Bin Abdul Aziz University of Health Sciences, Saudi Arabia, to respond. Survey elements addressed the role of simulation, simulation modalities used, and perceived challenges to implementation of simulation sessions. Various learning outcomes are achieved and improved through the technology enhanced simulation sessions such as communication skills, diagnostic skills, procedural skills, self-confidence, and integration of basic and clinical sciences. The use of high fidelity simulators, simulated patients and task trainers was more desirable by our students and faculty for teaching and learning as well as an evaluation tool. According to most of the students', institutional support in terms of resources, staff and duration of sessions was adequate. However, motivation to participate in the sessions and provision of adequate feedback by the staff was a constraint. The use of simulation laboratory is of great benefit to the students and a great teaching tool for the staff to ensure students learn various skills.

  20. [Diagnostic difficulties in Grave's orbitopathy--case report].

    PubMed

    Jedrzejowski, Maciej; Grzesiuk, Wiesław; Szwejda, Elzbieta; Bar-Andziak, Ewa

    2004-03-01

    Graves' orbitopathy is caused by intraorbital inflammatory reaction due to autoimmune thyroid disease. In most cases the diagnosis is based on the coexistence of typical eye signs and hyperthyroidism symptoms. In presented case, the absence of thyroid dysfunction implicated performance of differential diagnosis. Among many available diagnostic tools nuclear magnetic resonance seems to be the most accurate in confirmation of diagnosis of Graves' orbitopathy.

  1. Diagnostics Tools Identify Faults Prior to Failure

    NASA Technical Reports Server (NTRS)

    2013-01-01

    Through the SBIR program, Rochester, New York-based Impact Technologies LLC collaborated with Ames Research Center to commercialize the Center s Hybrid Diagnostic Engine, or HyDE, software. The fault detecting program is now incorporated into a software suite that identifies potential faults early in the design phase of systems ranging from printers to vehicles and robots, saving time and money.

  2. High Pressure Particulate Physics Facility

    DTIC Science & Technology

    2011-03-26

    controlled loading conditions, nanosecond time resolution diagnostics are required. Therefore, state of the art diagnostic tools such as Velocity...front end plate. The Data Acquisition System (DAS) is based on the state of the art National Instruments PXI system. The architecture provides...obtained by copper wire. In the future x-ray cinematography , line VISAR and time indexed spectroscopy are planned. SECTION III SUMMARY We are

  3. Going DEEP: guidelines for building simulation-based team assessments.

    PubMed

    Grand, James A; Pearce, Marina; Rench, Tara A; Chao, Georgia T; Fernandez, Rosemarie; Kozlowski, Steve W J

    2013-05-01

    Whether for team training, research or evaluation, making effective use of simulation-based technologies requires robust, reliable and accurate assessment tools. Extant literature on simulation-based assessment practices has primarily focused on scenario and instructional design; however, relatively little direct guidance has been provided regarding the challenging decisions and fundamental principles related to assessment development and implementation. The objective of this manuscript is to introduce a generalisable assessment framework supplemented by specific guidance on how to construct and ensure valid and reliable simulation-based team assessment tools. The recommendations reflect best practices in assessment and are designed to empower healthcare educators, professionals and researchers with the knowledge to design and employ valid and reliable simulation-based team assessments. Information and actionable recommendations associated with creating assessments of team processes (non-technical 'teamwork' activities) and performance (demonstration of technical proficiency) are presented which provide direct guidance on how to Distinguish the underlying competencies one aims to assess, Elaborate the measures used to capture team member behaviours during simulation activities, Establish the content validity of these measures and Proceduralise the measurement tools in a way that is systematically aligned with the goals of the simulation activity while maintaining methodological rigour (DEEP). The DEEP framework targets fundamental principles and critical activities that are important for effective assessment, and should benefit healthcare educators, professionals and researchers seeking to design or enhance any simulation-based assessment effort.

  4. Artificial intelligence in hematology.

    PubMed

    Zini, Gina

    2005-10-01

    Artificial intelligence (AI) is a computer based science which aims to simulate human brain faculties using a computational system. A brief history of this new science goes from the creation of the first artificial neuron in 1943 to the first artificial neural network application to genetic algorithms. The potential for a similar technology in medicine has immediately been identified by scientists and researchers. The possibility to store and process all medical knowledge has made this technology very attractive to assist or even surpass clinicians in reaching a diagnosis. Applications of AI in medicine include devices applied to clinical diagnosis in neurology and cardiopulmonary diseases, as well as the use of expert or knowledge-based systems in routine clinical use for diagnosis, therapeutic management and for prognostic evaluation. Biological applications include genome sequencing or DNA gene expression microarrays, modeling gene networks, analysis and clustering of gene expression data, pattern recognition in DNA and proteins, protein structure prediction. In the field of hematology the first devices based on AI have been applied to the routine laboratory data management. New tools concern the differential diagnosis in specific diseases such as anemias, thalassemias and leukemias, based on neural networks trained with data from peripheral blood analysis. A revolution in cancer diagnosis, including the diagnosis of hematological malignancies, has been the introduction of the first microarray based and bioinformatic approach for molecular diagnosis: a systematic approach based on the monitoring of simultaneous expression of thousands of genes using DNA microarray, independently of previous biological knowledge, analysed using AI devices. Using gene profiling, the traditional diagnostic pathways move from clinical to molecular based diagnostic systems.

  5. Reintrepreting the cardiovascular system as a mechanical model

    NASA Astrophysics Data System (ADS)

    Lemos, Diogo; Machado, José; Minas, Graça; Soares, Filomena; Barros, Carla; Leão, Celina Pinto

    2013-10-01

    The simulation of the different physiological systems is very useful as a pedagogical tool, allowing a better understanding of the mechanisms and the functions of the processes. The observation of the physiological phenomena through mechanical simulators represents a great asset. Furthermore, the development of these simulators allows reinterpreting physiological systems, with the advantage of using the same transducers and sensors that are commonly used in diagnostic and therapeutic cardiovascular procedures for the monitoring of system' parameters. The cardiovascular system is one of the most important systems of the human body and has been the target of several biomedical studies. The present work describes a mechanical simulation of the cardiovascular system, in particularly, the systemic circulation, which can be described in terms of its hemodynamic variables. From the mechanical process and parameters, physiological system's behavior was reproduced, as accurately as possible.

  6. A grid matrix-based Raman spectroscopic method to characterize different cell milieu in biopsied axillary sentinel lymph nodes of breast cancer patients.

    PubMed

    Som, Dipasree; Tak, Megha; Setia, Mohit; Patil, Asawari; Sengupta, Amit; Chilakapati, C Murali Krishna; Srivastava, Anurag; Parmar, Vani; Nair, Nita; Sarin, Rajiv; Badwe, R

    2016-01-01

    Raman spectroscopy which is based upon inelastic scattering of photons has a potential to emerge as a noninvasive bedside in vivo or ex vivo molecular diagnostic tool. There is a need to improve the sensitivity and predictability of Raman spectroscopy. We developed a grid matrix-based tissue mapping protocol to acquire cellular-specific spectra that also involved digital microscopy for localizing malignant and lymphocytic cells in sentinel lymph node biopsy sample. Biosignals acquired from specific cellular milieu were subjected to an advanced supervised analytical method, i.e., cross-correlation and peak-to-peak ratio in addition to PCA and PC-LDA. We observed decreased spectral intensity as well as shift in the spectral peaks of amides and lipid bands in the completely metastatic (cancer cells) lymph nodes with high cellular density. Spectral library of normal lymphocytes and metastatic cancer cells created using the cellular specific mapping technique can be utilized to create an automated smart diagnostic tool for bench side screening of sampled lymph nodes. Spectral library of normal lymphocytes and metastatic cancer cells created using the cellular specific mapping technique can be utilized to develop an automated smart diagnostic tool for bench side screening of sampled lymph nodes supported by ongoing global research in developing better technology and signal and big data processing algorithms.

  7. Rasch Model Based Analysis of the Force Concept Inventory

    ERIC Educational Resources Information Center

    Planinic, Maja; Ivanjek, Lana; Susac, Ana

    2010-01-01

    The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear…

  8. Effects of Pattern Matching, Pattern Discrimination, and Experience in the Development of Diagnostic Expertise.

    ERIC Educational Resources Information Center

    Papa, Frank; And Others

    1990-01-01

    In this study an artificial intelligence assessment tool used disease-by-feature frequency estimates to create disease prototypes for nine common causes of acute chest pain. The tool then used each subject's prototypes and a pattern-recognition-based decision-making mechanism to diagnose 18 myocardial infarction cases. (MLW)

  9. Machine learning for the meta-analyses of microbial pathogens' volatile signatures.

    PubMed

    Palma, Susana I C J; Traguedo, Ana P; Porteira, Ana R; Frias, Maria J; Gamboa, Hugo; Roque, Ana C A

    2018-02-20

    Non-invasive and fast diagnostic tools based on volatolomics hold great promise in the control of infectious diseases. However, the tools to identify microbial volatile organic compounds (VOCs) discriminating between human pathogens are still missing. Artificial intelligence is increasingly recognised as an essential tool in health sciences. Machine learning algorithms based in support vector machines and features selection tools were here applied to find sets of microbial VOCs with pathogen-discrimination power. Studies reporting VOCs emitted by human microbial pathogens published between 1977 and 2016 were used as source data. A set of 18 VOCs is sufficient to predict the identity of 11 microbial pathogens with high accuracy (77%), and precision (62-100%). There is one set of VOCs associated with each of the 11 pathogens which can predict the presence of that pathogen in a sample with high accuracy and precision (86-90%). The implemented pathogen classification methodology supports future database updates to include new pathogen-VOC data, which will enrich the classifiers. The sets of VOCs identified potentiate the improvement of the selectivity of non-invasive infection diagnostics using artificial olfaction devices.

  10. Analytical Utility of Mass Spectral Binning in Proteomic Experiments by SPectral Immonium Ion Detection (SPIID)*

    PubMed Central

    Kelstrup, Christian D.; Frese, Christian; Heck, Albert J. R.; Olsen, Jesper V.; Nielsen, Michael L.

    2014-01-01

    Unambiguous identification of tandem mass spectra is a cornerstone in mass-spectrometry-based proteomics. As the study of post-translational modifications (PTMs) by means of shotgun proteomics progresses in depth and coverage, the ability to correctly identify PTM-bearing peptides is essential, increasing the demand for advanced data interpretation. Several PTMs are known to generate unique fragment ions during tandem mass spectrometry, the so-called diagnostic ions, which unequivocally identify a given mass spectrum as related to a specific PTM. Although such ions offer tremendous analytical advantages, algorithms to decipher MS/MS spectra for the presence of diagnostic ions in an unbiased manner are currently lacking. Here, we present a systematic spectral-pattern-based approach for the discovery of diagnostic ions and new fragmentation mechanisms in shotgun proteomics datasets. The developed software tool is designed to analyze large sets of high-resolution peptide fragmentation spectra independent of the fragmentation method, instrument type, or protease employed. To benchmark the software tool, we analyzed large higher-energy collisional activation dissociation datasets of samples containing phosphorylation, ubiquitylation, SUMOylation, formylation, and lysine acetylation. Using the developed software tool, we were able to identify known diagnostic ions by comparing histograms of modified and unmodified peptide spectra. Because the investigated tandem mass spectra data were acquired with high mass accuracy, unambiguous interpretation and determination of the chemical composition for the majority of detected fragment ions was feasible. Collectively we present a freely available software tool that allows for comprehensive and automatic analysis of analogous product ions in tandem mass spectra and systematic mapping of fragmentation mechanisms related to common amino acids. PMID:24895383

  11. Chaos in plasma simulation and experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watts, C.; Newman, D.E.; Sprott, J.C.

    1993-09-01

    We investigate the possibility that chaos and simple determinism are governing the dynamics of reversed field pinch (RFP) plasmas using data from both numerical simulations and experiment. A large repertoire of nonlinear analysis techniques is used to identify low dimensional chaos. These tools include phase portraits and Poincard sections, correlation dimension, the spectrum of Lyapunov exponents and short term predictability. In addition, nonlinear noise reduction techniques are applied to the experimental data in an attempt to extract any underlying deterministic dynamics. Two model systems are used to simulate the plasma dynamics. These are -the DEBS code, which models global RFPmore » dynamics, and the dissipative trapped electron mode (DTEM) model, which models drift wave turbulence. Data from both simulations show strong indications of low,dimensional chaos and simple determinism. Experimental data were obtained from the Madison Symmetric Torus RFP and consist of a wide array of both global and local diagnostic signals. None of the signals shows any indication of low dimensional chaos or other simple determinism. Moreover, most of the analysis tools indicate the experimental system is very high dimensional with properties similar to noise. Nonlinear noise reduction is unsuccessful at extracting an underlying deterministic system.« less

  12. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  13. Evaluation of observed blast loading effects on NIF x-ray diagnostic collimators.

    PubMed

    Masters, N D; Fisher, A; Kalantar, D; Prasad, R; Stölken, J S; Wlodarczyk, C

    2014-11-01

    We present the "debris wind" models used to estimate the impulsive load to which x-ray diagnostics and other structures are subject during National Ignition Facility experiments. These models are used as part of the engineering design process. Isotropic models, based on simulations or simplified "expanding shell" models, are augmented by debris wind multipliers to account for directional anisotropy. We present improvements to these multipliers based on measurements of the permanent deflections of diagnostic components: 4× for the polar direction and 2× within the equatorial plane-the latter relaxing the previous heuristic debris wind multiplier.

  14. Allelic Variation of Cytochrome P450s Drives Resistance to Bednet Insecticides in a Major Malaria Vector.

    PubMed

    Ibrahim, Sulaiman S; Riveron, Jacob M; Bibby, Jaclyn; Irving, Helen; Yunta, Cristina; Paine, Mark J I; Wondji, Charles S

    2015-10-01

    Scale up of Long Lasting Insecticide Nets (LLINs) has massively contributed to reduce malaria mortality across Africa. However, resistance to pyrethroid insecticides in malaria vectors threatens its continued effectiveness. Deciphering the detailed molecular basis of such resistance and designing diagnostic tools is critical to implement suitable resistance management strategies. Here, we demonstrated that allelic variation in two cytochrome P450 genes is the most important driver of pyrethroid resistance in the major African malaria vector Anopheles funestus and detected key mutations controlling this resistance. An Africa-wide polymorphism analysis of the duplicated genes CYP6P9a and CYP6P9b revealed that both genes are directionally selected with alleles segregating according to resistance phenotypes. Modelling and docking simulations predicted that resistant alleles were better metabolizers of pyrethroids than susceptible alleles. Metabolism assays performed with recombinant enzymes of various alleles confirmed that alleles from resistant mosquitoes had significantly higher activities toward pyrethroids. Additionally, transgenic expression in Drosophila showed that flies expressing resistant alleles of both genes were significantly more resistant to pyrethroids compared with those expressing the susceptible alleles, indicating that allelic variation is the key resistance mechanism. Furthermore, site-directed mutagenesis and functional analyses demonstrated that three amino acid changes (Val109Ile, Asp335Glu and Asn384Ser) from the resistant allele of CYP6P9b were key pyrethroid resistance mutations inducing high metabolic efficiency. The detection of these first DNA markers of metabolic resistance to pyrethroids allows the design of DNA-based diagnostic tools to detect and track resistance associated with bednets scale up, which will improve the design of evidence-based resistance management strategies.

  15. Allelic Variation of Cytochrome P450s Drives Resistance to Bednet Insecticides in a Major Malaria Vector

    PubMed Central

    Ibrahim, Sulaiman S.; Riveron, Jacob M.; Bibby, Jaclyn; Irving, Helen; Yunta, Cristina; Paine, Mark J. I.; Wondji, Charles S.

    2015-01-01

    Scale up of Long Lasting Insecticide Nets (LLINs) has massively contributed to reduce malaria mortality across Africa. However, resistance to pyrethroid insecticides in malaria vectors threatens its continued effectiveness. Deciphering the detailed molecular basis of such resistance and designing diagnostic tools is critical to implement suitable resistance management strategies. Here, we demonstrated that allelic variation in two cytochrome P450 genes is the most important driver of pyrethroid resistance in the major African malaria vector Anopheles funestus and detected key mutations controlling this resistance. An Africa-wide polymorphism analysis of the duplicated genes CYP6P9a and CYP6P9b revealed that both genes are directionally selected with alleles segregating according to resistance phenotypes. Modelling and docking simulations predicted that resistant alleles were better metabolizers of pyrethroids than susceptible alleles. Metabolism assays performed with recombinant enzymes of various alleles confirmed that alleles from resistant mosquitoes had significantly higher activities toward pyrethroids. Additionally, transgenic expression in Drosophila showed that flies expressing resistant alleles of both genes were significantly more resistant to pyrethroids compared with those expressing the susceptible alleles, indicating that allelic variation is the key resistance mechanism. Furthermore, site-directed mutagenesis and functional analyses demonstrated that three amino acid changes (Val109Ile, Asp335Glu and Asn384Ser) from the resistant allele of CYP6P9b were key pyrethroid resistance mutations inducing high metabolic efficiency. The detection of these first DNA markers of metabolic resistance to pyrethroids allows the design of DNA-based diagnostic tools to detect and track resistance associated with bednets scale up, which will improve the design of evidence-based resistance management strategies. PMID:26517127

  16. Internet-based system for simulation-based medical planning for cardiovascular disease.

    PubMed

    Steele, Brooke N; Draney, Mary T; Ku, Joy P; Taylor, Charles A

    2003-06-01

    Current practice in vascular surgery utilizes only diagnostic and empirical data to plan treatments, which does not enable quantitative a priori prediction of the outcomes of interventions. We have previously described simulation-based medical planning methods to model blood flow in arteries and plan medical treatments based on physiologic models. An important consideration for the design of these patient-specific modeling systems is the accessibility to physicians with modest computational resources. We describe a simulation-based medical planning environment developed for the World Wide Web (WWW) using the Virtual Reality Modeling Language (VRML) and the Java programming language.

  17. Custom oligonucleotide array-based CGH: a reliable diagnostic tool for detection of exonic copy-number changes in multiple targeted genes

    PubMed Central

    Vasson, Aurélie; Leroux, Céline; Orhant, Lucie; Boimard, Mathieu; Toussaint, Aurélie; Leroy, Chrystel; Commere, Virginie; Ghiotti, Tiffany; Deburgrave, Nathalie; Saillour, Yoann; Atlan, Isabelle; Fouveaut, Corinne; Beldjord, Cherif; Valleix, Sophie; Leturcq, France; Dodé, Catherine; Bienvenu, Thierry; Chelly, Jamel; Cossée, Mireille

    2013-01-01

    The frequency of disease-related large rearrangements (referred to as copy-number mutations, CNMs) varies among genes, and search for these mutations has an important place in diagnostic strategies. In recent years, CGH method using custom-designed high-density oligonucleotide-based arrays allowed the development of a powerful tool for detection of alterations at the level of exons and made it possible to provide flexibility through the possibility of modeling chips. The aim of our study was to test custom-designed oligonucleotide CGH array in a diagnostic laboratory setting that analyses several genes involved in various genetic diseases, and to compare it with conventional strategies. To this end, we designed a 12-plex CGH array (135k; 135 000 probes/subarray) (Roche Nimblegen) with exonic and intronic oligonucleotide probes covering 26 genes routinely analyzed in the laboratory. We tested control samples with known CNMs and patients for whom genetic causes underlying their disorders were unknown. The contribution of this technique is undeniable. Indeed, it appeared reproducible, reliable and sensitive enough to detect heterozygous single-exon deletions or duplications, complex rearrangements and somatic mosaicism. In addition, it improves reliability of CNM detection and allows determination of boundaries precisely enough to direct targeted sequencing of breakpoints. All of these points, associated with the possibility of a simultaneous analysis of several genes and scalability ‘homemade' make it a valuable tool as a new diagnostic approach of CNMs. PMID:23340513

  18. Simulation of networks of spiking neurons: A review of tools and strategies

    PubMed Central

    Brette, Romain; Rudolph, Michelle; Carnevale, Ted; Hines, Michael; Beeman, David; Bower, James M.; Diesmann, Markus; Morrison, Abigail; Goodman, Philip H.; Harris, Frederick C.; Zirpe, Milind; Natschläger, Thomas; Pecevski, Dejan; Ermentrout, Bard; Djurfeldt, Mikael; Lansner, Anders; Rochel, Olivier; Vieville, Thierry; Muller, Eilif; Davison, Andrew P.; El Boustani, Sami

    2009-01-01

    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin–Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks. PMID:17629781

  19. Lymph Node Metastases Optical Molecular Diagnostic and Radiation Therapy

    DTIC Science & Technology

    2017-03-01

    structures and not molecular functions. The one tool commonly used for metastases imaging is nuclear medicine. Positron emission tomography, PET, is...be visualized at a relevant stage., largely because most imaging is based upon structures and not molecular functions. But there are no tools to...system suitable for imaging signals from in small animals on the standard radiation therapy tools. (3) To evaluate the limits on structural , metabolic

  20. Evaluation of the Community Multi-scale Air Quality (CMAQ) ...

    EPA Pesticide Factsheets

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Protection Agency develops the CMAQ model and periodically releases new versions of the model that include bug fixes and various other improvements to the modeling system. In the fall of 2015, CMAQ version 5.1 was released. This new version of CMAQ will contain important bug fixes to several issues that were identified in CMAQv5.0.2 and additionally include updates to other portions of the code. Several annual, and numerous episodic, CMAQv5.1 simulations were performed to assess the impact of these improvements on the model results. These results will be presented, along with a base evaluation of the performance of the CMAQv5.1 modeling system against available surface and upper-air measurements available during the time period simulated. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, proces

  1. Clinical Validity of the ADI-R in a US-Based Latino Population.

    PubMed

    Vanegas, Sandra B; Magaña, Sandra; Morales, Miguel; McNamara, Ellyn

    2016-05-01

    The Autism Diagnostic Interview-Revised (ADI-R) has been validated as a tool to aid in the diagnosis of Autism; however, given the growing diversity in the United States, the ADI-R must be validated for different languages and cultures. This study evaluates the validity of the ADI-R in a US-based Latino, Spanish-speaking population of 50 children and adolescents with ASD and developmental disability. Sensitivity and specificity of the ADI-R as a diagnostic tool were moderate, but lower than previously reported values. Validity of the social reciprocity and restrictive and repetitive behaviors domains was high, but low in the communication domain. Findings suggest that language discordance between caregiver and child may influence reporting of communication symptoms and contribute to lower sensitivity and specificity.

  2. Model-based diagnostics of gas turbine engine lubrication systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byington, C.S.

    1998-09-01

    The objective of the current research was to develop improved methodology for diagnosing anomalies and maintaining oil lubrication systems for gas turbine engines. The effort focused on the development of reasoning modules that utilize the existing, inexpensive sensors and are applicable to on-line monitoring within the full-authority digital engine controller (FADEC) of the engine. The target application is the Enhanced TF-40B gas turbine engine that powers the Landing Craft Air Cushion (LCAC) platform. To accomplish the development of the requisite data fusion algorithms and automated reasoning for the diagnostic modules, Penn State ARL produced a generic Turbine Engine Lubrication Systemmore » Simulator (TELSS) and Data Fusion Workbench (DFW). TELSS is a portable simulator code that calculates lubrication system parameters based upon one-dimensional fluid flow resistance network equations. Validation of the TF- 40B modules was performed using engineering and limited test data. The simulation model was used to analyze operational data from the LCAC fleet. The TELSS, as an integral portion of the DFW, provides the capability to experiment with combinations of variables and feature vectors that characterize normal and abnormal operation of the engine lubrication system. The model-based diagnostics approach is applicable to all gas turbine engines and mechanical transmissions with similar pressure-fed lubrication systems.« less

  3. Current ante-mortem techniques for diagnosis of bovine tuberculosis.

    PubMed

    Bezos, Javier; Casal, Carmen; Romero, Beatriz; Schroeder, Bjoern; Hardegger, Roland; Raeber, Alex J; López, Lissette; Rueda, Paloma; Domínguez, Lucas

    2014-10-01

    Bovine tuberculosis (TB), mainly caused by Mycobacterium bovis, is a zoonotic disease with implications for Public Health and having an economic impact due to decreased production and limitations to the trade. Bovine TB is subjected to official eradication campaigns mainly based on a test and slaughter policy using diagnostic assays based on the cell-mediated immune response as the intradermal tuberculin test and the gamma-interferon (IFN-γ) assay. Moreover, several diagnostic assays based on the detection of specific antibodies (Abs) have been developed in the last few years with the aim of complementing the current diagnostic techniques in the near future. This review provides an overview of the current ante-mortem diagnostic tools for diagnosis of bovine TB regarding historical background, methodologies and sensitivity (Se) and specificity (Sp) obtained in previous studies under different epidemiological situations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Automated Dermoscopy Image Analysis of Pigmented Skin Lesions

    PubMed Central

    Baldi, Alfonso; Quartulli, Marco; Murace, Raffaele; Dragonetti, Emanuele; Manganaro, Mario; Guerra, Oscar; Bizzi, Stefano

    2010-01-01

    Dermoscopy (dermatoscopy, epiluminescence microscopy) is a non-invasive diagnostic technique for the in vivo observation of pigmented skin lesions (PSLs), allowing a better visualization of surface and subsurface structures (from the epidermis to the papillary dermis). This diagnostic tool permits the recognition of morphologic structures not visible by the naked eye, thus opening a new dimension in the analysis of the clinical morphologic features of PSLs. In order to reduce the learning-curve of non-expert clinicians and to mitigate problems inherent in the reliability and reproducibility of the diagnostic criteria used in pattern analysis, several indicative methods based on diagnostic algorithms have been introduced in the last few years. Recently, numerous systems designed to provide computer-aided analysis of digital images obtained by dermoscopy have been reported in the literature. The goal of this article is to review these systems, focusing on the most recent approaches based on content-based image retrieval systems (CBIR). PMID:24281070

  5. Development of a smartphone-based pulse oximeter with adaptive SNR/power balancing.

    PubMed

    Phelps, Tom; Haowei Jiang; Hall, Drew A

    2017-07-01

    Millions worldwide suffer from diseases that exhibit early warnings signs that can be detected by standard clinical-grade diagnostic tools. Unfortunately, such tools are often prohibitively expensive to the developing world leading to inadequate healthcare and high mortality rates. To address this problem, a smartphone-based pulse oximeter is presented that interfaces with the phone through the audio jack, enabling point-of-care measurements of heart rate (HR) and oxygen saturation (SpO 2 ). The device is designed to utilize existing phone resources (e.g., the processor, battery, and memory) resulting in a more portable and inexpensive diagnostic tool than standalone equivalents. By adaptively tuning the LED driving signal, the device is less dependent on phone-specific audio jack properties than prior audio jack-based work making it universally compatible with all smartphones. We demonstrate that the pulse oximeter can adaptively optimize the signal-to-noise ratio (SNR) within the power constraints of a mobile phone (<; 10mW) while maintaining high accuracy (HR error <; 3.4% and SpO 2 error <; 3.7%) against a clinical grade instrument.

  6. Tuberculosis Diagnostics in 2015: Landscape, Priorities, Needs, and Prospects

    PubMed Central

    Pai, Madhukar; Schito, Marco

    2015-01-01

    In 2015, tuberculosis remains a major global health problem, and drug-resistant tuberculosis is a growing threat. Although tuberculosis diagnosis in many countries is still reliant on older tools, new diagnostics are changing the landscape. Stimulated, in part, by the success and roll out of Xpert MTB/RIF, there is now considerable interest in new technologies. The landscape looks promising, with a robust pipeline of new tools, particularly molecular diagnostics, and well over 50 companies actively engaged in product development. However, new diagnostics are yet to reach scale, and there needs to be greater convergence between diagnostics development and development of shorter-duration tuberculosis drug regimens. Another concern is the relative absence of non–sputum-based diagnostics in the pipeline for children and of biomarker tests for triage, cure, and progression of latent Mycobacterium tuberculosis infection. Several initiatives, described in this supplement, have been launched to further stimulate product development and policy, including assessment of needs and priorities, development of target product profiles, compilation of data on resistance-associated mutations, and assessment of market size and potential for new diagnostics. Advocacy is needed to increase funding for tuberculosis research and development, and governments in high-burden countries must invest more in tuberculosis control to meet post-2015 targets for care, control, and prevention. PMID:25765103

  7. Diagnostic budgets of analyzed and modelled tropical plumes

    NASA Technical Reports Server (NTRS)

    Mcguirk, James P.; Vest, Gerry W.

    1993-01-01

    Blackwell et al. successfully simulated tropical plumes in a global barotropic model valid at 200 mb. The plume evolved in response to strong equatorial convergence which simulated a surge in the Walker Circulation. The defining characteristics of simulated plumes are: a subtropical jet with southerlies emanating from the deep tropics; a tropical/mid-latitude trough to the west; a convergence/divergence dipole straddling the trough; and strong cross contour flow at the tropical base of the jet. Diagnostic budgets of vorticity, divergence, and kinetic energy are calculated to explain the evolution of the modelled plumes. Budgets describe the unforced (basic) state, forced plumes, forced cases with no plumes, and ECMWF analyzed plumes.

  8. GEM detectors development for radiation environment: neutron tests and simulations

    NASA Astrophysics Data System (ADS)

    Chernyshova, Maryna; Jednoróg, Sławomir; Malinowski, Karol; Czarski, Tomasz; Ziółkowski, Adam; Bieńkowska, Barbara; Prokopowicz, Rafał; Łaszyńska, Ewa; Kowalska-Strzeciwilk, Ewa; Poźniak, Krzysztof T.; Kasprowicz, Grzegorz; Zabołotny, Wojciech; Wojeński, Andrzej; Krawczyk, Rafał D.; Linczuk, Paweł; Potrykus, Paweł; Bajdel, Barcel

    2016-09-01

    One of the requests from the ongoing ITER-Like Wall Project is to have diagnostics for Soft X-Ray (SXR) monitoring in tokamak. Such diagnostics should be focused on tungsten emission measurements, as an increased attention is currently paid to tungsten due to a fact that it became a main candidate for the plasma facing material in ITER and future fusion reactor. In addition, such diagnostics should be able to withstand harsh radiation environment at tokamak during its operation. The presented work is related to the development of such diagnostics based on Gas Electron Multiplier (GEM) technology. More specifically, an influence of neutron radiation on performance of the GEM detectors is studied both experimentally and through computer simulations. The neutron induced radioactivity (after neutron source exposure) was found to be not pronounced comparing to an impact of other secondary neutron reaction products (during the exposure).

  9. NASA Tech Briefs, September 2011

    NASA Technical Reports Server (NTRS)

    2011-01-01

    Topics covered include: Fused Reality for Enhanced Flight Test Capabilities; Thermography to Inspect Insulation of Large Cryogenic Tanks; Crush Test Abuse Stand; Test Generator for MATLAB Simulations; Dynamic Monitoring of Cleanroom Fallout Using an Air Particle Counter; Enhancement to Non-Contacting Stress Measurement of Blade Vibration Frequency; Positively Verifying Mating of Previously Unverifiable Flight Connectors; Radiation-Tolerant Intelligent Memory Stack - RTIMS; Ultra-Low-Dropout Linear Regulator; Excitation of a Parallel Plate Waveguide by an Array of Rectangular Waveguides; FPGA for Power Control of MSL Avionics; UAVSAR Active Electronically Scanned Array; Lockout/Tagout (LOTO) Simulator; Silicon Carbide Mounts for Fabry-Perot Interferometers; Measuring the In-Process Figure, Final Prescription, and System Alignment of Large; Optics and Segmented Mirrors Using Lidar Metrology; Fiber-Reinforced Reactive Nano-Epoxy Composites; Polymerization Initiated at the Sidewalls of Carbon Nanotubes; Metal-Matrix/Hollow-Ceramic-Sphere Composites; Piezoelectrically Enhanced Photocathodes; Iridium-Doped Ruthenium Oxide Catalyst for Oxygen Evolution; Improved Mo-Re VPS Alloys for High-Temperature Uses; Data Service Provider Cost Estimation Tool; Hybrid Power Management-Based Vehicle Architecture; Force Limit System; Levitated Duct Fan (LDF) Aircraft Auxiliary Generator; Compact, Two-Sided Structural Cold Plate Configuration; AN Fitting Reconditioning Tool; Active Response Gravity Offload System; Method and Apparatus for Forming Nanodroplets; Rapid Detection of the Varicella Zoster Virus in Saliva; Improved Devices for Collecting Sweat for Chemical Analysis; Phase-Controlled Magnetic Mirror for Wavefront Correction; and Frame-Transfer Gating Raman Spectroscopy for Time-Resolved Multiscalar Combustion Diagnostics.

  10. Measuring the Diagnostic Features of Social (Pragmatic) Communication Disorder: An Exploratory Study.

    PubMed

    Yuan, Haiying; Dollaghan, Christine

    2018-03-27

    The Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition introduced a new neurodevelopmental disorder, social (pragmatic) communication disorder (SPCD), that is characterized by deficits in 4 areas of communication. Although descriptions of these areas are provided, no assessment tools for SPCD are recommended. The purpose of this study was to examine the extent to which items from measurement tools commonly used in assessing pragmatic language impairment and related disorders might be useful in assessing the characteristics of social communication that define SPCD in the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition. Based on a literature search, 594 items from assessment tools commonly used to measure social communication abilities in people with pragmatic language impairment were identified. The first author judged whether each item reflected 1, more than 1, or none of the 4 SPCD diagnostic characteristics. After a brief training process, 5 second raters independently mapped subsets of items to the 6 categories. We calculated the percentage of agreement and Cohen's kappa for each pair of raters in assigning items to categories. Percentages of agreement ranged from 76% to 82%, and Cohen's kappa values ranged from .69 to .76, indicating substantial agreement. Sources and item numbers for the 206 items that both raters assigned to the same SPCD feature are provided. These items may provide guidance in assessing SPCD and in designing standardized screening and diagnostic measures for SPCD.

  11. Model diagnostics in reduced-rank estimation

    PubMed Central

    Chen, Kun

    2016-01-01

    Reduced-rank methods are very popular in high-dimensional multivariate analysis for conducting simultaneous dimension reduction and model estimation. However, the commonly-used reduced-rank methods are not robust, as the underlying reduced-rank structure can be easily distorted by only a few data outliers. Anomalies are bound to exist in big data problems, and in some applications they themselves could be of the primary interest. While naive residual analysis is often inadequate for outlier detection due to potential masking and swamping, robust reduced-rank estimation approaches could be computationally demanding. Under Stein's unbiased risk estimation framework, we propose a set of tools, including leverage score and generalized information score, to perform model diagnostics and outlier detection in large-scale reduced-rank estimation. The leverage scores give an exact decomposition of the so-called model degrees of freedom to the observation level, which lead to exact decomposition of many commonly-used information criteria; the resulting quantities are thus named information scores of the observations. The proposed information score approach provides a principled way of combining the residuals and leverage scores for anomaly detection. Simulation studies confirm that the proposed diagnostic tools work well. A pattern recognition example with hand-writing digital images and a time series analysis example with monthly U.S. macroeconomic data further demonstrate the efficacy of the proposed approaches. PMID:28003860

  12. Model diagnostics in reduced-rank estimation.

    PubMed

    Chen, Kun

    2016-01-01

    Reduced-rank methods are very popular in high-dimensional multivariate analysis for conducting simultaneous dimension reduction and model estimation. However, the commonly-used reduced-rank methods are not robust, as the underlying reduced-rank structure can be easily distorted by only a few data outliers. Anomalies are bound to exist in big data problems, and in some applications they themselves could be of the primary interest. While naive residual analysis is often inadequate for outlier detection due to potential masking and swamping, robust reduced-rank estimation approaches could be computationally demanding. Under Stein's unbiased risk estimation framework, we propose a set of tools, including leverage score and generalized information score, to perform model diagnostics and outlier detection in large-scale reduced-rank estimation. The leverage scores give an exact decomposition of the so-called model degrees of freedom to the observation level, which lead to exact decomposition of many commonly-used information criteria; the resulting quantities are thus named information scores of the observations. The proposed information score approach provides a principled way of combining the residuals and leverage scores for anomaly detection. Simulation studies confirm that the proposed diagnostic tools work well. A pattern recognition example with hand-writing digital images and a time series analysis example with monthly U.S. macroeconomic data further demonstrate the efficacy of the proposed approaches.

  13. Regression Models for Identifying Noise Sources in Magnetic Resonance Images

    PubMed Central

    Zhu, Hongtu; Li, Yimei; Ibrahim, Joseph G.; Shi, Xiaoyan; An, Hongyu; Chen, Yashen; Gao, Wei; Lin, Weili; Rowe, Daniel B.; Peterson, Bradley S.

    2009-01-01

    Stochastic noise, susceptibility artifacts, magnetic field and radiofrequency inhomogeneities, and other noise components in magnetic resonance images (MRIs) can introduce serious bias into any measurements made with those images. We formally introduce three regression models including a Rician regression model and two associated normal models to characterize stochastic noise in various magnetic resonance imaging modalities, including diffusion-weighted imaging (DWI) and functional MRI (fMRI). Estimation algorithms are introduced to maximize the likelihood function of the three regression models. We also develop a diagnostic procedure for systematically exploring MR images to identify noise components other than simple stochastic noise, and to detect discrepancies between the fitted regression models and MRI data. The diagnostic procedure includes goodness-of-fit statistics, measures of influence, and tools for graphical display. The goodness-of-fit statistics can assess the key assumptions of the three regression models, whereas measures of influence can isolate outliers caused by certain noise components, including motion artifacts. The tools for graphical display permit graphical visualization of the values for the goodness-of-fit statistic and influence measures. Finally, we conduct simulation studies to evaluate performance of these methods, and we analyze a real dataset to illustrate how our diagnostic procedure localizes subtle image artifacts by detecting intravoxel variability that is not captured by the regression models. PMID:19890478

  14. Quantifying Solar Cell Cracks in Photovoltaic Modules by Electroluminescence Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spataru, Sergiu; Hacke, Peter; Sera, Dezso

    2015-06-14

    This article proposes a method for quantifying the percentage of partially and totally disconnected solar cell cracks by analyzing electroluminescence images of the photovoltaic module taken under high- and low-current forward bias. The method is based on the analysis of the module's electroluminescence intensity distribution, applied at module and cell level. These concepts are demonstrated on a crystalline silicon photovoltaic module that was subjected to several rounds of mechanical loading and humidity-freeze cycling, causing increasing levels of solar cell cracks. The proposed method can be used as a diagnostic tool to rate cell damage or quality of modules after transportation.more » Moreover, the method can be automated and used in quality control for module manufacturers, installers, or as a diagnostic tool by plant operators and diagnostic service providers.« less

  15. HIDRA-MAT: A Material Analysis Tool for Fusion Devices

    NASA Astrophysics Data System (ADS)

    Andruczyk, Daniel; Rizkallah, Rabel; Bedoya, Felipe; Kapat, Aveek; Schamis, Hanna; Allain, Jean Paul

    2017-10-01

    The former WEGA stellarator which is now operating as HIDRA at the University of Illinois will be almost exclusively used to study the intimate relationship between the plasma interacting with surfaces of different materials. A Material Analysis Tool (HIDRA-MAT) is being designed and will be built based on the successful Material Analysis and Particle Probe (MAPP) which is currently used on NSTX-U at PPPL. This will be an in-situ material diagnostic probe, meaning that all analysis can be done without breaking vacuum. This allows surface changes to be studied in real-time. HIDRA-MAT will consist of several in-situ diagnostics including Langmuir probes (LP), Thermal Desorption Spectroscopy (TDS), X-ray Photo Spectroscopy (XPS) and Ion Scattering Spectroscopy (ISS). This presentation will outline the HIDRA-MAT diagnostic and initial design, as well as its integration into the HIDRA system.

  16. Supply Chain Simulator: A Scenario-Based Educational Tool to Enhance Student Learning

    ERIC Educational Resources Information Center

    Siddiqui, Atiq; Khan, Mehmood; Akhtar, Sohail

    2008-01-01

    Simulation-based educational products are excellent set of illustrative tools that proffer features like visualization of the dynamic behavior of a real system, etc. Such products have great efficacy in education and are known to be one of the first-rate student centered learning methodologies. These products allow students to practice skills such…

  17. Recent advances in salivary cancer diagnostics enabled by biosensors and bioelectronics.

    PubMed

    Mishra, Saswat; Saadat, Darius; Kwon, Ohjin; Lee, Yongkuk; Choi, Woon-Seop; Kim, Jong-Hoon; Yeo, Woon-Hong

    2016-07-15

    There is a high demand for a non-invasive, rapid, and highly accurate tool for disease diagnostics. Recently, saliva based diagnostics for the detection of specific biomarkers has drawn significant attention since the sample extraction is simple, cost-effective, and precise. Compared to blood, saliva contains a similar variety of DNA, RNA, proteins, metabolites, and microbiota that can be compiled into a multiplex of cancer detection markers. The salivary diagnostic method holds great potential for early-stage cancer diagnostics without any complicated and expensive procedures. Here, we review various cancer biomarkers in saliva and compare the biomarkers efficacy with traditional diagnostics and state-of-the-art bioelectronics. We summarize biomarkers in four major groups: genomics, transcriptomics, proteomics, and metabolomics/microbiota. Representative bioelectronic systems for each group are summarized based on various stages of a cancer. Systematic study of oxidative stress establishes the relationship between macromolecules and cancer biomarkers in saliva. We also introduce the most recent examples of salivary diagnostic electronics based on nanotechnologies that can offer rapid, yet highly accurate detection of biomarkers. A concluding section highlights areas of opportunity in the further development and applications of these technologies. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Eco-Logic: Logic-Based Approaches to Ecological Modelling

    Treesearch

    Daniel L. Schmoldt

    1991-01-01

    This paper summarizes the simulation research carried out during 1984-1989 at the University of Edinburgh. Two primary objectives of their research are 1) to provide tools for manipulating simulation models (i.e., implementation tools) and 2) to provide advice on conceptualizing real-world phenomena into an idealized representation for simulation (i.e., model design...

  19. SolarTherm: A flexible Modelica-based simulator for CSP systems

    NASA Astrophysics Data System (ADS)

    Scott, Paul; Alonso, Alberto de la Calle; Hinkley, James T.; Pye, John

    2017-06-01

    Annual performance simulations provide a valuable tool for analysing the viability and overall impact of different concentrating solar power (CSP) component and system designs. However, existing tools work best with conventional systems and are difficult or impossible to adapt when novel components, configurations and operating strategies are of interest. SolarTherm is a new open source simulation tool that fulfils this need for the solar community. It includes a simulation framework and a library of flexible CSP components and control strategies that can be adapted or replaced with new designs to meet the special needs of end users. This paper provides an introduction to SolarTherm and a comparison of models for an energy-based trough system and a physical tower system to those in the well-established and widely-used simulator SAM. Differences were found in some components where the inner workings of SAM are undocumented or not well understood, while the other parts show strong agreement. These results help to validate the fundamentals of SolarTherm and demonstrate that, while at an early stage of development, it is already a useful tool for performing annual simulations.

  20. Betatron radiation based diagnostics for plasma wakefield accelerated electron beams at the SPARC_LAB test facility

    NASA Astrophysics Data System (ADS)

    Shpakov, V.; Anania, M. P.; Biagioni, A.; Chiadroni, E.; Cianchi, A.; Curcio, A.; Dabagov, S.; Ferrario, M.; Filippi, F.; Marocchino, A.; Paroli, B.; Pompili, R.; Rossi, A. R.; Zigler, A.

    2016-09-01

    Recent progress with wake-field acceleration has shown a great potential in providing high gradient acceleration fields, while the quality of the beams remains relatively poor. Precise knowledge of the beam size at the exit from the plasma and matching conditions for the externally injected beams are the key for improvement of beam quality. Betatron radiation emitted by the beam during acceleration in the plasma is a powerful tool for the transverse beam size measurement, being also non-intercepting. In this work we report on the technical solutions chosen at SPARC_LAB for such diagnostics tool, along with expected parameters of betatron radiation.

  1. Current molecular and emerging nanobiotechnology approaches for the detection of microbial pathogens.

    PubMed

    Theron, Jacques; Eugene Cloete, Thomas; de Kwaadsteniet, Michele

    2010-11-01

    Waterborne microbial diseases are escalating worldwide increasing the need for powerful and sensitive diagnostics tools. Molecular methodologies, including immunological and nucleic acid-based methods, have only recently been applied in the water sector. Advances in nanotechnology and nanomaterials have opened the door for the development of new diagnostic tools with increased sensitivity and speed, and reduced cost and labor. Quantum dots, flo dots, gold nanoparticles, magnetic nanoparticles, carbon nanotubes, nanowires, and nanocantilevers, with their unique optical and physical properties, have already been applied in nanodiagnostics. Nanobiotechnology, once remaining technical and practical problems has been addressed, will play an important role in the detection of microbial pathogens.

  2. Design and performance evaluation of a 20-aperture multipinhole collimator for myocardial perfusion imaging applications.

    PubMed

    Bowen, Jason D; Huang, Qiu; Ellin, Justin R; Lee, Tzu-Cheng; Shrestha, Uttam; Gullberg, Grant T; Seo, Youngho

    2013-10-21

    Single photon emission computed tomography (SPECT) myocardial perfusion imaging remains a critical tool in the diagnosis of coronary artery disease. However, after more than three decades of use, photon detection efficiency remains poor and unchanged. This is due to the continued reliance on parallel-hole collimators first introduced in 1964. These collimators possess poor geometric efficiency. Here we present the performance evaluation results of a newly designed multipinhole collimator with 20 pinhole apertures (PH20) for commercial SPECT systems. Computer simulations and numerical observer studies were used to assess the noise, bias and diagnostic imaging performance of a PH20 collimator in comparison with those of a low energy high resolution (LEHR) parallel-hole collimator. Ray-driven projector/backprojector pairs were used to model SPECT imaging acquisitions, including simulation of noiseless projection data and performing MLEM/OSEM image reconstructions. Poisson noise was added to noiseless projections for realistic projection data. Noise and bias performance were investigated for five mathematical cardiac and torso (MCAT) phantom anatomies imaged at two gantry orbit positions (19.5 and 25.0 cm). PH20 and LEHR images were reconstructed with 300 MLEM iterations and 30 OSEM iterations (ten subsets), respectively. Diagnostic imaging performance was assessed by a receiver operating characteristic (ROC) analysis performed on a single MCAT phantom; however, in this case PH20 images were reconstructed with 75 pixel-based OSEM iterations (four subsets). Four PH20 projection views from two positions of a dual-head camera acquisition and 60 LEHR projections were simulated for all studies. At uniformly-imposed resolution of 12.5 mm, significant improvements in SNR and diagnostic sensitivity (represented by the area under the ROC curve, or AUC) were realized when PH20 collimators are substituted for LEHR parallel-hole collimators. SNR improves by factors of 1.94-2.34 for the five patient anatomies and two orbital positions studied. For the ROC analysis the PH20 AUC is larger than the LEHR AUC with a p-value of 0.0067. Bias performance, however, decreases with the use of PH20 collimators. Systematic analyses showed PH20 collimators present improved diagnostic imaging performance over LEHR collimators, requiring only collimator exchange on existing SPECT cameras for their use.

  3. Design and performance evaluation of a 20-aperture multipinhole collimator for myocardial perfusion imaging applications

    NASA Astrophysics Data System (ADS)

    Bowen, Jason D.; Huang, Qiu; Ellin, Justin R.; Lee, Tzu-Cheng; Shrestha, Uttam; Gullberg, Grant T.; Seo, Youngho

    2013-10-01

    Single photon emission computed tomography (SPECT) myocardial perfusion imaging remains a critical tool in the diagnosis of coronary artery disease. However, after more than three decades of use, photon detection efficiency remains poor and unchanged. This is due to the continued reliance on parallel-hole collimators first introduced in 1964. These collimators possess poor geometric efficiency. Here we present the performance evaluation results of a newly designed multipinhole collimator with 20 pinhole apertures (PH20) for commercial SPECT systems. Computer simulations and numerical observer studies were used to assess the noise, bias and diagnostic imaging performance of a PH20 collimator in comparison with those of a low energy high resolution (LEHR) parallel-hole collimator. Ray-driven projector/backprojector pairs were used to model SPECT imaging acquisitions, including simulation of noiseless projection data and performing MLEM/OSEM image reconstructions. Poisson noise was added to noiseless projections for realistic projection data. Noise and bias performance were investigated for five mathematical cardiac and torso (MCAT) phantom anatomies imaged at two gantry orbit positions (19.5 and 25.0 cm). PH20 and LEHR images were reconstructed with 300 MLEM iterations and 30 OSEM iterations (ten subsets), respectively. Diagnostic imaging performance was assessed by a receiver operating characteristic (ROC) analysis performed on a single MCAT phantom; however, in this case PH20 images were reconstructed with 75 pixel-based OSEM iterations (four subsets). Four PH20 projection views from two positions of a dual-head camera acquisition and 60 LEHR projections were simulated for all studies. At uniformly-imposed resolution of 12.5 mm, significant improvements in SNR and diagnostic sensitivity (represented by the area under the ROC curve, or AUC) were realized when PH20 collimators are substituted for LEHR parallel-hole collimators. SNR improves by factors of 1.94-2.34 for the five patient anatomies and two orbital positions studied. For the ROC analysis the PH20 AUC is larger than the LEHR AUC with a p-value of 0.0067. Bias performance, however, decreases with the use of PH20 collimators. Systematic analyses showed PH20 collimators present improved diagnostic imaging performance over LEHR collimators, requiring only collimator exchange on existing SPECT cameras for their use.

  4. Preliminary validation of a new methodology for estimating dose reduction protocols in neonatal chest computed radiographs

    NASA Astrophysics Data System (ADS)

    Don, Steven; Whiting, Bruce R.; Hildebolt, Charles F.; Sehnert, W. James; Ellinwood, Jacquelyn S.; Töpfer, Karin; Masoumzadeh, Parinaz; Kraus, Richard A.; Kronemer, Keith A.; Herman, Thomas; McAlister, William H.

    2006-03-01

    The risk of radiation exposure is greatest for pediatric patients and, thus, there is a great incentive to reduce the radiation dose used in diagnostic procedures for children to "as low as reasonably achievable" (ALARA). Testing of low-dose protocols presents a dilemma, as it is unethical to repeatedly expose patients to ionizing radiation in order to determine optimum protocols. To overcome this problem, we have developed a computed-radiography (CR) dose-reduction simulation tool that takes existing images and adds synthetic noise to create realistic images that correspond to images generated with lower doses. The objective of our study was to determine the extent to which simulated, low-dose images corresponded with original (non-simulated) low-dose images. To make this determination, we created pneumothoraces of known volumes in five neonate cadavers and obtained images of the neonates at 10 mR, 1 mR and 0.1 mR (as measured at the cassette plate). The 10-mR exposures were considered "relatively-noise-free" images. We used these 10 mR-images and our simulation tool to create simulated 0.1- and 1-mR images. For the simulated and original images, we identified regions of interest (ROI) of the entire chest, free-in-air region, and liver. We compared the means and standard deviations of the ROI grey-scale values of the simulated and original images with paired t tests. We also had observers rate simulated and original images for image quality and for the presence or absence of pneumothoraces. There was no statistically significant difference in grey-scale-value means nor standard deviations between simulated and original entire chest ROI regions. The observer performance suggests that an exposure >=0.2 mR is required to detect the presence or absence of pneumothoraces. These preliminary results indicate that the use of the simulation tool is promising for achieving ALARA exposures in children.

  5. 2D imaging X-ray diagnostic for measuring the current density distribution in a wide-area electron beam produced in a multiaperture diode with plasma cathode

    NASA Astrophysics Data System (ADS)

    Kurkuchekov, V.; Kandaurov, I.; Trunev, Y.

    2018-05-01

    A simple and inexpensive X-ray diagnostic tool was designed for measuring the cross-sectional current density distribution in a low-relativistic pulsed electron beam produced in a source based on an arc-discharge plasma cathode and multiaperture diode-type electron optical system. The beam parameters were as follows: Uacc = 50–110 kV, Ibeam = 20–100 A, τbeam = 0.1–0.3 ms. The beam effective diameter was ca. 7 cm. Based on a pinhole camera, the diagnostic allows one to obtain a 2D profile of electron beam flux distribution on a flat metal target in a single shot. The linearity of the diagnostic system response to the electron flux density was established experimentally. Spatial resolution of the diagnostic was also estimated in special test experiments. The optimal choice of the main components of the diagnostic technique is discussed.

  6. Acquiring, Representing, and Evaluating a Competence Model of Diagnostic Strategy.

    ERIC Educational Resources Information Center

    Clancey, William J.

    This paper describes NEOMYCIN, a computer program that models one physician's diagnostic reasoning within a limited area of medicine. NEOMYCIN's knowledge base and reasoning procedure constitute a model of how human knowledge is organized and how it is used in diagnosis. The hypothesis is tested that such a procedure can be used to simulate both…

  7. Collision detection and modeling of rigid and deformable objects in laparoscopic simulator

    NASA Astrophysics Data System (ADS)

    Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru

    2015-03-01

    Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.

  8. PREFACE: Diagnostics for electrical discharge light sources: pushing the limits Diagnostics for electrical discharge light sources: pushing the limits

    NASA Astrophysics Data System (ADS)

    Zissis, Georges; Haverlag, Marco

    2010-06-01

    Light sources play an indispensable role in the daily life of any human being. Quality of life, health and urban security related to traffic and crime prevention depend on light and on its quality. In fact, every day approximately 30 billion electric light sources operate worldwide. These electric light sources consume almost 19% of worldwide electricity production. Finding new ways to light lamps is a challenge where the stakes are scientific, technological, economic and environmental. The production of more efficient light sources is a sustainable solution for humanity. There are many opportunities for not only enhancing the efficiency and reliability of lighting systems but also for improving the quality of light as seen by the end user. This is possible through intelligent use of new technologies, deep scientific understanding of the operating principles of light sources and knowledge of the varied human requirements for different types of lighting in different settings. A revolution in the domain of light source technology is on the way: high brightness light emitting diodes arriving in the general lighting market, together with organic LEDs (OLEDs), are producing spectacular advances. However, unlike incandescence, electrical discharge lamps are far from disappearing from the market. In addition, new generations of discharge lamps based on molecular radiators are becoming a reality. There are still many scientific and technological challenges to be raised in this direction. Diagnostics are important for understanding the fundamental mechanisms taking place in the discharge plasma. This understanding is an absolute necessity for system optimization leading to more efficient and high quality light sources. The studied medium is rather complex, but new diagnostic techniques coupled to innovative ideas and powerful tools have been developed in recent years. This cluster issue of seven papers illustrates these efforts. The selected papers cover all domains, from high to low pressure and dielectric barrier lamps, from breakdown to acoustic resonance. Especially in the domain of high pressure lamps, J J Curry shows how coherent and incoherent x-ray scattering can be used as an imaging technique adapted to lamps. J Hirsch et al treat the acoustic resonance phenomenon that seriously limits the frequency domain for high pressure lamp operation. M Jinno et al illustrate a method that allows for measuring Xe buffer gas pressure in Hg-free metal halide lamps for automotive applications. In the domain of low pressure lamps, M Gendre et al investigate the breakdown phase by means of optical and electrical diagnostic tools. The similarity rules used a long time ago for simulating plasma behaviour based on invariants are now serving as diagnostic tools, as shown in the paper by D Michael et al. N Dagang et al show how impurities can be detected in Hg-free electrodeless lamps and more particularly in dielectric barrier discharges emitting excimer radiation. The quality of light is illustrated by a final example by R Kozakov et al on how to qualify the light output from the lamp with respect to biological effects on humans.

  9. Application of simulation models for the optimization of business processes

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  10. From diagnostics to metagenomics: Applications of DNA-based tools in forest pathology

    Treesearch

    Amy L. Ross-Davis; Mee-Sook Kim; Jane E. Stewart; John W. Hanna; John D. Shaw; Ned B. Klopfenstein

    2013-01-01

    Advances in molecular technology provide an accessible set of tools to 1) help forest pathologists detect, identify, and monitor forest pathogens, 2) examine the evolutionary relationships and global distributions of forest pathogens and their hosts, 3) assess the diversity and structure of host and pathogen populations, and 4) evaluate the structure and function of...

  11. Simulation tools for analyzer-based x-ray phase contrast imaging system with a conventional x-ray source

    NASA Astrophysics Data System (ADS)

    Caudevilla, Oriol; Zhou, Wei; Stoupin, Stanislav; Verman, Boris; Brankov, J. G.

    2016-09-01

    Analyzer-based X-ray phase contrast imaging (ABI) belongs to a broader family of phase-contrast (PC) X-ray imaging modalities. Unlike the conventional X-ray radiography, which measures only X-ray absorption, in PC imaging one can also measures the X-rays deflection induced by the object refractive properties. It has been shown that refraction imaging provides better contrast when imaging the soft tissue, which is of great interest in medical imaging applications. In this paper, we introduce a simulation tool specifically designed to simulate the analyzer-based X-ray phase contrast imaging system with a conventional polychromatic X-ray source. By utilizing ray tracing and basic physical principles of diffraction theory our simulation tool can predicting the X-ray beam profile shape, the energy content, the total throughput (photon count) at the detector. In addition we can evaluate imaging system point-spread function for various system configurations.

  12. A Tabletop Tool for Modeling Life Support Systems

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.; Majumdar, A.; McDaniels, D.; Stewart, E.

    2003-01-01

    This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data and inferences from the tests will allow for improvements in the development and design of next generation life support systems and configurations.

  13. Statistical physics of medical diagnostics: Study of a probabilistic model.

    PubMed

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  14. Statistical physics of medical diagnostics: Study of a probabilistic model

    NASA Astrophysics Data System (ADS)

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  15. An agent based simulation tool for scheduling emergency department physicians.

    PubMed

    Jones, Spencer S; Evans, R Scott

    2008-11-06

    Emergency department overcrowding is a problem that threatens the public health of communities and compromises the quality of care given to individual patients. The Institute of Medicine recommends that hospitals employ information technology and operations research methods to reduce overcrowding. This paper describes the development of an agent based simulation tool that has been designed to evaluate the impact of various physician staffing configurations on patient waiting times in the emergency department. We evaluate the feasibility of this tool at a single hospital emergency department.

  16. Modeling and simulation of a beam emission spectroscopy diagnostic for the ITER prototype neutral beam injector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barbisan, M., E-mail: marco.barbisan@igi.cnr.it; Zaniol, B.; Pasqualotto, R.

    2014-11-15

    A test facility for the development of the neutral beam injection system for ITER is under construction at Consorzio RFX. It will host two experiments: SPIDER, a 100 keV H{sup −}/D{sup −} ion RF source, and MITICA, a prototype of the full performance ITER injector (1 MV, 17 MW beam). A set of diagnostics will monitor the operation and allow to optimize the performance of the two prototypes. In particular, beam emission spectroscopy will measure the uniformity and the divergence of the fast particles beam exiting the ion source and travelling through the beam line components. This type of measurementmore » is based on the collection of the H{sub α}/D{sub α} emission resulting from the interaction of the energetic particles with the background gas. A numerical model has been developed to simulate the spectrum of the collected emissions in order to design this diagnostic and to study its performance. The paper describes the model at the base of the simulations and presents the modeled H{sub α} spectra in the case of MITICA experiment.« less

  17. Development of Reliable and Validated Tools to Evaluate Technical Resuscitation Skills in a Pediatric Simulation Setting: Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics.

    PubMed

    Faudeux, Camille; Tran, Antoine; Dupont, Audrey; Desmontils, Jonathan; Montaudié, Isabelle; Bréaud, Jean; Braun, Marc; Fournier, Jean-Paul; Bérard, Etienne; Berlengi, Noémie; Schweitzer, Cyril; Haas, Hervé; Caci, Hervé; Gatin, Amélie; Giovannini-Chami, Lisa

    2017-09-01

    To develop a reliable and validated tool to evaluate technical resuscitation skills in a pediatric simulation setting. Four Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics (RESCAPE) evaluation tools were created, following international guidelines: intraosseous needle insertion, bag mask ventilation, endotracheal intubation, and cardiac massage. We applied a modified Delphi methodology evaluation to binary rating items. Reliability was assessed comparing the ratings of 2 observers (1 in real time and 1 after a video-recorded review). The tools were assessed for content, construct, and criterion validity, and for sensitivity to change. Inter-rater reliability, evaluated with Cohen kappa coefficients, was perfect or near-perfect (>0.8) for 92.5% of items and each Cronbach alpha coefficient was ≥0.91. Principal component analyses showed that all 4 tools were unidimensional. Significant increases in median scores with increasing levels of medical expertise were demonstrated for RESCAPE-intraosseous needle insertion (P = .0002), RESCAPE-bag mask ventilation (P = .0002), RESCAPE-endotracheal intubation (P = .0001), and RESCAPE-cardiac massage (P = .0037). Significantly increased median scores over time were also demonstrated during a simulation-based educational program. RESCAPE tools are reliable and validated tools for the evaluation of technical resuscitation skills in pediatric settings during simulation-based educational programs. They might also be used for medical practice performance evaluations. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Direct extraction of electron parameters from magnetoconductance analysis in mesoscopic ring array structures

    NASA Astrophysics Data System (ADS)

    Sawada, A.; Faniel, S.; Mineshige, S.; Kawabata, S.; Saito, K.; Kobayashi, K.; Sekine, Y.; Sugiyama, H.; Koga, T.

    2018-05-01

    We report an approach for examining electron properties using information about the shape and size of a nanostructure as a measurement reference. This approach quantifies the spin precession angles per unit length directly by considering the time-reversal interferences on chaotic return trajectories within mesoscopic ring arrays (MRAs). Experimentally, we fabricated MRAs using nanolithography in InGaAs quantum wells which had a gate-controllable spin-orbit interaction (SOI). As a result, we observed an Onsager symmetry related to relativistic magnetic fields, which provided us with indispensable information for the semiclassical billiard ball simulation. Our simulations, developed based on the real-space formalism of the weak localization/antilocalization effect including the degree of freedom for electronic spin, reproduced the experimental magnetoconductivity (MC) curves with high fidelity. The values of five distinct electron parameters (Fermi wavelength, spin precession angles per unit length for two different SOIs, impurity scattering length, and phase coherence length) were thereby extracted from a single MC curve. The methodology developed here is applicable to wide ranges of nanomaterials and devices, providing a diagnostic tool for exotic properties of two-dimensional electron systems.

  19. SHIELDS Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordanova, Vania Koleva

    Predicting variations in the near-Earth space environment that can lead to spacecraft damage and failure, i.e. “space weather”, remains a big space physics challenge. A new capability was developed at Los Alamos National Laboratory (LANL) to understand, model, and predict Space Hazards Induced near Earth by Large Dynamic Storms, the SHIELDS framework. This framework simulates the dynamics of the Surface Charging Environment (SCE), the hot (keV) electrons representing the source and seed populations for the radiation belts, on both macro- and micro-scale. In addition to using physics-based models (like RAM-SCB, BATS-R-US, and iPIC3D), new data assimilation techniques employing data frommore » LANL instruments on the Van Allen Probes and geosynchronous satellites were developed. An order of magnitude improvement in the accuracy in the simulation of the spacecraft surface charging environment was thus obtained. SHIELDS also includes a post-processing tool designed to calculate the surface charging for specific spacecraft geometry using the Curvilinear Particle-In-Cell (CPIC) code and to evaluate anomalies' relation to SCE dynamics. Such diagnostics is critically important when performing forensic analyses of space-system failures.« less

  20. A fast ultrasonic simulation tool based on massively parallel implementations

    NASA Astrophysics Data System (ADS)

    Lambert, Jason; Rougeron, Gilles; Lacassagne, Lionel; Chatillon, Sylvain

    2014-02-01

    This paper presents a CIVA optimized ultrasonic inspection simulation tool, which takes benefit of the power of massively parallel architectures: graphical processing units (GPU) and multi-core general purpose processors (GPP). This tool is based on the classical approach used in CIVA: the interaction model is based on Kirchoff, and the ultrasonic field around the defect is computed by the pencil method. The model has been adapted and parallelized for both architectures. At this stage, the configurations addressed by the tool are : multi and mono-element probes, planar specimens made of simple isotropic materials, planar rectangular defects or side drilled holes of small diameter. Validations on the model accuracy and performances measurements are presented.

  1. New Tooling System for Forming Aluminum Beverage Can End Shell

    NASA Astrophysics Data System (ADS)

    Yamazaki, Koetsu; Otsuka, Takayasu; Han, Jing; Hasegawa, Takashi; Shirasawa, Taketo

    2011-08-01

    This paper proposes a new tooling system for forming shells of aluminum beverage can ends. At first, forming process of a conversional tooling system has been simulated using three-dimensional finite element models. Simulation results have been confirmed to be consistent with those of axisymmetric models, so simulations for further study have been performed using axisymmetric models to save computational time. A comparison shows that thinning of the shell formed by the proposed tooling system has been improved about 3.6%. Influences of the tool upmost surface profiles and tool initial positions in the new tooling system have been investigated and the design optimization method based on the numerical simulations has been then applied to search optimum design points, in order to minimize thinning subjected to the constraints of the geometrical dimensions of the shell. At last, the performance of the shell subjected to internal pressure has been confirmed to meet design requirements.

  2. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  3. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    PubMed

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  4. Computational analysis of the effectiveness of blood flushing with saline injection from an intravascular diagnostic catheter

    PubMed Central

    Ghata, Narugopal; Aldredge, Ralph C.; Bec, Julien; Marcu, Laura

    2015-01-01

    SUMMARY Optical techniques including fluorescence lifetime spectroscopy have demonstrated potential as a tool for study and diagnosis of arterial vessel pathologies. However, their application in the intravascular diagnostic procedures has been hampered by the presence of blood hemoglobin that affects the light delivery to and the collection from the vessel wall. We report a computational fluid dynamics model that allows for the optimization of blood flushing parameters in a manner that minimizes the amount of saline needed to clear the optical field of view and reduces any adverse effects caused by the external saline jet. A 3D turbulence (k−ω) model was employed for Eulerian–Eulerian two-phase flow to simulate the flow inside and around a side-viewing fiber-optic catheter. Current analysis demonstrates the effects of various parameters including infusion and blood flow rates, vessel diameters, and pulsatile nature of blood flow on the flow structure around the catheter tip. The results from this study can be utilized in determining the optimal flushing rate for given vessel diameter, blood flow rate, and maximum wall shear stress that the vessel wall can sustain and subsequently in optimizing the design parameters of optical-based intravascular catheters. PMID:24953876

  5. Nonlinear data-driven identification of polymer electrolyte membrane fuel cells for diagnostic purposes: A Volterra series approach

    NASA Astrophysics Data System (ADS)

    Ritzberger, D.; Jakubek, S.

    2017-09-01

    In this work, a data-driven identification method, based on polynomial nonlinear autoregressive models with exogenous inputs (NARX) and the Volterra series, is proposed to describe the dynamic and nonlinear voltage and current characteristics of polymer electrolyte membrane fuel cells (PEMFCs). The structure selection and parameter estimation of the NARX model is performed on broad-band voltage/current data. By transforming the time-domain NARX model into a Volterra series representation using the harmonic probing algorithm, a frequency-domain description of the linear and nonlinear dynamics is obtained. With the Volterra kernels corresponding to different operating conditions, information from existing diagnostic tools in the frequency domain such as electrochemical impedance spectroscopy (EIS) and total harmonic distortion analysis (THDA) are effectively combined. Additionally, the time-domain NARX model can be utilized for fault detection by evaluating the difference between measured and simulated output. To increase the fault detectability, an optimization problem is introduced which maximizes this output residual to obtain proper excitation frequencies. As a possible extension it is shown, that by optimizing the periodic signal shape itself that the fault detectability is further increased.

  6. Multiple comorbidities of 21 psychological disorders and relationships with psychosocial variables: a study of the online assessment and diagnostic system within a web-based population.

    PubMed

    Al-Asadi, Ali M; Klein, Britt; Meyer, Denny

    2015-02-26

    While research in the area of e-mental health has received considerable attention over the last decade, there are still many areas that have not been addressed. One such area is the comorbidity of psychological disorders in a Web-based sample using online assessment and diagnostic tools, and the relationships between comorbidities and psychosocial variables. We aimed to identify comorbidities of psychological disorders of an online sample using an online diagnostic tool. Based on diagnoses made by an automated online assessment and diagnostic system administered to a large group of online participants, multiple comorbidities (co-occurrences) of 21 psychological disorders for males and females were identified. We examined the relationships between dyadic comorbidities of anxiety and depressive disorders and the psychosocial variables sex, age, suicidal ideation, social support, and quality of life. An online complex algorithm based on the criteria of the Diagnostic and Statistical Manual of Mental Disorders, 4th edition, Text Revision, was used to assign primary and secondary diagnoses of 21 psychological disorders to 12,665 online participants. The frequency of co-occurrences of psychological disorders for males and females were calculated for all disorders. A series of hierarchical loglinear analyses were performed to examine the relationships between the dyadic comorbidities of depression and various anxiety disorders and the variables suicidal ideation, social support, quality of life, sex, and age. A 21-by-21 frequency of co-occurrences of psychological disorders matrix revealed the presence of multiple significant dyadic comorbidities for males and females. Also, for those with some of the dyadic depression and the anxiety disorders, the odds for having suicidal ideation, reporting inadequate social support, and poorer quality of life increased for those with two-disorder comorbidity than for those with only one of the same two disorders. Comorbidities of several psychological disorders using an online assessment tool within a Web-based population were similar to those found in face-to-face clinics using traditional assessment tools. Results provided support for the transdiagnostic approaches and confirmed the positive relationship between comorbidity and suicidal ideation, the negative relationship between comorbidity and social support, and the negative relationship comorbidity and quality of life. Australian and New Zealand Clinical Trials Registry ACTRN121611000704998; http://www.anzctr.org.au/trial_view.aspx?ID=336143 (Archived by WebCite at http://www.webcitation.org/618r3wvOG).

  7. Multiple Comorbidities of 21 Psychological Disorders and Relationships With Psychosocial Variables: A Study of the Online Assessment and Diagnostic System Within a Web-Based Population

    PubMed Central

    Klein, Britt; Meyer, Denny

    2015-01-01

    Background While research in the area of e-mental health has received considerable attention over the last decade, there are still many areas that have not been addressed. One such area is the comorbidity of psychological disorders in a Web-based sample using online assessment and diagnostic tools, and the relationships between comorbidities and psychosocial variables. Objective We aimed to identify comorbidities of psychological disorders of an online sample using an online diagnostic tool. Based on diagnoses made by an automated online assessment and diagnostic system administered to a large group of online participants, multiple comorbidities (co-occurrences) of 21 psychological disorders for males and females were identified. We examined the relationships between dyadic comorbidities of anxiety and depressive disorders and the psychosocial variables sex, age, suicidal ideation, social support, and quality of life. Methods An online complex algorithm based on the criteria of the Diagnostic and Statistical Manual of Mental Disorders, 4th edition, Text Revision, was used to assign primary and secondary diagnoses of 21 psychological disorders to 12,665 online participants. The frequency of co-occurrences of psychological disorders for males and females were calculated for all disorders. A series of hierarchical loglinear analyses were performed to examine the relationships between the dyadic comorbidities of depression and various anxiety disorders and the variables suicidal ideation, social support, quality of life, sex, and age. Results A 21-by-21 frequency of co-occurrences of psychological disorders matrix revealed the presence of multiple significant dyadic comorbidities for males and females. Also, for those with some of the dyadic depression and the anxiety disorders, the odds for having suicidal ideation, reporting inadequate social support, and poorer quality of life increased for those with two-disorder comorbidity than for those with only one of the same two disorders. Conclusions Comorbidities of several psychological disorders using an online assessment tool within a Web-based population were similar to those found in face-to-face clinics using traditional assessment tools. Results provided support for the transdiagnostic approaches and confirmed the positive relationship between comorbidity and suicidal ideation, the negative relationship between comorbidity and social support, and the negative relationship comorbidity and quality of life. Trial Registration Australian and New Zealand Clinical Trials Registry ACTRN121611000704998; http://www.anzctr.org.au/trial_view.aspx?ID=336143 (Archived by WebCite at http://www.webcitation.org/618r3wvOG) PMID:25803420

  8. Simulation of Clinical Diagnosis: A Comparative Study

    PubMed Central

    de Dombal, F. T.; Horrocks, Jane C.; Staniland, J. R.; Gill, P. W.

    1971-01-01

    This paper presents a comparison between three different modes of simulation of the diagnostic process—a computer-based system, a verbal mode, and a further mode in which cards were selected from a large board. A total of 34 subjects worked through a series of 444 diagnostic simulations. The verbal mode was found to be most enjoyable and realistic. At the board, considerable amounts of extra irrelevant data were selected. At the computer, the users asked the same questions every time, whether or not they were relevant to the particular diagnosis. They also found the teletype distracting, noisy, and slow. The need for an acceptable simulation system remains, and at present our Minisim and verbal modes are proving useful in training junior clinical students. Future simulators should be flexible, economical, and acceptably realistic—and to us this latter criterion implies the two-way use of speech. We are currently developing and testing such a system. PMID:5579197

  9. Quality Assessment of Comparative Diagnostic Accuracy Studies: Our Experience Using a Modified Version of the QUADAS-2 Tool

    ERIC Educational Resources Information Center

    Wade, Ros; Corbett, Mark; Eastwood, Alison

    2013-01-01

    Assessing the quality of included studies is a vital step in undertaking a systematic review. The recently revised Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool (QUADAS-2), which is the only validated quality assessment tool for diagnostic accuracy studies, does not include specific criteria for assessing comparative studies. As…

  10. MITT writer and MITT writer advanced development: Developing authoring and training systems for complex technical domains

    NASA Technical Reports Server (NTRS)

    Wiederholt, Bradley J.; Browning, Elica J.; Norton, Jeffrey E.; Johnson, William B.

    1991-01-01

    MITT Writer is a software system for developing computer based training for complex technical domains. A training system produced by MITT Writer allows a student to learn and practice troubleshooting and diagnostic skills. The MITT (Microcomputer Intelligence for Technical Training) architecture is a reasonable approach to simulation based diagnostic training. MITT delivers training on available computing equipment, delivers challenging training and simulation scenarios, and has economical development and maintenance costs. A 15 month effort was undertaken in which the MITT Writer system was developed. A workshop was also conducted to train instructors in how to use MITT Writer. Earlier versions were used to develop an Intelligent Tutoring System for troubleshooting the Minuteman Missile Message Processing System.

  11. Airborne Turbulence Detection System Certification Tool Set

    NASA Technical Reports Server (NTRS)

    Hamilton, David W.; Proctor, Fred H.

    2006-01-01

    A methodology and a corresponding set of simulation tools for testing and evaluating turbulence detection sensors has been presented. The tool set is available to industry and the FAA for certification of radar based airborne turbulence detection systems. The tool set consists of simulated data sets representing convectively induced turbulence, an airborne radar simulation system, hazard tables to convert the radar observable to an aircraft load, documentation, a hazard metric "truth" algorithm, and criteria for scoring the predictions. Analysis indicates that flight test data supports spatial buffers for scoring detections. Also, flight data and demonstrations with the tool set suggest the need for a magnitude buffer.

  12. Plasma Diagnostics: Use and Justification in an Industrial Environment

    NASA Astrophysics Data System (ADS)

    Loewenhardt, Peter

    1998-10-01

    The usefulness and importance of plasma diagnostics have played a major role in the development of plasma processing tools in the semiconductor industry. As can be seen through marketing materials from semiconductor equipment manufacturers, results from plasma diagnostic equipment can be a powerful tool in selling the technological leadership of tool design. Some diagnostics have long been used for simple process control such as optical emission for endpoint determination, but in recent years more sophisticated and involved diagnostic tools have been utilized in chamber and plasma source development and optimization. It is now common to find an assortment of tools at semiconductor equipment companies such as Langmuir probes, mass spectrometers, spatial optical emission probes, impedance, ion energy and ion flux probes. An outline of how the importance of plasma diagnostics has grown at an equipment manufacturer over the last decade will be given, with examples of significant and useful results obtained. Examples will include the development and optimization of an inductive plasma source, trends and hardware effects on ion energy distributions, mass spectrometry influences on process development and investigations of plasma-wall interactions. Plasma diagnostic focus, in-house development and proliferation in an environment where financial justification requirements are both strong and necessary will be discussed.

  13. Importance of inlet boundary conditions for numerical simulation of combustor flows

    NASA Technical Reports Server (NTRS)

    Sturgess, G. J.; Syed, S. A.; Mcmanus, K. R.

    1983-01-01

    Fluid dynamic computer codes for the mathematical simulation of problems in gas turbine engine combustion systems are required as design and diagnostic tools. To eventually achieve a performance standard with these codes of more than qualitative accuracy it is desirable to use benchmark experiments for validation studies. Typical of the fluid dynamic computer codes being developed for combustor simulations is the TEACH (Teaching Elliptic Axisymmetric Characteristics Heuristically) solution procedure. It is difficult to find suitable experiments which satisfy the present definition of benchmark quality. For the majority of the available experiments there is a lack of information concerning the boundary conditions. A standard TEACH-type numerical technique is applied to a number of test-case experiments. It is found that numerical simulations of gas turbine combustor-relevant flows can be sensitive to the plane at which the calculations start and the spatial distributions of inlet quantities for swirling flows.

  14. Turbofan Engine Simulated in a Graphical Simulation Environment

    NASA Technical Reports Server (NTRS)

    Parker, Khary I.; Guo, Ten-Huei

    2004-01-01

    Recently, there has been an increase in the development of intelligent engine technology with advanced active component control. The computer engine models used in these control studies are component-level models (CLM), models that link individual component models of state space and nonlinear algebraic equations, written in a computer language such as Fortran. The difficulty faced in performing control studies on Fortran-based models is that Fortran is not supported with control design and analysis tools, so there is no means for implementing real-time control. It is desirable to have a simulation environment that is straightforward, has modular graphical components, and allows easy access to health, control, and engine parameters through a graphical user interface. Such a tool should also provide the ability to convert a control design into real-time code, helping to make it an extremely powerful tool in control and diagnostic system development. Simulation time management is shown: Mach number versus time, power level angle versus time, altitude versus time, ambient temperature change versus time, afterburner fuel flow versus time, controller and actuator dynamics, collect initial conditions, CAD output, and component-level model: CLM sensor, CAD input, and model output. The Controls and Dynamics Technologies Branch at the NASA Glenn Research Center has developed and demonstrated a flexible, generic turbofan engine simulation platform that can meet these objectives, known as the Modular Aero-Propulsion System Simulation (MAPSS). MAPSS is a Simulink-based implementation of a Fortran-based, modern high pressure ratio, dual-spool, low-bypass, military-type variable-cycle engine with a digital controller. Simulink (The Mathworks, Natick, MA) is a computer-aided control design and simulation package allows the graphical representation of dynamic systems in a block diagram form. MAPSS is a nonlinear, non-real-time system composed of controller and actuator dynamics (CAD) and component-level model (CLM) modules. The controller in the CAD module emulates the functionality of a digital controller, which has a typical update rate of 50 Hz. The CLM module simulates the dynamics of the engine components and uses an update rate of 2500 Hz, which is needed to iterate to balance mass and energy among system components. The actuators in the CAD module use the same sampling rate as those in the CLM. Two graphs of normalized spool speed versus time in seconds and one graph of normalized average metal temperature versus time in seconds is shown. MAPSS was validated via open-loop and closed-loop comparisons with the Fortran simulation. The preceding plots show the normalized results of a closed-loop comparison looking at three states of the model: low-pressure spool speed, high-pressure spool speed, and the average metal temperature measured from the combustor to the high-pressure turbine. In steady state, the error between the simulations is less than 1 percent. During a transient, the difference between the simulations is due to a correction in MAPSS that prevents the gas flow in the bypass duct inlet from flowing forward instead of toward the aft end, which occurs in the Fortran simulation. A comparison between MAPSS and the Fortran model of the bypass duct inlet flow for power lever angles greater than 35 degrees is shown.

  15. Mobile in Situ Simulation as a Tool for Evaluation and Improvement of Trauma Treatment in the Emergency Department.

    PubMed

    Amiel, Imri; Simon, Daniel; Merin, Ofer; Ziv, Amitai

    2016-01-01

    Medical simulation is an increasingly recognized tool for teaching, coaching, training, and examining practitioners in the medical field. For many years, simulation has been used to improve trauma care and teamwork. Despite technological advances in trauma simulators, including better means of mobilization and control, most reported simulation-based trauma training has been conducted inside simulation centers, and the practice of mobile simulation in hospitals' trauma rooms has not been investigated fully. The emergency department personnel from a second-level trauma center in Israel were evaluated. Divided into randomly formed trauma teams, they were reviewed twice using in situ mobile simulation training at the hospital's trauma bay. In all, 4 simulations were held before and 4 simulations were held after a structured learning intervention. The intervention included a 1-day simulation-based training conducted at the Israel Center for Medical Simulation (MSR), which included video-based debriefing facilitated by the hospital's 4 trauma team leaders who completed a 2-day simulation-based instructors' course before the start of the study. The instructors were also trained on performance rating and thus were responsible for the assessment of their respective teams in real time as well as through reviewing of the recorded videos; thus enabling a comparison of the performances in the mobile simulation exercise before and after the educational intervention. The internal reliability of the experts' evaluation calculated in the Cronbach α model was found to be 0.786. Statistically significant improvement was observed in 4 of 10 parameters, among which were teamwork (29.64%) and communication (24.48%) (p = 0.00005). The mobile in situ simulation-based training demonstrated efficacy both as an assessment tool for trauma teams' function and an educational intervention when coupled with in vitro simulation-based training, resulting in a significant improvement of the teams' function in various aspects of treatment. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  16. Approaches to incorporating climate change effects in state and transition simulation models of vegetation

    Treesearch

    Becky K. Kerns; Miles A. Hemstrom; David Conklin; Gabriel I. Yospin; Bart Johnson; Dominique Bachelet; Scott Bridgham

    2012-01-01

    Understanding landscape vegetation dynamics often involves the use of scientifically-based modeling tools that are capable of testing alternative management scenarios given complex ecological, management, and social conditions. State-and-transition simulation model (STSM) frameworks and software such as PATH and VDDT are commonly used tools that simulate how landscapes...

  17. Identification of facilitators and barriers to residents' use of a clinical reasoning tool.

    PubMed

    DiNardo, Deborah; Tilstra, Sarah; McNeil, Melissa; Follansbee, William; Zimmer, Shanta; Farris, Coreen; Barnato, Amber E

    2018-03-28

    While there is some experimental evidence to support the use of cognitive forcing strategies to reduce diagnostic error in residents, the potential usability of such strategies in the clinical setting has not been explored. We sought to test the effect of a clinical reasoning tool on diagnostic accuracy and to obtain feedback on its usability and acceptability. We conducted a randomized behavioral experiment testing the effect of this tool on diagnostic accuracy on written cases among post-graduate 3 (PGY-3) residents at a single internal medical residency program in 2014. Residents completed written clinical cases in a proctored setting with and without prompts to use the tool. The tool encouraged reflection on concordant and discordant aspects of each case. We used random effects regression to assess the effect of the tool on diagnostic accuracy of the independent case sets, controlling for case complexity. We then conducted audiotaped structured focus group debriefing sessions and reviewed the tapes for facilitators and barriers to use of the tool. Of 51 eligible PGY-3 residents, 34 (67%) participated in the study. The average diagnostic accuracy increased from 52% to 60% with the tool, a difference that just met the test for statistical significance in adjusted analyses (p=0.05). Residents reported that the tool was generally acceptable and understandable but did not recognize its utility for use with simple cases, suggesting the presence of overconfidence bias. A clinical reasoning tool improved residents' diagnostic accuracy on written cases. Overconfidence bias is a potential barrier to its use in the clinical setting.

  18. An ontology-driven, diagnostic modeling system.

    PubMed

    Haug, Peter J; Ferraro, Jeffrey P; Holmen, John; Wu, Xinzi; Mynam, Kumar; Ebert, Matthew; Dean, Nathan; Jones, Jason

    2013-06-01

    To present a system that uses knowledge stored in a medical ontology to automate the development of diagnostic decision support systems. To illustrate its function through an example focused on the development of a tool for diagnosing pneumonia. We developed a system that automates the creation of diagnostic decision-support applications. It relies on a medical ontology to direct the acquisition of clinic data from a clinical data warehouse and uses an automated analytic system to apply a sequence of machine learning algorithms that create applications for diagnostic screening. We refer to this system as the ontology-driven diagnostic modeling system (ODMS). We tested this system using samples of patient data collected in Salt Lake City emergency rooms and stored in Intermountain Healthcare's enterprise data warehouse. The system was used in the preliminary development steps of a tool to identify patients with pneumonia in the emergency department. This tool was compared with a manually created diagnostic tool derived from a curated dataset. The manually created tool is currently in clinical use. The automatically created tool had an area under the receiver operating characteristic curve of 0.920 (95% CI 0.916 to 0.924), compared with 0.944 (95% CI 0.942 to 0.947) for the manually created tool. Initial testing of the ODMS demonstrates promising accuracy for the highly automated results and illustrates the route to model improvement. The use of medical knowledge, embedded in ontologies, to direct the initial development of diagnostic computing systems appears feasible.

  19. Rocket engine diagnostics using qualitative modeling techniques

    NASA Technical Reports Server (NTRS)

    Binder, Michael; Maul, William; Meyer, Claudia; Sovie, Amy

    1992-01-01

    Researchers at NASA Lewis Research Center are presently developing qualitative modeling techniques for automated rocket engine diagnostics. A qualitative model of a turbopump interpropellant seal system has been created. The qualitative model describes the effects of seal failures on the system steady-state behavior. This model is able to diagnose the failure of particular seals in the system based on anomalous temperature and pressure values. The anomalous values input to the qualitative model are generated using numerical simulations. Diagnostic test cases include both single and multiple seal failures.

  20. Rocket engine diagnostics using qualitative modeling techniques

    NASA Technical Reports Server (NTRS)

    Binder, Michael; Maul, William; Meyer, Claudia; Sovie, Amy

    1992-01-01

    Researchers at NASA Lewis Research Center are presently developing qualitative modeling techniques for automated rocket engine diagnostics. A qualitative model of a turbopump interpropellant seal system was created. The qualitative model describes the effects of seal failures on the system steady state behavior. This model is able to diagnose the failure of particular seals in the system based on anomalous temperature and pressure values. The anomalous values input to the qualitative model are generated using numerical simulations. Diagnostic test cases include both single and multiple seal failures.

  1. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  2. The effect of a simulation training package on skill acquisition for duplex arterial stenosis detection.

    PubMed

    Jaffer, Usman; Normahani, Pasha; Singh, Prashant; Aslam, Mohammed; Standfield, Nigel J

    2015-01-01

    In vascular surgery, duplex ultrasonography is a valuable diagnostic tool in patients with peripheral vascular disease, and there is increasing demand for vascular surgeons to be able to perform duplex scanning. This study evaluates the role of a novel simulation training package on vascular ultrasound (US) skill acquisition. A total of 19 novices measured predefined stenosis in a simulated pulsatile vessel using both peak systolic velocity ratio (PSVR) and diameter reduction (DR) methods before and after a short period of training using a simulated training package. The training package consisted of a simulated pulsatile vessel phantom, a set of instructional videos, duplex ultrasound objective structured assessment of technical skills (DUOSATS) tool, and a portable US scanner. Quantitative metrics (procedure time, percentage error using PSVR and DR methods, DUOSAT scores, and global rating scores) before and after training were compared. Subjects spent a median time of 144 mins (IQR: 60-195) training using the simulation package. Subjects exhibited statistically significant improvements when comparing pretraining and posttraining DUOSAT scores (pretraining = 17 [16-19.3] vs posttraining = 30 [27.8-31.8]; p < 0.01), global rating score (pretraining = 1 [1-2] vs posttraining = 4 [3.8-4]; p < 0.01), percentage error using both the DR (pretraining = 12.6% [9-29.6] vs posttraining = 10.3% [8.9-11.1]; p = 0.03) and PSVR (pretraining = 60% [40-60] vs posttraining = 20% [6.7-20]; p < 0.01) methods. In this study, subjects with no previous practical US experience developed the ability to both acquire and interpret arterial duplex images in a pulsatile simulated phantom following a short period of goal direct training using a simulation training package. A simulation training package may be a valuable tool for integration into a vascular training program. However, further work is needed to explore whether these newly attained skills are translated into clinical assessment. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.

  3. A NEO population generation and observation simulation software tool

    NASA Astrophysics Data System (ADS)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as optical or radar sensors and simulate observation campaigns. The tool outputs field-of-view crossings and actual detections of the selected NEO population objects. Using the Observation Analysis users are able to process and plot the results of the Observation Simulation. In order to enable end-users to handle the tool in a user-intuitive and comfortable way, a GUI has been created based on the modular Eclipse Rich Client Platform (RCP) technology. Through the GUI users can easily enter input data for the tool, execute it and view its output data in a clear way. Additionally, the GUI runs gnuplot to create plot pictures and presents them to the user. Furthermore, users can create projects to organise executions of the tool.

  4. [Diagnostic tools for canine parvovirus infection].

    PubMed

    Proksch, A L; Hartmann, K

    2015-01-01

    Canine parvovirus (CPV) infection is one of the most important and common infectious diseases in dogs, in particular affecting young puppies when maternal antibodies have waned and vaccine-induced antibodies have not yet developed. The mortality rate remains high. Therefore, a rapid and safe diagnostic tool is essential to diagnose the disease to 1) provide intensive care treatment and 2) to identify virus-shedding animals and thus prevent virus spread. Whilst the detection of antibodies against CPV is considered unsuitable to diagnose the disease, there are several different methods to directly detect complete virus, virus antigen or DNA. Additionally, to test in commercial laboratories, rapid in-house tests based on ELISA are available worldwide. The specificity of the ELISA rapid in-house tests is reported to be excellent. However, results on sensitivity vary and high numbers of false-negative results are commonly reported, which potentially leads to misdiagnosis. Polymerase chain reaction (PCR) is a very sensitive and specific diagnostic tool. It also provides the opportunity to differentiate vaccine strains from natural infection when sequencing is performed after PCR.

  5. Design of a Web-tool for diagnostic clinical trials handling medical imaging research.

    PubMed

    Baltasar Sánchez, Alicia; González-Sistal, Angel

    2011-04-01

    New clinical studies in medicine are based on patients and controls using different imaging diagnostic modalities. Medical information systems are not designed for clinical trials employing clinical imaging. Although commercial software and communication systems focus on storage of image data, they are not suitable for storage and mining of new types of quantitative data. We sought to design a Web-tool to support diagnostic clinical trials involving different experts and hospitals or research centres. The image analysis of this project is based on skeletal X-ray imaging. It involves a computerised image method using quantitative analysis of regions of interest in healthy bone and skeletal metastases. The database is implemented with ASP.NET 3.5 and C# technologies for our Web-based application. For data storage, we chose MySQL v.5.0, one of the most popular open source databases. User logins were necessary, and access to patient data was logged for auditing. For security, all data transmissions were carried over encrypted connections. This Web-tool is available to users scattered at different locations; it allows an efficient organisation and storage of data (case report form) and images and allows each user to know precisely what his task is. The advantages of our Web-tool are as follows: (1) sustainability is guaranteed; (2) network locations for collection of data are secured; (3) all clinical information is stored together with the original images and the results derived from processed images and statistical analysis that enable us to perform retrospective studies; (4) changes are easily incorporated because of the modular architecture; and (5) assessment of trial data collected at different sites is centralised to reduce statistical variance.

  6. State of the art and future needs in S.I. engine combustion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maly, R.R.

    1994-12-31

    The paper reviews, in short, the state-of-the-art in SI engine combustion by addressing its main features: mixture formation, ignition, homogeneous combustion, pollutant formation, knock, and engine modeling. Necessary links between fundamental and practical work are clarified and discussed along with advanced diagnostics and simulation tools. The needs for further work are identified, the most important one being integration of all fundamental and practical resources to meet R and D requirements for future engines.

  7. ParCAT: A Parallel Climate Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. ParCAT is implemented in C to provide efficient file IO. The file IO operations in the toolkit use the parallel-netcdf library; this enables the code to use the parallel IO capabilities of modern HPC systems. Analysis that currently requires an estimated 12+ hours with the traditional CCSM Land Model Diagnostics Package can now be performed in as little as 30 minutes on a single desktop workstation and a few minutes for relatively small jobs completed on modern HPC systems such as ORNL's Jaguar.

  8. Climate Model Diagnostic Analyzer

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  9. A software tool for advanced MRgFUS prostate therapy planning and follow up

    NASA Astrophysics Data System (ADS)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  10. Computer-aided system design

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  11. Diagnostic Analyzer for Gearboxes (DAG): User's Guide. Version 3.1 for Microsoft Windows 3.1

    NASA Technical Reports Server (NTRS)

    Jammu, Vinay B.; Kourosh, Danai

    1997-01-01

    This documentation describes the Diagnostic Analyzer for Gearboxes (DAG) software for performing fault diagnosis of gearboxes. First, the user would construct a graphical representation of the gearbox using the gear, bearing, shaft, and sensor tools contained in the DAG software. Next, a set of vibration features obtained by processing the vibration signals recorded from the gearbox using a signal analyzer is required. Given this information, the DAG software uses an unsupervised neural network referred to as the Fault Detection Network (FDN) to identify the occurrence of faults, and a pattern classifier called Single Category-Based Classifier (SCBC) for abnormality scaling of individual vibration features. The abnormality-scaled vibration features are then used as inputs to a Structure-Based Connectionist Network (SBCN) for identifying faults in gearbox subsystems and components. The weights of the SBCN represent its diagnostic knowledge and are derived from the structure of the gearbox graphically presented in DAG. The outputs of SBCN are fault possibility values between 0 and 1 for individual subsystems and components in the gearbox with a 1 representing a definite fault and a 0 representing normality. This manual describes the steps involved in creating the diagnostic gearbox model, along with the options and analysis tools of the DAG software.

  12. Visual Predictive Check in Models with Time-Varying Input Function.

    PubMed

    Largajolli, Anna; Bertoldo, Alessandra; Campioni, Marco; Cobelli, Claudio

    2015-11-01

    The nonlinear mixed effects models are commonly used modeling techniques in the pharmaceutical research as they enable the characterization of the individual profiles together with the population to which the individuals belong. To ensure a correct use of them is fundamental to provide powerful diagnostic tools that are able to evaluate the predictive performance of the models. The visual predictive check (VPC) is a commonly used tool that helps the user to check by visual inspection if the model is able to reproduce the variability and the main trend of the observed data. However, the simulation from the model is not always trivial, for example, when using models with time-varying input function (IF). In this class of models, there is a potential mismatch between each set of simulated parameters and the associated individual IF which can cause an incorrect profile simulation. We introduce a refinement of the VPC by taking in consideration a correlation term (the Mahalanobis or normalized Euclidean distance) that helps the association of the correct IF with the individual set of simulated parameters. We investigate and compare its performance with the standard VPC in models of the glucose and insulin system applied on real and simulated data and in a simulated pharmacokinetic/pharmacodynamic (PK/PD) example. The newly proposed VPC performance appears to be better with respect to the standard VPC especially for the models with big variability in the IF where the probability of simulating incorrect profiles is higher.

  13. Diagnostics for Confounding of Time-varying and Other Joint Exposures.

    PubMed

    Jackson, John W

    2016-11-01

    The effects of joint exposures (or exposure regimes) include those of adhering to assigned treatment versus placebo in a randomized controlled trial, duration of exposure in a cohort study, interactions between exposures, and direct effects of exposure, among others. Unlike the setting of a single point exposure (e.g., propensity score matching), there are few tools to describe confounding for joint exposures or how well a method resolves it. Investigators need tools that describe confounding in ways that are conceptually grounded and intuitive for those who read, review, and use applied research to guide policy. We revisit the implications of exchangeability conditions that hold in sequentially randomized trials, and the bias structure that motivates the use of g-methods, such as marginal structural models. From these, we develop covariate balance diagnostics for joint exposures that can (1) describe time-varying confounding, (2) assess whether covariates are predicted by prior exposures given their past, the indication for g-methods, and (3) describe residual confounding after inverse probability weighting. For each diagnostic, we present time-specific metrics that encompass a wide class of joint exposures, including regimes of multivariate time-varying exposures in censored data, with multivariate point exposures as a special case. We outline how to estimate these directly or with regression and how to average them over person-time. Using a simulated example, we show how these metrics can be presented graphically. This conceptually grounded framework can potentially aid the transparent design, analysis, and reporting of studies that examine joint exposures. We provide easy-to-use tools to implement it.

  14. Histology Verification Demonstrates That Biospectroscopy Analysis of Cervical Cytology Identifies Underlying Disease More Accurately than Conventional Screening: Removing the Confounder of Discordance

    PubMed Central

    Gajjar, Ketan; Ahmadzai, Abdullah A.; Valasoulis, George; Trevisan, Júlio; Founta, Christina; Nasioutziki, Maria; Loufopoulos, Aristotelis; Kyrgiou, Maria; Stasinou, Sofia Melina; Karakitsos, Petros; Paraskevaidis, Evangelos; Da Gama-Rose, Bianca; Martin-Hirsch, Pierre L.; Martin, Francis L.

    2014-01-01

    Background Subjective visual assessment of cervical cytology is flawed, and this can manifest itself by inter- and intra-observer variability resulting ultimately in the degree of discordance in the grading categorisation of samples in screening vs. representative histology. Biospectroscopy methods have been suggested as sensor-based tools that can deliver objective assessments of cytology. However, studies to date have been apparently flawed by a corresponding lack of diagnostic efficiency when samples have previously been classed using cytology screening. This raises the question as to whether categorisation of cervical cytology based on imperfect conventional screening reduces the diagnostic accuracy of biospectroscopy approaches; are these latter methods more accurate and diagnose underlying disease? The purpose of this study was to compare the objective accuracy of infrared (IR) spectroscopy of cervical cytology samples using conventional cytology vs. histology-based categorisation. Methods Within a typical clinical setting, a total of n = 322 liquid-based cytology samples were collected immediately before biopsy. Of these, it was possible to acquire subsequent histology for n = 154. Cytology samples were categorised according to conventional screening methods and subsequently interrogated employing attenuated total reflection Fourier-transform IR (ATR-FTIR) spectroscopy. IR spectra were pre-processed and analysed using linear discriminant analysis. Dunn’s test was applied to identify the differences in spectra. Within the diagnostic categories, histology allowed us to determine the comparative efficiency of conventional screening vs. biospectroscopy to correctly identify either true atypia or underlying disease. Results Conventional cytology-based screening results in poor sensitivity and specificity. IR spectra derived from cervical cytology do not appear to discriminate in a diagnostic fashion when categories were based on conventional screening. Scores plots of IR spectra exhibit marked crossover of spectral points between different cytological categories. Although, significant differences between spectral bands in different categories are noted, crossover samples point to the potential for poor specificity and hampers the development of biospectroscopy as a diagnostic tool. However, when histology-based categories are used to conduct analyses, the scores plot of IR spectra exhibit markedly better segregation. Conclusions Histology demonstrates that ATR-FTIR spectroscopy of liquid-based cytology identifies the presence of underlying atypia or disease missed in conventional cytology screening. This study points to an urgent need for a future biospectroscopy study where categories are based on such histology. It will allow for the validation of this approach as a screening tool. PMID:24404130

  15. Designing and Evaluating an Interactive Multimedia Web-Based Simulation for Developing Nurses’ Competencies in Acute Nursing Care: Randomized Controlled Trial

    PubMed Central

    Wong, Lai Fun; Chan, Sally Wai-Chi; Ho, Jasmine Tze Yin; Mordiffi, Siti Zubaidah; Ang, Sophia Bee Leng; Goh, Poh Sun; Ang, Emily Neo Kim

    2015-01-01

    Background Web-based learning is becoming an increasingly important instructional tool in nursing education. Multimedia advancements offer the potential for creating authentic nursing activities for developing nursing competency in clinical practice. Objective This study aims to describe the design, development, and evaluation of an interactive multimedia Web-based simulation for developing nurses’ competencies in acute nursing care. Methods Authentic nursing activities were developed in a Web-based simulation using a variety of instructional strategies including animation video, multimedia instructional material, virtual patients, and online quizzes. A randomized controlled study was conducted on 67 registered nurses who were recruited from the general ward units of an acute care tertiary hospital. Following a baseline evaluation of all participants’ clinical performance in a simulated clinical setting, the experimental group received 3 hours of Web-based simulation and completed a survey to evaluate their perceptions of the program. All participants were re-tested for their clinical performances using a validated tool. Results The clinical performance posttest scores of the experimental group improved significantly (P<.001) from the pretest scores after the Web-based simulation. In addition, compared to the control group, the experimental group had significantly higher clinical performance posttest scores (P<.001) after controlling the pretest scores. The participants from the experimental group were satisfied with their learning experience and gave positive ratings for the quality of the Web-based simulation. Themes emerging from the comments about the most valuable aspects of the Web-based simulation include relevance to practice, instructional strategies, and fostering problem solving. Conclusions Engaging in authentic nursing activities using interactive multimedia Web-based simulation can enhance nurses’ competencies in acute care. Web-based simulations provide a promising educational tool in institutions where large groups of nurses need to be trained in acute nursing care and accessibility to repetitive training is essential for achieving long-term retention of clinical competency. PMID:25583029

  16. Designing and evaluating an interactive multimedia Web-based simulation for developing nurses' competencies in acute nursing care: randomized controlled trial.

    PubMed

    Liaw, Sok Ying; Wong, Lai Fun; Chan, Sally Wai-Chi; Ho, Jasmine Tze Yin; Mordiffi, Siti Zubaidah; Ang, Sophia Bee Leng; Goh, Poh Sun; Ang, Emily Neo Kim

    2015-01-12

    Web-based learning is becoming an increasingly important instructional tool in nursing education. Multimedia advancements offer the potential for creating authentic nursing activities for developing nursing competency in clinical practice. This study aims to describe the design, development, and evaluation of an interactive multimedia Web-based simulation for developing nurses' competencies in acute nursing care. Authentic nursing activities were developed in a Web-based simulation using a variety of instructional strategies including animation video, multimedia instructional material, virtual patients, and online quizzes. A randomized controlled study was conducted on 67 registered nurses who were recruited from the general ward units of an acute care tertiary hospital. Following a baseline evaluation of all participants' clinical performance in a simulated clinical setting, the experimental group received 3 hours of Web-based simulation and completed a survey to evaluate their perceptions of the program. All participants were re-tested for their clinical performances using a validated tool. The clinical performance posttest scores of the experimental group improved significantly (P<.001) from the pretest scores after the Web-based simulation. In addition, compared to the control group, the experimental group had significantly higher clinical performance posttest scores (P<.001) after controlling the pretest scores. The participants from the experimental group were satisfied with their learning experience and gave positive ratings for the quality of the Web-based simulation. Themes emerging from the comments about the most valuable aspects of the Web-based simulation include relevance to practice, instructional strategies, and fostering problem solving. Engaging in authentic nursing activities using interactive multimedia Web-based simulation can enhance nurses' competencies in acute care. Web-based simulations provide a promising educational tool in institutions where large groups of nurses need to be trained in acute nursing care and accessibility to repetitive training is essential for achieving long-term retention of clinical competency.

  17. Developments in label-free microfluidic methods for single-cell analysis and sorting.

    PubMed

    Carey, Thomas R; Cotner, Kristen L; Li, Brian; Sohn, Lydia L

    2018-04-24

    Advancements in microfluidic technologies have led to the development of many new tools for both the characterization and sorting of single cells without the need for exogenous labels. Label-free microfluidics reduce the preparation time, reagents needed, and cost of conventional methods based on fluorescent or magnetic labels. Furthermore, these devices enable analysis of cell properties such as mechanical phenotype and dielectric parameters that cannot be characterized with traditional labels. Some of the most promising technologies for current and future development toward label-free, single-cell analysis and sorting include electronic sensors such as Coulter counters and electrical impedance cytometry; deformation analysis using optical traps and deformation cytometry; hydrodynamic sorting such as deterministic lateral displacement, inertial focusing, and microvortex trapping; and acoustic sorting using traveling or standing surface acoustic waves. These label-free microfluidic methods have been used to screen, sort, and analyze cells for a wide range of biomedical and clinical applications, including cell cycle monitoring, rapid complete blood counts, cancer diagnosis, metastatic progression monitoring, HIV and parasite detection, circulating tumor cell isolation, and point-of-care diagnostics. Because of the versatility of label-free methods for characterization and sorting, the low-cost nature of microfluidics, and the rapid prototyping capabilities of modern microfabrication, we expect this class of technology to continue to be an area of high research interest going forward. New developments in this field will contribute to the ongoing paradigm shift in cell analysis and sorting technologies toward label-free microfluidic devices, enabling new capabilities in biomedical research tools as well as clinical diagnostics. This article is categorized under: Diagnostic Tools > Biosensing Diagnostic Tools > Diagnostic Nanodevices. © 2018 Wiley Periodicals, Inc.

  18. Development of an interpretive simulation tool for the proton radiography technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levy, M. C., E-mail: levymc@stanford.edu; Lawrence Livermore National Laboratory, Livermore, California 94551; Ryutov, D. D.

    2015-03-15

    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper, we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool’s numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from particle-in-cell or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field “primitives” is also provided. This latter capability allows users tomore » add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagnetic field elements. We show an example application of the tool in connection to experimental observations of the Weibel instability in counterstreaming plasmas, using ∼10{sup 8} particles generated from a realistic laser-driven point-like proton source, imaging fields which cover volumes of ∼10 mm{sup 3}. Insights derived from this application show that the tool can support understanding of HED plasmas.« less

  19. Comparison of presenting features, diagnostic tools, hospital outcomes, and quality of care indicators in older (>65 years) to younger, men to women, and diabetics to nondiabetics with acute chest pain triaged in the emergency department.

    PubMed

    Pelliccia, Francesco; Cartoni, Domenico; Verde, Monica; Salvini, Paolo; Petrolati, Sandro; Mercuro, Giuseppe; Tanzi, Pietro

    2004-07-15

    In a total of 4,843 consecutive patients admitted to an emergency department (ED) with acute chest pain over a 1-year period, presenting features, diagnostic tools, hospital outcomes, and quality-of-care indicators were compared between older (n = 1,781) and younger (n = 3,062) patients, men (n = 3,095) and women (n = 1,748), and diabetics (n = 856) and nondiabetics (n = 3,987). The results showed that after critical pathway implementation, there was an increase in the use of evidence-based treatment strategies in the ED and improved outcomes in older patients, women, and diabetics, with no more differences in the length of ED stay, diagnostic accuracy for myocardial infarction in the ED, door-to-thrombolysis time, and door-to-balloon time compared with younger patients, men, and nondiabetics.

  20. Development and validation of risk models and molecular diagnostics to permit personalized management of cancer.

    PubMed

    Pu, Xia; Ye, Yuanqing; Wu, Xifeng

    2014-01-01

    Despite the advances made in cancer management over the past few decades, improvements in cancer diagnosis and prognosis are still poor, highlighting the need for individualized strategies. Toward this goal, risk prediction models and molecular diagnostic tools have been developed, tailoring each step of risk assessment from diagnosis to treatment and clinical outcomes based on the individual's clinical, epidemiological, and molecular profiles. These approaches hold increasing promise for delivering a new paradigm to maximize the efficiency of cancer surveillance and efficacy of treatment. However, they require stringent study design, methodology development, comprehensive assessment of biomarkers and risk factors, and extensive validation to ensure their overall usefulness for clinical translation. In the current study, the authors conducted a systematic review using breast cancer as an example and provide general guidelines for risk prediction models and molecular diagnostic tools, including development, assessment, and validation. © 2013 American Cancer Society.

  1. Biochip for Real-Time Monitoring of Hepatitis B Virus (HBV) by Combined Loop-Mediated Isothermal Amplification and Solution-Phase Electrochemical Detection

    NASA Astrophysics Data System (ADS)

    Tien, Bui Quang; Ngoc, Nguyen Thy; Loc, Nguyen Thai; Thu, Vu Thi; Lam, Tran Dai

    2017-06-01

    Accurate in situ diagnostic tests play a key role in patient management and control of most infectious diseases. To achieve this, use of handheld biochips that implement sample handling, sample analysis, and result readout together is an ideal approach. We present herein a fluid-handling biochip for real-time electrochemical monitoring of nucleic acid amplification based on loop-mediated isothermal amplification and real-time electrochemical detection on a microfluidic platform. Intercalation between amplifying DNA and free redox probe in solution phase was used to monitor the number of DNA copies. The whole diagnostic process is completed within 70 min. Our platform offers a fast and easy tool for quantification of viral pathogens in shorter time and with limited risk of all potential forms of cross-contamination. Such diagnostic tools have potential to make a huge difference to the lives of millions of people worldwide.

  2. The capability of lithography simulation based on MVM-SEM® system

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong

    2015-10-01

    The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.

  3. Contextual information influences diagnosis accuracy and decision making in simulated emergency medicine emergencies.

    PubMed

    McRobert, Allistair Paul; Causer, Joe; Vassiliadis, John; Watterson, Leonie; Kwan, James; Williams, Mark A

    2013-06-01

    It is well documented that adaptations in cognitive processes with increasing skill levels support decision making in multiple domains. We examined skill-based differences in cognitive processes in emergency medicine physicians, and whether performance was significantly influenced by the removal of contextual information related to a patient's medical history. Skilled (n=9) and less skilled (n=9) emergency medicine physicians responded to high-fidelity simulated scenarios under high- and low-context information conditions. Skilled physicians demonstrated higher diagnostic accuracy irrespective of condition, and were less affected by the removal of context-specific information compared with less skilled physicians. The skilled physicians generated more options, and selected better quality options during diagnostic reasoning compared with less skilled counterparts. These cognitive processes were active irrespective of the level of context-specific information presented, although high-context information enhanced understanding of the patients' symptoms resulting in higher diagnostic accuracy. Our findings have implications for scenario design and the manipulation of contextual information during simulation training.

  4. Developing Simulations in Multi-User Virtual Environments to Enhance Healthcare Education

    ERIC Educational Resources Information Center

    Rogers, Luke

    2011-01-01

    Computer-based clinical simulations are a powerful teaching and learning tool because of their ability to expand healthcare students' clinical experience by providing practice-based learning. Despite the benefits of traditional computer-based clinical simulations, there are significant issues that arise when incorporating them into a flexible,…

  5. Finite element simulation of cutting grey iron HT250 by self-prepared Si3N4 ceramic insert

    NASA Astrophysics Data System (ADS)

    Wang, Bo; Wang, Li; Zhang, Enguang

    2017-04-01

    The finite element method has been able to simulate and solve practical machining problems, achieve the required accuracy and the highly reliability. In this paper, the simulation models based on the material properties of the self-prepared Si3N4 insert and HT250 were created. Using these models, the results of cutting force, cutting temperature and tool wear rate were obtained, and tool wear mode was predicted after cutting simulation. These approaches may develop as the new method for testing new cutting-tool materials, shortening development cycle and reducing the cost.

  6. Numerical modelling of orthogonal cutting: application to woodworking with a bench plane.

    PubMed

    Nairn, John A

    2016-06-06

    A numerical model for orthogonal cutting using the material point method was applied to woodcutting using a bench plane. The cutting process was modelled by accounting for surface energy associated with wood fracture toughness for crack growth parallel to the grain. By using damping to deal with dynamic crack propagation and modelling all contact between wood and the plane, simulations could initiate chip formation and proceed into steady-state chip propagation including chip curling. Once steady-state conditions were achieved, the cutting forces became constant and could be determined as a function of various simulation variables. The modelling details included a cutting tool, the tool's rake and grinding angles, a chip breaker, a base plate and a mouth opening between the base plate and the tool. The wood was modelled as an anisotropic elastic-plastic material. The simulations were verified by comparison to an analytical model and then used to conduct virtual experiments on wood planing. The virtual experiments showed interactions between depth of cut, chip breaker location and mouth opening. Additional simulations investigated the role of tool grinding angle, tool sharpness and friction.

  7. Modular Analytical Multicomponent Analysis in Gas Sensor Aarrays

    PubMed Central

    Chaiyboun, Ali; Traute, Rüdiger; Kiesewetter, Olaf; Ahlers, Simon; Müller, Gerhard; Doll, Theodor

    2006-01-01

    A multi-sensor system is a chemical sensor system which quantitatively and qualitatively records gases with a combination of cross-sensitive gas sensor arrays and pattern recognition software. This paper addresses the issue of data analysis for identification of gases in a gas sensor array. We introduce a software tool for gas sensor array configuration and simulation. It concerns thereby about a modular software package for the acquisition of data of different sensors. A signal evaluation algorithm referred to as matrix method was used specifically for the software tool. This matrix method computes the gas concentrations from the signals of a sensor array. The software tool was used for the simulation of an array of five sensors to determine gas concentration of CH4, NH3, H2, CO and C2H5OH. The results of the present simulated sensor array indicate that the software tool is capable of the following: (a) identify a gas independently of its concentration; (b) estimate the concentration of the gas, even if the system was not previously exposed to this concentration; (c) tell when a gas concentration exceeds a certain value. A gas sensor data base was build for the configuration of the software. With the data base one can create, generate and manage scenarios and source files for the simulation. With the gas sensor data base and the simulation software an on-line Web-based version was developed, with which the user can configure and simulate sensor arrays on-line.

  8. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.

  9. [Innovative education: simulation-based training at the Institute of Health Sciences, Semmelweis University, Hungary].

    PubMed

    Csóka, Mária; Deutsch, Tibor

    2011-01-02

    In Hungary, the Institute of Health Sciences at Semmelweis University was the first institution to introduce patient simulation-based practical training of non-physician professionals. Before introducing this novel educational methodology, students could only practice particular examinations and interventions on demonstration tools. Using the simulator they can also follow and analyze the effects of the interventions that have been made. The high fidelity, adult Human Patients Emergency Care Simulator (HPS-ECS, Medical Education Technologies Incorporation, Sarasota, Florida, USA) is particularly suitable for acquiring skills related to the management of various emergency situations. The 180 cm and 34 kg mannequin which can operate in lying and sitting positions has both respiration and circulation which can be examined the same way as in a living person. It is capable to produce several physical and clinical signs such as respiration with chest movement, electric cardiac activity, palpable pulse, and measurable blood pressure. In addition, it can also exhibit blinking, swelling of the tongue and whole-body trembling while intestinal, cardiac and pulmonary sounds can equally be examined. The high fidelity simulator allows various interventions including monitoring, oxygen therapy, bladder catheterization, gastric tube insertion, injection, infusion and transfusion therapy to be practiced as part of complex patient management. Diagnostic instruments such as ECG recorder, sphygmomanometer, pulse-oxymeter can be attached to the simulator which can also respond to different medical interventions such as intubation, defibrillation, pacing, liquid supplementing, and blood transfusion. The mannequin's physiological response can be followed up and monitored over time to assess whether the selected intervention has been proven adequate to achieve the desired outcome. Authors provide a short overview of the possible applications of clinical simulation for education and training in health sciences, and present how patient simulator has been embedded in various practical courses as part of different curriculum designed for different health care specialties.

  10. Diagnosis of edge condition based on force measurement during milling of composites

    NASA Astrophysics Data System (ADS)

    Felusiak, Agata; Twardowski, Paweł

    2018-04-01

    The present paper presents comparative results of the forecasting of a cutting tool wear with the application of different methods of diagnostic deduction based on the measurement of cutting force components. The research was carried out during the milling of the Duralcan F3S.10S aluminum-ceramic composite. Prediction of the toolwear was based on one variable, two variables regression Multilayer Perceptron(MLP)and Radial Basis Function(RBF)neural networks. Forecasting the condition of the cutting tool on the basis of cutting forces has yielded very satisfactory results.

  11. The National Ignition Facility Diagnostic Set at the Completion of the National Ignition Campaign, September 2012

    DOE PAGES

    Kilkenny, J. D.; Bell, P. M.; Bradley, D. K.; ...

    2016-01-06

    At the completion of the National Ignition Campaign (NIC), the National Ignition Facility (NIF) had about 36 different types of diagnostics. These were based on several decades of development on Nova and OMEGA and involved the whole U.S. inertial confinement fusion community. In 1994, the Joint Central Diagnostic Team documented a plan for a limited set of NIF diagnostics in the NIF Conceptual Design Report. Two decades later, these diagnostics, and many others, were installed workhorse tools for all users of NIF. We give a short description of each of the 36 different types of NIC diagnostics grouped by themore » function of the diagnostics, namely, target drive, target response and target assembly, stagnation, and burn. A comparison of NIF diagnostics with the Nova diagnostics shows that the NIF diagnostic capability is broadly equivalent to that of Nova in 1999. Although NIF diagnostics have a much greater degree of automation and rigor than Nova’s, new diagnostics are limited such as the higher-speed X-ray imager. Lastly, recommendations for future diagnostics on the NIF are discussed.« less

  12. Quantitative Segmentation of Fluorescence Microscopy Images of Heterogeneous Tissue: Application to the Detection of Residual Disease in Tumor Margins

    PubMed Central

    Mueller, Jenna L.; Harmany, Zachary T.; Mito, Jeffrey K.; Kennedy, Stephanie A.; Kim, Yongbaek; Dodd, Leslie; Geradts, Joseph; Kirsch, David G.; Willett, Rebecca M.; Brown, J. Quincy; Ramanujam, Nimmi

    2013-01-01

    Purpose To develop a robust tool for quantitative in situ pathology that allows visualization of heterogeneous tissue morphology and segmentation and quantification of image features. Materials and Methods Tissue excised from a genetically engineered mouse model of sarcoma was imaged using a subcellular resolution microendoscope after topical application of a fluorescent anatomical contrast agent: acriflavine. An algorithm based on sparse component analysis (SCA) and the circle transform (CT) was developed for image segmentation and quantification of distinct tissue types. The accuracy of our approach was quantified through simulations of tumor and muscle images. Specifically, tumor, muscle, and tumor+muscle tissue images were simulated because these tissue types were most commonly observed in sarcoma margins. Simulations were based on tissue characteristics observed in pathology slides. The potential clinical utility of our approach was evaluated by imaging excised margins and the tumor bed in a cohort of mice after surgical resection of sarcoma. Results Simulation experiments revealed that SCA+CT achieved the lowest errors for larger nuclear sizes and for higher contrast ratios (nuclei intensity/background intensity). For imaging of tumor margins, SCA+CT effectively isolated nuclei from tumor, muscle, adipose, and tumor+muscle tissue types. Differences in density were correctly identified with SCA+CT in a cohort of ex vivo and in vivo images, thus illustrating the diagnostic potential of our approach. Conclusion The combination of a subcellular-resolution microendoscope, acriflavine staining, and SCA+CT can be used to accurately isolate nuclei and quantify their density in anatomical images of heterogeneous tissue. PMID:23824589

  13. Quantitative Segmentation of Fluorescence Microscopy Images of Heterogeneous Tissue: Application to the Detection of Residual Disease in Tumor Margins.

    PubMed

    Mueller, Jenna L; Harmany, Zachary T; Mito, Jeffrey K; Kennedy, Stephanie A; Kim, Yongbaek; Dodd, Leslie; Geradts, Joseph; Kirsch, David G; Willett, Rebecca M; Brown, J Quincy; Ramanujam, Nimmi

    2013-01-01

    To develop a robust tool for quantitative in situ pathology that allows visualization of heterogeneous tissue morphology and segmentation and quantification of image features. TISSUE EXCISED FROM A GENETICALLY ENGINEERED MOUSE MODEL OF SARCOMA WAS IMAGED USING A SUBCELLULAR RESOLUTION MICROENDOSCOPE AFTER TOPICAL APPLICATION OF A FLUORESCENT ANATOMICAL CONTRAST AGENT: acriflavine. An algorithm based on sparse component analysis (SCA) and the circle transform (CT) was developed for image segmentation and quantification of distinct tissue types. The accuracy of our approach was quantified through simulations of tumor and muscle images. Specifically, tumor, muscle, and tumor+muscle tissue images were simulated because these tissue types were most commonly observed in sarcoma margins. Simulations were based on tissue characteristics observed in pathology slides. The potential clinical utility of our approach was evaluated by imaging excised margins and the tumor bed in a cohort of mice after surgical resection of sarcoma. Simulation experiments revealed that SCA+CT achieved the lowest errors for larger nuclear sizes and for higher contrast ratios (nuclei intensity/background intensity). For imaging of tumor margins, SCA+CT effectively isolated nuclei from tumor, muscle, adipose, and tumor+muscle tissue types. Differences in density were correctly identified with SCA+CT in a cohort of ex vivo and in vivo images, thus illustrating the diagnostic potential of our approach. The combination of a subcellular-resolution microendoscope, acriflavine staining, and SCA+CT can be used to accurately isolate nuclei and quantify their density in anatomical images of heterogeneous tissue.

  14. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less

  15. Getting expert systems off the ground: Lessons learned from integrating model-based diagnostics with prototype flight hardware

    NASA Technical Reports Server (NTRS)

    Stephan, Amy; Erikson, Carol A.

    1991-01-01

    As an initial attempt to introduce expert system technology into an onboard environment, a model based diagnostic system using the TRW MARPLE software tool was integrated with prototype flight hardware and its corresponding control software. Because this experiment was designed primarily to test the effectiveness of the model based reasoning technique used, the expert system ran on a separate hardware platform, and interactions between the control software and the model based diagnostics were limited. While this project met its objective of showing that model based reasoning can effectively isolate failures in flight hardware, it also identified the need for an integrated development path for expert system and control software for onboard applications. In developing expert systems that are ready for flight, artificial intelligence techniques must be evaluated to determine whether they offer a real advantage onboard, identify which diagnostic functions should be performed by the expert systems and which are better left to the procedural software, and work closely with both the hardware and the software developers from the beginning of a project to produce a well designed and thoroughly integrated application.

  16. Simulation studies for operating electron beam ion trap at very low energy for disentangling edge plasma spectra

    NASA Astrophysics Data System (ADS)

    Jin, Xuelong; Fei, Zejie; Xiao, Jun; Lu, Di; Hutton, Roger; Zou, Yaming

    2012-07-01

    Electron beam ion traps (EBITs) are very useful tools for disentanglement studies of atomic processes in plasmas. In order to assist studies on edge plasma spectroscopic diagnostics, a very low energy EBIT, SH-PermEBIT, has been set up at the Shanghai EBIT lab. In this work, simulation studies for factors which hinder an EBIT to operate at very low electron energies were made based on the Tricomp (Field Precision) codes. Longitudinal, transversal, and total kinetic energy distributions were analyzed for all the electron trajectories. Influences from the electron current and electron energy on the energy depression caused by the space charge are discussed. The simulation results show that although the energy depression is most serious along the center of the electron beam, the electrons in the outer part of the beam are more likely to be lost when an EBIT is running at very low energy. Using the simulation results to guide us, we successfully managed to reach the minimum electron beam energy of 60 eV with a beam transmission above 57% for the SH-PermEBIT. Ar and W spectra were measured from the SH-PermEBIT at the apparent electron beam energies (read from the voltage difference between the electron gun cathode and the central drift tube) of 60 eV and 1200 eV, respectively. The spectra are shown in this paper.

  17. Blended learning in surgery using the Inmedea Simulator.

    PubMed

    Funke, Katrin; Bonrath, Esther; Mardin, Wolf Arif; Becker, Jan Carl; Haier, Joerg; Senninger, Norbert; Vowinkel, Thorsten; Hoelzen, Jens Peter; Mees, Soeren Torge

    2013-02-01

    Recently, medical education in surgery has experienced several modifications. We have implemented a blended learning module in our teaching curriculum to evaluate its effectiveness, applicability, and acceptance in surgical education. In this prospective study, the traditional face-to-face learning of our teaching curriculum for fourth-year medical students (n = 116) was augmented by the Inmedea Simulator, a web-based E-learning system, with six virtual patient cases. Student results were documented by the system and learning success was determined by comparing patient cases with comparable diseases (second and sixth case). The acceptance among the students was evaluated with a questionnaire. After using the Inmedea Simulator, correct diagnoses were found significantly (P < 0.05) more often, while an incomplete diagnostic was seen significantly (P < 0.05) less often. Significant overall improvement (P < 0.05) was seen in sixth case (62.3 ± 5.6 %) vs. second case (53.9 ± 5.6 %). The questionnaire revealed that our students enjoyed the surgical seminar (score 2.1 ± 1.5) and preferred blended learning (score 2.5 ± 1.2) to conventional teaching. The blended learning approach using the Inmedea Simulator was highly appreciated by our medical students and resulted in a significant learning success. Blended learning appears to be a suitable tool to complement traditional teaching in surgery.

  18. Simulation-based ongoing professional practice evaluation in psychiatry: a novel tool for performance assessment.

    PubMed

    Gorrindo, Tristan; Goldfarb, Elizabeth; Birnbaum, Robert J; Chevalier, Lydia; Meller, Benjamin; Alpert, Jonathan; Herman, John; Weiss, Anthony

    2013-07-01

    Ongoing professional practice evaluation (OPPE) activities consist of a quantitative, competency-based evaluation of clinical performance. Hospitals must design assessments that measure clinical competencies, are scalable, and minimize impact on the clinician's daily routines. A psychiatry department at a large academic medical center designed and implemented an interactive Web-based psychiatric simulation focusing on violence risk assessment as a tool for a departmentwide OPPE. Of 412 invited clinicians in a large psychiatry department, 410 completed an online simulation in April-May 2012. Participants received scheduled e-mail reminders with instructions describing how to access the simulation. Using the Computer Simulation Assessment Tool, participants viewed an introductory video and were then asked to conduct a risk assessment, acting as a clinician in the encounter by selecting actions from a series of drop-down menus. Each action was paired with a corresponding video segment of a clinical encounter with a standardized patient. Participants were scored on the basis of their actions within the simulation (Measure 1) and by their responses to the open-ended questions in which they were asked to integrate the information from the simulation in a summative manner (Measure 2). Of the 410 clinicians, 381 (92.9%) passed Measure 1,359 (87.6%) passed Measure 2, and 5 (1.2%) failed both measures. Seventy-five (18.3%) participants were referred for focused professional practice evaluation (FPPE) after failing either Measure 1, Measure 2, or both. Overall, Web-based simulation and e-mail engagement tools were a scalable and efficient way to assess a large number of clinicians in OPPE and to identify those who required FPPE.

  19. Combining medical informatics and bioinformatics toward tools for personalized medicine.

    PubMed

    Sarachan, B D; Simmons, M K; Subramanian, P; Temkin, J M

    2003-01-01

    Key bioinformatics and medical informatics research areas need to be identified to advance knowledge and understanding of disease risk factors and molecular disease pathology in the 21 st century toward new diagnoses, prognoses, and treatments. Three high-impact informatics areas are identified: predictive medicine (to identify significant correlations within clinical data using statistical and artificial intelligence methods), along with pathway informatics and cellular simulations (that combine biological knowledge with advanced informatics to elucidate molecular disease pathology). Initial predictive models have been developed for a pilot study in Huntington's disease. An initial bioinformatics platform has been developed for the reconstruction and analysis of pathways, and work has begun on pathway simulation. A bioinformatics research program has been established at GE Global Research Center as an important technology toward next generation medical diagnostics. We anticipate that 21 st century medical research will be a combination of informatics tools with traditional biology wet lab research, and that this will translate to increased use of informatics techniques in the clinic.

  20. The diagnostic value of pepsin detection in saliva for gastro-esophageal reflux disease: a preliminary study from China.

    PubMed

    Du, Xing; Wang, Feng; Hu, Zhiwei; Wu, Jimin; Wang, Zhonggao; Yan, Chao; Zhang, Chao; Tang, Juan

    2017-10-17

    None of current diagnostic methods has been proven to be a reliable tool for gastro-esophageal reflux disease (GERD). Pepsin in saliva has been proposed as a promising diagnostic biomarker for gastro-esophageal reflux. We aimed to determine the diagnostic value of salivary pepsin detection for GERD. Two hundred and fifty patients with symptoms suggestive of GERD and 35 asymptomatic healthy volunteers provided saliva on morning waking, after lunch and dinner for pepsin determination using the Peptest lateral flow device. All patients underwent 24-h multichannel intraluminal impedance pH (24-h MII-pH) monitoring and upper gastrointestinal endoscopy. Based on 24-h MII-pH and endoscopy study, patients were defined as GERD (abnormal MII-pH results and/or reflux esophagitis) and non-GERD otherwise. Patients with GERD had a higher prevalence of pepsin in saliva and higher pepsin concentration than patients with non-GERD and healthy controls (P < 0.001 for all). The pepsin test had a sensitivity of 73% and a specificity of 88.3% for diagnosing GERD using the optimal cut-off value of 76 ng/mL. Postprandial saliva samples collected when the symptoms occurred had a more powerful ability to identify GERD. Salivary pepsin test had moderate diagnostic value for GERD. It may be a promising tool to replace the use of currently invasive tools with advantages of non-invasive, easy to perform and cost effective. ChiCTR-DDD-16009506 (date of registration: October 20, 2016).

  1. Can surgical simulation be used to train detection and classification of neural networks?

    PubMed

    Zisimopoulos, Odysseas; Flouty, Evangello; Stacey, Mark; Muscroft, Sam; Giataganas, Petros; Nehme, Jean; Chow, Andre; Stoyanov, Danail

    2017-10-01

    Computer-assisted interventions (CAI) aim to increase the effectiveness, precision and repeatability of procedures to improve surgical outcomes. The presence and motion of surgical tools is a key information input for CAI surgical phase recognition algorithms. Vision-based tool detection and recognition approaches are an attractive solution and can be designed to take advantage of the powerful deep learning paradigm that is rapidly advancing image recognition and classification. The challenge for such algorithms is the availability and quality of labelled data used for training. In this Letter, surgical simulation is used to train tool detection and segmentation based on deep convolutional neural networks and generative adversarial networks. The authors experiment with two network architectures for image segmentation in tool classes commonly encountered during cataract surgery. A commercially-available simulator is used to create a simulated cataract dataset for training models prior to performing transfer learning on real surgical data. To the best of authors' knowledge, this is the first attempt to train deep learning models for surgical instrument detection on simulated data while demonstrating promising results to generalise on real data. Results indicate that simulated data does have some potential for training advanced classification methods for CAI systems.

  2. MITHRA 1.0: A full-wave simulation tool for free electron lasers

    NASA Astrophysics Data System (ADS)

    Fallahi, Arya; Yahaghi, Alireza; Kärtner, Franz X.

    2018-07-01

    Free Electron Lasers (FELs) are a solution for providing intense, coherent and bright radiation in the hard X-ray regime. Due to the low wall-plug efficiency of FEL facilities, it is crucial and additionally very useful to develop complete and accurate simulation tools for better optimizing a FEL interaction. The highly sophisticated dynamics involved in a FEL process was the main obstacle hindering the development of general simulation tools for this problem. We present a numerical algorithm based on finite difference time domain/Particle in cell (FDTD/PIC) in a Lorentz boosted coordinate system which is able to fulfill a full-wave simulation of a FEL process. The developed software offers a suitable tool for the analysis of FEL interactions without considering any of the usual approximations. A coordinate transformation to bunch rest frame makes the very different length scales of bunch size, optical wavelengths and the undulator period transform to values with the same order. Consequently, FDTD/PIC simulations in conjunction with efficient parallelization techniques make the full-wave simulation feasible using the available computational resources. Several examples of free electron lasers are analyzed using the developed software, the results are benchmarked based on standard FEL codes and discussed in detail.

  3. Reporting completeness and transparency of meta-analyses of depression screening tool accuracy: A comparison of meta-analyses published before and after the PRISMA statement.

    PubMed

    Rice, Danielle B; Kloda, Lorie A; Shrier, Ian; Thombs, Brett D

    2016-08-01

    Meta-analyses that are conducted rigorously and reported completely and transparently can provide accurate evidence to inform the best possible healthcare decisions. Guideline makers have raised concerns about the utility of existing evidence on the diagnostic accuracy of depression screening tools. The objective of our study was to evaluate the transparency and completeness of reporting in meta-analyses of the diagnostic accuracy of depression screening tools using the PRISMA tool adapted for diagnostic test accuracy meta-analyses. We searched MEDLINE and PsycINFO from January 1, 2005 through March 13, 2016 for recent meta-analyses in any language on the diagnostic accuracy of depression screening tools. Two reviewers independently assessed the transparency in reporting using the PRISMA tool with appropriate adaptations made for studies of diagnostic test accuracy. We identified 21 eligible meta-analyses. Twelve of 21 meta-analyses complied with at least 50% of adapted PRISMA items. Of 30 adapted PRISMA items, 11 were fulfilled by ≥80% of included meta-analyses, 3 by 50-79% of meta-analyses, 7 by 25-45% of meta-analyses, and 9 by <25%. On average, post-PRISMA meta-analyses complied with 17 of 30 items compared to 13 of 30 items pre-PRISMA. Deficiencies in the transparency of reporting in meta-analyses of the diagnostic test accuracy of depression screening tools of meta-analyses were identified. Authors, reviewers, and editors should adhere to the PRISMA statement to improve the reporting of meta-analyses of the diagnostic accuracy of depression screening tools. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Face, Content, and Construct Validations of Endoscopic Needle Injection Simulator for Transurethral Bulking Agent in Treatment of Stress Urinary Incontinence.

    PubMed

    Farhan, Bilal; Soltani, Tandis; Do, Rebecca; Perez, Claudia; Choi, Hanul; Ghoniem, Gamal

    2018-05-02

    Endoscopic injection of urethral bulking agents is an office procedure that is used to treat stress urinary incontinence secondary to internal sphincteric deficiency. Validation studies important part of simulator evaluation and is considered important step to establish the effectiveness of simulation-based training. The endoscopic needle injection (ENI) simulator has not been formally validated, although it has been used widely at University of California, Irvine. We aimed to assess the face, content, and construct validity of the UC, Irvine ENI simulator. Dissected female porcine bladders were mounted in a modified Hysteroscopy Diagnostic Trainer. Using routine endoscopic equipment for this procedure with video monitoring, 6 urologists (experts group) and 6 urology trainee (novice group) completed urethral bulking agents injections on a total of 12 bladders using ENI simulator. Face and content validities were assessed by using structured quantitative survey which rating the realism. Construct validity was assessed by comparing the performance, time of the procedure, and the occlusive (anatomical and functional) evaluations between the experts and novices. Trainees also completed a postprocedure feedback survey. Effective injections were evaluated by measuring the retrograde urethral opening pressure, visual cystoscopic coaptation, and postprocedure gross anatomic examination. All 12 participants felt the simulator was a good training tool and should be used as essential part of urology training (face validity). ENI simulator showed good face and content validity with average score varies between the experts and the novices was 3.9/5 and 3.8/5, respectively. Content validity evaluation showed that most aspects of the simulator were adequately realistic (mean Likert scores 3.9-3.8/5). However, the bladder does not bleed, and sometimes thin. Experts significantly outperformed novices (p < 001) across all measure of performance therefore establishing construct validity. The ENI simulator shows face, content and construct validities, although few aspects of simulator were not very realistic (e.g., bleeding).This study provides a base for the future formal validation for this simulator and for continuing use of this simulator in endourology training. Copyright © 2018 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  5. A usability evaluation of medical software at an expert conference setting.

    PubMed

    Bond, Raymond Robert; Finlay, Dewar D; Nugent, Chris D; Moore, George; Guldenring, Daniel

    2014-01-01

    A usability test was employed to evaluate two medical software applications at an expert conference setting. One software application is a medical diagnostic tool (electrocardiogram [ECG] viewer) and the other is a medical research tool (electrode misplacement simulator [EMS]). These novel applications have yet to be adopted by the healthcare domain, thus, (1) we wanted to determine the potential user acceptance of these applications and (2) we wanted to determine the feasibility of evaluating medical diagnostic and medical research software at a conference setting as opposed to the conventional laboratory setting. The medical diagnostic tool (ECG viewer) was evaluated using seven delegates and the medical research tool (EMS) was evaluated using 17 delegates that were recruited at the 2010 International Conference on Computing in Cardiology. Each delegate/participant was required to use the software and undertake a set of predefined tasks during the session breaks at the conference. User interactions with the software were recorded using screen-recording software. The 'think-aloud' protocol was also used to elicit verbal feedback from the participants whilst they attempted the pre-defined tasks. Before and after each session, participants completed a pre-test and a post-test questionnaire respectively. The average duration of a usability session at the conference was 34.69 min (SD=10.28). However, taking into account that 10 min was dedicated to the pre-test and post-test questionnaires, the average time dedication to user interaction of the medical software was 24.69 min (SD=10.28). Given we have shown that usability data can be collected at conferences, this paper details the advantages of conference-based usability studies over the laboratory-based approach. For example, given delegates gather at one geographical location, a conference-based usability evaluation facilitates recruitment of a convenient sample of international subject experts. This would otherwise be very expensive to arrange. A conference-based approach also allows for data to be collected over a few days as opposed to months by avoiding administration duties normally involved in laboratory based approach, e.g. mailing invitation letters as part of a recruitment campaign. Following analysis of the user video recordings, 41 (previously unknown) use errors were identified in the advanced ECG viewer and 29 were identified in the EMS application. All use errors were given a consensus severity rating from two independent usability experts. Out of a rating scale of 4 (where 1=cosmetic and 4=critical), the average severity rating for the ECG viewer was 2.24 (SD=1.09) and the average severity rating for the EMS application was 2.34 (SD=0.97). We were also able to extract task completion rates and times from the video recordings to determine the effectiveness of the software applications. For example, six out of seven tasks were completed by all participants when using both applications. This statistic alone suggests both applications already have a high degree of usability. As well as extracting data from the video recordings, we were also able to extract data from the questionnaires. Using a semantic differential scale (where 1=poor and 5=excellent), delegates highly rated the 'responsiveness', 'usefulness', 'learnability' and the 'look and feel' of both applications. This study has shown the potential user acceptance and user-friendliness of the novel EMS and the ECG viewer applications within the healthcare domain. It has also shown that both medical diagnostic software and medical research software can be evaluated for their usability at an expert conference setting. The primary advantage of a conference-based usability evaluation over a laboratory-based evaluation is the high concentration of experts at one location, which is convenient, less time consuming and less expensive. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Validation of Quantitative Multimodality Analysis of Telomerase Activity in Urine Cells as a Noninvasive Diagnostic and Prognostic Tool for Prostate Cancer

    DTIC Science & Technology

    2005-08-01

    present study, who was previously misdiagnosed with BPH and inflammation, eventually has revealed the prostate cancer with the Gleason score 7. Therefore...Noninvasive Diagnostic and Prognostic Tool for Prostate Cancer ...5a. CONTRACT NUMBER Urine Cells as a Noninvasive Diagnostic and Prognostic Tool for Prostate Cancer 5b. GRANT NUMBER W81XWH-04-1-0774 5c

  7. IGMS: An Integrated ISO-to-Appliance Scale Grid Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan; Hale, Elaine; Hansen, Timothy M.

    This paper describes the Integrated Grid Modeling System (IGMS), a novel electric power system modeling platform for integrated transmission-distribution analysis that co-simulates off-the-shelf tools on high performance computing (HPC) platforms to offer unprecedented resolution from ISO markets down to appliances and other end uses. Specifically, the system simultaneously models hundreds or thousands of distribution systems in co-simulation with detailed Independent System Operator (ISO) markets and AGC-level reserve deployment. IGMS uses a new MPI-based hierarchical co-simulation framework to connect existing sub-domain models. Our initial efforts integrate opensource tools for wholesale markets (FESTIV), bulk AC power flow (MATPOWER), and full-featured distribution systemsmore » including physics-based end-use and distributed generation models (many instances of GridLAB-D[TM]). The modular IGMS framework enables tool substitution and additions for multi-domain analyses. This paper describes the IGMS tool, characterizes its performance, and demonstrates the impacts of the coupled simulations for analyzing high-penetration solar PV and price responsive load scenarios.« less

  8. Analysis of laparoscopy in trauma.

    PubMed

    Villavicencio, R T; Aucar, J A

    1999-07-01

    The optimum roles for laparoscopy in trauma have yet to be established. To date, reviews of laparoscopy in trauma have been primarily descriptive rather than analytic. This article analyzes the results of laparoscopy in trauma. Outcome analysis was done by reviewing 37 studies with more than 1,900 trauma patients, and laparoscopy was analyzed as a screening, diagnostic, or therapeutic tool. Laparoscopy was regarded as a screening tool if it was used to detect or exclude a positive finding (eg, hemoperitoneum, organ injury, gastrointestinal spillage, peritoneal penetration) that required operative exploration or repair. Laparoscopy was regarded as a diagnostic tool when it was used to identify all injuries, rather than as a screening tool to identify the first indication for a laparotomy. It was regarded as a diagnostic tool only in studies that mandated a laparotomy (gold standard) after laparoscopy to confirm the diagnostic accuracy of laparoscopic findings. Costs and charges for using laparoscopy in trauma were analyzed when feasible. As a screening tool, laparoscopy missed 1% of injuries and helped prevent 63% of patients from having a trauma laparotomy. When used as a diagnostic tool, laparoscopy had a 41% to 77% missed injury rate per patient. Overall, laparoscopy carried a 1% procedure-related complication rate. Cost-effectiveness has not been uniformly proved in studies comparing laparoscopy and laparotomy. Laparoscopy has been applied safely and effectively as a screening tool in stable patients with acute trauma. Because of the large number of missed injuries when used as a diagnostic tool, its value in this context is limited. Laparoscopy has been reported infrequently as a therapeutic tool in selected patients, and its use in this context requires further study.

  9. Simulation-Based Training Platforms for Arthroscopy: A Randomized Comparison of Virtual Reality Learning to Benchtop Learning.

    PubMed

    Middleton, Robert M; Alvand, Abtin; Garfjeld Roberts, Patrick; Hargrove, Caroline; Kirby, Georgina; Rees, Jonathan L

    2017-05-01

    To determine whether a virtual reality (VR) arthroscopy simulator or benchtop (BT) arthroscopy simulator showed superiority as a training tool. Arthroscopic novices were randomized to a training program on a BT or a VR knee arthroscopy simulator. The VR simulator provided user performance feedback. Individuals performed a diagnostic arthroscopy on both simulators before and after the training program. Performance was assessed using wireless objective motion analysis and a global rating scale. The groups (8 in the VR group, 9 in the BT group) were well matched at baseline across all parameters (P > .05). Training on each simulator resulted in significant performance improvements across all parameters (P < .05). BT training conferred a significant improvement in all parameters when trainees were reassessed on the VR simulator (P < .05). In contrast, VR training did not confer improvement in performance when trainees were reassessed on the BT simulator (P > .05). BT-trained subjects outperformed VR-trained subjects in all parameters during final assessments on the BT simulator (P < .05). There was no difference in objective performance between VR-trained and BT-trained subjects on final VR simulator wireless objective motion analysis assessment (P > .05). Both simulators delivered improvements in arthroscopic skills. BT training led to skills that readily transferred to the VR simulator. Skills acquired after VR training did not transfer as readily to the BT simulator. Despite trainees receiving automated metric feedback from the VR simulator, the results suggest a greater gain in psychomotor skills for BT training. Further work is required to determine if this finding persists in the operating room. This study suggests that there are differences in skills acquired on different simulators and skills learnt on some simulators may be more transferable. Further work in identifying user feedback metrics that enhance learning is also required. Copyright © 2016 Arthroscopy Association of North America. All rights reserved.

  10. Fast-ion D(alpha) measurements and simulations in DIII-D

    NASA Astrophysics Data System (ADS)

    Luo, Yadong

    The fast-ion Dalpha diagnostic measures the Doppler-shifted Dalpha light emitted by neutralized fast ions. For a favorable viewing geometry, the bright interferences from beam neutrals, halo neutrals, and edge neutrals span over a small wavelength range around the Dalpha rest wavelength and are blocked by a vertical bar at the exit focal plane of the spectrometer. Background subtraction and fitting techniques eliminate various contaminants in the spectrum. Fast-ion data are acquired with a time evolution of ˜1 ms, spatial resolution of ˜5 cm, and energy resolution of ˜10 keV. A weighted Monte Carlo simulation code models the fast-ion Dalpha spectra based on the fast-ion distribution function from other sources. In quiet plasmas, the spectral shape is in excellent agreement and absolute magnitude also has reasonable agreement. The fast-ion D alpha signal has the expected dependencies on plasma and neutral beam parameters. The neutral particle diagnostic and neutron diagnostic corroborate the fast-ion Dalpha measurements. The relative spatial profile is in agreement with the simulated profile based on the fast-ion distribution function from the TRANSP analysis code. During ion cyclotron heating, fast ions with high perpendicular energy are accelerated, while those with low perpendicular energy are barely affected. The spatial profile is compared with the simulated profiles based on the fast-ion distribution functions from the CQL Fokker-Planck code. In discharges with Alfven instabilities, both the spatial profile and spectral shape suggests that fast ions are redistributed. The flattened fast-ion Dalpha profile is in agreement with the fast-ion pressure profile.

  11. Comparison of strategies for substantiating freedom from scrapie in a sheep flock.

    PubMed

    Durand, Benoit; Martinez, Marie-José; Calavas, Didier; Ducrot, Christian

    2009-04-30

    The public health threat represented by a potential circulation of bovine spongiform encephalopathy agent in sheep population has led European animal health authorities to launch large screening and genetic selection programmes. If demonstrated, such a circulation would have dramatic economic consequences for sheep breeding sector. In this context, it is important to evaluate the feasibility of qualification procedures that would allow sheep breeders demonstrating their flock is free from scrapie. Classical approaches, based on surveys designed to detect disease presence, do not account for scrapie specificities: the genetic variations of susceptibility and the absence of live diagnostic test routinely available. Adapting these approaches leads to a paradoxical situation in which a greater amount of testing is needed to substantiate disease freedom in genetically resistant flocks than in susceptible flocks, whereas probability of disease freedom is a priori higher in the former than in the latter. The goal of this study was to propose, evaluate and compare several qualification strategies for demonstrating a flock is free from scrapie. A probabilistic framework was defined that accounts for scrapie specificities and allows solving the preceding paradox. Six qualification strategies were defined that combine genotyping data, diagnostic tests results and flock pedigree. These were compared in two types of simulated flocks: resistant and susceptible flocks. Two strategies allowed demonstrating disease freedom in several years, for the majority of simulated flocks: a strategy in which all the flock animals are genotyped, and a strategy in which only founders animals are genotyped, the flock pedigree being known. In both cases, diagnostic tests are performed on culled animals. The less costly strategy varied according to the genetic context (resistant or susceptible) and to the relative costs of a genotyping exam and of a diagnostic test. This work demonstrates that combining data sources allows substantiating a flock is free from scrapie within a reasonable time frame. Qualification schemes could thus be a useful tool for voluntary or mandatory scrapie control programmes. However, there is no general strategy that would always minimize the costs and choice of the strategy should be adapted to local genetic conditions.

  12. Neutral helium beam probe

    NASA Astrophysics Data System (ADS)

    Karim, Rezwanul

    1999-10-01

    This article discusses the development of a code where diagnostic neutral helium beam can be used as a probe. The code solves numerically the evolution of the population densities of helium atoms at their several different energy levels as the beam propagates through the plasma. The collisional radiative model has been utilized in this numerical calculation. The spatial dependence of the metastable states of neutral helium atom, as obtained in this numerical analysis, offers a possible diagnostic tool for tokamak plasma. The spatial evolution for several hypothetical plasma conditions was tested. Simulation routines were also run with the plasma parameters (density and temperature profiles) similar to a shot in the Princeton beta experiment modified (PBX-M) tokamak and a shot in Tokamak Fusion Test Reactor tokamak. A comparison between the simulation result and the experimentally obtained data (for each of these two shots) is presented. A good correlation in such comparisons for a number of such shots can establish the accurateness and usefulness of this probe. The result can possibly be extended for other plasma machines and for various plasma conditions in those machines.

  13. Software-defined Quantum Networking Ecosystem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Sadlier, Ronald

    The software enables a user to perform modeling and simulation of software-defined quantum networks. The software addresses the problem of how to synchronize transmission of quantum and classical signals through multi-node networks and to demonstrate quantum information protocols such as quantum teleportation. The software approaches this problem by generating a graphical model of the underlying network and attributing properties to each node and link in the graph. The graphical model is then simulated using a combination of discrete-event simulators to calculate the expected state of each node and link in the graph at a future time. A user interacts withmore » the software by providing an initial network model and instantiating methods for the nodes to transmit information with each other. This includes writing application scripts in python that make use of the software library interfaces. A user then initiates the application scripts, which invokes the software simulation. The user then uses the built-in diagnostic tools to query the state of the simulation and to collect statistics on synchronization.« less

  14. Improving the Efficiency and Quality of the Value Assessment Process for Companion Diagnostic Tests: The Companion test Assessment Tool (CAT).

    PubMed

    Canestaro, William J; Pritchard, Daryl E; Garrison, Louis P; Dubois, Robert; Veenstra, David L

    2015-08-01

    Companion diagnostic tests (CDTs) have emerged as a vital technology in the effective use of an increasing number of targeted drug therapies. Although CDTs can offer a multitude of potential benefits, assessing their value within a health technology appraisal process can be challenging because of a complex array of factors that influence clinical and economic outcomes. To develop a user-friendly tool to assist managed care and other health care decision makers in screening companion tests and determining whether an intensive technology review is necessary and, if so, where the review should be focused to improve efficiency. First, we conducted a systematic literature review of CDT cost-effectiveness studies to identify value drivers. Second, we conducted key informant interviews with a diverse group of stakeholders to elicit feedback and solicit any additional value drivers and identify desirable attributes for an evidence review tool. A draft tool was developed based on this information that captured value drivers, usability features, and had a particular focus on practical use by nonexperts. Finally, the tool was pilot tested with test developers and managed care evidence evaluators to assess face-validity and usability. The tool was also evaluated using several diverse examples of existing companion diagnostics and refined accordingly. We identified 65 cost-effectiveness studies of companion diagnostic technologies. The following factors were most commonly identified as value drivers from our literature review: clinical validity of testing; efficacy, safety, and cost of baseline and alternative treatments; cost and mortality of health states; and biomarker prevalence and testing cost. Stakeholders identified the following additional factors that they believed influenced the overall value of a companion test: regulatory status, actionability, utility, and market penetration. These factors were used to maximize the efficiency of the evidence review process. Stakeholders also stated that a tool should be easy to use and time efficient. Cognitive interviews with stakeholders led to minor changes in the draft tool to improve usability and relevance. The final tool consisted of 4 sections: (1) eligibility for review (2 questions), (2) prioritization of review (3 questions), (3) clinical review (3 questions), and (4) economic review (5 questions). Although the evaluation of CDTs can be challenging because of limited evidence and the added complexity of incorporating a diagnostic test into drug treatment decisions, using a pragmatic tool to identify tests that do not need extensive evaluation may improve the efficiency and effectiveness of CDT value assessments.

  15. Developing a modular architecture for creation of rule-based clinical diagnostic criteria.

    PubMed

    Hong, Na; Pathak, Jyotishman; Chute, Christopher G; Jiang, Guoqian

    2016-01-01

    With recent advances in computerized patient records system, there is an urgent need for producing computable and standards-based clinical diagnostic criteria. Notably, constructing rule-based clinical diagnosis criteria has become one of the goals in the International Classification of Diseases (ICD)-11 revision. However, few studies have been done in building a unified architecture to support the need for diagnostic criteria computerization. In this study, we present a modular architecture for enabling the creation of rule-based clinical diagnostic criteria leveraging Semantic Web technologies. The architecture consists of two modules: an authoring module that utilizes a standards-based information model and a translation module that leverages Semantic Web Rule Language (SWRL). In a prototype implementation, we created a diagnostic criteria upper ontology (DCUO) that integrates ICD-11 content model with the Quality Data Model (QDM). Using the DCUO, we developed a transformation tool that converts QDM-based diagnostic criteria into Semantic Web Rule Language (SWRL) representation. We evaluated the domain coverage of the upper ontology model using randomly selected diagnostic criteria from broad domains (n = 20). We also tested the transformation algorithms using 6 QDM templates for ontology population and 15 QDM-based criteria data for rule generation. As the results, the first draft of DCUO contains 14 root classes, 21 subclasses, 6 object properties and 1 data property. Investigation Findings, and Signs and Symptoms are the two most commonly used element types. All 6 HQMF templates are successfully parsed and populated into their corresponding domain specific ontologies and 14 rules (93.3 %) passed the rule validation. Our efforts in developing and prototyping a modular architecture provide useful insight into how to build a scalable solution to support diagnostic criteria representation and computerization.

  16. Laser diagnostics for combustion temperature and species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eckbreth, A.C.

    1987-01-01

    Laser approaches to combustion diagnostics are of considerable interest due to their remote, nonintrusive and in-situ character, unlimited temperature capability and potential for simultaneous temporal and spatial resolution, This book aims to make these powerful and important new tools in combustion research understandable. The focus of this text is on spectroscopically-based, spatially-precise laser techniques for temperature and chemical composition measurements in reacting and nonreacting flows. After introductory chapters providing a fundamental theoretical and experimental background, attention is directed to diagnostics based upon spontaneous Raman and Rayleigh scattering, coherent anti-Stokes Raman spectroscopy (CARS) and laser-induced fluorescence (LIFS). The book concludes withmore » a treatment of techniques which permit spatially-resolved measurements over an entire two-dimensional field simultaneously.« less

  17. Effect of Worked Examples on Mental Model Progression in a Computer-Based Simulation Learning Environment

    ERIC Educational Resources Information Center

    Darabi, Aubteen; Nelson, David W.; Meeker, Richard; Liang, Xinya; Boulware, Wilma

    2010-01-01

    In a diagnostic problem solving operation of a computer-simulated chemical plant, chemical engineering students were randomly assigned to two groups: one studying product-oriented worked examples, the other practicing conventional problem solving. Effects of these instructional strategies on the progression of learners' mental models were examined…

  18. Noncontact techniques for diesel engine diagnostics using exhaust waveform analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gore, D.A.; Cooke, G.J.

    1987-01-01

    RCA Corporation's continuing efforts to develop noncontact test techniques for diesel engines have led to recent advancements in deep engine diagnostics. The U.S. Army Tank-Automotive Command (TACOM) has been working with RCA for the development of new noncontact sensors and test techniques which use these sensors in conjunction with their family of Simplified Test Equipment (STE) to perform vehicle diagnostics. The STE systems are microprocessor-based maintenance tools that assist the Army mechanic in diagnosing malfunctions in both tactical and combat vehicles. The test systems support the mechanic by providing the sophisticated signal processing capabilities necessary for a wide range ofmore » diagnostic testing including exhaust waveform analysis.« less

  19. A proposed computer diagnostic system for malignant melanoma (CDSMM).

    PubMed

    Shao, S; Grams, R R

    1994-04-01

    This paper describes a computer diagnostic system for malignant melanoma. The diagnostic system is a rule base system based on image analyses and works under the PC windows environment. It consists of seven modules: I/O module, Patient/Clinic database, image processing module, classification module, rule base module and system control module. In the system, the image analyses are automatically carried out, and database management is efficient and fast. Both final clinic results and immediate results from various modules such as measured features, feature pictures and history records of the disease lesion can be presented on screen or printed out from each corresponding module or from the I/O module. The system can also work as a doctor's office-based tool to aid dermatologists with details not perceivable by the human eye. Since the system operates on a general purpose PC, it can be made portable if the I/O module is disconnected.

  20. Structured syncope care pathways based on lean six sigma methodology optimises resource use with shorter time to diagnosis and increased diagnostic yield.

    PubMed

    Martens, Leon; Goode, Grahame; Wold, Johan F H; Beck, Lionel; Martin, Georgina; Perings, Christian; Stolt, Pelle; Baggerman, Lucas

    2014-01-01

    To conduct a pilot study on the potential to optimise care pathways in syncope/Transient Loss of Consciousness management by using Lean Six Sigma methodology while maintaining compliance with ESC and/or NICE guidelines. Five hospitals in four European countries took part. The Lean Six Sigma methodology consisted of 3 phases: 1) Assessment phase, in which baseline performance was mapped in each centre, processes were evaluated and a new operational model was developed with an improvement plan that included best practices and change management; 2) Improvement phase, in which optimisation pathways and standardised best practice tools and forms were developed and implemented. Staff were trained on new processes and change-management support provided; 3) Sustaining phase, which included support, refinement of tools and metrics. The impact of the implementation of new pathways was evaluated on number of tests performed, diagnostic yield, time to diagnosis and compliance with guidelines. One hospital with focus on geriatric populations was analysed separately from the other four. With the new pathways, there was a 59% reduction in the average time to diagnosis (p = 0.048) and a 75% increase in diagnostic yield (p = 0.007). There was a marked reduction in repetitions of diagnostic tests and improved prioritisation of indicated tests. Applying a structured Lean Six Sigma based methodology to pathways for syncope management has the potential to improve time to diagnosis and diagnostic yield.

  1. Structured Syncope Care Pathways Based on Lean Six Sigma Methodology Optimises Resource Use with Shorter Time to Diagnosis and Increased Diagnostic Yield

    PubMed Central

    Martens, Leon; Goode, Grahame; Wold, Johan F. H.; Beck, Lionel; Martin, Georgina; Perings, Christian; Stolt, Pelle; Baggerman, Lucas

    2014-01-01

    Aims To conduct a pilot study on the potential to optimise care pathways in syncope/Transient Loss of Consciousness management by using Lean Six Sigma methodology while maintaining compliance with ESC and/or NICE guidelines. Methods Five hospitals in four European countries took part. The Lean Six Sigma methodology consisted of 3 phases: 1) Assessment phase, in which baseline performance was mapped in each centre, processes were evaluated and a new operational model was developed with an improvement plan that included best practices and change management; 2) Improvement phase, in which optimisation pathways and standardised best practice tools and forms were developed and implemented. Staff were trained on new processes and change-management support provided; 3) Sustaining phase, which included support, refinement of tools and metrics. The impact of the implementation of new pathways was evaluated on number of tests performed, diagnostic yield, time to diagnosis and compliance with guidelines. One hospital with focus on geriatric populations was analysed separately from the other four. Results With the new pathways, there was a 59% reduction in the average time to diagnosis (p = 0.048) and a 75% increase in diagnostic yield (p = 0.007). There was a marked reduction in repetitions of diagnostic tests and improved prioritisation of indicated tests. Conclusions Applying a structured Lean Six Sigma based methodology to pathways for syncope management has the potential to improve time to diagnosis and diagnostic yield. PMID:24927475

  2. Teaching of diagnostic skills in equine gynecology: simulator-based training versus schooling on live horses.

    PubMed

    Nagel, Christina; Ille, Natascha; Aurich, Jörg; Aurich, Christine

    2015-10-15

    Transrectal palpation and ultrasonography of the genital tract in mares are first-day skills for equine veterinarians. In this study, the learning outcome in equine gynecology after four times training on horses (group H4, n = 8), training on horses once (group H1, n = 9), and four times simulator-based training (group Sim, n = 8) was assessed in third-year veterinary students with two tests in live mares 14 days apart. The students of group H4 always scored better for transrectal palpation than students of group Sim and H1 (P < 0.05). Overall, the students reached better results for palpating the left versus the right ovary (P < 0.001), but group H1 students were least successful in obtaining correct ovarian findings (P < 0.05 vs. both other groups). Students' self-assessment reflected test results with palpation of the right ovary experienced as most difficult for group H1 students (P < 0.01 vs. both other groups). Groups did not score differentially for ultrasound examinations. Sim students were nearly as successful in transrectal palpation of the genital tract in mares as H4 students, and for most parameters assessed, they performed better than H1 students. After training four times on horses, students scored best but nevertheless the overall effect of intensive training was limited. Repeated simulator-based training is a useful tool to prepare veterinary students for transrectal palpation of the genital tract in mares and is more effective than one training session on horses. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Improved tympanic thermometer based on a fiber optic infrared radiometer and an otoscope and its use as a new diagnostic tool for acute otitis media

    NASA Astrophysics Data System (ADS)

    Fishman, Gadi; DeRowe, Ari; Ophir, Eyal; Scharf, Vered; Shabtai, Abraham; Ophir, Dov; Katzir, Abraham

    1999-06-01

    Clinical diagnosis of acute otitis media (AOM) in children is not easy. It was assumed that there is a difference ΔT between the Tympanic Membrane (TM) temperatures in the two ears in unilateral AOM and that an accurate measurement of ΔT may improve the diagnosis accuracy. An IR transmitting fiber, made of AgClBr, was coupled into a hand held otoscope and was used for the non-contact (radiometric) measurements of TT, the TM temperature. Experiments were carried out, first, on a laboratory model that simulated the human ear, including an artificial tympanic membrane and an artificial ear canal. Measurements carried out using commercially available tympanic thermometers shown that the temperature Tc of the ear canal affected the results. Tc did not affect the fiberoptic radiometer, and this device accurately measured the true temperature, TT of the tympanic membrane. A prospective blinded sampling of the TM temperature was then performed on 48 children with suspected AOM. The mean temperature difference between the ears, for children with unilateral AOM was ΔT = (0.68 +/- 0.27)°C. For children with bilateral AOM it was ΔT = (0.14+/-0.10)°C (p<0.001). It was demonstrated that afor unilateral AOM the difference ΔT was proportional to the systemic temperature. In conclusion, the fiberoptic interferometric measurements of the TM can be a useful non-invasive diagnostic tool for AOM, when combined with other data.

  4. Event-based soil loss models for construction sites

    NASA Astrophysics Data System (ADS)

    Trenouth, William R.; Gharabaghi, Bahram

    2015-05-01

    The elevated rates of soil erosion stemming from land clearing and grading activities during urban development, can result in excessive amounts of eroded sediments entering waterways and causing harm to the biota living therein. However, construction site event-based soil loss simulations - required for reliable design of erosion and sediment controls - are one of the most uncertain types of hydrologic models. This study presents models with improved degree of accuracy to advance the design of erosion and sediment controls for construction sites. The new models are developed using multiple linear regression (MLR) on event-based permutations of the Universal Soil Loss Equation (USLE) and artificial neural networks (ANN). These models were developed using surface runoff monitoring datasets obtained from three sites - Greensborough, Cookstown, and Alcona - in Ontario and datasets mined from the literature for three additional sites - Treynor, Iowa, Coshocton, Ohio and Cordoba, Spain. The predictive MLR and ANN models can serve as both diagnostic and design tools for the effective sizing of erosion and sediment controls on active construction sites, and can be used for dynamic scenario forecasting when considering rapidly changing land use conditions during various phases of construction.

  5. Cadaver-based Necrotizing Fasciitis Model for Medical Training.

    PubMed

    Mohty, Kurt M; Cravens, Matthew G; Adamas-Rappaport, William J; Amini-Shervin, Bahareh; Irving, Steven C; Stea, Nicholas; Adhikari, Srikar; Amini, Richard

    2017-04-14

    Necrotizing fasciitis is a devastating infectious disease process that is characterized by extensive soft tissue necrosis along deep fascial planes, systemic toxicity, and high mortality. Ultrasound imaging is a rapid and non-invasive tool that can be used to help make the diagnosis of necrotizing fasciitis by identifying several distinctive sonographic findings. The purpose of this study is to describe the construction of a realistic diagnostic training model for necrotizing fasciitis using fresh frozen cadavers and common, affordable materials. Presently, fresh non-embalmed cadavers have been used at medical institutions for various educational sessions including cadaver-based ultrasound training sessions. Details for the preparation and construction of a necrotizing fasciitis cadaver model are presented here. This paper shows that the images obtained from the cadaver model closely imitate the ultrasound appearance of fluid and gas seen in actual clinical cases of necrotizing fasciitis. Therefore, it can be concluded that this cadaver-based model produces high-quality sonographic images that simulate those found in true cases of necrotizing fasciitis and is ideal for demonstrating the sonographic findings of necrotizing fasciitis.

  6. Stabilized FE simulation of prototype thermal-hydraulics problems with integrated adjoint-based capabilities

    NASA Astrophysics Data System (ADS)

    Shadid, J. N.; Smith, T. M.; Cyr, E. C.; Wildey, T. M.; Pawlowski, R. P.

    2016-09-01

    A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. In this respect the understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In this study we report on initial efforts to apply integrated adjoint-based computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier-Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. Initial results are presented that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.

  7. Stabilized FE simulation of prototype thermal-hydraulics problems with integrated adjoint-based capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadid, J.N., E-mail: jnshadi@sandia.gov; Department of Mathematics and Statistics, University of New Mexico; Smith, T.M.

    A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. In this respect the understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In this study we report on initial efforts tomore » apply integrated adjoint-based computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier–Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. Initial results are presented that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.« less

  8. A compact fiber optics-based heterodyne combined normal and transverse displacement interferometer.

    PubMed

    Zuanetti, Bryan; Wang, Tianxue; Prakash, Vikas

    2017-03-01

    While Photonic Doppler Velocimetry (PDV) has become a common diagnostic tool for the measurement of normal component of particle motion in shock wave experiments, this technique has not yet been modified for the measurement of combined normal and transverse motion, as needed in oblique plate impact experiments. In this paper, we discuss the design and implementation of a compact fiber-optics-based heterodyne combined normal and transverse displacement interferometer. Like the standard PDV, this diagnostic tool is assembled using commercially available telecommunications hardware and uses a 1550 nm wavelength 2 W fiber-coupled laser, an optical focuser, and single mode fibers to transport light to and from the target. Two additional optical probes capture first-order beams diffracted from a reflective grating at the target free-surface and deliver the beams past circulators and a coupler where the signal is combined to form a beat frequency. The combined signal is then digitized and analyzed to determine the transverse component of the particle motion. The maximum normal velocity that can be measured by this system is limited by the equivalent transmission bandwidth (3.795 GHz) of the combined detector, amplifier, and digitizer and is estimated to be ∼2.9 km/s. Sample symmetric oblique plate-impact experiments are performed to demonstrate the capability of this diagnostic tool in the measurement of the combined normal and transverse displacement particle motion.

  9. Application of the denaturing gradient gel electrophoresis (DGGE) technique as an efficient diagnostic tool for ciliate communities in soil.

    PubMed

    Jousset, Alexandre; Lara, Enrique; Nikolausz, Marcell; Harms, Hauke; Chatzinotas, Antonis

    2010-02-01

    Ciliates (or Ciliophora) are ubiquitous organisms which can be widely used as bioindicators in ecosystems exposed to anthropogenic and industrial influences. The evaluation of the environmental impact on soil ciliate communities with methods relying on morphology-based identification may be hampered by the large number of samples usually required for a statistically supported, reliable conclusion. Cultivation-independent molecular-biological diagnostic tools are a promising alternative to greatly simplify and accelerate such studies. In this present work a ciliate-specific fingerprint method based on the amplification of a phylogenetic marker gene (i.e. the 18S ribosomal RNA gene) with subsequent analysis by denaturing gradient gel electrophoresis (DGGE) was developed and used to monitor community shifts in a polycyclic aromatic hydrocarbon (PAH) polluted soil. The semi-nested approach generated ciliate-specific amplification products from all soil samples and allowed to distinguish community profiles from a PAH-polluted and a non-polluted control soil. Subsequent sequence analysis of excised bands provided evidence that polluted soil samples are dominated by organisms belonging to the class Colpodea. The general DGGE approach presented in this study might thus in principle serve as a fast and reproducible diagnostic tool, complementing and facilitating future ecological and ecotoxicological monitoring of ciliates in polluted habitats. Copyright 2009 Elsevier B.V. All rights reserved.

  10. Feasibility of Self-Reflection as a Tool to Balance Clinical Reasoning Strategies

    ERIC Educational Resources Information Center

    Sibbald, Matthew; de Bruin, Anique B. H.

    2012-01-01

    Clinicians are believed to use two predominant reasoning strategies: system 1 based pattern recognition, and system 2 based analytical reasoning. Balancing these cognitive reasoning strategies is widely believed to reduce diagnostic error. However, clinicians approach different problems with different reasoning strategies. This study explores…

  11. Simulation of the Physics of Flight

    ERIC Educational Resources Information Center

    Lane, W. Brian

    2013-01-01

    Computer simulations continue to prove to be a valuable tool in physics education. Based on the needs of an Aviation Physics course, we developed the PHYSics of FLIght Simulator (PhysFliS), which numerically solves Newton's second law for an airplane in flight based on standard aerodynamics relationships. The simulation can be used to pique…

  12. A Computer Model for Red Blood Cell Chemistry

    DTIC Science & Technology

    1996-10-01

    5012. 13. ABSTRACT (Maximum 200 There is a growing need for interactive computational tools for medical education and research. The most exciting...paradigm for interactive education is simulation. Fluid Mod is a simulation based computational tool developed in the late sixties and early seventies at...to a modern Windows, object oriented interface. This development will provide students with a useful computational tool for learning . More important

  13. pyPcazip: A PCA-based toolkit for compression and analysis of molecular simulation data

    NASA Astrophysics Data System (ADS)

    Shkurti, Ardita; Goni, Ramon; Andrio, Pau; Breitmoser, Elena; Bethune, Iain; Orozco, Modesto; Laughton, Charles A.

    The biomolecular simulation community is currently in need of novel and optimised software tools that can analyse and process, in reasonable timescales, the large generated amounts of molecular simulation data. In light of this, we have developed and present here pyPcazip: a suite of software tools for compression and analysis of molecular dynamics (MD) simulation data. The software is compatible with trajectory file formats generated by most contemporary MD engines such as AMBER, CHARMM, GROMACS and NAMD, and is MPI parallelised to permit the efficient processing of very large datasets. pyPcazip is a Unix based open-source software (BSD licenced) written in Python.

  14. The role of simulation in continuing medical education for acute care physicians: a systematic review.

    PubMed

    Khanduja, P Kristina; Bould, M Dylan; Naik, Viren N; Hladkowicz, Emily; Boet, Sylvain

    2015-01-01

    We systematically reviewed the effectiveness of simulation-based education, targeting independently practicing qualified physicians in acute care specialties. We also describe how simulation is used for performance assessment in this population. Data source included: DataMEDLINE, Embase, Cochrane Database of Systematic Reviews, Cochrane CENTRAL Database of Controlled Trials, and National Health Service Economic Evaluation Database. The last date of search was January 31, 2013. All original research describing simulation-based education for independently practicing physicians in anesthesiology, critical care, and emergency medicine was reviewed. Data analysis was performed in duplicate with further review by a third author in cases of disagreement until consensus was reached. Data extraction was focused on effectiveness according to Kirkpatrick's model. For simulation-based performance assessment, tool characteristics and sources of validity evidence were also collated. Of 39 studies identified, 30 studies focused on the effectiveness of simulation-based education and nine studies evaluated the validity of simulation-based assessment. Thirteen studies (30%) targeted the lower levels of Kirkpatrick's hierarchy with reliance on self-reporting. Simulation was unanimously described as a positive learning experience with perceived impact on clinical practice. Of the 17 remaining studies, 10 used a single group or "no intervention comparison group" design. The majority (n = 17; 44%) were able to demonstrate both immediate and sustained improvements in educational outcomes. Nine studies reported the psychometric properties of simulation-based performance assessment as their sole objective. These predominantly recruited independent practitioners as a convenience sample to establish whether the tool could discriminate between experienced and inexperienced operators and concentrated on a single aspect of validity evidence. Simulation is perceived as a positive learning experience with limited evidence to support improved learning. Future research should focus on the optimal modality and frequency of exposure, quality of assessment tools and on the impact of simulation-based education beyond the individuals toward improved patient care.

  15. Model-based surgical planning and simulation of cranial base surgery.

    PubMed

    Abe, M; Tabuchi, K; Goto, M; Uchino, A

    1998-11-01

    Plastic skull models of seven individual patients were fabricated by stereolithography from three-dimensional data based on computed tomography bone images. Skull models were utilized for neurosurgical planning and simulation in the seven patients with cranial base lesions that were difficult to remove. Surgical approaches and areas of craniotomy were evaluated using the fabricated skull models. In preoperative simulations, hand-made models of the tumors, major vessels and nerves were placed in the skull models. Step-by-step simulation of surgical procedures was performed using actual surgical tools. The advantages of using skull models to plan and simulate cranial base surgery include a better understanding of anatomic relationships, preoperative evaluation of the proposed procedure, increased understanding by the patient and family, and improved educational experiences for residents and other medical staff. The disadvantages of using skull models include the time and cost of making the models. The skull models provide a more realistic tool that is easier to handle than computer-graphic images. Surgical simulation using models facilitates difficult cranial base surgery and may help reduce surgical complications.

  16. FTDD973: A multimedia knowledge-based system and methodology for operator training and diagnostics

    NASA Technical Reports Server (NTRS)

    Hekmatpour, Amir; Brown, Gary; Brault, Randy; Bowen, Greg

    1993-01-01

    FTDD973 (973 Fabricator Training, Documentation, and Diagnostics) is an interactive multimedia knowledge based system and methodology for computer-aided training and certification of operators, as well as tool and process diagnostics in IBM's CMOS SGP fabrication line (building 973). FTDD973 is an example of what can be achieved with modern multimedia workstations. Knowledge-based systems, hypertext, hypergraphics, high resolution images, audio, motion video, and animation are technologies that in synergy can be far more useful than each by itself. FTDD973's modular and object-oriented architecture is also an example of how improvements in software engineering are finally making it possible to combine many software modules into one application. FTDD973 is developed in ExperMedia/2; and OS/2 multimedia expert system shell for domain experts.

  17. A multimedia patient simulation for teaching and assessing endodontic diagnosis.

    PubMed

    Littlefield, John H; Demps, Elaine L; Keiser, Karl; Chatterjee, Lipika; Yuan, Cheng H; Hargreaves, Kenneth M

    2003-06-01

    Teaching and assessing diagnostic skills are difficult due to relatively small numbers of total clinical experiences and a shortage of clinical faculty. Patient simulations could help teach and assess diagnosis by displaying a well-defined diagnostic task, then providing informative feedback and opportunities for repetition and correction of errors. This report describes the development and initial evaluation of SimEndo I, a multimedia patient simulation program that could be used for teaching or assessing endodontic diagnosis. Students interact with a graphical interface that has four pull-down menus and related submenus. In response to student requests, the program presents patient information. Scoring is based on diagnosis of each case by endodontists. Pilot testing with seventy-four junior dental students identified numerous needed improvements to the user interface program. A multi-school field test of the interface program using three patient cases addressed three research questions: 1) How did the field test students evaluate SimEndo I? Overall mean evaluation was 8.1 on a 0 to 10 scale; 2) How many cases are needed to generate a reproducible diagnostic proficiency score for an individual student using the Rimoldi scoring procedure? Mean diagnostic proficiency scores by case ranged from .27 to .40 on a 0 to 1 scale; five cases would produce a score with a 0.80 reliability coefficient; and 3) Did students accurately diagnose each case? Mean correct diagnosis scores by case ranged from .54 to .78 on a 0 to 1 scale. We conclude that multimedia patient simulations offer a promising alternative for teaching and assessing student diagnostic skills.

  18. Rapid Technology Assessment via Unified Deployment of Global Optical and Virtual Diagnostics

    NASA Technical Reports Server (NTRS)

    Jordan, Jeffrey D.; Watkins, A. Neal; Fleming, Gary A.; Leighty, Bradley D.; Schwartz, Richard J.; Ingram, JoAnne L.; Grinstead, Keith D., Jr.; Oglesby, Donald M.; Tyler, Charles

    2003-01-01

    This paper discusses recent developments in rapid technology assessment resulting from an active collaboration between researchers at the Air Force Research Laboratory (AFRL) at Wright Patterson Air Force Base (WPAFB) and the NASA Langley Research Center (LaRC). This program targets the unified development and deployment of global measurement technologies coupled with a virtual diagnostic interface to enable the comparative evaluation of experimental and computational results. Continuing efforts focus on the development of seamless data translation methods to enable integration of data sets of disparate file format in a common platform. Results from a successful low-speed wind tunnel test at WPAFB in which global surface pressure distributions were acquired simultaneously with model deformation and geometry measurements are discussed and comparatively evaluated with numerical simulations. Intensity- and lifetime-based pressure-sensitive paint (PSP) and projection moire interferometry (PMI) results are presented within the context of rapid technology assessment to enable simulation-based R&D.

  19. Identifying influential data points in hydrological model calibration and their impact on streamflow predictions

    NASA Astrophysics Data System (ADS)

    Wright, David; Thyer, Mark; Westra, Seth

    2015-04-01

    Highly influential data points are those that have a disproportionately large impact on model performance, parameters and predictions. However, in current hydrological modelling practice the relative influence of individual data points on hydrological model calibration is not commonly evaluated. This presentation illustrates and evaluates several influence diagnostics tools that hydrological modellers can use to assess the relative influence of data. The feasibility and importance of including influence detection diagnostics as a standard tool in hydrological model calibration is discussed. Two classes of influence diagnostics are evaluated: (1) computationally demanding numerical "case deletion" diagnostics; and (2) computationally efficient analytical diagnostics, based on Cook's distance. These diagnostics are compared against hydrologically orientated diagnostics that describe changes in the model parameters (measured through the Mahalanobis distance), performance (objective function displacement) and predictions (mean and maximum streamflow). These influence diagnostics are applied to two case studies: a stage/discharge rating curve model, and a conceptual rainfall-runoff model (GR4J). Removing a single data point from the calibration resulted in differences to mean flow predictions of up to 6% for the rating curve model, and differences to mean and maximum flow predictions of up to 10% and 17%, respectively, for the hydrological model. When using the Nash-Sutcliffe efficiency in calibration, the computationally cheaper Cook's distance metrics produce similar results to the case-deletion metrics at a fraction of the computational cost. However, Cooks distance is adapted from linear regression with inherit assumptions on the data and is therefore less flexible than case deletion. Influential point detection diagnostics show great potential to improve current hydrological modelling practices by identifying highly influential data points. The findings of this study establish the feasibility and importance of including influential point detection diagnostics as a standard tool in hydrological model calibration. They provide the hydrologist with important information on whether model calibration is susceptible to a small number of highly influent data points. This enables the hydrologist to make a more informed decision of whether to (1) remove/retain the calibration data; (2) adjust the calibration strategy and/or hydrological model to reduce the susceptibility of model predictions to a small number of influential observations.

  20. Development of terahertz otoscope for diagnosing otitis media (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Jeon, Tae-In; Ji, Young Bin; Bark, Hyeon Sang; Noh, Sam Kyu; Oh, Seung Jae

    2017-03-01

    A novel terahertz (THz) otoscope is designed and fabricated to help physicians to diagnose otitis media (OM) with both THz diagnostics and conventional optical diagnostics. The inclusion of indium tin oxide (ITO) glass in the THz otoscope allows physicians to diagnose OM with both THz and conventional optical diagnostics. To determine THz diagnostics for OM, we observed reflection signals from samples behind a thin dielectric film and found that the presence of water behind the membrane could be distinguished based on THz pulse shape. We verified the potential of this tool for diagnosing OM using mouse skin tissue and a human tympanic membrane samples prior to clinical application. The presence of water absorbed by the human membrane was easily distinguished based on differences in pulse shapes and peak-to-peak amplitudes of reflected THz pulses. The potential for early OM diagnosis using the THz otoscope was confirmed by alteration of THz pulse depending on water absorption level.

  1. Parametric Testing of Launch Vehicle FDDR Models

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Bajwa, Anupa; Berg, Peter; Thirumalainambi, Rajkumar

    2011-01-01

    For the safe operation of a complex system like a (manned) launch vehicle, real-time information about the state of the system and potential faults is extremely important. The on-board FDDR (Failure Detection, Diagnostics, and Response) system is a software system to detect and identify failures, provide real-time diagnostics, and to initiate fault recovery and mitigation. The ERIS (Evaluation of Rocket Integrated Subsystems) failure simulation is a unified Matlab/Simulink model of the Ares I Launch Vehicle with modular, hierarchical subsystems and components. With this model, the nominal flight performance characteristics can be studied. Additionally, failures can be injected to see their effects on vehicle state and on vehicle behavior. A comprehensive test and analysis of such a complicated model is virtually impossible. In this paper, we will describe, how parametric testing (PT) can be used to support testing and analysis of the ERIS failure simulation. PT uses a combination of Monte Carlo techniques with n-factor combinatorial exploration to generate a small, yet comprehensive set of parameters for the test runs. For the analysis of the high-dimensional simulation data, we are using multivariate clustering to automatically find structure in this high-dimensional data space. Our tools can generate detailed HTML reports that facilitate the analysis.

  2. Examinations of electron temperature calculation methods in Thomson scattering diagnostics.

    PubMed

    Oh, Seungtae; Lee, Jong Ha; Wi, Hanmin

    2012-10-01

    Electron temperature from Thomson scattering diagnostic is derived through indirect calculation based on theoretical model. χ-square test is commonly used in the calculation, and the reliability of the calculation method highly depends on the noise level of input signals. In the simulations, noise effects of the χ-square test are examined and scale factor test is proposed as an alternative method.

  3. Diagnostic Tools for Performance Evaluation of Innovative In-Situ Remediation Technologies at Chlorinated Solvent-Contaminated Sites

    DTIC Science & Technology

    2011-07-01

    to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT...these innovative methods with conventional diagnostic tools that are currently used for assessing bioremediation performance. 132 Rula Deeb (510) 596...conventional diagnostic tools that are currently used for assessing bioremediation performance. DEMONSTRATION RESULTS 3-D multi-level systems

  4. Analysis of the hydrological response of a distributed physically-based model using post-assimilation (EnKF) diagnostics of streamflow and in situ soil moisture observations

    NASA Astrophysics Data System (ADS)

    Trudel, Mélanie; Leconte, Robert; Paniconi, Claudio

    2014-06-01

    Data assimilation techniques not only enhance model simulations and forecast, they also provide the opportunity to obtain a diagnostic of both the model and observations used in the assimilation process. In this research, an ensemble Kalman filter was used to assimilate streamflow observations at a basin outlet and at interior locations, as well as soil moisture at two different depths (15 and 45 cm). The simulation model is the distributed physically-based hydrological model CATHY (CATchment HYdrology) and the study site is the Des Anglais watershed, a 690 km2 river basin located in southern Quebec, Canada. Use of Latin hypercube sampling instead of a conventional Monte Carlo method to generate the ensemble reduced the size of the ensemble, and therefore the calculation time. Different post-assimilation diagnostics, based on innovations (observation minus background), analysis residuals (observation minus analysis), and analysis increments (analysis minus background), were used to evaluate assimilation optimality. An important issue in data assimilation is the estimation of error covariance matrices. These diagnostics were also used in a calibration exercise to determine the standard deviation of model parameters, forcing data, and observations that led to optimal assimilations. The analysis of innovations showed a lag between the model forecast and the observation during rainfall events. Assimilation of streamflow observations corrected this discrepancy. Assimilation of outlet streamflow observations improved the Nash-Sutcliffe efficiencies (NSE) between the model forecast (one day) and the observation at both outlet and interior point locations, owing to the structure of the state vector used. However, assimilation of streamflow observations systematically increased the simulated soil moisture values.

  5. The Use of Fluoroproline in MUC1 Antigen Enables Efficient Detection of Antibodies in Patients with Prostate Cancer.

    PubMed

    Somovilla, Víctor J; Bermejo, Iris A; Albuquerque, Inês S; Martínez-Sáez, Nuria; Castro-López, Jorge; García-Martín, Fayna; Compañón, Ismael; Hinou, Hiroshi; Nishimura, Shin-Ichiro; Jiménez-Barbero, Jesús; Asensio, Juan L; Avenoza, Alberto; Busto, Jesús H; Hurtado-Guerrero, Ramón; Peregrina, Jesús M; Bernardes, Gonçalo J L; Corzana, Francisco

    2017-12-20

    A structure-based design of a new generation of tumor-associated glycopeptides with improved affinity against two anti-MUC1 antibodies is described. These unique antigens feature a fluorinated proline residue, such as a (4S)-4-fluoro-l-proline or 4,4-difluoro-l-proline, at the most immunogenic domain. Binding assays using biolayer interferometry reveal 3-fold to 10-fold affinity improvement with respect to the natural (glyco)peptides. According to X-ray crystallography and MD simulations, the fluorinated residues stabilize the antigen-antibody complex by enhancing key CH/π interactions. Interestingly, a notable improvement in detection of cancer-associated anti-MUC1 antibodies from serum of patients with prostate cancer is achieved with the non-natural antigens, which proves that these derivatives can be considered better diagnostic tools than the natural antigen for prostate cancer.

  6. Direct digital conversion detector technology

    NASA Astrophysics Data System (ADS)

    Mandl, William J.; Fedors, Richard

    1995-06-01

    Future imaging sensors for the aerospace and commercial video markets will depend on low cost, high speed analog-to-digital (A/D) conversion to efficiently process optical detector signals. Current A/D methods place a heavy burden on system resources, increase noise, and limit the throughput. This paper describes a unique method for incorporating A/D conversion right on the focal plane array. This concept is based on Sigma-Delta sampling, and makes optimum use of the active detector real estate. Combined with modern digital signal processors, such devices will significantly increase data rates off the focal plane. Early conversion to digital format will also decrease the signal susceptibility to noise, lowering the communications bit error rate. Computer modeling of this concept is described, along with results from several simulation runs. A potential application for direct digital conversion is also reviewed. Future uses for this technology could range from scientific instruments to remote sensors, telecommunications gear, medical diagnostic tools, and consumer products.

  7. WFUMB Position Paper. Learning Gastrointestinal Ultrasound: Theory and Practice.

    PubMed

    Atkinson, Nathan S S; Bryant, Robert V; Dong, Yi; Maaser, Christian; Kucharzik, Torsten; Maconi, Giovanni; Asthana, Anil K; Blaivas, Michael; Goudie, Adrian; Gilja, Odd Helge; Nolsøe, Christian; Nürnberg, Dieter; Dietrich, Christoph F

    2016-12-01

    Gastrointestinal ultrasound (GIUS) is an ultrasound application that has been practiced for more than 30 years. Recently, GIUS has enjoyed a resurgence of interest, and there is now strong evidence of its utility and accuracy as a diagnostic tool for multiple indications. The method of learning GIUS is not standardised and may incorporate mentorship, didactic teaching and e-learning. Simulation, using either low- or high-fidelity models, can also play a key role in practicing and honing novice GIUS skills. A course for training as well as establishing and evaluating competency in GIUS is proposed in the manuscript, based on established learning theory practice. We describe the broad utility of GIUS in clinical medicine, including a review of the literature and existing meta-analyses. Further, the manuscript calls for agreement on international standards regarding education, training and indications. Copyright © 2016 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  8. Role-playing simulation as an educational tool for health care personnel: developing an embedded assessment framework.

    PubMed

    Libin, Alexander; Lauderdale, Manon; Millo, Yuri; Shamloo, Christine; Spencer, Rachel; Green, Brad; Donnellan, Joyce; Wellesley, Christine; Groah, Suzanne

    2010-04-01

    Simulation- and video game-based role-playing techniques have been proven effective in changing behavior and enhancing positive decision making in a variety of professional settings, including education, the military, and health care. Although the need for developing assessment frameworks for learning outcomes has been clearly defined, there is a significant gap between the variety of existing multimedia-based instruction and technology-mediated learning systems and the number of reliable assessment algorithms. This study, based on a mixed methodology research design, aims to develop an embedded assessment algorithm, a Knowledge Assessment Module (NOTE), to capture both user interaction with the educational tool and knowledge gained from the training. The study is regarded as the first step in developing an assessment framework for a multimedia educational tool for health care professionals, Anatomy of Care (AOC), that utilizes Virtual Experience Immersive Learning Simulation (VEILS) technology. Ninety health care personnel of various backgrounds took part in online AOC training, choosing from five possible scenarios presenting difficult situations of everyday care. The results suggest that although the simulation-based training tool demonstrated partial effectiveness in improving learners' decision-making capacity, a differential learner-oriented approach might be more effective and capable of synchronizing educational efforts with identifiable relevant individual factors such as sociobehavioral profile and professional background.

  9. Games for Traffic Education: An Experimental Study of a Game-Based Driving Simulator

    ERIC Educational Resources Information Center

    Backlund, Per; Engstrom, Henrik; Johannesson, Mikael; Lebram, Mikael

    2010-01-01

    In this article, the authors report on the construction and evaluation of a game-based driving simulator using a real car as a joystick. The simulator is constructed from off-the-shelf hardware and the simulation runs on open-source software. The feasibility of the simulator as a learning tool has been experimentally evaluated. Results are…

  10. Validation of Magnetic Resonance Thermometry by Computational Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Rydquist, Grant; Owkes, Mark; Verhulst, Claire M.; Benson, Michael J.; Vanpoppel, Bret P.; Burton, Sascha; Eaton, John K.; Elkins, Christopher P.

    2016-11-01

    Magnetic Resonance Thermometry (MRT) is a new experimental technique that can create fully three-dimensional temperature fields in a noninvasive manner. However, validation is still required to determine the accuracy of measured results. One method of examination is to compare data gathered experimentally to data computed with computational fluid dynamics (CFD). In this study, large-eddy simulations have been performed with the NGA computational platform to generate data for a comparison with previously run MRT experiments. The experimental setup consisted of a heated jet inclined at 30° injected into a larger channel. In the simulations, viscosity and density were scaled according to the local temperature to account for differences in buoyant and viscous forces. A mesh-independent study was performed with 5 mil-, 15 mil- and 45 mil-cell meshes. The program Star-CCM + was used to simulate the complete experimental geometry. This was compared to data generated from NGA. Overall, both programs show good agreement with the experimental data gathered with MRT. With this data, the validity of MRT as a diagnostic tool has been shown and the tool can be used to further our understanding of a range of flows with non-trivial temperature distributions.

  11. A Framework to Debug Diagnostic Matrices

    NASA Technical Reports Server (NTRS)

    Kodal, Anuradha; Robinson, Peter; Patterson-Hine, Ann

    2013-01-01

    Diagnostics is an important concept in system health and monitoring of space operations. Many of the existing diagnostic algorithms utilize system knowledge in the form of diagnostic matrix (D-matrix, also popularly known as diagnostic dictionary, fault signature matrix or reachability matrix) gleaned from physical models. But, sometimes, this may not be coherent to obtain high diagnostic performance. In such a case, it is important to modify this D-matrix based on knowledge obtained from other sources such as time-series data stream (simulated or maintenance data) within the context of a framework that includes the diagnostic/inference algorithm. A systematic and sequential update procedure, diagnostic modeling evaluator (DME) is proposed to modify D-matrix and wrapper logic considering least expensive solution first. This iterative procedure includes conditions ranging from modifying 0s and 1s in the matrix, or adding/removing the rows (failure sources) columns (tests). We will experiment this framework on datasets from DX challenge 2009.

  12. Diagnosis of TIA (DOT) score--design and validation of a new clinical diagnostic tool for transient ischaemic attack.

    PubMed

    Dutta, Dipankar

    2016-02-09

    The diagnosis of Transient Ischaemic Attack (TIA) can be difficult and 50-60% of patients seen in TIA clinics turn out to be mimics. Many of these mimics have high ABCD2 scores and fill urgent TIA clinic slots inappropriately. A TIA diagnostic tool may help non-specialists make the diagnosis with greater accuracy and improve TIA clinic triage. The only available diagnostic score (Dawson et al) is limited in scope and not widely used. The Diagnosis of TIA (DOT) Score is a new and internally validated web and mobile app based diagnostic tool which encompasses both brain and retinal TIA. The score was derived retrospectively from a single centre TIA clinic database using stepwise logistic regression by backwards elimination to find the best model. An optimum cutpoint was obtained for the score. The derivation and validation cohorts were separate samples drawn from the years 2010/12 and 2013 respectively. Receiver Operating Characteristic (ROC) curves and area under the curve (AUC) were calculated and the diagnostic accuracy of DOT was compared to the Dawson score. A web and smartphone calculator were designed subsequently. The derivation cohort had 879 patients and the validation cohort 525. The final model had seventeen predictors and had an AUC of 0.91 (95% CI: 0.89-0.93). When tested on the validation cohort, the AUC for DOTS was 0.89 (0.86-0.92) while that of the Dawson score was 0.77 (0.73-0.81). The sensitivity and specificity of the DOT score were 89% (CI: 84%-93%) and 76% (70%-81%) respectively while those of the Dawson score were 83% (78%-88%) and 51% (45%-57%). Other diagnostic accuracy measures (DOT vs. Dawson) include positive predictive values (75% vs. 58%), negative predictive values (89% vs. 79%), positive likelihood ratios (3.67 vs. 1.70) and negative likelihood ratios (0.15 vs. 0.32). The DOT score shows promise as a diagnostic tool for TIA and requires independent external validation before it can be widely used. It could potentially improve the triage of patients assessed for suspected TIA.

  13. The Berlin Inventory of Gambling behavior - Screening (BIG-S): Validation using a clinical sample.

    PubMed

    Wejbera, Martin; Müller, Kai W; Becker, Jan; Beutel, Manfred E

    2017-05-18

    Published diagnostic questionnaires for gambling disorder in German are either based on DSM-III criteria or focus on aspects other than life time prevalence. This study was designed to assess the usability of the DSM-IV criteria based Berlin Inventory of Gambling Behavior Screening tool in a clinical sample and adapt it to DSM-5 criteria. In a sample of 432 patients presenting for behavioral addiction assessment at the University Medical Center Mainz, we checked the screening tool's results against clinical diagnosis and compared a subsample of n=300 clinically diagnosed gambling disorder patients with a comparison group of n=132. The BIG-S produced a sensitivity of 99.7% and a specificity of 96.2%. The instrument's unidimensionality and the diagnostic improvements of DSM-5 criteria were verified by exploratory and confirmatory factor analysis as well as receiver operating characteristic analysis. The BIG-S is a reliable and valid screening tool for gambling disorder and demonstrated its concise and comprehensible operationalization of current DSM-5 criteria in a clinical setting.

  14. Optimal Combination of Non-Invasive Tools for the Early Detection of Potentially Life-Threatening Emergencies in Gynecology

    PubMed Central

    Varas, Catalina; Ravit, Marion; Mimoun, Camille; Panel, Pierre; Huchon, Cyrille; Fauconnier, Arnaud

    2016-01-01

    Objectives Potentially life-threatening gynecological emergencies (G-PLEs) are acute pelvic conditions that may spontaneously evolve into a life-threatening situation, or those for which there is a risk of sequelae or death in the absence of prompt diagnosis and treatment. The objective of this study was to identify the best combination of non-invasive diagnostic tools to ensure an accurate diagnosis and timely response when faced with G-PLEs for patients arriving with acute pelvic pain at the Gynecological Emergency Department (ED). Methods The data on non-invasive diagnostic tools were sourced from the records of patients presenting at the ED of two hospitals in the Parisian suburbs (France) with acute pelvic pain between September 2006 and April 2008. The medical history of the patients was obtained through a standardized questionnaire completed for a prospective observational study, and missing information was completed with data sourced from the medical forms. Diagnostic tool categories were predefined as a collection of signs or symptoms. We analyzed the association of each sign/symptom with G-PLEs using Pearson’s Chi-Square or Fischer’s exact tests. Symptoms and signs associated with G-PLEs (p-value < 0.20) were subjected to logistic regression to evaluate the diagnostic value of each of the predefined diagnostic tools and in various combinations. Results The data of 365 patients with acute pelvic pain were analyzed, of whom 103 were confirmed to have a PLE. We analyzed five diagnostic tools by logistic regression: Triage Process, History-Taking, Physical Examination, Ultrasonography, and Biological Exams. The combination of History-Taking and Ultrasonography had a C-index of 0.83, the highest for a model combining two tools. Conclusions The use of a standardized self-assessment questionnaire for history-taking and focal ultrasound examination were found to be the most successful tool combination for the diagnosis of gynecological emergencies in a Gynecological ED. Additional tools, such as physical examination, do not add substantial diagnostic value. PMID:27583697

  15. Incidence of vertebral hemangioma on spinal magnetic resonance imaging in Northern Iran.

    PubMed

    Barzin, M; Maleki, I

    2009-03-15

    The incidence of vertebral hemangiomas as the most common benign spinal neoplasms has been differently reported from 10 to 27% based on autopsy series, plain X-rays and MRI reviews. In this study, we reviewed consecutive 782 standard spinal MRI with axial and sagital T1 weighted and T2 weighted images looking for hemangiomas. In this study, the incidence of hemangioma was 26.9%, more common in females (30%) than males (23%), in older age group and in lumbar spine. Most hemangiomas (65%) were less than 10 mm in diameter. Multiple hemangiomas were seen in 33% of cases. The results of this study are similar to another Mediterranean study reported based on MRI findings, but differ from other reports using X-ray or autopsy as diagnostic tool, suggesting the influence of either the race or the sensitivity of the diagnostic tool on the incidence of vertebral hemangioma.

  16. The Environmental Self-Audit for Campus-Based Organizations: A Quick and Easy Guide to Environmental Compliance.

    ERIC Educational Resources Information Center

    New York State Dept. of Environmental Conservation, Albany.

    This guide is intended to help public and not-for-profit campus-based organizations in New York State to comply with local, state, and federal environmental regulations. The environmental self-audit serves as a basic diagnostic tool for campus-based organizations (centralized schools, colleges/universities, correctional facilities, mental health…

  17. Diagnostic ultrasound at MACH 20: retroperitoneal and pelvic imaging in space.

    PubMed

    Jones, J A; Sargsyan, A E; Barr, Y R; Melton, S; Hamilton, D R; Dulchavsky, S A; Whitson, P A

    2009-07-01

    An operationally available diagnostic imaging capability augments spaceflight medical support by facilitating the diagnosis, monitoring and treatment of medical or surgical conditions, by improving medical outcomes and, thereby, by lowering medical mission impacts and the probability of crew evacuation due to medical causes. Microgravity-related physiological changes occurring during spaceflight can affect the genitourinary system and potentially cause conditions such as urinary retention or nephrolithiasis for which ultrasonography (U/S) would be a useful diagnostic tool. This study describes the first genitourinary ultrasound examination conducted in space, and evaluates image quality, frame rate, resolution requirements, real-time remote guidance of nonphysician crew medical officers and evaluation of on-orbit tools that can augment image acquisition. A nonphysician crew medical officer (CMO) astronaut, with minimal training in U/S, performed a self-examination of the genitourinary system onboard the International Space Station, using a Philips/ATL Model HDI-5000 ultrasound imaging unit located in the International Space Station Human Research Facility. The CMO was remotely guided by voice commands from experienced, earth-based sonographers stationed in Mission Control Center in Houston. The crewmember, with guidance, was able to acquire all of the target images. Real-time and still U/S images received at Mission Control Center in Houston were of sufficient quality for the images to be diagnostic for multiple potential genitourinary applications. Microgravity-based ultrasound imaging can provide diagnostic quality images of the retroperitoneum and pelvis, offering improved diagnosis and treatment for onboard medical contingencies. Successful completion of complex sonographic examinations can be obtained even with minimally trained nonphysician ultrasound operators, with the assistance of ground-based real-time guidance.

  18. Diagnostic x-ray dosimetry using Monte Carlo simulation.

    PubMed

    Ioppolo, J L; Price, R I; Tuchyna, T; Buckley, C E

    2002-05-21

    An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 x 10(7)) than required for the calculation of dose profiles (1 x 10(9)). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.

  19. Diagnostic x-ray dosimetry using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Ioppolo, J. L.; Price, R. I.; Tuchyna, T.; Buckley, C. E.

    2002-05-01

    An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 × 107) than required for the calculation of dose profiles (1 × 109). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.

  20. WFIRST: Data/Instrument Simulation Support at IPAC

    NASA Astrophysics Data System (ADS)

    Laine, Seppo; Akeson, Rachel; Armus, Lee; Bennett, Lee; Colbert, James; Helou, George; Kirkpatrick, J. Davy; Meshkat, Tiffany; Paladini, Roberta; Ramirez, Solange; Wang, Yun; Xie, Joan; Yan, Lin

    2018-01-01

    As part of WFIRST Science Center preparations, the IPAC Science Operations Center (ISOC) maintains a repository of 1) WFIRST data and instrument simulations, 2) tools to facilitate scientific performance and feasibility studies using the WFIRST, and 3) parameters summarizing the current design and predicted performance of the WFIRST telescope and instruments. The simulation repository provides access for the science community to simulation code, tools, and resulting analyses. Examples of simulation code with ISOC-built web-based interfaces include EXOSIMS (for estimating exoplanet yields in CGI surveys) and the Galaxy Survey Exposure Time Calculator. In the future the repository will provide an interface for users to run custom simulations of a wide range of coronagraph instrument (CGI) observations and sophisticated tools for designing microlensing experiments. We encourage those who are generating simulations or writing tools for exoplanet observations with WFIRST to contact the ISOC team so we can work with you to bring these to the attention of the broader astronomical community as we prepare for the exciting science that will be enabled by WFIRST.

  1. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    NASA Astrophysics Data System (ADS)

    Wang, W.; Rinke, A.; Moore, J. C.; Cui, X.; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D. M.; McGuire, A. D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2015-03-01

    We perform a land surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies between 6 modern stand-alone land surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by 5 different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99-135 x 104 km2) between the two diagnostic methods based on air temperature which are also consistent with the best current observation-based estimate of actual permafrost area (101 x 104 km2). However the uncertainty (1-128 x 104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air temperature based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification and snow cover. Models are particularly poor at simulating permafrost distribution using definition that soil temperature remains at or below 0°C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in permafrost distribution can be made for the Tibetan Plateau.

  2. Linking 1D coastal ocean modelling to environmental management: an ensemble approach

    NASA Astrophysics Data System (ADS)

    Mussap, Giulia; Zavatarelli, Marco; Pinardi, Nadia

    2017-12-01

    The use of a one-dimensional interdisciplinary numerical model of the coastal ocean as a tool contributing to the formulation of ecosystem-based management (EBM) is explored. The focus is on the definition of an experimental design based on ensemble simulations, integrating variability linked to scenarios (characterised by changes in the system forcing) and to the concurrent variation of selected, and poorly constrained, model parameters. The modelling system used was previously specifically designed for the use in "data-rich" areas, so that horizontal dynamics can be resolved by a diagnostic approach and external inputs can be parameterised by nudging schemes properly calibrated. Ensembles determined by changes in the simulated environmental (physical and biogeochemical) dynamics, under joint forcing and parameterisation variations, highlight the uncertainties associated to the application of specific scenarios that are relevant to EBM, providing an assessment of the reliability of the predicted changes. The work has been carried out by implementing the coupled modelling system BFM-POM1D in an area of Gulf of Trieste (northern Adriatic Sea), considered homogeneous from the point of view of hydrological properties, and forcing it by changing climatic (warming) and anthropogenic (reduction of the land-based nutrient input) pressure. Model parameters affected by considerable uncertainties (due to the lack of relevant observations) were varied jointly with the scenarios of change. The resulting large set of ensemble simulations provided a general estimation of the model uncertainties related to the joint variation of pressures and model parameters. The information of the model result variability aimed at conveying efficiently and comprehensibly the information on the uncertainties/reliability of the model results to non-technical EBM planners and stakeholders, in order to have the model-based information effectively contributing to EBM.

  3. Mast cell activation test in the diagnosis of allergic disease and anaphylaxis.

    PubMed

    Bahri, Rajia; Custovic, Adnan; Korosec, Peter; Tsoumani, Marina; Barron, Martin; Wu, Jiakai; Sayers, Rebekah; Weimann, Alf; Ruiz-Garcia, Monica; Patel, Nandinee; Robb, Abigail; Shamji, Mohamed H; Fontanella, Sara; Silar, Mira; Mills, E N Clare; Simpson, Angela; Turner, Paul J; Bulfone-Paus, Silvia

    2018-03-05

    Food allergy is an increasing public health issue and the most common cause of life-threatening anaphylactic reactions. Conventional allergy tests assess for the presence of allergen-specific IgE, significantly overestimating the rate of true clinical allergy and resulting in overdiagnosis and adverse effect on health-related quality of life. To undertake initial validation and assessment of a novel diagnostic tool, we used the mast cell activation test (MAT). Primary human blood-derived mast cells (MCs) were generated from peripheral blood precursors, sensitized with patients' sera, and then incubated with allergen. MC degranulation was assessed by means of flow cytometry and mediator release. We compared the diagnostic performance of MATs with that of existing diagnostic tools to assess in a cohort of peanut-sensitized subjects undergoing double-blind, placebo-controlled challenge. Human blood-derived MCs sensitized with sera from patients with peanut, grass pollen, and Hymenoptera (wasp venom) allergy demonstrated allergen-specific and dose-dependent degranulation, as determined based on both expression of surface activation markers (CD63 and CD107a) and functional assays (prostaglandin D 2 and β-hexosaminidase release). In this cohort of peanut-sensitized subjects, the MAT was found to have superior discrimination performance compared with other testing modalities, including component-resolved diagnostics and basophil activation tests. Using functional principle component analysis, we identified 5 clusters or patterns of reactivity in the resulting dose-response curves, which at preliminary analysis corresponded to the reaction phenotypes seen at challenge. The MAT is a robust tool that can confer superior diagnostic performance compared with existing allergy diagnostics and might be useful to explore differences in effector cell function between basophils and MCs during allergic reactions. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Methodology for automating software systems

    NASA Technical Reports Server (NTRS)

    Moseley, Warren

    1990-01-01

    Applying ITS technology to the shuttle diagnostics would not require the rigor of the Petri Net representation, however it is important in providing the animated simulated portion of the interface and the demands placed on the system to support the training aspects to have a homogeneous and consistent underlying knowledge representation. By keeping the diagnostic rule base, the hardware description, the software description, user profiles, desired behavioral knowledge, and the user interface in the same notation, it is possible to reason about the all of the properties of petri nets, on any selected portion of the simulation. This reasoning provides foundation for utilization of intelligent tutoring systems technology.

  5. Biological Visualization, Imaging and Simulation(Bio-VIS) at NASA Ames Research Center: Developing New Software and Technology for Astronaut Training and Biology Research in Space

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey

    2003-01-01

    The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.

  6. Bootstrapping Methods Applied for Simulating Laboratory Works

    ERIC Educational Resources Information Center

    Prodan, Augustin; Campean, Remus

    2005-01-01

    Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…

  7. Use of the Chemical Transformation Simulator as a Parameterization Tool for Modeling the Environmental Fate of Organic Chemicals and their Transformation Products

    EPA Science Inventory

    A Chemical Transformation Simulator is a web-based system for predicting transformation pathways and physicochemical properties of organic chemicals. Role in Environmental Modeling • Screening tool for identifying likely transformation products in the environment • Parameteri...

  8. A simulation based method to assess inversion algorithms for transverse relaxation data

    NASA Astrophysics Data System (ADS)

    Ghosh, Supriyo; Keener, Kevin M.; Pan, Yong

    2008-04-01

    NMR relaxometry is a very useful tool for understanding various chemical and physical phenomena in complex multiphase systems. A Carr-Purcell-Meiboom-Gill (CPMG) [P.T. Callaghan, Principles of Nuclear Magnetic Resonance Microscopy, Clarendon Press, Oxford, 1991] experiment is an easy and quick way to obtain transverse relaxation constant (T2) in low field. Most of the samples usually have a distribution of T2 values. Extraction of this distribution of T2s from the noisy decay data is essentially an ill-posed inverse problem. Various inversion approaches have been used to solve this problem, to date. A major issue in using an inversion algorithm is determining how accurate the computed distribution is. A systematic analysis of an inversion algorithm, UPEN [G.C. Borgia, R.J.S. Brown, P. Fantazzini, Uniform-penalty inversion of multiexponential decay data, Journal of Magnetic Resonance 132 (1998) 65-77; G.C. Borgia, R.J.S. Brown, P. Fantazzini, Uniform-penalty inversion of multiexponential decay data II. Data spacing, T2 data, systematic data errors, and diagnostics, Journal of Magnetic Resonance 147 (2000) 273-285] was performed by means of simulated CPMG data generation. Through our simulation technique and statistical analyses, the effects of various experimental parameters on the computed distribution were evaluated. We converged to the true distribution by matching up the inversion results from a series of true decay data and a noisy simulated data. In addition to simulation studies, the same approach was also applied on real experimental data to support the simulation results.

  9. Diagnostic Tools for the Systemic Reform of Schools.

    ERIC Educational Resources Information Center

    Amsler, Mary; Kirsch, Kayla

    This paper presents three interrelated diagnostic tools that can be used by school staff as they begin to plan a systemic reform effort. These tools are designed to help educators reflect on their experiences in creating changes in their school and to examine the current barriers to and supports for the change process. The tools help school design…

  10. Measurement properties of screening and diagnostic tools for autism spectrum adults of mean normal intelligence: A systematic review.

    PubMed

    Baghdadli, A; Russet, F; Mottron, L

    2017-07-01

    The autism spectrum (AS) is a multifaceted neurodevelopmental variant associated with lifelong challenges. Despite the relevant importance of identifying AS in adults for epidemiological, public health, and quality of life issues, the measurement properties of the tools currently used to screen and diagnose adults without intellectual disabilities (ID) have not been assessed. This systematic review addresses the accuracy, reliability, and validity of the reported AS screening and diagnostic tools used in adults without ID. Electronic databases and bibliographies were searched, and identified papers evaluated against inclusion criteria. The PRISMA statement was used for reporting the review. We evaluated the quality of the papers using the COSMIN Checklist for psychometric data, and QUADAS-2 for diagnostic data. For the COSMIN assessment, evidence was considered to be strong when several methodologically good articles, or one excellent article, reported consistent evidence for or against a measurement property. For the QUADAS ratings, evidence was considered to be "satisfactory" if at least one study was rated with a low risk of bias and low concern about applicability. We included 38 articles comprising 32 studies, five reviews, and one book chapter and assessed nine tools (three diagnostic and six screening, including eight of their short versions). Among screening tools, only AQ-50, AQ-S, and RAADS-R and RAADS-14 were found to provide satisfactory or intermediate values for their psychometric properties, supported by strong or moderate evidence. Nevertheless, risks of bias and concerns on the applicability of these tools limit the evidence on their diagnostic properties. We found that none of the gold standard diagnostic tools used for children had satisfactory measurement properties. There is limited evidence for the measurement properties of the screening and diagnostic tools used for AS adults with a mean normal range of measured intelligence. This may lessen the validity of conclusions and public health decisions on an important fraction of the adult autistic population. This not only justifies further validation studies of screening and diagnostic tools for autistic adults, but also supports the parallel use of self-reported information and clinical expertise with these instruments during the diagnostic process. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  11. Modeling Methodologies for Design and Control of Solid Oxide Fuel Cell APUs

    NASA Astrophysics Data System (ADS)

    Pianese, C.; Sorrentino, M.

    2009-08-01

    Among the existing fuel cell technologies, Solid Oxide Fuel Cells (SOFC) are particularly suitable for both stationary and mobile applications, due to their high energy conversion efficiencies, modularity, high fuel flexibility, low emissions and noise. Moreover, the high working temperatures enable their use for efficient cogeneration applications. SOFCs are entering in a pre-industrial era and a strong interest for designing tools has growth in the last years. Optimal system configuration, components sizing, control and diagnostic system design require computational tools that meet the conflicting needs of accuracy, affordable computational time, limited experimental efforts and flexibility. The paper gives an overview on control-oriented modeling of SOFC at both single cell and stack level. Such an approach provides useful simulation tools for designing and controlling SOFC-APUs destined to a wide application area, ranging from automotive to marine and airplane APUs.

  12. The Managing Emergencies in Paediatric Anaesthesia global rating scale is a reliable tool for simulation-based assessment in pediatric anesthesia crisis management.

    PubMed

    Everett, Tobias C; Ng, Elaine; Power, Daniel; Marsh, Christopher; Tolchard, Stephen; Shadrina, Anna; Bould, Matthew D

    2013-12-01

    The use of simulation-based assessments for high-stakes physician examinations remains controversial. The Managing Emergencies in Paediatric Anaesthesia course uses simulation to teach evidence-based management of anesthesia crises to trainee anesthetists in the United Kingdom (UK) and Canada. In this study, we investigated the feasibility and reliability of custom-designed scenario-specific performance checklists and a global rating scale (GRS) assessing readiness for independent practice. After research ethics board approval, subjects were videoed managing simulated pediatric anesthesia crises in a single Canadian teaching hospital. Each subject was randomized to two of six different scenarios. All 60 scenarios were subsequently rated by four blinded raters (two in the UK, two in Canada) using the checklists and GRS. The actual and predicted reliability of the tools was calculated for different numbers of raters using the intraclass correlation coefficient (ICC) and the Spearman-Brown prophecy formula. Average measures ICCs ranged from 'substantial' to 'near perfect' (P ≤ 0.001). The reliability of the checklists and the GRS was similar. Single measures ICCs showed more variability than average measures ICC. At least two raters would be required to achieve acceptable reliability. We have established the reliability of a GRS to assess the management of simulated crisis scenarios in pediatric anesthesia, and this tool is feasible within the setting of a research study. The global rating scale allows raters to make a judgement regarding a participant's readiness for independent practice. These tools may be used in the future research examining simulation-based assessment. © 2013 John Wiley & Sons Ltd.

  13. A web-based rapid assessment tool for production publishing solutions

    NASA Astrophysics Data System (ADS)

    Sun, Tong

    2010-02-01

    Solution assessment is a critical first-step in understanding and measuring the business process efficiency enabled by an integrated solution package. However, assessing the effectiveness of any solution is usually a very expensive and timeconsuming task which involves lots of domain knowledge, collecting and understanding the specific customer operational context, defining validation scenarios and estimating the expected performance and operational cost. This paper presents an intelligent web-based tool that can rapidly assess any given solution package for production publishing workflows via a simulation engine and create a report for various estimated performance metrics (e.g. throughput, turnaround time, resource utilization) and operational cost. By integrating the digital publishing workflow ontology and an activity based costing model with a Petri-net based workflow simulation engine, this web-based tool allows users to quickly evaluate any potential digital publishing solutions side-by-side within their desired operational contexts, and provides a low-cost and rapid assessment for organizations before committing any purchase. This tool also benefits the solution providers to shorten the sales cycles, establishing a trustworthy customer relationship and supplement the professional assessment services with a proven quantitative simulation and estimation technology.

  14. Fundamentals of Arthroscopic Surgery Training Program Improves Knee Arthroscopy Simulator Performance in Arthroscopic Trainees.

    PubMed

    Cychosz, Chris C; Tofte, Josef N; Johnson, Alyssa; Gao, Yubo; Phisitkul, Phinit

    2018-05-01

    To determine the effectiveness of a nonanatomic simulator in developing basic arthroscopy motor skills transferable to an anatomic model. Forty-three arthroscopy novice individuals currently enrolled in medical school were recruited to perform a diagnostic knee arthroscopy using a high-fidelity virtual reality arthroscopic simulator providing haptic feedback after viewing a video of an expert performing an identical procedure. Students were then randomized into an experimental or control group. The experimental group then completed a series of self-guided training modules using the fundamentals of arthroscopy simulator training nonanatomic modules including camera centering, tracking, periscoping, palpation, and collecting stars in a three-dimensional space. Both groups completed another diagnostic knee arthroscopy between 1 and 2 weeks later. Camera path length, time, tibia and femur cartilage damage, as well as a composite score were recorded by the simulator on each attempt. The experimental group (n = 22) showed superior performance in composite score (30.09 vs 24, P = .046) and camera path length (71.51 cm vs 109.07 cm, P = .0274) at the time of the second diagnostic knee arthroscope compared with the control group (n = 21). The experimental group also showed significantly greater improvement in composite score between the first and second arthroscopes compared with the control group (14.27 vs 4.95, P < .01). Femoral and tibial cartilage damage were not significantly improved between arthroscopy attempts (-0.86% vs -1.45%, P = .40) and (-1.10 vs -1.27%, P = .83), respectively. The virtual reality-based fundamentals of arthroscopy simulator training nonanatomic simulator is beneficial in developing basic motor skills in arthroscopy novice individuals resulting in significantly greater composite performance in an anatomic knee model. Based on the results of this study, it appears that there may be benefit from nonanatomic simulators in general as part of an arthroscopy training program. Level II, randomized trial. Published by Elsevier Inc.

  15. Autonomous power expert fault diagnostic system for Space Station Freedom electrical power system testbed

    NASA Technical Reports Server (NTRS)

    Truong, Long V.; Walters, Jerry L.; Roth, Mary Ellen; Quinn, Todd M.; Krawczonek, Walter M.

    1990-01-01

    The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control to the Space Station Freedom Electrical Power System (SSF/EPS) testbed being developed and demonstrated at NASA Lewis Research Center. The objectives of the program are to establish artificial intelligence technology paths, to craft knowledge-based tools with advanced human-operator interfaces for power systems, and to interface and integrate knowledge-based systems with conventional controllers. The Autonomous Power EXpert (APEX) portion of the APS program will integrate a knowledge-based fault diagnostic system and a power resource planner-scheduler. Then APEX will interface on-line with the SSF/EPS testbed and its Power Management Controller (PMC). The key tasks include establishing knowledge bases for system diagnostics, fault detection and isolation analysis, on-line information accessing through PMC, enhanced data management, and multiple-level, object-oriented operator displays. The first prototype of the diagnostic expert system for fault detection and isolation has been developed. The knowledge bases and the rule-based model that were developed for the Power Distribution Control Unit subsystem of the SSF/EPS testbed are described. A corresponding troubleshooting technique is also described.

  16. Intracranial hemorrhage alters scalp potential distribution in bioimpedance cerebral monitoring: Preliminary results from FEM simulation on a realistic head model and human subjects

    PubMed Central

    Atefi, Seyed Reza; Seoane, Fernando; Kamalian, Shervin; Rosenthal, Eric S.; Lev, Michael H.; Bonmassar, Giorgio

    2016-01-01

    Purpose: Current diagnostic neuroimaging for detection of intracranial hemorrhage (ICH) is limited to fixed scanners requiring patient transport and extensive infrastructure support. ICH diagnosis would therefore benefit from a portable diagnostic technology, such as electrical bioimpedance (EBI). Through simulations and patient observation, the authors assessed the influence of unilateral ICH hematomas on quasisymmetric scalp potential distributions in order to establish the feasibility of EBI technology as a potential tool for early diagnosis. Methods: Finite element method (FEM) simulations and experimental left–right hemispheric scalp potential differences of healthy and damaged brains were compared with respect to the asymmetry caused by ICH lesions on quasisymmetric scalp potential distributions. In numerical simulations, this asymmetry was measured at 25 kHz and visualized on the scalp as the normalized potential difference between the healthy and ICH damaged models. Proof-of-concept simulations were extended in a pilot study of experimental scalp potential measurements recorded between 0 and 50 kHz with the authors’ custom-made bioimpedance spectrometer. Mean left–right scalp potential differences recorded from the frontal, central, and parietal brain regions of ten healthy control and six patients suffering from acute/subacute ICH were compared. The observed differences were measured at the 5% level of significance using the two-sample Welch t-test. Results: The 3D-anatomically accurate FEM simulations showed that the normalized scalp potential difference between the damaged and healthy brain models is zero everywhere on the head surface, except in the vicinity of the lesion, where it can vary up to 5%. The authors’ preliminary experimental results also confirmed that the left–right scalp potential difference in patients with ICH (e.g., 64 mV) is significantly larger than in healthy subjects (e.g., 20.8 mV; P < 0.05). Conclusions: Realistic, proof-of-concept simulations confirmed that ICH affects quasisymmetric scalp potential distributions. Pilot clinical observations with the authors’ custom-made bioimpedance spectrometer also showed higher left–right potential differences in the presence of ICH, similar to those of their simulations, that may help to distinguish healthy subjects from ICH patients. Although these pilot clinical observations are in agreement with the computer simulations, the small sample size of this study lacks statistical power to exclude the influence of other possible confounders such as age, sex, and electrode positioning. The agreement with previously published simulation-based and clinical results, however, suggests that EBI technology may be potentially useful for ICH detection. PMID:26843231

  17. DNA Barcoding of Recently Diverged Species: Relative Performance of Matching Methods

    PubMed Central

    van Velzen, Robin; Weitschek, Emanuel; Felici, Giovanni; Bakker, Freek T.

    2012-01-01

    Recently diverged species are challenging for identification, yet they are frequently of special interest scientifically as well as from a regulatory perspective. DNA barcoding has proven instrumental in species identification, especially in insects and vertebrates, but for the identification of recently diverged species it has been reported to be problematic in some cases. Problems are mostly due to incomplete lineage sorting or simply lack of a ‘barcode gap’ and probably related to large effective population size and/or low mutation rate. Our objective was to compare six methods in their ability to correctly identify recently diverged species with DNA barcodes: neighbor joining and parsimony (both tree-based), nearest neighbor and BLAST (similarity-based), and the diagnostic methods DNA-BAR, and BLOG. We analyzed simulated data assuming three different effective population sizes as well as three selected empirical data sets from published studies. Results show, as expected, that success rates are significantly lower for recently diverged species (∼75%) than for older species (∼97%) (P<0.00001). Similarity-based and diagnostic methods significantly outperform tree-based methods, when applied to simulated DNA barcode data (P<0.00001). The diagnostic method BLOG had highest correct query identification rate based on simulated (86.2%) as well as empirical data (93.1%), indicating that it is a consistently better method overall. Another advantage of BLOG is that it offers species-level information that can be used outside the realm of DNA barcoding, for instance in species description or molecular detection assays. Even though we can confirm that identification success based on DNA barcoding is generally high in our data, recently diverged species remain difficult to identify. Nevertheless, our results contribute to improved solutions for their accurate identification. PMID:22272356

  18. DNA barcoding of recently diverged species: relative performance of matching methods.

    PubMed

    van Velzen, Robin; Weitschek, Emanuel; Felici, Giovanni; Bakker, Freek T

    2012-01-01

    Recently diverged species are challenging for identification, yet they are frequently of special interest scientifically as well as from a regulatory perspective. DNA barcoding has proven instrumental in species identification, especially in insects and vertebrates, but for the identification of recently diverged species it has been reported to be problematic in some cases. Problems are mostly due to incomplete lineage sorting or simply lack of a 'barcode gap' and probably related to large effective population size and/or low mutation rate. Our objective was to compare six methods in their ability to correctly identify recently diverged species with DNA barcodes: neighbor joining and parsimony (both tree-based), nearest neighbor and BLAST (similarity-based), and the diagnostic methods DNA-BAR, and BLOG. We analyzed simulated data assuming three different effective population sizes as well as three selected empirical data sets from published studies. Results show, as expected, that success rates are significantly lower for recently diverged species (∼75%) than for older species (∼97%) (P<0.00001). Similarity-based and diagnostic methods significantly outperform tree-based methods, when applied to simulated DNA barcode data (P<0.00001). The diagnostic method BLOG had highest correct query identification rate based on simulated (86.2%) as well as empirical data (93.1%), indicating that it is a consistently better method overall. Another advantage of BLOG is that it offers species-level information that can be used outside the realm of DNA barcoding, for instance in species description or molecular detection assays. Even though we can confirm that identification success based on DNA barcoding is generally high in our data, recently diverged species remain difficult to identify. Nevertheless, our results contribute to improved solutions for their accurate identification.

  19. Rapid and sensitive detection of Yersinia pestis using amplification of plague diagnostic bacteriophages monitored by real-time PCR.

    PubMed

    Sergueev, Kirill V; He, Yunxiu; Borschel, Richard H; Nikolich, Mikeljon P; Filippov, Andrey A

    2010-06-28

    Yersinia pestis, the agent of plague, has caused many millions of human deaths and still poses a serious threat to global public health. Timely and reliable detection of such a dangerous pathogen is of critical importance. Lysis by specific bacteriophages remains an essential method of Y. pestis detection and plague diagnostics. The objective of this work was to develop an alternative to conventional phage lysis tests--a rapid and highly sensitive method of indirect detection of live Y. pestis cells based on quantitative real-time PCR (qPCR) monitoring of amplification of reporter Y. pestis-specific bacteriophages. Plague diagnostic phages phiA1122 and L-413C were shown to be highly effective diagnostic tools for the detection and identification of Y. pestis by using qPCR with primers specific for phage DNA. The template DNA extraction step that usually precedes qPCR was omitted. phiA1122-specific qPCR enabled the detection of an initial bacterial concentration of 10(3) CFU/ml (equivalent to as few as one Y. pestis cell per 1-microl sample) in four hours. L-413C-mediated detection of Y. pestis was less sensitive (up to 100 bacteria per sample) but more specific, and thus we propose parallel qPCR for the two phages as a rapid and reliable method of Y. pestis identification. Importantly, phiA1122 propagated in simulated clinical blood specimens containing EDTA and its titer rise was detected by both a standard plating test and qPCR. Thus, we developed a novel assay for detection and identification of Y. pestis using amplification of specific phages monitored by qPCR. The method is simple, rapid, highly sensitive, and specific and allows the detection of only live bacteria.

  20. Diagnostic accuracy of an identification tool for localized neuropathic pain based on the IASP criteria.

    PubMed

    Mayoral, Víctor; Pérez-Hernández, Concepción; Muro, Inmaculada; Leal, Ana; Villoria, Jesús; Esquivias, Ana

    2018-04-27

    Based on the clear neuroanatomical delineation of many neuropathic pain (NP) symptoms, a simple tool for performing a short structured clinical encounter based on the IASP diagnostic criteria was developed to identify NP. This study evaluated its accuracy and usefulness. A case-control study was performed in 19 pain clinics within Spain. A pain clinician used the experimental screening tool (the index test, IT) to assign the descriptions of non-neuropathic (nNP), non-localized neuropathic (nLNP), and localized neuropathic (LNP) to the patients' pain conditions. The reference standard was a formal clinical diagnosis provided by another pain clinician. The accuracy of the IT was compared with that of the Douleur Neuropathique en 4 questions (DN4) and the Leeds Assessment of Neuropathic Signs and Symptoms (LANSS). Six-hundred and sixty-six patients were analyzed. There was a good agreement between the IT and the reference standard (kappa =0.722). The IT was accurate in distinguishing between LNP and nLNP (83.2% sensitivity, 88.2% specificity), between LNP and the other pain categories (nLNP + nNP) (80.0% sensitivity, 90.7% specificity), and between NP and nNP (95.5% sensitivity, 89.1% specificity). The accuracy in distinguishing between NP and nNP was comparable with that of the DN4 and the LANSS. The IT took a median of 10 min to complete. A novel instrument based on an operationalization of the IASP criteria can not only discern between LNP and nLNP, but also provide a high level of diagnostic certainty about the presence of NP after a short clinical encounter.

Top