Sample records for diagnostic software tools

  1. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  2. Software Tools to Support the Assessment of System Health

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of sensors that optimally meet the performance goals and the constraints. It identifies optimal sensor suite solutions by utilizing a merit (i.e., cost) function with one of several available optimization approaches. As part of its analysis, S4 can expose fault conditions that are difficult to diagnose due to an incomplete diagnostic philosophy and/or a lack of sensors. S4 was originally developed and applied to liquid rocket engines. It was subsequently used to study the optimized selection of sensors for a simulation ]based aircraft engine diagnostic system. The ETA Tool is a software ]based analysis tool that augments the testability analysis and reporting capabilities of a commercial ]off ]the ]shelf (COTS) package. An initial diagnostic assessment is performed by the COTS software using a user ]developed, qualitative, directed ]graph model of the system being analyzed. The ETA Tool accesses system design information captured within the model and the associated testability analysis output to create a series of six reports for various system engineering needs. These reports are highlighted in the presentation. The ETA Tool was developed by NASA to support the verification of fault management requirements early in the Launch Vehicle process. Due to their early development during the design process, the TEAMS ]based diagnostic model and the ETA Tool were able to positively influence the system design by highlighting gaps in failure detection, fault isolation, and failure recovery.

  3. A Structural Health Monitoring Software Tool for Optimization, Diagnostics and Prognostics

    DTIC Science & Technology

    2011-01-01

    A Structural Health Monitoring Software Tool for Optimization, Diagnostics and Prognostics Seth S . Kessler1, Eric B. Flynn2, Christopher T...technology more accessible, and commercially practical. 1. INTRODUCTION Currently successful laboratory non- destructive testing and monitoring...PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES

  4. Propulsion Diagnostic Method Evaluation Strategy (ProDiMES) User's Guide

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.

    2010-01-01

    This report is a User's Guide for the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES). ProDiMES is a standard benchmarking problem and a set of evaluation metrics to enable the comparison of candidate aircraft engine gas path diagnostic methods. This Matlab (The Mathworks, Inc.) based software tool enables users to independently develop and evaluate diagnostic methods. Additionally, a set of blind test case data is also distributed as part of the software. This will enable the side-by-side comparison of diagnostic approaches developed by multiple users. The Users Guide describes the various components of ProDiMES, and provides instructions for the installation and operation of the tool.

  5. A Genuine TEAM Player

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Qualtech Systems, Inc. developed a complete software system with capabilities of multisignal modeling, diagnostic analysis, run-time diagnostic operations, and intelligent interactive reasoners. Commercially available as the TEAMS (Testability Engineering and Maintenance System) tool set, the software can be used to reveal unanticipated system failures. The TEAMS software package is broken down into four companion tools: TEAMS-RT, TEAMATE, TEAMS-KB, and TEAMS-RDS. TEAMS-RT identifies good, bad, and suspect components in the system in real-time. It reports system health results from onboard tests, and detects and isolates failures within the system, allowing for rapid fault isolation. TEAMATE takes over from where TEAMS-RT left off by intelligently guiding the maintenance technician through the troubleshooting procedure, repair actions, and operational checkout. TEAMS-KB serves as a model management and collection tool. TEAMS-RDS (TEAMS-Remote Diagnostic Server) has the ability to continuously assess a system and isolate any failure in that system or its components, in real time. RDS incorporates TEAMS-RT, TEAMATE, and TEAMS-KB in a large-scale server architecture capable of providing advanced diagnostic and maintenance functions over a network, such as the Internet, with a web browser user interface.

  6. Diagnostics Tools Identify Faults Prior to Failure

    NASA Technical Reports Server (NTRS)

    2013-01-01

    Through the SBIR program, Rochester, New York-based Impact Technologies LLC collaborated with Ames Research Center to commercialize the Center s Hybrid Diagnostic Engine, or HyDE, software. The fault detecting program is now incorporated into a software suite that identifies potential faults early in the design phase of systems ranging from printers to vehicles and robots, saving time and money.

  7. A software tool for analyzing multichannel cochlear implant signals.

    PubMed

    Lai, Wai Kong; Bögli, Hans; Dillier, Norbert

    2003-10-01

    A useful and convenient means to analyze the radio frequency (RF) signals being sent by a speech processor to a cochlear implant would be to actually capture and display them with appropriate software. This is particularly useful for development or diagnostic purposes. sCILab (Swiss Cochlear Implant Laboratory) is such a PC-based software tool intended for the Nucleus family of Multichannel Cochlear Implants. Its graphical user interface provides a convenient and intuitive means for visualizing and analyzing the signals encoding speech information. Both numerical and graphic displays are available for detailed examination of the captured CI signals, as well as an acoustic simulation of these CI signals. sCILab has been used in the design and verification of new speech coding strategies, and has also been applied as an analytical tool in studies of how different parameter settings of existing speech coding strategies affect speech perception. As a diagnostic tool, it is also useful for troubleshooting problems with the external equipment of the cochlear implant systems.

  8. Analytical Utility of Mass Spectral Binning in Proteomic Experiments by SPectral Immonium Ion Detection (SPIID)*

    PubMed Central

    Kelstrup, Christian D.; Frese, Christian; Heck, Albert J. R.; Olsen, Jesper V.; Nielsen, Michael L.

    2014-01-01

    Unambiguous identification of tandem mass spectra is a cornerstone in mass-spectrometry-based proteomics. As the study of post-translational modifications (PTMs) by means of shotgun proteomics progresses in depth and coverage, the ability to correctly identify PTM-bearing peptides is essential, increasing the demand for advanced data interpretation. Several PTMs are known to generate unique fragment ions during tandem mass spectrometry, the so-called diagnostic ions, which unequivocally identify a given mass spectrum as related to a specific PTM. Although such ions offer tremendous analytical advantages, algorithms to decipher MS/MS spectra for the presence of diagnostic ions in an unbiased manner are currently lacking. Here, we present a systematic spectral-pattern-based approach for the discovery of diagnostic ions and new fragmentation mechanisms in shotgun proteomics datasets. The developed software tool is designed to analyze large sets of high-resolution peptide fragmentation spectra independent of the fragmentation method, instrument type, or protease employed. To benchmark the software tool, we analyzed large higher-energy collisional activation dissociation datasets of samples containing phosphorylation, ubiquitylation, SUMOylation, formylation, and lysine acetylation. Using the developed software tool, we were able to identify known diagnostic ions by comparing histograms of modified and unmodified peptide spectra. Because the investigated tandem mass spectra data were acquired with high mass accuracy, unambiguous interpretation and determination of the chemical composition for the majority of detected fragment ions was feasible. Collectively we present a freely available software tool that allows for comprehensive and automatic analysis of analogous product ions in tandem mass spectra and systematic mapping of fragmentation mechanisms related to common amino acids. PMID:24895383

  9. Diagnostic Analyzer for Gearboxes (DAG): User's Guide. Version 3.1 for Microsoft Windows 3.1

    NASA Technical Reports Server (NTRS)

    Jammu, Vinay B.; Kourosh, Danai

    1997-01-01

    This documentation describes the Diagnostic Analyzer for Gearboxes (DAG) software for performing fault diagnosis of gearboxes. First, the user would construct a graphical representation of the gearbox using the gear, bearing, shaft, and sensor tools contained in the DAG software. Next, a set of vibration features obtained by processing the vibration signals recorded from the gearbox using a signal analyzer is required. Given this information, the DAG software uses an unsupervised neural network referred to as the Fault Detection Network (FDN) to identify the occurrence of faults, and a pattern classifier called Single Category-Based Classifier (SCBC) for abnormality scaling of individual vibration features. The abnormality-scaled vibration features are then used as inputs to a Structure-Based Connectionist Network (SBCN) for identifying faults in gearbox subsystems and components. The weights of the SBCN represent its diagnostic knowledge and are derived from the structure of the gearbox graphically presented in DAG. The outputs of SBCN are fault possibility values between 0 and 1 for individual subsystems and components in the gearbox with a 1 representing a definite fault and a 0 representing normality. This manual describes the steps involved in creating the diagnostic gearbox model, along with the options and analysis tools of the DAG software.

  10. Getting expert systems off the ground: Lessons learned from integrating model-based diagnostics with prototype flight hardware

    NASA Technical Reports Server (NTRS)

    Stephan, Amy; Erikson, Carol A.

    1991-01-01

    As an initial attempt to introduce expert system technology into an onboard environment, a model based diagnostic system using the TRW MARPLE software tool was integrated with prototype flight hardware and its corresponding control software. Because this experiment was designed primarily to test the effectiveness of the model based reasoning technique used, the expert system ran on a separate hardware platform, and interactions between the control software and the model based diagnostics were limited. While this project met its objective of showing that model based reasoning can effectively isolate failures in flight hardware, it also identified the need for an integrated development path for expert system and control software for onboard applications. In developing expert systems that are ready for flight, artificial intelligence techniques must be evaluated to determine whether they offer a real advantage onboard, identify which diagnostic functions should be performed by the expert systems and which are better left to the procedural software, and work closely with both the hardware and the software developers from the beginning of a project to produce a well designed and thoroughly integrated application.

  11. Feasibility of streamlining an interactive Bayesian-based diagnostic support tool designed for clinical practice

    NASA Astrophysics Data System (ADS)

    Chen, Po-Hao; Botzolakis, Emmanuel; Mohan, Suyash; Bryan, R. N.; Cook, Tessa

    2016-03-01

    In radiology, diagnostic errors occur either through the failure of detection or incorrect interpretation. Errors are estimated to occur in 30-35% of all exams and contribute to 40-54% of medical malpractice litigations. In this work, we focus on reducing incorrect interpretation of known imaging features. Existing literature categorizes cognitive bias leading a radiologist to an incorrect diagnosis despite having correctly recognized the abnormal imaging features: anchoring bias, framing effect, availability bias, and premature closure. Computational methods make a unique contribution, as they do not exhibit the same cognitive biases as a human. Bayesian networks formalize the diagnostic process. They modify pre-test diagnostic probabilities using clinical and imaging features, arriving at a post-test probability for each possible diagnosis. To translate Bayesian networks to clinical practice, we implemented an entirely web-based open-source software tool. In this tool, the radiologist first selects a network of choice (e.g. basal ganglia). Then, large, clearly labeled buttons displaying salient imaging features are displayed on the screen serving both as a checklist and for input. As the radiologist inputs the value of an extracted imaging feature, the conditional probabilities of each possible diagnosis are updated. The software presents its level of diagnostic discrimination using a Pareto distribution chart, updated with each additional imaging feature. Active collaboration with the clinical radiologist is a feasible approach to software design and leads to design decisions closely coupling the complex mathematics of conditional probability in Bayesian networks with practice.

  12. Building Diagnostic Market Deployment - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katipamula, S.; Gayeski, N.

    2012-04-30

    Operational faults are pervasive across the commercial buildings sector, wasting energy and increasing energy costs by up to about 30% (Mills 2009, Liu et al. 2003, Claridge et al. 2000, Katipamula and Brambley 2008, and Brambley and Katipamula 2009). Automated fault detection and diagnostic (AFDD) tools provide capabilities essential for detecting and correcting these problems and eliminating the associated energy waste and costs. The U.S. Department of Energy's (DOE) Building Technology Program (BTP) has previously invested in developing and testing of such diagnostic tools for whole-building (and major system) energy use, air handlers, chillers, cooling towers, chilled-water distribution systems, andmore » boilers. These diagnostic processes can be used to make the commercial buildings more energy efficient. The work described in this report was done as part of a Cooperative Research and Development Agreement (CRADA) between the U.S. Department of Energy's Pacific Northwest National Laboratory (PNNL) and KGS Building LLC (KGS). PNNL and KGS both believe that the widespread adoption of AFDD tools will result in significant reduction to energy and peak energy consumption. The report provides an introduction and summary of the various tasks performed under the CRADA. The CRADA project had three major focus areas: (1) Technical Assistance for Whole Building Energy Diagnostician (WBE) Commercialization, (2) Market Transfer of the Outdoor Air/Economizer Diagnostician (OAE), and (3) Development and Deployment of Automated Diagnostics to Improve Large Commercial Building Operations. PNNL has previously developed two diagnostic tools: (1) whole building energy (WBE) diagnostician and (2) outdoor air/economizer (OAE) diagnostician. WBE diagnostician is currently licensed non-exclusively to one company. As part of this CRADA, PNNL developed implementation documentation and provided technical support to KGS to implement the tool into their software suite, Clockworks. PNNL also provided validation data sets and the WBE software tool to validate the KGS implementation. OAE diagnostician automatically detects and diagnoses problems with outdoor air ventilation and economizer operation for air handling units (AHUs) in commercial buildings using data available from building automation systems (BASs). As part of this CRADA, PNNL developed implementation documentation and provided technical support to KGS to implement the tool into their software suite. PNNL also provided validation data sets and the OAE software tool to validate the KGS implementation. Finally, as part of this CRADA project, PNNL developed new processes to automate parts of the re-tuning process and transfer those process to KGS for integration into their software product. The transfer of DOE-funded technologies will transform the commercial buildings sector by making buildings more energy efficient and also reducing the carbon footprint from the buildings. As part of the CRADA with PNNL, KGS implemented the whole building energy diagnostician, a portion of outdoor air economizer diagnostician and a number of measures that automate the identification of re-tuning measures.« less

  13. A software tool for advanced MRgFUS prostate therapy planning and follow up

    NASA Astrophysics Data System (ADS)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  14. Object oriented fault diagnosis system for space shuttle main engine redlines

    NASA Technical Reports Server (NTRS)

    Rogers, John S.; Mohapatra, Saroj Kumar

    1990-01-01

    A great deal of attention has recently been given to Artificial Intelligence research in the area of computer aided diagnostics. Due to the dynamic and complex nature of space shuttle red-line parameters, a research effort is under way to develop a real time diagnostic tool that will employ historical and engineering rulebases as well as a sensor validity checking. The capability of AI software development tools (KEE and G2) will be explored by applying object oriented programming techniques in accomplishing the diagnostic evaluation.

  15. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  16. Generation of Controlled Analog Emissions from Embedded Devices using Software Stress Methods

    DTIC Science & Technology

    2017-03-01

    Generation of Controlled Analog Emissions from Embedded Devices using Software Stress Methods Oren Sternberg, Jonathan H. Nelson, Israel Perez...Abstract: In this paper, we present a new method that uses software diagnostic tools to study the generation of induced spurious physical emissions from...types of attacks warrants an understanding of unwanted signal generation. We examine this connection by observing the emission profile of an embedded

  17. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less

  18. Energy Assessment Helps Kaiser Aluminum Save Energy and Improve Productivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2008-07-01

    The Kaiser Aluminum plant in Sherman, Texas, adjusted controls and made repairs to a furnace for a simple payback of 1 month. Kaiser adopted DOE's Process Heating Assessment and Survey Tool (PHAST) software as the corporate diagnostic tool and has used it to evaluate process heating systems at five other aluminum plants.

  19. Spinoff 2013

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Topics covered include: Innovative Software Tools Measure Behavioral Alertness; Miniaturized, Portable Sensors Monitor Metabolic Health; Patient Simulators Train Emergency Caregivers; Solar Refrigerators Store Life-Saving Vaccines; Monitors Enable Medication Management in Patients' Homes; Handheld Diagnostic Device Delivers Quick Medical Readings; Experiments Result in Safer, Spin-Resistant Aircraft; Interfaces Visualize Data for Airline Safety, Efficiency; Data Mining Tools Make Flights Safer, More Efficient; NASA Standards Inform Comfortable Car Seats; Heat Shield Paves the Way for Commercial Space; Air Systems Provide Life Support to Miners; Coatings Preserve Metal, Stone, Tile, and Concrete; Robots Spur Software That Lends a Hand; Cloud-Based Data Sharing Connects Emergency Managers; Catalytic Converters Maintain Air Quality in Mines; NASA-Enhanced Water Bottles Filter Water on the Go; Brainwave Monitoring Software Improves Distracted Minds; Thermal Materials Protect Priceless, Personal Keepsakes; Home Air Purifiers Eradicate Harmful Pathogens; Thermal Materials Drive Professional Apparel Line; Radiant Barriers Save Energy in Buildings; Open Source Initiative Powers Real-Time Data Streams; Shuttle Engine Designs Revolutionize Solar Power; Procedure-Authoring Tool Improves Safety on Oil Rigs; Satellite Data Aid Monitoring of Nation's Forests; Mars Technologies Spawn Durable Wind Turbines; Programs Visualize Earth and Space for Interactive Education; Processor Units Reduce Satellite Construction Costs; Software Accelerates Computing Time for Complex Math; Simulation Tools Prevent Signal Interference on Spacecraft; Software Simplifies the Sharing of Numerical Models; Virtual Machine Language Controls Remote Devices; Micro-Accelerometers Monitor Equipment Health; Reactors Save Energy, Costs for Hydrogen Production; Cameras Monitor Spacecraft Integrity to Prevent Failures; Testing Devices Garner Data on Insulation Performance; Smart Sensors Gather Information for Machine Diagnostics; Oxygen Sensors Monitor Bioreactors and Ensure Health and Safety; Vision Algorithms Catch Defects in Screen Displays; and Deformable Mirrors Capture Exoplanet Data, Reflect Lasers.

  20. Modeling Complex Workflow in Molecular Diagnostics

    PubMed Central

    Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan

    2010-01-01

    One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844

  1. Open architecture CNC system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tal, J.; Lopez, A.; Edwards, J.M.

    1995-04-01

    In this paper, an alternative solution to the traditional CNC machine tool controller has been introduced. Software and hardware modules have been described and their incorporation in a CNC control system has been outlined. This type of CNC machine tool controller demonstrates that technology is accessible and can be readily implemented into an open architecture machine tool controller. Benefit to the user is greater controller flexibility, while being economically achievable. PC based, motion as well as non-motion features will provide flexibility through a Windows environment. Up-grading this type of controller system through software revisions will keep the machine tool inmore » a competitive state with minimal effort. Software and hardware modules are mass produced permitting competitive procurement and incorporation. Open architecture CNC systems provide diagnostics thus enhancing maintainability, and machine tool up-time. A major concern of traditional CNC systems has been operator training time. Training time can be greatly minimized by making use of Windows environment features.« less

  2. Diagnosis and Prognosis of Weapon Systems

    NASA Technical Reports Server (NTRS)

    Nolan, Mary; Catania, Rebecca; deMare, Gregory

    2005-01-01

    The Prognostics Framework is a set of software tools with an open architecture that affords a capability to integrate various prognostic software mechanisms and to provide information for operational and battlefield decision-making and logistical planning pertaining to weapon systems. The Prognostics NASA Tech Briefs, February 2005 17 Framework is also a system-level health -management software system that (1) receives data from performance- monitoring and built-in-test sensors and from other prognostic software and (2) processes the received data to derive a diagnosis and a prognosis for a weapon system. This software relates the diagnostic and prognostic information to the overall health of the system, to the ability of the system to perform specific missions, and to needed maintenance actions and maintenance resources. In the development of the Prognostics Framework, effort was focused primarily on extending previously developed model-based diagnostic-reasoning software to add prognostic reasoning capabilities, including capabilities to perform statistical analyses and to utilize information pertaining to deterioration of parts, failure modes, time sensitivity of measured values, mission criticality, historical data, and trends in measurement data. As thus extended, the software offers an overall health-monitoring capability.

  3. LevRad software as a tool to learn how to proceed with an evaluation of barriers.

    PubMed

    Ferreira, C C; Souza, S O

    2011-05-30

    We developed the software LevRad with the objective of teaching how to proceed in an analysis of barriers shielding against x-rays to minimize the contact of the professional or the student with x-rays and also to prevent wearing out of the x-ray equipment. Some tests of the software were made, and preliminary results indicate that LevRad is efficient as a complementary tool for the development of professionals related to diagnostic radiology. In the case of education, an advantage is gained when the beginner uses the software before his or her first contact with x-ray equipment in locu. The software introduces a basic knowledge about evaluation of barriers, prevents wearing out of the x—ray tube, reinforces teaching of evaluation of barriers, and reduces the collective effective dose by avoiding unnecessary exposures when possible.

  4. DECIDE: a software for computer-assisted evaluation of diagnostic test performance.

    PubMed

    Chiecchio, A; Bo, A; Manzone, P; Giglioli, F

    1993-05-01

    The evaluation of the performance of clinical tests is a complex problem involving different steps and many statistical tools, not always structured in an organic and rational system. This paper presents a software which provides an organic system of statistical tools helping evaluation of clinical test performance. The program allows (a) the building and the organization of a working database, (b) the selection of the minimal set of tests with the maximum information content, (c) the search of the model best fitting the distribution of the test values, (d) the selection of optimal diagnostic cut-off value of the test for every positive/negative situation, (e) the evaluation of performance of the combinations of correlated and uncorrelated tests. The uncertainty associated with all the variables involved is evaluated. The program works in a MS-DOS environment with EGA or higher performing graphic card.

  5. Evaluation of whole genome sequencing and software tools for drug susceptibility testing of Mycobacterium tuberculosis.

    PubMed

    van Beek, J; Haanperä, M; Smit, P W; Mentula, S; Soini, H

    2018-04-11

    Culture-based assays are currently the reference standard for drug susceptibility testing for Mycobacterium tuberculosis. They provide good sensitivity and specificity but are time consuming. The objective of this study was to evaluate whether whole genome sequencing (WGS), combined with software tools for data analysis, can replace routine culture-based assays for drug susceptibility testing of M. tuberculosis. M. tuberculosis cultures sent to the Finnish mycobacterial reference laboratory in 2014 (n = 211) were phenotypically tested by Mycobacteria Growth Indicator Tube (MGIT) for first-line drug susceptibilities. WGS was performed for all isolates using the Illumina MiSeq system, and data were analysed using five software tools (PhyResSE, Mykrobe Predictor, TB Profiler, TGS-TB and KvarQ). Diagnostic time and reagent costs were estimated for both methods. The sensitivity of the five software tools to predict any resistance among strains was almost identical, ranging from 74% to 80%, and specificity was more than 95% for all software tools except for TGS-TB. The sensitivity and specificity to predict resistance to individual drugs varied considerably among the software tools. Reagent costs for MGIT and WGS were €26 and €143 per isolate respectively. Turnaround time for MGIT was 19 days (range 10-50 days) for first-line drugs, and turnaround time for WGS was estimated to be 5 days (range 3-7 days). WGS could be used as a prescreening assay for drug susceptibility testing with confirmation of resistant strains by MGIT. The functionality and ease of use of the software tools need to be improved. Copyright © 2018 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  6. Capricorn-A Web-Based Automatic Case Log and Volume Analytics for Diagnostic Radiology Residents.

    PubMed

    Chen, Po-Hao; Chen, Yin Jie; Cook, Tessa S

    2015-10-01

    On-service clinical learning is a mainstay of radiology education. However, an accurate and timely case log is difficult to keep, especially in the absence of software tools tailored to resident education. Furthermore, volume-related feedback from the residency program sometimes occurs months after a rotation ends, limiting the opportunity for meaningful intervention. We surveyed the residents of a single academic institution to evaluate the current state of and the existing need for tracking interpretation volume. Using the results of the survey, we created an open-source automated case log software. Finally, we evaluated the effect of the software tool on the residency in a 1-month, postimplementation survey. Before implementation of the system, 89% of respondents stated that volume is an important component of training, but 71% stated that volume data was inconvenient to obtain. Although the residency program provides semiannual reviews, 90% preferred reviewing interpretation volumes at least once monthly. After implementation, 95% of the respondents stated that the software is convenient to access, 75% found it useful, and 88% stated they would use the software at least once a month. The included analytics module, which benchmarks the user using historical aggregate average volumes, is the most often used feature of the software. Server log demonstrates that, on average, residents use the system approximately twice a week. An automated case log software system may fulfill a previously unmet need in diagnostic radiology training, making accurate and timely review of volume-related performance analytics a convenient process. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  7. FTDD973: A multimedia knowledge-based system and methodology for operator training and diagnostics

    NASA Technical Reports Server (NTRS)

    Hekmatpour, Amir; Brown, Gary; Brault, Randy; Bowen, Greg

    1993-01-01

    FTDD973 (973 Fabricator Training, Documentation, and Diagnostics) is an interactive multimedia knowledge based system and methodology for computer-aided training and certification of operators, as well as tool and process diagnostics in IBM's CMOS SGP fabrication line (building 973). FTDD973 is an example of what can be achieved with modern multimedia workstations. Knowledge-based systems, hypertext, hypergraphics, high resolution images, audio, motion video, and animation are technologies that in synergy can be far more useful than each by itself. FTDD973's modular and object-oriented architecture is also an example of how improvements in software engineering are finally making it possible to combine many software modules into one application. FTDD973 is developed in ExperMedia/2; and OS/2 multimedia expert system shell for domain experts.

  8. Software ion scan functions in analysis of glycomic and lipidomic MS/MS datasets.

    PubMed

    Haramija, Marko

    2018-03-01

    Hardware ion scan functions unique to tandem mass spectrometry (MS/MS) mode of data acquisition, such as precursor ion scan (PIS) and neutral loss scan (NLS), are important for selective extraction of key structural data from complex MS/MS spectra. However, their software counterparts, software ion scan (SIS) functions, are still not regularly available. Software ion scan functions can be easily coded for additional functionalities, such as software multiple precursor ion scan, software no ion scan, and software variable ion scan functions. These are often necessary, since they allow more efficient analysis of complex MS/MS datasets, often encountered in glycomics and lipidomics. Software ion scan functions can be easily coded by using modern script languages and can be independent of instrument manufacturer. Here we demonstrate the utility of SIS functions on a medium-size glycomic MS/MS dataset. Knowledge of sample properties, as well as of diagnostic and conditional diagnostic ions crucial for data analysis, was needed. Based on the tables constructed with the output data from the SIS functions performed, a detailed analysis of a complex MS/MS glycomic dataset could be carried out in a quick, accurate, and efficient manner. Glycomic research is progressing slowly, and with respect to the MS experiments, one of the key obstacles for moving forward is the lack of appropriate bioinformatic tools necessary for fast analysis of glycomic MS/MS datasets. Adding novel SIS functionalities to the glycomic MS/MS toolbox has a potential to significantly speed up the glycomic data analysis process. Similar tools are useful for analysis of lipidomic MS/MS datasets as well, as will be discussed briefly. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Contribution of PCR Denaturing Gradient Gel Electrophoresis Combined with Mixed Chromatogram Software Separation for Complex Urinary Sample Analysis.

    PubMed

    Kotásková, Iva; Mališová, Barbora; Obručová, Hana; Holá, Veronika; Peroutková, Tereza; Růžička, Filip; Freiberger, Tomáš

    2017-01-01

    Complex samples are a challenge for sequencing-based broad-range diagnostics. We analysed 19 urinary catheter, ureteral Double-J catheter, and urine samples using 3 methodological approaches. Out of the total 84 operational taxonomic units, 37, 61, and 88% were identified by culture, PCR-DGGE-SS (PCR denaturing gradient gel electrophoresis followed by Sanger sequencing), and PCR-DGGE-RM (PCR- DGGE combined with software chromatogram separation by RipSeq Mixed tool), respectively. The latter approach was shown to be an efficient tool to complement culture in complex sample assessment. © 2017 S. Karger AG, Basel.

  10. Automated Diagnosis and Control of Complex Systems

    NASA Technical Reports Server (NTRS)

    Kurien, James; Plaunt, Christian; Cannon, Howard; Shirley, Mark; Taylor, Will; Nayak, P.; Hudson, Benoit; Bachmann, Andrew; Brownston, Lee; Hayden, Sandra; hide

    2007-01-01

    Livingstone2 is a reusable, artificial intelligence (AI) software system designed to assist spacecraft, life support systems, chemical plants, or other complex systems by operating with minimal human supervision, even in the face of hardware failures or unexpected events. The software diagnoses the current state of the spacecraft or other system, and recommends commands or repair actions that will allow the system to continue operation. Livingstone2 is an enhancement of the Livingstone diagnosis system that was flight-tested onboard the Deep Space One spacecraft in 1999. This version tracks multiple diagnostic hypotheses, rather than just a single hypothesis as in the previous version. It is also able to revise diagnostic decisions made in the past when additional observations become available. In such cases, Livingstone might arrive at an incorrect hypothesis. Re-architecting and re-implementing the system in C++ has increased performance. Usability has been improved by creating a set of development tools that is closely integrated with the Livingstone2 engine. In addition to the core diagnosis engine, Livingstone2 includes a compiler that translates diagnostic models written in a Java-like language into Livingstone2's language, and a broad set of graphical tools for model development.

  11. Pilot Study of an Open-source Image Analysis Software for Automated Screening of Conventional Cervical Smears.

    PubMed

    Sanyal, Parikshit; Ganguli, Prosenjit; Barui, Sanghita; Deb, Prabal

    2018-01-01

    The Pap stained cervical smear is a screening tool for cervical cancer. Commercial systems are used for automated screening of liquid based cervical smears. However, there is no image analysis software used for conventional cervical smears. The aim of this study was to develop and test the diagnostic accuracy of a software for analysis of conventional smears. The software was developed using Python programming language and open source libraries. It was standardized with images from Bethesda Interobserver Reproducibility Project. One hundred and thirty images from smears which were reported Negative for Intraepithelial Lesion or Malignancy (NILM), and 45 images where some abnormality has been reported, were collected from the archives of the hospital. The software was then tested on the images. The software was able to segregate images based on overall nuclear: cytoplasmic ratio, coefficient of variation (CV) in nuclear size, nuclear membrane irregularity, and clustering. 68.88% of abnormal images were flagged by the software, as well as 19.23% of NILM images. The major difficulties faced were segmentation of overlapping cell clusters and separation of neutrophils. The software shows potential as a screening tool for conventional cervical smears; however, further refinement in technique is required.

  12. The road map towards providing a robust Raman spectroscopy-based cancer diagnostic platform and integration into clinic

    NASA Astrophysics Data System (ADS)

    Lau, Katherine; Isabelle, Martin; Lloyd, Gavin R.; Old, Oliver; Shepherd, Neil; Bell, Ian M.; Dorney, Jennifer; Lewis, Aaran; Gaifulina, Riana; Rodriguez-Justo, Manuel; Kendall, Catherine; Stone, Nicolas; Thomas, Geraint; Reece, David

    2016-03-01

    Despite the demonstrated potential as an accurate cancer diagnostic tool, Raman spectroscopy (RS) is yet to be adopted by the clinic for histopathology reviews. The Stratified Medicine through Advanced Raman Technologies (SMART) consortium has begun to address some of the hurdles in its adoption for cancer diagnosis. These hurdles include awareness and acceptance of the technology, practicality of integration into the histopathology workflow, data reproducibility and availability of transferrable models. We have formed a consortium, in joint efforts, to develop optimised protocols for tissue sample preparation, data collection and analysis. These protocols will be supported by provision of suitable hardware and software tools to allow statistically sound classification models to be built and transferred for use on different systems. In addition, we are building a validated gastrointestinal (GI) cancers model, which can be trialled as part of the histopathology workflow at hospitals, and a classification tool. At the end of the project, we aim to deliver a robust Raman based diagnostic platform to enable clinical researchers to stage cancer, define tumour margin, build cancer diagnostic models and discover novel disease bio markers.

  13. Artificial intelligence against breast cancer (A.N.N.E.S-B.C.-Project).

    PubMed

    Parmeggiani, Domenico; Avenia, Nicola; Sanguinetti, Alessandro; Ruggiero, Roberto; Docimo, Giovanni; Siciliano, Mattia; Ambrosino, Pasquale; Madonna, Imma; Peltrini, Roberto; Parmeggiani, Umberto

    2012-01-01

    Our preliminary study examined the development of an advanced innovative technology with the objectives of--developing methodologies and algorithms for a Artificial Neural Network (ANN) system, improving mammography and ultra-sonography images interpretation;--creating autonomous software as a diagnostic tool for the physicians, allowing the possibility for the advanced application of databases using Artificial Intelligence (Expert System). Since 2004 550 F patients over 40 yrs old were divided in two groups: 1) 310 pts underwent echo every 6 months and mammography every year by expert radiologists. 2) 240 pts had the same screening program and were also examined by our diagnosis software, developed with ANN-ES technology by the Engineering Aircraft Research Project team. The information was continually updated and returned to the Expert System, defining the principal rules of automatic diagnosis. In the second group we selected: Expert radiologist decision; ANN-ES decision; Expert radiologists with ANN-ES decision. The second group had significantly better diagnosis for cancer and better specificity for breast lesions risk as well as the highest percentage account when the radiologist's decision was helped by the ANN software. The ANN-ES group was able to select, by anamnestic, diagnostic and genetic means, 8 patients for prophylactic surgery, finding 4 cancers in a very early stage. Although it is only a preliminary study, this innovative diagnostic tool seems to provide better positive and negative predictive value in cancer diagnosis as well as in breast risk lesion identification.

  14. Reviews of Selected System and Software Tools for Strategic Defense Applications

    DTIC Science & Technology

    1990-02-01

    Interleaf and FrameMaker . IStatic Diagnostics Basic testing includes validating flows, detecting orphan activity, and checking completeness of activities...Publisher, Aldus PageMaker, Unix pic, Apple .pict metafile, Interleaf, Framemaker , or Postscript format. There are no forms for standard documents such as 3

  15. A New Look at NASA: Strategic Research In Information Technology

    NASA Technical Reports Server (NTRS)

    Alfano, David; Tu, Eugene (Technical Monitor)

    2002-01-01

    This viewgraph presentation provides information on research undertaken by NASA to facilitate the development of information technologies. Specific ideas covered here include: 1) Bio/nano technologies: biomolecular and nanoscale systems and tools for assembly and computing; 2) Evolvable hardware: autonomous self-improving, self-repairing hardware and software for survivable space systems in extreme environments; 3) High Confidence Software Technologies: formal methods, high-assurance software design, and program synthesis; 4) Intelligent Controls and Diagnostics: Next generation machine learning, adaptive control, and health management technologies; 5) Revolutionary computing: New computational models to increase capability and robustness to enable future NASA space missions.

  16. A usability evaluation of medical software at an expert conference setting.

    PubMed

    Bond, Raymond Robert; Finlay, Dewar D; Nugent, Chris D; Moore, George; Guldenring, Daniel

    2014-01-01

    A usability test was employed to evaluate two medical software applications at an expert conference setting. One software application is a medical diagnostic tool (electrocardiogram [ECG] viewer) and the other is a medical research tool (electrode misplacement simulator [EMS]). These novel applications have yet to be adopted by the healthcare domain, thus, (1) we wanted to determine the potential user acceptance of these applications and (2) we wanted to determine the feasibility of evaluating medical diagnostic and medical research software at a conference setting as opposed to the conventional laboratory setting. The medical diagnostic tool (ECG viewer) was evaluated using seven delegates and the medical research tool (EMS) was evaluated using 17 delegates that were recruited at the 2010 International Conference on Computing in Cardiology. Each delegate/participant was required to use the software and undertake a set of predefined tasks during the session breaks at the conference. User interactions with the software were recorded using screen-recording software. The 'think-aloud' protocol was also used to elicit verbal feedback from the participants whilst they attempted the pre-defined tasks. Before and after each session, participants completed a pre-test and a post-test questionnaire respectively. The average duration of a usability session at the conference was 34.69 min (SD=10.28). However, taking into account that 10 min was dedicated to the pre-test and post-test questionnaires, the average time dedication to user interaction of the medical software was 24.69 min (SD=10.28). Given we have shown that usability data can be collected at conferences, this paper details the advantages of conference-based usability studies over the laboratory-based approach. For example, given delegates gather at one geographical location, a conference-based usability evaluation facilitates recruitment of a convenient sample of international subject experts. This would otherwise be very expensive to arrange. A conference-based approach also allows for data to be collected over a few days as opposed to months by avoiding administration duties normally involved in laboratory based approach, e.g. mailing invitation letters as part of a recruitment campaign. Following analysis of the user video recordings, 41 (previously unknown) use errors were identified in the advanced ECG viewer and 29 were identified in the EMS application. All use errors were given a consensus severity rating from two independent usability experts. Out of a rating scale of 4 (where 1=cosmetic and 4=critical), the average severity rating for the ECG viewer was 2.24 (SD=1.09) and the average severity rating for the EMS application was 2.34 (SD=0.97). We were also able to extract task completion rates and times from the video recordings to determine the effectiveness of the software applications. For example, six out of seven tasks were completed by all participants when using both applications. This statistic alone suggests both applications already have a high degree of usability. As well as extracting data from the video recordings, we were also able to extract data from the questionnaires. Using a semantic differential scale (where 1=poor and 5=excellent), delegates highly rated the 'responsiveness', 'usefulness', 'learnability' and the 'look and feel' of both applications. This study has shown the potential user acceptance and user-friendliness of the novel EMS and the ECG viewer applications within the healthcare domain. It has also shown that both medical diagnostic software and medical research software can be evaluated for their usability at an expert conference setting. The primary advantage of a conference-based usability evaluation over a laboratory-based evaluation is the high concentration of experts at one location, which is convenient, less time consuming and less expensive. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. Improving the quality of EHR recording in primary care: a data quality feedback tool.

    PubMed

    van der Bij, Sjoukje; Khan, Nasra; Ten Veen, Petra; de Bakker, Dinny H; Verheij, Robert A

    2017-01-01

    Electronic health record (EHR) data are used to exchange information among health care providers. For this purpose, the quality of the data is essential. We developed a data quality feedback tool that evaluates differences in EHR data quality among practices and software packages as part of a larger intervention. The tool was applied in 92 practices in the Netherlands using different software packages. Practices received data quality feedback in 2010 and 2012. We observed large differences in the quality of recording. For example, the percentage of episodes of care that had a meaningful diagnostic code ranged from 30% to 100%. Differences were highly related to the software package. A year after the first measurement, the quality of recording had improved significantly and differences decreased, with 67% of the physicians indicating that they had actively changed their recording habits based on the results of the first measurement. About 80% found the feedback helpful in pinpointing recording problems. One of the software vendors made changes in functionality as a result of the feedback. Our EHR data quality feedback tool is capable of highlighting differences among practices and software packages. As such, it also stimulates improvements. As substantial variability in recording is related to the software package, our study strengthens the evidence that data quality can be improved substantially by standardizing the functionalities of EHR software packages. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Amanda M.; Daly, Don S.; Willse, Alan R.

    The Automated Microarray Image Analysis (AMIA) Toolbox for MATLAB is a flexible, open-source microarray image analysis tool that allows the user to customize analysis of sets of microarray images. This tool provides several methods of identifying and quantify spot statistics, as well as extensive diagnostic statistics and images to identify poor data quality or processing. The open nature of this software allows researchers to understand the algorithms used to provide intensity estimates and to modify them easily if desired.

  19. [The development and evaluation of software to verify diagnostic accuracy].

    PubMed

    Jensen, Rodrigo; de Moraes Lopes, Maria Helena Baena; Silveira, Paulo Sérgio Panse; Ortega, Neli Regina Siqueira

    2012-02-01

    This article describes the development and evaluation of software that verifies the accuracy of diagnoses made by nursing students. The software was based on a model that uses fuzzy logic concepts, including PERL, the MySQL database for Internet accessibility, and the NANDA-I 2007-2008 classification system. The software was evaluated in terms of its technical quality and usability through specific instruments. The activity proposed in the software involves four stages in which students establish the relationship values between nursing diagnoses, defining characteristics/risk factors and clinical cases. The relationship values determined by students are compared to those of specialists, generating performance scores for the students. In the evaluation, the software demonstrated satisfactory outcomes regarding the technical quality and, according to the students, helped in their learning and may become an educational tool to teach the process of nursing diagnosis.

  20. [Research progress of probe design software of oligonucleotide microarrays].

    PubMed

    Chen, Xi; Wu, Zaoquan; Liu, Zhengchun

    2014-02-01

    DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.

  1. Computer-aided diagnosis software for vulvovaginal candidiasis detection from Pap smear images.

    PubMed

    Momenzadeh, Mohammadreza; Vard, Alireza; Talebi, Ardeshir; Mehri Dehnavi, Alireza; Rabbani, Hossein

    2018-01-01

    Vulvovaginal candidiasis (VVC) is a common gynecologic infection and it occurs when there is overgrowth of the yeast called Candida. VVC diagnosis is usually done by observing a Pap smear sample under a microscope and searching for the conidium and mycelium components of Candida. This manual method is time consuming, subjective and tedious. Any diagnosis tools that detect VVC, semi- or full-automatically, can be very helpful to pathologists. This article presents a computer aided diagnosis (CAD) software to improve human diagnosis of VVC from Pap smear samples. The proposed software is designed based on phenotypic and morphology features of the Candida in Pap smear sample images. This software provide a user-friendly interface which consists of a set of image processing tools and analytical results that helps to detect Candida and determine severity of illness. The software was evaluated on 200 Pap smear sample images and obtained specificity of 91.04% and sensitivity of 92.48% to detect VVC. As a result, the use of the proposed software reduces diagnostic time and can be employed as a second objective opinion for pathologists. © 2017 Wiley Periodicals, Inc.

  2. An Assessment of Integrated Health Management (IHM) Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N. Lybeck; M. Tawfik; L. Bond

    In order to meet the ever increasing demand for energy, the United States nuclear industry is turning to life extension of existing nuclear power plants (NPPs). Economically ensuring the safe, secure, and reliable operation of aging nuclear power plants presents many challenges. The 2009 Light Water Reactor Sustainability Workshop identified online monitoring of active and structural components as essential to the better understanding and management of the challenges posed by aging nuclear power plants. Additionally, there is increasing adoption of condition-based maintenance (CBM) for active components in NPPs. These techniques provide a foundation upon which a variety of advanced onlinemore » surveillance, diagnostic, and prognostic techniques can be deployed to continuously monitor and assess the health of NPP systems and components. The next step in the development of advanced online monitoring is to move beyond CBM to estimating the remaining useful life of active components using prognostic tools. Deployment of prognostic health management (PHM) on the scale of a NPP requires the use of an integrated health management (IHM) framework - a software product (or suite of products) used to manage the necessary elements needed for a complete implementation of online monitoring and prognostics. This paper provides a thoughtful look at the desirable functions and features of IHM architectures. A full PHM system involves several modules, including data acquisition, system modeling, fault detection, fault diagnostics, system prognostics, and advisory generation (operations and maintenance planning). The standards applicable to PHM applications are indentified and summarized. A list of evaluation criteria for PHM software products, developed to ensure scalability of the toolset to an environment with the complexity of a NPP, is presented. Fourteen commercially available PHM software products are identified and classified into four groups: research tools, PHM system development tools, deployable architectures, and peripheral tools.« less

  3. Software architecture and design of the web services facilitating climate model diagnostic analysis

    NASA Astrophysics Data System (ADS)

    Pan, L.; Lee, S.; Zhang, J.; Tang, B.; Zhai, C.; Jiang, J. H.; Wang, W.; Bao, Q.; Qi, M.; Kubar, T. L.; Teixeira, J.

    2015-12-01

    Climate model diagnostic analysis is a computationally- and data-intensive task because it involves multiple numerical model outputs and satellite observation data that can both be high resolution. We have built an online tool that facilitates this process. The tool is called Climate Model Diagnostic Analyzer (CMDA). It employs the web service technology and provides a web-based user interface. The benefits of these choices include: (1) No installation of any software other than a browser, hence it is platform compatable; (2) Co-location of computation and big data on the server side, and small results and plots to be downloaded on the client side, hence high data efficiency; (3) multi-threaded implementation to achieve parallel performance on multi-core servers; and (4) cloud deployment so each user has a dedicated virtual machine. In this presentation, we will focus on the computer science aspects of this tool, namely the architectural design, the infrastructure of the web services, the implementation of the web-based user interface, the mechanism of provenance collection, the approach to virtualization, and the Amazon Cloud deployment. As an example, We will describe our methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). Another example is the use of Docker, a light-weight virtualization container, to distribute and deploy CMDA onto an Amazon EC2 instance. Our tool of CMDA has been successfully used in the 2014 Summer School hosted by the JPL Center for Climate Science. Students had positive feedbacks in general and we will report their comments. An enhanced version of CMDA with several new features, some requested by the 2014 students, will be used in the 2015 Summer School soon.

  4. Mobile Phones Democratize and Cultivate Next-Generation Imaging, Diagnostics and Measurement Tools

    PubMed Central

    Ozcan, Aydogan

    2014-01-01

    In this article, I discuss some of the emerging applications and the future opportunities and challenges created by the use of mobile phones and their embedded components for the development of next-generation imaging, sensing, diagnostics and measurement tools. The massive volume of mobile phone users, which has now reached ~7 billion, drives the rapid improvements of the hardware, software and high-end imaging and sensing technologies embedded in our phones, transforming the mobile phone into a cost-effective and yet extremely powerful platform to run e.g., biomedical tests and perform scientific measurements that would normally require advanced laboratory instruments. This rapidly evolving and continuing trend will help us transform how medicine, engineering and sciences are practiced and taught globally. PMID:24647550

  5. Intuitive and interpretable visual communication of a complex statistical model of disease progression and risk.

    PubMed

    Jieyi Li; Arandjelovic, Ognjen

    2017-07-01

    Computer science and machine learning in particular are increasingly lauded for their potential to aid medical practice. However, the highly technical nature of the state of the art techniques can be a major obstacle in their usability by health care professionals and thus, their adoption and actual practical benefit. In this paper we describe a software tool which focuses on the visualization of predictions made by a recently developed method which leverages data in the form of large scale electronic records for making diagnostic predictions. Guided by risk predictions, our tool allows the user to explore interactively different diagnostic trajectories, or display cumulative long term prognostics, in an intuitive and easily interpretable manner.

  6. Open Energy Information System version 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    OpenEIS was created to provide standard methods for authoring, sharing, testing, using, and improving algorithms for operational building energy efficiency with building managers and building owners. OpenEIS is designed as a no-cost/low-cost solution that will propagate the fault detection and diagnostic (FDD) solutions into the marketplace by providing state- of- the-art analytical and diagnostic algorithms. As OpenEIS penetrates the market, demand by control system manufacturers and integrators serving small and medium commercial customers will help push these types of commercial software tool offerings into the broader marketplace.

  7. Computer Assisted Thermography And Its Application In Ovulation Detection

    NASA Astrophysics Data System (ADS)

    Rao, K. H.; Shah, A. V.

    1984-08-01

    Hardware and software of a computer-assisted image analyzing system used for infrared images in medical applications are discussed. The application of computer-assisted thermography (CAT) as a complementary diagnostic tool in centralized diagnostic management is proposed. The authors adopted 'Computer Assisted Thermography' to study physiological changes in the breasts related to the hormones characterizing the menstrual cycle of a woman. Based on clinical experi-ments followed by thermal image analysis, they suggest that 'differential skin temperature (DST)1 be measured to detect the fertility interval in the menstrual cycle of a woman.

  8. Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)

    2013-01-01

    Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.

  9. Software-defined Quantum Networking Ecosystem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Sadlier, Ronald

    The software enables a user to perform modeling and simulation of software-defined quantum networks. The software addresses the problem of how to synchronize transmission of quantum and classical signals through multi-node networks and to demonstrate quantum information protocols such as quantum teleportation. The software approaches this problem by generating a graphical model of the underlying network and attributing properties to each node and link in the graph. The graphical model is then simulated using a combination of discrete-event simulators to calculate the expected state of each node and link in the graph at a future time. A user interacts withmore » the software by providing an initial network model and instantiating methods for the nodes to transmit information with each other. This includes writing application scripts in python that make use of the software library interfaces. A user then initiates the application scripts, which invokes the software simulation. The user then uses the built-in diagnostic tools to query the state of the simulation and to collect statistics on synchronization.« less

  10. Intelligent model-based diagnostics for vehicle health management

    NASA Astrophysics Data System (ADS)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  11. Rapid Diagnostics of Onboard Sequences

    NASA Technical Reports Server (NTRS)

    Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.

    2012-01-01

    Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command EVRs by the onboard flight software. Second, both the binary SCMF result and the RML input file can be retrieved simply by specifying the hash to a Restful web interface. This interface enables command line tools as well as large sophisticated programs to download the SCMF and RMLs on-demand from the database, enabling a vast array of tools to be built on top of it. One such command line tool can retrieve and display RML files, or annotate a list of EVRs by interleaving them with the original sequence commands. This software has been integrated with the MSL sequencing pipeline where it will serve sequences useful in diagnostics, debugging, and situational awareness throughout the mission.

  12. Application of Diagnostic Analysis Tools to the Ares I Thrust Vector Control System

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Melcher, Kevin J.; Chicatelli, Amy K.; Johnson, Stephen B.

    2010-01-01

    The NASA Ares I Crew Launch Vehicle is being designed to support missions to the International Space Station (ISS), to the Moon, and beyond. The Ares I is undergoing design and development utilizing commercial-off-the-shelf tools and hardware when applicable, along with cutting edge launch technologies and state-of-the-art design and development. In support of the vehicle s design and development, the Ares Functional Fault Analysis group was tasked to develop an Ares Vehicle Diagnostic Model (AVDM) and to demonstrate the capability of that model to support failure-related analyses and design integration. One important component of the AVDM is the Upper Stage (US) Thrust Vector Control (TVC) diagnostic model-a representation of the failure space of the US TVC subsystem. This paper first presents an overview of the AVDM, its development approach, and the software used to implement the model and conduct diagnostic analysis. It then uses the US TVC diagnostic model to illustrate details of the development, implementation, analysis, and verification processes. Finally, the paper describes how the AVDM model can impact both design and ground operations, and how some of these impacts are being realized during discussions of US TVC diagnostic analyses with US TVC designers.

  13. Creation of a simple natural language processing tool to support an imaging utilization quality dashboard.

    PubMed

    Swartz, Jordan; Koziatek, Christian; Theobald, Jason; Smith, Silas; Iturrate, Eduardo

    2017-05-01

    Testing for venous thromboembolism (VTE) is associated with cost and risk to patients (e.g. radiation). To assess the appropriateness of imaging utilization at the provider level, it is important to know that provider's diagnostic yield (percentage of tests positive for the diagnostic entity of interest). However, determining diagnostic yield typically requires either time-consuming, manual review of radiology reports or the use of complex and/or proprietary natural language processing software. The objectives of this study were twofold: 1) to develop and implement a simple, user-configurable, and open-source natural language processing tool to classify radiology reports with high accuracy and 2) to use the results of the tool to design a provider-specific VTE imaging dashboard, consisting of both utilization rate and diagnostic yield. Two physicians reviewed a training set of 400 lower extremity ultrasound (UTZ) and computed tomography pulmonary angiogram (CTPA) reports to understand the language used in VTE-positive and VTE-negative reports. The insights from this review informed the arguments to the five modifiable parameters of the NLP tool. A validation set of 2,000 studies was then independently classified by the reviewers and by the tool; the classifications were compared and the performance of the tool was calculated. The tool was highly accurate in classifying the presence and absence of VTE for both the UTZ (sensitivity 95.7%; 95% CI 91.5-99.8, specificity 100%; 95% CI 100-100) and CTPA reports (sensitivity 97.1%; 95% CI 94.3-99.9, specificity 98.6%; 95% CI 97.8-99.4). The diagnostic yield was then calculated at the individual provider level and the imaging dashboard was created. We have created a novel NLP tool designed for users without a background in computer programming, which has been used to classify venous thromboembolism reports with a high degree of accuracy. The tool is open-source and available for download at http://iturrate.com/simpleNLP. Results obtained using this tool can be applied to enhance quality by presenting information about utilization and yield to providers via an imaging dashboard. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. TU-AB-202-03: Prediction of PET Transfer Uncertainty by DIR Error Estimating Software, AUTODIRECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, H; Chen, J; Phillips, J

    2016-06-15

    Purpose: Deformable image registration (DIR) is a powerful tool, but DIR errors can adversely affect its clinical applications. To estimate voxel-specific DIR uncertainty, a software tool, called AUTODIRECT (automated DIR evaluation of confidence tool), has been developed and validated. This work tests the ability of this software to predict uncertainty for the transfer of standard uptake values (SUV) from positron-emission tomography (PET) with DIR. Methods: Virtual phantoms are used for this study. Each phantom has a planning computed tomography (CT) image and a diagnostic PET-CT image set. A deformation was digitally applied to the diagnostic CT to create the planningmore » CT image and establish a known deformation between the images. One lung and three rectum patient datasets were employed to create the virtual phantoms. Both of these sites have difficult deformation scenarios associated with them, which can affect DIR accuracy (lung tissue sliding and changes in rectal filling). The virtual phantoms were created to simulate these scenarios by introducing discontinuities in the deformation field at the lung rectum border. The DIR algorithm from Plastimatch software was applied to these phantoms. The SUV mapping errors from the DIR were then compared to that predicted by AUTODIRECT. Results: The SUV error distributions closely followed the AUTODIRECT predicted error distribution for the 4 test cases. The minimum and maximum PET SUVs were produced from AUTODIRECT at 95% confidence interval before applying gradient-based SUV segmentation for each of these volumes. Notably, 93.5% of the target volume warped by the true deformation was included within the AUTODIRECT-predicted maximum SUV volume after the segmentation, while 78.9% of the target volume was within the target volume warped by Plastimatch. Conclusion: The AUTODIRECT framework is able to predict PET transfer uncertainty caused by DIR, which enables an understanding of the associated target volume uncertainty.« less

  15. MYST: a comprehensive high-level AO control tool for GeMS

    NASA Astrophysics Data System (ADS)

    Rigaut, F.; Neichel, B.; Bec, M.; Garcia-Rissman, A.

    2010-07-01

    Myst is the Gemini MCAO System (GeMS) high level control GUI. It is written in yorick, python and C. In this paper, we review the software architecture of Myst and its primary purposes, which are many: Real-time display, high level diagnostics, calibrations, and executor/sequencer of high level actions (closing the loop, coordinating dithers, etc).

  16. Novel calibration tools and validation concepts for microarray-based platforms used in molecular diagnostics and food safety control.

    PubMed

    Brunner, C; Hoffmann, K; Thiele, T; Schedler, U; Jehle, H; Resch-Genger, U

    2015-04-01

    Commercial platforms consisting of ready-to-use microarrays printed with target-specific DNA probes, a microarray scanner, and software for data analysis are available for different applications in medical diagnostics and food analysis, detecting, e.g., viral and bacteriological DNA sequences. The transfer of these tools from basic research to routine analysis, their broad acceptance in regulated areas, and their use in medical practice requires suitable calibration tools for regular control of instrument performance in addition to internal assay controls. Here, we present the development of a novel assay-adapted calibration slide for a commercialized DNA-based assay platform, consisting of precisely arranged fluorescent areas of various intensities obtained by incorporating different concentrations of a "green" dye and a "red" dye in a polymer matrix. These dyes present "Cy3" and "Cy5" analogues with improved photostability, chosen based upon their spectroscopic properties closely matching those of common labels for the green and red channel of microarray scanners. This simple tool allows to efficiently and regularly assess and control the performance of the microarray scanner provided with the biochip platform and to compare different scanners. It will be eventually used as fluorescence intensity scale for referencing of assays results and to enhance the overall comparability of diagnostic tests.

  17. Virtual chromoendoscopy can be a useful software tool in capsule endoscopy.

    PubMed

    Duque, Gabriela; Almeida, Nuno; Figueiredo, Pedro; Monsanto, Pedro; Lopes, Sandra; Freire, Paulo; Ferreira, Manuela; Carvalho, Rita; Gouveia, Hermano; Sofia, Carlos

    2012-05-01

    capsule endoscopy (CE) has revolutionized the study of small bowel. One major drawback of this technique is that we cannot interfere with image acquisition process. Therefore, the development of new software tools that could modify the images and increase both detection and diagnosis of small-bowel lesions would be very useful. The Flexible Spectral Imaging Color Enhancement (FICE) that allows for virtual chromoendoscopy is one of these software tools. to evaluate the reproducibility and diagnostic accuracy of the FICE system in CE. this prospective study involved 20 patients. First, four physicians interpreted 150 static FICE images and the overall agreement between them was determined using the Fleiss Kappa Test. Second, two experienced gastroenterologists, blinded to each other results, analyzed the complete 20 video streams. One interpreted conventional capsule videos and the other, the CE-FICE videos at setting 2. All findings were reported, regardless of their clinical value. Non-concordant findings between both interpretations were analyzed by a consensus panel of four gastroenterologists who reached a final result (positive or negative finding). in the first arm of the study the overall concordance between the four gastroenterologists was substantial (0.650). In the second arm, the conventional mode identified 75 findings and the CE-FICE mode 95. The CE-FICE mode did not miss any lesions identified by the conventional mode and allowed the identification of a higher number of angiodysplasias (35 vs 32), and erosions (41 vs. 24). there is reproducibility for the interpretation of CE-FICE images between different observers experienced in conventional CE. The use of virtual chromoendoscopy in CE seems to increase its diagnostic accuracy by highlighting small bowel erosions and angiodysplasias that weren´t identified by the conventional mode.

  18. An open source software for analysis of dynamic contrast enhanced magnetic resonance images: UMMPerfusion revisited.

    PubMed

    Zöllner, Frank G; Daab, Markus; Sourbron, Steven P; Schad, Lothar R; Schoenberg, Stefan O; Weisser, Gerald

    2016-01-14

    Perfusion imaging has become an important image based tool to derive the physiological information in various applications, like tumor diagnostics and therapy, stroke, (cardio-) vascular diseases, or functional assessment of organs. However, even after 20 years of intense research in this field, perfusion imaging still remains a research tool without a broad clinical usage. One problem is the lack of standardization in technical aspects which have to be considered for successful quantitative evaluation; the second problem is a lack of tools that allow a direct integration into the diagnostic workflow in radiology. Five compartment models, namely, a one compartment model (1CP), a two compartment exchange (2CXM), a two compartment uptake model (2CUM), a two compartment filtration model (2FM) and eventually the extended Toft's model (ETM) were implemented as plugin for the DICOM workstation OsiriX. Moreover, the plugin has a clean graphical user interface and provides means for quality management during the perfusion data analysis. Based on reference test data, the implementation was validated against a reference implementation. No differences were found in the calculated parameters. We developed open source software to analyse DCE-MRI perfusion data. The software is designed as plugin for the DICOM Workstation OsiriX. It features a clean GUI and provides a simple workflow for data analysis while it could also be seen as a toolbox providing an implementation of several recent compartment models to be applied in research tasks. Integration into the infrastructure of a radiology department is given via OsiriX. Results can be saved automatically and reports generated automatically during data analysis ensure certain quality control.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    W. Davis; P. Roney; T. Carroll

    The MDSplus data acquisition system has been used successfully since the 1999 startup of NSTX [National Spherical Torus Experiment] for control, data acquisition, and analysis for diagnostic subsystems. For each plasma ''shot'' on NSTX about 75 MBs of data is acquired and loaded into MDSplus hierarchical data structures in 2-3 minutes. Physicists adapted to the MDSplus software tools with no real difficulty. Some locally developed tools are described. The support from the developers at MIT [Massachusetts Institute of Technology] was timely and insightful. The use of MDSplus has resulted in a significant cost savings for NSTX.

  20. Three-dimensional virtual bronchoscopy using a tablet computer to guide real-time transbronchial needle aspiration.

    PubMed

    Fiorelli, Alfonso; Raucci, Antonio; Cascone, Roberto; Reginelli, Alfonso; Di Natale, Davide; Santoriello, Carlo; Capuozzo, Antonio; Grassi, Roberto; Serra, Nicola; Polverino, Mario; Santini, Mario

    2017-04-01

    We proposed a new virtual bronchoscopy tool to improve the accuracy of traditional transbronchial needle aspiration for mediastinal staging. Chest-computed tomographic images (1 mm thickness) were reconstructed with Osirix software to produce a virtual bronchoscopic simulation. The target adenopathy was identified by measuring its distance from the carina on multiplanar reconstruction images. The static images were uploaded in iMovie Software, which produced a virtual bronchoscopic movie from the images; the movie was then transferred to a tablet computer to provide real-time guidance during a biopsy. To test the validity of our tool, we divided all consecutive patients undergoing transbronchial needle aspiration retrospectively in two groups based on whether the biopsy was guided by virtual bronchoscopy (virtual bronchoscopy group) or not (traditional group). The intergroup diagnostic yields were statistically compared. Our analysis included 53 patients in the traditional and 53 in the virtual bronchoscopy group. The sensitivity, specificity, positive predictive value, negative predictive value and diagnostic accuracy for the traditional group were 66.6%, 100%, 100%, 10.53% and 67.92%, respectively, and for the virtual bronchoscopy group were 84.31%, 100%, 100%, 20% and 84.91%, respectively. The sensitivity ( P  = 0.011) and diagnostic accuracy ( P  = 0.011) of sampling the paratracheal station were better for the virtual bronchoscopy group than for the traditional group; no significant differences were found for the subcarinal lymph node. Our tool is simple, economic and available in all centres. It guided in real time the needle insertion, thereby improving the accuracy of traditional transbronchial needle aspiration, especially when target lesions are located in a difficult site like the paratracheal station. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  1. ConfocalCheck - A Software Tool for the Automated Monitoring of Confocal Microscope Performance

    PubMed Central

    Hng, Keng Imm; Dormann, Dirk

    2013-01-01

    Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system’s performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments. PMID:24224017

  2. A flexible tool for diagnosing water, energy, and entropy budgets in climate models

    NASA Astrophysics Data System (ADS)

    Lembo, Valerio; Lucarini, Valerio

    2017-04-01

    We have developed a new flexible software for studying the global energy budget, the hydrological cycle, and the material entropy production of global climate models. The program receives as input radiative, latent and sensible energy fluxes, with the requirement that the variable names are in agreement with the Climate and Forecast (CF) conventions for the production of NetCDF datasets. Annual mean maps, meridional sections and time series are computed by means of Climate Data Operators (CDO) collection of command line operators developed at Max-Planck Institute for Meteorology (MPI-M). If a land-sea mask is provided, the program also computes the required quantities separately on the continents and oceans. Depending on the user's choice, the program also calls the MATLAB software to compute meridional heat transports and location and intensities of the peaks in the two hemispheres. We are currently planning to adapt the program in order to be included in the Earth System Model eValuation Tool (ESMValTool) community diagnostics.

  3. Diagnostic value of using 18F-FDG PET and PET/CT in immunocompetent patients with primary central nervous system lymphoma: A systematic review and meta-analysis.

    PubMed

    Zou, Yaru; Tong, Jianjing; Leng, Haiyan; Jiang, Jingwei; Pan, Meng; Chen, Zi

    2017-06-20

    18F-fluorodeoxyglucose (18F-FDG) positron emission tomography (PET) and PET/CT have become two of the most powerful tools for malignant lymphoma exploration, but their diagnostic role in primary central nervous system lymphoma (PCNSL) is still disputed. The purpose of our study is to identify the usefulness of 18F-FDG PET and PET/CT for detecting PCNSL. A total of 129 patients, obtained from eight eligible studies, were included for this systematic review and meta-analysis. The performance of 18F-FDG PET and PET/CT for diagnosing PCNSL were as follows: the pooled sensitivity was 0.88 (95% CI: 0.80-0.94), specificity was 0.86 (95% CI: 0.73-0.94), positive likelihood ratio (PLR) was 3.99 (95% CI: 2.31-6.90), negative likelihood ratio (NLR) was 0.11 (95% CI: 0.04-0.32), and diagnostic odds ratio (DOR) was 33.40 (95% CI: 10.40-107.3). In addition, the area under the curve (AUC) and Q index were 0.9192 and 0.8525, respectively. PubMed/MEDLINE, Embase and Cochrane Library were systematically searched for potential publications (last updated on July 16th, 2016). Reference lists of included articles were also checked. Original articles that reported data on patients who were suspected of having PCNSL were considered suitable for inclusion. The sensitivities and specificities of 18F-FDG PET and PET/CT in each study were evaluated. The Stata software and Meta-Disc software were employed in the process of data analysis. 18F-FDG PET and PET/CT showed considerable accuracy in identifying PCNSL in immunocompetent patients and could be a valuable radiological diagnostic tool for PCNSL.

  4. Use of software tools in the development of real time software systems

    NASA Technical Reports Server (NTRS)

    Garvey, R. C.

    1981-01-01

    The transformation of a preexisting software system into a larger and more versatile system with different mission requirements is discussed. The history of this transformation is used to illustrate the use of structured real time programming techniques and tools to produce maintainable and somewhat transportable systems. The predecessor system is a single ground diagnostic system; its purpose is to exercise a computer controlled hardware set prior to its deployment in its functional environment, as well as test the equipment set by supplying certain well known stimulas. The successor system (FTE) is required to perform certain testing and control functions while this hardware set is in its functional environment. Both systems must deal with heavy user input/output loads and a new I/O requirement is included in the design of the FTF system. Human factors are enhanced by adding an improved console interface and special function keyboard handler. The additional features require the inclusion of much new software to the original set from which FTF was developed. As a result, it is necessary to split the system into a duel programming configuration with high rates of interground communications. A generalized information routing mechanism is used to support this configuration.

  5. Control and Information Systems for the National Ignition Facility

    DOE PAGES

    Brunton, Gordon; Casey, Allan; Christensen, Marvin; ...

    2017-03-23

    Orchestration of every National Ignition Facility (NIF) shot cycle is managed by the Integrated Computer Control System (ICCS), which uses a scalable software architecture running code on more than 1950 front-end processors, embedded controllers, and supervisory servers. The ICCS operates laser and industrial control hardware containing 66 000 control and monitor points to ensure that all of NIF’s laser beams arrive at the target within 30 ps of each other and are aligned to a pointing accuracy of less than 50 μm root-mean-square, while ensuring that a host of diagnostic instruments record data in a few billionths of a second.more » NIF’s automated control subsystems are built from a common object-oriented software framework that distributes the software across the computer network and achieves interoperation between different software languages and target architectures. A large suite of business and scientific software tools supports experimental planning, experimental setup, facility configuration, and post-shot analysis. Standard business services using open-source software, commercial workflow tools, and database and messaging technologies have been developed. An information technology infrastructure consisting of servers, network devices, and storage provides the foundation for these systems. Thus, this work is an overview of the control and information systems used to support a wide variety of experiments during the National Ignition Campaign.« less

  6. Control and Information Systems for the National Ignition Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunton, Gordon; Casey, Allan; Christensen, Marvin

    Orchestration of every National Ignition Facility (NIF) shot cycle is managed by the Integrated Computer Control System (ICCS), which uses a scalable software architecture running code on more than 1950 front-end processors, embedded controllers, and supervisory servers. The ICCS operates laser and industrial control hardware containing 66 000 control and monitor points to ensure that all of NIF’s laser beams arrive at the target within 30 ps of each other and are aligned to a pointing accuracy of less than 50 μm root-mean-square, while ensuring that a host of diagnostic instruments record data in a few billionths of a second.more » NIF’s automated control subsystems are built from a common object-oriented software framework that distributes the software across the computer network and achieves interoperation between different software languages and target architectures. A large suite of business and scientific software tools supports experimental planning, experimental setup, facility configuration, and post-shot analysis. Standard business services using open-source software, commercial workflow tools, and database and messaging technologies have been developed. An information technology infrastructure consisting of servers, network devices, and storage provides the foundation for these systems. Thus, this work is an overview of the control and information systems used to support a wide variety of experiments during the National Ignition Campaign.« less

  7. NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-01-01

    This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less

  8. CSE database: extended annotations and new recommendations for ECG software testing.

    PubMed

    Smíšek, Radovan; Maršánová, Lucie; Němcová, Andrea; Vítek, Martin; Kozumplík, Jiří; Nováková, Marie

    2017-08-01

    Nowadays, cardiovascular diseases represent the most common cause of death in western countries. Among various examination techniques, electrocardiography (ECG) is still a highly valuable tool used for the diagnosis of many cardiovascular disorders. In order to diagnose a person based on ECG, cardiologists can use automatic diagnostic algorithms. Research in this area is still necessary. In order to compare various algorithms correctly, it is necessary to test them on standard annotated databases, such as the Common Standards for Quantitative Electrocardiography (CSE) database. According to Scopus, the CSE database is the second most cited standard database. There were two main objectives in this work. First, new diagnoses were added to the CSE database, which extended its original annotations. Second, new recommendations for diagnostic software quality estimation were established. The ECG recordings were diagnosed by five new cardiologists independently, and in total, 59 different diagnoses were found. Such a large number of diagnoses is unique, even in terms of standard databases. Based on the cardiologists' diagnoses, a four-round consensus (4R consensus) was established. Such a 4R consensus means a correct final diagnosis, which should ideally be the output of any tested classification software. The accuracy of the cardiologists' diagnoses compared with the 4R consensus was the basis for the establishment of accuracy recommendations. The accuracy was determined in terms of sensitivity = 79.20-86.81%, positive predictive value = 79.10-87.11%, and the Jaccard coefficient = 72.21-81.14%, respectively. Within these ranges, the accuracy of the software is comparable with the accuracy of cardiologists. The accuracy quantification of the correct classification is unique. Diagnostic software developers can objectively evaluate the success of their algorithm and promote its further development. The annotations and recommendations proposed in this work will allow for faster development and testing of classification software. As a result, this might facilitate cardiologists' work and lead to faster diagnoses and earlier treatment.

  9. Telepathology: design of a modular system.

    PubMed

    Brauchli, K; Christen, H; Meyer, P; Haroske, G; Meyer, W; Kunze, K D; Otto, R; Oberholzer, M

    2000-01-01

    Although telepathology systems have been developed for more than a decade, they are still not a widespread tool for routine diagnostic applications. Lacking interoperability, software that is not satisfying user needs as well as high costs have been identified as reasons. In this paper we would like to demonstrate that with a clear separation of the tasks required for a telepathology application, telepathology systems can be built in a modular way, where many modules can be implemented using standard software components. With such a modular design, systems can be easily adapted to changing user needs and new technological developments and it is easier to integrate modular systems into existing environments.

  10. Network Meta-Analysis Using R: A Review of Currently Available Automated Packages

    PubMed Central

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687

  11. Network meta-analysis using R: a review of currently available automated packages.

    PubMed

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  12. An intelligent advisory system for pre-launch processing

    NASA Technical Reports Server (NTRS)

    Engrand, Peter A.; Mitchell, Tami

    1991-01-01

    The shuttle system of interest in this paper is the shuttle's data processing system (DPS). The DPS is composed of the following: (1) general purpose computers (GPC); (2) a multifunction CRT display system (MCDS); (3) mass memory units (MMU); and (4) a multiplexer/demultiplexer (MDM) and related software. In order to ensure the correct functioning of shuttle systems, some level of automatic error detection has been incorporated into all shuttle systems. For the DPS, error detection equipment has been incorporated into all of its subsystems. The automated diagnostic system, (MCDS) diagnostic tool, that aids in a more efficient processing of the DPS is described.

  13. Paranoia.Ada: A diagnostic program to evaluate Ada floating-point arithmetic

    NASA Technical Reports Server (NTRS)

    Hjermstad, Chris

    1986-01-01

    Many essential software functions in the mission critical computer resource application domain depend on floating point arithmetic. Numerically intensive functions associated with the Space Station project, such as emphemeris generation or the implementation of Kalman filters, are likely to employ the floating point facilities of Ada. Paranoia.Ada appears to be a valuabe program to insure that Ada environments and their underlying hardware exhibit the precision and correctness required to satisfy mission computational requirements. As a diagnostic tool, Paranoia.Ada reveals many essential characteristics of an Ada floating point implementation. Equipped with such knowledge, programmers need not tremble before the complex task of floating point computation.

  14. Visual programming for next-generation sequencing data analytics.

    PubMed

    Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia

    2016-01-01

    High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.

  15. Imaging of femoroacetabular impingement-current concepts

    PubMed Central

    Albers, Christoph E.; Wambeek, Nicholas; Hanke, Markus S.; Schmaranzer, Florian; Prosser, Gareth H.; Yates, Piers J.

    2016-01-01

    Following the recognition of femoroacetabular impingement (FAI) as a clinical entity, diagnostic tools have continuously evolved. While the diagnosis of FAI is primarily made based on the patients’ history and clinical examination, imaging of FAI is indispensable. Routine diagnostic work-up consists of a set of plain radiographs, magnetic resonance imaging (MRI) and MR-arthrography. Recent advances in MRI technology include biochemically sensitive sequences bearing the potential to detect degenerative changes of the hip joint at an early stage prior to their appearance on conventional imaging modalities. Computed tomography may serve as an adjunct. Advantages of CT include superior bone to soft tissue contrast, making CT applicable for image-guiding software tools that allow evaluation of the underlying dynamic mechanisms causing FAI. This article provides a summary of current concepts of imaging in FAI and a review of the literature on recent advances, and their application to clinical practice. PMID:29632685

  16. Climate Model Diagnostic Analyzer

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  17. System diagnostic builder

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph L.; Burke, Roger

    1992-01-01

    The System Diagnostic Builder (SDB) is an automated software verification and validation tool using state-of-the-art Artificial Intelligence (AI) technologies. The SDB is used extensively by project BURKE at NASA-JSC as one component of a software re-engineering toolkit. The SDB is applicable to any government or commercial organization which performs verification and validation tasks. The SDB has an X-window interface, which allows the user to 'train' a set of rules for use in a rule-based evaluator. The interface has a window that allows the user to plot up to five data parameters (attributes) at a time. Using these plots and a mouse, the user can identify and classify a particular behavior of the subject software. Once the user has identified the general behavior patterns of the software, he can train a set of rules to represent his knowledge of that behavior. The training process builds rules and fuzzy sets to use in the evaluator. The fuzzy sets classify those data points not clearly identified as a particular classification. Once an initial set of rules is trained, each additional data set given to the SDB will be used by a machine learning mechanism to refine the rules and fuzzy sets. This is a passive process and, therefore, it does not require any additional operator time. The evaluation component of the SDB can be used to validate a single software system using some number of different data sets, such as a simulator. Moreover, it can be used to validate software systems which have been re-engineered from one language and design methodology to a totally new implementation.

  18. Development and implementation of a 'Mental Health Finder' software tool within an electronic medical record system.

    PubMed

    Swan, D; Hannigan, A; Higgins, S; McDonnell, R; Meagher, D; Cullen, W

    2017-02-01

    In Ireland, as in many other healthcare systems, mental health service provision is being reconfigured with a move toward more care in the community, and particularly primary care. Recording and surveillance systems for mental health information and activities in primary care are needed for service planning and quality improvement. We describe the development and initial implementation of a software tool ('mental health finder') within a widely used primary care electronic medical record system (EMR) in Ireland to enable large-scale data collection on the epidemiology and management of mental health and substance use problems among patients attending general practice. In collaboration with the Irish Primary Care Research Network (IPCRN), we developed the 'Mental Health Finder' as a software plug-in to a commonly used primary care EMR system to facilitate data collection on mental health diagnoses and pharmacological treatments among patients. The finder searches for and identifies patients based on diagnostic coding and/or prescribed medicines. It was initially implemented among a convenience sample of six GP practices. Prevalence of mental health and substance use problems across the six practices, as identified by the finder, was 9.4% (range 6.9-12.7%). 61.9% of identified patients were female; 25.8% were private patients. One-third (33.4%) of identified patients were prescribed more than one class of psychotropic medication. Of the patients identified by the finder, 89.9% were identifiable via prescribing data, 23.7% via diagnostic coding. The finder is a feasible and promising methodology for large-scale data collection on mental health problems in primary care.

  19. Comparison of quality control software tools for diffusion tensor imaging.

    PubMed

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Refining Measurement of Substance Use Disorders among Women of Child-bearing Age Using Hospital Records: The Development of the Explicit-Mention Substance Abuse Need for Treatment in Women (EMSANT-W) Algorithm

    PubMed Central

    Derrington, Taletha Mae; Bernstein, Judith; Belanoff, Candice; Cabral, Howard J.; Babakhanlou-Chase, Hermik; Diop, Hafsatou; Evans, Stephen R.; Kotelchuck, Milton

    2015-01-01

    Substance use disorder (SUD) in women of reproductive age is associated with adverse health consequences for both women and their offspring. US states need a feasible population-based, case-identification tool to generate better approximations of SUD prevalence, treatment use, and treatment outcomes among women. This article presents the development of the Explicit Mention Substance Abuse Need for Treatment in Women (EMSANT-W), a gender-tailored tool based upon existing International Classification of Diseases, 9th Edition, Clinical Modification diagnostic code-based groupers that can be applied to hospital administrative data. Gender-tailoring entailed the addition of codes related to infants, pregnancy, and prescription drug abuse, as well as the creation of inclusion/exclusion rules based on other conditions present in the diagnostic record. Among 1,728,027 women and associated infants who accessed hospital care from January 1, 2002 to December 31, 2008 in Massachusetts, EMSANT-W identified 103,059 women with probable SUD. EMSANT-W identified 4,116 women who were not identified by the widely used Clinical Classifications Software for Mental Health and Substance Abuse (CCS-MHSA) and did not capture 853 women identified by CCS-MHSA. Content and approach innovations in EMSANT-W address potential limitations of the Clinical Classifications Software, and create a methodologically sound, gender-tailored and feasible population-based tool for identifying women of reproductive age in need of further evaluation for SUD treatment. Rapid changes in health care service infrastructure, delivery systems and policies require tools such as the EMSANT-W that provide more precise identification methods for sub-populations and can serve as the foundation for analyses of treatment use and outcomes. PMID:25680703

  1. Refining Measurement of Substance Use Disorders Among Women of Child-Bearing Age Using Hospital Records: The Development of the Explicit-Mention Substance Abuse Need for Treatment in Women (EMSANT-W) Algorithm.

    PubMed

    Derrington, Taletha Mae; Bernstein, Judith; Belanoff, Candice; Cabral, Howard J; Babakhanlou-Chase, Hermik; Diop, Hafsatou; Evans, Stephen R; Kotelchuck, Milton

    2015-10-01

    Substance use disorder (SUD) in women of reproductive age is associated with adverse health consequences for both women and their offspring. US states need a feasible population-based, case-identification tool to generate better approximations of SUD prevalence, treatment use, and treatment outcomes among women. This article presents the development of the Explicit Mention Substance Abuse Need for Treatment in Women (EMSANT-W), a gender-tailored tool based upon existing International Classification of Diseases, 9th Edition, Clinical Modification diagnostic code-based groupers that can be applied to hospital administrative data. Gender-tailoring entailed the addition of codes related to infants, pregnancy, and prescription drug abuse, as well as the creation of inclusion/exclusion rules based on other conditions present in the diagnostic record. Among 1,728,027 women and associated infants who accessed hospital care from January 1, 2002 to December 31, 2008 in Massachusetts, EMSANT-W identified 103,059 women with probable SUD. EMSANT-W identified 4,116 women who were not identified by the widely used Clinical Classifications Software for Mental Health and Substance Abuse (CCS-MHSA) and did not capture 853 women identified by CCS-MHSA. Content and approach innovations in EMSANT-W address potential limitations of the Clinical Classifications Software, and create a methodologically sound, gender-tailored and feasible population-based tool for identifying women of reproductive age in need of further evaluation for SUD treatment. Rapid changes in health care service infrastructure, delivery systems and policies require tools such as the EMSANT-W that provide more precise identification methods for sub-populations and can serve as the foundation for analyses of treatment use and outcomes.

  2. Ada Programming Support Environment (APSE) Evaluation and Validation (E&V) Team

    DTIC Science & Technology

    1991-12-31

    standards. The purpose of the team was to assist the project in several ways. Raymond Szymanski of Wright Research Iand Development Center (WRDC, now...debuggers, program library systems, and compiler diagnostics. The test suite does not include explicit tests for the existence of language features . The...support software is a set of tools and procedures which assist in preparing and executing the test suite, in extracting data from the results of

  3. Computational diagnosis of canine lymphoma

    NASA Astrophysics Data System (ADS)

    Mirkes, E. M.; Alexandrakis, I.; Slater, K.; Tuli, R.; Gorban, A. N.

    2014-03-01

    One out of four dogs will develop cancer in their lifetime and 20% of those will be lymphoma cases. PetScreen developed a lymphoma blood test using serum samples collected from several veterinary practices. The samples were fractionated and analysed by mass spectrometry. Two protein peaks, with the highest diagnostic power, were selected and further identified as acute phase proteins, C-Reactive Protein and Haptoglobin. Data mining methods were then applied to the collected data for the development of an online computer-assisted veterinary diagnostic tool. The generated software can be used as a diagnostic, monitoring and screening tool. Initially, the diagnosis of lymphoma was formulated as a classification problem and then later refined as a lymphoma risk estimation. Three methods, decision trees, kNN and probability density evaluation, were used for classification and risk estimation and several preprocessing approaches were implemented to create the diagnostic system. For the differential diagnosis the best solution gave a sensitivity and specificity of 83.5% and 77%, respectively (using three input features, CRP, Haptoglobin and standard clinical symptom). For the screening task, the decision tree method provided the best result, with sensitivity and specificity of 81.4% and >99%, respectively (using the same input features). Furthermore, the development and application of new techniques for the generation of risk maps allowed their user-friendly visualization.

  4. iDermatoPath - a novel software tool for mitosis detection in H&E-stained tissue sections of malignant melanoma.

    PubMed

    Andres, C; Andres-Belloni, B; Hein, R; Biedermann, T; Schäpe, A; Brieu, N; Schönmeyer, R; Yigitsoy, M; Ring, J; Schmidt, G; Harder, N

    2017-07-01

    Malignant Melanoma (MM) is characterized by a growing incidence and a high malignant potential. Besides well-defined prognostic factors such as tumour thickness and ulceration, the Mitotic Rate (MR) was included in the AJCC recommendations for diagnosis and treatment of MM. In daily routine, the identification of a single mitosis can be difficult on haematoxylin and eosin slides alone. Several studies showed a big inter- and intra-individual variability in detecting the MR in MM even by very experienced investigators, thus raising the question for a computer-assisted method. The objective was to develop a software system for mitosis detection in MM on H&E slides based on machine learning for diagnostic support. We developed a computer-aided staging support system based on image analysis and machine learning on the basis of 59 MM specimens. Our approach automatically detects tumour regions, identifies mitotic nuclei and classifies them with respect to their diagnostic relevance. A convenient user interface enables the investigator to browse through the proposed mitoses for fast and efficient diagnosing. A quantitative evaluation on manually labelled ground truth data revealed that the tumour region detection yields a medium spatial overlap index (dice coefficient) of 0.72. For the mitosis detection, we obtained high accuracies of above 83%. On the technical side, the developed iDermatoPath software tool provides a novel approach for mitosis detection in MM, which can be further improved using more training data such as dermatopathologist annotations. On the practical side, a first evaluation of the clinical utility was positive, albeit this approach provides most benefit for difficult cases in a research setting. Assuming all slides to be digitally processed and reported in the near future, this method could become a helpful additional tool for the pathologist. © 2017 European Academy of Dermatology and Venereology.

  5. The use of digital images in pathology.

    PubMed

    Furness, P N

    1997-11-01

    Digital images are routinely used by the publishing industry, but most diagnostic pathologists are unfamiliar with the technology and its possibilities. This review aims to explain the basic principles of digital image acquisition, storage, manipulation and use, and the possibilities provided not only in research, but also in teaching and in routine diagnostic pathology. Images of natural objects are usually expressed digitally as 'bitmaps'--rectilinear arrays of small dots. The size of each dot can vary, but so can its information content in terms, for example, of colour, greyscale or opacity. Various file formats and compression algorithms are available. Video cameras connected to microscopes are familiar to most pathologists; video images can be converted directly to a digital form by a suitably equipped computer. Digital cameras and scanners are alternative acquisition tools of relevance to pathologists. Once acquired, a digital image can easily be subjected to the digital equivalent of any conventional darkroom manipulation and modern software allows much more flexibility, to such an extent that a new tool for scientific fraud has been created. For research, image enhancement and analysis is an increasingly powerful and affordable tool. Morphometric measurements are, after many predictions, at last beginning to be part of the toolkit of the diagnostic pathologist. In teaching, the potential to create dramatic yet informative presentations is demonstrated daily by the publishing industry; such methods are readily applicable to the classroom. The combination of digital images and the Internet raises many possibilities; for example, instead of seeking one expert diagnostic opinion, one could simultaneously seek the opinion of many, all around the globe. It is inevitable that in the coming years the use of digital images will spread from the laboratory to the medical curriculum and to the whole of diagnostic pathology.

  6. PAUSE: Predictive Analytics Using SPARQL-Endpoints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Ainsworth, Keela; Bond, Nathaniel

    2014-07-11

    This invention relates to the medical industry and more specifically to methods of predicting risks. With the impetus towards personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledgebase, etc.) to predict diagnostic risks is fast emerging. We describe a software solution that leverages hardware for scalable in-memory analytics and applies next-generation semantic query tools on medical data.

  7. Casimage project: a digital teaching files authoring environment.

    PubMed

    Rosset, Antoine; Muller, Henning; Martins, Martina; Dfouni, Natalia; Vallée, Jean-Paul; Ratib, Osman

    2004-04-01

    The goal of the Casimage project is to offer an authoring and editing environment integrated with the Picture Archiving and Communication Systems (PACS) for creating image-based electronic teaching files. This software is based on a client/server architecture allowing remote access of users to a central database. This authoring environment allows radiologists to create reference databases and collection of digital images for teaching and research directly from clinical cases being reviewed on PACS diagnostic workstations. The environment includes all tools to create teaching files, including textual description, annotations, and image manipulation. The software also allows users to generate stand-alone CD-ROMs and web-based teaching files to easily share their collections. The system includes a web server compatible with the Medical Imaging Resource Center standard (MIRC, http://mirc.rsna.org) to easily integrate collections in the RSNA web network dedicated to teaching files. This software could be installed on any PACS workstation to allow users to add new cases at any time and anywhere during clinical operations. Several images collections were created with this tool, including thoracic imaging that was subsequently made available on a CD-Rom and on our web site and through the MIRC network for public access.

  8. A Model-Based Expert System for Space Power Distribution Diagnostics

    NASA Technical Reports Server (NTRS)

    Quinn, Todd M.; Schlegelmilch, Richard F.

    1994-01-01

    When engineers diagnose system failures, they often use models to confirm system operation. This concept has produced a class of advanced expert systems that perform model-based diagnosis. A model-based diagnostic expert system for the Space Station Freedom electrical power distribution test bed is currently being developed at the NASA Lewis Research Center. The objective of this expert system is to autonomously detect and isolate electrical fault conditions. Marple, a software package developed at TRW, provides a model-based environment utilizing constraint suspension. Originally, constraint suspension techniques were developed for digital systems. However, Marple provides the mechanisms for applying this approach to analog systems such as the test bed, as well. The expert system was developed using Marple and Lucid Common Lisp running on a Sun Sparc-2 workstation. The Marple modeling environment has proved to be a useful tool for investigating the various aspects of model-based diagnostics. This report describes work completed to date and lessons learned while employing model-based diagnostics using constraint suspension within an analog system.

  9. Radiation biodosimetry: Applications for spaceflight

    NASA Astrophysics Data System (ADS)

    Blakely, W. F.; Miller, A. C.; Grace, M. B.; McLeland, C. B.; Luo, L.; Muderhwa, J. M.; Miner, V. L.; Prasanna, P. G. S.

    The multiparametric dosimetry system that we are developing for medical radiological defense applications could be adapted for spaceflight environments. The system complements the internationally accepted personnel dosimeters and cytogenetic analysis of chromosome aberrations, considered the best means of documenting radiation doses for health records. Our system consists of a portable hematology analyzer, molecular biodosimetry using nucleic acid and antigen-based diagnostic equipment, and a dose assessment management software application. A dry-capillary tube reagent-based centrifuge blood cell counter (QBC Autoread Plus, Beckon Dickinson Bioscience) measures peripheral blood lymphocytes and monocytes, which could determine radiation dose based on the kinetics of blood cell depletion. Molecular biomarkers for ionizing radiation exposure (gene expression changes, blood proteins) can be measured in real time using such diagnostic detection technologies as miniaturized nucleic acid sequences and antigen-based biosensors, but they require validation of dose-dependent targets and development of optimized protocols and analysis systems. The Biodosimetry Assessment Tool, a software application, calculates radiation dose based on a patient's physical signs and symptoms and blood cell count analysis. It also annotates location of personnel dosimeters, displays a summary of a patient's dosimetric information to healthcare professionals, and archives the data for further use. These radiation assessment diagnostic technologies can have dual-use applications supporting general medical-related care.

  10. Educational Software Applied in Teaching Electrocardiogram: A Systematic Review

    PubMed Central

    Chaves, Rafael O.; de Souza, Érica F.; Seruffo, Marcos C. R.; Francês, Carlos R. L.

    2018-01-01

    Background The electrocardiogram (ECG) is the most used diagnostic tool in medicine; in this sense, it is essential that medical undergraduates learn how to interpret it correctly while they are still on training. Naturally, they go through classic learning (e.g., lectures and speeches). However, they are not often efficiently trained in analyzing ECG results. In this regard, methodologies such as other educational support tools in medical practice, such as educational software, should be considered a valuable approach for medical training purposes. Methods We performed a literature review in six electronic databases, considering studies published before April 2017. The resulting set comprises 2,467 studies. From this collection, 12 studies have been selected, initially, whereby we carried out a snowballing process to identify other relevant studies through the reference lists of these studies, resulting in five relevant studies, making up a total of 17 articles that passed all stages and criteria. Results The results show that 52.9% of software types were tutorial and 58.8% were designed to be run locally on a computer. The subjects were discussed together with a greater focus on the teaching of electrophysiology and/or cardiac physiology, identifying patterns of ECG and/or arrhythmias. Conclusions We found positive results with the introduction of educational software for ECG teaching. However, there is a clear need for using higher quality research methodologies and the inclusion of appropriate controls, in order to obtain more precise conclusions about how beneficial the inclusion of such tools can be for the practices of ECG interpretation. PMID:29736398

  11. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    PubMed Central

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  12. Rapid bacterial diagnostics via surface enhanced Raman microscopy.

    PubMed

    Premasiri, W R; Sauer-Budge, A F; Lee, J C; Klapperich, C M; Ziegler, L D

    2012-06-01

    There is a continuing need to develop new techniques for the rapid and specific identification of bacterial pathogens in human body fluids especially given the increasing prevalence of drug resistant strains. Efforts to develop a surface enhanced Raman spectroscopy (SERS) based approach, which encompasses sample preparation, SERS substrates, portable Raman microscopy instrumentation and novel identification software, are described. The progress made in each of these areas in our laboratory is summarized and illustrated by a spiked infectious sample for urinary tract infection (UTI) diagnostics. SERS bacterial spectra exhibit both enhanced sensitivity and specificity allowing the development of an easy to use, portable, optical platform for pathogen detection and identification. SERS of bacterial cells is shown to offer not only reproducible molecular spectroscopic signatures for analytical applications in clinical diagnostics, but also is a new tool for studying biochemical activity in real time at the outer layers of these organisms.

  13. Expert diagnostics system as a part of analysis software for power mission operations

    NASA Technical Reports Server (NTRS)

    Harris, Jennifer A.; Bahrami, Khosrow A.

    1993-01-01

    The operation of interplanetary spacecraft at JPL has become an increasingly complex activity. This complexity is due to advanced spacecraft designs and ambitious mission objectives which lead to operations requirements that are more demanding than those of any previous mission. For this reason, several productivity enhancement measures are underway at JPL within mission operations, particularly in the spacecraft analysis area. These measures aimed at spacecraft analysis include: the development of a multi-mission, multi-subsystem operations environment; the introduction of automated tools into this environment; and the development of an expert diagnostics system. This paper discusses an effort to integrate the above mentioned productivity enhancement measures. A prototype was developed that integrates an expert diagnostics system into a multi-mission, multi-subsystem operations environment using the Galileo Power / Pyro Subsystem as a testbed. This prototype will be discussed in addition to background information associated with it.

  14. DMS augmented monitoring and diganosis application (DMS AMDA) prototype

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Boyd, Mark A.; Iverson, David L.; Donnell, Brian; Lauritsen, Janet; Doubek, Sharon; Gibson, Jim; Monahan, Christine; Rosenthal, Donald A.

    1993-01-01

    The Data Management System Augmented Monitoring and Diagnosis Application (DMS AMDA) is currently under development at NASA Ames Research Center (ARC). It will provide automated monitoring and diagnosis capabilities for the Space Station Freedom (SSF) Data Management System (DMS) in the Control Center Complex (CCC) at NASA Johnson Space Center. Several advanced automation applications are under development for use in the CCC for other SSF subsystems. The DMS AMDA, however, is the first application to utilize digraph failure analysis techniques and the Extended Realtime FEAT (ERF) application as the core of its diagnostic system design, since the other projects were begun before the digraph tools were available. Model-based diagnosis and expert systems techniques will provide additional capabilities and augment ERF where appropriate. Utilization of system knowledge captured in the design phase of a system in digraphs should result in both a cost savings and a technical advantage during implementation of the diagnostic software. This paper addresses both the programmatic and technical considerations of this approach, and describes the software design and initial prototyping effort.

  15. Advanced Monitoring to Improve Combustion Turbine/Combined Cycle Reliability, Availability & Maintainability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leonard Angello

    2005-09-30

    Power generators are concerned with the maintenance costs associated with the advanced turbines that they are purchasing. Since these machines do not have fully established Operation and Maintenance (O&M) track records, power generators face financial risk due to uncertain future maintenance costs. This risk is of particular concern, as the electricity industry transitions to a competitive business environment in which unexpected O&M costs cannot be passed through to consumers. These concerns have accelerated the need for intelligent software-based diagnostic systems that can monitor the health of a combustion turbine in real time and provide valuable information on the machine's performancemore » to its owner/operators. EPRI, Impact Technologies, Boyce Engineering, and Progress Energy have teamed to develop a suite of intelligent software tools integrated with a diagnostic monitoring platform that, in real time, interpret data to assess the 'total health' of combustion turbines. The 'Combustion Turbine Health Management System' (CTHMS) will consist of a series of 'Dynamic Link Library' (DLL) programs residing on a diagnostic monitoring platform that accepts turbine health data from existing monitoring instrumentation. CTHMS interprets sensor and instrument outputs, correlates them to a machine's condition, provide interpretative analyses, project servicing intervals, and estimate remaining component life. In addition, the CTHMS enables real-time anomaly detection and diagnostics of performance and mechanical faults, enabling power producers to more accurately predict critical component remaining useful life and turbine degradation.« less

  16. Automated Quantitative Nuclear Cardiology Methods

    PubMed Central

    Motwani, Manish; Berman, Daniel S.; Germano, Guido; Slomka, Piotr J.

    2016-01-01

    Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps and estimate global and local measures of stress/rest perfusion – all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This chapter briefly reviews these techniques, highlights several challenges and discusses the latest developments. PMID:26590779

  17. Performance of a Method to Standardize Breast Ultrasound Interpretation Using Image Processing and Case-Based Reasoning

    NASA Astrophysics Data System (ADS)

    André, M. P.; Galperin, M.; Berry, A.; Ojeda-Fournier, H.; O'Boyle, M.; Olson, L.; Comstock, C.; Taylor, A.; Ledgerwood, M.

    Our computer-aided diagnostic (CADx) tool uses advanced image processing and artificial intelligence to analyze findings on breast sonography images. The goal is to standardize reporting of such findings using well-defined descriptors and to improve accuracy and reproducibility of interpretation of breast ultrasound by radiologists. This study examined several factors that may impact accuracy and reproducibility of the CADx software, which proved to be highly accurate and stabile over several operating conditions.

  18. Laboratory automation: trajectory, technology, and tactics.

    PubMed

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a modular approach, from a hardware-driven system to process control, from a one-of-a-kind novelty toward a standardized product, and from an in vitro diagnostics novelty to a marketing tool. Multiple vendors are present in the marketplace, many of whom are in vitro diagnostics manufacturers providing an automation solution coupled with their instruments, whereas others are focused automation companies. Automation technology continues to advance, acceptance continues to climb, and payback and cost justification methods are developing.

  19. Developing an automated database for monitoring ultrasound- and computed tomography-guided procedure complications and diagnostic yield.

    PubMed

    Itri, Jason N; Jones, Lisa P; Kim, Woojin; Boonn, William W; Kolansky, Ana S; Hilton, Susan; Zafar, Hanna M

    2014-04-01

    Monitoring complications and diagnostic yield for image-guided procedures is an important component of maintaining high quality patient care promoted by professional societies in radiology and accreditation organizations such as the American College of Radiology (ACR) and Joint Commission. These outcome metrics can be used as part of a comprehensive quality assurance/quality improvement program to reduce variation in clinical practice, provide opportunities to engage in practice quality improvement, and contribute to developing national benchmarks and standards. The purpose of this article is to describe the development and successful implementation of an automated web-based software application to monitor procedural outcomes for US- and CT-guided procedures in an academic radiology department. The open source tools PHP: Hypertext Preprocessor (PHP) and MySQL were used to extract relevant procedural information from the Radiology Information System (RIS), auto-populate the procedure log database, and develop a user interface that generates real-time reports of complication rates and diagnostic yield by site and by operator. Utilizing structured radiology report templates resulted in significantly improved accuracy of information auto-populated from radiology reports, as well as greater compliance with manual data entry. An automated web-based procedure log database is an effective tool to reliably track complication rates and diagnostic yield for US- and CT-guided procedures performed in a radiology department.

  20. Acoustic Emission Analysis Applet (AEAA) Software

    NASA Technical Reports Server (NTRS)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  1. tsiR: An R package for time-series Susceptible-Infected-Recovered models of epidemics.

    PubMed

    Becker, Alexander D; Grenfell, Bryan T

    2017-01-01

    tsiR is an open source software package implemented in the R programming language designed to analyze infectious disease time-series data. The software extends a well-studied and widely-applied algorithm, the time-series Susceptible-Infected-Recovered (TSIR) model, to infer parameters from incidence data, such as contact seasonality, and to forward simulate the underlying mechanistic model. The tsiR package aggregates a number of different fitting features previously described in the literature in a user-friendly way, providing support for their broader adoption in infectious disease research. Also included in tsiR are a number of diagnostic tools to assess the fit of the TSIR model. This package should be useful for researchers analyzing incidence data for fully-immunizing infectious diseases.

  2. PPM Receiver Implemented in Software

    NASA Technical Reports Server (NTRS)

    Gray, Andrew; Kang, Edward; Lay, Norman; Vilnrotter, Victor; Srinivasan, Meera; Lee, Clement

    2010-01-01

    A computer program has been written as a tool for developing optical pulse-position- modulation (PPM) receivers in which photodetector outputs are fed to analog-to-digital converters (ADCs) and all subsequent signal processing is performed digitally. The program can be used, for example, to simulate an all-digital version of the PPM receiver described in Parallel Processing of Broad-Band PPM Signals (NPO-40711), which appears elsewhere in this issue of NASA Tech Briefs. The program can also be translated into a design for digital PPM receiver hardware. The most notable innovation embodied in the software and the underlying PPM-reception concept is a digital processing subsystem that performs synchronization of PPM time slots, even though the digital processing is, itself, asynchronous in the sense that no attempt is made to synchronize it with the incoming optical signal a priori and there is no feedback to analog signal processing subsystems or ADCs. Functions performed by the software receiver include time-slot synchronization, symbol synchronization, coding preprocessing, and diagnostic functions. The program is written in the MATLAB and Simulink software system. The software receiver is highly parameterized and, hence, programmable: for example, slot- and symbol-synchronization filters have programmable bandwidths.

  3. Receiver operating characteristic (ROC) curves: review of methods with applications in diagnostic medicine

    NASA Astrophysics Data System (ADS)

    Obuchowski, Nancy A.; Bullen, Jennifer A.

    2018-04-01

    Receiver operating characteristic (ROC) analysis is a tool used to describe the discrimination accuracy of a diagnostic test or prediction model. While sensitivity and specificity are the basic metrics of accuracy, they have many limitations when characterizing test accuracy, particularly when comparing the accuracies of competing tests. In this article we review the basic study design features of ROC studies, illustrate sample size calculations, present statistical methods for measuring and comparing accuracy, and highlight commonly used ROC software. We include descriptions of multi-reader ROC study design and analysis, address frequently seen problems of verification and location bias, discuss clustered data, and provide strategies for testing endpoints in ROC studies. The methods are illustrated with a study of transmission ultrasound for diagnosing breast lesions.

  4. A free software for the calculation of T2* values for iron overload assessment.

    PubMed

    Fernandes, Juliano Lara; Fioravante, Luciana Andrea Barozi; Verissimo, Monica P; Loggetto, Sandra R

    2017-06-01

    Background Iron overload assessment with magnetic resonance imaging (MRI) using T2* has become a key diagnostic method in the management of many diseases. Quantitative analysis of the MRI images with a cost-effective tool has been a limitation to increased use of the method. Purpose To provide a free software solution for this purpose comparing the results with a commercial solution. Material and Methods The free tool was developed as a standalone program to be directly downloaded and ran in a common personal computer platform without the need of a dedicated workstation. Liver and cardiac T2* values were calculated using both tools and the values obtained compared between them in a group of 56 patients with suspected iron overload using Bland-Altman plots and concordance correlation coefficients (CCC). Results In the heart, the mean T2* differences between the two methods was 0.46 ms (95% confidence interval [CI], -0.037 -0.965) and in the liver 0.49 ms (95% CI, 0.257-0.722). The CCC for both the heart and the liver were significantly high (0.98 [95% CI, 0.966-0.988] with a Pearson ρ of 0.9811 and 0.991 [95% CI, 0.986-0.994] with a Pearson ρ of 0.996, respectively. No significant differences were observed when analyzing only patients with abnormal concentrations of iron in both organs compared to the whole cohort. Conclusion The proposed free software tool is accurate for calculation of T2* values of the liver and heart and might be a solution for centers that cannot use paid commercial solutions.

  5. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Bonvini, Marco; Piette, Mary Ann

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and buildingmore » behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.« less

  6. Model-based reasoning for power system management using KATE and the SSM/PMAD

    NASA Technical Reports Server (NTRS)

    Morris, Robert A.; Gonzalez, Avelino J.; Carreira, Daniel J.; Mckenzie, F. D.; Gann, Brian

    1993-01-01

    The overall goal of this research effort has been the development of a software system which automates tasks related to monitoring and controlling electrical power distribution in spacecraft electrical power systems. The resulting software system is called the Intelligent Power Controller (IPC). The specific tasks performed by the IPC include continuous monitoring of the flow of power from a source to a set of loads, fast detection of anomalous behavior indicating a fault to one of the components of the distribution systems, generation of diagnosis (explanation) of anomalous behavior, isolation of faulty object from remainder of system, and maintenance of flow of power to critical loads and systems (e.g. life-support) despite fault conditions being present (recovery). The IPC system has evolved out of KATE (Knowledge-based Autonomous Test Engineer), developed at NASA-KSC. KATE consists of a set of software tools for developing and applying structure and behavior models to monitoring, diagnostic, and control applications.

  7. Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report

    NASA Technical Reports Server (NTRS)

    Ossenfort, John

    2008-01-01

    As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and development through health management and maintenance. TEAMS-Designer is the model-building and testability analysis software in that suite.

  8. Design of a Web-tool for diagnostic clinical trials handling medical imaging research.

    PubMed

    Baltasar Sánchez, Alicia; González-Sistal, Angel

    2011-04-01

    New clinical studies in medicine are based on patients and controls using different imaging diagnostic modalities. Medical information systems are not designed for clinical trials employing clinical imaging. Although commercial software and communication systems focus on storage of image data, they are not suitable for storage and mining of new types of quantitative data. We sought to design a Web-tool to support diagnostic clinical trials involving different experts and hospitals or research centres. The image analysis of this project is based on skeletal X-ray imaging. It involves a computerised image method using quantitative analysis of regions of interest in healthy bone and skeletal metastases. The database is implemented with ASP.NET 3.5 and C# technologies for our Web-based application. For data storage, we chose MySQL v.5.0, one of the most popular open source databases. User logins were necessary, and access to patient data was logged for auditing. For security, all data transmissions were carried over encrypted connections. This Web-tool is available to users scattered at different locations; it allows an efficient organisation and storage of data (case report form) and images and allows each user to know precisely what his task is. The advantages of our Web-tool are as follows: (1) sustainability is guaranteed; (2) network locations for collection of data are secured; (3) all clinical information is stored together with the original images and the results derived from processed images and statistical analysis that enable us to perform retrospective studies; (4) changes are easily incorporated because of the modular architecture; and (5) assessment of trial data collected at different sites is centralised to reduce statistical variance.

  9. Evidence-based development of a diagnosis-dependent therapy planning system and its implementation in modern diagnostic software.

    PubMed

    Ahlers, M O; Jakstat, H A

    2005-07-01

    The prerequisite for structured individual therapy of craniomandibular dysfunctions is differential diagnostics. Suggestions for the structured recording of findings and their structured evaluation beyond the global diagnosis of "craniomandibular disorders" have been published. Only this structured approach enables computerization of the diagnostic process. The respective software is available for use in practice (CMDcheck for CMD screening, CMDfact for the differential diagnostics). Based on this structured diagnostics, knowledge-based therapy planning is also conceivable. The prerequisite for this would be a model of achieving consensus on the indicated forms of therapy related to the diagnosis. Therefore, a procedure for evidence-based achievement of consensus on suitable forms of therapy in CMD was developed first in multicentric cooperation, and then implemented in corresponding software. The clinical knowledge of experienced specialists was included consciously for the consensus achievement process. At the same time, anonymized mathematical statistical evaluations were used for control and objectification. Different examiners form different departments of several universities working independently of one another assigned the theoretically conceiveable therapeutic alternatives to the already published diagnostic scheme. After anonymization, the correlation of these assignments was then calculated mathematically. For achieving consensus in those cases for which no agreement initally existed, agreement was subsequently arrived at in the course of a consensus conference on the basis of literature evaluations and the discussion of clinical case examples. This consensus in turn finally served as the basis of a therapy planner implemented in the above-mentioned diagnostic software CMDfact. Contributing to quality assurance, the principles of programming this assistant as well as the interface for linking into the diagnostic software are documented and also published here.

  10. A Bayesian approach to the characterization of electroencephalographic recordings in premature infants

    NASA Astrophysics Data System (ADS)

    Mitchell, Timothy J.

    Preterm infants are particularly susceptible to cerebral injury, and electroencephalographic (EEG) recordings provide an important diagnostic tool for determining cerebral health. However, interpreting these EEG recordings is challenging and requires the skills of a trained electroencephalographer. Because these EEG specialists are rare, an automated interpretation of newborn EEG recordings would increase access to an important diagnostic tool for physicians. To automate this procedure, we employ a novel Bayesian approach to compute the probability of EEG features (waveforms) including suppression, delta brushes, and delta waves. The power of this approach lies not only in its ability to closely mimic the techniques used by EEG specialists, but also its ability to be generalized to identify other waveforms that may be of interest for future work. The results of these calculations are used in a program designed to output simple statistics related to the presence or absence of such features. Direct comparison of the software with expert human readers has indicated satisfactory performance, and the algorithm has shown promise in its ability to distinguish between infants with normal neurodevelopmental outcome and those with poor neurodevelopmental outcome.

  11. Computer-aided dermoscopy for diagnosis of melanoma

    PubMed Central

    Barzegari, Masoomeh; Ghaninezhad, Haiedeh; Mansoori, Parisa; Taheri, Arash; Naraghi, Zahra S; Asgari, Masood

    2005-01-01

    Background Computer-aided dermoscopy using artificial neural networks has been reported to be an accurate tool for the evaluation of pigmented skin lesions. We set out to determine the sensitivity and specificity of a computer-aided dermoscopy system for diagnosis of melanoma in Iranian patients. Methods We studied 122 pigmented skin lesions which were referred for diagnostic evaluation or cosmetic reasons. Each lesion was examined by two clinicians with naked eyes and all of their clinical diagnostic considerations were recorded. The lesions were analyzed using a microDERM® dermoscopy unit. The output value of the software for each lesion was a score between 0 and 10. All of the lesions were excised and examined histologically. Results Histopathological examination revealed melanoma in six lesions. Considering only the most likely clinical diagnosis, sensitivity and specificity of clinical examination for diagnosis of melanoma were 83% and 96%, respectively. Considering all clinical diagnostic considerations, the sensitivity and specificity were 100% and 89%. Choosing a cut-off point of 7.88 for dermoscopy score, the sensitivity and specificity of the score for diagnosis of melanoma were 83% and 96%, respectively. Setting the cut-off point at 7.34, the sensitivity and specificity were 100% and 90%. Conclusion The diagnostic accuracy of the dermoscopy system was at the level of clinical examination by dermatologists with naked eyes. This system may represent a useful tool for screening of melanoma, particularly at centers not experienced in the field of pigmented skin lesions. PMID:16000171

  12. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  13. RIEMS: a software pipeline for sensitive and comprehensive taxonomic classification of reads from metagenomics datasets.

    PubMed

    Scheuch, Matthias; Höper, Dirk; Beer, Martin

    2015-03-03

    Fuelled by the advent and subsequent development of next generation sequencing technologies, metagenomics became a powerful tool for the analysis of microbial communities both scientifically and diagnostically. The biggest challenge is the extraction of relevant information from the huge sequence datasets generated for metagenomics studies. Although a plethora of tools are available, data analysis is still a bottleneck. To overcome the bottleneck of data analysis, we developed an automated computational workflow called RIEMS - Reliable Information Extraction from Metagenomic Sequence datasets. RIEMS assigns every individual read sequence within a dataset taxonomically by cascading different sequence analyses with decreasing stringency of the assignments using various software applications. After completion of the analyses, the results are summarised in a clearly structured result protocol organised taxonomically. The high accuracy and performance of RIEMS analyses were proven in comparison with other tools for metagenomics data analysis using simulated sequencing read datasets. RIEMS has the potential to fill the gap that still exists with regard to data analysis for metagenomics studies. The usefulness and power of RIEMS for the analysis of genuine sequencing datasets was demonstrated with an early version of RIEMS in 2011 when it was used to detect the orthobunyavirus sequences leading to the discovery of Schmallenberg virus.

  14. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov Websites

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  15. ATACseqQC: a Bioconductor package for post-alignment quality assessment of ATAC-seq data.

    PubMed

    Ou, Jianhong; Liu, Haibo; Yu, Jun; Kelliher, Michelle A; Castilla, Lucio H; Lawson, Nathan D; Zhu, Lihua Julie

    2018-03-01

    ATAC-seq (Assays for Transposase-Accessible Chromatin using sequencing) is a recently developed technique for genome-wide analysis of chromatin accessibility. Compared to earlier methods for assaying chromatin accessibility, ATAC-seq is faster and easier to perform, does not require cross-linking, has higher signal to noise ratio, and can be performed on small cell numbers. However, to ensure a successful ATAC-seq experiment, step-by-step quality assurance processes, including both wet lab quality control and in silico quality assessment, are essential. While several tools have been developed or adopted for assessing read quality, identifying nucleosome occupancy and accessible regions from ATAC-seq data, none of the tools provide a comprehensive set of functionalities for preprocessing and quality assessment of aligned ATAC-seq datasets. We have developed a Bioconductor package, ATACseqQC, for easily generating various diagnostic plots to help researchers quickly assess the quality of their ATAC-seq data. In addition, this package contains functions to preprocess aligned ATAC-seq data for subsequent peak calling. Here we demonstrate the utilities of our package using 25 publicly available ATAC-seq datasets from four studies. We also provide guidelines on what the diagnostic plots should look like for an ideal ATAC-seq dataset. This software package has been used successfully for preprocessing and assessing several in-house and public ATAC-seq datasets. Diagnostic plots generated by this package will facilitate the quality assessment of ATAC-seq data, and help researchers to evaluate their own ATAC-seq experiments as well as select high-quality ATAC-seq datasets from public repositories such as GEO to avoid generating hypotheses or drawing conclusions from low-quality ATAC-seq experiments. The software, source code, and documentation are freely available as a Bioconductor package at https://bioconductor.org/packages/release/bioc/html/ATACseqQC.html .

  16. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Beckman, Carol S.; Benzinger, Leonora; Beshers, George; Hammerslag, David; Kimball, John; Kirslis, Peter A.; Render, Hal; Richards, Paul; Terwilliger, Robert

    1985-01-01

    The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. The SAGA system consists of a small number of software components that are adapted by the meta-tools into specific tools for use in the software development application. The modules are design so that the meta-tools can construct an environment which is both integrated and flexible. The SAGA project is documented in several papers which are presented.

  17. Debugging and Performance Analysis Software Tools for Peregrine System |

    Science.gov Websites

    High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea

  18. AirLab: a cloud-based platform to manage and share antibody-based single-cell research.

    PubMed

    Catena, Raúl; Özcan, Alaz; Jacobs, Andrea; Chevrier, Stephane; Bodenmiller, Bernd

    2016-06-29

    Single-cell analysis technologies are essential tools in research and clinical diagnostics. These methods include flow cytometry, mass cytometry, and other microfluidics-based technologies. Most laboratories that employ these methods maintain large repositories of antibodies. These ever-growing collections of antibodies, their multiple conjugates, and the large amounts of data generated in assays using specific antibodies and conditions makes a dedicated software solution necessary. We have developed AirLab, a cloud-based tool with web and mobile interfaces, for the organization of these data. AirLab streamlines the processes of antibody purchase, organization, and storage, antibody panel creation, results logging, and antibody validation data sharing and distribution. Furthermore, AirLab enables inventory of other laboratory stocks, such as primers or clinical samples, through user-controlled customization. Thus, AirLab is a mobile-powered and flexible tool that harnesses the capabilities of mobile tools and cloud-based technology to facilitate inventory and sharing of antibody and sample collections and associated validation data.

  19. Development of a comprehensive software engineering environment

    NASA Technical Reports Server (NTRS)

    Hartrum, Thomas C.; Lamont, Gary B.

    1987-01-01

    The generation of a set of tools for software lifecycle is a recurring theme in the software engineering literature. The development of such tools and their integration into a software development environment is a difficult task because of the magnitude (number of variables) and the complexity (combinatorics) of the software lifecycle process. An initial development of a global approach was initiated in 1982 as the Software Development Workbench (SDW). Continuing efforts focus on tool development, tool integration, human interfacing, data dictionaries, and testing algorithms. Current efforts are emphasizing natural language interfaces, expert system software development associates and distributed environments with Ada as the target language. The current implementation of the SDW is on a VAX-11/780. Other software development tools are being networked through engineering workstations.

  20. Developing and Testing of a Software Prototype to Support Diagnostic Reasoning of Nursing Students.

    PubMed

    de Sousa, Vanessa Emille Carvalho; de Oliveira Lopes, Marcos Venícios; Keenan, Gail M; Lopez, Karen Dunn

    2018-04-01

    To design and test educational software to improve nursing students' diagnostic reasoning through NANDA-I-based clinical scenarios. A mixed method approach was used and included content validation by a panel of 13 experts and prototype testing by a sample of 56 students. Experts' suggestions included writing adjustments, new response options, and replacement of clinical information on the scenarios. Percentages of students' correct answers were 65.7%, 62.2%, and 60.5% for related factors, defining characteristics, and nursing diagnoses, respectively. Full development of this software shows strong potential for enhancing students' diagnostic reasoning. New graduates may be able to apply diagnostic reasoning more rapidly by exercising their diagnostic skills within this software. Desenvolver e testar um protótipo de software educativo para melhorar o raciocínio diagnóstico de estudantes de enfermagem. MÉTODOS: Uma abordagem mista foi utilizada e incluiu validação de conteúdo por 13 experts e testagem do protótipo por 56 estudantes. Sugestões dos experts incluíram ajustes na escrita, inclusão de novas opções de resposta e substituição de dados clínicos nos cenários. Os percentuais de respostas corretas dos estudantes foram 65,7%, 62,2% e 60,5% para fatores relacionados, características definidoras e diagnósticos de enfermagem respectivamente. CONCLUSÃO: O desenvolvimento deste software tem um forte potencial para melhorar o raciocínio diagnóstico de estudantes. IMPLICAÇÕES PARA A PRÁTICA EM ENFERMAGEM: Através deste software, enfermeiros poderão ser capazes de exercitar o raciocínio diagnóstico e aplicá-lo mais rapidamente. © 2016 NANDA International, Inc.

  1. Diagnostic performance of confocal laser endomicroscopy for optical diagnosis of gastric intestinal metaplasia: a meta-analysis.

    PubMed

    He, Xing-Kang; Liu, Dan; Sun, Lei-Min

    2016-09-05

    Gastric intestinal metaplasia (IM) is generally considered as a precancerous condition, a related risk factor for intestinal-type gastric cancer. However, an accurate endoscopic diagnosis of IM is a clinical challenge. Confocal Laser Endomicroscopy (CLE) is a newly technique that can provide real-time magnified images and visualize tissues at cellular or subcellular levels. The aim of this study is to clarify the diagnostic value of CLE in detection of IM in patients at high risk of gastric cancer. Systematic literature searches up to April 2015 in PubMed, Embase, Web of Science, Cochrane Library databases were conducted by two reviewers independently. The Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool was applied to assess study quality and to reduce potential bias. A meta-analysis using Meta-Disc (version 1.4) and STATA software (version 13) was performed. A total of four studies enrolled 218 patients and 579 lesions were included in this meta-analysis. On per-lesion basis, the pooled sensitivity and specificity of CLE were 0.97(95 % confidence interval (CI) = 0.94-0.98) and 0.94 (95 % CI = 0.91-0.97) respectively. The pooled positive likelihood ratio (PLR) and negative likelihood ratio (NLR) were 15.20 (95 % CI = 9.46-24.41) and 0.04 (95 % CI = 0.02-0.07) respectively. The pooled diagnostic odds ratio (DOR) was 479.59 (95 % CI = 205.64-1118.51) and summary receiver operating curve (SROC) area under the curve was 0.9884. There was no statistical significance of publication bias. CLE is a promising endoscopic tool in the detection of IM with the relatively high diagnostic value in patients at high risk of gastric cancer.

  2. Digital diagnosis of medical images

    NASA Astrophysics Data System (ADS)

    Heinonen, Tomi; Kuismin, Raimo; Jormalainen, Raimo; Dastidar, Prasun; Frey, Harry; Eskola, Hannu

    2001-08-01

    The popularity of digital imaging devices and PACS installations has increased during the last years. Still, images are analyzed and diagnosed using conventional techniques. Our research group begun to study the requirements for digital image diagnostic methods to be applied together with PACS systems. The research was focused on various image analysis procedures (e.g., segmentation, volumetry, 3D visualization, image fusion, anatomic atlas, etc.) that could be useful in medical diagnosis. We have developed Image Analysis software (www.medimag.net) to enable several image-processing applications in medical diagnosis, such as volumetry, multimodal visualization, and 3D visualizations. We have also developed a commercial scalable image archive system (ActaServer, supports DICOM) based on component technology (www.acta.fi), and several telemedicine applications. All the software and systems operate in NT environment and are in clinical use in several hospitals. The analysis software have been applied in clinical work and utilized in numerous patient cases (500 patients). This method has been used in the diagnosis, therapy and follow-up in various diseases of the central nervous system (CNS), respiratory system (RS) and human reproductive system (HRS). In many of these diseases e.g. Systemic Lupus Erythematosus (CNS), nasal airways diseases (RS) and ovarian tumors (HRS), these methods have been used for the first time in clinical work. According to our results, digital diagnosis improves diagnostic capabilities, and together with PACS installations it will become standard tool during the next decade by enabling more accurate diagnosis and patient follow-up.

  3. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  4. Evolving software reengineering technology for the emerging innovative-competitive era

    NASA Technical Reports Server (NTRS)

    Hwang, Phillip Q.; Lock, Evan; Prywes, Noah

    1994-01-01

    This paper reports on a multi-tool commercial/military environment combining software Domain Analysis techniques with Reusable Software and Reengineering of Legacy Software. It is based on the development of a military version for the Department of Defense (DOD). The integrated tools in the military version are: Software Specification Assistant (SSA) and Software Reengineering Environment (SRE), developed by Computer Command and Control Company (CCCC) for Naval Surface Warfare Center (NSWC) and Joint Logistics Commanders (JLC), and the Advanced Research Project Agency (ARPA) STARS Software Engineering Environment (SEE) developed by Boeing for NAVAIR PMA 205. The paper describes transitioning these integrated tools to commercial use. There is a critical need for the transition for the following reasons: First, to date, 70 percent of programmers' time is applied to software maintenance. The work of these users has not been facilitated by existing tools. The addition of Software Reengineering will also facilitate software maintenance and upgrading. In fact, the integrated tools will support the entire software life cycle. Second, the integrated tools are essential to Business Process Reengineering, which seeks radical process innovations to achieve breakthrough results. Done well, process reengineering delivers extraordinary gains in process speed, productivity and profitability. Most importantly, it discovers new opportunities for products and services in collaboration with other organizations. Legacy computer software must be changed rapidly to support innovative business processes. The integrated tools will provide commercial organizations important competitive advantages. This, in turn, will increase employment by creating new business opportunities. Third, the integrated system will produce much higher quality software than use of the tools separately. The reason for this is that producing or upgrading software requires keen understanding of extremely complex applications which is facilitated by the integrated tools. The radical savings in the time and cost associated with software, due to use of CASE tools that support combined Reuse of Software and Reengineering of Legacy Code, will add an important impetus to improving the automation of enterprises. This will be reflected in continuing operations, as well as in innovating new business processes. The proposed multi-tool software development is based on state of the art technology, which will be further advanced through the use of open systems for adding new tools and experience in their use.

  5. Wireless networking for the dental office: current wireless standards and security protocols.

    PubMed

    Mupparapu, Muralidhar; Arora, Sarika

    2004-11-15

    Digital radiography has gained immense popularity in dentistry today in spite of the early difficulty for the profession to embrace the technology. The transition from film to digital has been happening at a faster pace in the fields of Orthodontics, Oral Surgery, Endodontics, Periodontics, and other specialties where the radiographic images (periapical, bitewing, panoramic, cephalometric, and skull radiographs) are being acquired digitally, stored within a server locally, and eventually accessed for diagnostic purposes, along with the rest of the patient data via the patient management software (PMS). A review of the literature shows the diagnostic performance of digital radiography is at least comparable to or even better than that of conventional radiography. Similarly, other digital diagnostic tools like caries detectors, cephalometric analysis software, and digital scanners were used for many years for the diagnosis and treatment planning purposes. The introduction of wireless charged-coupled device (CCD) sensors in early 2004 (Schick Technologies, Long Island City, NY) has moved digital radiography a step further into the wireless era. As with any emerging technology, there are concerns that should be looked into before adapting to the wireless environment. Foremost is the network security involved in the installation and usage of these wireless networks. This article deals with the existing standards and choices in wireless technologies that are available for implementation within a contemporary dental office. The network security protocols that protect the patient data and boost the efficiency of modern day dental clinics are enumerated.

  6. Accurate analysis and visualization of cardiac (11)C-PIB uptake in amyloidosis with semiautomatic software.

    PubMed

    Kero, Tanja; Lindsjö, Lars; Sörensen, Jens; Lubberink, Mark

    2016-08-01

    (11)C-PIB PET is a promising non-invasive diagnostic tool for cardiac amyloidosis. Semiautomatic analysis of PET data is now available but it is not known how accurate these methods are for amyloid imaging. The aim of this study was to evaluate the feasibility of one semiautomatic software tool for analysis and visualization of (11)C-PIB left ventricular retention index (RI) in cardiac amyloidosis. Patients with systemic amyloidosis and cardiac involvement (n = 10) and healthy controls (n = 5) were investigated with dynamic (11)C-PIB PET. Two observers analyzed the PET studies with semiautomatic software to calculate the left ventricular RI of (11)C-PIB and to create parametric images. The mean RI at 15-25 min from the semiautomatic analysis was compared with RI based on manual analysis and showed comparable values (0.056 vs 0.054 min(-1) for amyloidosis patients and 0.024 vs 0.025 min(-1) in healthy controls; P = .78) and the correlation was excellent (r = 0.98). Inter-reader reproducibility also was excellent (intraclass correlation coefficient, ICC > 0.98). Parametric polarmaps and histograms made visual separation of amyloidosis patients and healthy controls fast and simple. Accurate semiautomatic analysis of cardiac (11)C-PIB RI in amyloidosis patients is feasible. Parametric polarmaps and histograms make visual interpretation fast and simple.

  7. Fault Tree Analysis Application for Safety and Reliability

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.

  8. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  9. Increasing rigor in NMR-based metabolomics through validated and open source tools

    PubMed Central

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2016-01-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism’s phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. PMID:27643760

  10. Increasing rigor in NMR-based metabolomics through validated and open source tools.

    PubMed

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2017-02-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.

  11. Autonomous system for Web-based microarray image analysis.

    PubMed

    Bozinov, Daniel

    2003-12-01

    Software-based feature extraction from DNA microarray images still requires human intervention on various levels. Manual adjustment of grid and metagrid parameters, precise alignment of superimposed grid templates and gene spots, or simply identification of large-scale artifacts have to be performed beforehand to reliably analyze DNA signals and correctly quantify their expression values. Ideally, a Web-based system with input solely confined to a single microarray image and a data table as output containing measurements for all gene spots would directly transform raw image data into abstracted gene expression tables. Sophisticated algorithms with advanced procedures for iterative correction function can overcome imminent challenges in image processing. Herein is introduced an integrated software system with a Java-based interface on the client side that allows for decentralized access and furthermore enables the scientist to instantly employ the most updated software version at any given time. This software tool is extended from PixClust as used in Extractiff incorporated with Java Web Start deployment technology. Ultimately, this setup is destined for high-throughput pipelines in genome-wide medical diagnostics labs or microarray core facilities aimed at providing fully automated service to its users.

  12. A novel tool for user-friendly estimation of natural, diagnostic and professional radiation risk: Radio-Risk software.

    PubMed

    Carpeggiani, Clara; Paterni, Marco; Caramella, Davide; Vano, Eliseo; Semelka, Richard C; Picano, Eugenio

    2012-11-01

    Awareness of radiological risk is low among doctors and patients. An educational/decision tool that considers each patient' s cumulative lifetime radiation exposure would facilitate provider-patient communication. The purpose of this work was to develop user-friendly software for simple estimation and communication of radiological risk to patients and doctors as a part of the SUIT-Heart (Stop Useless Imaging Testing in Heart disease) Project of the Tuscany Region. We developed a novel software program (PC-platform, Windows OS fully downloadable at http://suit-heart.ifc.cnr.it) considering reference dose estimates from American Heart Association Radiological Imaging 2009 guidelines and UK Royal College of Radiology 2007 guidelines. Cancer age and gender-weighted risk were derived from Biological Effects of Ionising Radiation VII Committee, 2006. With simple input functions (demographics, age, gender) the user selects from a predetermined menu variables relating to natural (e.g., airplane flights and geo-tracked background exposure), professional (e.g., cath lab workers) and medical (e.g., CT, cardiac scintigraphy, coronary stenting) sources. The program provides a simple numeric (cumulative effective dose in milliSievert, mSv, and equivalent number of chest X-rays) and graphic (cumulative temporal trends of exposure, cancer cases out of 100 exposed persons) display. A simple software program allows straightforward estimation of cumulative dose (in multiples of chest X-rays) and risk (in extra % lifetime cancer risk), with simple numbers quantifying lifetime extra cancer risk. Pictorial display of radiation risk may be valuable for increasing radiological awareness in cardiologists. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  13. A Diagnostic System for Studying Energy Partitioning and Assessing the Response of the Ionosphere during HAARP Modification Experiments

    NASA Technical Reports Server (NTRS)

    Djuth, Frank T.; Elder, John H.; Williams, Kenneth L.

    1996-01-01

    This research program focused on the construction of several key radio wave diagnostics in support of the HF Active Auroral Ionospheric Research Program (HAARP). Project activities led to the design, development, and fabrication of a variety of hardware units and to the development of several menu-driven software packages for data acquisition and analysis. The principal instrumentation includes an HF (28 MHz) radar system, a VHF (50 MHz) radar system, and a high-speed radar processor consisting of three separable processing units. The processor system supports the HF and VHF radars and is capable of acquiring very detailed data with large incoherent scatter radars. In addition, a tunable HF receiver system having high dynamic range was developed primarily for measurements of stimulated electromagnetic emissions (SEE). A separate processor unit was constructed for the SEE receiver. Finally, a large amount of support instrumentation was developed to accommodate complex field experiments. Overall, the HAARP diagnostics are powerful tools for studying diverse ionospheric modification phenomena. They are also flexible enough to support a host of other missions beyond the scope of HAARP. Many new research programs have been initiated by applying the HAARP diagnostics to studies of natural atmospheric processes.

  14. New bone post-processing tools in forensic imaging: a multi-reader feasibility study to evaluate detection time and diagnostic accuracy in rib fracture assessment.

    PubMed

    Glemser, Philip A; Pfleiderer, Michael; Heger, Anna; Tremper, Jan; Krauskopf, Astrid; Schlemmer, Heinz-Peter; Yen, Kathrin; Simons, David

    2017-03-01

    The aim of this multi-reader feasibility study was to evaluate new post-processing CT imaging tools in rib fracture assessment of forensic cases by analyzing detection time and diagnostic accuracy. Thirty autopsy cases (20 with and 10 without rib fractures in autopsy) were randomly selected and included in this study. All cases received a native whole body CT scan prior to the autopsy procedure, which included dissection and careful evaluation of each rib. In addition to standard transverse sections (modality A), CT images were subjected to a reconstruction algorithm to compute axial labelling of the ribs (modality B) as well as "unfolding" visualizations of the rib cage (modality C, "eagle tool"). Three radiologists with different clinical and forensic experience who were blinded to autopsy results evaluated all cases in a random manner of modality and case. Rib fracture assessment of each reader was evaluated compared to autopsy and a CT consensus read as radiologic reference. A detailed evaluation of relevant test parameters revealed a better accordance to the CT consensus read as to the autopsy. Modality C was the significantly quickest rib fracture detection modality despite slightly reduced statistic test parameters compared to modalities A and B. Modern CT post-processing software is able to shorten reading time and to increase sensitivity and specificity compared to standard autopsy alone. The eagle tool as an easy to use tool is suited for an initial rib fracture screening prior to autopsy and can therefore be beneficial for forensic pathologists.

  15. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  16. Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.

  17. 3D printing from diagnostic images: a radiologist's primer with an emphasis on musculoskeletal imaging-putting the 3D printing of pathology into the hands of every physician.

    PubMed

    Friedman, Tamir; Michalski, Mark; Goodman, T Rob; Brown, J Elliott

    2016-03-01

    Three-dimensional (3D) printing has recently erupted into the medical arena due to decreased costs and increased availability of printers and software tools. Due to lack of detailed information in the medical literature on the methods for 3D printing, we have reviewed the medical and engineering literature on the various methods for 3D printing and compiled them into a practical "how to" format, thereby enabling the novice to start 3D printing with very limited funds. We describe (1) background knowledge, (2) imaging parameters, (3) software, (4) hardware, (5) post-processing, and (6) financial aspects required to cost-effectively reproduce a patient's disease ex vivo so that the patient, engineer and surgeon may hold the anatomy and associated pathology in their hands.

  18. Software Engineering Laboratory (SEL) compendium of tools, revision 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of programs used to aid software product development is listed. Known as software tools, such programs include requirements analyzers, design languages, precompilers, code auditors, code analyzers, and software librarians. Abstracts, resource requirements, documentation, processing summaries, and availability are indicated for most tools.

  19. Verifying Diagnostic Software

    NASA Technical Reports Server (NTRS)

    Lindsey, Tony; Pecheur, Charles

    2004-01-01

    Livingstone PathFinder (LPF) is a simulation-based computer program for verifying autonomous diagnostic software. LPF is designed especially to be applied to NASA s Livingstone computer program, which implements a qualitative-model-based algorithm that diagnoses faults in a complex automated system (e.g., an exploratory robot, spacecraft, or aircraft). LPF forms a software test bed containing a Livingstone diagnosis engine, embedded in a simulated operating environment consisting of a simulator of the system to be diagnosed by Livingstone and a driver program that issues commands and faults according to a nondeterministic scenario provided by the user. LPF runs the test bed through all executions allowed by the scenario, checking for various selectable error conditions after each step. All components of the test bed are instrumented, so that execution can be single-stepped both backward and forward. The architecture of LPF is modular and includes generic interfaces to facilitate substitution of alternative versions of its different parts. Altogether, LPF provides a flexible, extensible framework for simulation-based analysis of diagnostic software; these characteristics also render it amenable to application to diagnostic programs other than Livingstone.

  20. Foreign Object Damage Identification in Turbine Engines

    NASA Technical Reports Server (NTRS)

    Strack, William; Zhang, Desheng; Turso, James; Pavlik, William; Lopez, Isaac

    2005-01-01

    This report summarizes the collective work of a five-person team from different organizations examining the problem of detecting foreign object damage (FOD) events in turbofan engines from gas path thermodynamic and bearing accelerometer sensors, and determining the severity of damage to each component (diagnosis). Several detection and diagnostic approaches were investigated and a software tool (FODID) was developed to assist researchers detect/diagnose FOD events. These approaches include (1) fan efficiency deviation computed from upstream and downstream temperature/ pressure measurements, (2) gas path weighted least squares estimation of component health parameter deficiencies, (3) Kalman filter estimation of component health parameters, and (4) use of structural vibration signal processing to detect both large and small FOD events. The last three of these approaches require a significant amount of computation in conjunction with a physics-based analytic model of the underlying phenomenon the NPSS thermodynamic cycle code for approaches 1 to 3 and the DyRoBeS reduced-order rotor dynamics code for approach 4. A potential application of the FODID software tool, in addition to its detection/diagnosis role, is using its sensitivity results to help identify the best types of sensors and their optimum locations within the gas path, and similarly for bearing accelerometers.

  1. Diagnostic Algorithm Benchmarking

    NASA Technical Reports Server (NTRS)

    Poll, Scott

    2011-01-01

    A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.

  2. Diagnostic workstation for digital hand atlas in bone age assessment

    NASA Astrophysics Data System (ADS)

    Cao, Fei; Huang, H. K.; Pietka, Ewa; Gilsanz, Vicente; Ominsky, Steven

    1998-06-01

    Bone age assessment by a radiological examination of a hand and wrist image is a procedure frequently performed in pediatric patients to evaluate growth disorders, determine growth potential in children and monitor therapy effects. The assessment method currently used in radiological diagnosis is based on atlas matching of the diagnosed hand image with the reference set of atlas patterns, which was developed in 1950s and is not fully applicable for children of today. We intent to implement a diagnostic workstation for creating a new reference set of clinically normal images which will serve as a digital atlas and can be used for a computer-assisted bone age assessment. In this paper, we present the initial data- collection and system setup phase of this five-year research program. We describe the system design, user interface implementation and software tool development for collection, visualization, management and processing of clinically normal hand and wrist images.

  3. The effect of general anesthesia versus intravenous sedation on diagnostic yield and success in electromagnetic navigation bronchoscopy.

    PubMed

    Bowling, Mark R; Kohan, Matthew W; Walker, Paul; Efird, Jimmy; Ben Or, Sharon

    2015-01-01

    Navigational bronchoscopy is utilized to guide biopsies of peripheral lung nodules and place fiducial markers for treatment of limited stage lung cancer with stereotactic body radiotherapy. The type of sedation used for this procedure remains controversial. We performed a retrospective chart review to evaluate the differences of diagnostic yield and overall success of the procedure based on anesthesia type. Electromagnetic navigational bronchoscopy was performed using the superDimension software system. Once the targeted lesion was within reach, multiple tissue samples were obtained. Statistical analysis was used to correlate the yield with the type of sedation among other factors. A successful procedure was defined if a diagnosis was made or a fiducial marker was adequately placed. Navigational bronchoscopy was performed on a total of 120 targeted lesions. The overall complication rate of the procedure was 4.1%. The diagnostic yield and success of the procedure was 74% and 87%, respectively. Duration of the procedure was the only significant difference between the general anesthesia and IV sedation groups (mean, 58 vs. 43 min, P=0.0005). A larger tumor size was associated with a higher diagnostic yield (P=0.032). All other variables in terms of effect on diagnostic yield and an unsuccessful procedure did not meet statistical significance. Navigational bronchoscopy is a safe and effective pulmonary diagnostic tool with relatively low complication rate. The diagnostic yield and overall success of the procedure does not seem to be affected by the type of sedation used.

  4. Automated support for experience-based software management

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  5. Halitosis amongst students in tertiary institutions in Lagos state.

    PubMed

    Arinola, J E; Olukoju, O O

    2012-12-01

    Halitosis is defined as a noticeable unpleasant odor from the mouth. It is a medico-social problem that affects a significant number of people around the world. Research reveals that nearly 50% of the adult population has halitosis. To determine level of awareness of halitosis and prevalence of the condition amongst students in tertiary institutions as a baseline survey. For this project, 100 students from three tertiary institutions in Lagos state were chosen: University of Lagos, Lagos State University, Ojo campus and Yaba College of Technology. A semi-structured questionnaire and practical testing/diagnostic tool were utilized. Data collected was collated and analyzed using Microsoft Excel 2007 and SPSS statistical software. Most of the respondents were single and Christian. Level of awareness of halitosis was high. Results showed that 15%, 2% and 22% from UNILAG, LASU and YCT respectively said they had halitosis. Using the diagnostic tool, 6%, 8% and 2% respectively were positive for halitosis. There is high level of awareness of halitosis among the respondents. The prevalence of the disorder is low, however, it is recommended that enlightenment campaigns be mounted in schools to improve level of awareness and treatment seeking.

  6. Virtual surgery in a (tele-)radiology framework.

    PubMed

    Glombitza, G; Evers, H; Hassfeld, S; Engelmann, U; Meinzer, H P

    1999-09-01

    This paper presents telemedicine as an extension of a teleradiology framework through tools for virtual surgery. To classify the described methods and applications, the research field of virtual reality (VR) is broadly reviewed. Differences with respect to technical equipment, methodological requirements and areas of application are pointed out. Desktop VR, augmented reality, and virtual reality are differentiated and discussed in some typical contexts of diagnostic support, surgical planning, therapeutic procedures, simulation and training. Visualization techniques are compared as a prerequisite for virtual reality and assigned to distinct levels of immersion. The advantage of a hybrid visualization kernel is emphasized with respect to the desktop VR applications that are subsequently shown. Moreover, software design aspects are considered by outlining functional openness in the architecture of the host system. Here, a teleradiology workstation was extended by dedicated tools for surgical planning through a plug-in mechanism. Examples of recent areas of application are introduced such as liver tumor resection planning, diagnostic support in heart surgery, and craniofacial surgery planning. In the future, surgical planning systems will become more important. They will benefit from improvements in image acquisition and communication, new image processing approaches, and techniques for data presentation. This will facilitate preoperative planning and intraoperative applications.

  7. A diagnostic interface for the ICOsahedral Non-hydrostatic (ICON) modelling framework based on the Modular Earth Submodel System (MESSy v2.50)

    NASA Astrophysics Data System (ADS)

    Kern, Bastian; Jöckel, Patrick

    2016-10-01

    Numerical climate and weather models have advanced to finer scales, accompanied by large amounts of output data. The model systems hit the input and output (I/O) bottleneck of modern high-performance computing (HPC) systems. We aim to apply diagnostic methods online during the model simulation instead of applying them as a post-processing step to written output data, to reduce the amount of I/O. To include diagnostic tools into the model system, we implemented a standardised, easy-to-use interface based on the Modular Earth Submodel System (MESSy) into the ICOsahedral Non-hydrostatic (ICON) modelling framework. The integration of the diagnostic interface into the model system is briefly described. Furthermore, we present a prototype implementation of an advanced online diagnostic tool for the aggregation of model data onto a user-defined regular coarse grid. This diagnostic tool will be used to reduce the amount of model output in future simulations. Performance tests of the interface and of two different diagnostic tools show, that the interface itself introduces no overhead in form of additional runtime to the model system. The diagnostic tools, however, have significant impact on the model system's runtime. This overhead strongly depends on the characteristics and implementation of the diagnostic tool. A diagnostic tool with high inter-process communication introduces large overhead, whereas the additional runtime of a diagnostic tool without inter-process communication is low. We briefly describe our efforts to reduce the additional runtime from the diagnostic tools, and present a brief analysis of memory consumption. Future work will focus on optimisation of the memory footprint and the I/O operations of the diagnostic interface.

  8. Radiation biodosimetry: applications for spaceflight

    NASA Astrophysics Data System (ADS)

    Blakely, W.; Miller, A.; Grace, M.; Prasanna, P.; Muderhwa, J.

    The multiparametric dosimetry system that we are developing for medical radiological defense applications could be adapted for spaceflight environments. The system complements the internationally accepted cytogenetic analysis of chromosome aberrations, considered the best means of documenting radiation doses for health records. Our system consists of a dose assessment software application, a portable blood cell counter, and molecular biodosimetry using miniaturized equipment. The Biodosimetry Assessment Tool (BAT) software application calculates radiation dose based on a patient's physical signs and symptoms and blood analysis, annotates location of personnel dosimeters, displays a summary of a patient's dosimetric information to healthcare professionals, and archives the data for further use. The dry reagent centrifuge-based blood cell counter (QBC Autoread Plus, Beckon Dickinson Bioscience) measures peripheral blood lymphocytes and monocytes, which could determine radiation dose based on the kinetics of blood cell depletion. Molecular biomarkers for ionizing radiation exposure (gene expression changes, blood proteins), once dose-dependent targets are identified, optimized, and validated, will make use of miniaturized diagnostic equipment for nucleic acid sequence and antigen-based biosensor detection technologies. These radiation assessment diagnostic technologies can have dual use for other medical related applications. [The Armed Forces Radiobiology Research Institute, under work unit AFRRI-01-3, and the Defense Threat Reduction Agency, under contract GG4661, supported this research.

  9. Multi-loci diagnosis of acute lymphoblastic leukaemia with high-throughput sequencing and bioinformatics analysis.

    PubMed

    Ferret, Yann; Caillault, Aurélie; Sebda, Shéhérazade; Duez, Marc; Grardel, Nathalie; Duployez, Nicolas; Villenet, Céline; Figeac, Martin; Preudhomme, Claude; Salson, Mikaël; Giraud, Mathieu

    2016-05-01

    High-throughput sequencing (HTS) is considered a technical revolution that has improved our knowledge of lymphoid and autoimmune diseases, changing our approach to leukaemia both at diagnosis and during follow-up. As part of an immunoglobulin/T cell receptor-based minimal residual disease (MRD) assessment of acute lymphoblastic leukaemia patients, we assessed the performance and feasibility of the replacement of the first steps of the approach based on DNA isolation and Sanger sequencing, using a HTS protocol combined with bioinformatics analysis and visualization using the Vidjil software. We prospectively analysed the diagnostic and relapse samples of 34 paediatric patients, thus identifying 125 leukaemic clones with recombinations on multiple loci (TRG, TRD, IGH and IGK), including Dd2/Dd3 and Intron/KDE rearrangements. Sequencing failures were halved (14% vs. 34%, P = 0.0007), enabling more patients to be monitored. Furthermore, more markers per patient could be monitored, reducing the probability of false negative MRD results. The whole analysis, from sample receipt to clinical validation, was shorter than our current diagnostic protocol, with equal resources. V(D)J recombination was successfully assigned by the software, even for unusual recombinations. This study emphasizes the progress that HTS with adapted bioinformatics tools can bring to the diagnosis of leukaemia patients. © 2016 John Wiley & Sons Ltd.

  10. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  11. Raman spectral post-processing for oral tissue discrimination – a step for an automatized diagnostic system

    PubMed Central

    Carvalho, Luis Felipe C. S.; Nogueira, Marcelo Saito; Neto, Lázaro P. M.; Bhattacharjee, Tanmoy T.; Martin, Airton A.

    2017-01-01

    Most oral injuries are diagnosed by histopathological analysis of a biopsy, which is an invasive procedure and does not give immediate results. On the other hand, Raman spectroscopy is a real time and minimally invasive analytical tool with potential for the diagnosis of diseases. The potential for diagnostics can be improved by data post-processing. Hence, this study aims to evaluate the performance of preprocessing steps and multivariate analysis methods for the classification of normal tissues and pathological oral lesion spectra. A total of 80 spectra acquired from normal and abnormal tissues using optical fiber Raman-based spectroscopy (OFRS) were subjected to PCA preprocessing in the z-scored data set, and the KNN (K-nearest neighbors), J48 (unpruned C4.5 decision tree), RBF (radial basis function), RF (random forest), and MLP (multilayer perceptron) classifiers at WEKA software (Waikato environment for knowledge analysis), after area normalization or maximum intensity normalization. Our results suggest the best classification was achieved by using maximum intensity normalization followed by MLP. Based on these results, software for automated analysis can be generated and validated using larger data sets. This would aid quick comprehension of spectroscopic data and easy diagnosis by medical practitioners in clinical settings. PMID:29188115

  12. Raman spectral post-processing for oral tissue discrimination - a step for an automatized diagnostic system.

    PubMed

    Carvalho, Luis Felipe C S; Nogueira, Marcelo Saito; Neto, Lázaro P M; Bhattacharjee, Tanmoy T; Martin, Airton A

    2017-11-01

    Most oral injuries are diagnosed by histopathological analysis of a biopsy, which is an invasive procedure and does not give immediate results. On the other hand, Raman spectroscopy is a real time and minimally invasive analytical tool with potential for the diagnosis of diseases. The potential for diagnostics can be improved by data post-processing. Hence, this study aims to evaluate the performance of preprocessing steps and multivariate analysis methods for the classification of normal tissues and pathological oral lesion spectra. A total of 80 spectra acquired from normal and abnormal tissues using optical fiber Raman-based spectroscopy (OFRS) were subjected to PCA preprocessing in the z-scored data set, and the KNN (K-nearest neighbors), J48 (unpruned C4.5 decision tree), RBF (radial basis function), RF (random forest), and MLP (multilayer perceptron) classifiers at WEKA software (Waikato environment for knowledge analysis), after area normalization or maximum intensity normalization. Our results suggest the best classification was achieved by using maximum intensity normalization followed by MLP. Based on these results, software for automated analysis can be generated and validated using larger data sets. This would aid quick comprehension of spectroscopic data and easy diagnosis by medical practitioners in clinical settings.

  13. Software project management tools in global software development: a systematic mapping study.

    PubMed

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  14. Clinical data miner: an electronic case report form system with integrated data preprocessing and machine-learning libraries supporting clinical diagnostic model research.

    PubMed

    Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk

    2014-10-20

    Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries. This integration improves the efficiency of the clinical diagnostic model research workflow. Moreover, by simplifying the generation of learning curves, CDM enables study coordinators to assess more accurately when data collection can be terminated, resulting in better models or lower patient recruitment costs.

  15. Integrating and Managing Bim in GIS, Software Review

    NASA Astrophysics Data System (ADS)

    El Meouche, R.; Rezoug, M.; Hijazi, I.

    2013-08-01

    Since the advent of Computer-Aided Design (CAD) and Geographical Information System (GIS) tools, project participants have been increasingly leveraging these tools throughout the different phases of a civil infrastructure project. In recent years the number of GIS software that provides tools to enable the integration of Building information in geo context has risen sharply. More and more GIS software are added tools for this purposes and other software projects are regularly extending these tools. However, each software has its different strength and weakness and its purpose of use. This paper provides a thorough review to investigate the software capabilities and clarify its purpose. For this study, Autodesk Revit 2012 i.e. BIM editor software was used to create BIMs. In the first step, three building models were created, the resulted models were converted to BIM format and then the software was used to integrate it. For the evaluation of the software, general characteristics was studied such as the user interface, what formats are supported (import/export), and the way building information are imported.

  16. ReQON: a Bioconductor package for recalibrating quality scores from next-generation sequencing data

    PubMed Central

    2012-01-01

    Background Next-generation sequencing technologies have become important tools for genome-wide studies. However, the quality scores that are assigned to each base have been shown to be inaccurate. If the quality scores are used in downstream analyses, these inaccuracies can have a significant impact on the results. Results Here we present ReQON, a tool that recalibrates the base quality scores from an input BAM file of aligned sequencing data using logistic regression. ReQON also generates diagnostic plots showing the effectiveness of the recalibration. We show that ReQON produces quality scores that are both more accurate, in the sense that they more closely correspond to the probability of a sequencing error, and do a better job of discriminating between sequencing errors and non-errors than the original quality scores. We also compare ReQON to other available recalibration tools and show that ReQON is less biased and performs favorably in terms of quality score accuracy. Conclusion ReQON is an open source software package, written in R and available through Bioconductor, for recalibrating base quality scores for next-generation sequencing data. ReQON produces a new BAM file with more accurate quality scores, which can improve the results of downstream analysis, and produces several diagnostic plots showing the effectiveness of the recalibration. PMID:22946927

  17. Software attribute visualization for high integrity software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  18. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  19. Lessons learned in deploying software estimation technology and tools

    NASA Technical Reports Server (NTRS)

    Panlilio-Yap, Nikki; Ho, Danny

    1994-01-01

    Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.

  20. Evaluation of the efficiency and reliability of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1994-01-01

    There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

  1. WinHPC System Software | High-Performance Computing | NREL

    Science.gov Websites

    Software WinHPC System Software Learn about the software applications, tools, toolchains, and for industrial applications. Intel Compilers Development Tool, Toolchain Suite featuring an industry

  2. Installing and Setting Up the Git Software Tool on OS X | High-Performance

    Science.gov Websites

    Computing | NREL the Git Software Tool on OS X Installing and Setting Up the Git Software Tool on OS X Learn how to install the Git software tool on OS X for use with the Peregrine system. You can . Binary Installer for OS X - Easiest! You can download the latest version of git from http://git-scm.com

  3. Computer-Aided Nodule Assessment and Risk Yield Risk Management of Adenocarcinoma: The Future of Imaging?

    PubMed

    Foley, Finbar; Rajagopalan, Srinivasan; Raghunath, Sushravya M; Boland, Jennifer M; Karwoski, Ronald A; Maldonado, Fabien; Bartholmai, Brian J; Peikert, Tobias

    2016-01-01

    Increased clinical use of chest high-resolution computed tomography results in increased identification of lung adenocarcinomas and persistent subsolid opacities. However, these lesions range from very indolent to extremely aggressive tumors. Clinically relevant diagnostic tools to noninvasively risk stratify and guide individualized management of these lesions are lacking. Research efforts investigating semiquantitative measures to decrease interrater and intrarater variability are emerging, and in some cases steps have been taken to automate this process. However, many such methods currently are still suboptimal, require validation and are not yet clinically applicable. The computer-aided nodule assessment and risk yield software application represents a validated tool for the automated, quantitative, and noninvasive tool for risk stratification of adenocarcinoma lung nodules. Computer-aided nodule assessment and risk yield correlates well with consensus histology and postsurgical patient outcomes, and therefore may help to guide individualized patient management, for example, in identification of nodules amenable to radiological surveillance, or in need of adjunctive therapy. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. CANARY Risk Management of Adenocarcinoma: The Future of Imaging?

    PubMed Central

    Foley, Finbar; Rajagopalan, Srinivasan; Raghunath, Sushravya M; Boland, Jennifer M; Karwoski, Ronald A.; Maldonado, Fabien; Bartholmai, Brian J; Peikert, Tobias

    2016-01-01

    Increased clinical utilization of chest high resolution computed tomography results in increased identification of lung adenocarcinomas and persistent sub-solid opacities. However, these lesions range from very indolent to extremely aggressive tumors. Clinically relevant diagnostic tools to non-invasively risk stratify and guide individualized management of these lesions are lacking. Research efforts investigating semi-quantitative measures to decrease inter- and intra-rater variability are emerging, and in some cases steps have been taken to automate this process. However, many such methods currently are still sub-optimal, require validation and are not yet clinically applicable. The Computer-Aided Nodule Assessment and Risk Yield (CANARY) software application represents a validated tool for the automated, quantitative, non-invasive tool for risk stratification of adenocarcinoma lung nodules. CANARY correlates well with consensus histology and post-surgical patient outcomes and therefore may help to guide individualized patient management e.g. in identification of nodules amenable to radiological surveillance, or in need of adjunctive therapy. PMID:27568149

  5. Applying CASE Tools for On-Board Software Development

    NASA Astrophysics Data System (ADS)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  6. PC Software for Artificial Intelligence Applications.

    PubMed

    Epp, H; Kalin, M; Miller, D

    1988-05-06

    Our review has emphasized that AI tools are programming languages inspired by some problem-solving paradigm. We want to underscore their status as programming languages; even if an AI tool seems to fit a problem perfectly, its proficient use still requires the training and practice associated with any programming language. The programming manuals for PC-Plus, Smalltalk/ V, and Nexpert Object are all tutorial in nature, and the corresponding software packages come with sample applications. We find the manuals to be uniformly good introductions that try to anticipate the problems of a user who is new to the technology. All three vendors offer free technical support by telephone to licensed users. AI tools are sometimes oversold as a way to make programming easy or to avoid it altogether. The truth is that AI tools demand programming-but programming that allows you to concentrate on the essentials of the problem. If we had to implement a diagnostic system, we would look first to a product such as PC-Plus rather than BASIC or C, because PC-Plus is designed specifically for such a problem, whereas these conventional languages are not. If we had to implement a system that required graphical interfaces and could benefit from inheritance, we would look first to an object-oriented system such as Smalltalk/V that provides built-in mechanisms for both. If we had to implement an expert system that called for some mix of AI and conventional techniques, we would look first to a product such as Nexpert Object that integrates various problem-solving technologies. Finally, we might use FORTRAN if we were concerned primarily with programming a well-defined numerical algorithm. AI tools are a valuable complement to traditional languages.

  7. Software for predictive microbiology and risk assessment: a description and comparison of tools presented at the ICPMF8 Software Fair.

    PubMed

    Tenenhaus-Aziza, Fanny; Ellouze, Mariem

    2015-02-01

    The 8th International Conference on Predictive Modelling in Food was held in Paris, France in September 2013. One of the major topics of this conference was the transfer of knowledge and tools between academics and stakeholders of the food sector. During the conference, a "Software Fair" was held to provide information and demonstrations of predictive microbiology and risk assessment software. This article presents an overall description of the 16 software tools demonstrated at the session and provides a comparison based on several criteria such as the modeling approach, the different modules available (e.g. databases, predictors, fitting tools, risk assessment tools), the studied environmental factors (temperature, pH, aw, etc.), the type of media (broth or food) and the number and type of the provided micro-organisms (pathogens and spoilers). The present study is a guide to help users select the software tools which are most suitable to their specific needs, before they test and explore the tool(s) in more depth. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Human Vision-Motivated Algorithm Allows Consistent Retinal Vessel Classification Based on Local Color Contrast for Advancing General Diagnostic Exams.

    PubMed

    Ivanov, Iliya V; Leitritz, Martin A; Norrenberg, Lars A; Völker, Michael; Dynowski, Marek; Ueffing, Marius; Dietter, Johannes

    2016-02-01

    Abnormalities of blood vessel anatomy, morphology, and ratio can serve as important diagnostic markers for retinal diseases such as AMD or diabetic retinopathy. Large cohort studies demand automated and quantitative image analysis of vascular abnormalities. Therefore, we developed an analytical software tool to enable automated standardized classification of blood vessels supporting clinical reading. A dataset of 61 images was collected from a total of 33 women and 8 men with a median age of 38 years. The pupils were not dilated, and images were taken after dark adaption. In contrast to current methods in which classification is based on vessel profile intensity averages, and similar to human vision, local color contrast was chosen as a discriminator to allow artery vein discrimination and arterial-venous ratio (AVR) calculation without vessel tracking. With 83% ± 1 standard error of the mean for our dataset, we achieved best classification for weighted lightness information from a combination of the red, green, and blue channels. Tested on an independent dataset, our method reached 89% correct classification, which, when benchmarked against conventional ophthalmologic classification, shows significantly improved classification scores. Our study demonstrates that vessel classification based on local color contrast can cope with inter- or intraimage lightness variability and allows consistent AVR calculation. We offer an open-source implementation of this method upon request, which can be integrated into existing tool sets and applied to general diagnostic exams.

  9. Toolpack mathematical software development environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osterweil, L.

    1982-07-21

    The purpose of this research project was to produce a well integrated set of tools for the support of numerical computation. The project entailed the specification, design and implementation of both a diversity of tools and an innovative tool integration mechanism. This large configuration of tightly integrated tools comprises an environment for numerical software development, and has been named Toolpack/IST (Integrated System of Tools). Following the creation of this environment in prototype form, the environment software was readied for widespread distribution by transitioning it to a development organization for systematization, documentation and distribution. It is expected that public release ofmore » Toolpack/IST will begin imminently and will provide a basis for evaluation of the innovative software approaches taken as well as a uniform set of development tools for the numerical software community.« less

  10. Software development environments: Status and trends

    NASA Technical Reports Server (NTRS)

    Duffel, Larry E.

    1988-01-01

    Currently software engineers are the essential integrating factors tying several components together. The components consist of process, methods, computers, tools, support environments, and software engineers. The engineers today empower the tools versus the tools empowering the engineers. Some of the issues in software engineering are quality, managing the software engineering process, and productivity. A strategy to accomplish this is to promote the evolution of software engineering from an ad hoc, labor intensive activity to a managed, technology supported discipline. This strategy may be implemented by putting the process under management control, adopting appropriate methods, inserting the technology that provides automated support for the process and methods, collecting automated tools into an integrated environment and educating the personnel.

  11. Transatlantic Comparison of CT Radiation Doses in the Era of Radiation Dose-Tracking Software.

    PubMed

    Parakh, Anushri; Euler, Andre; Szucs-Farkas, Zsolt; Schindera, Sebastian T

    2017-12-01

    The purpose of this study is to compare diagnostic reference levels from a local European CT dose registry, using radiation-tracking software from a large patient sample, with preexisting European and North American diagnostic reference levels. Data (n = 43,761 CT scans obtained over the course of 2 years) for the European local CT dose registry were obtained from eight CT scanners at six institutions. Means, medians, and interquartile ranges of volumetric CT dose index (CTDI vol ), dose-length product (DLP), size-specific dose estimate, and effective dose values for CT examinations of the head, paranasal sinuses, thorax, pulmonary angiogram, abdomen-pelvis, renal-colic, thorax-abdomen-pelvis, and thoracoabdominal angiogram were obtained using radiation-tracking software. Metrics from this registry were compared with diagnostic reference levels from Canada and California (published in 2015), the American College of Radiology (ACR) dose index registry (2015), and national diagnostic reference levels from local CT dose registries in Switzerland (2010), the United Kingdom (2011), and Portugal (2015). Our local registry had a lower 75th percentile CTDI vol for all protocols than did the individual internationally sourced data. Compared with our study, the ACR dose index registry had higher 75th percentile CTDI vol values by 55% for head, 240% for thorax, 28% for abdomen-pelvis, 42% for thorax-abdomen-pelvis, 128% for pulmonary angiogram, 138% for renal-colic, and 58% for paranasal sinus studies. Our local registry had lower diagnostic reference level values than did existing European and North American diagnostic reference levels. Automated radiation-tracking software could be used to establish and update existing diagnostic reference levels because they are capable of analyzing large datasets meaningfully.

  12. Model Diagnostics for the Department of Energy's Accelerated Climate Modeling for Energy (ACME) Project

    NASA Astrophysics Data System (ADS)

    Smith, B.

    2015-12-01

    In 2014, eight Department of Energy (DOE) national laboratories, four academic institutions, one company, and the National Centre for Atmospheric Research combined forces in a project called Accelerated Climate Modeling for Energy (ACME) with the goal to speed Earth system model development for climate and energy. Over the planned 10-year span, the project will conduct simulations and modeling on DOE's most powerful high-performance computing systems at Oak Ridge, Argonne, and Lawrence Berkeley Leadership Compute Facilities. A key component of the ACME project is the development of an interactive test bed for the advanced Earth system model. Its execution infrastructure will accelerate model development and testing cycles. The ACME Workflow Group is leading the efforts to automate labor-intensive tasks, provide intelligent support for complex tasks and reduce duplication of effort through collaboration support. As part of this new workflow environment, we have created a diagnostic, metric, and intercomparison Python framework, called UVCMetrics, to aid in the testing-to-production execution of the ACME model. The framework exploits similarities among different diagnostics to compactly support diagnosis of new models. It presently focuses on atmosphere and land but is designed to support ocean and sea ice model components as well. This framework is built on top of the existing open-source software framework known as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT). Because of its flexible framework design, scientists and modelers now can generate thousands of possible diagnostic outputs. These diagnostics can compare model runs, compare model vs. observation, or simply verify a model is physically realistic. Additional diagnostics are easily integrated into the framework, and our users have already added several. Diagnostics can be generated, viewed, and manipulated from the UV-CDAT graphical user interface, Python command line scripts and programs, and web browsers. The framework is designed to be scalable to large datasets, yet easy to use and familiar to scientists using previous tools. Integration in the ACME overall user interface facilitates data publication, further analysis, and quick feedback to model developers and scientists making component or coupled model runs.

  13. ddpcr: an R package and web application for analysis of droplet digital PCR data.

    PubMed

    Attali, Dean; Bidshahri, Roza; Haynes, Charles; Bryan, Jennifer

    2016-01-01

    Droplet digital polymerase chain reaction (ddPCR) is a novel platform for exact quantification of DNA which holds great promise in clinical diagnostics. It is increasingly popular due to its digital nature, which provides more accurate quantification and higher sensitivity than traditional real-time PCR. However, clinical adoption has been slowed in part by the lack of software tools available for analyzing ddPCR data. Here, we present ddpcr - a new R package for ddPCR visualization and analysis. In addition, ddpcr includes a web application (powered by the Shiny R package) that allows users to analyze ddPCR data using an interactive graphical interface.

  14. Current trends for customized biomedical software tools.

    PubMed

    Khan, Haseeb Ahmad

    2017-01-01

    In the past, biomedical scientists were solely dependent on expensive commercial software packages for various applications. However, the advent of user-friendly programming languages and open source platforms has revolutionized the development of simple and efficient customized software tools for solving specific biomedical problems. Many of these tools are designed and developed by biomedical scientists independently or with the support of computer experts and often made freely available for the benefit of scientific community. The current trends for customized biomedical software tools are highlighted in this short review.

  15. Software management tools: Lessons learned from use

    NASA Technical Reports Server (NTRS)

    Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.

    1985-01-01

    Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.

  16. MFV-class: a multi-faceted visualization tool of object classes.

    PubMed

    Zhang, Zhi-meng; Pan, Yun-he; Zhuang, Yue-ting

    2004-11-01

    Classes are key software components in an object-oriented software system. In many industrial OO software systems, there are some classes that have complicated structure and relationships. So in the processes of software maintenance, testing, software reengineering, software reuse and software restructure, it is a challenge for software engineers to understand these classes thoroughly. This paper proposes a class comprehension model based on constructivist learning theory, and implements a software visualization tool (MFV-Class) to help in the comprehension of a class. The tool provides multiple views of class to uncover manifold facets of class contents. It enables visualizing three object-oriented metrics of classes to help users focus on the understanding process. A case study was conducted to evaluate our approach and the toolkit.

  17. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    NASA Technical Reports Server (NTRS)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  18. ToxPredictor: a Toxicity Estimation Software Tool

    EPA Science Inventory

    The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...

  19. Dynamic visualization techniques for high consequence software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of howmore » the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.« less

  20. Bringing your tools to CyVerse Discovery Environment using Docker

    PubMed Central

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse’s Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse’s production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use. PMID:27803802

  1. Bringing your tools to CyVerse Discovery Environment using Docker.

    PubMed

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse's Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse's production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use.

  2. Evaluation of the diagnostic accuracy of CareStart G6PD deficiency Rapid Diagnostic Test (RDT) in a malaria endemic area in Ghana, Africa.

    PubMed

    Adu-Gyasi, Dennis; Asante, Kwaku Poku; Newton, Sam; Dosoo, David; Amoako, Sabastina; Adjei, George; Amoako, Nicholas; Ankrah, Love; Tchum, Samuel Kofi; Mahama, Emmanuel; Agyemang, Veronica; Kayan, Kingsley; Owusu-Agyei, Seth

    2015-01-01

    Glucose-6-phosphate dehydrogenase (G6PD) deficiency is the most widespread enzyme defect that can result in red cell breakdown under oxidative stress when exposed to certain medicines including antimalarials. We evaluated the diagnostic accuracy of CareStart G6PD deficiency Rapid Diagnostic Test (RDT) as a point-of-care tool for screening G6PD deficiency. A cross-sectional study was conducted among 206 randomly selected and consented participants from a group with known G6PD deficiency status between February 2013 and June 2013. A maximum of 1.6ml of capillary blood samples were used for G6PD deficiency screening using CareStart G6PD RDT and Trinity qualitative with Trinity quantitative methods as the "gold standard". Samples were also screened for the presence of malaria parasites. Data entry and analysis were done using Microsoft Access 2010 and Stata Software version 12. Kintampo Health Research Centre Institutional Ethics Committee granted ethical approval. The sensitivity (SE) and specificity (SP) of CareStart G6PD deficiency RDT was 100% and 72.1% compared to Trinity quantitative method respectively and was 98.9% and 96.2% compared to Trinity qualitative method. Malaria infection status had no significant (P=0.199) change on the performance of the G6PD RDT test kit compared to the "gold standard". The outcome of this study suggests that the diagnostic performance of the CareStart G6PD deficiency RDT kit was high and it is acceptable at determining the G6PD deficiency status in a high malaria endemic area in Ghana. The RDT kit presents as an attractive tool for point-of-care G6PD deficiency for rapid testing in areas with high temperatures and less expertise. The CareStart G6PD deficiency RDT kit could be used to screen malaria patients before administration of the fixed dose primaquine with artemisinin-based combination therapy.

  3. Overview of aerothermodynamic loads definition study

    NASA Technical Reports Server (NTRS)

    Gaugler, Raymond E.

    1989-01-01

    Over the years, NASA has been conducting the Advanced Earth-to-Orbit (AETO) Propulsion Technology Program to provide the knowledge, understanding, and design methodology that will allow the development of advanced Earth-to-orbit propulsion systems with high performance, extended service life, automated operations, and diagnostics for in-flight health monitoring. The objective of the Aerothermodynamic Loads Definition Study is to develop methods to more accurately predict the operating environment in AETO propulsion systems, such as the Space Shuttle Main Engine (SSME) powerhead. The approach taken consists of 2 parts: to modify, apply, and disseminate existing computational fluid dynamics tools in response to current needs and to develop new technology that will enable more accurate computation of the time averaged and unsteady aerothermodynamic loads in the SSME powerhead. The software tools are detailed. Significant progress was made in the area of turbomachinery, where there is an overlap between the AETO efforts and research in the aeronautical gas turbine field.

  4. Proceedings of the Ninth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Experiences in measurement, utilization, and evaluation of software methodologies, models, and tools are discussed. NASA's involvement in ever larger and more complex systems, like the space station project, provides a motive for the support of software engineering research and the exchange of ideas in such forums. The topics of current SEL research are software error studies, experiments with software development, and software tools.

  5. Software Management Environment (SME): Components and algorithms

    NASA Technical Reports Server (NTRS)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1994-01-01

    This document presents the components and algorithms of the Software Management Environment (SME), a management tool developed for the Software Engineering Branch (Code 552) of the Flight Dynamics Division (FDD) of the Goddard Space Flight Center (GSFC). The SME provides an integrated set of visually oriented experienced-based tools that can assist software development managers in managing and planning software development projects. This document describes and illustrates the analysis functions that underlie the SME's project monitoring, estimation, and planning tools. 'SME Components and Algorithms' is a companion reference to 'SME Concepts and Architecture' and 'Software Engineering Laboratory (SEL) Relationships, Models, and Management Rules.'

  6. Diagnostic Tools for Acute Anterior Cruciate Ligament Injury: GNRB, Lachman Test, and Telos.

    PubMed

    Ryu, Seung Min; Na, Ho Dong; Shon, Oog Jin

    2018-06-01

    The purpose of this study is to compare the accuracy of the GNRB arthrometer (Genourob), Lachman test, and Telos device (GmbH) in acute anterior cruciate ligament (ACL) injuries and to evaluate the accuracy of each diagnostic tool according to the length of time from injury to examination. From September 2015 to September 2016, 40 cases of complete ACL rupture were reviewed. We divided the time from injury to examination into three periods of 10 days each and analyzed the diagnostic tools according to the time frame. An analysis of the area under the curve (AUC) of a receiver operating characteristic curve showed that all diagnostic tools were fairly informative. The GNRB showed a higher AUC than other diagnostic tools. In 10 cases assessed within 10 days after injury, the GNRB showed statistically significant side-to-side difference in laxity (p<0.001), whereas the Telos test and Lachman test did not show significantly different laxity (p=0.541 and p=0.413, respectively). All diagnostic values of the GNRB were better than other diagnostic tools in acute ACL injuries. The GNRB was more effective in acute ACL injuries examined within 10 days of injury. The GNRB arthrometer can be a useful diagnostic tool for acute ACL injuries.

  7. Data management software concept for WEST plasma measurement system

    NASA Astrophysics Data System (ADS)

    Zienkiewicz, P.; Kasprowicz, G.; Byszuk, A.; Wojeński, A.; Kolasinski, P.; Cieszewski, R.; Czarski, T.; Chernyshova, M.; Pozniak, K.; Zabolotny, W.; Juszczyk, B.; Mazon, D.; Malard, P.

    2014-11-01

    This paper describes the concept of data management software for the multichannel readout system for the GEM detector used in WEST Plasma experiment. The proposed system consists of three separate communication channels: fast data channel, diagnostics channel, slow data channel. Fast data channel is provided by the FPGA with integrated ARM cores providing direct readout data from Analog Front Ends through 10GbE with short, guaranteed intervals. Slow data channel is provided by multiple, fast CPUs after data processing with detailed readout data with use of GNU/Linux OS and appropriate software. Diagnostic channel provides detailed feedback for control purposes.

  8. Applying Model-Based Reasoning to the FDIR of the Command and Data Handling Subsystem of the International Space Station

    NASA Technical Reports Server (NTRS)

    Robinson, Peter; Shirley, Mark; Fletcher, Daryl; Alena, Rick; Duncavage, Dan; Lee, Charles

    2003-01-01

    All of the International Space Station (ISS) systems which require computer control depend upon the hardware and software of the Command and Data Handling System (C&DH) system, currently a network of over 30 386-class computers called Multiplexor/Dimultiplexors (MDMs)[18]. The Caution and Warning System (C&W)[7], a set of software tasks that runs on the MDMs, is responsible for detecting, classifying, and reporting errors in all ISS subsystems including the C&DH. Fault Detection, Isolation and Recovery (FDIR) of these errors is typically handled with a combination of automatic and human effort. We are developing an Advanced Diagnostic System (ADS) to augment the C&W system with decision support tools to aid in root cause analysis as well as resolve differing human and machine C&DH state estimates. These tools which draw from sources in model-based reasoning[ 16,291, will improve the speed and accuracy of flight controllers by reducing the uncertainty in C&DH state estimation, allowing for a more complete assessment of risk. We have run tests with ISS telemetry and focus on those C&W events which relate to the C&DH system itself. This paper describes our initial results and subsequent plans.

  9. DNAseq Workflow in a Diagnostic Context and an Example of a User Friendly Implementation.

    PubMed

    Wolf, Beat; Kuonen, Pierre; Dandekar, Thomas; Atlan, David

    2015-01-01

    Over recent years next generation sequencing (NGS) technologies evolved from costly tools used by very few, to a much more accessible and economically viable technology. Through this recently gained popularity, its use-cases expanded from research environments into clinical settings. But the technical know-how and infrastructure required to analyze the data remain an obstacle for a wider adoption of this technology, especially in smaller laboratories. We present GensearchNGS, a commercial DNAseq software suite distributed by Phenosystems SA. The focus of GensearchNGS is the optimal usage of already existing infrastructure, while keeping its use simple. This is achieved through the integration of existing tools in a comprehensive software environment, as well as custom algorithms developed with the restrictions of limited infrastructures in mind. This includes the possibility to connect multiple computers to speed up computing intensive parts of the analysis such as sequence alignments. We present a typical DNAseq workflow for NGS data analysis and the approach GensearchNGS takes to implement it. The presented workflow goes from raw data quality control to the final variant report. This includes features such as gene panels and the integration of online databases, like Ensembl for annotations or Cafe Variome for variant sharing.

  10. Spacewire on Earth orbiting scatterometers

    NASA Technical Reports Server (NTRS)

    Bachmann, Alex; Lang, Minh; Lux, James; Steffke, Richard

    2002-01-01

    The need for a high speed, reliable and easy to implement communication link has led to the development of a space flight oriented version of IEEE 1355 called SpaceWire. SpaceWire is based on high-speed (200 Mbps) serial point-to-point links using Low Voltage Differential Signaling (LVDS). SpaceWIre has provisions for routing messages between a large network of processors, using wormhole routing for low overhead and latency. {additionally, there are available space qualified hybrids, which provide the Link layer to the user's bus}. A test bed of multiple digital signal processor breadboards, demonstrating the ability to meet signal processing requirements for an orbiting scatterometer has been implemented using three Astrium MCM-DSPs, each breadboard consists of a Multi Chip Module (MCM) that combines a space qualified Digital Signal Processor and peripherals, including IEEE-1355 links. With the addition of appropriate physical layer interfaces and software on the DSP, the SpaceWire link is used to communicate between processors on the test bed, e.g. sending timing references, commands, status, and science data among the processors. Results are presented on development issues surrounding the use of SpaceWire in this environment, from physical layer implementation (cables, connectors, LVDS drivers) to diagnostic tools, driver firmware, and development methodology. The tools, methods, and hardware, software challenges and preliminary performance are investigated and discussed.

  11. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-31

    requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry

  12. Reviews of Instructional Software in Scholarly Journals: A Selected Bibliography.

    ERIC Educational Resources Information Center

    Bantz, David A.; And Others

    This bibliography lists reviews of more than 100 instructional software packages, which are arranged alphabetically by discipline. Information provided for each entry includes the topical emphasis, type of software (i.e., simulation, tutorial, analysis tool, test generator, database, writing tool, drill, plotting tool, videodisc), the journal…

  13. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  14. Assistive Software Tools for Secondary-Level Students with Literacy Difficulties

    ERIC Educational Resources Information Center

    Lange, Alissa A.; McPhillips, Martin; Mulhern, Gerry; Wylie, Judith

    2006-01-01

    The present study assessed the compensatory effectiveness of four assistive software tools (speech synthesis, spellchecker, homophone tool, and dictionary) on literacy. Secondary-level students (N = 93) with reading difficulties completed computer-based tests of literacy skills. Training on their respective software followed for those assigned to…

  15. Estimation of toxicity using a Java based software tool

    EPA Science Inventory

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  16. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  17. Software tool for portal dosimetry research.

    PubMed

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  18. Lessons Learned in the Livingstone 2 on Earth Observing One Flight Experiment

    NASA Technical Reports Server (NTRS)

    Hayden, Sandra C.; Sweet, Adam J.; Shulman, Seth

    2005-01-01

    The Livingstone 2 (L2) model-based diagnosis software is a reusable diagnostic tool for monitoring complex systems. In 2004, L2 was integrated with the JPL Autonomous Sciencecraft Experiment (ASE) and deployed on-board Goddard's Earth Observing One (EO-1) remote sensing satellite, to monitor and diagnose the EO-1 space science instruments and imaging sequence. This paper reports on lessons learned from this flight experiment. The goals for this experiment, including validation of minimum success criteria and of a series of diagnostic scenarios, have all been successfully net. Long-term operations in space are on-going, as a test of the maturity of the system, with L2 performance remaining flawless. L2 has demonstrated the ability to track the state of the system during nominal operations, detect simulated abnormalities in operations and isolate failures to their root cause fault. Specific advances demonstrated include diagnosis of ambiguity groups rather than a single fault candidate; hypothesis revision given new sensor evidence about the state of the system; and the capability to check for faults in a dynamic system without having to wait until the system is quiescent. The major benefits of this advanced health management technology are to increase mission duration and reliability through intelligent fault protection, and robust autonomous operations with reduced dependency on supervisory operations from Earth. The work-load for operators will be reduced by telemetry of processed state-of-health information rather than raw data. The long-term vision is that of making diagnosis available to the onboard planner or executive, allowing autonomy software to re-plan in order to work around known component failures. For a system that is expected to evolve substantially over its lifetime, as for the International Space Station, the model-based approach has definite advantages over rule-based expert systems and limit-checking fault protection systems, as these do not scale well. The model-based approach facilitates reuse of the L2 diagnostic software; only the model of the system to be diagnosed and telemetry monitoring software has to be rebuilt for a new system or expanded for a growing system. The hierarchical L2 model supports modularity and expendability, and as such is suitable solution for integrated system health management as envisioned for systems-of-systems.

  19. The development of a quality appraisal tool for studies of diagnostic reliability (QAREL).

    PubMed

    Lucas, Nicholas P; Macaskill, Petra; Irwig, Les; Bogduk, Nikolai

    2010-08-01

    In systematic reviews of the reliability of diagnostic tests, no quality assessment tool has been used consistently. The aim of this study was to develop a specific quality appraisal tool for studies of diagnostic reliability. Key principles for the quality of studies of diagnostic reliability were identified with reference to epidemiologic principles, existing quality appraisal checklists, and the Standards for Reporting of Diagnostic Accuracy (STARD) and Quality Assessment of Diagnostic Accuracy Studies (QUADAS) resources. Specific items that encompassed each of the principles were developed. Experts in diagnostic research provided feedback on the items that were to form the appraisal tool. This process was iterative and continued until consensus among experts was reached. The Quality Appraisal of Reliability Studies (QAREL) checklist includes 11 items that explore seven principles. Items cover the spectrum of subjects, spectrum of examiners, examiner blinding, order effects of examination, suitability of the time interval among repeated measurements, appropriate test application and interpretation, and appropriate statistical analysis. QAREL has been developed as a specific quality appraisal tool for studies of diagnostic reliability. The reliability of this tool in different contexts needs to be evaluated. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  20. Dataflow Design Tool: User's Manual

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1996-01-01

    The Dataflow Design Tool is a software tool for selecting a multiprocessor scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. The software tool implements graph-search algorithms and analysis techniques based on the dataflow paradigm. Dataflow analyses provided by the software are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool provides performance optimization through the inclusion of artificial precedence constraints among the schedulable tasks. The user interface and tool capabilities are described. Examples are provided to demonstrate the analysis, scheduling, and optimization functions facilitated by the tool.

  1. Monitoring machining conditions by infrared images

    NASA Astrophysics Data System (ADS)

    Borelli, Joao E.; Gonzaga Trabasso, Luis; Gonzaga, Adilson; Coelho, Reginaldo T.

    2001-03-01

    During machining process the knowledge of the temperature is the most important factor in tool analysis. It allows to control main factors that influence tool use, life time and waste. The temperature in the contact area between the piece and the tool is resulting from the material removal in cutting operation and it is too difficult to be obtained because the tool and the work piece are in motion. One way to measure the temperature in this situation is detecting the infrared radiation. This work presents a new methodology for diagnosis and monitoring of machining processes with the use of infrared images. The infrared image provides a map in gray tones of the elements in the process: tool, work piece and chips. Each gray tone in the image corresponds to a certain temperature for each one of those materials and the relationship between the gray tones and the temperature is gotten by the previous of infrared camera calibration. The system developed in this work uses an infrared camera, a frame grabber board and a software composed of three modules. The first module makes the image acquisition and processing. The second module makes the feature image extraction and performs the feature vector. Finally, the third module uses fuzzy logic to evaluate the feature vector and supplies the tool state diagnostic as output.

  2. Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics

    DOE PAGES

    Scott, S. D.; Mumgaard, R. T.

    2016-07-20

    A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using amore » numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. As a result, the software suite is modular, parallelized, and portable to other facilities.« less

  3. Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, S. D.; Mumgaard, R. T.

    A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using amore » numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. As a result, the software suite is modular, parallelized, and portable to other facilities.« less

  4. GRAIL-genQuest: A comprehensive computational system for DNA sequence analysis. Final report, DOE SBIR Phase II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manning, Ruth Ann

    Recent advances in DNA sequencing and genome mapping technologies are making it possible, for the first time in history, to find genes in plants and animals and to elucidate their function. This means that diagnostics and therapeutics can be developed for human diseases such as cancer, obesity, hypertension, and cardiovascular problems. Crop and animal strains can be developed that are hardier, resistant to diseases, and produce higher yields. The challenge is to develop tools that will find the nucleotides in the DNA of a living organism that comprise a particular gene. In the human genome alone it is estimated thatmore » only about 51% of the approximately 3 billion pairs of nucleotides code for some 100,000 human genes. In this search for nucleotides within a genome which are active in the actual coding of proteins, efficient tools to locate and identify their function can be of significant value to mankind. Software tools such as ApoCom GRAIL{trademark} have assisted in this search. It can be used to analyze genome information, to identify exons (coding regions) and to construct gene models. Using a neural network approach, this software can ''learn'' sequence patterns and refine its ability to recognize a pattern as it is exposed to more and more examples of it. Since 1992 versions of GRAIL{trademark} have been publicly available over the Internet from Oak Ridge National Laboratory. Because of the potential for security and patent compromise, these Internet versions are not available to many researchers in pharmaceutical and biotechnology companies who cannot send proprietary sequences past their data-secure firewalls. ApoCom is making available commercial versions of the GRAIL{trademark} software to run self-contained over local area networks. As part of the commercialization effort, ApoCom has developed a new Java{trademark}-based graphical user interface, the ApoCom Client Tool for Genomics (ACTG){trademark}. Two products, ApoCom GRAIL{trademark} Network Edition and ApoCom GRAIL{trademark} Personal Edition, have been developed to reach two diverse niche markets in the Phase III commercialization of this software. As a result of this project ApoCom GRAIL{trademark} can now be made available to the desktop (UNIX{reg_sign}, Windows{reg_sign} 95 and Windows NT{reg_sign}, or Mac{trademark} 0S) of any researcher who needs it.« less

  5. Lessons learned developing a diagnostic tool for HIV-associated dementia feasible to implement in resource-limited settings: pilot testing in Kenya.

    PubMed

    Kwasa, Judith; Cettomai, Deanna; Lwanya, Edwin; Osiemo, Dennis; Oyaro, Patrick; Birbeck, Gretchen L; Price, Richard W; Bukusi, Elizabeth A; Cohen, Craig R; Meyer, Ana-Claire L

    2012-01-01

    To conduct a preliminary evaluation of the utility and reliability of a diagnostic tool for HIV-associated dementia (HAD) for use by primary health care workers (HCW) which would be feasible to implement in resource-limited settings. In resource-limited settings, HAD is an indication for anti-retroviral therapy regardless of CD4 T-cell count. Anti-retroviral therapy, the treatment for HAD, is now increasingly available in resource-limited settings. Nonetheless, HAD remains under-diagnosed likely because of limited clinical expertise and availability of diagnostic tests. Thus, a simple diagnostic tool which is practical to implement in resource-limited settings is an urgent need. A convenience sample of 30 HIV-infected outpatients was enrolled in Western Kenya. We assessed the sensitivity and specificity of a diagnostic tool for HAD as administered by a primary HCW. This was compared to an expert clinical assessment which included examination by a physician, neuropsychological testing, and in selected cases, brain imaging. Agreement between HCW and an expert examiner on certain tool components was measured using Kappa statistic. The sample was 57% male, mean age was 38.6 years, mean CD4 T-cell count was 323 cells/µL, and 54% had less than a secondary school education. Six (20%) of the subjects were diagnosed with HAD by expert clinical assessment. The diagnostic tool was 63% sensitive and 67% specific for HAD. Agreement between HCW and expert examiners was poor for many individual items of the diagnostic tool (K = .03-.65). This diagnostic tool had moderate sensitivity and specificity for HAD. However, reliability was poor, suggesting that substantial training and formal evaluations of training adequacy will be critical to enable HCW to reliably administer a brief diagnostic tool for HAD.

  6. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation.

    PubMed

    Bergeron, Dominic; Tremblay, A-M S

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ^{2} with respect to α, and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  7. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  8. Computer Vision Tool and Technician as First Reader of Lung Cancer Screening CT Scans.

    PubMed

    Ritchie, Alexander J; Sanghera, Calvin; Jacobs, Colin; Zhang, Wei; Mayo, John; Schmidt, Heidi; Gingras, Michel; Pasian, Sergio; Stewart, Lori; Tsai, Scott; Manos, Daria; Seely, Jean M; Burrowes, Paul; Bhatia, Rick; Atkar-Khattra, Sukhinder; van Ginneken, Bram; Tammemagi, Martin; Tsao, Ming Sound; Lam, Stephen

    2016-05-01

    To implement a cost-effective low-dose computed tomography (LDCT) lung cancer screening program at the population level, accurate and efficient interpretation of a large volume of LDCT scans is needed. The objective of this study was to evaluate a workflow strategy to identify abnormal LDCT scans in which a technician assisted by computer vision (CV) software acts as a first reader with the aim to improve speed, consistency, and quality of scan interpretation. Without knowledge of the diagnosis, a technician reviewed 828 randomly batched scans (136 with lung cancers, 556 with benign nodules, and 136 without nodules) from the baseline Pan-Canadian Early Detection of Lung Cancer Study that had been annotated by the CV software CIRRUS Lung Screening (Diagnostic Image Analysis Group, Nijmegen, The Netherlands). The scans were classified as either normal (no nodules ≥1 mm or benign nodules) or abnormal (nodules or other abnormality). The results were compared with the diagnostic interpretation by Pan-Canadian Early Detection of Lung Cancer Study radiologists. The overall sensitivity and specificity of the technician in identifying an abnormal scan were 97.8% (95% confidence interval: 96.4-98.8) and 98.0% (95% confidence interval: 89.5-99.7), respectively. Of the 112 prevalent nodules that were found to be malignant in follow-up, 92.9% were correctly identified by the technician plus CV compared with 84.8% by the study radiologists. The average time taken by the technician to review a scan after CV processing was 208 ± 120 seconds. Prescreening CV software and a technician as first reader is a promising strategy for improving the consistency and quality of screening interpretation of LDCT scans. Copyright © 2016 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.

  9. Parallel software tools at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Tennille, Geoffrey M.; Lakeotes, Christopher D.; Randall, Donald P.; Arthur, Jarvis J.; Hammond, Dana P.; Mall, Gerald H.

    1993-01-01

    This document gives a brief overview of parallel software tools available on the Intel iPSC/860 parallel computer at Langley Research Center. It is intended to provide a source of information that is somewhat more concise than vendor-supplied material on the purpose and use of various tools. Each of the chapters on tools is organized in a similar manner covering an overview of the functionality, access information, how to effectively use the tool, observations about the tool and how it compares to similar software, known problems or shortfalls with the software, and reference documentation. It is primarily intended for users of the iPSC/860 at Langley Research Center and is appropriate for both the experienced and novice user.

  10. Estimating population diversity with CatchAll

    PubMed Central

    Bunge, John; Woodard, Linda; Böhning, Dankmar; Foster, James A.; Connolly, Sean; Allen, Heather K.

    2012-01-01

    Motivation: The massive data produced by next-generation sequencing require advanced statistical tools. We address estimating the total diversity or species richness in a population. To date, only relatively simple methods have been implemented in available software. There is a need for software employing modern, computationally intensive statistical analyses including error, goodness-of-fit and robustness assessments. Results: We present CatchAll, a fast, easy-to-use, platform-independent program that computes maximum likelihood estimates for finite-mixture models, weighted linear regression-based analyses and coverage-based non-parametric methods, along with outlier diagnostics. Given sample ‘frequency count’ data, CatchAll computes 12 different diversity estimates and applies a model-selection algorithm. CatchAll also derives discounted diversity estimates to adjust for possibly uncertain low-frequency counts. It is accompanied by an Excel-based graphics program. Availability: Free executable downloads for Linux, Windows and Mac OS, with manual and source code, at www.northeastern.edu/catchall. Contact: jab18@cornell.edu PMID:22333246

  11. A Quantitative Analysis of Open Source Software's Acceptability as Production-Quality Code

    ERIC Educational Resources Information Center

    Fischer, Michael

    2011-01-01

    The difficulty in writing defect-free software has been long acknowledged both by academia and industry. A constant battle occurs as developers seek to craft software that works within aggressive business schedules and deadlines. Many tools and techniques are used in attempt to manage these software projects. Software metrics are a tool that has…

  12. Technology and the future of medical equipment maintenance.

    PubMed

    Wear, J O

    1999-05-01

    Maintenance of medical equipment has been changing rapidly in the past few years. It is changing more rapidly in developed countries, but changes are also occurring in developing countries. Some of the changes may permit improved maintenance on the higher technology equipment in developing countries, since they do not require onsite expertise. Technology has had an increasing impact on the development of medical equipment with the increased use of microprocessors and computers. With miniaturization from space technology and electronic chip design, powerful microprocessors and computers have been built into medical equipment. The improvement in manufacturing technology has increased the quality of parts and therefore the medical equipment. This has resulted in increased mean time between failures and reduced maintenance needs. This has made equipment more reliable in remote areas and developing countries. The built-in computers and advances in software design have brought about self-diagnostics in medical equipment. The technicians now have a strong tool to be used in maintenance. One problem in this area is getting access to the self-diagnostics. Some manufacturers will not readily provide this access to the owner of the equipment. Advances in telecommunications in conjunction with self-diagnostics make available remote diagnosis and repair. Since components can no longer be repaired, a remote repair technician can instruct an operator or an on-site repairman on board replacement. In case of software problems, the remote repair technician may perform the repairs over the telephone. It is possible for the equipment to be monitored remotely by modern without interfering with the operation of the equipment. These changes in technology require the training of biomedical engineering technicians (BMETs) to change. They must have training in computers and telecommunications. Some of this training can be done with telecommunications and computers.

  13. Ophthalmologic diagnostic tool using MR images for biomechanically-based muscle volume deformation

    NASA Astrophysics Data System (ADS)

    Buchberger, Michael; Kaltofen, Thomas

    2003-05-01

    We would like to give a work-in-progress report on our ophthalmologic diagnostic software system which performs biomechanically-based muscle volume deformations using MR images. For reconstructing a three-dimensional representation of an extraocular eye muscle, a sufficient amount of high resolution MR images is used, each representing a slice of the muscle. In addition, threshold values are given, which restrict the amount of data used from the MR images. The Marching Cube algorithm is applied to the polygons, resulting in a 3D representation of the muscle, which can efficiently be rendered. A transformation to a dynamic, deformable model is applied by calculating the center of gravity of each muscle slice, approximating the muscle path and subsequently adding Hermite splines through the centers of gravity of all slices. Then, a radius function is defined for each slice, completing the transformation of the static 3D polygon model. Finally, this paper describes future extensions to our system. One of these extensions is the support for additional calculations and measurements within the reconstructed 3D muscle representation. Globe translation, localization of muscle pulleys by analyzing the 3D reconstruction in two different gaze positions and other diagnostic measurements will be available.

  14. Fleet-Wide Prognostic and Health Management Suite: Asset Fault Signature Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: (1) Diagnostic Advisor, (2) Asset Fault Signature (AFS) Database, (3) Remaining Useful Life Advisor, and (4) Remaining Useful Life Database. The paper focuses on the AFS Database of the FW-PHM Suite, which is used to catalog asset fault signatures. A fault signature is a structured representation ofmore » the information that an expert would use to first detect and then verify the occurrence of a specific type of fault. The fault signatures developed to assess the health status of generator step-up transformers are described in the paper. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.« less

  15. Hybrid-fusion SPECT/CT systems in parathyroid adenoma: Technological improvements and added clinical diagnostic value.

    PubMed

    Wong, K K; Chondrogiannis, S; Bowles, H; Fuster, D; Sánchez, N; Rampin, L; Rubello, D

    Nuclear medicine traditionally employs planar and single photon emission computed tomography (SPECT) imaging techniques to depict the biodistribution of radiotracers for the diagnostic investigation of a range of disorders of endocrine gland function. The usefulness of combining functional information with anatomy derived from computed tomography (CT), magnetic resonance imaging (MRI), and high resolution ultrasound (US), has long been appreciated, either using visual side-by-side correlation, or software-based co-registration. The emergence of hybrid SPECT/CT camera technology now allows the simultaneous acquisition of combined multi-modality imaging, with seamless fusion of 3D volume datasets. Thus, it is not surprising that there is growing literature describing the many advantages that contemporary SPECT/CT technology brings to radionuclide investigation of endocrine disorders, showing potential advantages for the pre-operative locating of the parathyroid adenoma using a minimally invasive surgical approach, especially in the presence of ectopic glands and in multiglandular disease. In conclusion, hybrid SPECT/CT imaging has become an essential tool to ensure the most accurate diagnostic in the management of patients with hyperparathyroidism. Copyright © 2016 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  16. The mission events graphic generator software: A small tool with big results

    NASA Technical Reports Server (NTRS)

    Lupisella, Mark; Leibee, Jack; Scaffidi, Charles

    1993-01-01

    Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.

  17. Usefulness of dual-energy computed tomography with and without dedicated software in identifying uric acid kidney stones.

    PubMed

    Salvador, R; Luque, M P; Ciudin, A; Paño, B; Buñesch, L; Sebastia, C; Nicolau, C

    2016-01-01

    To prospectively evaluate the usefulness of dual-energy computed tomography (DECT) with and without dedicated software in identifying uric acid kidney stones in vivo. We studied 65 kidney stones in 63 patients. All stones were analyzed in vivo by DECT and ex vivo by spectrophotometry. We evaluated the diagnostic performance in identifying uric acid stones with DECT by analyzing the radiologic densities with dedicated software and without using it (through manual measurements) as well as by analyzing the attenuation ratios of the stones in both energies with and without the dedicated software. The six uric acid stones included were correctly identified by evaluating the attenuation ratios with a cutoff of 1.21, both with the dedicated software and without it, yielding perfect diagnostic performance without false positives or false negatives. The study of the attenuations of the stones obtained the following values on the receiver operating characteristic curves in the classification of the uric acid stones: 0.92 for the measurements done with the software and 0.89 for the manual measurements; a cutoff of 538 HU yielded 84% (42/50) diagnostic accuracy for the software and 83.1% (54/65) for the manual measurements. DECT enabled the uric acid stones to be identified correctly through the calculation of the ratio of the attenuations in the two energies. The results obtained with the dedicated software were similar to those obtained manually. Copyright © 2015 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  18. User Studies: Developing Learning Strategy Tool Software for Children.

    ERIC Educational Resources Information Center

    Fitzgerald, Gail E.; Koury, Kevin A.; Peng, Hsinyi

    This paper is a report of user studies for developing learning strategy tool software for children. The prototype software demonstrated is designed for children with learning and behavioral disabilities. The tools consist of easy-to-use templates for creating organizational, memory, and learning approach guides for use in classrooms and at home.…

  19. MUST - An integrated system of support tools for research flight software engineering. [Multipurpose User-oriented Software Technology

    NASA Technical Reports Server (NTRS)

    Straeter, T. A.; Foudriat, E. C.; Will, R. W.

    1977-01-01

    The objectives of NASA's MUST (Multipurpose User-oriented Software Technology) program at Langley Research Center are to cut the cost of producing software which effectively utilizes digital systems for flight research. These objectives will be accomplished by providing an integrated system of support software tools for use throughout the research flight software development process. A description of the overall MUST program and its progress toward the release of a first MUST system will be presented. This release includes: a special interactive user interface, a library of subroutines, assemblers, a compiler, automatic documentation tools, and a test and simulation system.

  20. MoniQA: a general approach to monitor quality assurance

    NASA Astrophysics Data System (ADS)

    Jacobs, J.; Deprez, T.; Marchal, G.; Bosmans, H.

    2006-03-01

    MoniQA ("Monitor Quality Assurance") is a new, non-commercial, independent quality assurance software application developed in our medical physics team. It is a complete Java TM - based modular environment for the evaluation of radiological viewing devices and it thus fits in the global quality assurance network of our (film less) radiology department. The purpose of the software tool is to guide the medical physicist through an acceptance protocol and the radiologist through a constancy check protocol by presentation of the necessary test patterns and by automated data collection. Data are then sent to a central management system for further analysis. At the moment more than 55 patterns have been implemented, which can be grouped in schemes to implement protocols (i.e. AAPMtg18, DIN and EUREF). Some test patterns are dynamically created and 'drawn' on the viewing device with random parameters as is the case in a recently proposed new pattern for constancy testing. The software is installed on 35 diagnostic stations (70 monitors) in a film less radiology department. Learning time was very limited. A constancy check -with the new pattern that assesses luminance decrease, resolution problems and geometric distortion- takes only 2 minutes and 28 seconds per monitor. The modular approach of the software allows the evaluation of new or emerging test patterns. We will report on the software and its usability: practicality of the constancy check tests in our hospital and on the results from acceptance tests of viewing stations for digital mammography.

  1. Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Jong, Jen-Yi

    1986-01-01

    An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.

  2. From microscopy to whole slide digital images: a century and a half of image analysis.

    PubMed

    Taylor, Clive R

    2011-12-01

    In the year 1850, microscopes had evolved in quality to the point that the "first pathologists emerged from the treacherous swamps of medieval practice onto the relatively firm ground that histopathology seemed to offer." These early pathologists began to practice the art of image analysis, and diagnostic surgical pathology was born. Today the traditional microscope, in the hands of an experienced pathologist, is established as the gold standard for diagnosis of cancer and other diseases. Nonetheless, it is a tool and a technology that is more than 150 years old. Rapid advances in the capabilities of digital imaging hardware and software now offer the real possibility of moving to a new level of practice, using whole slide digital images for diagnosis, education, and research in morphologic pathology. Potential efficiencies in work flow and diagnostic integration, coupled with the use of powerful new analytic methods, promise radically to change the future shape of surgical pathology.

  3. Wireless Infrastructure for Performing Monitoring, Diagnostics, and Control HVAC and Other Energy-Using Systems in Small Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patrick O'Neill

    This project focused on developing a low-cost wireless infrastructure for monitoring, diagnosing, and controlling building systems and equipment. End users receive information via the Internet and need only a web browser and Internet connection. The system used wireless communications for: (1) collecting data centrally on site from many wireless sensors installed on building equipment, (2) transmitting control signals to actuators and (3) transmitting data to an offsite network operations center where it is processed and made available to clients on the Web (see Figure 1). Although this wireless infrastructure can be applied to any building system, it was tested onmore » two representative applications: (1) monitoring and diagnostics for packaged rooftop HVAC units used widely on small commercial buildings and (2) continuous diagnosis and control of scheduling errors such as lights and equipment left on during unoccupied hours. This project developed a generic infrastructure for performance monitoring, diagnostics, and control, applicable to a broad range of building systems and equipment, but targeted specifically to small to medium commercial buildings (an underserved market segment). The proposed solution is based on two wireless technologies. The first, wireless telemetry, is used for cell phones and paging and is reliable and widely available. This risk proved to be easily managed during the project. The second technology is on-site wireless communication for acquiring data from sensors and transmitting control signals. The technology must enable communication with many nodes, overcome physical obstructions, operate in environments with other electrical equipment, support operation with on-board power (instead of line power) for some applications, operate at low transmission power in license-free radio bands, and be low cost. We proposed wireless mesh networking to meet these needs. This technology is relatively new and has been applied only in research and tests. This proved to be a major challenge for the project and was ultimately abandoned in favor of a directly wired solution for collecting sensor data at the building. The primary reason for this was the relatively short ranges at which we were able to effectively place the sensor nodes from the central receiving unit. Several different mesh technologies were attempted with similar results. Two hardware devices were created during the original performance period of the project. The first device, the WEB-MC, is a master control unit that has two radios, a CPU, memory, and serves as the central communications device for the WEB-MC System (Currently called the 'BEST Wireless HVAC Maintenance System' as a tentative commercial product name). The WEB-MC communicates with the local mesh network system via one of its antennas. Communication with the mesh network enables the WEB-MC to configure the network, send/receive data from individual motes, and serves as the primary mechanism for collecting sensor data at remote locations. The second antenna enables the WEB-MC to connect to a cellular network ('Long-Haul Communications') to transfer data to and from the NorthWrite Network Operations Center (NOC). A third 'all-in-one' hardware solution was created after the project was extended (Phase 2) and additional resources were provided. The project team leveraged a project funded by the State of Washington to develop a hardware solution that integrated the functionality of the original two devices. The primary reason for this approach was to eliminate the mesh network technical difficulties that severely limited the functionality of the original hardware approach. There were five separate software developments required to deliver the functionality needed for this project. These include the Data Server (or Network Operations Center), Web Application, Diagnostic Software, WEB-MC Embedded Software, Mote Embedded Software. Each of these developments was necessarily dependent on the others. This resulted in a challenging management task - requiring high bandwidth communications among all the team members. Fortunately, the project team performed exceptionally well together and was able to work through the various challenges that this presented - for example, when one software tool required a detailed description of the output of a second tool, before that tool had been fully designed.« less

  4. MatMRI and MatHIFU: software toolboxes for real-time monitoring and control of MR-guided HIFU

    PubMed Central

    2013-01-01

    Background The availability of open and versatile software tools is a key feature to facilitate pre-clinical research for magnetic resonance imaging (MRI) and magnetic resonance-guided high-intensity focused ultrasound (MR-HIFU) and expedite clinical translation of diagnostic and therapeutic medical applications. In the present study, two customizable software tools that were developed at the Thunder Bay Regional Research Institute are presented for use with both MRI and MR-HIFU. Both tools operate in a MATLAB®; environment. The first tool is named MatMRI and enables real-time, dynamic acquisition of MR images with a Philips MRI scanner. The second tool is named MatHIFU and enables the execution and dynamic modification of user-defined treatment protocols with the Philips Sonalleve MR-HIFU therapy system to perform ultrasound exposures in MR-HIFU therapy applications. Methods MatMRI requires four basic steps: initiate communication, subscribe to MRI data, query for new images, and unsubscribe. MatMRI can also pause/resume the imaging and perform real-time updates of the location and orientation of images. MatHIFU requires four basic steps: initiate communication, prepare treatment protocol, and execute treatment protocol. MatHIFU can monitor the state of execution and, if required, modify the protocol in real time. Results Four applications were developed to showcase the capabilities of MatMRI and MatHIFU to perform pre-clinical research. Firstly, MatMRI was integrated with an existing small animal MR-HIFU system (FUS Instruments, Toronto, Ontario, Canada) to provide real-time temperature measurements. Secondly, MatMRI was used to perform T2-based MR thermometry in the bone marrow. Thirdly, MatHIFU was used to automate acoustic hydrophone measurements on a per-element basis of the 256-element transducer of the Sonalleve system. Finally, MatMRI and MatHIFU were combined to produce and image a heating pattern that recreates the word ‘HIFU’ in a tissue-mimicking heating phantom. Conclusions MatMRI and MatHIFU leverage existing MRI and MR-HIFU clinical platforms to facilitate pre-clinical research. MatMRI substantially simplifies the real-time acquisition and processing of MR data. MatHIFU facilitates the testing and characterization of new therapy applications using the Philips Sonalleve clinical MR-HIFU system. Under coordination with Philips Healthcare, both MatMRI and MatHIFU are intended to be freely available as open-source software packages to other research groups. PMID:25512856

  5. Managing Digital Archives Using Open Source Software Tools

    NASA Astrophysics Data System (ADS)

    Barve, S.; Dongare, S.

    2007-10-01

    This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.

  6. Tools for Administration of a UNIX-Based Network

    NASA Technical Reports Server (NTRS)

    LeClaire, Stephen; Farrar, Edward

    2004-01-01

    Several computer programs have been developed to enable efficient administration of a large, heterogeneous, UNIX-based computing and communication network that includes a variety of computers connected to a variety of subnetworks. One program provides secure software tools for administrators to create, modify, lock, and delete accounts of specific users. This program also provides tools for users to change their UNIX passwords and log-in shells. These tools check for errors. Another program comprises a client and a server component that, together, provide a secure mechanism to create, modify, and query quota levels on a network file system (NFS) mounted by use of the VERITAS File SystemJ software. The client software resides on an internal secure computer with a secure Web interface; one can gain access to the client software from any authorized computer capable of running web-browser software. The server software resides on a UNIX computer configured with the VERITAS software system. Directories where VERITAS quotas are applied are NFS-mounted. Another program is a Web-based, client/server Internet Protocol (IP) address tool that facilitates maintenance lookup of information about IP addresses for a network of computers.

  7. A GPU Simulation Tool for Training and Optimisation in 2D Digital X-Ray Imaging.

    PubMed

    Gallio, Elena; Rampado, Osvaldo; Gianaria, Elena; Bianchi, Silvio Diego; Ropolo, Roberto

    2015-01-01

    Conventional radiology is performed by means of digital detectors, with various types of technology and different performance in terms of efficiency and image quality. Following the arrival of a new digital detector in a radiology department, all the staff involved should adapt the procedure parameters to the properties of the detector, in order to achieve an optimal result in terms of correct diagnostic information and minimum radiation risks for the patient. The aim of this study was to develop and validate a software capable of simulating a digital X-ray imaging system, using graphics processing unit computing. All radiological image components were implemented in this application: an X-ray tube with primary beam, a virtual patient, noise, scatter radiation, a grid and a digital detector. Three different digital detectors (two digital radiography and a computed radiography systems) were implemented. In order to validate the software, we carried out a quantitative comparison of geometrical and anthropomorphic phantom simulated images with those acquired. In terms of average pixel values, the maximum differences were below 15%, while the noise values were in agreement with a maximum difference of 20%. The relative trends of contrast to noise ratio versus beam energy and intensity were well simulated. Total calculation times were below 3 seconds for clinical images with pixel size of actual dimensions less than 0.2 mm. The application proved to be efficient and realistic. Short calculation times and the accuracy of the results obtained make this software a useful tool for training operators and dose optimisation studies.

  8. An integrated approach for increasing breeding efficiency in apple and peach in Europe.

    PubMed

    Laurens, Francois; Aranzana, Maria José; Arus, Pere; Bassi, Daniele; Bink, Marco; Bonany, Joan; Caprera, Andrea; Corelli-Grappadelli, Luca; Costes, Evelyne; Durel, Charles-Eric; Mauroux, Jehan-Baptiste; Muranty, Hélène; Nazzicari, Nelson; Pascal, Thierry; Patocchi, Andrea; Peil, Andreas; Quilot-Turion, Bénédicte; Rossini, Laura; Stella, Alessandra; Troggio, Michela; Velasco, Riccardo; van de Weg, Eric

    2018-01-01

    Despite the availability of whole genome sequences of apple and peach, there has been a considerable gap between genomics and breeding. To bridge the gap, the European Union funded the FruitBreedomics project (March 2011 to August 2015) involving 28 research institutes and private companies. Three complementary approaches were pursued: (i) tool and software development, (ii) deciphering genetic control of main horticultural traits taking into account allelic diversity and (iii) developing plant materials, tools and methodologies for breeders. Decisive breakthroughs were made including the making available of ready-to-go DNA diagnostic tests for Marker Assisted Breeding, development of new, dense SNP arrays in apple and peach, new phenotypic methods for some complex traits, software for gene/QTL discovery on breeding germplasm via Pedigree Based Analysis (PBA). This resulted in the discovery of highly predictive molecular markers for traits of horticultural interest via PBA and via Genome Wide Association Studies (GWAS) on several European genebank collections. FruitBreedomics also developed pre-breeding plant materials in which multiple sources of resistance were pyramided and software that can support breeders in their selection activities. Through FruitBreedomics, significant progresses were made in the field of apple and peach breeding, genetics, genomics and bioinformatics of which advantage will be made by breeders, germplasm curators and scientists. A major part of the data collected during the project has been stored in the FruitBreedomics database and has been made available to the public. This review covers the scientific discoveries made in this major endeavour, and perspective in the apple and peach breeding and genomics in Europe and beyond.

  9. Scanning and Measuring Device for Diagnostic of Barrel Bore

    NASA Astrophysics Data System (ADS)

    Marvan, Ales; Hajek, Josef; Vana, Jan; Dvorak, Radim; Drahansky, Martin; Jankovych, Robert; Skvarek, Jozef

    The article discusses the design, mechanical design, electronics and software for robot diagnosis of barrels with caliber of 120 mm to 155 mm. This diagnostic device is intended primarily for experimental research and verification of appropriate methods and technologies for the diagnosis of the main bore guns. Article also discusses the design of sensors and software, the issue of data processing and image reconstruction obtained by scanning of the surface of the bore.

  10. The State of Software for Evolutionary Biology.

    PubMed

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  11. Using EMIS to Identify Top Opportunities for Commercial Building Efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guanjing; Singla, Rupam; Granderson, Jessica

    Energy Management and Information Systems (EMIS) comprise a broad family of tools and services to manage commercial building energy use. These technologies offer a mix of capabilities to store, display, and analyze energy use and system data, and in some cases, provide control. EMIS technologies enable 10–20 percent site energy savings in best practice implementations. Energy Information System (EIS) and Fault Detection and Diagnosis (FDD) systems are two key technologies in the EMIS family. Energy Information Systems are broadly defined as the web-based software, data acquisition hardware, and communication systems used to analyze and display building energy performance. At amore » minimum, an EIS provides daily, hourly or sub-hourly interval meter data at the whole-building level, with graphical and analytical capability. Fault Detection and Diagnosis systems automatically identify heating, ventilation, and air-conditioning (HVAC) system or equipment-level performances issues, and in some cases are able to isolate the root causes of the problem. They use computer algorithms to continuously analyze system-level operational data to detect faults and diagnose their causes. Many FDD tools integrate the trend log data from a Building Automation System (BAS) but otherwise are stand-alone software packages; other types of FDD tools are implemented as “on-board” equipment-embedded diagnostics. (This document focuses on the former.) Analysis approaches adopted in FDD technologies span a variety of techniques from rule-based methods to process history-based approaches. FDD tools automate investigations that can be conducted via manual data inspection by someone with expert knowledge, thereby expanding accessibility and breath of analysis opportunity, and also reducing complexity.« less

  12. Orbit Software Suite

    NASA Technical Reports Server (NTRS)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.

  13. Breast-specific gamma camera imaging with 99mTc-MIBI has better diagnostic performance than magnetic resonance imaging in breast cancer patients: A meta-analysis.

    PubMed

    Zhang, Aimi; Li, Panli; Liu, Qiufang; Song, Shaoli

    2017-01-01

    This study aimed to evaluate the diagnostic role of breast-specific gamma camera imaging (BSGI) with technetium-99m-methoxy isobutyl isonitrile ( 99m Tc-MIBI) and magnetic resonance imaging (MRI) in patients with breast cancer through a meta-analysis. Three reviewers searched articles published in medical journals before June 2016 in MEDLINE, EMBASE and Springer Databases; the references listed in original articles were also retrieved. We used the quality assessment of diagnostic accuracy studies (QUADAS) tool to assess the quality of the included studies. Heterogeneity, pooled sensitivity and specificity, positive likelihood ratio, negative likelihood ratio, diagnostic odds ratio (DOR) and summary receiver operating characteristic (SROC) curves were calculated by Meta-DiSc software to estimate the diagnostic performance of BSGI and MRI. Ten studies with 517 patients were included after meeting the inclusion criteria. We did a subgroup analysis of the same data type. The pooled sensitivities of BSGI and MRI were: 0.84 (95% CI, 0.79-0.88) and 0.89 (95% CI, 0.84-0.92) respectively, and the pooled specificities of BSGI and MRI were: 0.82 (95% CI, 0.74-0.88) and 0.39 (95% CI, 0.30-0.49) respectively. The areas under the SROC curve of BSGI and MRI were 0.93 and 0.72 respectively. The results of our meta-analysis indicated that compared with MRI, BSGI has similar sensitivity, higher specificity, better diagnostic performance, and can be widely used in clinical practice.

  14. Galaxy emission line classification using three-dimensional line ratio diagrams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogt, Frédéric P. A.; Dopita, Michael A.; Kewley, Lisa J.

    2014-10-01

    Two-dimensional (2D) line ratio diagnostic diagrams have become a key tool in understanding the excitation mechanisms of galaxies. The curves used to separate the different regions—H II-like or excited by an active galactic nucleus (AGN)—have been refined over time but the core technique has not evolved significantly. However, the classification of galaxies based on their emission line ratios really is a multi-dimensional problem. Here we exploit recent software developments to explore the potential of three-dimensional (3D) line ratio diagnostic diagrams. We introduce the ZQE diagrams, which are a specific set of 3D diagrams that separate the oxygen abundance and themore » ionization parameter of H II region-like spectra and also enable us to probe the excitation mechanism of the gas. By examining these new 3D spaces interactively, we define the ZE diagnostics, a new set of 2D diagnostics that can provide the metallicity of objects excited by hot young stars and that cleanly separate H II region-like objects from the different classes of AGNs. We show that these ZE diagnostics are consistent with the key log [N II]/Hα versus log [O III]/Hβ diagnostic currently used by the community. They also have the advantage of attaching a probability that a given object belongs to one class or the other. Finally, we discuss briefly why ZQE diagrams can provide a new way to differentiate and study the different classes of AGNs in anticipation of a dedicated follow-up study.« less

  15. 47 CFR 73.9007 - Robustness requirements for covered demodulator products.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RADIO SERVICES RADIO BROADCAST SERVICES Digital Broadcast Television Redistribution Control § 73.9007...-available tools or equipment also means specialized electronic tools or software tools that are widely... requirements set forth in this subpart. Such specialized electronic tools or software tools includes, but is...

  16. Problem Representation, Background Evidence, Analysis, Recommendation: An Oral Case Presentation Tool to Promote Diagnostic Reasoning.

    PubMed

    Carter, Cristina; Akar-Ghibril, Nicole; Sestokas, Jeff; Dixon, Gabrina; Bradford, Wilhelmina; Ottolini, Mary

    2018-03-01

    Oral case presentations provide an opportunity for trainees to communicate diagnostic reasoning at the bedside. However, few tools exist to enable faculty to provide effective feedback. We developed a tool to assess diagnostic reasoning and communication during oral case presentations. Published by Elsevier Inc.

  17. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    NASA Technical Reports Server (NTRS)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development and throughout the life of the Orion project.

  18. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  19. Integrated approach to ischemic heart disease. The one-stop shop.

    PubMed

    Kramer, C M

    1998-05-01

    Magnetic resonance imaging is unique in its variety of applications for imaging the cardiovascular system. A thorough assessment of myocardial structure, function, and perfusion; assessment of coronary artery anatomy and flow; and spectroscopic evaluation of cardiac energetics can be readily performed by magnetic resonance imaging. One key to the advancement of cardiac magnetic resonance imaging as a clinical tool in the evaluation, the so called one stop shop. Improvements in magnetic resonance hardware, software, and imaging speed now permit this integrated examination. Cardiac magnetic resonance is a powerful technique with the potential to replace or complement other commonly used techniques in the diagnostic armamentarium of physicians caring for patients with ischemic heart disease.

  20. Dysferlin quantification in monocytes for rapid screening for dysferlinopathies.

    PubMed

    Sánchez-Chapul, Laura; Ángel-Muñoz, Miguel Del; Ruano-Calderón, Luis; Luna-Angulo, Alexandra; Coral-Vázquez, Ramón; Hernández-Hernández, Óscar; Magaña, Jonathan J; León-Hernández, Saúl R; Escobar-Cedillo, Rosa E; Vargas, Steven

    2016-12-01

    In this study, we determined normal levels of dysferlin expression in CD14 + monocytes by flow cytometry (FC) as a screening tool for dysferlinopathies. Monocytes from 183 healthy individuals and 29 patients were immunolabeled, run on an FACScalibur flow cytometer, and analyzed by FlowJo software. The relative quantity of dysferlin was expressed as mean fluorescence intensity (MFI). Performance of this diagnostic test was assessed by calculating likelihood ratios at different MFI cut-off points, which allowed definition of 4 disease classification groups in a simplified algorithm. The MFI value may differentiate patients with dysferlinopathy from healthy individuals; it may be a useful marker for screening purposes. Muscle Nerve 54: 1064-1071, 2016. © 2016 Wiley Periodicals, Inc.

  1. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  2. Caesy: A software tool for computer-aided engineering

    NASA Technical Reports Server (NTRS)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  3. Software Tools for Battery Design | Transportation Research | NREL

    Science.gov Websites

    battery designers, developers, and manufacturers create affordable, high-performance lithium-ion (Li-ion Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided ) batteries for next-generation electric-drive vehicles (EDVs). An image of a simulation of a battery pack

  4. Small PACS implementation using publicly available software

    NASA Astrophysics Data System (ADS)

    Passadore, Diego J.; Isoardi, Roberto A.; Gonzalez Nicolini, Federico J.; Ariza, P. P.; Novas, C. V.; Omati, S. A.

    1998-07-01

    Building cost effective PACS solutions is a main concern in developing countries. Hardware and software components are generally much more expensive than in developed countries and also more tightened financial constraints are the main reasons contributing to a slow rate of implementation of PACS. The extensive use of Internet for sharing resources and information has brought a broad number of freely available software packages to an ever-increasing number of users. In the field of medical imaging is possible to find image format conversion packages, DICOM compliant servers for all kinds of service classes, databases, web servers, image visualization, manipulation and analysis tools, etc. This paper describes a PACS implementation for review and storage built on freely available software. It currently integrates four diagnostic modalities (PET, CT, MR and NM), a Radiotherapy Treatment Planning workstation and several computers in a local area network, for image storage, database management and image review, processing and analysis. It also includes a web-based application that allows remote users to query the archive for studies from any workstation and to view the corresponding images and reports. We conclude that the advantage of using this approach is twofold. It allows a full understanding of all the issues involved in the implementation of a PACS and also contributes to keep costs down while enabling the development of a functional system for storage, distribution and review that can prove to be helpful for radiologists and referring physicians.

  5. Diagnostic value of 3D time-of-flight MRA in trigeminal neuralgia.

    PubMed

    Cai, Jing; Xin, Zhen-Xue; Zhang, Yu-Qiang; Sun, Jie; Lu, Ji-Liang; Xie, Feng

    2015-08-01

    The aim of this meta-analysis was to evaluate the diagnostic value of 3D time-of-flight magnetic resonance angiography (3D-TOF-MRA) in trigeminal neuralgia (TN). Relevant studies were identified by computerized database searches supplemented by manual search strategies. The studies were included in accordance with stringent inclusion and exclusion criteria. Following a multistep screening process, high quality studies related to the diagnostic value of 3D-TOF-MRA in TN were selected for meta-analysis. Statistical analyses were conducted using Statistical Analysis Software (version 8.2; SAS Institute, Cary, NC, USA) and Meta Disc (version 1.4; Unit of Clinical Biostatistics, Ramon y Cajal Hospital, Madrid, Spain). For the present meta-analysis, we initially retrieved 95 studies from database searches. A total of 13 studies were eventually enrolled containing a combined total of 1084 TN patients. The meta-analysis results demonstrated that the sensitivity and specificity of the diagnostic value of 3D-TOF-MRA in TN were 95% (95% confidence interval [CI] 0.93-0.96) and 77% (95% CI 0.66-0.86), respectively. The pooled positive likelihood ratio and negative likelihood ratio were 2.72 (95% CI 1.81-4.09) and 0.08 (95% CI 0.06-0.12), respectively. The pooled diagnostic odds ratio of 3D-TOF-MRA in TN was 52.92 (95% CI 26.39-106.11), and the corresponding area under the curve in the summary receiver operating characteristic curve based on the 3D-TOF-MRA diagnostic image of observers was 0.9695 (standard error 0.0165). Our results suggest that 3D-TOF-MRA has excellent sensitivity and specificity as a diagnostic tool for TN, and that it can accurately identify neurovascular compression in TN patients. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Methodology for automating software systems. Task 1 of the foundations for automating software systems

    NASA Technical Reports Server (NTRS)

    Moseley, Warren

    1989-01-01

    The early stages of a research program designed to establish an experimental research platform for software engineering are described. Major emphasis is placed on Computer Assisted Software Engineering (CASE). The Poor Man's CASE Tool is based on the Apple Macintosh system, employing available software including Focal Point II, Hypercard, XRefText, and Macproject. These programs are functional in themselves, but through advanced linking are available for operation from within the tool being developed. The research platform is intended to merge software engineering technology with artificial intelligence (AI). In the first prototype of the PMCT, however, the sections of AI are not included. CASE tools assist the software engineer in planning goals, routes to those goals, and ways to measure progress. The method described allows software to be synthesized instead of being written or built.

  7. Computer implemented method, and apparatus for controlling a hand-held tool

    NASA Technical Reports Server (NTRS)

    Wagner, Kenneth William (Inventor); Taylor, James Clayton (Inventor)

    1999-01-01

    The invention described here in is a computer-implemented method and apparatus for controlling a hand-held tool. In particular, the control of a hand held tool is for the purpose of controlling the speed of a fastener interface mechanism and the torque applied to fasteners by the fastener interface mechanism of the hand-held tool and monitoring the operating parameters of the tool. The control is embodied in intool software embedded on a processor within the tool which also communicates with remote software. An operator can run the tool, or through the interaction of both software, operate the tool from a remote location, analyze data from a performance history recorded by the tool, and select various torque and speed parameters for each fastener.

  8. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  9. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  10. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    PubMed

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  11. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    NASA Technical Reports Server (NTRS)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  12. The State of Software for Evolutionary Biology

    PubMed Central

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-01-01

    Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525

  13. Spatial resolution measurements by Radia diagnostic software with SEDENTEXCT image quality phantom in cone beam CT for dental use.

    PubMed

    Watanabe, Hiroshi; Nomura, Yoshikazu; Kuribayashi, Ami; Kurabayashi, Tohru

    2018-02-01

    We aimed to employ the Radia diagnostic software with the safety and efficacy of a new emerging dental X-ray modality (SEDENTEXCT) image quality (IQ) phantom in CT, and to evaluate its validity. The SEDENTEXCT IQ phantom and Radia diagnostic software were employed. The phantom was scanned using one medical full-body CT and two dentomaxillofacial cone beam CTs. The obtained images were imported to the Radia software, and the spatial resolution outputs were evaluated. The oversampling method was employed using our original wire phantom as a reference. The resultant modulation transfer function (MTF) curves were compared. The null hypothesis was that MTF curves generated using both methods would be in agreement. One-way analysis of variance tests were applied to the f50 and f10 values from the MTF curves. The f10 values were subjectively confirmed by observing the line pair modules. The Radia software reported the MTF curves on the xy-plane of the CT scans, but could not return f50 and f10 values on the z-axis. The null hypothesis concerning the reported MTF curves on the xy-plane was rejected. There were significant differences between the results of the Radia software and our reference method, except for f10 values in CS9300. These findings were consistent with our line pair observations. We evaluated the validity of the Radia software with the SEDENTEXCT IQ phantom. The data provided were semi-automatic, albeit with problems and statistically different from our reference. We hope the manufacturer will overcome these limitations.

  14. Interactive Electronic Decision Trees for the Integrated Primary Care Management of Febrile Children in Low Resource Settings - Review of existing tools.

    PubMed

    Keitel, Kristina; D'Acremont, Valérie

    2018-04-20

    The lack of effective, integrated diagnostic tools pose a major challenge to the primary care management of febrile childhood illnesses. These limitations are especially evident in low-resource settings and are often inappropriately compensated by antimicrobial over-prescription. Interactive electronic decision trees (IEDTs) have the potential to close these gaps: guiding antibiotic use and better identifying serious disease. This narrative review summarizes existing IEDTs, to provide an overview of their degree of validation, as well as to identify gaps in current knowledge and prospects for future innovation. Structured literature review in PubMed and Embase complemented by google search and contact with developers. Six integrated IEDTs were identified: three (eIMCI, REC, and Bangladesh digital IMCI) based on Integrated Management of Childhood Illnesses (IMCI); four (SL eCCM, MEDSINC, e-iCCM, and D-Tree eCCM) on Integrated Community Case Management (iCCM); two (ALMANACH, MSFeCARE) with a modified IMCI content; and one (ePOCT) that integrates novel content with biomarker testing. The types of publications and evaluation studies varied greatly: the content and evidence-base was published for two (ALMANACH and ePOCT), ALMANACH and ePOCT were validated in efficacy studies. Other types of evaluations, such as compliance, acceptability were available for D-Tree eCCM, eIMCI, ALMANACH. Several evaluations are still ongoing. Future prospects include conducting effectiveness and impact studies using data gathered through larger studies to adapt the medical content to local epidemiology, improving the software and sensors, and Assessing factors that influence compliance and scale-up. IEDTs are valuable tools that have the potential to improve management of febrile children in primary care and increase the rational use of diagnostics and antimicrobials. Next steps in the evidence pathway should be larger effectiveness and impact studies (including cost analysis) and continuous integration of clinically useful diagnostic and treatment innovations. Copyright © 2018. Published by Elsevier Ltd.

  15. Computer modeling in the practice of acoustical consulting: An evolving variety of uses from marketing and diagnosis through design to eventually research

    NASA Astrophysics Data System (ADS)

    Madaras, Gary S.

    2002-05-01

    The use of computer modeling as a marketing, diagnosis, design, and research tool in the practice of acoustical consulting is discussed. From the time it is obtained, the software can be used as an effective marketing tool. It is not until the software basics are learned and some amount of testing and verification occurs that the software can be used as a tool for diagnosing the acoustics of existing rooms. A greater understanding of the output types and formats as well as experience in interpreting the results is required before the software can be used as an efficient design tool. Lastly, it is only after repetitive use as a design tool that the software can be used as a cost-effective means of conducting research in practice. The discussion is supplemented with specific examples of actual projects provided by various consultants within multiple firms. Focus is placed on the use of CATT-Acoustic software and predicting the room acoustics of large performing arts halls as well as other public assembly spaces.

  16. Knowledge-based approach for generating target system specifications from a domain model

    NASA Technical Reports Server (NTRS)

    Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan

    1992-01-01

    Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.

  17. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  18. Rapid Development of Custom Software Architecture Design Environments

    DTIC Science & Technology

    1999-08-01

    the tools themselves. This dissertation describes a new approach to capturing and using architectural design expertise in software architecture design environments...A language and tools are presented for capturing and encapsulating software architecture design expertise within a conceptual framework...of architectural styles and design rules. The design expertise thus captured is supported with an incrementally configurable software architecture

  19. Evaluating Business Intelligence/Business Analytics Software for Use in the Information Systems Curriculum

    ERIC Educational Resources Information Center

    Davis, Gary Alan; Woratschek, Charles R.

    2015-01-01

    Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…

  20. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, R. H.; Badger, W.; Beckman, C. S.; Beshers, G.; Hammerslag, D.; Kimball, J.; Kirslis, P. A.; Render, H.; Richards, P.; Terwilliger, R.

    1984-01-01

    The project to automate the management of software production systems is described. The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. Several major components of the SAGA system are completed to prototype form. The construction methods are described.

  1. BH-ShaDe: A Software Tool That Assists Architecture Students in the III-Structured Task of Housing Design

    ERIC Educational Resources Information Center

    Millan, Eva; Belmonte, Maria-Victoria; Ruiz-Montiel, Manuela; Gavilanes, Juan; Perez-de-la-Cruz, Jose-Luis

    2016-01-01

    In this paper, we present BH-ShaDe, a new software tool to assist architecture students learning the ill-structured domain/task of housing design. The software tool provides students with automatic or interactively generated floor plan schemas for basic houses. The students can then use the generated schemas as initial seeds to develop complete…

  2. A subscale facility for liquid rocket propulsion diagnostics at Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Raines, N. G.; Bircher, F. E.; Chenevert, D. J.

    1991-01-01

    The Diagnostics Testbed Facility (DTF) at NASA's John C. Stennis Space Center in Mississippi was designed to provide a testbed for the development of rocket engine exhaust plume diagnostics instrumentation. A 1200-lb thrust liquid oxygen/gaseous hydrogen thruster is used as the plume source for experimentation and instrument development. Theoretical comparative studies have been performed with aerothermodynamic codes to ensure that the DTF thruster (DTFT) has been optimized to produce a plume with pressure and temperature conditions as much like the plume of the Space Shuttle Main Engine as possible. Operation of the DTFT is controlled by an icon-driven software program using a series of soft switches. Data acquisition is performed using the same software program. A number of plume diagnostics experiments have utilized the unique capabilities of the DTF.

  3. GPM Timeline Inhibits For IT Processing

    NASA Technical Reports Server (NTRS)

    Dion, Shirley K.

    2014-01-01

    The Safety Inhibit Timeline Tool was created as one approach to capturing and understanding inhibits and controls from IT through launch. Global Precipitation Measurement (GPM) Mission, which launched from Japan in March 2014, was a joint mission under a partnership between the National Aeronautics and Space Administration (NASA) and the Japan Aerospace Exploration Agency (JAXA). GPM was one of the first NASA Goddard in-house programs that extensively used software controls. Using this tool during the GPM buildup allowed a thorough review of inhibit and safety critical software design for hazardous subsystems such as the high gain antenna boom, solar array, and instrument deployments, transmitter turn-on, propulsion system release, and instrument radar turn-on. The GPM safety team developed a methodology to document software safety as part of the standard hazard report. As a result of this process, a new tool safety inhibit timeline was created for management of inhibits and their controls during spacecraft buildup and testing during IT at GSFC and at the launch range in Japan. The Safety Inhibit Timeline Tool was a pathfinder approach for reviewing software that controls the electrical inhibits. The Safety Inhibit Timeline Tool strengthens the Safety Analysts understanding of the removal of inhibits during the IT process with safety critical software. With this tool, the Safety Analyst can confirm proper safe configuration of a spacecraft during each IT test, track inhibit and software configuration changes, and assess software criticality. In addition to understanding inhibits and controls during IT, the tool allows the Safety Analyst to better communicate to engineers and management the changes in inhibit states with each phase of hardware and software testing and the impact of safety risks. Lessons learned from participating in the GPM campaign at NASA and JAXA will be discussed during this session.

  4. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand.

    PubMed

    Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo

    2018-02-19

    The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.

  5. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand

    PubMed Central

    2018-01-01

    Background The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. Methods On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. Results All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. Conclusion These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. PMID:29441756

  6. Software Management Environment (SME) concepts and architecture, revision 1

    NASA Technical Reports Server (NTRS)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1992-01-01

    This document presents the concepts and architecture of the Software Management Environment (SME), developed for the Software Engineering Branch of the Flight Dynamic Division (FDD) of GSFC. The SME provides an integrated set of experience-based management tools that can assist software development managers in managing and planning flight dynamics software development projects. This document provides a high-level description of the types of information required to implement such an automated management tool.

  7. Use of a quality improvement tool, the prioritization matrix, to identify and prioritize triage software algorithm enhancement.

    PubMed

    North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg

    2007-10-11

    Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.

  8. Comparison of two high-resolution manometry software systems in evaluating esophageal motor function.

    PubMed

    Rengarajan, A; Drapekin, J; Patel, A; Gyawali, C P

    2016-12-01

    High-resolution manometry (HRM) utilizes software tools to diagnose esophageal motor disorders. Performance of these software metrics could be affected by averaging and by software characteristics of different manufacturers. High-resolution manometry studies on 86 patients referred for antireflux surgery (61.6 ± 1.4 year, 70% F) and 20 healthy controls (27.9 ± 0.7 year, 45% F) were first subject to standard analysis (Medtronic, Duluth, GA, USA). Coordinates for each of 10 test swallows were exported and averaged to generate a composite swallow. The swallows and averaged composites were imported as ASCII file format into Manoview (Medtronic) and Medical Measurement Systems database reporter (MMS, Dover, NH, USA), and analyses repeated. Comparisons were made between standard and composite swallow interpretations. Correlation between the two systems was high for mean distal contractile integral (DCI, r 2 ≥ 0.9) but lower for integrated relaxation pressure (IRP, r 2 = 0.7). Excluding achalasia, six patients with outflow obstruction (mean IRP 23.2 ± 2.1 with 10-swallow average) were identified by both systems. An additional nine patients (10.5%) were identified as outflow obstruction (15 mmHg threshold) with MMS 10-swallow and four with MMS composite swallow evaluation; only one was confirmed. Ineffective esophageal motility was diagnosed by 10-swallow evaluation in 19 (22.1%) with Manoview, and 20 (23.3%) with MMS. On Manoview composite, 17 had DCI <450 mmHg/cm/s, and on MMS composite, 21, (p ≥ 0.85 for each comparison) but these did not impact diagnostic conclusions. Comparison of 10 swallow and composite swallows demonstrate variability in software metrics between manometry systems. Our data support use of manufacturer specific software metrics on 10-swallow sequences. © 2016 John Wiley & Sons Ltd.

  9. Use and Perceived Benefits of Handheld Computer-based Clinical References

    PubMed Central

    Rothschild, Jeffrey M.; Fang, Edward; Liu, Vincent; Litvak, Irina; Yoon, Cathy; Bates, David W.

    2006-01-01

    Objective Clinicians are increasingly using handheld computers (HC) during patient care. We sought to assess the role of HC-based clinical reference software in medical practice by conducting a survey and assessing actual usage behavior. Design During a 2-week period in February 2005, 3600 users of a HC-based clinical reference application were asked by e-mail to complete a survey and permit analysis of their usage patterns. The software includes a pharmacopeia, an infectious disease reference, a medical diagnostic and therapeutic reference and transmits medical alerts and other notifications during HC synchronizations. Software usage data were captured during HC synchronization for the 4 weeks prior to survey completion. Measurements Survey responses and software usage data. Results The survey response rate was 42% (n = 1501). Physicians reported using the clinical reference software for a mean of 4 years and 39% reported using the software during more than half of patient encounters. Physicians who synchronized their HC during the data collection period (n = 1249; 83%) used the pharmacopeia for unique drug lookups a mean of 6.3 times per day (SD 12.4). The majority of users (61%) believed that in the prior 4 weeks, use of the clinical reference prevented adverse drug events or medication errors 3 or more times. Physicians also believed that alerts and other notifications improved patient care if they were public health warnings (e.g. about influenza), new immunization guidelines or drug alert warnings (e.g. rofecoxib withdrawal). Conclusion Current adopters of HC-based medical references use these tools frequently, and found them to improve patient care and be valuable in learning of recent alerts and warnings. PMID:16929041

  10. An evaluation of software tools for the design and development of cockpit displays

    NASA Technical Reports Server (NTRS)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  11. Using component technologies for web based wavelet enhanced mammographic image visualization.

    PubMed

    Sakellaropoulos, P; Costaridou, L; Panayiotakis, G

    2000-01-01

    The poor contrast detectability of mammography can be dealt with by domain specific software visualization tools. Remote desktop client access and time performance limitations of a previously reported visualization tool are addressed, aiming at more efficient visualization of mammographic image resources existing in web or PACS image servers. This effort is also motivated by the fact that at present, web browsers do not support domain-specific medical image visualization. To deal with desktop client access the tool was redesigned by exploring component technologies, enabling the integration of stand alone domain specific mammographic image functionality in a web browsing environment (web adaptation). The integration method is based on ActiveX Document Server technology. ActiveX Document is a part of Object Linking and Embedding (OLE) extensible systems object technology, offering new services in existing applications. The standard DICOM 3.0 part 10 compatible image-format specification Papyrus 3.0 is supported, in addition to standard digitization formats such as TIFF. The visualization functionality of the tool has been enhanced by including a fast wavelet transform implementation, which allows for real time wavelet based contrast enhancement and denoising operations. Initial use of the tool with mammograms of various breast structures demonstrated its potential in improving visualization of diagnostic mammographic features. Web adaptation and real time wavelet processing enhance the potential of the previously reported tool in remote diagnosis and education in mammography.

  12. Transparent ICD and DRG coding using information technology: linking and associating information sources with the eXtensible Markup Language.

    PubMed

    Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach.

  13. Transparent ICD and DRG Coding Using Information Technology: Linking and Associating Information Sources with the eXtensible Markup Language

    PubMed Central

    Hoelzer, Simon; Schweiger, Ralf K.; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or “semantically associated” parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach. PMID:12807813

  14. A Unified Overset Grid Generation Graphical Interface and New Concepts on Automatic Gridding Around Surface Discontinuities

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Akien, Edwin (Technical Monitor)

    2002-01-01

    For many years, generation of overset grids for complex configurations has required the use of a number of different independently developed software utilities. Results created by each step were then visualized using a separate visualization tool before moving on to the next. A new software tool called OVERGRID was developed which allows the user to perform all the grid generation steps and visualization under one environment. OVERGRID provides grid diagnostic functions such as surface tangent and normal checks as well as grid manipulation functions such as extraction, extrapolation, concatenation, redistribution, smoothing, and projection. Moreover, it also contains hyperbolic surface and volume grid generation modules that are specifically suited for overset grid generation. It is the first time that such a unified interface existed for the creation of overset grids for complex geometries. New concepts on automatic overset surface grid generation around surface discontinuities will also be briefly presented. Special control curves on the surface such as intersection curves, sharp edges, open boundaries, are called seam curves. The seam curves are first automatically extracted from a multiple panel network description of the surface. Points where three or more seam curves meet are automatically identified and are called seam corners. Seam corner surface grids are automatically generated using a singular axis topology. Hyperbolic surface grids are then grown from the seam curves that are automatically trimmed away from the seam corners.

  15. Quality Assessment of Comparative Diagnostic Accuracy Studies: Our Experience Using a Modified Version of the QUADAS-2 Tool

    ERIC Educational Resources Information Center

    Wade, Ros; Corbett, Mark; Eastwood, Alison

    2013-01-01

    Assessing the quality of included studies is a vital step in undertaking a systematic review. The recently revised Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool (QUADAS-2), which is the only validated quality assessment tool for diagnostic accuracy studies, does not include specific criteria for assessing comparative studies. As…

  16. Ensemble: an Architecture for Mission-Operations Software

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey; Powell, Mark; Fox, Jason; Rabe, Kenneth; Shu, IHsiang; McCurdy, Michael; Vera, Alonso

    2008-01-01

    Ensemble is the name of an open architecture for, and a methodology for the development of, spacecraft mission operations software. Ensemble is also potentially applicable to the development of non-spacecraft mission-operations- type software. Ensemble capitalizes on the strengths of the open-source Eclipse software and its architecture to address several issues that have arisen repeatedly in the development of mission-operations software: Heretofore, mission-operations application programs have been developed in disparate programming environments and integrated during the final stages of development of missions. The programs have been poorly integrated, and it has been costly to develop, test, and deploy them. Users of each program have been forced to interact with several different graphical user interfaces (GUIs). Also, the strategy typically used in integrating the programs has yielded serial chains of operational software tools of such a nature that during use of a given tool, it has not been possible to gain access to the capabilities afforded by other tools. In contrast, the Ensemble approach offers a low-risk path towards tighter integration of mission-operations software tools.

  17. Experience with case tools in the design of process-oriented software

    NASA Astrophysics Data System (ADS)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  18. Family-Based Benchmarking of Copy Number Variation Detection Software.

    PubMed

    Nutsua, Marcel Elie; Fischer, Annegret; Nebel, Almut; Hofmann, Sylvia; Schreiber, Stefan; Krawczak, Michael; Nothnagel, Michael

    2015-01-01

    The analysis of structural variants, in particular of copy-number variations (CNVs), has proven valuable in unraveling the genetic basis of human diseases. Hence, a large number of algorithms have been developed for the detection of CNVs in SNP array signal intensity data. Using the European and African HapMap trio data, we undertook a comparative evaluation of six commonly used CNV detection software tools, namely Affymetrix Power Tools (APT), QuantiSNP, PennCNV, GLAD, R-gada and VEGA, and assessed their level of pair-wise prediction concordance. The tool-specific CNV prediction accuracy was assessed in silico by way of intra-familial validation. Software tools differed greatly in terms of the number and length of the CNVs predicted as well as the number of markers included in a CNV. All software tools predicted substantially more deletions than duplications. Intra-familial validation revealed consistently low levels of prediction accuracy as measured by the proportion of validated CNVs (34-60%). Moreover, up to 20% of apparent family-based validations were found to be due to chance alone. Software using Hidden Markov models (HMM) showed a trend to predict fewer CNVs than segmentation-based algorithms albeit with greater validity. PennCNV yielded the highest prediction accuracy (60.9%). Finally, the pairwise concordance of CNV prediction was found to vary widely with the software tools involved. We recommend HMM-based software, in particular PennCNV, rather than segmentation-based algorithms when validity is the primary concern of CNV detection. QuantiSNP may be used as an additional tool to detect sets of CNVs not detectable by the other tools. Our study also reemphasizes the need for laboratory-based validation, such as qPCR, of CNVs predicted in silico.

  19. Plasma Diagnostics: Use and Justification in an Industrial Environment

    NASA Astrophysics Data System (ADS)

    Loewenhardt, Peter

    1998-10-01

    The usefulness and importance of plasma diagnostics have played a major role in the development of plasma processing tools in the semiconductor industry. As can be seen through marketing materials from semiconductor equipment manufacturers, results from plasma diagnostic equipment can be a powerful tool in selling the technological leadership of tool design. Some diagnostics have long been used for simple process control such as optical emission for endpoint determination, but in recent years more sophisticated and involved diagnostic tools have been utilized in chamber and plasma source development and optimization. It is now common to find an assortment of tools at semiconductor equipment companies such as Langmuir probes, mass spectrometers, spatial optical emission probes, impedance, ion energy and ion flux probes. An outline of how the importance of plasma diagnostics has grown at an equipment manufacturer over the last decade will be given, with examples of significant and useful results obtained. Examples will include the development and optimization of an inductive plasma source, trends and hardware effects on ion energy distributions, mass spectrometry influences on process development and investigations of plasma-wall interactions. Plasma diagnostic focus, in-house development and proliferation in an environment where financial justification requirements are both strong and necessary will be discussed.

  20. Estimating Computer-Based Training Development Times

    DTIC Science & Technology

    1987-10-14

    beginners , must be sure they interpret terms correctly. As a result of this informal validation, the authors suggest refinements in the tool which...Productivity tools available: automated design tools, text processor interfaces, flowcharting software, software interfaces a Multimedia interfaces e

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Svetlana Shasharina

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  2. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software

    PubMed Central

    Mejías, Andrés; Herrera, Reyes S.; Márquez, Marco A.; Calderón, Antonio José; González, Isaías; Andújar, José Manuel

    2017-01-01

    There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)—this one designed using the intuitive graphical system of EJS—located on the user’s computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented. PMID:28067801

  3. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software.

    PubMed

    Mejías, Andrés; Herrera, Reyes S; Márquez, Marco A; Calderón, Antonio José; González, Isaías; Andújar, José Manuel

    2017-01-05

    There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)-this one designed using the intuitive graphical system of EJS-located on the user's computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented.

  4. Software Analysis of New Space Gravity Data for Geophysics and Climate Research

    NASA Technical Reports Server (NTRS)

    Deese, Rupert; Ivins, Erik R.; Fielding, Eric J.

    2012-01-01

    Both the Gravity Recovery and Climate Experiment (GRACE) and Gravity field and steady-state Ocean Circulation Explorer (GOCE) satellites are returning rich data for the study of the solid earth, the oceans, and the climate. Current software analysis tools do not provide researchers with the ease and flexibility required to make full use of this data. We evaluate the capabilities and shortcomings of existing software tools including Mathematica, the GOCE User Toolbox, the ICGEM's (International Center for Global Earth Models) web server, and Tesseroids. Using existing tools as necessary, we design and implement software with the capability to produce gridded data and publication quality renderings from raw gravity data. The straight forward software interface marks an improvement over previously existing tools and makes new space gravity data more useful to researchers. Using the software we calculate Bouguer anomalies of the gravity tensor's vertical component in the Gulf of Mexico, Antarctica, and the 2010 Maule earthquake region. These maps identify promising areas of future research.

  5. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  6. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  7. Diagnostic Testing Package DX v 2.0 Technical Specification. Methodology Project.

    ERIC Educational Resources Information Center

    McArthur, David

    This paper contains the technical specifications, schematic diagrams, and program printout for a computer software package for the development and administration of diagnostic tests. The second version of the Diagnostic Testing Package DX consists of a PASCAL-based set of modules located in two main programs: (1) EDITTEST creates, modifies, and…

  8. Space Shuttle Software Development and Certification

    NASA Technical Reports Server (NTRS)

    Orr, James K.; Henderson, Johnnie A

    2000-01-01

    Man-rated software, "software which is in control of systems and environments upon which human life is critically dependent," must be highly reliable. The Space Shuttle Primary Avionics Software System is an excellent example of such a software system. Lessons learn from more than 20 years of effort have identified basic elements that must be present to achieve this high degree of reliability. The elements include rigorous application of appropriate software development processes, use of trusted tools to support those processes, quantitative process management, and defect elimination and prevention. This presentation highlights methods used within the Space Shuttle project and raises questions that must be addressed to provide similar success in a cost effective manner on future long-term projects where key application development tools are COTS rather than internally developed custom application development tools

  9. Atrioventricular junction (AVJ) motion tracking: a software tool with ITK/VTK/Qt.

    PubMed

    Pengdong Xiao; Shuang Leng; Xiaodan Zhao; Hua Zou; Ru San Tan; Wong, Philip; Liang Zhong

    2016-08-01

    The quantitative measurement of the Atrioventricular Junction (AVJ) motion is an important index for ventricular functions of one cardiac cycle including systole and diastole. In this paper, a software tool that can conduct AVJ motion tracking from cardiovascular magnetic resonance (CMR) images is presented by using Insight Segmentation and Registration Toolkit (ITK), The Visualization Toolkit (VTK) and Qt. The software tool is written in C++ by using Visual Studio Community 2013 integrated development environment (IDE) containing both an editor and a Microsoft complier. The software package has been successfully implemented. From the software engineering practice, it is concluded that ITK, VTK, and Qt are very handy software systems to implement automatic image analysis functions for CMR images such as quantitative measure of motion by visual tracking.

  10. Fusing Symbolic and Numerical Diagnostic Computations

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.

  11. Object-oriented design of medical imaging software.

    PubMed

    Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R

    1994-01-01

    A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.

  12. Security Risks: Management and Mitigation in the Software Life Cycle

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.

    2004-01-01

    A formal approach to managing and mitigating security risks in the software life cycle is requisite to developing software that has a higher degree of assurance that it is free of security defects which pose risk to the computing environment and the organization. Due to its criticality, security should be integrated as a formal approach in the software life cycle. Both a software security checklist and assessment tools should be incorporated into this life cycle process and integrated with a security risk assessment and mitigation tool. The current research at JPL addresses these areas through the development of a Sotfware Security Assessment Instrument (SSAI) and integrating it with a Defect Detection and Prevention (DDP) risk management tool.

  13. Identification of facilitators and barriers to residents' use of a clinical reasoning tool.

    PubMed

    DiNardo, Deborah; Tilstra, Sarah; McNeil, Melissa; Follansbee, William; Zimmer, Shanta; Farris, Coreen; Barnato, Amber E

    2018-03-28

    While there is some experimental evidence to support the use of cognitive forcing strategies to reduce diagnostic error in residents, the potential usability of such strategies in the clinical setting has not been explored. We sought to test the effect of a clinical reasoning tool on diagnostic accuracy and to obtain feedback on its usability and acceptability. We conducted a randomized behavioral experiment testing the effect of this tool on diagnostic accuracy on written cases among post-graduate 3 (PGY-3) residents at a single internal medical residency program in 2014. Residents completed written clinical cases in a proctored setting with and without prompts to use the tool. The tool encouraged reflection on concordant and discordant aspects of each case. We used random effects regression to assess the effect of the tool on diagnostic accuracy of the independent case sets, controlling for case complexity. We then conducted audiotaped structured focus group debriefing sessions and reviewed the tapes for facilitators and barriers to use of the tool. Of 51 eligible PGY-3 residents, 34 (67%) participated in the study. The average diagnostic accuracy increased from 52% to 60% with the tool, a difference that just met the test for statistical significance in adjusted analyses (p=0.05). Residents reported that the tool was generally acceptable and understandable but did not recognize its utility for use with simple cases, suggesting the presence of overconfidence bias. A clinical reasoning tool improved residents' diagnostic accuracy on written cases. Overconfidence bias is a potential barrier to its use in the clinical setting.

  14. An ontology-driven, diagnostic modeling system.

    PubMed

    Haug, Peter J; Ferraro, Jeffrey P; Holmen, John; Wu, Xinzi; Mynam, Kumar; Ebert, Matthew; Dean, Nathan; Jones, Jason

    2013-06-01

    To present a system that uses knowledge stored in a medical ontology to automate the development of diagnostic decision support systems. To illustrate its function through an example focused on the development of a tool for diagnosing pneumonia. We developed a system that automates the creation of diagnostic decision-support applications. It relies on a medical ontology to direct the acquisition of clinic data from a clinical data warehouse and uses an automated analytic system to apply a sequence of machine learning algorithms that create applications for diagnostic screening. We refer to this system as the ontology-driven diagnostic modeling system (ODMS). We tested this system using samples of patient data collected in Salt Lake City emergency rooms and stored in Intermountain Healthcare's enterprise data warehouse. The system was used in the preliminary development steps of a tool to identify patients with pneumonia in the emergency department. This tool was compared with a manually created diagnostic tool derived from a curated dataset. The manually created tool is currently in clinical use. The automatically created tool had an area under the receiver operating characteristic curve of 0.920 (95% CI 0.916 to 0.924), compared with 0.944 (95% CI 0.942 to 0.947) for the manually created tool. Initial testing of the ODMS demonstrates promising accuracy for the highly automated results and illustrates the route to model improvement. The use of medical knowledge, embedded in ontologies, to direct the initial development of diagnostic computing systems appears feasible.

  15. The Knowledge-Based Software Assistant: Beyond CASE

    NASA Technical Reports Server (NTRS)

    Carozzoni, Joseph A.

    1993-01-01

    This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.

  16. Diagnostic accuracy of the 14C-urea breath test in Helicobacter pylori infections: a meta-analysis.

    PubMed

    Zhou, Qiaohui; Li, Ling; Ai, Yaowei; Pan, Zhihong; Guo, Mingwen; Han, Jingbo

    2017-01-01

    To summarize and appraise the available literature regarding the use of the 14 C-urea breath test in the diagnosis of Helicobacter pylori infections in adult patients with dyspepsia and to calculate pooled diagnostic accuracy measures. We systematically searched the PubMed, EMBASE, Cochrane Library, Chinese Journals Full-text (CNKI) and CBMDisc databases to identify published data regarding the sensitivity, specificity, and other measures of diagnostic accuracy of the 14 C-urea breath test in the diagnosis of Helicobacter pylori infections in adult patients with dyspeptic symptoms. Risk of bias was assessed using the QUADAS (Quality Assessment of Diagnostic Accuracy Studies)-2 tool. Statistical analyses were performed using Meta-Disc 1.4 software and STATA. Eighteen studies met the inclusion criteria. Pooled results indicated that the  14 C-urea breath test showed a diagnostic sensitivity of 0.96 (95% CI 0.95 to 0.96) and specificity of 0.93 (95% CI 0.91 to 0.94). The positive like ratio (PLR) was 12.27 (95% CI 8.17 to 18.44), the negative like ratio (NLR) was 0.05 (95% CI 0.04 to 0.07), and the area under the curve was 0.985. The DOR was 294.95 (95% CI 178.37 to 487.70). The 14 C-urea breath test showed sufficient sensitivity and specificity for diagnosing Helicobacter pylori infection, but unexplained heterogeneity after meta-regression and several subgroup analyses remained. The UBT has high accuracy for diagnosing H. pylori infections in adult patients with dyspepsia. However, the reliability of these diagnostic meta-analytic estimates is limited by significant heterogeneity due to unknown factors.

  17. The Software Management Environment (SME)

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  18. Diagnostic performance of an automated analysis software for the diagnosis of Alzheimer’s dementia with 18F FDG PET

    PubMed Central

    Partovi, Sasan; Yuh, Roger; Pirozzi, Sara; Lu, Ziang; Couturier, Spencer; Grosse, Ulrich; Schluchter, Mark D; Nelson, Aaron; Jones, Robert; O’Donnell, James K; Faulhaber, Peter

    2017-01-01

    The objective of this study was to assess the ability of a quantitative software-aided approach to improve the diagnostic accuracy of 18F FDG PET for Alzheimer’s dementia over visual analysis alone. Twenty normal subjects (M:F-12:8; mean age 80.6 years) and twenty mild AD subjects (M:F-12:8; mean age 70.6 years) with 18F FDG PET scans were obtained from the ADNI database. Three blinded readers interpreted these PET images first using a visual qualitative approach and then using a quantitative software-aided approach. Images were classified on two five-point scales based on normal/abnormal (1-definitely normal; 5-definitely abnormal) and presence of AD (1-definitely not AD; 5-definitely AD). Diagnostic sensitivity, specificity, and accuracy for both approaches were compared based on the aforementioned scales. The sensitivity, specificity, and accuracy for the normal vs. abnormal readings of all readers combined were higher when comparing the software-aided vs. visual approach (sensitivity 0.93 vs. 0.83 P = 0.0466; specificity 0.85 vs. 0.60 P = 0.0005; accuracy 0.89 vs. 0.72 P<0.0001). The specificity and accuracy for absence vs. presence of AD of all readers combined were higher when comparing the software-aided vs. visual approach (specificity 0.90 vs. 0.70 P = 0.0008; accuracy 0.81 vs. 0.72 P = 0.0356). Sensitivities of the software-aided and visual approaches did not differ significantly (0.72 vs. 0.73 P = 0.74). The quantitative software-aided approach appears to improve the performance of 18F FDG PET for the diagnosis of mild AD. It may be helpful for experienced 18F FDG PET readers analyzing challenging cases. PMID:28123864

  19. Dynamic CT myocardial perfusion imaging: performance of 3D semi-automated evaluation software.

    PubMed

    Ebersberger, Ullrich; Marcus, Roy P; Schoepf, U Joseph; Lo, Gladys G; Wang, Yining; Blanke, Philipp; Geyer, Lucas L; Gray, J Cranston; McQuiston, Andrew D; Cho, Young Jun; Scheuering, Michael; Canstein, Christian; Nikolaou, Konstantin; Hoffmann, Ellen; Bamberg, Fabian

    2014-01-01

    To evaluate the performance of three-dimensional semi-automated evaluation software for the assessment of myocardial blood flow (MBF) and blood volume (MBV) at dynamic myocardial perfusion computed tomography (CT). Volume-based software relying on marginal space learning and probabilistic boosting tree-based contour fitting was applied to CT myocardial perfusion imaging data of 37 subjects. In addition, all image data were analysed manually and both approaches were compared with SPECT findings. Study endpoints included time of analysis and conventional measures of diagnostic accuracy. Of 592 analysable segments, 42 showed perfusion defects on SPECT. Average analysis times for the manual and software-based approaches were 49.1 ± 11.2 and 16.5 ± 3.7 min respectively (P < 0.01). There was strong agreement between the two measures of interest (MBF, ICC = 0.91, and MBV, ICC = 0.88, both P < 0.01) and no significant difference in MBF/MBV with respect to diagnostic accuracy between the two approaches for both MBF and MBV for manual versus software-based approach; respectively; all comparisons P > 0.05. Three-dimensional semi-automated evaluation of dynamic myocardial perfusion CT data provides similar measures and diagnostic accuracy to manual evaluation, albeit with substantially reduced analysis times. This capability may aid the integration of this test into clinical workflows. • Myocardial perfusion CT is attractive for comprehensive coronary heart disease assessment. • Traditional image analysis methods are cumbersome and time-consuming. • Automated 3D perfusion software shortens analysis times. • Automated 3D perfusion software increases standardisation of myocardial perfusion CT. • Automated, standardised analysis fosters myocardial perfusion CT integration into clinical practice.

  20. FFI: A software tool for ecological monitoring

    Treesearch

    Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...

  1. Computational modelling of genome-scale metabolic networks and its application to CHO cell cultures.

    PubMed

    Rejc, Živa; Magdevska, Lidija; Tršelič, Tilen; Osolin, Timotej; Vodopivec, Rok; Mraz, Jakob; Pavliha, Eva; Zimic, Nikolaj; Cvitanović, Tanja; Rozman, Damjana; Moškon, Miha; Mraz, Miha

    2017-09-01

    Genome-scale metabolic models (GEMs) have become increasingly important in recent years. Currently, GEMs are the most accurate in silico representation of the genotype-phenotype link. They allow us to study complex networks from the systems perspective. Their application may drastically reduce the amount of experimental and clinical work, improve diagnostic tools and increase our understanding of complex biological phenomena. GEMs have also demonstrated high potential for the optimisation of bio-based production of recombinant proteins. Herein, we review the basic concepts, methods, resources and software tools used for the reconstruction and application of GEMs. We overview the evolution of the modelling efforts devoted to the metabolism of Chinese Hamster Ovary (CHO) cells. We present a case study on CHO cell metabolism under different amino acid depletions. This leads us to the identification of the most influential as well as essential amino acids in selected CHO cell lines. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. SUNREL Related Links | Buildings | NREL

    Science.gov Websites

    SUNREL Related Links SUNREL Related Links DOE Simulation Software Tools Directory a directory of 301 building software tools for evaluation of energy efficiency, renewable energy, and sustainability in buildings. TREAT Software Program a computer program that uses SUNREL and is designed to provide

  3. IUWare and Computing Tools: Indiana University's Approach to Low-Cost Software.

    ERIC Educational Resources Information Center

    Sheehan, Mark C.; Williams, James G.

    1987-01-01

    Describes strategies for providing low-cost microcomputer-based software for classroom use on college campuses. Highlights include descriptions of the software (IUWare and Computing Tools); computing center support; license policies; documentation; promotion; distribution; staff, faculty, and user training; problems; and future plans. (LRW)

  4. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  5. A simple computer-based measurement and analysis system of pulmonary auscultation sounds.

    PubMed

    Polat, Hüseyin; Güler, Inan

    2004-12-01

    Listening to various lung sounds has proven to be an important diagnostic tool for detecting and monitoring certain types of lung diseases. In this study a computer-based system has been designed for easy measurement and analysis of lung sound using the software package DasyLAB. The designed system presents the following features: it is able to digitally record the lung sounds which are captured with an electronic stethoscope plugged to a sound card on a portable computer, display the lung sound waveform for auscultation sites, record the lung sound into the ASCII format, acoustically reproduce the lung sound, edit and print the sound waveforms, display its time-expanded waveform, compute the Fast Fourier Transform (FFT), and display the power spectrum and spectrogram.

  6. Ultrasonography in gastroenterology.

    PubMed

    Ødegaard, Svein; Nesje, Lars B; Hausken, Trygve; Gilja, Odd Helge

    2015-06-01

    Ultrasonography (US) is a safe and available real-time, high-resolution imaging method, which during the last decades has been increasingly integrated as a clinical tool in gastroenterology. New US applications have emerged with enforced data software and new technical solutions, including strain evaluation, three-dimensional imaging and use of ultrasound contrast agents. Specific gastroenterologic applications have been developed by combining US with other diagnostic or therapeutic methods, such as endoscopy, manometry, puncture needles, diathermy and stents. US provides detailed structural information about visceral organs without hazard to the patients and can play an important clinical role by reducing the need for invasive procedures. This paper presents different aspects of US in gastroenterology, with a special emphasis on the contribution from Nordic scientists in developing clinical applications.

  7. Assessment of Semi-Structured Clinical Interview for Mobile Phone Addiction Disorder.

    PubMed

    Alavi, Seyyed Salman; Mohammadi, Mohammad Reza; Jannatifard, Fereshteh; Mohammadi Kalhori, Soroush; Sepahbodi, Ghazal; BabaReisi, Mohammad; Sajedi, Sahar; Farshchi, Mojtaba; KhodaKarami, Rasul; Hatami Kasvaee, Vahid

    2016-04-01

    The Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision (DSM-IV-TR) classified mobile phone addiction disorder under "impulse control disorder not elsewhere classified". This study surveyed the diagnostic criteria of DSM-IV-TR for the diagnosis of mobile phone addiction in correspondence with Iranian society and culture. Two hundred fifty students of Tehran universities were entered into this descriptive-analytical and cross-sectional study. Quota sampling method was used. At first, semi- structured clinical interview (based on DSM-IV-TR) was performed for all the cases, and another specialist reevaluated the interviews. Data were analyzed using content validity, inter-scorer reliability (Kappa coefficient) and test-retest via SPSS18 software. The content validity of the semi- structured clinical interview matched the DSM-IV-TR criteria for behavioral addiction. Moreover, their content was appropriate, and two items, including "SMS pathological use" and "High monthly cost of using the mobile phone" were added to promote its validity. Internal reliability (Kappa) and test-retest reliability were 0.55 and r = 0.4 (p<0. 01) respectively. The results of this study revealed that semi- structured diagnostic criteria of DSM-IV-TR are valid and reliable for diagnosing mobile phone addiction, and this instrument is an effective tool to diagnose this disorder.

  8. The Holistic Targeting (HOT) Methodology as the Means to Improve Information Operations (IO) Target Development and Prioritization

    DTIC Science & Technology

    2008-09-01

    software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi-automated means to...the use of compendium software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi...OBJECTIVES USING COMPENDIUM SOFTWARE .....63 E. HOT TARGET PRIORITIZATION AND DEVELOPMENT USING PALANTIR SOFTWARE .................................69 1

  9. Validation of Tendril TrueHome Using Software-to-Software Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maguire, Jeffrey B; Horowitz, Scott G; Moore, Nathan

    This study performed comparative evaluation of EnergyPlus version 8.6 and Tendril TrueHome, two physics-based home energy simulation models, to identify differences in energy consumption predictions between the two programs and resolve discrepancies between them. EnergyPlus is considered a benchmark, best-in-class software tool for building energy simulation. This exercise sought to improve both software tools through additional evaluation/scrutiny.

  10. Capabilities of software "Vector-M" for a diagnostics of the ionosphere state from auroral emissions images and plasma characteristics from the different orbits as a part of the system of control of space weather

    NASA Astrophysics Data System (ADS)

    Avdyushev, V.; Banshchikova, M.; Chuvashov, I.; Kuzmin, A.

    2017-09-01

    In the paper are presented capabilities of software "Vector-M" for a diagnostics of the ionosphere state from auroral emissions images and plasma characteristics from the different orbits as a part of the system of control of space weather. The software "Vector-M" is developed by the celestial mechanics and astrometry department of Tomsk State University in collaboration with Space Research Institute (Moscow) and Central Aerological Observatory of Russian Federal Service for Hydrometeorology and Environmental Monitoring. The software "Vector-M" is intended for calculation of attendant geophysical and astronomical information for the centre of mass of the spacecraft and the space of observations in the experiment with auroral imager Aurovisor-VIS/MP in the orbit of the perspective Meteor-MP spacecraft.

  11. Tool Use Within NASA Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  12. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  13. Evaluation of the Diagnostic Accuracy of CareStart G6PD Deficiency Rapid Diagnostic Test (RDT) in a Malaria Endemic Area in Ghana, Africa

    PubMed Central

    Adu-Gyasi, Dennis; Asante, Kwaku Poku; Newton, Sam; Dosoo, David; Amoako, Sabastina; Adjei, George; Amoako, Nicholas; Ankrah, Love; Tchum, Samuel Kofi; Mahama, Emmanuel; Agyemang, Veronica; Kayan, Kingsley; Owusu-Agyei, Seth

    2015-01-01

    Background Glucose-6-phosphate dehydrogenase (G6PD) deficiency is the most widespread enzyme defect that can result in red cell breakdown under oxidative stress when exposed to certain medicines including antimalarials. We evaluated the diagnostic accuracy of CareStart G6PD deficiency Rapid Diagnostic Test (RDT) as a point-of-care tool for screening G6PD deficiency. Methods A cross-sectional study was conducted among 206 randomly selected and consented participants from a group with known G6PD deficiency status between February 2013 and June 2013. A maximum of 1.6ml of capillary blood samples were used for G6PD deficiency screening using CareStart G6PD RDT and Trinity qualitative with Trinity quantitative methods as the “gold standard”. Samples were also screened for the presence of malaria parasites. Data entry and analysis were done using Microsoft Access 2010 and Stata Software version 12. Kintampo Health Research Centre Institutional Ethics Committee granted ethical approval. Results The sensitivity (SE) and specificity (SP) of CareStart G6PD deficiency RDT was 100% and 72.1% compared to Trinity quantitative method respectively and was 98.9% and 96.2% compared to Trinity qualitative method. Malaria infection status had no significant (P=0.199) change on the performance of the G6PD RDT test kit compared to the “gold standard”. Conclusions The outcome of this study suggests that the diagnostic performance of the CareStart G6PD deficiency RDT kit was high and it is acceptable at determining the G6PD deficiency status in a high malaria endemic area in Ghana. The RDT kit presents as an attractive tool for point-of-care G6PD deficiency for rapid testing in areas with high temperatures and less expertise. The CareStart G6PD deficiency RDT kit could be used to screen malaria patients before administration of the fixed dose primaquine with artemisinin-based combination therapy. PMID:25885097

  14. Runtime Performance Monitoring Tool for RTEMS System Software

    NASA Astrophysics Data System (ADS)

    Cho, B.; Kim, S.; Park, H.; Kim, H.; Choi, J.; Chae, D.; Lee, J.

    2007-08-01

    RTEMS is a commercial-grade real-time operating system that supports multi-processor computers. However, there are not many development tools for RTEMS. In this paper, we report new RTEMS-based runtime performance monitoring tool. We have implemented a light weight runtime monitoring task with an extension to the RTEMS APIs. Using our tool, software developers can verify various performance- related parameters during runtime. Our tool can be used during software development phase and in-orbit operation as well. Our implemented target agent is light weight and has small overhead using SpaceWire interface. Efforts to reduce overhead and to add other monitoring parameters are currently under research.

  15. Use of Software Tools in Teaching Relational Database Design.

    ERIC Educational Resources Information Center

    McIntyre, D. R.; And Others

    1995-01-01

    Discusses the use of state-of-the-art software tools in teaching a graduate, advanced, relational database design course. Results indicated a positive student response to the prototype of expert systems software and a willingness to utilize this new technology both in their studies and in future work applications. (JKP)

  16. Analyzing the Core Flight Software (CFS) with SAVE

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; McComas, David

    2008-01-01

    This viewgraph presentation describes the SAVE tool and it's application to Core Flight Software (CFS). The contents include: 1) Fraunhofer-a short intro; 2) Context of this Collaboration; 3) CFS-Core Flight Software?; 4) The SAVE Tool; 5) Applying SAVE to CFS -A few example analyses; and 6) Goals.

  17. Designing and Using Software Tools for Educational Purposes: FLAT, a Case Study

    ERIC Educational Resources Information Center

    Castro-Schez, J. J.; del Castillo, E.; Hortolano, J.; Rodriguez, A.

    2009-01-01

    Educational software tools are considered to enrich teaching strategies, providing a more compelling means of exploration and feedback than traditional blackboard methods. Moreover, software simulators provide a more motivating link between theory and practice than pencil-paper methods, encouraging active and discovery learning in the students.…

  18. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    ERIC Educational Resources Information Center

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  19. Introducing the CUAHSI Hydrologic Information System Desktop Application (HydroDesktop) and Open Development Community

    NASA Astrophysics Data System (ADS)

    Ames, D.; Kadlec, J.; Horsburgh, J. S.; Maidment, D. R.

    2009-12-01

    The Consortium of Universities for the Advancement of Hydrologic Sciences (CUAHSI) Hydrologic Information System (HIS) project includes extensive development of data storage and delivery tools and standards including WaterML (a language for sharing hydrologic data sets via web services); and HIS Server (a software tool set for delivering WaterML from a server); These and other CUASHI HIS tools have been under development and deployment for several years and together, present a relatively complete software “stack” to support the consistent storage and delivery of hydrologic and other environmental observation data. This presentation describes the development of a new HIS software tool called “HydroDesktop” and the development of an online open source software development community to update and maintain the software. HydroDesktop is a local (i.e. not server-based) client side software tool that ultimately will run on multiple operating systems and will provide a highly usable level of access to HIS services. The software provides many key capabilities including data query, map-based visualization, data download, local data maintenance, editing, graphing, data export to selected model-specific data formats, linkage with integrated modeling systems such as OpenMI, and ultimately upload to HIS servers from the local desktop software. As the software is presently in the early stages of development, this presentation will focus on design approach and paradigm and is viewed as an opportunity to encourage participation in the open development community. Indeed, recognizing the value of community based code development as a means of ensuring end-user adoption, this project has adopted an “iterative” or “spiral” software development approach which will be described in this presentation.

  20. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  1. Reporting completeness and transparency of meta-analyses of depression screening tool accuracy: A comparison of meta-analyses published before and after the PRISMA statement.

    PubMed

    Rice, Danielle B; Kloda, Lorie A; Shrier, Ian; Thombs, Brett D

    2016-08-01

    Meta-analyses that are conducted rigorously and reported completely and transparently can provide accurate evidence to inform the best possible healthcare decisions. Guideline makers have raised concerns about the utility of existing evidence on the diagnostic accuracy of depression screening tools. The objective of our study was to evaluate the transparency and completeness of reporting in meta-analyses of the diagnostic accuracy of depression screening tools using the PRISMA tool adapted for diagnostic test accuracy meta-analyses. We searched MEDLINE and PsycINFO from January 1, 2005 through March 13, 2016 for recent meta-analyses in any language on the diagnostic accuracy of depression screening tools. Two reviewers independently assessed the transparency in reporting using the PRISMA tool with appropriate adaptations made for studies of diagnostic test accuracy. We identified 21 eligible meta-analyses. Twelve of 21 meta-analyses complied with at least 50% of adapted PRISMA items. Of 30 adapted PRISMA items, 11 were fulfilled by ≥80% of included meta-analyses, 3 by 50-79% of meta-analyses, 7 by 25-45% of meta-analyses, and 9 by <25%. On average, post-PRISMA meta-analyses complied with 17 of 30 items compared to 13 of 30 items pre-PRISMA. Deficiencies in the transparency of reporting in meta-analyses of the diagnostic test accuracy of depression screening tools of meta-analyses were identified. Authors, reviewers, and editors should adhere to the PRISMA statement to improve the reporting of meta-analyses of the diagnostic accuracy of depression screening tools. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. A Self-Diagnostic System for the M6 Accelerometer

    NASA Technical Reports Server (NTRS)

    Flanagan, Patrick M.; Lekki, John

    2001-01-01

    The design of a Self-Diagnostic (SD) accelerometer system for the Space Shuttle Main Engine is presented. This retrofit system connects diagnostic electronic hardware and software to the current M6 accelerometer system. This paper discusses the general operation of the M6 accelerometer SD system and procedures for developing and evaluating the SD system. Signal processing techniques using M6 accelerometer diagnostic data are explained. Test results include diagnostic data responding to changing ambient temperature, mounting torque and base mounting impedance.

  3. Software Tools on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    Debugger or performance analysis Tool for understanding the behavior of MPI applications. Intel VTune environment for statistical computing and graphics. VirtualGL/TurboVNC Visualization and analytics Remote Tools on the Peregrine System Software Tools on the Peregrine System NREL has a variety of

  4. Validation of Quantitative Multimodality Analysis of Telomerase Activity in Urine Cells as a Noninvasive Diagnostic and Prognostic Tool for Prostate Cancer

    DTIC Science & Technology

    2005-08-01

    present study, who was previously misdiagnosed with BPH and inflammation, eventually has revealed the prostate cancer with the Gleason score 7. Therefore...Noninvasive Diagnostic and Prognostic Tool for Prostate Cancer ...5a. CONTRACT NUMBER Urine Cells as a Noninvasive Diagnostic and Prognostic Tool for Prostate Cancer 5b. GRANT NUMBER W81XWH-04-1-0774 5c

  5. Analysis of laparoscopy in trauma.

    PubMed

    Villavicencio, R T; Aucar, J A

    1999-07-01

    The optimum roles for laparoscopy in trauma have yet to be established. To date, reviews of laparoscopy in trauma have been primarily descriptive rather than analytic. This article analyzes the results of laparoscopy in trauma. Outcome analysis was done by reviewing 37 studies with more than 1,900 trauma patients, and laparoscopy was analyzed as a screening, diagnostic, or therapeutic tool. Laparoscopy was regarded as a screening tool if it was used to detect or exclude a positive finding (eg, hemoperitoneum, organ injury, gastrointestinal spillage, peritoneal penetration) that required operative exploration or repair. Laparoscopy was regarded as a diagnostic tool when it was used to identify all injuries, rather than as a screening tool to identify the first indication for a laparotomy. It was regarded as a diagnostic tool only in studies that mandated a laparotomy (gold standard) after laparoscopy to confirm the diagnostic accuracy of laparoscopic findings. Costs and charges for using laparoscopy in trauma were analyzed when feasible. As a screening tool, laparoscopy missed 1% of injuries and helped prevent 63% of patients from having a trauma laparotomy. When used as a diagnostic tool, laparoscopy had a 41% to 77% missed injury rate per patient. Overall, laparoscopy carried a 1% procedure-related complication rate. Cost-effectiveness has not been uniformly proved in studies comparing laparoscopy and laparotomy. Laparoscopy has been applied safely and effectively as a screening tool in stable patients with acute trauma. Because of the large number of missed injuries when used as a diagnostic tool, its value in this context is limited. Laparoscopy has been reported infrequently as a therapeutic tool in selected patients, and its use in this context requires further study.

  6. Computer-Aided Process and Tools for Mobile Software Acquisition

    DTIC Science & Technology

    2013-04-01

    Software Acquisition Christopher Bonine , Man-Tak Shing, and Thomas W. Otani Naval Postgraduate School Published April 1, 2013 Approved for public...ManTech International Corporation Computer-Aided Process and Tools for Mobile Software Acquisition Christopher Bonine , Man-Tak Shing, and Thomas W. Otani...Mobile Software Acquisition Christopher Bonine — Bonine is a lieutenant in the United States Navy. He is currently assigned to the Navy Cyber Defense

  7. Evaluating online diagnostic decision support tools for the clinical setting.

    PubMed

    Pryor, Marie; White, David; Potter, Bronwyn; Traill, Roger

    2012-01-01

    Clinical decision support tools available at the point of care are an effective adjunct to support clinicians to make clinical decisions and improve patient outcomes. We developed a methodology and applied it to evaluate commercially available online clinical diagnostic decision support (DDS) tools for use at the point of care. We identified 11 commercially available DDS tools and assessed these against an evaluation instrument that included 6 categories; general information, content, quality control, search, clinical results and other features. We developed diagnostically challenging clinical case scenarios based on real patient experience that were commonly missed by junior medical staff. The evaluation was divided into 2 phases; an initial evaluation of all identified and accessible DDS tools conducted by the Clinical Information Access Portal (CIAP) team and a second phase that further assessed the top 3 tools identified in the initial evaluation phase. An evaluation panel consisting of senior and junior medical clinicians from NSW Health conducted the second phase. Of the eleven tools that were assessed against the evaluation instrument only 4 tools completely met the DDS definition that was adopted for this evaluation and were able to produce a differential diagnosis. From the initial phase of the evaluation 4 DDS tools scored 70% or more (maximum score 96%) for the content category, 8 tools scored 65% or more (maximum 100%) for the quality control category, 5 tools scored 65% or more (maximum 94%) for the search category, and 4 tools score 70% or more (maximum 81%) for the clinical results category. The second phase of the evaluation was focused on assessing diagnostic accuracy for the top 3 tools identified in the initial phase. Best Practice ranked highest overall against the 6 clinical case scenarios used. Overall the differentiating factor between the top 3 DDS tools was determined by diagnostic accuracy ranking, ease of use and the confidence and credibility of the clinical information. The evaluation methodology used here to assess the quality and comprehensiveness of clinical DDS tools was effective in identifying the most appropriate tool for the clinical setting. The use of clinical case scenarios is fundamental in determining the diagnostic accuracy and usability of the tools.

  8. Design and validation of an improved graphical user interface with the 'Tool ball'.

    PubMed

    Lee, Kuo-Wei; Lee, Ying-Chu

    2012-01-01

    The purpose of this research is introduce the design of an improved graphical user interface (GUI) and verifies the operational efficiency of the proposed interface. Until now, clicking the toolbar with the mouse is the usual way to operate software functions. In our research, we designed an improved graphical user interface - a tool ball that is operated by a mouse wheel to perform software functions. Several experiments are conducted to measure the time needed to operate certain software functions with the traditional combination of "mouse click + tool button" and the proposed integration of "mouse wheel + tool ball". The results indicate that the tool ball design can accelerate the speed of operating software functions, decrease the number of icons on the screen, and enlarge the applications of the mouse wheel. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  9. The GenABEL Project for statistical genomics.

    PubMed

    Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.

  10. Two New Tools for Glycopeptide Analysis Researchers: A Glycopeptide Decoy Generator and a Large Data Set of Assigned CID Spectra of Glycopeptides.

    PubMed

    Lakbub, Jude C; Su, Xiaomeng; Zhu, Zhikai; Patabandige, Milani W; Hua, David; Go, Eden P; Desaire, Heather

    2017-08-04

    The glycopeptide analysis field is tightly constrained by a lack of effective tools that translate mass spectrometry data into meaningful chemical information, and perhaps the most challenging aspect of building effective glycopeptide analysis software is designing an accurate scoring algorithm for MS/MS data. We provide the glycoproteomics community with two tools to address this challenge. The first tool, a curated set of 100 expert-assigned CID spectra of glycopeptides, contains a diverse set of spectra from a variety of glycan types; the second tool, Glycopeptide Decoy Generator, is a new software application that generates glycopeptide decoys de novo. We developed these tools so that emerging methods of assigning glycopeptides' CID spectra could be rigorously tested. Software developers or those interested in developing skills in expert (manual) analysis can use these tools to facilitate their work. We demonstrate the tools' utility in assessing the quality of one particular glycopeptide software package, GlycoPep Grader, which assigns glycopeptides to CID spectra. We first acquired the set of 100 expert assigned CID spectra; then, we used the Decoy Generator (described herein) to generate 20 decoys per target glycopeptide. The assigned spectra and decoys were used to test the accuracy of GlycoPep Grader's scoring algorithm; new strengths and weaknesses were identified in the algorithm using this approach. Both newly developed tools are freely available. The software can be downloaded at http://glycopro.chem.ku.edu/GPJ.jar.

  11. Validation of next generation sequencing technologies in comparison to current diagnostic gold standards for BRAF, EGFR and KRAS mutational analysis.

    PubMed

    McCourt, Clare M; McArt, Darragh G; Mills, Ken; Catherwood, Mark A; Maxwell, Perry; Waugh, David J; Hamilton, Peter; O'Sullivan, Joe M; Salto-Tellez, Manuel

    2013-01-01

    Next Generation Sequencing (NGS) has the potential of becoming an important tool in clinical diagnosis and therapeutic decision-making in oncology owing to its enhanced sensitivity in DNA mutation detection, fast-turnaround of samples in comparison to current gold standard methods and the potential to sequence a large number of cancer-driving genes at the one time. We aim to test the diagnostic accuracy of current NGS technology in the analysis of mutations that represent current standard-of-care, and its reliability to generate concomitant information on other key genes in human oncogenesis. Thirteen clinical samples (8 lung adenocarcinomas, 3 colon carcinomas and 2 malignant melanomas) already genotyped for EGFR, KRAS and BRAF mutations by current standard-of-care methods (Sanger Sequencing and q-PCR), were analysed for detection of mutations in the same three genes using two NGS platforms and an additional 43 genes with one of these platforms. The results were analysed using closed platform-specific proprietary bioinformatics software as well as open third party applications. Our results indicate that the existing format of the NGS technology performed well in detecting the clinically relevant mutations stated above but may not be reliable for a broader unsupervised analysis of the wider genome in its current design. Our study represents a diagnostically lead validation of the major strengths and weaknesses of this technology before consideration for diagnostic use.

  12. Discrimination of Isomers of Released N- and O-Glycans Using Diagnostic Product Ions in Negative Ion PGC-LC-ESI-MS/MS

    NASA Astrophysics Data System (ADS)

    Ashwood, Christopher; Lin, Chi-Hung; Thaysen-Andersen, Morten; Packer, Nicolle H.

    2018-03-01

    Profiling cellular protein glycosylation is challenging due to the presence of highly similar glycan structures that play diverse roles in cellular physiology. As the anomericity and the exact linkage type of a single glycosidic bond can influence glycan function, there is a demand for improved and automated methods to confirm detailed structural features and to discriminate between structurally similar isomers, overcoming a significant bottleneck in the analysis of data generated by glycomics experiments. We used porous graphitized carbon-LC-ESI-MS/MS to separate and detect released N- and O-glycan isomers from mammalian model glycoproteins using negative mode resonance activation CID-MS/MS. By interrogating similar fragment spectra from closely related glycan isomers that differ only in arm position and sialyl linkage, product fragment ions for discrimination between these features were discovered. Using the Skyline software, at least two diagnostic fragment ions of high specificity were validated for automated discrimination of sialylation and arm position in N-glycan structures, and sialylation in O-glycan structures, complementing existing structural diagnostic ions. These diagnostic ions were shown to be useful for isomer discrimination using both linear and 3D ion trap mass spectrometers when analyzing complex glycan mixtures from cell lysates. Skyline was found to serve as a useful tool for automated assessment of glycan isomer discrimination. This platform-independent workflow can potentially be extended to automate the characterization and quantitation of other challenging glycan isomers. [Figure not available: see fulltext.

  13. Machine Tool Software

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  14. WISE: Automated support for software project management and measurement. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Sudhakar

    1995-01-01

    One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.

  15. USER'S GUIDE: Strategic Waste Minimization Initiative (SWAMI) Version 2.0 - A Software Tool to Aid in Process Analysis for Pollution Prevention

    EPA Science Inventory

    The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...

  16. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-02

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-74,554] International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA, San Jose, CA; Notice of... determination of the TAA petition filed on behalf of workers at International Business Machines (IBM), Software...

  17. An Overview of Public Access Computer Software Management Tools for Libraries

    ERIC Educational Resources Information Center

    Wayne, Richard

    2004-01-01

    An IT decision maker gives an overview of public access PC software that's useful in controlling session length and scheduling, Internet access, print output, security, and the latest headaches: spyware and adware. In this article, the author describes a representative sample of software tools in several important categories such as setup…

  18. Software engineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan

    1993-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.

  19. Predicting tool life in turning operations using neural networks and image processing

    NASA Astrophysics Data System (ADS)

    Mikołajczyk, T.; Nowicki, K.; Bustillo, A.; Yu Pimenov, D.

    2018-05-01

    A two-step method is presented for the automatic prediction of tool life in turning operations. First, experimental data are collected for three cutting edges under the same constant processing conditions. In these experiments, the parameter of tool wear, VB, is measured with conventional methods and the same parameter is estimated using Neural Wear, a customized software package that combines flank wear image recognition and Artificial Neural Networks (ANNs). Second, an ANN model of tool life is trained with the data collected from the first two cutting edges and the subsequent model is evaluated on two different subsets for the third cutting edge: the first subset is obtained from the direct measurement of tool wear and the second is obtained from the Neural Wear software that estimates tool wear using edge images. Although the complete-automated solution, Neural Wear software for tool wear recognition plus the ANN model of tool life prediction, presented a slightly higher error than the direct measurements, it was within the same range and can meet all industrial requirements. These results confirm that the combination of image recognition software and ANN modelling could potentially be developed into a useful industrial tool for low-cost estimation of tool life in turning operations.

  20. Novel features and enhancements in BioBin, a tool for the biologically inspired binning and association analysis of rare variants

    PubMed Central

    Byrska-Bishop, Marta; Wallace, John; Frase, Alexander T; Ritchie, Marylyn D

    2018-01-01

    Abstract Motivation BioBin is an automated bioinformatics tool for the multi-level biological binning of sequence variants. Herein, we present a significant update to BioBin which expands the software to facilitate a comprehensive rare variant analysis and incorporates novel features and analysis enhancements. Results In BioBin 2.3, we extend our software tool by implementing statistical association testing, updating the binning algorithm, as well as incorporating novel analysis features providing for a robust, highly customizable, and unified rare variant analysis tool. Availability and implementation The BioBin software package is open source and freely available to users at http://www.ritchielab.com/software/biobin-download Contact mdritchie@geisinger.edu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28968757

  1. Software tool for data mining and its applications

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  2. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  3. Management of an affiliated Physics Residency Program using a commercial software tool.

    PubMed

    Zacarias, Albert S; Mills, Michael D

    2010-06-01

    A review of commercially available allied health educational management software tools was performed to evaluate their capacity to manage program data associated with a CAMPEP-accredited Therapy Physics Residency Program. Features of these software tools include: a) didactic course reporting and organization, b) competency reporting by topic, category and didactic course, c) student time management and accounting, and d) student patient case reporting by topic, category and course. The software package includes features for recording school administrative information; setting up lists of courses, faculty, clinical sites, categories, competencies, and time logs; and the inclusion of standardized external documents. There are provisions for developing evaluation and survey instruments. The mentors and program may be evaluated by residents, and residents may be evaluated by faculty members using this feature. Competency documentation includes the time spent on the problem or with the patient, time spent with the mentor, date of the competency, and approval by the mentor and program director. Course documentation includes course and lecture title, lecturer, topic information, date of lecture and approval by the Program Director. These software tools have the facility to include multiple clinical sites, with local subadministrators having the ability to approve competencies and attendance at clinical conferences. In total, these software tools have the capability of managing all components of a CAMPEP-accredited residency program. The application database lends the software to the support of multiple affiliated clinical sites within a single residency program. Such tools are a critical and necessary component if the medical physics profession is to meet the projected needs for qualified medical physicists in future years.

  4. Diagnostic Tools for Performance Evaluation of Innovative In-Situ Remediation Technologies at Chlorinated Solvent-Contaminated Sites

    DTIC Science & Technology

    2011-07-01

    to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT...these innovative methods with conventional diagnostic tools that are currently used for assessing bioremediation performance. 132 Rula Deeb (510) 596...conventional diagnostic tools that are currently used for assessing bioremediation performance. DEMONSTRATION RESULTS 3-D multi-level systems

  5. PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.

    PubMed

    Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter

    2016-04-01

    Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Coordinating the Complexity of Tools, Tasks, and Users: On Theory-Based Approaches to Authoring Tool Usability

    ERIC Educational Resources Information Center

    Murray, Tom

    2016-01-01

    Intelligent Tutoring Systems authoring tools are highly complex educational software applications used to produce highly complex software applications (i.e. ITSs). How should our assumptions about the target users (authors) impact the design of authoring tools? In this article I first reflect on the factors leading to my original 1999 article on…

  7. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  8. Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base

    NASA Technical Reports Server (NTRS)

    Bryant, Richard B., Jr.; Carrelli, David J.

    2006-01-01

    The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.

  9. Framework Support For Knowledge-Based Software Development

    NASA Astrophysics Data System (ADS)

    Huseth, Steve

    1988-03-01

    The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.

  10. Small Portable Analyzer Diagnostic Equipment (SPADE) Program -- Diagnostic Software Validation

    DTIC Science & Technology

    1984-07-01

    Electronic Equipment Electromagnetic Emission and Susceptibility Requirements for the Control of Electromagnetic Interference Electromagnetic...ONLY. ORIENTATION OF DEFECT LOOKING HHO QIlILL: t -ed’-o· Significant efforts were expended to simulate spalling failures associated with naturally

  11. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  12. Optimal Combination of Non-Invasive Tools for the Early Detection of Potentially Life-Threatening Emergencies in Gynecology

    PubMed Central

    Varas, Catalina; Ravit, Marion; Mimoun, Camille; Panel, Pierre; Huchon, Cyrille; Fauconnier, Arnaud

    2016-01-01

    Objectives Potentially life-threatening gynecological emergencies (G-PLEs) are acute pelvic conditions that may spontaneously evolve into a life-threatening situation, or those for which there is a risk of sequelae or death in the absence of prompt diagnosis and treatment. The objective of this study was to identify the best combination of non-invasive diagnostic tools to ensure an accurate diagnosis and timely response when faced with G-PLEs for patients arriving with acute pelvic pain at the Gynecological Emergency Department (ED). Methods The data on non-invasive diagnostic tools were sourced from the records of patients presenting at the ED of two hospitals in the Parisian suburbs (France) with acute pelvic pain between September 2006 and April 2008. The medical history of the patients was obtained through a standardized questionnaire completed for a prospective observational study, and missing information was completed with data sourced from the medical forms. Diagnostic tool categories were predefined as a collection of signs or symptoms. We analyzed the association of each sign/symptom with G-PLEs using Pearson’s Chi-Square or Fischer’s exact tests. Symptoms and signs associated with G-PLEs (p-value < 0.20) were subjected to logistic regression to evaluate the diagnostic value of each of the predefined diagnostic tools and in various combinations. Results The data of 365 patients with acute pelvic pain were analyzed, of whom 103 were confirmed to have a PLE. We analyzed five diagnostic tools by logistic regression: Triage Process, History-Taking, Physical Examination, Ultrasonography, and Biological Exams. The combination of History-Taking and Ultrasonography had a C-index of 0.83, the highest for a model combining two tools. Conclusions The use of a standardized self-assessment questionnaire for history-taking and focal ultrasound examination were found to be the most successful tool combination for the diagnosis of gynecological emergencies in a Gynecological ED. Additional tools, such as physical examination, do not add substantial diagnostic value. PMID:27583697

  13. Bayesian Software Health Management for Aircraft Guidance, Navigation, and Control

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mbaya, Timmy; Menghoel, Ole

    2011-01-01

    Modern aircraft, both piloted fly-by-wire commercial aircraft as well as UAVs, more and more depend on highly complex safety critical software systems with many sensors and computer-controlled actuators. Despite careful design and V&V of the software, severe incidents have happened due to malfunctioning software. In this paper, we discuss the use of Bayesian networks (BNs) to monitor the health of the on-board software and sensor system, and to perform advanced on-board diagnostic reasoning. We will focus on the approach to develop reliable and robust health models for the combined software and sensor systems.

  14. APT: what it has enabled us to do

    NASA Astrophysics Data System (ADS)

    Blacker, Brett S.; Golombek, Daniel

    2004-09-01

    With the development and operations deployment of the Astronomer's Proposal Tool (APT), Hubble Space Telescope (HST) proposers have been provided with an integrated toolset for Phase I and Phase II. This toolset consists of editors for filling out proposal information, an Orbit Planner for determining observation feasibility, a Visit Planner for determining schedulability, diagnostic and reporting tools and an integrated Visual Target Tuner (VTT) for viewing exposure specifications. The VTT can also overlay HST"s field of view on user-selected Flexible Image Transport System (FITS) images, perform bright object checks and query the HST archive. In addition to these direct benefits for the HST user, STScI"s internal Phase I process has been able to take advantage of the APT products. APT has enabled a substantial streamlining of the process and software processing tools, which enabled a compression by three months of the Phase I to Phase II schedule, allowing to schedule observations earlier and thus further benefiting HST observers. Some of the improvements to our process include: creating a compact disk (CD) of Phase I products; being able to print all proposals on the day of the deadline; link the proposal in Portable Document Format (PDF) with a database, and being able to run all Phase I software on a single platform. In this paper we will discuss the operational results of using APT for HST's Cycles 12 and 13 Phase I process and will show the improvements for the users and the overall process that is allowing STScI to obtain scientific results with HST three months earlier than in previous years. We will also show how APT can be and is being used for multiple missions.

  15. An Embedded Rule-Based Diagnostic Expert System in Ada

    NASA Technical Reports Server (NTRS)

    Jones, Robert E.; Liberman, Eugene M.

    1992-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with it portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assumed a growing role in providing human-like reasoning capability expertise for computer systems. The integration is discussed of expert system technology with Ada programming language, especially a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell. NASA Lewis was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-based power expert system, in ART-Ada. Three components, the rule-based expert systems, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The rules were written in the ART-Ada development environment and converted to Ada source code. The graphics interface was developed with the Transportable Application Environment (TAE) Plus, which generates Ada source code to control graphics images. SMART-Ada communicates with a remote host to obtain either simulated or real data. The Ada source code generated with ART-Ada, TAE Plus, and communications code was incorporated into an Ada expert system that reads the data from a power distribution test bed, applies the rule to determine a fault, if one exists, and graphically displays it on the screen. The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  16. Investigating the Effectiveness of Classroom Diagnostic Tools

    ERIC Educational Resources Information Center

    Schultz, Robert K.

    2012-01-01

    The primary purposes of the study are to investigate what teachers experience while using the Classroom Diagnostic Tools (CDT) and to relate those experiences to the rate of growth in students' mathematics achievement. The CDT contains three components: an online computer adaptive diagnostic test, interactive web-based student reports, and…

  17. Pre- and Post-Processing Tools to Streamline the CFD Process

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne Miller

    2002-01-01

    This viewgraph presentation provides information on software development tools to facilitate the use of CFD (Computational Fluid Dynamics) codes. The specific CFD codes FDNS and CORSAIR are profiled, and uses for software development tools with these codes during pre-processing, interim-processing, and post-processing are explained.

  18. A NASA-wide approach toward cost-effective, high-quality software through reuse

    NASA Technical Reports Server (NTRS)

    Scheper, Charlotte O. (Editor); Smith, Kathryn A. (Editor)

    1993-01-01

    NASA Langley Research Center sponsored the second Workshop on NASA Research in Software Reuse on May 5-6, 1992 at the Research Triangle Park, North Carolina. The workshop was hosted by the Research Triangle Institute. Participants came from the three NASA centers, four NASA contractor companies, two research institutes and the Air Force's Rome Laboratory. The purpose of the workshop was to exchange information on software reuse tool development, particularly with respect to tool needs, requirements, and effectiveness. The participants presented the software reuse activities and tools being developed and used by their individual centers and programs. These programs address a wide range of reuse issues. The group also developed a mission and goals for software reuse within NASA. This publication summarizes the presentations and the issues discussed during the workshop.

  19. Ares I-X Ground Diagnostic Prototype

    NASA Technical Reports Server (NTRS)

    Schwabacher, Mark A.; Martin, Rodney Alexander; Waterman, Robert D.; Oostdyk, Rebecca Lynn; Ossenfort, John P.; Matthews, Bryan

    2010-01-01

    The automation of pre-launch diagnostics for launch vehicles offers three potential benefits: improving safety, reducing cost, and reducing launch delays. The Ares I-X Ground Diagnostic Prototype demonstrated anomaly detection, fault detection, fault isolation, and diagnostics for the Ares I-X first-stage Thrust Vector Control and for the associated ground hydraulics while the vehicle was in the Vehicle Assembly Building at Kennedy Space Center (KSC) and while it was on the launch pad. The prototype combines three existing tools. The first tool, TEAMS (Testability Engineering and Maintenance System), is a model-based tool from Qualtech Systems Inc. for fault isolation and diagnostics. The second tool, SHINE (Spacecraft Health Inference Engine), is a rule-based expert system that was developed at the NASA Jet Propulsion Laboratory. We developed SHINE rules for fault detection and mode identification, and used the outputs of SHINE as inputs to TEAMS. The third tool, IMS (Inductive Monitoring System), is an anomaly detection tool that was developed at NASA Ames Research Center. The three tools were integrated and deployed to KSC, where they were interfaced with live data. This paper describes how the prototype performed during the period of time before the launch, including accuracy and computer resource usage. The paper concludes with some of the lessons that we learned from the experience of developing and deploying the prototype.

  20. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  1. Diagnostic Tools for the Systemic Reform of Schools.

    ERIC Educational Resources Information Center

    Amsler, Mary; Kirsch, Kayla

    This paper presents three interrelated diagnostic tools that can be used by school staff as they begin to plan a systemic reform effort. These tools are designed to help educators reflect on their experiences in creating changes in their school and to examine the current barriers to and supports for the change process. The tools help school design…

  2. Measurement properties of screening and diagnostic tools for autism spectrum adults of mean normal intelligence: A systematic review.

    PubMed

    Baghdadli, A; Russet, F; Mottron, L

    2017-07-01

    The autism spectrum (AS) is a multifaceted neurodevelopmental variant associated with lifelong challenges. Despite the relevant importance of identifying AS in adults for epidemiological, public health, and quality of life issues, the measurement properties of the tools currently used to screen and diagnose adults without intellectual disabilities (ID) have not been assessed. This systematic review addresses the accuracy, reliability, and validity of the reported AS screening and diagnostic tools used in adults without ID. Electronic databases and bibliographies were searched, and identified papers evaluated against inclusion criteria. The PRISMA statement was used for reporting the review. We evaluated the quality of the papers using the COSMIN Checklist for psychometric data, and QUADAS-2 for diagnostic data. For the COSMIN assessment, evidence was considered to be strong when several methodologically good articles, or one excellent article, reported consistent evidence for or against a measurement property. For the QUADAS ratings, evidence was considered to be "satisfactory" if at least one study was rated with a low risk of bias and low concern about applicability. We included 38 articles comprising 32 studies, five reviews, and one book chapter and assessed nine tools (three diagnostic and six screening, including eight of their short versions). Among screening tools, only AQ-50, AQ-S, and RAADS-R and RAADS-14 were found to provide satisfactory or intermediate values for their psychometric properties, supported by strong or moderate evidence. Nevertheless, risks of bias and concerns on the applicability of these tools limit the evidence on their diagnostic properties. We found that none of the gold standard diagnostic tools used for children had satisfactory measurement properties. There is limited evidence for the measurement properties of the screening and diagnostic tools used for AS adults with a mean normal range of measured intelligence. This may lessen the validity of conclusions and public health decisions on an important fraction of the adult autistic population. This not only justifies further validation studies of screening and diagnostic tools for autistic adults, but also supports the parallel use of self-reported information and clinical expertise with these instruments during the diagnostic process. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  3. Using Docker Compose for the Simple Deployment of an Integrated Drug Target Screening Platform.

    PubMed

    List, Markus

    2017-06-10

    Docker virtualization allows for software tools to be executed in an isolated and controlled environment referred to as a container. In Docker containers, dependencies are provided exactly as intended by the developer and, consequently, they simplify the distribution of scientific software and foster reproducible research. The Docker paradigm is that each container encapsulates one particular software tool. However, to analyze complex biomedical data sets, it is often necessary to combine several software tools into elaborate workflows. To address this challenge, several Docker containers need to be instantiated and properly integrated, which complicates the software deployment process unnecessarily. Here, we demonstrate how an extension to Docker, Docker compose, can be used to mitigate these problems by providing a unified setup routine that deploys several tools in an integrated fashion. We demonstrate the power of this approach by example of a Docker compose setup for a drug target screening platform consisting of five integrated web applications and shared infrastructure, deployable in just two lines of codes.

  4. Canadian Open Genetics Repository (COGR): a unified clinical genomics database as a community resource for standardising and sharing genetic interpretations.

    PubMed

    Lerner-Ellis, Jordan; Wang, Marina; White, Shana; Lebo, Matthew S

    2015-07-01

    The Canadian Open Genetics Repository is a collaborative effort for the collection, storage, sharing and robust analysis of variants reported by medical diagnostics laboratories across Canada. As clinical laboratories adopt modern genomics technologies, the need for this type of collaborative framework is increasingly important. A survey to assess existing protocols for variant classification and reporting was delivered to clinical genetics laboratories across Canada. Based on feedback from this survey, a variant assessment tool was made available to all laboratories. Each participating laboratory was provided with an instance of GeneInsight, a software featuring versioning and approval processes for variant assessments and interpretations and allowing for variant data to be shared between instances. Guidelines were established for sharing data among clinical laboratories and in the final outreach phase, data will be made readily available to patient advocacy groups for general use. The survey demonstrated the need for improved standardisation and data sharing across the country. A variant assessment template was made available to the community to aid with standardisation. Instances of the GeneInsight tool were provided to clinical diagnostic laboratories across Canada for the purpose of uploading, transferring, accessing and sharing variant data. As an ongoing endeavour and a permanent resource, the Canadian Open Genetics Repository aims to serve as a focal point for the collaboration of Canadian laboratories with other countries in the development of tools that take full advantage of laboratory data in diagnosing, managing and treating genetic diseases. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Capturing Your Students' Attention is Easier with this Month's Software Selections. Blue Ribbon Reviews

    ERIC Educational Resources Information Center

    Lindroth, Linda

    2005-01-01

    This article describes new presentation tools and game shows that can make the classroom into a learning stage. RM Easiteach Studio, a presentation software from RM Educational Software, provides teaching tools for use on any interactive whiteboard. Classroom Jeopardy[R] from Educational Insights includes a scoreboard/base control unit, three…

  6. Using Academia-Industry Partnerships to Enhance Software Verification & Validation Education via Active Learning Tools

    ERIC Educational Resources Information Center

    Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter

    2017-01-01

    Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…

  7. Research flight software engineering and MUST, an integrated system of support tools

    NASA Technical Reports Server (NTRS)

    Straeter, T. A.; Foudriat, E. C.; Will, R. W.

    1977-01-01

    Consideration is given to software development to support NASA flight research. The Multipurpose User-Oriented Software Technology (MUST) program, designed to integrate digital systems into flight research, is discussed. Particular attention is given to the program's special interactive user interface, subroutine library, assemblers, compiler, automatic documentation tools, and test and simulation subsystems.

  8. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  9. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  10. System and Software Reliability (C103)

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  11. Software tool for physics chart checks.

    PubMed

    Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa

    2014-01-01

    Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.

  12. The GenABEL Project for statistical genomics

    PubMed Central

    Karssen, Lennart C.; van Duijn, Cornelia M.; Aulchenko, Yurii S.

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination. PMID:27347381

  13. Dependency visualization for complex system understanding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smart, J. Allison Cory

    1994-09-01

    With the volume of software in production use dramatically increasing, the importance of software maintenance has become strikingly apparent. Techniques now sought and developed for reverse engineering and design extraction and recovery. At present, numerous commercial products and research tools exist which are capable of visualizing a variety of programming languages and software constructs. The list of new tools and services continues to grow rapidly. Although the scope of the existing commercial and academic product set is quite broad, these tools still share a common underlying problem. The ability of each tool to visually organize object representations is increasingly impairedmore » as the number of components and component dependencies within systems increases. Regardless of how objects are defined, complex ``spaghetti`` networks result in nearly all large system cases. While this problem is immediately apparent in modem systems analysis involving large software implementations, it is not new. As will be discussed in Chapter 2, related problems involving the theory of graphs were identified long ago. This important theoretical foundation provides a useful vehicle for representing and analyzing complex system structures. While the utility of directed graph based concepts in software tool design has been demonstrated in literature, these tools still lack the capabilities necessary for large system comprehension. This foundation must therefore be expanded with new organizational and visualization constructs necessary to meet this challenge. This dissertation addresses this need by constructing a conceptual model and a set of methods for interactively exploring, organizing, and understanding the structure of complex software systems.« less

  14. Biology Needs Evolutionary Software Tools: Let’s Build Them Right

    PubMed Central

    Team, Galaxy; Goecks, Jeremy; Taylor, James

    2018-01-01

    Abstract Research in population genetics and evolutionary biology has always provided a computational backbone for life sciences as a whole. Today evolutionary and population biology reasoning are essential for interpretation of large complex datasets that are characteristic of all domains of today’s life sciences ranging from cancer biology to microbial ecology. This situation makes algorithms and software tools developed by our community more important than ever before. This means that we, developers of software tool for molecular evolutionary analyses, now have a shared responsibility to make these tools accessible using modern technological developments as well as provide adequate documentation and training. PMID:29688462

  15. Virtual Reality as Innovative Approach to the Interior Designing

    NASA Astrophysics Data System (ADS)

    Kaleja, Pavol; Kozlovská, Mária

    2017-06-01

    We can observe significant potential of information and communication technologies (ICT) in interior designing field, by development of software and hardware virtual reality tools. Using ICT tools offer realistic perception of proposal in its initial idea (the study). A group of real-time visualization, supported by hardware tools like Oculus Rift HTC Vive, provides free walkthrough and movement in virtual interior with the possibility of virtual designing. By improving of ICT software tools for designing in virtual reality we can achieve still more realistic virtual environment. The contribution presented proposal of an innovative approach of interior designing in virtual reality, using the latest software and hardware ICT virtual reality technologies

  16. ReGaTE: Registration of Galaxy Tools in Elixir

    PubMed Central

    Mareuil, Fabien; Deveaud, Eric; Kalaš, Matúš; Soranzo, Nicola; van den Beek, Marius; Grüning, Björn; Ison, Jon; Ménager, Hervé

    2017-01-01

    Abstract Background: Bioinformaticians routinely use multiple software tools and data sources in their day-to-day work and have been guided in their choices by a number of cataloguing initiatives. The ELIXIR Tools and Data Services Registry (bio.tools) aims to provide a central information point, independent of any specific scientific scope within bioinformatics or technological implementation. Meanwhile, efforts to integrate bioinformatics software in workbench and workflow environments have accelerated to enable the design, automation, and reproducibility of bioinformatics experiments. One such popular environment is the Galaxy framework, with currently more than 80 publicly available Galaxy servers around the world. In the context of a generic registry for bioinformatics software, such as bio.tools, Galaxy instances constitute a major source of valuable content. Yet there has been, to date, no convenient mechanism to register such services en masse. Findings: We present ReGaTE (Registration of Galaxy Tools in Elixir), a software utility that automates the process of registering the services available in a Galaxy instance. This utility uses the BioBlend application program interface to extract service metadata from a Galaxy server, enhance the metadata with the scientific information required by bio.tools, and push it to the registry. Conclusions: ReGaTE provides a fast and convenient way to publish Galaxy services in bio.tools. By doing so, service providers may increase the visibility of their services while enriching the software discovery function that bio.tools provides for its users. The source code of ReGaTE is freely available on Github at https://github.com/C3BI-pasteur-fr/ReGaTE. PMID:28402416

  17. Supporting geoscience with graphical-user-interface Internet tools for the Macintosh

    NASA Astrophysics Data System (ADS)

    Robin, Bernard

    1995-07-01

    This paper describes a suite of Macintosh graphical-user-interface (GUI) software programs that can be used in conjunction with the Internet to support geoscience education. These software programs allow science educators to access and retrieve a large body of resources from an increasing number of network sites, taking advantage of the intuitive, simple-to-use Macintosh operating system. With these tools, educators easily can locate, download, and exchange not only text files but also sound resources, video movie clips, and software application files from their desktop computers. Another major advantage of these software tools is that they are available at no cost and may be distributed freely. The following GUI software tools are described including examples of how they can be used in an educational setting: ∗ Eudora—an e-mail program ∗ NewsWatcher—a newsreader ∗ TurboGopher—a Gopher program ∗ Fetch—a software application for easy File Transfer Protocol (FTP) ∗ NCSA Mosaic—a worldwide hypertext browsing program. An explosive growth of online archives currently is underway as new electronic sites are being added continuously to the Internet. Many of these resources may be of interest to science educators who learn they can share not only ASCII text files, but also graphic image files, sound resources, QuickTime movie clips, and hypermedia projects with colleagues from locations around the world. These powerful, yet simple to learn GUI software tools are providing a revolution in how knowledge can be accessed, retrieved, and shared.

  18. Polarization sensitive camera for the in vitro diagnostic and monitoring of dental erosion

    NASA Astrophysics Data System (ADS)

    Bossen, Anke; Rakhmatullina, Ekaterina; Lussi, Adrian; Meier, Christoph

    Due to a frequent consumption of acidic food and beverages, the prevalence of dental erosion increases worldwide. In an initial erosion stage, the hard dental tissue is softened due to acidic demineralization. As erosion progresses, a gradual tissue wear occurs resulting in thinning of the enamel. Complete loss of the enamel tissue can be observed in severe clinical cases. Therefore, it is essential to provide a diagnosis tool for an accurate detection and monitoring of dental erosion already at early stages. In this manuscript, we present the development of a polarization sensitive imaging camera for the visualization and quantification of dental erosion. The system consists of two CMOS cameras mounted on two sides of a polarizing beamsplitter. A horizontal linearly polarized light source is positioned orthogonal to the camera to ensure an incidence illumination and detection angles of 45°. The specular reflected light from the enamel surface is collected with an objective lens mounted on the beam splitter and divided into horizontal (H) and vertical (V) components on each associate camera. Images of non-eroded and eroded enamel surfaces at different erosion degrees were recorded and assessed with diagnostic software. The software was designed to generate and display two types of images: distribution of the reflection intensity (V) and a polarization ratio (H-V)/(H+V) throughout the analyzed tissue area. The measurements and visualization of these two optical parameters, i.e. specular reflection intensity and the polarization ratio, allowed detection and quantification of enamel erosion at early stages in vitro.

  19. Designing Real-time Decision Support for Trauma Resuscitations

    PubMed Central

    Yadav, Kabir; Chamberlain, James M.; Lewis, Vicki R.; Abts, Natalie; Chawla, Shawn; Hernandez, Angie; Johnson, Justin; Tuveson, Genevieve; Burd, Randall S.

    2016-01-01

    Background Use of electronic clinical decision support (eCDS) has been recommended to improve implementation of clinical decision rules. Many eCDS tools, however, are designed and implemented without taking into account the context in which clinical work is performed. Implementation of the pediatric traumatic brain injury (TBI) clinical decision rule at one Level I pediatric emergency department includes an electronic questionnaire triggered when ordering a head computed tomography using computerized physician order entry (CPOE). Providers use this CPOE tool in less than 20% of trauma resuscitation cases. A human factors engineering approach could identify the implementation barriers that are limiting the use of this tool. Objectives The objective was to design a pediatric TBI eCDS tool for trauma resuscitation using a human factors approach. The hypothesis was that clinical experts will rate a usability-enhanced eCDS tool better than the existing CPOE tool for user interface design and suitability for clinical use. Methods This mixed-methods study followed usability evaluation principles. Pediatric emergency physicians were surveyed to identify barriers to using the existing eCDS tool. Using standard trauma resuscitation protocols, a hierarchical task analysis of pediatric TBI evaluation was developed. Five clinical experts, all board-certified pediatric emergency medicine faculty members, then iteratively modified the hierarchical task analysis until reaching consensus. The software team developed a prototype eCDS display using the hierarchical task analysis. Three human factors engineers provided feedback on the prototype through a heuristic evaluation, and the software team refined the eCDS tool using a rapid prototyping process. The eCDS tool then underwent iterative usability evaluations by the five clinical experts using video review of 50 trauma resuscitation cases. A final eCDS tool was created based on their feedback, with content analysis of the evaluations performed to ensure all concerns were identified and addressed. Results Among 26 EPs (76% response rate), the main barriers to using the existing tool were that the information displayed is redundant and does not fit clinical workflow. After the prototype eCDS tool was developed based on the trauma resuscitation hierarchical task analysis, the human factors engineers rated it to be better than the CPOE tool for nine of 10 standard user interface design heuristics on a three-point scale. The eCDS tool was also rated better for clinical use on the same scale, in 84% of 50 expert–video pairs, and was rated equivalent in the remainder. Clinical experts also rated barriers to use of the eCDS tool as being low. Conclusions An eCDS tool for diagnostic imaging designed using human factors engineering methods has improved perceived usability among pediatric emergency physicians. PMID:26300010

  20. Diagnosing Chronic Pancreatitis: Comparison and Evaluation of Different Diagnostic Tools.

    PubMed

    Issa, Yama; van Santvoort, Hjalmar C; van Dieren, Susan; Besselink, Marc G; Boermeester, Marja A; Ahmed Ali, Usama

    2017-10-01

    This study aims to compare the M-ANNHEIM, Büchler, and Lüneburg diagnostic tools for chronic pancreatitis (CP). A cross-sectional analysis of the development of CP was performed in a prospectively collected multicenter cohort including 669 patients after a first episode of acute pancreatitis. We compared the individual components of the M-ANNHEIM, Büchler, and Lüneburg tools, the agreement between tools, and estimated diagnostic accuracy using Bayesian latent-class analysis. A total of 669 patients with acute pancreatitis followed-up for a median period of 57 (interquartile range, 42-70) months were included. Chronic pancreatitis was diagnosed in 50 patients (7%), 59 patients (9%), and 61 patients (9%) by the M-ANNHEIM, Lüneburg, and Büchler tools, respectively. The overall agreement between these tools was substantial (κ = 0.75). Differences between the tools regarding the following criteria led to significant changes in the total number of diagnoses of CP: abdominal pain, recurrent pancreatitis, moderate to marked ductal lesions, endocrine and exocrine insufficiency, pancreatic calcifications, and pancreatic pseudocysts. The Büchler tool had the highest sensitivity (94%), followed by the M-ANNHEIM (87%), and finally the Lüneburg tool (81%). Differences between diagnostic tools for CP are mainly attributed to presence of clinical symptoms, endocrine insufficiency, and certain morphological complications.

  1. The Pathway Tools software.

    PubMed

    Karp, Peter D; Paley, Suzanne; Romero, Pedro

    2002-01-01

    Bioinformatics requires reusable software tools for creating model-organism databases (MODs). The Pathway Tools is a reusable, production-quality software environment for creating a type of MOD called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc (see http://ecocyc.org) integrates our evolving understanding of the genes, proteins, metabolic network, and genetic network of an organism. This paper provides an overview of the four main components of the Pathway Tools: The PathoLogic component supports creation of new PGDBs from the annotated genome of an organism. The Pathway/Genome Navigator provides query, visualization, and Web-publishing services for PGDBs. The Pathway/Genome Editors support interactive updating of PGDBs. The Pathway Tools ontology defines the schema of PGDBs. The Pathway Tools makes use of the Ocelot object database system for data management services for PGDBs. The Pathway Tools has been used to build PGDBs for 13 organisms within SRI and by external users.

  2. FunRich proteomics software analysis, let the fun begin!

    PubMed

    Benito-Martin, Alberto; Peinado, Héctor

    2015-08-01

    Protein MS analysis is the preferred method for unbiased protein identification. It is normally applied to a large number of both small-scale and high-throughput studies. However, user-friendly computational tools for protein analysis are still needed. In this issue, Mathivanan and colleagues (Proteomics 2015, 15, 2597-2601) report the development of FunRich software, an open-access software that facilitates the analysis of proteomics data, providing tools for functional enrichment and interaction network analysis of genes and proteins. FunRich is a reinterpretation of proteomic software, a standalone tool combining ease of use with customizable databases, free access, and graphical representations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. New technologies for supporting real-time on-board software development

    NASA Astrophysics Data System (ADS)

    Kerridge, D.

    1995-03-01

    The next generation of on-board data management systems will be significantly more complex than current designs, and will be required to perform more complex and demanding tasks in software. Improved hardware technology, in the form of the MA31750 radiation hard processor, is one key component in addressing the needs of future embedded systems. However, to complement these hardware advances, improved support for the design and implementation of real-time data management software is now needed. This will help to control the cost and risk assoicated with developing data management software development as it becomes an increasingly significant element within embedded systems. One particular problem with developing embedded software is managing the non-functional requirements in a systematic way. This paper identifies how Logica has exploited recent developments in hard real-time theory to address this problem through the use of new hard real-time analysis and design methods which can be supported by specialized tools. The first stage in transferring this technology from the research domain to industrial application has already been completed. The MA37150 Hard Real-Time Embedded Software Support Environment (HESSE) is a loosely integrated set of hardware and software tools which directly support the process of hard real-time analysis for software targeting the MA31750 processor. With further development, this HESSE promises to provide embedded system developers with software tools which can reduce the risks associated with developing complex hard real-time software. Supported in this way by more sophisticated software methods and tools, it is foreseen that MA31750 based embedded systems can meet the processing needs for the next generation of on-board data management systems.

  4. Social Software in Academia

    ERIC Educational Resources Information Center

    Bryant, Todd

    2006-01-01

    Considerable buzz has appeared on the Internet over a group of new tools labeled social software. These tools can expand discussion beyond the classroom and provide new ways for students to collaborate and communicate within their class or around the world. Dickinson College has implemented two of the best-known tools, the wiki and the blog, in…

  5. CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 4

    DTIC Science & Technology

    2005-04-01

    older automated cost- estimating tools are no longer being actively marketed but are still in use such as CheckPoint, COCOMO, ESTIMACS, REVIC, and SPQR ...estimation tools: SPQR /20, Checkpoint, and Knowl- edgePlan. These software estimation tools pioneered the use of function point metrics for sizing and

  6. The Implications of Cognitive Psychology for Computer-Based Learning Tools.

    ERIC Educational Resources Information Center

    Kozma, Robert B.

    1987-01-01

    Defines cognitive computer tools as software programs that use the control capabilities of computers to amplify, extend, or enhance human cognition; suggests seven ways in which computers can aid learning; and describes the "Learning Tool," a software package for the Apple Macintosh microcomputer that is designed to aid learning of…

  7. New software tools for enhanced precision in robot-assisted laser phonomicrosurgery.

    PubMed

    Dagnino, Giulio; Mattos, Leonardo S; Caldwell, Darwin G

    2012-01-01

    This paper describes a new software package created to enhance precision during robot-assisted laser phonomicrosurgery procedures. The new software is composed of three tools for camera calibration, automatic tumor segmentation, and laser tracking. These were designed and developed to improve the outcome of this demanding microsurgical technique, and were tested herein to produce quantitative performance data. The experimental setup was based on the motorized laser micromanipulator created by Istituto Italiano di Tecnologia and the experimental protocols followed are fully described in this paper. The results show the new tools are robust and effective: The camera calibration tool reduced residual errors (RMSE) to 0.009 ± 0.002 mm under 40× microscope magnification; the automatic tumor segmentation tool resulted in deep lesion segmentations comparable to manual segmentations (RMSE= 0.160 ± 0.028 mm under 40× magnification); and the laser tracker tool proved to be reliable even during cutting procedures (RMSE= 0.073 ± 0.023 mm under 40× magnification). These results demonstrate the new software package can provide excellent improvements to the previous microsurgical system, leading to important enhancements in surgical outcome.

  8. [Relevance of big data for molecular diagnostics].

    PubMed

    Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T

    2018-04-01

    Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.

  9. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  10. Developing sustainable software solutions for bioinformatics by the “ Butterfly” paradigm

    PubMed Central

    Ahmed, Zeeshan; Zeeshan, Saman; Dandekar, Thomas

    2014-01-01

    Software design and sustainable software engineering are essential for the long-term development of bioinformatics software. Typical challenges in an academic environment are short-term contracts, island solutions, pragmatic approaches and loose documentation. Upcoming new challenges are big data, complex data sets, software compatibility and rapid changes in data representation. Our approach to cope with these challenges consists of iterative intertwined cycles of development (“ Butterfly” paradigm) for key steps in scientific software engineering. User feedback is valued as well as software planning in a sustainable and interoperable way. Tool usage should be easy and intuitive. A middleware supports a user-friendly Graphical User Interface (GUI) as well as a database/tool development independently. We validated the approach of our own software development and compared the different design paradigms in various software solutions. PMID:25383181

  11. Software engineering and Ada in design

    NASA Technical Reports Server (NTRS)

    Oneill, Don

    1986-01-01

    Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.

  12. Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets

    PubMed Central

    Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-01-01

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  13. Computer applications making rapid advances in high throughput microbial proteomics (HTMP).

    PubMed

    Anandkumar, Balakrishna; Haga, Steve W; Wu, Hui-Fen

    2014-02-01

    The last few decades have seen the rise of widely-available proteomics tools. From new data acquisition devices, such as MALDI-MS and 2DE to new database searching softwares, these new products have paved the way for high throughput microbial proteomics (HTMP). These tools are enabling researchers to gain new insights into microbial metabolism, and are opening up new areas of study, such as protein-protein interactions (interactomics) discovery. Computer software is a key part of these emerging fields. This current review considers: 1) software tools for identifying the proteome, such as MASCOT or PDQuest, 2) online databases of proteomes, such as SWISS-PROT, Proteome Web, or the Proteomics Facility of the Pathogen Functional Genomics Resource Center, and 3) software tools for applying proteomic data, such as PSI-BLAST or VESPA. These tools allow for research in network biology, protein identification, functional annotation, target identification/validation, protein expression, protein structural analysis, metabolic pathway engineering and drug discovery.

  14. Web Implementation of Quality Assurance (QA) for X-ray Units in Balkanic Medical Institutions.

    PubMed

    Urošević, Vlade; Ristić, Olga; Milošević, Danijela; Košutić, Duško

    2015-08-01

    Diagnostic radiology is the major contributor to the total dose of the population from all artificial sources. In order to reduce radiation exposure and optimize diagnostic x-ray image quality, it is necessary to increase the quality and efficiency of quality assurance (QA) and audit programs. This work presents a web application providing completely new QA solutions for x-ray modalities and facilities. The software gives complete online information (using European standards) with which the corresponding institutions and individuals can evaluate and control a facility's Radiation Safety and QA program. The software enables storage of all data in one place and sharing the same information (data), regardless of whether the measured data is used by an individual user or by an authorized institution. The software overcomes the distance and time separation of institutions and individuals who take part in QA. Upgrading the software will enable assessment of the medical exposure level to ionizing radiation.

  15. The Toxicity Estimation Software Tool (T.E.S.T.)

    EPA Science Inventory

    The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to estimate toxicological values for aquatic and mammalian species considering acute and chronic endpoints for screening purposes within TSCA and REACH programs.

  16. Breast Cancer Diagnostic System Final Report CRADA No. TC02098.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubenchik, A. M.; DaSilva, L. B.

    This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of California)/Lawrence Liver more National Laboratory (LLNL) and BioTelligent, Inc. together with a Russian Institution (BioFil, Ltd.), to develop a new system ( diagnostic device, operating procedures, algorithms and software) to accurately distinguish between benign and malignant breast tissue (Breast Cancer Diagnostic System, BCDS).

  17. Automated System of Diagnostic Monitoring at Bureya HPP Hydraulic Engineering Installations: a New Level of Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musyurka, A. V., E-mail: musyurkaav@burges.rushydro.ru

    This article presents the design, hardware, and software solutions developed and placed in service for the automated system of diagnostic monitoring (ASDM) for hydraulic engineering installations at the Bureya HPP, and assuring a reliable process for monitoring hydraulic engineering installations. Project implementation represents a timely solution of problems addressed by the hydraulic engineering installation diagnostics section.

  18. Development of a non-contact diagnostic tool for high power lasers

    NASA Astrophysics Data System (ADS)

    Simmons, Jed A.; Guttman, Jeffrey L.; McCauley, John

    2016-03-01

    High power lasers in excess of 1 kW generate enough Rayleigh scatter, even in the NIR, to be detected by silicon based sensor arrays. A lens and camera system in an off-axis position can therefore be used as a non-contact diagnostic tool for high power lasers. Despite the simplicity of the concept, technical challenges have been encountered in the development of an instrument referred to as BeamWatch. These technical challenges include reducing background radiation, achieving high signal to noise ratio, reducing saturation events caused by particulates crossing the beam, correcting images to achieve accurate beam width measurements, creating algorithms for the removal of non-uniformities, and creating two simultaneous views of the beam from orthogonal directions. Background radiation in the image was reduced by the proper positioning of the back plane and the placement of absorbing materials on the internal surfaces of BeamWatch. Maximizing signal to noise ratio, important to the real-time monitoring of focus position, was aided by increasing lens throughput. The number of particulates crossing the beam path was reduced by creating a positive pressure inside BeamWatch. Algorithms in the software removed non-uniformities in the data prior to generating waist width, divergence, BPP, and M2 results. A dual axis version of BeamWatch was developed by the use of mirrors. By its nature BeamWatch produced results similar to scanning slit measurements. Scanning slit data was therefore taken and compared favorably with BeamWatch results.

  19. A Tale of Two Cultures: Cross Cultural Comparison in Learning the Prezi Presentation Software Tool in the US and Norway

    ERIC Educational Resources Information Center

    Brock, Sabra; Brodahl, Cornelia

    2013-01-01

    Presentation software is an important tool for both student and professorial communicators. PowerPoint has been the standard since it was introduced in 1990. However, new "improved" software platforms are emerging. Prezi is one of these, claiming to remedy the linear thinking that underlies PowerPoint by creating one canvas and…

  20. Role of the Educator in Social Software Initiatives in Further and Higher Education: A Conceptualisation and Research Agenda

    ERIC Educational Resources Information Center

    Minocha, Shailey; Schroeder, Andreas; Schneider, Christoph

    2011-01-01

    Higher and further education institutions are increasingly using social software tools to support teaching and learning. A growing body of research investigates the diversity of tools and their range of contributions. However, little research has focused on investigating the role of the educator in the context of a social software initiative, even…

  1. ISWHM: Tools and Techniques for Software and System Health Management

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  2. Planning for land use and conservation: Assessing GIS-based conservation software for land use planning

    Treesearch

    Rob Baldwin; Ryan Scherzinger; Don Lipscomb; Miranda Mockrin; Susan Stein

    2014-01-01

    Recent advances in planning and ecological software make it possible to conduct highly technical analyses to prioritize conservation investments and inform local land use planning. We review these tools, termed conservation planning tools, and assess the knowledge of a key set of potential users: the land use planning community. We grouped several conservation software...

  3. Report on Automated Semantic Analysis of Scientific and Engineering Codes

    NASA Technical Reports Server (NTRS)

    Stewart. Maark E. M.; Follen, Greg (Technical Monitor)

    2001-01-01

    The loss of the Mars Climate Orbiter due to a software error reveals what insiders know: software development is difficult and risky because, in part, current practices do not readily handle the complex details of software. Yet, for scientific software development the MCO mishap represents the tip of the iceberg; few errors are so public, and many errors are avoided with a combination of expertise, care, and testing during development and modification. Further, this effort consumes valuable time and resources even when hardware costs and execution time continually decrease. Software development could use better tools! This lack of tools has motivated the semantic analysis work explained in this report. However, this work has a distinguishing emphasis; the tool focuses on automated recognition of the fundamental mathematical and physical meaning of scientific code. Further, its comprehension is measured by quantitatively evaluating overall recognition with practical codes. This emphasis is necessary if software errors-like the MCO error-are to be quickly and inexpensively avoided in the future. This report evaluates the progress made with this problem. It presents recommendations, describes the approach, the tool's status, the challenges, related research, and a development strategy.

  4. Multi-hybrid instrumentations with smartphones and smartpads for innovative in-field and POC diagnostics

    NASA Astrophysics Data System (ADS)

    Hofmann, Dietrich; Dittrich, Paul-Gerald; Gärtner, Claudia; Klemm, Richard

    2013-03-01

    Aim of the paper is the orientation of research and development on a completely new approach to innovative in-field and point of care diagnostics in industry, biology and medicine. Central functional modules are smartphones and/or smart pads supplemented by additional hardware apps and software apps. Specific examples are given for numerous practical applications concerning optodigital instrumentations. The methodical classification distinguishes between different levels for combination of hardware apps (hwapps) and software apps (swapps) with smartphones and/or smartpads. These methods are fundamental enablers for the transformation from stationary conventional laboratory diagnostics into mobile innovative in-field and point of care diagnostics. The innovative approach opens so far untapped enormous markets due to the convenience, reliability and affordability of smartphone and/or smartpad instruments. A highly visible advantage of smartphones and/or smartpads is the huge number of their distribution, their worldwide connectivity via cloud services and the experienced capability of their users for practical operations.

  5. A Vision on the Status and Evolution of HEP Physics Software Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canal, P.; Elvira, D.; Hatcher, R.

    2013-07-28

    This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.

  6. A practical overview and comparison of certain commercial forensic software tools for processing large-scale digital investigations

    NASA Astrophysics Data System (ADS)

    Kröger, Knut; Creutzburg, Reiner

    2013-05-01

    The aim of this paper is to show the usefulness of modern forensic software tools for processing large-scale digital investigations. In particular, we focus on the new version of Nuix 4.2 and compare it with AccessData FTK 4.2, X-Ways Forensics 16.9 and Guidance Encase Forensic 7 regarding its performance, functionality, usability and capability. We will show how these software tools work with large forensic images and how capable they are in examining complex and big data scenarios.

  7. Artificial intelligence and expert systems in-flight software testing

    NASA Technical Reports Server (NTRS)

    Demasie, M. P.; Muratore, J. F.

    1991-01-01

    The authors discuss the introduction of advanced information systems technologies such as artificial intelligence, expert systems, and advanced human-computer interfaces directly into Space Shuttle software engineering. The reconfiguration automation project (RAP) was initiated to coordinate this move towards 1990s software technology. The idea behind RAP is to automate several phases of the flight software testing procedure and to introduce AI and ES into space shuttle flight software testing. In the first phase of RAP, conventional tools to automate regression testing have already been developed or acquired. There are currently three tools in use.

  8. Real-time fluoroscopic needle guidance in the interventional radiology suite using navigational software for percutaneous bone biopsies in children.

    PubMed

    Shellikeri, Sphoorti; Setser, Randolph M; Hwang, Tiffany J; Srinivasan, Abhay; Krishnamurthy, Ganesh; Vatsky, Seth; Girard, Erin; Zhu, Xiaowei; Keller, Marc S; Cahill, Anne Marie

    2017-07-01

    Navigational software provides real-time fluoroscopic needle guidance for percutaneous procedures in the Interventional Radiology (IR) suite. We describe our experience with navigational software for pediatric percutaneous bone biopsies in the IR suite and compare technical success, diagnostic accuracy, radiation dose and procedure time with that of CT-guided biopsies. Pediatric bone biopsies performed using navigational software (Syngo iGuide, Siemens Healthcare) from 2011 to 2016 were prospectively included and anatomically matched CT-guided bone biopsies from 2008 to 2016 were retrospectively reviewed with institutional review board approval. C-arm CT protocols used for navigational software-assisted cases included institution-developed low-dose (0.1/0.17 μGy/projection), regular-dose (0.36 μGy/projection), or a combination of low-dose/regular-dose protocols. Estimated effective radiation dose and procedure times were compared between software-assisted and CT-guided biopsies. Twenty-six patients (15 male; mean age: 10 years) underwent software-assisted biopsies (15 pelvic, 7 lumbar and 4 lower extremity) and 33 patients (13 male; mean age: 9 years) underwent CT-guided biopsies (22 pelvic, 7 lumbar and 4 lower extremity). Both modality biopsies resulted in a 100% technical success rate. Twenty-five of 26 (96%) software-assisted and 29/33 (88%) CT-guided biopsies were diagnostic. Overall, the effective radiation dose was significantly lower in software-assisted than CT-guided cases (3.0±3.4 vs. 6.6±7.7 mSv, P=0.02). The effective dose difference was most dramatic in software-assisted cases using low-dose C-arm CT (1.2±1.8 vs. 6.6±7.7 mSv, P=0.001) or combined low-dose/regular-dose C-arm CT (1.9±2.4 vs. 6.6±7.7 mSv, P=0.04), whereas effective dose was comparable in software-assisted cases using regular-dose C-arm CT (6.0±3.5 vs. 6.6±7.7 mSv, P=0.7). Mean procedure time was significantly lower for software-assisted cases (91±54 vs. 141±68 min, P=0.005). In our experience, navigational software technology in the IR suite is a promising alternative to CT guidance for pediatric bone biopsies providing comparable technical success and diagnostic accuracy with lower radiation dose and procedure time, in addition to providing real-time fluoroscopic needle guidance.

  9. Initial Ada components evaluation

    NASA Technical Reports Server (NTRS)

    Moebes, Travis

    1989-01-01

    The SAIC has the responsibility for independent test and validation of the SSE. They have been using a mathematical functions library package implemented in Ada to test the SSE IV and V process. The library package consists of elementary mathematical functions and is both machine and accuracy independent. The SSE Ada components evaluation includes code complexity metrics based on Halstead's software science metrics and McCabe's measure of cyclomatic complexity. Halstead's metrics are based on the number of operators and operands on a logical unit of code and are compiled from the number of distinct operators, distinct operands, and total number of occurrences of operators and operands. These metrics give an indication of the physical size of a program in terms of operators and operands and are used diagnostically to point to potential problems. McCabe's Cyclomatic Complexity Metrics (CCM) are compiled from flow charts transformed to equivalent directed graphs. The CCM is a measure of the total number of linearly independent paths through the code's control structure. These metrics were computed for the Ada mathematical functions library using Software Automated Verification and Validation (SAVVAS), the SSE IV and V tool. A table with selected results was shown, indicating that most of these routines are of good quality. Thresholds for the Halstead measures indicate poor quality if the length metric exceeds 260 or difficulty is greater than 190. The McCabe CCM indicated a high quality of software products.

  10. An Algorithm and R Program for Fitting and Simulation of Pharmacokinetic and Pharmacodynamic Data.

    PubMed

    Li, Jijie; Yan, Kewei; Hou, Lisha; Du, Xudong; Zhu, Ping; Zheng, Li; Zhu, Cairong

    2017-06-01

    Pharmacokinetic/pharmacodynamic link models are widely used in dose-finding studies. By applying such models, the results of initial pharmacokinetic/pharmacodynamic studies can be used to predict the potential therapeutic dose range. This knowledge can improve the design of later comparative large-scale clinical trials by reducing the number of participants and saving time and resources. However, the modeling process can be challenging, time consuming, and costly, even when using cutting-edge, powerful pharmacological software. Here, we provide a freely available R program for expediently analyzing pharmacokinetic/pharmacodynamic data, including data importation, parameter estimation, simulation, and model diagnostics. First, we explain the theory related to the establishment of the pharmacokinetic/pharmacodynamic link model. Subsequently, we present the algorithms used for parameter estimation and potential therapeutic dose computation. The implementation of the R program is illustrated by a clinical example. The software package is then validated by comparing the model parameters and the goodness-of-fit statistics generated by our R package with those generated by the widely used pharmacological software WinNonlin. The pharmacokinetic and pharmacodynamic parameters as well as the potential recommended therapeutic dose can be acquired with the R package. The validation process shows that the parameters estimated using our package are satisfactory. The R program developed and presented here provides pharmacokinetic researchers with a simple and easy-to-access tool for pharmacokinetic/pharmacodynamic analysis on personal computers.

  11. OEXP Analysis Tools Workshop

    NASA Technical Reports Server (NTRS)

    Garrett, L. Bernard; Wright, Robert L.; Badi, Deborah; Findlay, John T.

    1988-01-01

    This publication summarizes the software needs and available analysis tools presented at the OEXP Analysis Tools Workshop held at the NASA Langley Research Center, Hampton, Virginia on June 21 to 22, 1988. The objective of the workshop was to identify available spacecraft system (and subsystem) analysis and engineering design tools, and mission planning and analysis software that could be used for various NASA Office of Exploration (code Z) studies, specifically lunar and Mars missions.

  12. Computer Forensics Education - the Open Source Approach

    NASA Astrophysics Data System (ADS)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  13. Toxicity Estimation Software Tool (TEST)

    EPA Science Inventory

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  14. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  15. Diagnostic test accuracy of nutritional tools used to identify undernutrition in patients with colorectal cancer: a systematic review.

    PubMed

    Håkonsen, Sasja Jul; Pedersen, Preben Ulrich; Bath-Hextall, Fiona; Kirkpatrick, Pamela

    2015-05-15

    Effective nutritional screening, nutritional care planning and nutritional support are essential in all settings, and there is no doubt that a health service seeking to increase safety and clinical effectiveness must take nutritional care seriously. Screening and early detection of malnutrition is crucial in identifying patients at nutritional risk. There is a high prevalence of malnutrition in hospitalized patients undergoing treatment for colorectal cancer. To synthesize the best available evidence regarding the diagnostic test accuracy of nutritional tools (sensitivity and specificity) used to identify malnutrition (specifically undernutrition) in patients with colorectal cancer (such as the Malnutrition Screening Tool and Nutritional Risk Index) compared to reference tests (such as the Subjective Global Assessment or Patient Generated Subjective Global Assessment). Patients with colorectal cancer requiring either (or all) surgery, chemotherapy and/or radiotherapy in secondary care. Focus of the review: The diagnostic test accuracy of validated assessment tools/instruments (such as the Malnutrition Screening Tool and Nutritional Risk Index) in the diagnosis of malnutrition (specifically under-nutrition) in patients with colorectal cancer, relative to reference tests (Subjective Global Assessment or Patient Generated Subjective Global Assessment). Types of studies: Diagnostic test accuracy studies regardless of study design. Studies published in English, German, Danish, Swedish and Norwegian were considered for inclusion in this review. Databases were searched from their inception to April 2014. Methodological quality was determined using the Quality Assessment of Diagnostic Accuracy Studies checklist. Data was collected using the data extraction form: the Standards for Reporting Studies of Diagnostic Accuracy checklist for the reporting of studies of diagnostic accuracy. The accuracy of diagnostic tests is presented in terms of sensitivity, specificity, positive and negative predictive values. In addition, the positive likelihood ratio (sensitivity/ [1 - specificity]) and negative likelihood ratio (1 - sensitivity)/ specificity), were also calculated and presented in this review to provide information about the likelihood that a given test result would be expected when the target condition is present compared with the likelihood that the same result would be expected when the condition is absent. Not all trials reported true positive, true negative, false positive and false negative rates, therefore these rates were calculated based on the data in the published papers. A two-by-two truth table was reconstructed for each study, and sensitivity, specificity, positive predictive value, negative predictive value positive likelihood ratio and negative likelihood ratio were calculated for each study. A summary receiver operator characteristics curve was constructed to determine the relationship between sensitivity and specificity, and the area under the summary receiver operator characteristics curve which measured the usefulness of a test was calculated. Meta-analysis was not considered appropriate, therefore data was synthesized in a narrative summary. 1. One study evaluated the Malnutrition Screening Tool against the reference standard Patient-Generated Subjective Global Assessment. The sensitivity was 56% and the specificity 84%. The positive likelihood ratio was 3.100, negative likelihood ratio was 0.59, the diagnostic odds ratio (CI 95%) was 5.20 (1.09-24.90) and the Area Under the Curve (AUC) represents only a poor to fair diagnostic test accuracy. A total of two studies evaluated the diagnostic accuracy of Malnutrition Universal Screening Tool (MUST) (index test) compared to both Subjective Global Assessment (SGA) (reference standard) and PG-SGA (reference standard) in patients with colorectal cancer. In MUST vs SGA the sensitivity of the tool was 96%, specificity was 75%, LR+ 3.826, LR- 0.058, diagnostic OR (CI 95%) 66.00 (6.61-659.24) and AUC represented excellent diagnostic accuracy. In MUST vs PG-SGA the sensitivity of the tool was 72%, specificity 48.9%, LR+ 1.382, LR- 0.579, diagnostic OR (CI 95%) 2.39 (0.87-6.58) and AUC indicated that the tool failed as a diagnostic test to identify patients with colorectal cancer at nutritional risk,. The Nutrition Risk Index (NRI) was compared to SGA representing a sensitivity of 95.2%, specificity of 62.5%, LR+ 2.521, LR- 0.087, diagnostic OR (CI 95%) 28.89 (6.93-120.40) and AUC represented good diagnostic accuracy. In regard to NRI vs PG-SGA the sensitivity of the tool was 68%, specificity 64%, LR+ 1.947, LR- 0.487, diagnostic OR (CI 95%) 4.00 (1.23-13.01) and AUC indicated poor diagnostic test accuracy. There are no single, specific tools used to screen or assess the nutritional status of colorectal cancer patients. All tools showed varied diagnostic accuracies when compared to the reference standards SGA and PG-SGA. Hence clinical judgment combined with perhaps the SGA or PG-SGA should play a major role. The PG-SGA offers several advantages over the SGA tool: 1) the patient completes the medical history component, thereby decreasing the amount of time involved; 2) it contains more nutrition impact symptoms, which are important to the patient with cancer; and 3) it has a scoring system that allows patients to be triaged for nutritional intervention. Therefore, the PG-SGA could be used as a nutrition assessment tool as it allows quick identification and prioritization of colorectal cancer patients with malnutrition in combination with other parameters. This systematic review highlights the need for the following: Further studies needs to investigate the diagnostic accuracy of already existing nutritional screening tools in the context of colorectal cancer patients. If new screenings tools are developed, they should be developed and validated in the specific clinical context within the same patient population (colorectal cancer patients). The Joanna Briggs Institute.

  16. Characteristics of a Cognitive Tool That Helps Students Learn Diagnostic Problem Solving

    ERIC Educational Resources Information Center

    Danielson, Jared A.; Mills, Eric M.; Vermeer, Pamela J.; Preast, Vanessa A.; Young, Karen M.; Christopher, Mary M.; George, Jeanne W.; Wood, R. Darren; Bender, Holly S.

    2007-01-01

    Three related studies replicated and extended previous work (J.A. Danielson et al. (2003), "Educational Technology Research and Development," 51(3), 63-81) involving the Diagnostic Pathfinder (dP) (previously Problem List Generator [PLG]), a cognitive tool for learning diagnostic problem solving. In studies 1 and 2, groups of 126 and 113…

  17. An Instructional Feedback Technique for Teaching Project Management Tools Aligned With PMBOK

    ERIC Educational Resources Information Center

    Goncalves, Rafael Queiroz; von Wangenheim, Christiane A. Gresse; Hauck, Jean C. R.; Zanella, Andreia

    2018-01-01

    Contribution: An approach is presented to provide contextualized feedback for students using a project management (PM) tool. This approach covers the ten PM knowledge areas, guiding students through the planning of software projects. Background: Because software PM is unfeasible without the support of a PM tool there is a growing demand that these…

  18. Data Visualization: An Exploratory Study into the Software Tools Used by Businesses

    ERIC Educational Resources Information Center

    Diamond, Michael; Mattia, Angela

    2017-01-01

    Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…

  19. Database Administration: Concepts, Tools, Experiences, and Problems.

    ERIC Educational Resources Information Center

    Leong-Hong, Belkis; Marron, Beatrice

    The concepts of data base administration, the role of the data base administrator (DBA), and computer software tools useful in data base administration are described in order to assist data base technologists and managers. A study of DBA's in the Federal Government is detailed in terms of the functions they perform, the software tools they use,…

  20. Data Visualization: An Exploratory Study into the Software Tools Used by Businesses

    ERIC Educational Resources Information Center

    Diamond, Michael; Mattia, Angela

    2015-01-01

    Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…

  1. Assessment of Semi-Structured Clinical Interview for Mobile Phone Addiction Disorder

    PubMed Central

    Alavi, Seyyed Salman; Jannatifard, Fereshteh; Mohammadi Kalhori, Soroush; Sepahbodi, Ghazal; BabaReisi, Mohammad; Sajedi, Sahar; Farshchi, Mojtaba; KhodaKarami, Rasul; Hatami Kasvaee, Vahid

    2016-01-01

    Objective: The Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision (DSM-IV-TR) classified mobile phone addiction disorder under “impulse control disorder not elsewhere classified”. This study surveyed the diagnostic criteria of DSM-IV-TR for the diagnosis of mobile phone addiction in correspondence with Iranian society and culture. Method: Two hundred fifty students of Tehran universities were entered into this descriptive-analytical and cross-sectional study. Quota sampling method was used. At first, semi- structured clinical interview (based on DSM-IV-TR) was performed for all the cases, and another specialist reevaluated the interviews. Data were analyzed using content validity, inter-scorer reliability (Kappa coefficient) and test-retest via SPSS18 software. Results: The content validity of the semi- structured clinical interview matched the DSM–IV-TR criteria for behavioral addiction. Moreover, their content was appropriate, and two items, including “SMS pathological use” and “High monthly cost of using the mobile phone” were added to promote its validity. Internal reliability (Kappa) and test–retest reliability were 0.55 and r = 0.4 (p<0. 01) respectively. Conclusion: The results of this study revealed that semi- structured diagnostic criteria of DSM-IV-TR are valid and reliable for diagnosing mobile phone addiction, and this instrument is an effective tool to diagnose this disorder. PMID:27437008

  2. New methodology to baseline and match AME polysilicon etcher using advanced diagnostic tools

    NASA Astrophysics Data System (ADS)

    Poppe, James; Shipman, John; Reinhardt, Barbara E.; Roussel, Myriam; Hedgecock, Raymond; Fonda, Arturo

    1999-09-01

    As process controls tighten in the semiconductor industry, the need to understand the variables that determine system performance become more important. For plasma etch systems, process success depends on the control of key parameters such as: vacuum integrity, pressure, gas flows, and RF power. It is imperative to baseline, monitor, and control these variables. This paper presents an overview of the methods and tools used by Motorola BMC fabrication facility to characterize an Applied Materials polysilicon etcher. Tool performance data obtained from our traditional measurement techniques are limited in their scope and do not provide a complete picture of the ultimate tool performance. Presently the BMC traditional characterization tools provide a snapshot of the static operation of the equipment under test (EUT); however, complete evaluation of the dynamic performance cannot be monitored without the aid of specialized diagnostic equipment. To provide us with a complete system baseline evaluation of the polysilicon etcher, three diagnostic tools were utilized: Lucas Labs Vacuum Diagnostic System, Residual Gas Analyzer, and the ENI Voltage/Impedance Probe. The diagnostic methodology used to baseline and match key parameters of qualified production equipment has had an immense impact on other equipment characterization in the facility. It has resulted in reduced cycle time for new equipment introduction as well.

  3. Experiences with a generator tool for building clinical application modules.

    PubMed

    Kuhn, K A; Lenz, R; Elstner, T; Siegele, H; Moll, R

    2003-01-01

    To elaborate main system characteristics and relevant deployment experiences for the health information system (HIS) Orbis/OpenMed, which is in widespread use in Germany, Austria, and Switzerland. In a deployment phase of 3 years in a 1.200 bed university hospital, where the system underwent significant improvements, the system's functionality and its software design have been analyzed in detail. We focus on an integrated CASE tool for generating embedded clinical applications and for incremental system evolution. We present a participatory and iterative software engineering process developed for efficient utilization of such a tool. The system's functionality is comparable to other commercial products' functionality; its components are embedded in a vendor-specific application framework, and standard interfaces are being used for connecting subsystems. The integrated generator tool is a remarkable feature; it became a key factor of our project. Tool generated applications are workflow enabled and embedded into the overall data base schema. Rapid prototyping and iterative refinement are supported, so application modules can be adapted to the users' work practice. We consider tools supporting an iterative and participatory software engineering process highly relevant for health information system architects. The potential of a system to continuously evolve and to be effectively adapted to changing needs may be more important than sophisticated but hard-coded HIS functionality. More work will focus on HIS software design and on software engineering. Methods and tools are needed for quick and robust adaptation of systems to health care processes and changing requirements.

  4. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  5. STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.

    PubMed

    Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X

    2009-08-01

    This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.

  6. ReGaTE: Registration of Galaxy Tools in Elixir.

    PubMed

    Doppelt-Azeroual, Olivia; Mareuil, Fabien; Deveaud, Eric; Kalaš, Matúš; Soranzo, Nicola; van den Beek, Marius; Grüning, Björn; Ison, Jon; Ménager, Hervé

    2017-06-01

    Bioinformaticians routinely use multiple software tools and data sources in their day-to-day work and have been guided in their choices by a number of cataloguing initiatives. The ELIXIR Tools and Data Services Registry (bio.tools) aims to provide a central information point, independent of any specific scientific scope within bioinformatics or technological implementation. Meanwhile, efforts to integrate bioinformatics software in workbench and workflow environments have accelerated to enable the design, automation, and reproducibility of bioinformatics experiments. One such popular environment is the Galaxy framework, with currently more than 80 publicly available Galaxy servers around the world. In the context of a generic registry for bioinformatics software, such as bio.tools, Galaxy instances constitute a major source of valuable content. Yet there has been, to date, no convenient mechanism to register such services en masse. We present ReGaTE (Registration of Galaxy Tools in Elixir), a software utility that automates the process of registering the services available in a Galaxy instance. This utility uses the BioBlend application program interface to extract service metadata from a Galaxy server, enhance the metadata with the scientific information required by bio.tools, and push it to the registry. ReGaTE provides a fast and convenient way to publish Galaxy services in bio.tools. By doing so, service providers may increase the visibility of their services while enriching the software discovery function that bio.tools provides for its users. The source code of ReGaTE is freely available on Github at https://github.com/C3BI-pasteur-fr/ReGaTE . © The Author 2017. Published by Oxford University Press.

  7. Software packager user's guide

    NASA Technical Reports Server (NTRS)

    Callahan, John R.

    1995-01-01

    Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.

  8. Proceedings of the Thirteenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics covered in the workshop included studies and experiments conducted in the Software Engineering Laboratory (SEL), a cooperative effort of NASA Goddard Space Flight Center, the University of Maryland, and Computer Sciences Corporation; software models; software products; and software tools.

  9. Portable Health Algorithms Test System

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Wong, Edmond; Fulton, Christopher E.; Sowers, Thomas S.; Maul, William A.

    2010-01-01

    A document discusses the Portable Health Algorithms Test (PHALT) System, which has been designed as a means for evolving the maturity and credibility of algorithms developed to assess the health of aerospace systems. Comprising an integrated hardware-software environment, the PHALT system allows systems health management algorithms to be developed in a graphical programming environment, to be tested and refined using system simulation or test data playback, and to be evaluated in a real-time hardware-in-the-loop mode with a live test article. The integrated hardware and software development environment provides a seamless transition from algorithm development to real-time implementation. The portability of the hardware makes it quick and easy to transport between test facilities. This hard ware/software architecture is flexible enough to support a variety of diagnostic applications and test hardware, and the GUI-based rapid prototyping capability is sufficient to support development execution, and testing of custom diagnostic algorithms. The PHALT operating system supports execution of diagnostic algorithms under real-time constraints. PHALT can perform real-time capture and playback of test rig data with the ability to augment/ modify the data stream (e.g. inject simulated faults). It performs algorithm testing using a variety of data input sources, including real-time data acquisition, test data playback, and system simulations, and also provides system feedback to evaluate closed-loop diagnostic response and mitigation control.

  10. Reusable science tools for analog exploration missions: xGDS Web Tools, VERVE, and Gigapan Voyage

    NASA Astrophysics Data System (ADS)

    Lee, Susan Y.; Lees, David; Cohen, Tamar; Allan, Mark; Deans, Matthew; Morse, Theodore; Park, Eric; Smith, Trey

    2013-10-01

    The Exploration Ground Data Systems (xGDS) project led by the Intelligent Robotics Group (IRG) at NASA Ames Research Center creates software tools to support multiple NASA-led planetary analog field experiments. The two primary tools that fall under the xGDS umbrella are the xGDS Web Tools (xGDS-WT) and Visual Environment for Remote Virtual Exploration (VERVE). IRG has also developed a hardware and software system that is closely integrated with our xGDS tools and is used in multiple field experiments called Gigapan Voyage. xGDS-WT, VERVE, and Gigapan Voyage are examples of IRG projects that improve the ratio of science return versus development effort by creating generic and reusable tools that leverage existing technologies in both hardware and software. xGDS Web Tools provides software for gathering and organizing mission data for science and engineering operations, including tools for planning traverses, monitoring autonomous or piloted vehicles, visualization, documentation, analysis, and search. VERVE provides high performance three dimensional (3D) user interfaces used by scientists, robot operators, and mission planners to visualize robot data in real time. Gigapan Voyage is a gigapixel image capturing and processing tool that improves situational awareness and scientific exploration in human and robotic analog missions. All of these technologies emphasize software reuse and leverage open source and/or commercial-off-the-shelf tools to greatly improve the utility and reduce the development and operational cost of future similar technologies. Over the past several years these technologies have been used in many NASA-led robotic field campaigns including the Desert Research and Technology Studies (DRATS), the Pavilion Lake Research Project (PLRP), the K10 Robotic Follow-Up tests, and most recently we have become involved in the NASA Extreme Environment Mission Operations (NEEMO) field experiments. A major objective of these joint robot and crew experiments is to improve NASAs understanding of how to most effectively execute and increase science return from exploration missions. This paper focuses on an integrated suite of xGDS software and compatible hardware tools: xGDS Web Tools, VERVE, and Gigapan Voyage, how they are used, and the design decisions that were made to allow them to be easily developed, integrated, tested, and reused by multiple NASA field experiments and robotic platforms.

  11. Modular Analytical Multicomponent Analysis in Gas Sensor Aarrays

    PubMed Central

    Chaiyboun, Ali; Traute, Rüdiger; Kiesewetter, Olaf; Ahlers, Simon; Müller, Gerhard; Doll, Theodor

    2006-01-01

    A multi-sensor system is a chemical sensor system which quantitatively and qualitatively records gases with a combination of cross-sensitive gas sensor arrays and pattern recognition software. This paper addresses the issue of data analysis for identification of gases in a gas sensor array. We introduce a software tool for gas sensor array configuration and simulation. It concerns thereby about a modular software package for the acquisition of data of different sensors. A signal evaluation algorithm referred to as matrix method was used specifically for the software tool. This matrix method computes the gas concentrations from the signals of a sensor array. The software tool was used for the simulation of an array of five sensors to determine gas concentration of CH4, NH3, H2, CO and C2H5OH. The results of the present simulated sensor array indicate that the software tool is capable of the following: (a) identify a gas independently of its concentration; (b) estimate the concentration of the gas, even if the system was not previously exposed to this concentration; (c) tell when a gas concentration exceeds a certain value. A gas sensor data base was build for the configuration of the software. With the data base one can create, generate and manage scenarios and source files for the simulation. With the gas sensor data base and the simulation software an on-line Web-based version was developed, with which the user can configure and simulate sensor arrays on-line.

  12. Methods to ensure the standardization of FORTRAN software. [PFORT, DAVE, POLISH, and BRNANL, for analysis and editing of codes, in FORTRAN for PDP-10 and IBM 360 and 370

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaffney, P.W.; Wooten, J.W.

    1980-05-01

    Four software tools PFORT, DAVE, POLISH, and BRNANL, which may be used to ensure the standardization of FORTRAN software are introduced. First, FORTRAN computer programs are loosely classified into three groups. Then reasons are given why the program in two of these groups should adhere to a portable subset of the American National Standard (ANS) First FORTRAN 1966. Next, the software tools PFORT, DAVE, POLISH, and BRNANL, are briefly described, and an example of the output from PFORT, DAVE, and POLISH are given. Finally, the dissemination of information pertaining to the tools together with their availability is outlined. 11 figures.

  13. Volumetric neuroimage analysis extensions for the MIPAV software package.

    PubMed

    Bazin, Pierre-Louis; Cuzzocreo, Jennifer L; Yassa, Michael A; Gandler, William; McAuliffe, Matthew J; Bassett, Susan S; Pham, Dzung L

    2007-09-15

    We describe a new collection of publicly available software tools for performing quantitative neuroimage analysis. The tools perform semi-automatic brain extraction, tissue classification, Talairach alignment, and atlas-based measurements within a user-friendly graphical environment. They are implemented as plug-ins for MIPAV, a freely available medical image processing software package from the National Institutes of Health. Because the plug-ins and MIPAV are implemented in Java, both can be utilized on nearly any operating system platform. In addition to the software plug-ins, we have also released a digital version of the Talairach atlas that can be used to perform regional volumetric analyses. Several studies are conducted applying the new tools to simulated and real neuroimaging data sets.

  14. Proceedings of the Workshop on software tools for distributed intelligent control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herget, C.J.

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can formmore » the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.« less

  15. Payload Operations Support Team Tools

    NASA Technical Reports Server (NTRS)

    Askew, Bill; Barry, Matthew; Burrows, Gary; Casey, Mike; Charles, Joe; Downing, Nicholas; Jain, Monika; Leopold, Rebecca; Luty, Roger; McDill, David; hide

    2007-01-01

    Payload Operations Support Team Tools is a software system that assists in (1) development and testing of software for payloads to be flown aboard the space shuttles and (2) training of payload customers, flight controllers, and flight crews in payload operations

  16. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  17. Stability analysis using SDSA tool

    NASA Astrophysics Data System (ADS)

    Goetzendorf-Grabowski, Tomasz; Mieszalski, Dawid; Marcinkiewicz, Ewa

    2011-11-01

    The SDSA (Simulation and Dynamic Stability Analysis) application is presented as a tool for analysing the dynamic characteristics of the aircraft just in the conceptual design stage. SDSA is part of the CEASIOM (Computerized Environment for Aircraft Synthesis and Integrated Optimization Methods) software environment which was developed within the SimSAC (Simulating Aircraft Stability And Control Characteristics for Use in Conceptual Design) project, funded by the European Commission 6th Framework Program. SDSA can also be used as stand alone software, and integrated with other design and optimisation systems using software wrappers. This paper focuses on the main functionalities of SDSA and presents both computational and free flight experimental results to compare and validate the presented software. Two aircraft are considered, the EADS Ranger 2000 and the Warsaw University designed PW-6 glider. For the two cases considered here the SDSA software is shown to be an excellent tool for predicting dynamic characteristics of an aircraft.

  18. Software Certification for Temporal Properties With Affordable Tool Qualification

    NASA Technical Reports Server (NTRS)

    Xia, Songtao; DiVito, Benedetto L.

    2005-01-01

    It has been recognized that a framework based on proof-carrying code (also called semantic-based software certification in its community) could be used as a candidate software certification process for the avionics industry. To meet this goal, tools in the "trust base" of a proof-carrying code system must be qualified by regulatory authorities. A family of semantic-based software certification approaches is described, each different in expressive power, level of automation and trust base. Of particular interest is the so-called abstraction-carrying code, which can certify temporal properties. When a pure abstraction-carrying code method is used in the context of industrial software certification, the fact that the trust base includes a model checker would incur a high qualification cost. This position paper proposes a hybrid of abstraction-based and proof-based certification methods so that the model checker used by a client can be significantly simplified, thereby leading to lower cost in tool qualification.

  19. Results of Software and Services Citations Review at ESIP

    NASA Astrophysics Data System (ADS)

    Hausman, J.; Gallagher, J. H. R.; Stall, S.

    2017-12-01

    Citations for software and services/tools are important as they provide a way to improve reproducibility of science, better provenance and easier to attribute credit to the developers. Software citations are trickier than papers or data as software can be very dynamic so it is a bit of a moving target. It is even more difficult for services/tools as they usually have data as inputs so now a relation between the tool and data is needed. There are suggested citation formats, but they do not always contain enough information that can easily gleaned or obtained from a metrics crawler. At the Summer 2017 Earth Science Information Partners (ESIP) meeting a workshop was held to evaluate the effectiveness of a citation. This presentation will summarize those results and put forth adjustments to the format. These adjustments will make it easier to verify that the citation is for a service or software and for information harvesting.

  20. Proceedings of Tenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Papers are presented on the following topics: measurement of software technology, recent studies of the Software Engineering Lab, software management tools, expert systems, error seeding as a program validation technique, software quality assurance, software engineering environments (including knowledge-based environments), the Distributed Computing Design System, and various Ada experiments.

  1. Diagnostic and Therapeutic Cancer Care Equipment

    DTIC Science & Technology

    2009-07-01

    Content of Premarket Submissions for Software Contained in Medical Devices” available at http://www.fda.gov/ cdrh /ode/guidance/337.pdf and “Guidance for...Off-the-Shelf Software Use in Medical Devices” available at http://www.fda.gov/ cdrh /ode/guidance/585.pdf. 5. It is unclear what the “Nellcor Puritan

  2. Monitoring the performance of the next Climate Forecast System version 3, throughout its development stage at EMC/NCEP

    NASA Astrophysics Data System (ADS)

    Peña, M.; Saha, S.; Wu, X.; Wang, J.; Tripp, P.; Moorthi, S.; Bhattacharjee, P.

    2016-12-01

    The next version of the operational Climate Forecast System (version 3, CFSv3) will be a fully coupled six-components system with diverse applications to earth system modeling, including weather and climate predictions. This system will couple the earth's atmosphere, land, ocean, sea-ice, waves and aerosols for both data assimilation and modeling. It will also use the NOAA Environmental Modeling System (NEMS) software super structure to couple these components. The CFSv3 is part of the next Unified Global Coupled System (UGCS), which will unify the global prediction systems that are now operational at NCEP. The UGCS is being developed through the efforts of dedicated research and engineering teams and through coordination across many CPO/MAPP and NGGPS groups. During this development phase, the UGCS is being tested for seasonal purposes and undergoes frequent revisions. Each new revision is evaluated to quickly discover, isolate and solve problems that negatively impact its performance. In the UGCS-seasonal model, components (e.g., ocean, sea-ice, atmosphere, etc.) are coupled through a NEMS-based "mediator". In this numerical infrastructure, model diagnostics and forecast validation are carried out, both component by component, and as a whole. The next stage, model optimization, will require enhanced performance diagnostics tools to help prioritize areas of numerical improvements. After the technical development of the UGCS-seasonal is completed, it will become the first realization of the CFSv3. All future development of this system will be carried out by the climate team at NCEP, in scientific collaboration with the groups that developed the individual components, as well as the climate community. A unique challenge to evaluate this unified weather-climate system is the large number of variables, which evolve over a wide range of temporal and spatial scales. A small set of performance measures and scorecard displays are been created, and collaboration and software contributions from research and operational centers are being incorporated. A status of the CFSv3/UGCS-seasonal development and examples of its performance and measuring tools will be presented.

  3. Recent H- diagnostics, plasma simulations, and 2X scaled Penning ion source developments at the Rutherford Appleton Laboratory

    NASA Astrophysics Data System (ADS)

    Lawrie, S. R.; Faircloth, D. C.; Smith, J. D.; Sarmento, T. M.; Whitehead, M. O.; Wood, T.; Perkins, M.; Macgregor, J.; Abel, R.

    2018-05-01

    A vessel for extraction and source plasma analyses is being used for Penning H- ion source development at the Rutherford Appleton Laboratory. A new set of optical elements including an einzel lens has been installed, which transports over 80 mA of H- beam successfully. Simultaneously, a 2X scaled Penning source has been developed to reduce cathode power density. The 2X source is now delivering a 65 mA H- ion beam at 10% duty factor, meeting its design criteria. The long-term viability of the einzel lens and 2X source is now being evaluated, so new diagnostic devices have been installed. A pair of electrostatic deflector plates is used to correct beam misalignment and perform fast chopping, with a voltage rise time of 24 ns. A suite of four quartz crystal microbalances has shown that the cesium flux in the vacuum vessel is only increased by a factor of two, despite the absence of a dedicated cold trap. Finally, an infrared camera has demonstrated good agreement with thermal simulations but has indicated unexpected heating due to beam loss on the downstream electrode. These types of diagnostics are suitable for monitoring all operational ion sources. In addition to experimental campaigns and new diagnostic tools, the high-performance VSim and COMSOL software packages are being used for plasma simulations of two novel ion thrusters for space propulsion applications. In parallel, a VSim framework has been established to include arbitrary temperature and cesium fields to allow the modeling of surface physics in H- ion sources.

  4. Circulating miR-128 as a potential diagnostic biomarker for glioma.

    PubMed

    Liang, Ruo-Fei; Li, Mao; Yang, Yuan; Wang, Xiang; Mao, Qing; Liu, Yan-Hui

    2017-09-01

    miR-128 in circulation is a promising marker for early diagnosis of glioma. A meta-analysis was performed to evaluate the diagnostic accuracy and clinical value of circulating miR-128 in patients with glioma. A comprehensive literature search for relevant published articles (last search updated on December 29, 2016) was conducted in the Chinese Biomedical Literature Database, PubMed, and Embase. The quality assessment of diagnostic accuracy studies (QUADAS) tool was used to score the quality of the eligible studies. Meta-Disc 1.4 software was used to test for heterogeneity and to perform the meta-analysis. The three studies included in our study enrolled a total of 191 patients with glioma and 73 individuals without tumor. Using a fixed-effect model analysis, the summary assessments revealed that the pooled sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio were 0.89 (95% CI: 0.84-0.93), 0.90 (95% CI: 0.81-0.96), 8.07 (95% CI: 4.21-15.46), and 0.13 (95% CI: 0.09-0.19), respectively. The diagnostic odds ratio (DOR) of miR-128 was 65.00 (95% CI: 26.90-157.10), indicating that the overall accuracy of the miR-128 test for detecting glioma was high. The value of I 2 was 0.0%, indicating that there was no significant heterogeneity among studies. The present meta-analysis showed that circulating miR-128 might be a promising noninvasive biomarker for diagnosing glioma. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Master Pump Shutdown MPS Software Quality Assurance Plan (SQAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BEVINS, R.R.

    2000-09-20

    The MPSS Software Quality Assurance (SQAP) describes the tools and strategy used in the development of the MPSS software. The document also describes the methodology for controlling and managing changes to the software.

  6. From Colorado to Guam: Infant Diagnostic Audiological Evaluations by Telepractice

    ERIC Educational Resources Information Center

    Hayes, Deborah; Eclavea, Elaine; Dreith, Susan; Habte, Bereket

    2012-01-01

    This manuscript describes a pilot project in which infants in Guam who refer on newborn hearing screening receive diagnostic audiological evaluation conducted by audiologists in Colorado over the Internet (telepractice). The evaluation is completed in real time using commercially-available software and personal computers to control the diagnostic…

  7. Cognitive Diagnostic Modeling Using R

    ERIC Educational Resources Information Center

    Ravand, Hamdollah

    2015-01-01

    Cognitive diagnostic models (CDM) have been around for more than a decade but their application is far from widespread for mainly two reasons: (1) CDMs are novel, as compared to traditional IRT models. Consequently, many researchers lack familiarity with them and their properties, and (2) Software programs doing CDMs have been expensive and not…

  8. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  9. A Software Engineering Approach based on WebML and BPMN to the Mediation Scenario of the SWS Challenge

    NASA Astrophysics Data System (ADS)

    Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina

    Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).

  10. Morphology in the Digital Age: Integrating High Resolution Description of Structural Alterations with Phenotypes and Genotypes

    PubMed Central

    Nast, Cynthia C.; Lemley, Kevin V.; Hodgin, Jeffrey B.; Bagnasco, Serena; Avila-Casado, Carmen; Hewitt, Stephen M; Barisoni, Laura

    2015-01-01

    Conventional light microscopy (CLM) has been used to characterize and classify renal diseases, evaluate histopathology in studies and trials, and educate renal pathologists and nephrologists. The advent of digital pathology, in which a glass slide can be scanned to create whole slide images (WSI) for viewing and manipulating on a computer monitor, provides real and potential advantages over CLM. Software tools such as annotation, morphometry and image analysis can be applied to WSIs for studies or educational purposes, and the digital images are globally available to clinicians, pathologists and investigators. New ways of assessing renal pathology with observational data collection may allow better morphologic correlations and integration with molecular and genetic signatures, refinements of classification schema, and understanding of disease pathogenesis. In multicenter studies, WSI, which require additional quality assurance steps, provide efficiencies by reducing slide shipping and consensus conference costs, and allowing anytime anywhere slide viewing. While validation studies for the routine diagnostic use of digital pathology still are needed, this is a powerful tool currently available for translational research, clinical trials and education in renal pathology. PMID:26215864

  11. Exploiting volatile opportunistic computing resources with Lobster

    NASA Astrophysics Data System (ADS)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  12. Bioimpedance Harmonic Analysis as a Diagnostic Tool to Assess Regional Circulation and Neural Activity

    NASA Astrophysics Data System (ADS)

    Mudraya, I. S.; Revenko, S. V.; Khodyreva, L. A.; Markosyan, T. G.; Dudareva, A. A.; Ibragimov, A. R.; Romich, V. V.; Kirpatovsky, V. I.

    2013-04-01

    The novel technique based on harmonic analysis of bioimpedance microvariations with original hard- and software complex incorporating a high-resolution impedance converter was used to assess the neural activity and circulation in human urinary bladder and penis in patients with pelvic pain, erectile dysfunction, and overactive bladder. The therapeutic effects of shock wave therapy and Botulinum toxin detrusor injections were evaluated quantitatively according to the spectral peaks at low 0.1 Hz frequency (M for Mayer wave), respiratory (R) and cardiac (C) rhythms with their harmonics. Enhanced baseline regional neural activity identified according to M and R peaks was found to be presumably sympathetic in pelvic pain patients, and parasympathetic - in patients with overactive bladder. Total pulsatile activity and pulsatile resonances found in the bladder as well as in the penile spectrum characterised regional circulation and vascular tone. The abnormal spectral parameters characteristic of the patients with genitourinary diseases shifted to the norm in the cases of efficient therapy. Bioimpedance harmonic analysis seems to be a potent tool to assess regional peculiarities of circulatory and autonomic nervous activity in the course of patient treatment.

  13. Advances in molecular labeling, high throughput imaging and machine intelligence portend powerful functional cellular biochemistry tools.

    PubMed

    Price, Jeffrey H; Goodacre, Angela; Hahn, Klaus; Hodgson, Louis; Hunter, Edward A; Krajewski, Stanislaw; Murphy, Robert F; Rabinovich, Andrew; Reed, John C; Heynen, Susanne

    2002-01-01

    Cellular behavior is complex. Successfully understanding systems at ever-increasing complexity is fundamental to advances in modern science and unraveling the functional details of cellular behavior is no exception. We present a collection of prospectives to provide a glimpse of the techniques that will aid in collecting, managing and utilizing information on complex cellular processes via molecular imaging tools. These include: 1) visualizing intracellular protein activity with fluorescent markers, 2) high throughput (and automated) imaging of multilabeled cells in statistically significant numbers, and 3) machine intelligence to analyze subcellular image localization and pattern. Although not addressed here, the importance of combining cell-image-based information with detailed molecular structure and ligand-receptor binding models cannot be overlooked. Advanced molecular imaging techniques have the potential to impact cellular diagnostics for cancer screening, clinical correlations of tissue molecular patterns for cancer biology, and cellular molecular interactions for accelerating drug discovery. The goal of finally understanding all cellular components and behaviors will be achieved by advances in both instrumentation engineering (software and hardware) and molecular biochemistry. Copyright 2002 Wiley-Liss, Inc.

  14. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  15. INTELLIGENT COMPUTING SYSTEM FOR RESERVOIR ANALYSIS AND RISK ASSESSMENT OF THE RED RIVER FORMATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenneth D. Luff

    2002-06-30

    Integrated software has been written that comprises the tool kit for the Intelligent Computing System (ICS). Luff Exploration Company is applying these tools for analysis of carbonate reservoirs in the southern Williston Basin. The integrated software programs are designed to be used by small team consisting of an engineer, geologist and geophysicist. The software tools are flexible and robust, allowing application in many environments for hydrocarbon reservoirs. Keystone elements of the software tools include clustering and neural-network techniques. The tools are used to transform seismic attribute data to reservoir characteristics such as storage (phi-h), probable oil-water contacts, structural depths andmore » structural growth history. When these reservoir characteristics are combined with neural network or fuzzy logic solvers, they can provide a more complete description of the reservoir. This leads to better estimates of hydrocarbons in place, areal limits and potential for infill or step-out drilling. These tools were developed and tested using seismic, geologic and well data from the Red River Play in Bowman County, North Dakota and Harding County, South Dakota. The geologic setting for the Red River Formation is shallow-shelf carbonate at a depth from 8000 to 10,000 ft.« less

  16. INTELLIGENT COMPUTING SYSTEM FOR RESERVOIR ANALYSIS AND RISK ASSESSMENT OF THE RED RIVER FORMATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenneth D. Luff

    2002-09-30

    Integrated software has been written that comprises the tool kit for the Intelligent Computing System (ICS). Luff Exploration Company is applying these tools for analysis of carbonate reservoirs in the southern Williston Basin. The integrated software programs are designed to be used by small team consisting of an engineer, geologist and geophysicist. The software tools are flexible and robust, allowing application in many environments for hydrocarbon reservoirs. Keystone elements of the software tools include clustering and neural-network techniques. The tools are used to transform seismic attribute data to reservoir characteristics such as storage (phi-h), probable oil-water contacts, structural depths andmore » structural growth history. When these reservoir characteristics are combined with neural network or fuzzy logic solvers, they can provide a more complete description of the reservoir. This leads to better estimates of hydrocarbons in place, areal limits and potential for infill or step-out drilling. These tools were developed and tested using seismic, geologic and well data from the Red River Play in Bowman County, North Dakota and Harding County, South Dakota. The geologic setting for the Red River Formation is shallow-shelf carbonate at a depth from 8000 to 10,000 ft.« less

  17. Evaluation of work zone enhancement software programs.

    DOT National Transportation Integrated Search

    2009-09-01

    The Missouri Department of Transportation (MoDOT) is looking for software tools that can assist in : developing effective plans to manage and communicate work zone activities. QuickZone, CA4PRS, : VISSIM, and Spreadsheet models are the tools that MoD...

  18. Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Emma; Kiliccote, Sila; McParland, Charles

    2014-07-01

    This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation.more » Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve/ensure accuracy, providing information on normally estimated values such as underground conductor impedance, and characterization of complex loads. Although the input of high-fidelity data to existing tools will be challenging, µPMU data on phase angle (as well as other data from advanced sensors) will be useful for basic operational decisions that are based on a trend of changing data.« less

  19. Tools Automate Spacecraft Testing, Operation

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "NASA began the Small Explorer (SMEX) program to develop spacecraft to advance astrophysics and space physics. As one of the entities supporting software development at Goddard Space Flight Center, the Hammers Company Inc. (tHC Inc.), of Greenbelt, Maryland, developed the Integrated Test and Operations System to support SMEX. Later, the company received additional Small Business Innovation Research (SBIR) funding from Goddard for a tool to facilitate the development of flight software called VirtualSat. NASA uses the tools to support 15 satellites, and the aerospace industry is using them to develop science instruments, spacecraft computer systems, and navigation and control software."

  20. Environmental databases and other computerized information tools

    NASA Technical Reports Server (NTRS)

    Clark-Ingram, Marceia

    1995-01-01

    Increasing environmental legislation has brought about the development of many new environmental databases and software application packages to aid in the quest for environmental compliance. These databases and software packages are useful tools and applicable to a wide range of environmental areas from atmospheric modeling to materials replacement technology. The great abundance of such products and services can be very overwhelming when trying to identify the tools which best meet specific needs. This paper will discuss the types of environmental databases and software packages available. This discussion will also encompass the affected environmental areas of concern, product capabilities, and hardware requirements for product utilization.

  1. Review of software tools for design and analysis of large scale MRM proteomic datasets.

    PubMed

    Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-06-15

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Climate tools in mainstream Linux distributions

    NASA Astrophysics Data System (ADS)

    McKinstry, Alastair

    2015-04-01

    Debian/meterology is a project to integrate climate tools and analysis software into the mainstream Debian/Ubuntu Linux distributions. This work describes lessons learnt, and recommends practices for scientific software to be adopted and maintained in OS distributions. In addition to standard analysis tools (cdo,, grads, ferret, metview, ncl, etc.), software used by the Earth System Grid Federation was chosen for integraion, to enable ESGF portals to be built on this base; however exposing scientific codes via web APIs enables security weaknesses, normally ignorable, to be exposed. How tools are hardened, and what changes are required to handle security upgrades, are described. Secondly, to enable libraries and components (e.g. Python modules) to be integrated requires planning by writers: it is not sufficient to assume users can upgrade their code when you make incompatible changes. Here, practices are recommended to enable upgrades and co-installability of C, C++, Fortran and Python codes. Finally, software packages such as NetCDF and HDF5 can be built in multiple configurations. Tools may then expect incompatible versions of these libraries (e.g. serial and parallel) to be simultaneously available; how this was solved in Debian using "pkg-config" and shared library interfaces is described, and best practices for software writers to enable this are summarised.

  3. Ascent/Descent Software

    NASA Technical Reports Server (NTRS)

    Brown, Charles; Andrew, Robert; Roe, Scott; Frye, Ronald; Harvey, Michael; Vu, Tuan; Balachandran, Krishnaiyer; Bly, Ben

    2012-01-01

    The Ascent/Descent Software Suite has been used to support a variety of NASA Shuttle Program mission planning and analysis activities, such as range safety, on the Integrated Planning System (IPS) platform. The Ascent/Descent Software Suite, containing Ascent Flight Design (ASC)/Descent Flight Design (DESC) Configuration items (Cis), lifecycle documents, and data files used for shuttle ascent and entry modeling analysis and mission design, resides on IPS/Linux workstations. A list of tools in Navigation (NAV)/Prop Software Suite represents tool versions established during or after the IPS Equipment Rehost-3 project.

  4. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.

  5. The Role of Computers in Research and Development at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D. (Compiler)

    1994-01-01

    This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.

  6. A portable hardware-in-the-loop (HIL) device for automotive diagnostic control systems.

    PubMed

    Palladino, A; Fiengo, G; Lanzo, D

    2012-01-01

    In-vehicle driving tests for evaluating the performance and diagnostic functionalities of engine control systems are often time consuming, expensive, and not reproducible. Using a hardware-in-the-loop (HIL) simulation approach, new control strategies and diagnostic functions on a controller area network (CAN) line can be easily tested in real time, in order to reduce the effort and the cost of the testing phase. Nowadays, spark ignition engines are controlled by an electronic control unit (ECU) with a large number of embedded sensors and actuators. In order to meet the rising demand of lower emissions and fuel consumption, an increasing number of control functions are added into such a unit. This work aims at presenting a portable electronic environment system, suited for HIL simulations, in order to test the engine control software and the diagnostic functionality on a CAN line, respectively, through non-regression and diagnostic tests. The performances of the proposed electronic device, called a micro hardware-in-the-loop system, are presented through the testing of the engine management system software of a 1.6 l Fiat gasoline engine with variable valve actuation for the ECU development version. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  7. cryoem-cloud-tools: A software platform to deploy and manage cryo-EM jobs in the cloud.

    PubMed

    Cianfrocco, Michael A; Lahiri, Indrajit; DiMaio, Frank; Leschziner, Andres E

    2018-06-01

    Access to streamlined computational resources remains a significant bottleneck for new users of cryo-electron microscopy (cryo-EM). To address this, we have developed tools that will submit cryo-EM analysis routines and atomic model building jobs directly to Amazon Web Services (AWS) from a local computer or laptop. These new software tools ("cryoem-cloud-tools") have incorporated optimal data movement, security, and cost-saving strategies, giving novice users access to complex cryo-EM data processing pipelines. Integrating these tools into the RELION processing pipeline and graphical user interface we determined a 2.2 Å structure of ß-galactosidase in ∼55 hours on AWS. We implemented a similar strategy to submit Rosetta atomic model building and refinement to AWS. These software tools dramatically reduce the barrier for entry of new users to cloud computing for cryo-EM and are freely available at cryoem-tools.cloud. Copyright © 2018. Published by Elsevier Inc.

  8. Comparison of BrainTool to other UML modeling and model transformation tools

    NASA Astrophysics Data System (ADS)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  9. Exoskeletons, Robots and System Software: Tools for the Warfighter

    DTIC Science & Technology

    2012-04-24

    Exoskeletons , Robots and System Software: Tools for the Warfighter? Paul Flanagan, Tuesday, April 24, 2012 11:15 am– 12:00 pm 1 “The views...Emerging technologies such as exoskeletons , robots , drones, and the underlying software are and will change the face of the battlefield. Warfighters will...global hub for educating, informing, and connecting Information Age leaders.” What is an exoskeleton ? An exoskeleton is a wearable robot suit that

  10. Software Aids for radiologists: Part 1, Useful Photoshop skills.

    PubMed

    Gross, Joel A; Thapa, Mahesh M

    2012-12-01

    The purpose of this review is to describe the use of several essential techniques and tools in Adobe Photoshop image-editing software. The techniques shown expand on those previously described in the radiologic literature. Radiologists, especially those with minimal experience with image-editing software, can quickly apply a few essential Photoshop tools to minimize the frustration that can result from attempting to navigate a complex user interface.

  11. Evaluation and Validation (E&V) Team Public Report. Volume 5

    DTIC Science & Technology

    1990-10-31

    aspects, software engineering practices, etc. The E&V requirements which are developed will be used to guide the E&V technical effort. The currently...interoperability of Ada software engineering environment tools and data. The scope of the CAIS-A includes the functionality affecting transportability that is...requirement that they be CAIS conforming tools or data. That is, for example numerous CIVC data exist on special purpose software currently available

  12. An overview of 3D software visualization.

    PubMed

    Teyseyre, Alfredo R; Campo, Marcelo R

    2009-01-01

    Software visualization studies techniques and methods for graphically representing different aspects of software. Its main goal is to enhance, simplify and clarify the mental representation a software engineer has of a computer system. During many years, visualization in 2D space has been actively studied, but in the last decade, researchers have begun to explore new 3D representations for visualizing software. In this article, we present an overview of current research in the area, describing several major aspects like: visual representations, interaction issues, evaluation methods and development tools. We also perform a survey of some representative tools to support different tasks, i.e., software maintenance and comprehension, requirements validation and algorithm animation for educational purposes, among others. Finally, we conclude identifying future research directions.

  13. Toward Intelligent Software Defect Detection

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2011-01-01

    Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.

  14. A META-COMPOSITE SOFTWARE DEVELOPMENT APPROACH FOR TRANSLATIONAL RESEARCH

    PubMed Central

    Sadasivam, Rajani S.; Tanik, Murat M.

    2013-01-01

    Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users’ needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements. PMID:23504436

  15. A meta-composite software development approach for translational research.

    PubMed

    Sadasivam, Rajani S; Tanik, Murat M

    2013-06-01

    Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users' needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements.

  16. Pathway Tools version 13.0: integrated software for pathway/genome informatics and systems biology

    PubMed Central

    Paley, Suzanne M.; Krummenacker, Markus; Latendresse, Mario; Dale, Joseph M.; Lee, Thomas J.; Kaipa, Pallavi; Gilham, Fred; Spaulding, Aaron; Popescu, Liviu; Altman, Tomer; Paulsen, Ian; Keseler, Ingrid M.; Caspi, Ron

    2010-01-01

    Pathway Tools is a production-quality software environment for creating a type of model-organism database called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc integrates the evolving understanding of the genes, proteins, metabolic network and regulatory network of an organism. This article provides an overview of Pathway Tools capabilities. The software performs multiple computational inferences including prediction of metabolic pathways, prediction of metabolic pathway hole fillers and prediction of operons. It enables interactive editing of PGDBs by DB curators. It supports web publishing of PGDBs, and provides a large number of query and visualization tools. The software also supports comparative analyses of PGDBs, and provides several systems biology analyses of PGDBs including reachability analysis of metabolic networks, and interactive tracing of metabolites through a metabolic network. More than 800 PGDBs have been created using Pathway Tools by scientists around the world, many of which are curated DBs for important model organisms. Those PGDBs can be exchanged using a peer-to-peer DB sharing system called the PGDB Registry. PMID:19955237

  17. Methodology for automating software systems

    NASA Technical Reports Server (NTRS)

    Moseley, Warren

    1990-01-01

    Applying ITS technology to the shuttle diagnostics would not require the rigor of the Petri Net representation, however it is important in providing the animated simulated portion of the interface and the demands placed on the system to support the training aspects to have a homogeneous and consistent underlying knowledge representation. By keeping the diagnostic rule base, the hardware description, the software description, user profiles, desired behavioral knowledge, and the user interface in the same notation, it is possible to reason about the all of the properties of petri nets, on any selected portion of the simulation. This reasoning provides foundation for utilization of intelligent tutoring systems technology.

  18. Development and evaluation of a Loop Mediated Isothermal Amplification (LAMP) technique for the detection of hookworm (Necator americanus) infection in fecal samples.

    PubMed

    Mugambi, Robert Muriuki; Agola, Eric L; Mwangi, Ibrahim N; Kinyua, Johnson; Shiraho, Esther Andia; Mkoji, Gerald M

    2015-11-06

    Hookworm infection is a major concern in sub-Saharan Africa, particularly in children and pregnant women. Necator americanus and Ancylostoma duodenale are responsible for this condition. Hookworm disease is one of the Neglected tropical diseases (NTDs) that are targeted for elimination through global mass chemotherapy. To support this there is a need for reliable diagnostic tools. The conventional diagnostic test, Kato-Katz that is based on microscopic detection of parasite ova in faecal samples, is not effective due to its low sensitivity that is brought about mainly by non-random distribution of eggs in stool and day to day variation in egg output. It is tedious, cumbersome to perform and requires experience for correct diagnosis. LAMP-based tests are simple, relatively cheap, offer greater sensitivity, specificity than existing tests, have high throughput capability, and are ideal for use at the point of care. We have developed a LAMP diagnostic test for detection of hookworm infection in faecal samples. LAMP relies on auto cycling strand displacement DNA synthesis performed at isothermal temperature by Bst polymerase and a set of 4 specific primers. The primers used in the LAMP assay were based on the second Internal Transcribed Spacer (ITS-2) region and designed using Primer Explorer version 4 Software. The ITS-2 region of the ribosomal gene (rDNA) was identified as a suitable target due to its low mutation rates and substantial differences between species. DNA was extracted directly from human faecal samples, followed by LAMP amplification at isothermal temperature of 63 °C for 1 h. Amplicons were visualized using gel electrophoresis and SYBR green dye. Both specificity and sensitivity of the assay were determined. The LAMP based technique developed was able to detect N. americanus DNA in faecal samples. The assay showed 100 % specificity and no cross-reaction was observed with other helminth parasites (S. mansoni, A. lumbricoides or T. trichiura). The developed LAMP assay was 97 % sensitive and DNA at concentrations as low as 0.4 fg were amplified. The LAMP assay developed is an appropriate diagnostic method for the detection of N. americanus DNA in human stool samples because of its simplicity, low cost, sensitivity, and specificity. It holds great promise as a useful diagnostic tool for use in disease control where infection intensities have been significantly reduced.

  19. DAISY: a new software tool to test global identifiability of biological and physiological systems.

    PubMed

    Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina

    2007-10-01

    A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/.

  20. Software for enhanced video capsule endoscopy: challenges for essential progress.

    PubMed

    Iakovidis, Dimitris K; Koulaouzidis, Anastasios

    2015-03-01

    Video capsule endoscopy (VCE) has revolutionized the diagnostic work-up in the field of small bowel diseases. Furthermore, VCE has the potential to become the leading screening technique for the entire gastrointestinal tract. Computational methods that can be implemented in software can enhance the diagnostic yield of VCE both in terms of efficiency and diagnostic accuracy. Since the appearance of the first capsule endoscope in clinical practice in 2001, information technology (IT) research groups have proposed a variety of such methods, including algorithms for detecting haemorrhage and lesions, reducing the reviewing time, localizing the capsule or lesion, assessing intestinal motility, enhancing the video quality and managing the data. Even though research is prolific (as measured by publication activity), the progress made during the past 5 years can only be considered as marginal with respect to clinically significant outcomes. One thing is clear-parallel pathways of medical and IT scientists exist, each publishing in their own area, but where do these research pathways meet? Could the proposed IT plans have any clinical effect and do clinicians really understand the limitations of VCE software? In this Review, we present an in-depth critical analysis that aims to inspire and align the agendas of the two scientific groups.

  1. A software tool to analyze clinical workflows from direct observations.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2015-01-01

    Observational data of clinical processes need to be managed in a convenient way, so that process information is reliable, valid and viable for further analysis. However, existing tools for allocating observations fail in systematic data collection of specific workflow recordings. We present a software tool which was developed to facilitate the analysis of clinical process observations. The tool was successfully used in the project OntoHealth, to build, store and analyze observations of diabetes routine consultations.

  2. Cutoff Finder: A Comprehensive and Straightforward Web Application Enabling Rapid Biomarker Cutoff Optimization

    PubMed Central

    Budczies, Jan; Klauschen, Frederick; Sinn, Bruno V.; Győrffy, Balázs; Schmitt, Wolfgang D.; Darb-Esfahani, Silvia; Denkert, Carsten

    2012-01-01

    Gene or protein expression data are usually represented by metric or at least ordinal variables. In order to translate a continuous variable into a clinical decision, it is necessary to determine a cutoff point and to stratify patients into two groups each requiring a different kind of treatment. Currently, there is no standard method or standard software for biomarker cutoff determination. Therefore, we developed Cutoff Finder, a bundle of optimization and visualization methods for cutoff determination that is accessible online. While one of the methods for cutoff optimization is based solely on the distribution of the marker under investigation, other methods optimize the correlation of the dichotomization with respect to an outcome or survival variable. We illustrate the functionality of Cutoff Finder by the analysis of the gene expression of estrogen receptor (ER) and progesterone receptor (PgR) in breast cancer tissues. This distribution of these important markers is analyzed and correlated with immunohistologically determined ER status and distant metastasis free survival. Cutoff Finder is expected to fill a relevant gap in the available biometric software repertoire and will enable faster optimization of new diagnostic biomarkers. The tool can be accessed at http://molpath.charite.de/cutoff. PMID:23251644

  3. Global review of open access risk assessment software packages valid for global or continental scale analysis

    NASA Astrophysics Data System (ADS)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.

  4. Advances in the diagnosis of premenstrual syndrome and premenstrual dysphoric disorder.

    PubMed

    Futterman, Lori A

    2010-01-01

    Premenstrual disorders negatively impact the quality of life and functional ability of millions of women. The two generally recognized premenstrual disorders are premenstrual syndrome (PMS) and premenstrual dysphoric disorder (PMDD). These disorders are characterized by a wide variety of nonspecific mood, somatic and behavioral symptoms that occur only during the late luteal phase of a woman's cycle and disappear soon after the onset of menstruation. This paper reviews the diagnostic criteria for PMS and PMDD, describes some of the more common symptom diaries and other tools used to diagnose premenstrual disorders, and discusses the challenges inherent in diagnosing PMS and PMDD. A survey of peer-reviewed articles and relevant texts provided diagnostic criteria, descriptions of diagnostic tools and information about diagnostic challenges. The many nonspecific symptoms associated with premenstrual disorders complicate the diagnostic process. The use of proven symptom diaries and other diagnostic tools should aid in the differential diagnosis of premenstrual disorders. Patients need to report bothersome premenstrual symptoms, and clinicians should become more proficient in the diagnostic process in order to prevent underdiagnosis of these disorders.

  5. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    PubMed

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast.

  6. [Construction of educational software about personality disorders].

    PubMed

    Botti, Nadja Cristiane Lappann; Carneiro, Ana Luíza Marques; Almeida, Camila Souza; Pereira, Cíntia Braga Silva

    2011-01-01

    The study describes the experience of building educational software in the area of mental health. The software was developed to enable the nursing student identify personality disorders. In this process, we applied the pedagogical framework of Vygotsky and the theoretical framework of the diagnostic criteria defined by DSM-IV. From these references were identified personality disorders characters in stories and / or children's movies. The software development bank was built with multimedia graphics data, sound and explanatory. The software developed like educational game like questions with increasing levels of difficulty. The software was developed with Microsoft Office PowerPoint 2007. It is believed in the validity of this strategy for teaching-learning to the area of mental health nursing.

  7. USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOL IN POLLUTION PREVENTION

    EPA Science Inventory

    Computer-Aided Process Engineering has become established in industry as a design tool. With the establishment of the CAPE-OPEN software specifications for process simulation environments. CAPE-OPEN provides a set of "middleware" standards that enable software developers to acces...

  8. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  9. SafetyAnalyst : software tools for safety management of specific highway sites

    DOT National Transportation Integrated Search

    2010-07-01

    SafetyAnalyst provides a set of software tools for use by state and local highway agencies for highway safety management. SafetyAnalyst can be used by highway agencies to improve their programming of site-specific highway safety improvements. SafetyA...

  10. Overview of T.E.S.T. (Toxicity Estimation Software Tool)

    EPA Science Inventory

    This talk provides an overview of T.E.S.T. (Toxicity Estimation Software Tool). T.E.S.T. predicts toxicity values and physical properties using a variety of different QSAR (quantitative structure activity relationship) approaches including hierarchical clustering, group contribut...

  11. Desktop Publishing.

    ERIC Educational Resources Information Center

    Stanley, Milt

    1986-01-01

    Defines desktop publishing, describes microcomputer developments and software tools that make it possible, and discusses its use as an instructional tool to improve writing skills. Reasons why students' work should be published, examples of what to publish, and types of software and hardware to facilitate publishing are reviewed. (MBR)

  12. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-05-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  13. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  14. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-02-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  15. Cementitious Barriers Partnership FY2013 End-Year Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, G. P.; Langton, C. A.; Burns, H. H.

    2013-11-01

    In FY2013, the Cementitious Barriers Partnership (CBP) demonstrated continued tangible progress toward fulfilling the objective of developing a set of software tools to improve understanding and prediction of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. In November 2012, the CBP released “Version 1.0” of the CBP Software Toolbox, a suite of software for simulating reactive transport in cementitious materials and important degradation phenomena. In addition, the CBP completed development of new software for the “Version 2.0” Toolbox to be released in early FY2014 and demonstrated use of the Version 1.0 Toolbox on DOEmore » applications. The current primary software components in both Versions 1.0 and 2.0 are LeachXS/ORCHESTRA, STADIUM, and a GoldSim interface for probabilistic analysis of selected degradation scenarios. The CBP Software Toolbox Version 1.0 supports analysis of external sulfate attack (including damage mechanics), carbonation, and primary constituent leaching. Version 2.0 includes the additional analysis of chloride attack and dual regime flow and contaminant migration in fractured and non-fractured cementitious material. The LeachXS component embodies an extensive material property measurements database along with chemical speciation and reactive mass transport simulation cases with emphasis on leaching of major, trace and radionuclide constituents from cementitious materials used in DOE facilities, such as Saltstone (Savannah River) and Cast Stone (Hanford), tank closure grouts, and barrier concretes. STADIUM focuses on the physical and structural service life of materials and components based on chemical speciation and reactive mass transport of major cement constituents and aggressive species (e.g., chloride, sulfate, etc.). THAMES is a planned future CBP Toolbox component focused on simulation of the microstructure of cementitious materials and calculation of resultant hydraulic and constituent mass transfer parameters needed in modeling. Two CBP software demonstrations were conducted in FY2013, one to support the Saltstone Disposal Facility (SDF) at SRS and the other on a representative Hanford high-level waste tank. The CBP Toolbox demonstration on the SDF provided analysis on the most probable degradation mechanisms to the cementitious vault enclosure caused by sulfate and carbonation ingress. This analysis was documented and resulted in the issuance of a SDF Performance Assessment Special Analysis by Liquid Waste Operations this fiscal year. The two new software tools supporting chloride attack and dual-regime flow will provide additional degradation tools to better evaluate performance of DOE and commercial cementitious barriers. The CBP SRNL experimental program produced two patent applications and field data that will be used in the development and calibration of CBP software tools being developed in FY2014. The CBP software and simulation tools varies from other efforts in that all the tools are based upon specific and relevant experimental research of cementitious materials utilized in DOE applications. The CBP FY2013 program involved continuing research to improve and enhance the simulation tools as well as developing new tools that model other key degradation phenomena not addressed in Version 1.0. Also efforts to continue to verify the various simulation tools through laboratory experiments and analysis of field specimens are ongoing and will continue into FY2014 to quantify and reduce the uncertainty associated with performance assessments. This end-year report summarizes FY2013 software development efforts and the various experimental programs that are providing data for calibration and validation of the CBP developed software.« less

  16. Platform-independent software for medical image processing on the Internet

    NASA Astrophysics Data System (ADS)

    Mancuso, Michael E.; Pathak, Sayan D.; Kim, Yongmin

    1997-05-01

    We have developed a software tool for image processing over the Internet. The tool is a general purpose, easy to use, flexible, platform independent image processing software package with functions most commonly used in medical image processing.It provides for processing of medical images located wither remotely on the Internet or locally. The software was written in Java - the new programming language developed by Sun Microsystems. It was compiled and tested using Microsoft's Visual Java 1.0 and Microsoft's Just in Time Compiler 1.00.6211. The software is simple and easy to use. In order to use the tool, the user needs to download the software from our site before he/she runs it using any Java interpreter, such as those supplied by Sun, Symantec, Borland or Microsoft. Future versions of the operating systems supplied by Sun, Microsoft, Apple, IBM, and others will include Java interpreters. The software is then able to access and process any image on the iNternet or on the local computer. Using a 512 X 512 X 8-bit image, a 3 X 3 convolution took 0.88 seconds on an Intel Pentium Pro PC running at 200 MHz with 64 Mbytes of memory. A window/level operation took 0.38 seconds while a 3 X 3 median filter took 0.71 seconds. These performance numbers demonstrate the feasibility of using this software interactively on desktop computes. Our software tool supports various image processing techniques commonly used in medical image processing and can run without the need of any specialized hardware. It can become an easily accessible resource over the Internet to promote the learning and of understanding image processing algorithms. Also, it could facilitate sharing of medical image databases and collaboration amongst researchers and clinicians, regardless of location.

  17. The Pathologist 2.0: An Update on Digital Pathology in Veterinary Medicine.

    PubMed

    Bertram, Christof A; Klopfleisch, Robert

    2017-09-01

    Using light microscopy to describe the microarchitecture of normal and diseased tissues has changed very little since the middle of the 19th century. While the premise of histologic analysis remains intact, our relationship with the microscope is changing dramatically. Digital pathology offers new forms of visualization, and delivery of images is facilitated in unprecedented ways. This new technology can untether us entirely from our light microscopes, with many pathologists already performing their jobs using virtual microscopy. Several veterinary colleges have integrated virtual microscopy in their curriculum, and some diagnostic histopathology labs are switching to virtual microscopy as their main tool for the assessment of histologic specimens. Considering recent technical advancements of slide scanner and viewing software, digital pathology should now be considered a serious alternative to traditional light microscopy. This review therefore intends to give an overview of the current digital pathology technologies and their potential in all fields of veterinary pathology (ie, research, diagnostic service, and education). A future integration of digital pathology in the veterinary pathologist's workflow seems to be inevitable, and therefore it is proposed that trainees should be taught in digital pathology to keep up with the unavoidable digitization of the profession.

  18. Pointo - a Low Cost Solution to Point Cloud Processing

    NASA Astrophysics Data System (ADS)

    Houshiar, H.; Winkler, S.

    2017-11-01

    With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a fast and efficient visualization with the ability to add annotation and documentation to the point clouds.

  19. Development of the ITER magnetic diagnostic set and specification.

    PubMed

    Vayakis, G; Arshad, S; Delhom, D; Encheva, A; Giacomin, T; Jones, L; Patel, K M; Pérez-Lasala, M; Portales, M; Prieto, D; Sartori, F; Simrock, S; Snipes, J A; Udintsev, V S; Watts, C; Winter, A; Zabeo, L

    2012-10-01

    ITER magnetic diagnostics are now in their detailed design and R&D phase. They have passed their conceptual design reviews and a working diagnostic specification has been prepared aimed at the ITER project requirements. This paper highlights specific design progress, in particular, for the in-vessel coils, steady state sensors, saddle loops and divertor sensors. Key changes in the measurement specifications, and a working concept of software and electronics are also outlined.

  20. Teaching structure: student use of software tools for understanding macromolecular structure in an undergraduate biochemistry course.

    PubMed

    Jaswal, Sheila S; O'Hara, Patricia B; Williamson, Patrick L; Springer, Amy L

    2013-01-01

    Because understanding the structure of biological macromolecules is critical to understanding their function, students of biochemistry should become familiar not only with viewing, but also with generating and manipulating structural representations. We report a strategy from a one-semester undergraduate biochemistry course to integrate use of structural representation tools into both laboratory and homework activities. First, early in the course we introduce the use of readily available open-source software for visualizing protein structure, coincident with modules on amino acid and peptide bond properties. Second, we use these same software tools in lectures and incorporate images and other structure representations in homework tasks. Third, we require a capstone project in which teams of students examine a protein-nucleic acid complex and then use the software tools to illustrate for their classmates the salient features of the structure, relating how the structure helps explain biological function. To ensure engagement with a range of software and database features, we generated a detailed template file that can be used to explore any structure, and that guides students through specific applications of many of the software tools. In presentations, students demonstrate that they are successfully interpreting structural information, and using representations to illustrate particular points relevant to function. Thus, over the semester students integrate information about structural features of biological macromolecules into the larger discussion of the chemical basis of function. Together these assignments provide an accessible introduction to structural representation tools, allowing students to add these methods to their biochemical toolboxes early in their scientific development. © 2013 by The International Union of Biochemistry and Molecular Biology.

Top